On Friday, we were asked to review a technical problem a neighbour was experiencing. It turned out that a MySQL server had somehow corrupted its data-file during a power-cycle. Although the user thought they had been backing up their data, what was actually being saved was a snapshot of their database from 2007. The upshot being that their database - in daily use since 2007 - was now hosed. The more the system was investigated, the greater the probability that their data was lost for ever. While it is embarrassing to lose your database, to lose it because your backup was useless is verging on the criminal. Laugh or snigger if you want but before you do, make sure

  • you have a backup strategy
  • and a disaster recovery plan
  • and most important of all, make sure they all do what they are supposed to do

A  couple of years ago, a few of our core servers were running on rather old hardware and were regularly subjected to quite serious levels of  stress. Not surprisingly over a period of a couple of years we had a  succession of disk failures. Fortunately we were able to recover within  half-a-day after calamitous failures. Nothing like a real-world test to keep you on your toes. Nothing like a real-world example to remind you  how wrong things can get!