Having suffered hard disk failures in the past and having been responsible for client data (software and websites) for around 17 years I admit to being paranoid about data loss. I try to operate as if all hard disks are bound to fail when you least expect them to and I therefore have in place backup strategies that try to minimise the impact of this when it happens.
At another level I admit to being anxious that we are the throw-away generation whose memories – at least in photographic terms – won’t outlive us. Who amongst us can remember, as children, the curious thrill of rifling through a collection of old family snap-shots that had survived any number of power cuts? A shoe box and a sturdy desk drawer are no longer sufficient to cradle and protect our digital versions of these memories.
And so to the myriad of files that we produce daily and whose volume grows exponentially …
My main development machine has three hard disk drives, one for Windows and software, the other two for data. I use a mixed bag of external hard drives, two in their own enclosures, and a handful which can be plugged in to a caddie.
With over a million files on my main machine, I’ve recently gone back to making use of RAID disks and have two big SATA drives in an external enclosure in RAID 1 configuration where both disks carry the same data and each can be hot-swapped if one fails.
Unplugging both the power and data cables from such drives is essential, especially here in south-west France where storms can be astonishingly violent – and quick to appear. Indeed, a while back a storm here burnt the data bus on an external drive that was not even plugged in. The drive was untouched, but its enclosure had to be replaced. That was a ‘lucky strike’ as they say.
In the longer term, DVD+RW is a quick and easy solution and I ensure that all website projects get squirted onto at least two of these. I try to ensure that any client work gets pushed onto one of these just before evening shut-down.
The 4.7Gb ceiling of DVD+RW is a bit of a headache and I recently got round this by installing a Pioneer BDR-205 Blue-ray drive that writes to rewritable 50Gb disks. This gobbles up the whole of my system’s Drupal development stack of around 60k files, in an hour. Awesome.
I’ve written before about Second Copy software and can’t stress how simple it is to configure and use. Fire it up, double-click a project icon and a whole directory tree of files gets shunted across to the designated drive. I’ve written endless software utilities in the past that do the same, but SC is a cheaper and more flexible option. I’ve also just discovered RoboCopy, which comes with Windows 7 and is a command-line file copy utility of extraordinary power.
One of the things to watch out for with CD/DVD-burning software is whether it can copy standard .htaccess or .htpasswd files. (Roxio can’t but Nero can.)
There’s also Acronis True Image which I use as a heavyweight backup solution and then manually burn the resulting archive file to DVD+RW or BD-RE disks. Acronis compresses 60k files that occupy 8Gb of disk space into a single 1.4Gb archive file, with everything dutifully included. One can even peek inside these monster archives with Windows Explorer and pull out individual files.
All of this used to be somewhat laborious, a necessary labour that was part of offering a professional service to clients. I now find it a snip thanks to the astonishing leap made in technology. For example, two of the hard drives in my everyday machine are SSDs. These ‘solid state drives’ have no moving parts and have read/write speeds that are dramatically higher than ‘traditional’ hard drives. I use an OCZ Vertex 2E (see below) and an Intel X-25. The former hosts Windows and all the software whilst the latter acts merely as a temporary drive for backups and as a scratch disk for Photoshop. Not only is Windows 7 ready to use in under 30 seconds, but the shovelling around of files that happens during backups throughout the working day goes on behind the scenes without too much of a fuss.
I will probably be hoist by my own petard in this regard, but at the moment the due safeguarding of important files seems to be working well.
Of course, there’s always on-line backup which can be set to run perpetually as a background service. Very nice idea (privacy/security issues aside), but not workable in our rural neck of the woods!
Update x 2
- February 2016 and we’re back in the UK with fast broadband: so many cloud backup options we’re spoilt for choice!
- August 2016: an interesting article in the Guardian about a small team of Australian archivists trying to access and then preserve the contents of Germaine Greer’s digital file collection (in obsolete file formats on obsolete media). This links to BitCurator, a software environment that “uses open source and public domain digital forensics, data triage, and metadata reprocessing tools”. Check it out. You may need it!