Over the weekend, I read Alan Zeichick’s article Protect your developer data and it made me smile as I’ve been through the exact same thing many times over the years. Of course, it made me think, are people/companies backing up their data or just flying by the seat of their pants. Do people think that ‘backing up their data is something they may do someday but not today’?
For me, I can’t take any chances, as my life exists in the digital world (so to speak) and I could never ever afford/tolerate any data loss. I have put tens of thousands of hours into creating software products (commercial & open source), documentation, tutorials, web sites, etc. and I couldn’t simply recreate it without a whole lot of work, if some or all of it disappeared (actually, it would be devastating). So, I’m a little anal about backing up my data.
A little history on my desktop PC step up – its a little different, kind of like me. Back in the 90′s. I used to have my PC setup to triple boot (OS/2, Windows and Linux). So, to be able to get access to my programs or documents, I had to separate my files from the OSes. I created a separate partition for each OS and created other partitions to hold programs and documents. Over time, I found it easier to have separate PCs to run OS/2 (I don’t use OS/2 any more), Windows and Linux and use VNC or telnet to them and do what I need to do but the partitioning on my desktop PC stuck plus I have had the ‘blue screen of death’ too many times to actually trust Windows!!
So, here’s how I partition the hard drives on my desktop PC:
- The C Drive only contains the Windows OS and absolutely nothing else. I do NOT use the Windows’ Documents directory or any other Windows directory that would put data on the C Drive.
- The D Drive is where I install all of the applications I use. i.e. LibreOffice, FireFox, Eclipse, JDK, WinAmp, etc.
- The E Drive is where I store all of the documentation, pictures, photos and business data that I create plus I have a huge PDF collection (manuals, docs, etc..) for anything and everything.
- The F Drive is for source code. It contains my Eclipse workspace, Visual Studio code base, build scripts plus I have hundreds of thousands of code samples I have collected and categorized over the years.
- The G Drive is my junk collector. I have FireFox’s Download directory on the G Drive, ‘temp’ directories and a ‘MySoftware’ directory that contains all of the software that I have download from the internet and installed.
If you are curious how I do the partitioning from a hardware perspective, I have 2 SSD (Solid State Drive) drives in my desktop PC. The first SSD is partitioned as C and G Drives and the second SSD is partitioned as D, E and F Drives. Why like that, well, I’ll tell you. The 2 most heavily used partitions are C (Windows OS) and D (all of the applications) drives, so I wanted them on separate physical drives. Hence, I will get the best throughput with this configuration. Also, the G Drive is rarely used, so it made sense to pair it with the C Drive partition. So, that’s the why!!
Also, the reason I still keep nothing on the C Drive except for Windows is because (roughly) 5 years ago, I got the ‘blue screen of death’ (running WinXP) and Windows would not start except in bare bones ‘safe mode’. I tried to repair it, searched on the internet (with another PC) for a solution but in the end, I formatted the C drive and re-installed Windows. Hence, if I had any data on the C drive, I would have lost it. So, keeping my data on a separate partition is well worth the 1 minute of time to create the separate partition.
So, back to Alan Zeichick’s article. One of the backup procedures (more on this in a minute) that I use is to do daily and weekly backups to my NAS (Network Attached Storage). After a lot of research, the first NAS I purchased was a Buffalo TeraStation Terabyte (1TB) NAS back in 2005. It had 4 256GB drives in it for a total 1TB. I setup it up as RAID-5 which meant I got roughly 750GB of storage which was fine for my needs back then. It ran fine for several years and then one day I got the ‘red blinking light’. Since, this was new to me, I freaked-out. After I realized that my data was still safe, I followed the procedure and replaced the bad drive. I ran the reconfigure option and a few hours later, the red light was gone and everything was bad to normal.
In 2009 (or 2010, I can’t remember), I got another ‘red blinking light’ on the nearly full NAS. I thought, I could replace the 256GB hard drive or simply get a new Buffalo TeraStation III 4TB NAS. I opted for the new NAS, sense the old one was nearly full. I configured the new NAS in a RAID-5 configuration and copied everything from the old NAS to the new NAS.
A couple of months ago (in 2013), I got the ‘red blinking light’ on the 4TB NAS, so I searched around looking for a replacement 1TB drive and found that Best Buy happen to be dumping Seagate Barracuda 1TB drives for $74.99, so I bought 4. Yes, I only needed one, but the price was excellent and they may be hard to find in the future. So it seemed logical to me to stock up now. I put them on the shelf right next to the NAS. So, the next time I see a ‘red blinking light’, I’ll just grab one of the new drives, pop-out the dead drive and put in the brand new drive. Oh, I forgot to mention, the NAS supports hot-swappable drives, which means you don’t have to turn it off to swap out the drives – very cool!!
Besides using the NAS as a backup device, I create ‘shared folders’ on it. I have a ‘music’ share on it where I put all of my converted CDs (yes, I still buy CDs). The ‘music’ share has a little over 11,000 songs on it. Hence, no matter what computer I am using (i.e. Windows, Mac or Linux), I can access the ‘music’ share.
Now back to my anal backup procedures (you’ll see in a minute why I describe it that way). There are 2 pieces of backup software that I rely on: Cobian Backup and Robocopy.
- On a daily basis, I use Robocopy to ‘clone’ my desktop PC onto my laptop (and yes, the laptop is configured exactly the same as the desktop PC).
- On the desktop PC, I have a daily Windows task to run Robocopy to do a differential backup to the NAS.
- On the desktop PC, I have a daily Windows task to run a UFM Workflow to compress selected directories, encrypt the zip files and upload the files to a secure online resource (maybe I should do a ‘UFM How-To’ on this).
- On the desktop PC, I have Cobian Backup setup to do full weekly backups to the NAS.
- Every 4-6 weeks, I backup (burn) everything to several ‘Blu-Ray’ discs and store the discs at an offsite location that is more than 10km from the office.
- The NAS sits in a locked storage room, 6 feet (2 meters) off the floor where most of the walls and floor are of concrete. It is plugged into an APC XS 900 backup UPS. Hence, there is a high probability that it will survive an electrical brownout, short-term electricity loss or a fire.
So, those are my backup procedures that I have setup. Is it complete? I think I have covered off 99.999% of the situations that could occur to cause data loss. Is it a little over the top? Probably. But better safe than sorry.
If you don’t backup your data, please take a moment and implement a solution. You don’t have to do what I do, as there are plenty of easy solutions like external backup hard drives. For those people who use digital cameras or take pictures on your phone, what would you do if you lost ALL of your pictures because your hard drive crashed or your PC was stolen?
Food for thought.