Continuous Local Backup

Overview

Currently I have off site backup which occur usually on a monthly basis but had no solution for a daily or even weekly backup.

Concept

The idea of a daily or weekly backup was not new. It allowed for quick backup on a regular basis with quick access. Using the concept of continual backup means as soon as changes occur, it would be written to the backup. Whilst it’s possible I felt instantaneous backup was too quick especially if a virus or an accidental delete occurred. I settled on a 2 hour delay.

SyncBack SE

SyncBack SE
SBS used profiles as backup tasks. The profiles were setup to mirror the source folder to the destination including deletes. Some filters were applied beyond the default like files ending with “.tmp” or “.log”. Additional directories I chose to exclude were AppData directories which were locked due to running programs / not needed. For example Skype and cache folder in Firefox.

SyncBack Log
SBS allowed reports to be created in text to HTML reports. These could be stored to file or even emailed out. It supports log rolling and I chose to keep the logs in Dropbox and only alert me (open the HTML report) on failed backups. The reports are very detailed but one detail I found missing was how long the profile took to run i.e how long backup took. With the start date time known I used the last modified date on the reports to calculate how long reports took.

Profile Data
Profiles could be simulated and reported on how it would run before the actual profile was run to ensure files were not overwritten or deleted by accident. It’s a good way to test before executing.

Encryption

The medium I used were external hard drive encrypted using TrueCrypt. The problem is unless the program supported TrueCrypt, it meant keeping the disk mounted to allow backups to be made. The software I had settled on was SyncBack SE. It was very versatile and allowed a lot of filters and rules to be implemented. The Pro version was over kill. SyncBack SE (SBS) had encryption built in but used compression as it’s method to encrypt files. This is like putting all the backup files into a zip file and putting a password on it. Whilst this allowed reduction in storage space, it also meant if the compressed file corrupted, all files in the compressed file was useless. I wanted to keep the backup as a mirror of the files i was backing up.

Backup

I used USB 2.0 drives for backups and on average it takes approximately 1-2 minutes to scan just over 725GB for changes. After the changes have been detected it is dependent on the amount of data it needs to copy and delete and the speed of the drive.

Summary

It has cut down the time it took to backup and also gives quick access to data. The delay could be configured to give a grace period before being synchronised balancing the frequency of change v.s period to retrieve from the backup. It will also wear out the backup drives quicker because they are left on and connected but hopefully they will outlast my need before replacing them with higher capacity drives.

I have to give an honorable mention to Dave Bradford for suggesting the idea.

About Danny

I.T software professional always studying and applying the knowledge gained and one way of doing this is to blog. Danny also has participates in a part time project called Energy@Home [http://code.google.com/p/energyathome/] for monitoring energy usage on a premise. Dedicated to I.T since studying pure Information Technology since the age of 16, Danny Tsang working in the field that he has aimed for since leaving school. View all posts by Danny → This entry was posted in Software and tagged , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.