I just took a look at my backup progress bar on the windows box:
Check out the elapsed time – 18 hours and counting. This is why I only do backups once a week – I just can’t have this damn thing running 24-7. This is what happens when you have way to much shit on your hard drives, and you are backing up to an external USB device. Sigh…. I should get a firewire card or something. But, slow backups are better than no backups.
Oh, and one thing that is even more important than backups is auditing your backup files. Because a corrupted backup is absolutely worthless. I know becasuse I lost my Morrowind saves this way.
I designate this a backup thread. What was the best and worst backup strategy you have ever seen?
[tags]backups, windows backup, backup strategy, usb, backup device[/tags]
I used to do backups by simply tarring up what I needed (with a script), then burning the file to a CD or DVD later on. This was a pain, I had to burn discs often (I never got a CDRW to erase & rewrite properly), cataloging was a pain, and so was remembering to put a blank cd/dvd in the drive every saturday night for my script I schedule. Basically it was a pain and I’m lazy. Then, one time I needed a backup, and the only disc the file I needed was on, decided to have CRC issues. Given the files were tarred and gzipped, the CRC error was fatal to the whole archive, so I changed strategies.
The best way I’ve found to do backups is with rsync. Something as simple as “rsync -ruvh –delete ~/docs/* /media/server/docs/backups/Laptop\ backup” does the trick wonderfully. I used a USB/firewire drive, and every so often I’d run some kind of quick integrity check on the drive (e2fsck or equivalent). I did find that firewire consistently had better throughput than USB drives, even if USB is supposedly faster. I’d love to see an external sata drive on firewire 800, but alas I’m not made out of money.
I did have problems with hard drives in external enclosures (even nice drives and enclosures) dying at a really fast rate. I must have rma’d 2 drives and 3 or 4 enclosures in a year. So I switched strategies again.
For the last few years, I’ve just backed up to a file server (which runs its own internal backups to a second hard disk), which saves me the time of even occasionally running an integrity check. The server is scheduled to do its own.
The server is local, so if there is a fire I’m hosed. I was going to do a “backup swap” with my brother (who lives in another state) where I backup to his file server and he to mine, but being behind as many nat routers as we both are (stupid ISPs) it wasn’t going to work out. And I’m too cheap to rent a host, so I’ll take the fire risk.
Yeah, I do have that backup server project on the back-burner. I wanted to go all out with two large drives set up as RAID 1 and etc… Never got around buying the right parts and setting it up.
And yeah – all my backups are local too. If there is a fire or a flood my data is gone. But then again, I figured that if my house burns down, the loss of my pr0n, warez and odd pieces of code and writings that are not checked in some external SVN repository is really the least of my worries. :mrgreen:
damn Luke ya always save the Pr0n first, i know you’ve saw that pic that floating around the Internet. lmao
On the more serious side
That is unreal i think ya need a better solution than that.
[quote comment=”6223″]damn Luke ya always save the Pr0n first[/quote]
But yeah, I should really set up something better than this. :)