Just Back Up

How many days, months, and years have you put into your site? Do yourself a favor and take a few hours to back up everything, right now. You’ll feel a lot better afterward. Anyone have any favorite backup tips they’d like to share? Right now I have my servers backing up to a remote fileserver over FTP nightly, and my home stuff is mostly flying without a net.

42 replies on “Just Back Up”

Personally, I’m working towards keeping everythingin subversion. I’ve got the basic documents and stuff in there already and daily run a ‘svn commit’ but I need to buy more storage before I put everything in there.

My sites are backed up via FTP every night, to another server at another host. My PowerBook is backed up about once every two days (but I try for everyday, ends up probably 4 times a week) to a FireWire drive. I use a program called DataBackup from ProSoft, it makes it pretty simple–but it does about the same job as rsync without all the fun of a CLI.

You’re right, it really does make you feel better. I can sleep well at night knowing that if there’s any trouble, I’ll be back up and at it without much trouble at all.

Until recently, I used to get gunzipped tar archives of all my sites from CPanel. I did this every morning. A week ago I automated part of the process by getting more crucial (and frequently-changing pages). Here is what I did:

I set up a set of batch scripts (say dummy1..2)like the following:

cd /home/roy/Main/Transfer_Archives/Sites/Roy/
wget -r -l1 -t1 -N -np -erobots=off

I then set up cron jobs as in the following:

35 23 * * * /home/roy/Main/Transfer_Archives/Sites/dummy1
38 23 * * * /home/roy/Main/Transfer_Archives/Sites/dummy2
50 23 * * * tar czvf /home/roy/Main/Transfer_Archives/www-`date +%Y-%m-%d`.tar
.gz /home/roy/Main/Transfer_Archives/Sites
58 23 * * * cp -rf /home/roy/Desktop/cmake /home/server2/transfer/roy

The last line puts a copy on the SAN, just to be 100% covered.

So I can now sleep while my sites are being backed up…

Ryan, unless your repository is backed up in some other way you might (read will) lose the data in a harddrive crash. Version control can not replace backup.

7 years.

I made a backup on a DVD a week or so ago. Yes, it did feel much better.

It took up 75% of the DVD, including all photos and stuff.

You just reminded me I haven’t burned a single DVD since I bought the dang thing. DVD backup it is, at least for my photos.

> Ryan, unless your repository is backed up in some other way you might (read will) lose the data in a harddrive crash.

However, keeping data in a SVN repository is akin to an off-site backup as the SVN repository is presumably on a different machine and location from the WP server. I do exactly the same thing but only keep the non-default WP data i.e. my themes, custom plugins, data etc…

I have the luxury of having most of my websites local (rack mount server) and the data is backed up to a tape drive nightly. A couple of sites run over in the US (I’m located in Australia) so grab the whole thing twice a week & dump it locally (which then gets dumped onto tape as well).

My machine is backed up nightly with rdiff-backup to another server – and I keep the last 7 days of files to be able to “timewarp” on problems. The databases are backed up with mysqldump and the resulting dump is thrown into RCS, so I can go back to any day in the past I want to go to.


Blogger keeps your blogs on their own server, and keeps their own backups. As far as I know there isn’t really a way to backup your Blogger blog.

You could, however, FTP to the server where blogger uploads your blog and just copy and backup the files from there.


I used to use Carbon Copy Cloner to back up my Macintosh, daily to one of three partitions on an external and now I use SuperDuper! which is much better.

The key to backup is not just doing it, but producing a backup that you can use if anything goes wrong. Having an external drive that’s a mirror of my internal is what I like so I can start up from it on another computer if this one dies completely.

SuperDuper!’s incremental backup system is quite fast and easy (ease is key or you’ll never do it) and once you do an initial backup of an entire drive it doesn’t take long to save a day’s changes, even if you pull down multiple web sites via ftp and back them up too (which I do).

I would love, just love, a plugin for wordpress that backed up the entire system: database and wordpress files and pictures and the whole shebang locally. Push a button, get it local so you can burn it onto a CD. Then, if anything happens to the server, you can reinstall wordpress, this special plugin, and restore your weblog just as it was last backed up.

Nothing nerdy, just a simple way to do it so that people do it on a daily basis. It’s one thing to go on about its importance but quite another to build into the system an easy way to do it so that people actually do.

I second the DataBackup suggestion for Mac. The cool thing about it is it keeps versions of your files, so you can go back X number of versions to get what you need.

For PC, Backup for One is about the same thing. Used with an external hard drive, to back up, just turn on the drive. My dad keeps two hard drives and rotates every week, and it works seamlessly. They also have a product for networks.

After a Mysql database corruption caused ( i think) by a flaky hardware reboot I have been using a daily cron script I gleaned from a discussion on the WP support forums. Basically the script runs mysqldump on my WP database directory and puts the contents in a single restoreable file. This file is gzipped and given a name corresponding to the day.

It is possible to retrieve posts and comments from a corrupt and unrepairable database file, as i learned, but it’s no fun. I spent three hours cutting and pasting with a text editor restoring and re-publishing about eighty posts I didn’t want to abandon.

If you’re bcking everything up over FTP, consider doing it over rsync instead. Only files that have changed end up getting transferred, leading to a *MUCH* faster transmission time, and less chance of something getting corrupted over multiple transfers. It also allows you to do an on-demand backup after an important change, without having to copy the entire data set all over again.

Also, I do a “/usr/local/bin/mysqldump -Ae” every day, so that the data dump file is part of what gets rsync’ed. That way, in the event of a catastrophic failure, I’ll have my databases too. Most of the really important stuff I have is in my databases anyway.

I run a cron job on my web host which backs up MySql once an hour. I run another cron job on my desktop which backs up those MySql dumps to my desktop PC. I have a fairly standard install of WordPress, and could restore from a standard download without much pain. I use the perl Template Toolkit to generate my website, and the source for those resides in my home directory, so I can regenerate my non-WordPress website at any time. I probably should modifiy my process to get the stuff in the weblog directories mirrored on my desktop.

We have the luxury of a third PC on our household LAN and I run rsync jobs with cron on both my husbands desktop and mine to back up the home directory files up to the third PC. I also occasionally burn CDs of my most important files. I ought to take up to date copies offsite, though.

I am comfortable with reinstalling my operating systems from scratch (I run various Linux flavors of the week, currently Fedora 3 and Ubuntu) on all three systems, so I only really care about my data.

I use cron for my webserver, it calls a script that makes a backup of my web folder and my mysql db once a night. The cron job also makes a backup job every month, so I have both daily backups and monthly backups. The scipts is just a mysqldump for MySQL and a tar for my web folder. I back it up to another hardrive on the same computer, I know, it should be better with another external computer.

On my workstation I have an external firewire drive that I use only for back up. I turn it on and manually backup my workstation with Ghost 9. It is possible to make an disc mirror from Windows without even turn it off.

I back mine up whenever I make changes to the code and the mySQL is backed up nightly

Plus I have back ups of back ups thanks to the history subdomain which keeps older versions of my site dating back to 2000

I have a set of Python scripts that I use to back up around 25 sites at work. The scripts backup the databases, all of the files and creates tars of image folders. Then it FTPs them to a central location where I can download everything to an external harddrive and burn it all to DVD.

Last night the cat bit through the charger cord on my laptop, low voltage side. Cat is fine, motherboard is fried. Not quite what I thought I was making backups for.

I keep all of my important documents, websites, etc on my Net Integrator ( which then runs hourly incremental disk-to-disk backups. I swap the backup HDD out every couple of weeks, with a 6 disk rotation. 4 disks are kept offsite at a secure media storage facility ( that only costs me about 100 CDN / month for them to pick up / drop off / store these things for me.

Belt. Suspenders. Duct tape. My server pants never fall down.

Thanks for reminding me. I’ve been meaning to set up a backup system, and now I have one:

My server runs a cron script which dumps my MySQL data (that’s the important stuff; the rest hasn’t been updated for a while and I have old copies on my machine) nightly. My computer then automatically downloads them on a weekly basis. I determined that this would be fine because my MySQL server is separate from my web server, according to my webhost, DreamHost (which has an excellent backup schedule of its own; check the knowledge base for details).

My Windows machines run SyncBack, which zips and uploads my backups to my FTP server weekly (overwriting the past one).

Use rsync…

This will only backup changes that are made, so you don’t have to backup everything – everyday.

Hi Matt!

As a physical measure–buy two external usb or firewire drives big enough to backup everything on your machine. 250GB drives are about $250 each.

Use Rsync or whatever to copy everything from your machine on to both drives.

For about $60/year at a bank, you can get a safe deposit box big enough for a few of those drives. Stick one drive in a safe deposit box at a bank (i.e., it’s likey safe from fires, floods, etc.).

Everyday, run a script to do an incremental backup on the drive at home. Every week or two, switch the drive at home with the one in the safe deposit box.

When you periodically make CDs / DVDs to save versions of things, you can also burn an extra copy and throw it in your safe depoist box too!

has anyone had any trouble with LaCie drives, specifically the external one designed by Porsche? mine crapped out and failed to repair. luckily i lost nothing but i need to get the drive repaired. any tips on that?

I have an 80 gig portable LaCie porsche drive and so far it seems fine. It does have a bit of a whine sometimes when starting up that makes me nervous.

I have an OWC 80 gig drive that’s ugly but much better: OWC Mercury On-The-Go FireWire Portable. It’s 5400 rpm as opposed to the lacie 4200 and while a bit bigger and uglier, it’s worked flawlessly with daily backups for a year. I’ve only had the lacie for 4 months so I’m not sure what’s up with the sound. Your comment about it crapping out is not music to my ears. Is it under warranty?


I version control is backup when its on another machine (which is what I’ve got now). I’m working on getting the repository at another location, too (colo).

Comments are closed.