Simple mysqldump script to be ran prior to tarsnap

Now that I’m using tarsnap (http://www.tarsnap.com/) of all my important files, I need something to backup my important databases. To do this I’m going to use mysqldump to dump any important databases to a file.

#!/bin/bash
# sonia 16-nov-05
# backup each mysql db into a different file, rather than one big file
# as with --all-databases - will make restores easier

USER="backup"
PASSWORD="secret"
OUTPUTDIR="/usr/home/somewhere"
MYSQLDUMP="/usr/local/bin/mysqldump"
MYSQL="/usr/local/bin/mysql"

# clean up any old backups - save space
rm "$OUTPUTDIR/*bak" > /dev/null 2>&1

# get a list of databases
databases=`$MYSQL --user=$USER --password=$PASSWORD \
 -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`

# dump each database in turn
for db in $databases; do
    echo $db
    $MYSQLDUMP --force --opt --user=$USER --password=$PASSWORD \
    --databases $db > "$OUTPUTDIR/$db.bak"
done

Note that I won’t be compressing the output of mysqldump, this is so that tarsnap will only transfer the actual differential each day, if I were to gzip the output of mysqldump, tarsnap would retransfer the entire compressed file each backup.

This script was taken from http://www.snowfrog.net/2005/11/16/backup-multiple-databases-into-separate-files/ and works wonderfully!

Tarsnap

For some backup of already encrypted files I’m going to be giving Tarsnap a try. I’m just trying it with $5 to keep a copy of a couple GB of web files and sql dumps. So far the install was seemless on FreeBSD (its in ports and compiled quickly/easily). Running the software itself was also easy enough with one single, simple command. The pricing seems fair enough for smaller backups, and they use Amazon’s S2 storage cloud to store the data, so its at least somewhat reliable.

Tarsnap works with incremental, compressed, encrypted backups. I’m using it in combination with the mysqldump utility in order to create a dump of important mysql databases prior to the backup, and to be included in the backup.

http://www.tarsnap.com (The software itself is free as well).

<Update> After a couple hours of usage, I’ve noticed that even though I’ve tried it from multiple servers on 100mbit or gigE, I never see more than 20-30mbit of throughput. This isn’t necessarily a problem as its a backup utility, but I’d like to know what the bottlekneck is.

<Update 5.3.12>  I’ve used tarsnap a couple times now and it seems to be working well, costing a couple dollars for the initial backup, and a few cents a day in storage fees. I’ve decided to stick with it for a while until I have larger backups, at which time i’ll figure out how to get something that can backup to amazon s3 storage working (as of yet i’m having issues getting duplicity to work, I keep getting an error).