Server Backup (Revision 2)

Since trying tarsnap I’ve fallen in love with not having to maintain my own backup servers, however tarsnap is quite expensive for anything above a few GB. I want to slowly convert all of my backup strategies to using cloud storage, so I found it necessary to work out the kinks with duplicity and get it working on my FreeBSD machines to back everything up to the Amazon S3 cloud.

Here’s an example of the commands I run to get the backup going.

export AWS_ACCESS_ID=SDLFJKSLDKJVLV7B
export AWS_SECRET_ACCESS_KEY=dOfih2o38h4r89hsd98hf;asdhfiuh2p98hd
export PASSPHRASE=SDFOIJD3oihjds9f8hap9hAISU
duplicity /etc s3+http://www27-Backup.s3-website-us-east-1.amazonaws.com/etc

I’ve switched to using duplicity as its free and allows to backup to Amazon S3 buckets with ease. Amazon even includes a free account to get started with (though I think its limited to something like 5GB, though that should be plenty for individual websites and databases. I’m still running a mysqldump script to dump my databases prior to running duplicity.

Also be sure to unset the environmental variables for your passphrase and secret ID and such. I do so by adding the following commands:

export AWS_ACCESS_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=

So then altogether I put it into one little script:

 

export AWS_ACCESS_ID=SDLFJKSLDKJVLV7B
export AWS_SECRET_ACCESS_KEY=dOfih2o38h4r89hsd98hf;asdhfiuh2p98hd
export PASSPHRASE=SDFOIJD3oihjds9f8hap9hAISU
duplicity /etc s3+http://www27-Backup.s3-website-us-east-1.amazonaws.com/etc| mail -s "duplicity backup report" [email protected]
export AWS_ACCESS_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=

 

MySQL

Were still backing up MySQL the same way as before, using mysqldump to dump all the databases to the local filesystem, in a location that’s set to be backed up to Amazon S3.

 

#!/bin/bash
# sonia 16-nov-05
# backup each mysql db into a different file, rather than one big file
# as with --all-databases - will make restores easier

USER="backup"
PASSWORD="secret"
OUTPUTDIR="/usr/home/somewhere"
MYSQLDUMP="/usr/local/bin/mysqldump"
MYSQL="/usr/local/bin/mysql"

# clean up any old backups - save space
rm "$OUTPUTDIR/*bak" > /dev/null 2>&1

# get a list of databases
databases=`$MYSQL --user=$USER --password=$PASSWORD \
 -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`

# dump each database in turn
for db in $databases; do
    echo $db
    $MYSQLDUMP --force --opt --user=$USER --password=$PASSWORD \
    --databases $db > "$OUTPUTDIR/$db.bak"
done

Note that I won’t be compressing the output of mysqldump, this is so that tarsnap will only transfer the actual differential each day, if I were to gzip the output of mysqldump, duplicity would retransfer the entire compressed file each backup instead of just a partial incremental update.

This script was taken from http://www.snowfrog.net/2005/11/16/backup-multiple-databases-into-separate-files/ and works wonderfully!

I then added the mysqldump function to crontab to run daily. This does not need to be nor should it be done under the root user.

Example Crontab Entry

8 8 * * * /home/spike/.backups/sqldump.sh

 

That takes care of daily mysql backups for the time being.

 

 

Leave a Reply