Server Backup (Revision 1)

This section deals with backups of a FreeBSD webserver to an offsite server. These backups will not be all inclusive, but will include users home directories, configuration directories, log directories, and mysql databases.

We will be applying a completely selective strategy, meaning only what we specifically ask to be backed up will be backed up.

I use some software and scripts to accomplish these backup goals, I did not write any of them, but will try and give credit where due.

Files and Directories:

Normally rsync would provide a perfect solution for backing up files to a remote server, however I wanted to get away from that if I could. Instead this time I’m going to be using tarsnap, software that encrypts and then stores backups on Amazon’s S3 cloud service. This eliminates the need to have any backup servers around causing trouble all the time.

tarsnap is in the FreeBSD ports collection, and compiled and installed easily.

  1. Install from ports.
  2. Use the keygen utility to generate necessary encryption keys.
  3. Run tarsnap <backup-name> <directories to be backed up>
I created a simple script to run daily to back everything up. Eventually I would like to sidestep tarsnap and push backups directly to the Amazon S3 cloud, but at this time FreeBSD lacks good Amazon S3 support.

MySQL Database Backup

A critical part of these backups needs to include MySQL data as it is very important for some websites. MySQL is not able to be backed-up using conventional methods because of possible inconsistencies while data is being written. As such we’ll use a script that dumps the mysql databases daily using mysqldump and then adds the backups to a directory to be backed up to our remote server.

Now that I’m using tarsnap (http://www.tarsnap.com/) of all my important files, I need something to backup my important databases. To do this I’m going to use mysqldump to dump any important databases to a file.

#!/bin/bash
# sonia 16-nov-05
# backup each mysql db into a different file, rather than one big file
# as with --all-databases - will make restores easier

USER="backup"
PASSWORD="secret"
OUTPUTDIR="/usr/home/somewhere"
MYSQLDUMP="/usr/local/bin/mysqldump"
MYSQL="/usr/local/bin/mysql"

# clean up any old backups - save space
rm "$OUTPUTDIR/*bak" > /dev/null 2>&1

# get a list of databases
databases=`$MYSQL --user=$USER --password=$PASSWORD \
 -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`

# dump each database in turn
for db in $databases; do
    echo $db
    $MYSQLDUMP --force --opt --user=$USER --password=$PASSWORD \
    --databases $db > "$OUTPUTDIR/$db.bak"
done

Note that I won’t be compressing the output of mysqldump, this is so that tarsnap will only transfer the actual differential each day, if I were to gzip the output of mysqldump, tarsnap would retransfer the entire compressed file each backup.

This script was taken from http://www.snowfrog.net/2005/11/16/backup-multiple-databases-into-separate-files/ and works wonderfully!

I then added the mysqldump function to crontab to run daily. This does not need to be nor should it be done under the root user.

Example Crontab Entry

8 8 * * * /home/spike/.backups/sqldump.sh

 

That takes care of daily mysql backups for the time being.

 

Summary

So there we have it, a very simple way to keep an extra copy of important files and mysql databases from a web server (or any other sort of server).

 

Leave a Reply