Maintaining data backups is crucial to any web site, be it big or small. Most recently, I decided that it was time that I put some sort of method in place to help me manage backups for MySQL Exchange. Since the site had started to grow and was quickly gaining quality content, the ever increasing risk of having to start over from scratch grew greater. I don't personally have a lot of time, especially when it comes to repeative tasks, so I needed to find a way to automate the backup process. In this tutorial we will look at the automated process that I use and why. We will be using a Python script created by ehansen over at Security For Us that we have modified to fit our unique needs, off-site storage using SoftLayer Object Storage and the SoftLayer Object Storage API. You can check out the references section at the end of this tutorial for links to the sites and API discussed.
As I stated in the introduction, we'll be looking at sending our backups to SoftLayer Object Storage. Their pricing is pretty decent, to the tune of 12 cents per GB of storage, and their download rates are extremely low. Since this is mainly used for backups, I don't have to worry about paying download rates unless something crashes. I won't go into too much detail on installation and usage, but in the references section of this article you'll find a link to install the SoftLayer Object Storage Python library and the SoftLayer Object Storage Backup Script. The backup script is what will actually send our backups to SoftLayers servers, but we'll leverage it through the use of a BASH script and Cron.
Backup The Database
The script used to backup the database is pretty straightforward. This script was originally created by ehansen with Security For Us, but I have made some alterations to it. The script is below:
# The path the backups will be dumped to DUMP_DIR="/home/backups/mysql/"
# SQL user who can access the db SQL_USER="SQL_USERNAME"
# SQL password for above user SQL_PASS="SQL_PASSWORD"
# SQL host SQL_HOST="localhost"
# SQL database SQL_DB="DATABASE_NAME"
# Back up folder name (mmddyyyy) BACKUP_DIR="`date +%m%d%Y`"
# SQL dump file name DUMP_FILE="`date +%m_%d_%Y_%H_%M_%S`_mysqlexchange.com"
# SL container name CONTAINER="mysql_backups"
# Create backup dir if doesn't exist if [ ! -d $DUMP_DIR$BACKUP_DIR ]; then mkdir -p $DUMP_DIR$BACKUP_DIR fi
# Dump the database to /tmp/... mysqldump -u $SQL_USER -p$SQL_PASS --host=$SQL_HOST $SQL_DB > /tmp/$DUMP_FILE.sql
tar -zcvpf $DUMP_DIR$BACKUP_DIR/$DUMP_FILE.sql.gz /tmp/$DUMP_FILE.sql
# No reason to keep the dump file rm -rf /tmp/$DUMP_FILE.sql
# Make sure the archive exists if [ -f $DUMP_DIR$BACKUP_DIR/$DUMP_FILE.sql.gz ]; then /root/slbackup.py -s $DUMP_DIR$BACKUP_DIR/ -o "$CONTAINER"
# Remove the backup stored locally rm -rf $DUMP_DIR$BACKUP_DIR
# Success exit 0 else echo "$DUMP_DIR$BACKUP_DIR/$DUMP_FILE.sq.gz does not exist." exit 1 fi Lets start at the beginning of this script. Right off we are defining some local configurations for our backups. This being the place we want to store the backups, our MySQL information, and some backup naming conventions. Each backup will be stored in a folder with the current date and each file will have the current date and time along with the site that it belongs to. You can alter these names as you see fit.
Next we'll actually dump the database to the /tmp/ folder with our naming convention. We do this because we are then going to tar the file so that it can be moved to our backup directory for later transfer to SoftLayer Object Storage.
Finally, the magic happens. We check to see if the file exists as a tar. If so, we are going to send it to SoftLayer Object Storage. Since you configured all of your configuration files when you downloaded the SoftLayer Backup Script, all you got to do is call the pythong script directly. The first part is telling the BASH script where the SoftLayer Backup Script is. The second part tells us what the local location of the file is. The third part lets us pick the container we are storing it in. SoftLayer stores objects in "containers". My container is named mysql_backups. Each backup is stored there and when a new backup comes in, the older ones are archived. By default, SoftLayer Object Storage will delete archived backups after 30 days. You can change this in your configuration settings.
At the end of it all we clean up our backups locally and close out successfully. You may be wondering why it is that we clean up our backups? If anyone on the server had access to those backups, they could untar them and have access to all the information that was dumped from your database.
Finally, we are going to make an entry in crontab. This entry runs the backup script every Sunday at around 3a.m. I picked this time as it is the slowest time on the server and means I hopefully won't miss much data between backups:
Some final notes before signing off. It is good to note that SoftLayer has three locations for storage. If you wish, you could easily backup your MySQL database to more than one location for redundancy. SoftLayer Object Storage does replication across multiple servers so that your backups are always safe. As a result you need to allow ample time, depending on backup size and frequency, for the backups to populate across all servers.
If you have any questions, feel free to ask in the Forums. We'll be more than happy to help!