DigitalOcean Droplets are so cheap for hobbyist developers, there is really nothing better out there starting at just $5 per month! For an extra dollar per month, they will even take routine backups of your DigitalOcean Droplet. But what if you need to restore one project to a week-old snapshot, but you don't want to restore ALL projects to a week-old snapshot?
That's where a scheduled backup script can really come in handy. These backup scripts dump each database to unique file names, for easy archival and also easy retrieval for another development environment. A notification email is also sent to a configurable email address.
The contents of my root user's crontab
# database backup daily at 12:00am 0 0 * * * /root/Scripts/mysql_daily_backup.sh # database backup monthly on 1st day of month at 12:10am 10 0 1 * * /root/Scripts/mysql_monthly_backup.sh # webroot backup monthly on 1st day of month at 12:20am 20 0 1 * * /root/Scripts/www_monthly_backup.sh
Daily MySQL Backup Script
The root user crontab automates the process of backing up each system Mysql database to individual sql.gz files with the following script located at /root/Scripts/mysql_daily_backup.sh:
#!/bin/bash # Run this script on a daily cron to dump all databases # Original script courtesy of Sonia Hamilton # http://www.snowfrog.net/2005/11/16/backup-multiple-databases-into-separate-files/ # Modified by EEJ to support: # - Email logging # - Rackspace Cloud Backup incremental backup # Updated: 8/6/2015 # Set this to MySQL root user USER="root" # Set this to MySQL root password PASSWORD="xxxxxxxx" # mkdir these folders if they don't yet exist OUTPUTDIR="/var/backup/db" # Formal Backup title BACKUPTITLE="JenkProd Daily Database Backup" # Email address for log file MAILTO="ericjenkins@ericjenkins.net" # From address MAILFROM="JenkProd <noreply@ericjenkins.net>" # Output file format (prefixed by dbname, suffixed by .sql.gz) FILEDESC="_daily" # Log file path LOGDIR="/root/log" # Log file name format (suffixed by date.log) LOGFILEPREFIX="mysql_daily_backup_" ############################################################################### # Linux command paths MYSQLDUMP="$(which mysqldump) --single-transaction --routines --triggers --events" MYSQL="$(which mysql)" MAIL="$(which mail)" # Get date in yyyymmdd format DATE="$(date +%Y.%m.%d)" LOGDATE="$(date +%m/%d/%Y)" # Set a couple variables LOGFILE=$LOGFILEPREFIX""$DATE".log" # Record start time/date TIME_START=`date +%s` DATE_START=`date +%c` echo -e "*** $BACKUPTITLE started at: $DATE_START ***" >> $LOGDIR/$LOGFILE echo -e "\nOutput directory: $OUTPUTDIR\n" >> $LOGDIR/$LOGFILE # Get a list of databases DATABASES=`$MYSQL --user=$USER --password=$PASSWORD \ -e "SHOW DATABASES;" | tr -d "| " | grep -v Database` # Dump each database in turn for DB in $DATABASES; do OUTPUTFILE=$DB""$FILEDESC".sql.gz" OUTPUTFILEBAK=$DB""$FILEDESC".bak.sql.gz" echo -en "Creating $OUTPUTFILE ... " >> $LOGDIR/$LOGFILE mv $OUTPUTDIR/$OUTPUTFILE $OUTPUTDIR/$OUTPUTFILEBAK >> $LOGDIR/$LOGFILE $MYSQLDUMP --log-error=$LOGDIR/$LOGFILE --user=$USER \ --password=$PASSWORD --databases $DB | gzip --rsyncable \ > $OUTPUTDIR/$OUTPUTFILE echo -e "Done." >> $LOGDIR/$LOGFILE done # Delete files older than 2 days echo -e "\nDeleting files older than 2 days ..." >> $LOGDIR/$LOGFILE find $OUTPUTDIR/*$FILEDESC*.sql.gz -type f -mtime +2 \ -exec rm -v {} \; >> $LOGDIR/$LOGFILE find $LOGDIR/$LOGFILEPREFIX* -type f -mtime +2 \ -exec rm -v {} \; >> $LOGDIR/$LOGFILE echo -e "... Done." >> $LOGDIR/$LOGFILE # Set File Permissions chgrp gitusers $OUTPUTDIR/* chgrp gitusers $LOGDIR/* # List the contents of the output directory echo -e "\nContents of $OUTPUTDIR:" >> $LOGDIR/$LOGFILE ls -lah $OUTPUTDIR >> $LOGDIR/$LOGFILE # Log system disk usage echo -e "\nSystem Disk Usage:" >> $LOGDIR/$LOGFILE df -h >> $LOGDIR/$LOGFILE # Record script end time and time elapsed TIME_END=`date +%s` DATE_END=`date +%c` TIME_ELAPSED=$((TIME_END - TIME_START)) echo -e "\n*** $BACKUPTITLE completed at: $DATE_END ***" >> $LOGDIR/$LOGFILE echo -e "Elapsed time:\t $TIME_ELAPSED seconds\n" >> $LOGDIR/$LOGFILE # Send email of log file $MAIL -r "$MAILFROM" -s "$BACKUPTITLE completed on $LOGDATE" "$MAILTO" \ < $LOGDIR/$LOGFILE </noreply@ericjenkins.net>
Source: MySQL database backup script - QWeb Ltd
Monthly MySQL Backup Script
#!/bin/bash # Run this script on a monthly cron to dump all databases # Original script courtesy of Sonia Hamilton # http://www.snowfrog.net/2005/11/16/backup-multiple-databases-into-separate-files/ # Modified by EEJ to support: # - Email logging # - Append date to filename and delete after X days # Updated: 8/6/2015 # Set this to MySQL root user USER="root" # Set this to MySQL root password PASSWORD="xxxxxxxx" # mkdir these folders if they don't yet exist OUTPUTDIR="/var/backup/db" # Number of days to keep backups DAYSKEPT="150" # Formal Backup title BACKUPTITLE="JenkProd Monthly Database Backup" # Email address for log file MAILTO="ericjenkins@ericjenkins.net" # From address MAILFROM="JenkProd <noreply@ericjenkins.net>" # Output file format (prefixed by dbname, suffixed by [date].sql.gz) FILEDESC="_monthly_" # Log file path LOGDIR="/root/log" # Log file name format (suffixed by date.log) LOGFILEPREFIX="mysql_monthly_backup_" ############################################################################### # Linux command paths MYSQLDUMP="$(which mysqldump) --single-transaction --routines --triggers --events" MYSQL="$(which mysql)" MAIL="$(which mail)" # Get date in yyyymmdd format DATE="$(date +%Y.%m.%d)" LOGDATE="$(date +%m/%d/%Y)" # Set a couple variables LOGFILE=$LOGFILEPREFIX""$DATE".log" # Record start time/date TIME_START=`date +%s` DATE_START=`date +%c` echo -e "*** $BACKUPTITLE started at: $DATE_START ***" >> $LOGDIR/$LOGFILE echo -e "\nOutput directory: $OUTPUTDIR\n" >> $LOGDIR/$LOGFILE # Get a list of databases DATABASES=`$MYSQL --user=$USER --password=$PASSWORD \ -e "SHOW DATABASES;" | tr -d "| " | grep -v Database` # Dump each database in turn for DB in $DATABASES; do OUTPUTFILE=$DB""$FILEDESC""$DATE".sql.gz" echo -en "Creating $OUTPUTFILE ... " >> $LOGDIR/$LOGFILE $MYSQLDUMP --log-error=$LOGDIR/$LOGFILE --user=$USER \ --password=$PASSWORD --databases $DB | gzip > $OUTPUTDIR/$OUTPUTFILE echo -e "Done." >> $LOGDIR/$LOGFILE done # Delete files older than DAYSKEPT echo -e "\nDeleting files older than $DAYSKEPT days ..." >> $LOGDIR/$LOGFILE find $OUTPUTDIR/*$FILEDESC*.sql.gz -type f -mtime +$DAYSKEPT \ -exec rm -v {} \; >> $LOGDIR/$LOGFILE find $LOGDIR/$LOGFILEPREFIX* -type f -mtime +$DAYSKEPT \ -exec rm -v {} \; >> $LOGDIR/$LOGFILE echo -e "... Done." >> $LOGDIR/$LOGFILE # Set File Permissions chgrp gitusers $OUTPUTDIR/* chgrp gitusers $OUTPUTDIR/* # List the contents of the output directory echo -e "\nContents of $OUTPUTDIR:" >> $LOGDIR/$LOGFILE ls -lah $OUTPUTDIR >> $LOGDIR/$LOGFILE # Log system disk usage echo -e "\nSystem Disk Usage:" >> $LOGDIR/$LOGFILE df -h >> $LOGDIR/$LOGFILE # Record script end time and time elapsed TIME_END=`date +%s` DATE_END=`date +%c` TIME_ELAPSED=$((TIME_END - TIME_START)) echo -e "\n*** $BACKUPTITLE completed at: $DATE_END ***" >> $LOGDIR/$LOGFILE echo -e "Elapsed time:\t $TIME_ELAPSED seconds\n" >> $LOGDIR/$LOGFILE # Send email of log file $MAIL -r "$MAILFROM" -s "$BACKUPTITLE completed on $LOGDATE" "$MAILTO" \ < $LOGDIR/$LOGFILE </noreply@ericjenkins.net>
Monthly Webroot Backup Script
#!/bin/bash # Create the directory specified by variable OUTPUTDIR # Then run this script on a daily cron to backup directory specified by INPUTDIR # Original script courtesy of Sonia Hamilton # http://www.snowfrog.net/2005/11/16/backup-multiple-databases-into-separate-files/ # Modified by EEJ to support: # - Email logging # - Easier custom configuration # Updated: 8/6/2015 # Path to parent directory of sites to archive INPUTDIR="/var/www" # mkdir this folder if it doesn't yet exist OUTPUTDIR="/var/backup/www" # Number of days of backups to keep DAYSKEPT="150" # Formal Backup title BACKUPTITLE="JenkProd Monthly Webroot Backup" # Email address for log file MAILTO="ericjenkins@ericjenkins.net" # From address MAILFROM="JenkProd <noreply@ericjenkins.net>" # Output file format (prefixed by foldername, suffixed by [date].tgz) FILEDESC="_monthly_" # Log file path LOGDIR="/root/log" # Log file name format (suffixed by date.log) LOGFILEPREFIX="www_monthly_backup_" ################################################################################ # Linux command paths MAIL="$(which mail)" # Get date in yyyymmdd format DATE="$(date +%Y.%m.%d)" LOGDATE="$(date +%m/%d/%Y)" # Set a couple variables LOGFILE=$LOGFILEPREFIX""$DATE".log" # Record start time/date TIME_START=`date +%s` DATE_START=`date +%c` echo -e "*** $BACKUPTITLE started at: $DATE_START ***" >> $LOGDIR/$LOGFILE echo -e "\nInput path: $INPUTDIR" >> $LOGDIR/$LOGFILE echo -e "Output directory: $OUTPUTDIR\n" >> $LOGDIR/$LOGFILE # Compress and archive each site in turn # Split into 2GB parts if necessary FOLDERS=`ls $INPUTDIR | grep -v cgi-bin` for FOLDER in $FOLDERS; do OUTPUTFILE=$FOLDER""$FILEDESC""$DATE".tgz" echo -en "Creating $OUTPUTFILE ... " >> $LOGDIR/$LOGFILE tar -cz $INPUTDIR/$FOLDER | split -b 1920M - $OUTPUTDIR/$OUTPUTFILE"_" echo -e "Done." >> $LOGDIR/$LOGFILE done # Delete files older than DAYSKEPT echo -e "\nDeleting files older than $DAYSKEPT days..." >> $LOGDIR/$LOGFILE find $OUTPUTDIR/*$FILEDESC* -type f -mtime +$DAYSKEPT \ -exec rm -v {} \; >> $LOGDIR/$LOGFILE find $LOGDIR/$LOGFILEPREFIX* -type f -mtime +$DAYSKEPT \ -exec rm -v {} \; >> $LOGDIR/$LOGFILE echo -e "... Done." >> $LOGDIR/$LOGFILE # Set File Permissions chgrp gitusers $LOGDIR/* # Log the contents of the output directory echo -e "\nContents of $OUTPUTDIR:" >> $LOGDIR/$LOGFILE ls -lah $OUTPUTDIR >> $LOGDIR/$LOGFILE # Log system disk usage echo -e "\nSystem Disk Usage:" >> $LOGDIR/$LOGFILE df -h >> $LOGDIR/$LOGFILE # Record script end time and time elapsed TIME_END=`date +%s` DATE_END=`date +%c` TIME_ELAPSED=$((TIME_END - TIME_START)) echo -e "\n*** $BACKUPTITLE completed at: $DATE_END ***" >> $LOGDIR/$LOGFILE echo -e "Elapsed time:\t $TIME_ELAPSED seconds\n" >> $LOGDIR/$LOGFILE # Send email of log file $MAIL -r "$MAILFROM" -s "$BACKUPTITLE completed on $LOGDATE" "$MAILTO" \ < $LOGDIR/$LOGFILE </noreply@ericjenkins.net>