Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough

Automated Drupal database backups using Drush, Bash and Cron

Parent Feed: 

Drush + Bash + Cron: Datbase Backup Goals

  • Scan sites directory for a given drupal install
  • Find all multisite folders/symlinks
  • For each multisite:
  • Use Drush to clear cache - we dont want cache table bloating up the MySQL dump file
  • Use Drush to delete watchdog logs - we dont want watchdog table bloating up the MySQL dump file
  • Use Drush to backup the database to pre-assigned folder
  • Use tar to compress and timestamp the Drush generated sql file
  • Setup Crontab to run perodically with the above commands as a bash file

Assumptions and Instructions

You will need to adjust the Bash file if any of these are not the same on your server

  • Drupal is installed in /var/www/html/drupal
  • Multisites are setup in the /var/www/html/drupal/sites folder
  • Backup folder exists in /var/www/backup/sqldumps
  • Drush is already installed in /root/drush/drush. If drush is not installed follow this Drush installation guide
  • AWK is already installed, if not, type: sudo yum install gawk

Drush Backup BASH file

Copy paste the code below and create a new bash file ideally in your/root home folder. Make the Bash file executable.

#!/bin/bash
#
 
# Adjust to match your system settings
DRUSH=/root/drush/drush
ECHO=/bin/echo
FIND=/usr/bin/find
AWK=/bin/awk
 
# Adjust to match your system settings
docroot=/var/www/html/drupal
backup_dir=/var/www/backup/sqldumps
 
multisites=$1
 
START_TIME=$(date +%Y%m%d%H%M);
 
# Add all multisites for a given docroot into a list. Detects all web addresses which are a directory which isn't named all, default or ends in .local.
if [ "${multisites}" = "all" ];then
        # If multisites are folders change -type d
        # If multisites are symlinks change -type l
        # Adjust $8 to match your docroot, it $x needs to be the name of the multisite folder/symlink
        multisites_list="`$FIND ${docroot}/sites/* -type l -prune | $AWK -F \/ '$8!="all" && $8!="default" && $8!~/\.local$/ { print $8 }'`"
else
        multisites_list=$multisites
fi
 
 
# Must be in the docroot directory before proceeding.
cd $docroot
 
for multisite in $multisites_list
do
        # Echo to the screen the current task.
        $ECHO
        $ECHO "##############################################################"
        $ECHO "Backing up ${multisite}"
        $ECHO
        $ECHO
 
        # Clear Drupal cache
        $DRUSH -y -u 1 -l "${multisite}" cc all
 
        # Truncate Watchdog
        $DRUSH -y -u 1 -l "${multisite}" wd-del all
 
        # SQL Dump DB
        $DRUSH -u 1 -l "${multisite}" sql-dump --result-file="${backup_dir}"/"${multisite}".sql
 
        # Compress the SQL Dump
        tar -czv -f "${backup_dir}"/"${START_TIME}"-"${multisite}".tar.gz -C "${backup_dir}"/ "${multisite}".sql
 
        # Delete original SQL Dump
        rm -f "${backup_dir}"/"${multisite}".sql
 
        $ECHO
        $ECHO
        $ECHO "Finished backing up ${multisite}"
        $ECHO
        $ECHO "##############################################################"
 
done

Setup Crontab

Assuming your bash file containing the code above is saved as /root/drush_backup.sh, you can setup a crontab for root user.

crontab -e
1 1 * * * /root/drush_backup_db.sh

Further Reading and Links

Related blog posts: 


Bookmark and Share
Author: 
Original Post: 

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web