Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough

Simplify Importing a Production Database with Bash and Drush

Parent Feed: 

Posted Sep 24, 2012 // 5 comments

Local development, with its xdebug goodness, only works if you can mimic the production site as close as possible.

While there are very good arguments to use virtualization, sometimes a local site with prod code and a prod database is good enough.

Using a simple bash script, you can quickly and easily run a set of automated commands that get your local environment setup quickly.

Decompress and Import a Database File

It's really simple to import a database, even if it's already compressed, using Drush's SQL functions:

$ gzcat ~/path/to/file.sql.gz | `drush @alias sqlc`

The backticks are important, as you want that command to act on the results of the gzcat command.

Automate clearing cache and disabling modules

When you pull down a prod database locally, a lot of times you need to disable modules that are only applicable to prod. You may also need to clear caches or set values specific to development.

Below is a script you can put in your .bash_profile or .bashrc to do this for you:


#Project
project_clean () { drush $1 pm-disable [ADD PROD MODULES TO DISABLE] -y && drush $1 pm-enable devel dblog -y && drush $1 cc all && drush $1 updb -y && drush $1 vset --always-set --yes less_devel 1 && drush $1 vset --always-set --yes preprocess_css 0 && drush $1 vset --always-set --yes preprocess_js 0; }

Replace [ADD PROD MODULES TO DISABLE] with the name of the modules you want to disable separate by spaces (without the brackets).

You'll notice that the script also enables devel and dblog, clears cache, updates the database, and sets some development variables.

You'll also notice that the script accepts an argument. This is so if you have multiple sites setup locally, you can simply pass in the alias of the site.

For instance, if you had a site with alias 'foobar', you'd use the script at your command prompt via:

$ project_clean @foobar

Combine the Two

Hey! Can't we chain these together, so that in one line, we decompress a database file, import it to our local site, disable modules we don't need, enable the development ones we do, and run database updates?

Of course!

$ gzcat ~/path/to/file.sql.gz | `drush @foobar sqlc` && project_clean @foobar

Mountain Lion Caveat

I updated to Mountain Lion recently and kept running into a really weird error when doing this process:

ERROR 1016 (HY000): Can't open file: './XXXXXX.frm' (errno: 24)

A quick google search led me to believe that the error was related to a limit of concurrent files being open.

In OSX <=10.7, you can get around this by editing your my.cnf configuration for your MYSQL setup with the appropriate values for mysqld. Here is an example I use:

[mysqld]
# Packets.
max_allowed_packet=16m
# Wait timeouts.
innodb_lock_wait_timeout=600
wait_timeout=600
connect_timeout=10
# Set this as high as possible. On a dedicated server, 60% - 80% of machine RAM.
innodb_buffer_pool_size=512m
# Set this to the number of logical cores you have on the database server.
innodb_thread_concurrency=4
# Turn this on dynamically with a Jenkins job.
slow_query_log=OFF
# Max number of connections allowed.
max_connections=400
# Don't run out of file descriptors!
open_files_limit=32768
# If you set the query cache too high, your server risks severly slowing down and taking tens of seconds after an INSERT due to query cache mutex contention.
query_cache_limit=1M
query_cache_size=32M

In OSX >=10.8, this gets ignored because Mountain Lion has a soft virtual file limit that is maxed out at 256.

You can validate this one of two ways. Either check the VARIABLES table in MySQL for the open_files_limit or simply run 'ulimit -a' at a command prompt:

$ ulimit -a

core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) 256
open files (-n) 256
pipe size (512 bytes, -p) 1
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 709
virtual memory (kbytes, -v) 256

Because open_files_limit is a MySQL setting, you can't update it directly in the VARIABLES table. Check this tutorial of how to change it.

Changing it is a moot point, however, if the virtual file limit in Mountain Lion OSX 10.8 isn't changed, as that will trump any value you put in MySQL.

You can, however, increase the virtual file limit in your terminal session by simply running:

$ ulimit -n [NUMBER]

Replace [NUMBER} with the number you want to use, like 4000.

You can then check it again by running:

$ ulimit -a

core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) 256
open files (-n) 4000
pipe size (512 bytes, -p) 1
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 709
virtual memory (kbytes, -v) 256

For that session, you should be able to do an import of a bigger database as the virtual file limit will be higher, your MySQL open_file_limit will be adjusted, and your world will be happier!

Fredric has many years of experience in the IT field, including as a consultant in healthcare IT and as an Interaction Designer. Since the days of the Tandy 3000, tinkering has always been a passion, and before joining Phase2, he dabbled in ...

Original Post: 

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web