Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jan 30 2021
Jan 30

When you install a Drupal site, a settings.php file is created (either by you, or by the installer) to contain various settings specific to your site (such as database configuration, trusted hostnames, etc.). This is done by taking a copy of a file provided by Drupal core, default.settings.php, and adding or modifying the required lines.

As Drupal develops, additional features mean new things going into default.settings.php.

For example, after a long discussion, a new entry was added to default.settings.php with effect from Drupal 9.2

# $settings['update_fetch_with_http_fallback'] = TRUE;

This mitigates a potential man-in-the-middle attack with checking for updates to core and contributed modules. The point here is not to discuss that issue. This merely serves to illustrate something: Any Drupal site created before 9.2 will have its settings.php file based off the earlier default.settings.php file, and so won't have this entry with associated documentation comments.

This is going to become increasingly important. Before Drupal 8, major new Drupal releases would often involve creating a new settings.php file. Now, this file could persist as Drupal moves through 8.x, 9.x, 10.x, etc.

There needs to be some way to keep track of changes to default.settings.php between releases of Drupal core, so that any individual site's settings.php file can keep pace. Expecting site maintainers to comb the extensive release notes for every minor core release is not going to work; apart from the chance something might be missed, there is also the fact that sometimes smaller changes in a major release are documented in the release notes of a beta or release candidate.

To work, we need a solution that

  • Keeps track of the default.settings.php file off which the current settings.php file is based
  • Allows a (semi-)automated way to update settings.php files to incorporate changes
  • Alerts the site maintainer when the settings.php file has become stale.

This post will offer a solution, running through those 3 requirements

Keeping track of the defaults for the current settings.php

When you install your Drupal site, take a copy of default.settings.php. I'll call it last.default.settings.php.

Put it in web/sites/default, the same place as default.settings.php.

If your site is tracked with some kind of version control, make sure that file is included.

You now have a file that matches exactly the default.settings.php file used to create your site's settings.php file for the first time.

Checking when something has changed

We'll get to automating this in a bit.

But, for now, after any core update, you can run the following to check to see if default.settings.php has changed since you created settings.php.

diff -u web/sites/default/last.default.settings.php web/sites/default/default.settings.php

That command will return nothing if nothing has changed, or a diff of the changes if something has.

Incorporating changes (semi-)automatically

We can use that diff to change settings.php to incorporate any changes

cd web/sites/default
diff -u web/sites/default/last.default.settings.php web/sites/default/default.settings.php > settings-merge.patch
patch --dry-run settings.php < settings-merge.patch

As long as you're happy with what the dry-run says will happen, follow this with:

patch settings.php < settings-merge.patch
rm settings-merge.patch

Lastly, we need to copy the (changed) default.settings.php file. Your settings.php file is now based off the updated version, so we need to update the copy we're keeping to track this

cp -a default.settings.php last.default.settings.php

Take care, if patch creates a file called settings.php.orig containing the unaltered file, remove that before checking the changed version back into version control. If you're using version control, you don't need a separate copy of the old file anyway.

This is all only semi-automated, because it's possible that the patch won't apply cleanly (for example, if the changed portion of default.settings.php is too close to site-specific modifications you had made), in which case you'd have to make the changes manually.

Alerting the site maintainer

You could run that diff command manually after each core update. But it would be nice to automate that.

Fortunately, this is easily done.

I have a directory inside my composer root folder named hook-scripts. Your mileage may vary as to where you choose to put files like this, but working with my directory structure, create a file in hook-scripts named check-default-settings.

DIFF=$(diff -q web/sites/default/last.default.settings.php web/sites/default/default.settings.php)
if [ ! -z "$DIFF" ]; then
  echo -e "\e[31mdefault.settings.php file has changed\e[0m"

(Make sure you set the file to be executable, chmod +x)

If you execute that script, it will return nothing if default.settings.php is unchanged. But if your last.default.settings.php file no longer matches default.settings.php, it will print a message to that effect in red letters.

Now all we need to do is tell composer to call this script after each installation or update. I'll assume here that you know how to put pre / post update / install scripts into composer.json. But you want something like this:

    "scripts": {
        "post-update-cmd" : [
        "post-install-cmd" : [

Now, every time Drupal core is updated (indeed, anything is updated) your script will run. If default.settings.php has changed, you'll be prompted in red lettering. You can then go and run the diff / patch commands above to make sure those changes are included for your site.

Jan 18 2021
Jan 18

What I needed to do

I develop and maintain several Drupal websites. I do some development on a server running cPanel (in a Linux account that is totally isolated from the account running any production sites), most notably theming work.

I wanted to develop a custom subtheme of the Drupal base theme Barrio (a Bootstrap implementation). Because I like using Sass as a CSS pre-processor, to give me maintainable source files, I wanted to use the Barrio SASS Starter Kit theme as a starter. (It is itself a subtheme of Barrio).

That starter kit uses Gulp to generate the final stylesheets.

So, to do any theming work from a cPanel account, I had to find out how to run Gulp scripts from within cPanel. Gulp runs on node.js.

What I'm about to describe will work with other Drupal themes that use Gulp, and will also work for Drupal themes that need other tools (such as Grunt).

The Red Herring

For cPanel hosting, I have used Veerotech for several years. They are excellent and I highly recomend them. Technical support, when needed, is timely day or night. You talk to actual techs employed by Veerotech themselves (never outsourced). As someone who is fairly competent in web hosting (having run servers myself using various control panels, and no panel), if I need help I'll ask a question that is phrased in the correct technical terminology. Unlike some other providers, they engage me on that level, read and understand the actual question asked, and almost always resolve first time.

In particular, they offer two things, that I thought might have been enough. 1. With only a helpdesk ticket and a five minute wait, any account can have console / shell access. 2. Their CloudLinux setup includes the tool to create a Node application within the cPanel UI, including allowing you to run it from an external (sub)domain if you want an actual website powered by Node.

I don't need public access, but I hoped that this access to Node might give me a way to run the Gulp script from my Barrio subtheme.

Alas, no. The cPanel / CloudLinux Node application system assumes you'll be running Node only from within the directory where you create your app. If I tried to run the Gulp script for my theme, it couldn't see dependent modules (like PostCSS). So this was not the answer.

But I found a solution that works, and it's really quite easy when you know how. In fact, it's elegant and cleaner than creating a Node application that is accessible (but does nothing) from the public internet. Here's how:

Node Version Manager

Node Version Manager is a package that, in its own words, "a version manager for node.js, designed to be installed per-user, and invoked per-shell. nvm works on any POSIX-compliant shell (sh, dash, ksh, zsh, bash), in particular on these platforms: unix, macOS, and windows WSL."

The important bit is "per-user". It is installed by the user wanting to run it, in the shell they wish to use. Root permissions are not needed.

Step 1: Install Node Version Manager

The docs tell you to run one of these two commands.

curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.37.2/install.sh | bash
wget -qO- https://raw.githubusercontent.com/nvm-sh/nvm/v0.37.2/install.sh | bash

Personally, I never pipe curl or wget straight into a shell command. Should the download be corrupted or fail to finish, you could accidentally run half a script and leave things in an unstable state. So I did this:

wget https://raw.githubusercontent.com/nvm-sh/nvm/v0.37.2/install.sh
bash install.sh

The version number (0.37.2) was current at the time I ran the install script. Obviously, use the download current when you run it.

This will install Node Version Manager into a directory within your home named .nvm , and attempt to append lines to your .bashrc file that sets up the environment and path for future use. It also prints to screen how to run those commands yourself, in case you don't wish to log back into the shell (or source .bashrc).

export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"  # This loads nvm
[ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion"  # This loads nvm bash_completion

Step 2: Install Node.js

In theory, you just run

nvm install node

I tried that first, but hit a problem. The gulp script for the Barrio subtheme calls node-sass.

In particular, it called version 4.14.1 of node-sass, which threw an error because that version does not support 15.x releases of Node.js. By the time read this, a different version of node-sass may be called (5.0.0 does support Node.jx 15). So you may be able simply to run the command above.

In my case, I had to look at the release notes for node-sass 4.14.1, and see that the highest supported version of Node.js was 14. I then went to the table of releases for Node.js where I saw that meant installing Node.js 14.15.4. That should be possible with

nvm install 14.15.4

but for some reason that hung without completing. The release table also told me that 14.15.4 was the current long-term support version, so I ran this command without any trouble:

nvm install --lts

Step 3: Install the required modules for the project

I was now able to call the commands on the Barrio SASS project page. Install both Bootstrap Barrio and Barrio SASS, run the command included with Barrio SASS to create your own subtheme, navigate to the your custom theme directory, and run these two commands:

npm install --global gulp-cli
npm install

Step 4: Call gulp

Now, you just go to your custom theme directory and run:


That will pick up gulpfile.js in the project directory and compile the theme resources. It will then stay running, because the gulpfile includes gulp-watch, which means it will watch for when you change any of the .scss files and instantly rebuild the theme. So you may prefer to run

gulp &

to keep it in the background.

The Gulp script will also think you're running a full version of Node.js, which means it will attempt to launch a web serving process listening on a particular port on the localhost interface. None of that will work in a shared hosting environment. That's OK - you don't need it to. You just need Gulp to assemble the css files for your theme, which it will do.

Aug 05 2019
Aug 05

I've been putting off learning how to build sites in Drupal 8, and migrating my existing Drupal 7 sites over to Drupal 8. Why? Drupal 8 uses a lot of new tools. I want to learn how to set up a Drupal 8 site in the "right" (optimal) way so that I don't incur technical debt for myself later on. That means I have a lot of tools to learn. That takes time, which I don't have a lot of. So I've procrastinated.

Now, two deadlines are creeping up on me. The first is the release of Drupal 9, planned for June 3, 2020. At that point, there's a bit longer before Drupal 7 goes unsupported (November 2021 is the current plan), but it's time to move.

The second is the requirement for Strong Customer Authentication (SCA) on European payments the middle of this coming September. One of my D7 sites runs Ubercart and uses Ubercart Stripe, which won't support SCA.The only alternative to upgrading is to disable Stripe as a payment method. Actually, AndraeRay has taken maintainer status of Ubercart Stripe and nearly has an SCA-compliant 7.x-3.x branch ready. Many thanks!

So it's time to start. But how do I set up my development workflow? Having done a lot of reading, here's what I propose. I'd really value input from the Drupal community here; I'm bound to have missed something important, and I've one big question I can't answer (see below). Please pile in at the Comments.

My Proposed Approach

  • Set up a development server, separate from the server that live sites will be deployed on. This will have plenty of RAM, the same version of PHP as the deployment environment, and Composer installed.
  • I plan to use a private git repository to store each site's code and configuration.
  • On the development server, use Composer to install drupal-composer/drupal-project.
  • Use .gitignore to exclude all core and contributed modules and themes from the git repository.
  • I'll also move environment-specific settings (such as database credentials) to a settings.local.php file, which will also be excluded using .gitignore.
  • Add everything else to git.
  • As I work on the site, I'll commit, add and push changes to the git repository.
  • I'll also set up a sync directory using $config_directories[‘sync’], so that I can use Drupal console to export the site's configuration and include that in git as well.
  • On the production server, I can periodically pull everything from the git repository. When I do so, I'd use composer install (never composer update). This removes the need for the production server to have a large amount of RAM (since generating composer.lock is done on the development server). I'd then use Drupal console to import the synchronised configuration, and clear all caches.
  • For theming, on Drupal 7 I built my own responsive themes by subtheming Zen. Zen for Drupal 8 is in Alpha only; the last (alpha) release was 3 years ago and the last commit was 20 months ago. So it looks like the best approach is to subtheme Classy directly.
  • Previously, I've found Sass and Compass to be a great help with building a custom theme, so I'd plan to use them again. I'd include one simple .scss file that is excluded from git to override the base hue of the theme; this allows my development site to have a different colour scheme to the production one, to help me always remember which site I'm on.

All of the above can be modified to allow a staging copy of the site as well, if that becomes helpful.


I have one big question.

This workflow allows me to make configuration changes on the development site, and then push them up to the deployment site when ready.

It seems to me that, whilst configuration needs pushing from development to production, content needs pushing the other way. That's to say, I'll want regularly to make sure my development site contains the latest content on the production site. New and altered pages need pushing down, as do nodes that use the new content types I've developed in development.

Configuration: Development => Staging => Production
Content: Production => Staging => Development

How do you do this?

Specifically, I don't think I want to dump the entire database from production, and replace the development database contents with the sqldump. Doing that would also override configuration changes on the development site, and that's not the way this whole workflow is designed.

So is there a generally adopted method to use, to dump the site content from production and import it to development, but without making any changes to the configuration that is stored in the development database?

Over to you

So over to you, reader. I've learnt a lot, but I have a lot to learn still.

Can you help answer that question?

Have I missed anything really important? Would anything in the approach above be unwise, and if so what and why? Am I right that Classy is the best base theme, or is another preferable? (It would need to be actively developed, likely to stick around as Drupal moves through 8.8, 8.9, 9.x and so on, and not slow the site down significantly or make it harder to maintain by adding lots of preprocessing or complex div structures / css libraries). Am I using git and composer in the right way? Any other advice?

Thanks in advance!

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web