Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
May 20 2020
May 20

I recently finished porting this website from a static site generator to Drupal 8, meaning that this site has now been powered by three different major versions of Drupal (6, 7 and 8) as well as by two static site generators since it was first launched in early 2010.

The majority of the content was imported using migrations from JSON feeds that I created. This included:

  • Blog tags
  • Blog posts
  • Talks
  • Redirects

In some follow-up posts, I'll be looking at each migration separately, describing any issues and look at how it was used to import its respective content.

I'll update this post with the links to the follow-up posts, and they are also available from the blog series' page.

Apr 24 2020
Apr 24

I'm happy to be presenting two talks remotely at this year's CMS Philly conference (formerly Drupaldelphia).

The first talk is Deploying PHP applications with Ansible, Ansible Vault and Ansistrano at 1pm (6pm UK time) where I'll be doing an introduction to Ansible and show how to use Ansistrano to do deploy a Drupal 8 application.

The second talk is Taking Flight with Tailwind CSS at 2pm (7pm UK time) where I'll show how to configure and use Tailwind CSS.

CMS Philly is happening virtually on Friday, May 1st via GoToWebinar.

Apr 22 2020
Apr 22

Some time ago, I announced that I was planning on writing a book on automated testing and test driven development with Drupal. I created a landing page and set up a mailing list, but I wasn't sure at that point what I was going to cover or create as part of the book.

I'm going to write a book on automated testing in Drupal. Join the mailing list for updates, and I'm happy to take suggestions on what to cover. https://t.co/YXNpe6f8Ft #drupal

— Oliver Davies (@opdavies) May 15, 2018

Being a meetup and DrupalCamp conference organiser, after some thought I decided to build a website for an example conference, and that some of this code would then be included in the book as example content. This seemed to cover most of what I originally wanted, through features like a call for papers for potential speakers to propose sessions, allowing organisers to administer and moderate those proposals, automatically sending notification emails to submitters and displaying the accepted sessions.

I've started building it with Drupal 8.8 and it is now available on GitStore to purchase access to, including all future updates as I continue building the application - adding new features and upgrading to Drupal 9 once it is released. There are some other interesting things there too, such as using feature flags to enable or disable functionality, and using GitHub Actions to run the tests automatically.

The book itself I've added a page for on Leanpub, and I'll be continuing to add content to it in parallel to building the example codebase. Once there is enough content, I will release the first draft for purchase.

Any purchases that are made via Gitstore or Leanpub, an amount will be donated to the Drupal Association and the #DrupalCares campaign to help sustain the Association during COVID-19.

Feb 04 2020
Feb 04

How to use the PSR-4 autoloading standard for Drupal 7 Simpletest test cases.

The Traditional Way

The typical way of including test cases in Drupal 7 is to add one or more classes within a .test file - e.g. opdavies.test. This would typically include all of the different test cases for that module, and would be placed in the root of the module’s directory alongside the .info and .module files.

In order to load the files, each file would need to be declared within the .info file for the module.

There is a convention that if you have multiple tests for your project, these can be split into different files and grouped within a tests directory.

; Load a test file at the root of the module
files[] = opdavies.test

; Load a test file from within a subdirectory
files[] = tests/foo.test
files[] = tests/bar.test

Using the xautoload Module

Whilst splitting tests into separate files makes things more organised, each file needs to be loaded separately. This can be made simpler by using the Xautoload module, which supports wildcards when declaring files.

files[] = tests/**/*.test

This would load all of the .test files within the tests directory.

Using PSR-4 Autoloading

Another option is to use PSR-4 (or PSR-0) autoloading.

This should be a lot more familiar to those who have worked with Drupal 8, Symfony etc, and means that each test case is in its own file which is cleaner, files have the .php extension which is more standard, and the name of the file matches the name of the test class for consistency.

To do this, create a src/Tests (PSR-4) or lib/Drupal/{module_name}/Tests (PSR-0) directory within your module, and then add or move your test cases there. Add the appropriate namespace for your module, and ensure that DrupalWebTestCase or DrupalUnitTestCase is also namespaced.

// src/Tests/Functional/OliverDaviesTest.php

namespace Drupal\opdavies\Tests\Functional;

class OliverDaviesTest extends \DrupalWebTestCase {
  // ...

This also supports subdirectories, so you can group classes within Functional and Unit directories if you like.

If you want to see an real-world example, see the Drupal 7 branch of the Override Node Options module.

Digging into the simpletest_test_get_all function

This is the code within simpletest.module that makes this work:

// simpletest_test_get_all()

// ...

$module_dir = DRUPAL_ROOT . '/' . dirname($filename);

// Search both the 'lib/Drupal/mymodule' directory (for PSR-0 classes)
// and the 'src' directory (for PSR-4 classes).
foreach (array(
  'lib/Drupal/' . $name,
) as $subdir) {

  // Build directory in which the test files would reside.
  $tests_dir = $module_dir . '/' . $subdir . '/Tests';

  // Scan it for test files if it exists.
  if (is_dir($tests_dir)) {
    $files = file_scan_directory($tests_dir, '/.*\\.php/');
    if (!empty($files)) {
      foreach ($files as $file) {

        // Convert the file name into the namespaced class name.
        $replacements = array(
          '/' => '\\',
          $module_dir . '/' => '',
          'lib/' => '',
          'src/' => 'Drupal\\' . $name . '\\',
          '.php' => '',
        $classes[] = strtr($file->uri, $replacements);

It looks for a the tests directory (src/Tests or lib/Drupal/{module_name}/Tests) within the module, and then finds any .php files within it. It then converts the file name into the fully qualified (namespaced) class name and loads it automatically.

Running the Tests

You can still run the tests from within the Simpletest UI, or from the command line using run-tests.sh.

If you want to run a specific test case using the --class option, you will now need to include the fully qualified name.

php scripts/run-tests.sh --class Drupal\\opdavies\\Tests\\Functional\\OliverDaviesTest

Oct 24 2018
Oct 24

Today I found another instance where I decided to use Illuminate Collections within my Drupal 8 code; whilst I was debugging an issue where a Drupal Commerce promotion was incorrectly being applied to an order.

No adjustments were showing in the Drupal UI for that order, so after some initial investigation and finding that $order->getAdjustments() was empty, I determined that I would need to get the adjustments from each order item within the order.

If the order were an array, this is how it would be structured in this situation:

$order = [
  'id' => 1,
  'items' => [
      'id' => 1,
      'adjustments' => [
        ['name' => 'Adjustment 1'],
        ['name' => 'Adjustment 2'],
        ['name' => 'Adjustment 3'],
      'id' => 2,
      'adjustments' => [
        ['name' => 'Adjustment 4'],
      'id' => 3,
      'adjustments' => [
        ['name' => 'Adjustment 5'],
        ['name' => 'Adjustment 6'],

Getting the order items

I started by using $order->getItems() to load the order’s items, converted them into a Collection, and used the Collection’s pipe() method and the dump() function provided by the Devel module to output the order items.

  ->pipe(function (Collection $collection) {

Get the order item adjustments

Now we have a Collection of order items, for each item we need to get it’s adjustments. We can do this with map(), then call getAdjustments() on the order item.

This would return a Collection of arrays, with each array containing it’s own adjustments, so we can use flatten() to collapse all the adjustments into one single-dimensional array.

  ->map(function (OrderItem $order_item) {
    return $order_item->getAdjustments();

There are a couple of refactors that we can do here though:

  • Use flatMap() to combine the flatten() and map() methods.
  • Use higher order messages to delegate straight to the getAdjustments() method on the order, rather than having to create a closure and call the method within it.


In this scenario, each order item had three adjustments - the correct promotion, the incorrect one and the standard VAT addition. I wasn’t concerned about the VAT adjustment for debugging, so I used filter() to remove it based on the result of the adjustment’s getSourceId() method.

  ->filter(function (Adjustment $adjustment) {
    return $adjustment->getSourceId() != 'vat';


Now I have just the relevant adjustments, I want to be able to load each one to load it and check it’s conditions. To do this, I need just the source IDs.

Again, I can use a higher order message to directly call getSourceId() on the adjustment and return it’s value to map().

  ->filter(function (Adjustment $adjustment) {
    return $adjustment->getSourceId() != 'vat';

This returns a Collection containing just the relevant promotion IDs being applied to the order that I can use for debugging.

Now just to find out why the incorrect promotion was applying!

Aug 23 2018
Aug 23

Since starting to work with Laravel as well as Drupal and Symfony, watching Adam Wathan’s Refactoring to Collections course as well as lessons on Laracasts, I’ve become a fan of Laravel’s Illuminate Collections and the object-orientated pipeline approach for interacting with PHP arrays.

In fact I’ve given a talk on using Collections outside Laravel and have written a Collection class module for Drupal 7.

I’ve also tweeted several examples of code that I’ve written within Drupal that use Collections, and I thought it would be good to collate them all here for reference.

Thanks again to Tighten for releasing and maintaining the tightenco/collect library that makes it possible to pull in Collections via Composer.

Putting @laravelphp's Collection class to good use, cleaning up some of my @drupal 8 code. Thanks @TightenCo for the Collect library! pic.twitter.com/Bn1UfudGvp

— Oliver Davies (@opdavies) August 18, 2017

Putting more @laravelphp Collections to work in my @drupal code today. ? pic.twitter.com/H8xDTT063X

— Oliver Davies (@opdavies) February 14, 2018

I knew that you could specify a property like 'price' in Twig and it would also look for methods like 'getPrice()', but I didn't know (or had maybe forgotten) that @laravelphp Collections does it too.

This means that these two Collections return the same result.

Nice! ? pic.twitter.com/2g2IfThzdy

— Oliver Davies (@opdavies) June 20, 2018

More @laravelphp Collection goodness, within my #Drupal8 project! pic.twitter.com/mWgpNbNIrh

— Oliver Davies (@opdavies) August 10, 2018

Some more #Drupal 8 fun with Laravel Collections. Loading the tags for a post and generating a formatted string of tweetable hashtags. pic.twitter.com/GbyiRPzIRo

— Oliver Davies (@opdavies) August 23, 2018
Aug 21 2018
Aug 21

I’ve been experimenting with moving some code to Drupal 8, and I’m quite intrigued by a different way that I’ve tried to structure it - using event subscribers, building on some of the takeaways from Drupal Dev Days.

Here is how this module is currently structured:

Note that there is no opdavies_blog.module file, and rather than calling actions from within a hook like opdavies_blog_entity_update(), each action becomes it’s own event subscriber class.

This means that there are no long hook_entity_update functions, and instead there are descriptive, readable event subscriber class names, simpler action code that is responsibile only for performing one task, and you’re able to inject and autowire dependencies into the event subscriber classes as services - making it easier and cleaner to use dependency injection, and simpler write tests to mock dependencies when needed.

The additional events are provided by the Hook Event Dispatcher module.



    autowire: true
      - { name: event_subscriber }

    autowire: true
      - { name: event_subscriber }

Adding autowire: true is not required for the event subscriber to work. I’m using it to automatically inject any dependencies into the class rather than specifying them separately as arguments.


namespace Drupal\opdavies_blog\EventSubscriber;

use Drupal\hook_event_dispatcher\Event\Entity\EntityUpdateEvent;
use Drupal\hook_event_dispatcher\HookEventDispatcherInterface;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;

class SendTweet implements EventSubscriberInterface {


  public static function getSubscribedEvents() {
    return [
      HookEventDispatcherInterface::ENTITY_UPDATE => 'sendTweet',

  public function sendTweet(EntityUpdateEvent $event) {
    // Perform checks and send the tweet.


Aug 16 2018
Aug 16

Have you ever needed to have a 'special user' to perform tasks on your Drupal site, such as performing actions based on an API request, or for sending an internal site message?

If you just create a new user, how do you identify that user going forward? Do you hard-code the 'magic' user ID in your custom code? What if the user has a different ID on different environments of your site? You could declare it in each environment’s settings file and retrieve it from there, but what then if you need to do the same on another site? That would mean some duplication of code - and something that could have been abstracted and re-used.

I had to do this recently, and rather than just duplicate the code I decided to make it into it’s own module - which then became two modules.

System users

The System User module provides a re-usable, generic way to denote users as 'system users', which is not specific to a certain site or environment as this is value is stored against each individual user in the database.

'System user' is a term used in Linux, which I thought also applies well to this scenario.

From https://www.ssh.com/iam/user/system-account:

A system account is a user account that is created by an operating system during installation and that is used for operating system defined purposes. System accounts often have predefiend user ids. Examples of system accounts include the root account in Linux.

A system user isn’t an account that we’d expect a person to log in with and perform routine tasks like updating content, but rather for the system (site) to use to perform tasks like the earlier examples.

Declaring a user as a system user

System User module adds a base field to Drupal’s User entity, which determines whether or not each user is a system user - i.e. if this field is TRUE, that user is a system user. This means that users can easily be queried to identify which are system users, without having to rely on magic, environment and site specific user IDs. This also means that we can have multiple system users, if needed.

{.border .p-1}

In the Drupal 8 version of the module, a SystemUser is a custom entity, that contains it’s own create method for creating new system users. This is a essentially a wrapper around User::create() that automatically sets the value of the system user field as part of the creation.

The original intention is that system users would always be created manually in an custom install or update hook, however since releasing the module, I’ve also added an install hook to the module to automatically create a new system user when the module is installed, basing the username on the site name.

There is also an open issue to add a Drush command to create a new system user, and I’d imagine I’ll also add a Drupal Console command too.

Retrieving system users

Whilst you could easily write your own query that retrieves users based on the value of the system user field, but the module contains a SystemUserManager service that contains methods to do so. It also provides a static helper class that determines if a specified user is a system user by checking the value of the system user field.

// Retrieve the first system user.
$system_user = $this->systemUserManager->getFirst();

// Is the specified user a system user?
$is_system_user = SystemUserManager::isSystemUser($user);

But what do we return if there are no system users? You could return NULL or FALSE, but I decided to take a different approach, which became the second module.

Null users

The Null User module is an implementation of the null object pattern for users in Drupal 8. In this case, a NullUser is an extension of Drupal’s AnonymousUserSession, which means that it inherits sensible defaults to return for a non-existent User. Though, through inheritance, the id, getRoles and hasPermission methods are overridden to return relevant values.

use Drupal\Core\Session\AnonymousUserSession;

class NullUser extends AnonymousUserSession {

Null User module is a dependency of System User in Drupal 8, so When no system user is found from the getFirst() method, a NullUser is returned. Whilst I could alternatively have returned NULL or FALSE, we then would need to check if the returned value was an object or not before calling methods on it.

$system_user = $this->systemUserManager->getFirst(); // Returns NULL or FALSE.

// Need to check if a user was returned or not.
if (!$system_user) {

if ($system_user->isActive()) {

Because instead we’re returning a NullUser, which through class inheritance has the same methods and properties as a regular user, there is no need to do the additional check as you will always receive a relevant object, and the expected methods will always be present.

$system_user = $this->systemUserManager->getFirst(); // Returns a NullUser.

if ($system_user->isActive()) {

This means we have less code, which also is simpler and more readable.

System User module is the only one that I’m aware of that makes use of Null User, but I’ve added a list to the project page so let me know if you can think of any others.


Jun 04 2018
Jun 04

Within the Docksal documentation for Drupal settings, the example database settings include hard-coded credentials to connect to the Drupal database. For example, within a settings.php file, you could add this:

$databases['default']['default'] = [
  'driver' => 'mysql',
  'host' => 'db',
  'database' => 'myproject_db',
  'username' => 'myproject_user',
  'password' => 'myproject_pass',

Whilst this is fine, it does mean that there is duplication in the codebase as the database credentials can also be added as environment variations within .docksal/docksal.env - this is definitely the case if you want to use a custom database name, for example.

Also if one of these values were to change, then Drupal wouldn't be aware of that and would no longer be able to connect to the database.

It also means that the file can’t simply be re-used on another project as it contains project-specific credentials.

We can improve this by using the environment variables within the settings file.

The relevant environment variables are MYSQL_DATABASE for the database name, and MYSQL_USER and MYSQL_PASSWORD for the MySQL username and password. These can be set in .docksal/docksal.env, and will need to be present for this to work.

For example:


With these in place, they can be referenced within the settings file using the getenv() function.

$databases['default']['default'] = [
  'driver' => 'mysql',
  'host' => 'db',
  'database' => getenv('MYSQL_DATABASE'),
  'username' => getenv('MYSQL_USER'),
  'password' => getenv('MYSQL_PASSWORD'),

Now the credentials are no longer duplicated, and the latest values from the environment variables will always be used.

However, you may see a message like this when you try and load the site:

Drupal\Core\Database\DatabaseAccessDeniedException: SQLSTATE[HY000][1045] Access denied for user ''@'' (using password: NO) in /var/www/core/lib/Drupal/Core/Database/Driver/mysql/Connection.php on line 156

If you see this, the environment variables aren’t being passed into Docksal’s cli container, so the values are not being populated. To enable them, edit .docksal/docksal.yml and add MYSQL_DATABASE, MYSQL_PASSWORD and MYSQL_USER to the environment section of the cli service.

version: '2.1'
      - MYSQL_USER

After changing this file, run fin start to rebuild the project containers and try to load the site again.

May 06 2018
May 06

This week I’ve started writing some custom commands for my Drupal projects that use Docksal, including one to easily run PHPUnit tests in Drupal 8. This is the process of how I created this command.

What is Docksal?

Docksal is a local Docker-based development environment for Drupal projects and other frameworks and CMSes. It is our standard tool for local environments for projects at Microserve.

There was a great talk recently at Drupaldelphia about Docksal.

Why write a custom command?

One of the things that Docksal offers (and is covered in the talk) is the ability to add custom commands to the Docksal’s fin CLI, either globally or as part of your project.

As an advocate of automated testing and TDD practitioner, I write a lot of tests and run PHPUnit numerous times a day. I’ve also given talks and have written other posts on this site relating to testing in Drupal.

There are a couple of ways to run PHPUnit with Docksal. The first is to use fin bash to open a shell into the container, move into the docroot directory if needed, and run the phpunit command.

fin bash
cd /var/www/docroot
../vendor/bin/phpunit -c core modules/custom

Alternatively, it can be run from the host machine using fin exec.

cd docroot
fin exec '../vendor/bin/phpunit -c core modules/custom'

Both of these options require multiple steps as we need to be in the docroot directory where the Drupal code is located before the command can be run, and both have quite long commands to run PHPUnit itself - some of which is repeated every time.

By adding a custom command, I intend to:

  1. Make it easier to get set up to run PHPUnit tests - i.e. setting up a phpunit.xml file.
  2. Make it easier to run the tests that we’d written by shortening the command and making it so it can be run anywhere within our project.

I also hoped to make it project agnostic so that I could add it onto any project and immediately run it.

Creating the command

Each command is a file located within the .docksal/commands directory. The filename is the name of the command (e.g. phpunit) with no file extension.

To create the file, run this from the same directory where your .docksal directory is:

mkdir -p .docksal/commands
touch .docksal/commands/phpunit

This will create a new, empty .docksal/commands/phpunit file, and now the phpunit command is now listed under "Custom commands" when we run fin.

You can write commands with any interpreter. I’m going to use bash, so I’ll add the shebang to the top of the file.

#!/usr/bin/env bash

With this in place, I can now run fin phpunit, though there is no output displayed or actions performed as the rest of the file is empty.

Adding a description and help text

Currently the description for our command when we run fin is the default "No description" text. I’d like to add something more relevant, so I’ll start by adding a new description.

fin interprets lines starting with ## as documentation - the first of which it uses as the description.

#!/usr/bin/env bash

## Run automated PHPUnit tests.

Now when I run it, I see the new description.

Any additional lines are used as help text with running fin help phpunit. Here I’ll add an example command to demonstrate how to run it as well as some more in-depth text about what the command will do.

#!/usr/bin/env bash

## Run automated PHPUnit tests.
## Usage: fin phpunit <args>
## If a core/phpunit.xml file does not exist, copy one from elsewhere.
## Then run the tests.

Now when I run fin help phpunit, I see the new help text.

Adding some content

Setting the target

As I want the commands to be run within Docksal’s "cli" container, I can specify that with exec_target. If one isn’t specified, the commands are run locally on the host machine.

#: exec_target = cli

Available variables

These variables are provided by fin and are available to use within any custom commands:

  • PROJECT_ROOT - The absolute path to the nearest .docksal directory.
  • DOCROOT - name of the docroot folder.
  • VIRTUAL_HOST - the virtual host name for the project. Such as myproject.docksal.
  • DOCKER_RUNNING - (string) "true" or "false".

Note: If the DOCROOT variable is not defined within the cli container, ensure that it’s added to the environment variables in .docksal/docksal.yml. For example:

version: "2.1"

      - DOCROOT

Running phpunit

When you run the phpunit command, there are number of options you can pass to it such as --filter, --testsuite and --group, as well as the path to the tests to execute, such as modules/custom.

I wanted to still be able to do this by running fin phpunit <args> so the commands can be customised when executed. However, as the first half of the command (../vendor/bin/phpunit -c core) is consistent, I can wrap that within my custom command and not need to type it every time.

By using "[email protected]" I can capture any additional arguments, such as the test directory path, and append them to the command to execute.

I’m using $PROJECT_ROOT to prefix the command with the absolute path to phpunit so that I don’t need to be in that directory when I run the custom command, and $DOCROOT to always enter the sub-directory where Drupal is located. In this case, it’s "docroot" though I also use "web" and I’ve seen various others used.


# If there is no phpunit.xml file, copy one from elsewhere.

# Otherwise run the tests.
${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"

For example, fin phpunit modules/custom would execute /var/www/vendor/bin/phpunit -c /var/www/docroot/core modules/custom within the container.

I can then wrap this within a condition so that the tests are only run when a phpunit.xml file exists, as it is required for them to run successfully.

if [ ! -e ${DRUPAL_CORE_PATH}/phpunit.xml ]; then
    # If there is no phpunit.xml file, copy one from elsewhere.
    ${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"

Creating phpunit.xml - step 1

My first thought was that if a phpunit.xml file doesn’t exist was to duplicate core’s phpunit.xml.dist file. However this isn’t enough to run the tests, as values such as SIMPLETEST_BASE_URL, SIMPLETEST_DB and BROWSERTEST_OUTPUT_DIRECTORY need to be populated.

As the tests wouldn't run at this point, I’ve exited early and displayed a message to the user to edit the new phpunit.xml file and run fin phpunit again.

if [ ! -e ${DRUPAL_CORE_PATH}/phpunit.xml ]; then
    echo "Copying ${DRUPAL_CORE_PATH}/phpunit.xml.dist to ${DRUPAL_CORE_PATH}/phpunit.xml."
    echo "Please edit it's values as needed and re-run 'fin phpunit'."
    cp ${DRUPAL_CORE_PATH}/phpunit.xml.dist ${DRUPAL_CORE_PATH}/phpunit.xml
    exit 1;
    ${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"

However this isn’t as streamlined as I originally wanted as it still requires the user to perform an additional step before the tests can run.

Creating phpunit.xml - step 2

My second idea was to keep a pre-configured file within the project repository, and to copy that into the expected location. That approach would mean that the project specific values would already be populated, as well as any customisations made to the default settings. I decided on .docksal/drupal/core/phpunit.xml to be the potential location.

Also, if this file is copied then we can go ahead and run the tests straight away rather than needing to exit early.

If a pre-configured file doesn’t exist, then we can default back to copying phpunit.xml.dist.

To avoid duplication, I created a reusable run_tests() function so it could be executed in either scenario.

run_tests() {
    ${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"

if [ ! -e ${DRUPAL_CORE_PATH}/phpunit.xml ]; then
    if [ -e "${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml" ]; then
        echo "Copying ${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml to ${DRUPAL_CORE_PATH}/phpunit.xml"
        cp "${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml" ${DRUPAL_CORE_PATH}/phpunit.xml
        run_tests "[email protected]"
        echo "Copying ${DRUPAL_CORE_PATH}/phpunit.xml.dist to ${DRUPAL_CORE_PATH}/phpunit.xml."
        echo "Please edit it's values as needed and re-run 'fin phpunit'."
        cp ${DRUPAL_CORE_PATH}/phpunit.xml.dist ${DRUPAL_CORE_PATH}/phpunit.xml
        exit 1;
    run_tests "[email protected]"

This means that I can execute less steps and run a much shorter command compared to the original, and even if someone didn’t have a phpunit.xml file created they could have copied into place and have tests running with only one command.

The finished file

#!/usr/bin/env bash

#: exec_target = cli

## Run automated PHPUnit tests.
## Usage: fin phpunit <args>
## If a core/phpunit.xml file does not exist, one is copied from
## .docksal/core/phpunit.xml if that file exists, or copied from the default
## core/phpunit.xml.dist file.


run_tests() {
    ${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"

if [ ! -e ${DRUPAL_CORE_PATH}/phpunit.xml ]; then
    if [ -e "${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml" ]; then
        echo "Copying ${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml to ${DRUPAL_CORE_PATH}/phpunit.xml"
        cp "${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml" ${DRUPAL_CORE_PATH}/phpunit.xml
        run_tests "[email protected]"
        echo "Copying phpunit.xml.dist to phpunit.xml"
        echo "Please edit it's values as needed and re-run 'fin phpunit'."
        cp ${DRUPAL_CORE_PATH}/phpunit.xml.dist ${DRUPAL_CORE_PATH}/phpunit.xml
        exit 0;
    run_tests "[email protected]"

It’s currently available as a GitHub Gist, though I’m planning on moving it into a public GitHub repository either on my personal account or the Microserve organisation, for people to either use as examples or to download and use directly.

I’ve also started to add other commands to projects such as config-export to standardise the way to export configuration from Drupal 8, run Drupal 7 tests with SimpleTest, and compile front-end assets like CSS within custom themes.

I think it’s a great way to shorten existing commands, or to group multiple commands into one like in this case, and I can see a lot of other potential uses for it during local development and continuous integration. Also being able to run one command like fin init and have it set up everything for your project is very convenient and a big time saver!

Since writing this post, I’ve had a pull request accepted for this command to be added as a Docksal add-on. This means that the command can be added to any Docksal project by running fin addon install phpunit. It will be installed into the .docksal/addons/phpunit directory, and displayed under "Addons" rather than "Custom commands" when you run fin.


Mar 10 2018
Mar 10

Yay! You’ve written a new Drupal module, theme or installation profile as part of your site, and now you’ve decided to open source it and upload it to Drupal.org as a new contrib project. But how do you split it from the main site repository into it’s own?

Well, there are a couple of options.

Does it need to be part of the site repository?

An interesting thing to consider is, does it need to be a part of the site repository in the first place?

If from the beginning you intend to contribute the module, theme or distribution and it’s written as generic and re-usable from the start, then it could be created as a separate project on Drupal.org or as a private repository on your Git server from the beginning, and added as a dependency of the main project rather than part of it. It could already have the correct branch name and adhere to the Drupal.org release conventions and be managed as a separate project, then there is no later need to "clean it up" or split it from the main repo at all.

This is how I worked at the Drupal Association - with all of the modules needed for Drupal.org hosted on Drupal.org itself, and managed as a dependency of the site repository with Drush Make.

Whether this is a viable option or not will depend on your processes. For example, if your code needs to go through a peer review process before releasing it, then pushing it straight to Drupal.org would either complicate that process or bypass it completely. Pushing it to a separate private repository may depend on your team's level of familiarity with Composer, for example.

It does though avoid the “we’ll clean it up and contribute it later” scenario which probably happens less than people intend.

Create a new, empty repository

If the project is already in the site repo, this is probably the most common method - to create a new, empty repository for the new project, add everything to it and push it.

For example:

cd web/modules/custom/my_new_module

# Create a new Git repository.
git init

# Add everything and make a new commit.
git add -A .
git commit -m 'Initial commit'

# Rename the branch.
git branch -m 8.x-1.x

# Add the new remote and push everything.
git remote add origin [email protected]:project/my_new_module.git
git push origin 8.x-1.x

There is a huge issue with this approach though - you now have only one single commit, and you’ve lost the commmit history!

This means that you lose the story and context of how the project was developed, and what decisions and changes were made during the lifetime of the project so far. Also, if multiple people developed it, now there is only one person being attributed - the one who made the single new commit.

Also, if I’m considering adding your module to my project, personally I’m less likely to do so if I only see one "initial commit". I’d like to see the activity from the days, weeks or months prior to it being released.

What this does allow though is to easily remove references to client names etc before pushing the code.

Use a subtree split

An alternative method is to use git-subtree, a Git command that "merges subtrees together and split repository into subtrees". In this scenario, we can use split to take a directory from within the site repo and split it into it’s own separate repository, keeping the commit history intact.

Here is the description for the split command from the Git project itself:

Extract a new, synthetic project history from the history of the subtree. The new history includes only the commits (including merges) that affected , and each of those commits now has the contents of at the root of the project instead of in a subdirectory. Thus, the newly created history is suitable for export as a separate git repository.

Note: This command needs to be run at the top level of the repository. Otherwise you will see an error like "You need to run this command from the toplevel of the working tree.".

To find the path to the top level, run git rev-parse --show-toplevel.

In order to do this, you need specify the prefix for the subtree (i.e. the directory that contains the project you’re splitting) as well as a name of a new branch that you want to split onto.

git subtree split --prefix web/modules/custom/my_new_module -b split_my_new_module

When complete, you should see a confirmation message showing the branch name and the commit SHA of the branch.

Created branch 'split_my_new_module'

If you run git branch, you should now be able to see the new branch, and if you run git log --oneline split_my_new_module, you should only see commits for that module.

If you do need to tidy up a particular commit to remove client references etc, change a commit message or squash some commits together, then you can do that by checking out the new branch, running an interactive rebase and making the required amends.

git checkout split_my_new_module
git rebase -i --root

Once everything is in the desired state, you can use git push to push to the remote repo - specifying the repo URL, the local branch name and the remote branch name:

git push [email protected]:project/my_new_module.git split_my_new_module:8.x-1.x

In this case, the new branch will be 8.x-1.x.

Here is a screenshot of example module that I’ve split and pushed to GitLab. Notice that there are multiple commits in the history, and each still attributed to it’s original author.

Screenshot of a split project repo on GitLab

Also, as this is standard Git functionality, you can follow the same process to extract PHP libraries, Symfony bundles, WordPress plugins or anything else.

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web