Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
May 20 2020
May 20

I recently finished porting this website from a static site generator to Drupal 8, meaning that this site has now been powered by three different major versions of Drupal (6, 7 and 8) as well as by two static site generators since it was first launched in early 2010.

The majority of the content was imported using migrations from JSON feeds that I created. This included:

  • Blog tags
  • Blog posts
  • Talks
  • Redirects

In some follow-up posts, I'll be looking at each migration separately, describing any issues and look at how it was used to import its respective content.

I'll update this post with the links to the follow-up posts, and they are also available from the blog series' page.

Apr 24 2020
Apr 24

I'm happy to be presenting two talks remotely at this year's CMS Philly conference (formerly Drupaldelphia).

The first talk is Deploying PHP applications with Ansible, Ansible Vault and Ansistrano at 1pm (6pm UK time) where I'll be doing an introduction to Ansible and show how to use Ansistrano to do deploy a Drupal 8 application.

The second talk is Taking Flight with Tailwind CSS at 2pm (7pm UK time) where I'll show how to configure and use Tailwind CSS.

CMS Philly is happening virtually on Friday, May 1st via GoToWebinar.

Apr 22 2020
Apr 22

Some time ago, I announced that I was planning on writing a book on automated testing and test driven development with Drupal. I created a landing page and set up a mailing list, but I wasn't sure at that point what I was going to cover or create as part of the book.

I'm going to write a book on automated testing in Drupal. Join the mailing list for updates, and I'm happy to take suggestions on what to cover. https://t.co/YXNpe6f8Ft #drupal

— Oliver Davies (@opdavies) May 15, 2018

Being a meetup and DrupalCamp conference organiser, after some thought I decided to build a website for an example conference, and that some of this code would then be included in the book as example content. This seemed to cover most of what I originally wanted, through features like a call for papers for potential speakers to propose sessions, allowing organisers to administer and moderate those proposals, automatically sending notification emails to submitters and displaying the accepted sessions.

I've started building it with Drupal 8.8 and it is now available on GitStore to purchase access to, including all future updates as I continue building the application - adding new features and upgrading to Drupal 9 once it is released. There are some other interesting things there too, such as using feature flags to enable or disable functionality, and using GitHub Actions to run the tests automatically.

The book itself I've added a page for on Leanpub, and I'll be continuing to add content to it in parallel to building the example codebase. Once there is enough content, I will release the first draft for purchase.

Any purchases that are made via Gitstore or Leanpub, an amount will be donated to the Drupal Association and the #DrupalCares campaign to help sustain the Association during COVID-19.

Feb 04 2020
Feb 04

How to use the PSR-4 autoloading standard for Drupal 7 Simpletest test cases.

The Traditional Way

The typical way of including test cases in Drupal 7 is to add one or more classes within a .test file - e.g. opdavies.test. This would typically include all of the different test cases for that module, and would be placed in the root of the module’s directory alongside the .info and .module files.

In order to load the files, each file would need to be declared within the .info file for the module.

There is a convention that if you have multiple tests for your project, these can be split into different files and grouped within a tests directory.

; Load a test file at the root of the module
files[] = opdavies.test

; Load a test file from within a subdirectory
files[] = tests/foo.test
files[] = tests/bar.test

Using the xautoload Module

Whilst splitting tests into separate files makes things more organised, each file needs to be loaded separately. This can be made simpler by using the Xautoload module, which supports wildcards when declaring files.

files[] = tests/**/*.test

This would load all of the .test files within the tests directory.

Using PSR-4 Autoloading

Another option is to use PSR-4 (or PSR-0) autoloading.

This should be a lot more familiar to those who have worked with Drupal 8, Symfony etc, and means that each test case is in its own file which is cleaner, files have the .php extension which is more standard, and the name of the file matches the name of the test class for consistency.

To do this, create a src/Tests (PSR-4) or lib/Drupal/{module_name}/Tests (PSR-0) directory within your module, and then add or move your test cases there. Add the appropriate namespace for your module, and ensure that DrupalWebTestCase or DrupalUnitTestCase is also namespaced.

// src/Tests/Functional/OliverDaviesTest.php

namespace Drupal\opdavies\Tests\Functional;

class OliverDaviesTest extends \DrupalWebTestCase {
  // ...
}

This also supports subdirectories, so you can group classes within Functional and Unit directories if you like.

If you want to see an real-world example, see the Drupal 7 branch of the Override Node Options module.

Digging into the simpletest_test_get_all function

This is the code within simpletest.module that makes this work:

// simpletest_test_get_all()

// ...

$module_dir = DRUPAL_ROOT . '/' . dirname($filename);

// Search both the 'lib/Drupal/mymodule' directory (for PSR-0 classes)
// and the 'src' directory (for PSR-4 classes).
foreach (array(
  'lib/Drupal/' . $name,
  'src',
) as $subdir) {

  // Build directory in which the test files would reside.
  $tests_dir = $module_dir . '/' . $subdir . '/Tests';

  // Scan it for test files if it exists.
  if (is_dir($tests_dir)) {
    $files = file_scan_directory($tests_dir, '/.*\\.php/');
    if (!empty($files)) {
      foreach ($files as $file) {

        // Convert the file name into the namespaced class name.
        $replacements = array(
          '/' => '\\',
          $module_dir . '/' => '',
          'lib/' => '',
          'src/' => 'Drupal\\' . $name . '\\',
          '.php' => '',
        );
        $classes[] = strtr($file->uri, $replacements);
      }
    }
  }
}

It looks for a the tests directory (src/Tests or lib/Drupal/{module_name}/Tests) within the module, and then finds any .php files within it. It then converts the file name into the fully qualified (namespaced) class name and loads it automatically.

Running the Tests

You can still run the tests from within the Simpletest UI, or from the command line using run-tests.sh.

If you want to run a specific test case using the --class option, you will now need to include the fully qualified name.

php scripts/run-tests.sh --class Drupal\\opdavies\\Tests\\Functional\\OliverDaviesTest

Oct 24 2018
Oct 24

Today I found another instance where I decided to use Illuminate Collections within my Drupal 8 code; whilst I was debugging an issue where a Drupal Commerce promotion was incorrectly being applied to an order.

No adjustments were showing in the Drupal UI for that order, so after some initial investigation and finding that $order->getAdjustments() was empty, I determined that I would need to get the adjustments from each order item within the order.

If the order were an array, this is how it would be structured in this situation:

$order = [
  'id' => 1,
  'items' => [
    [
      'id' => 1,
      'adjustments' => [
        ['name' => 'Adjustment 1'],
        ['name' => 'Adjustment 2'],
        ['name' => 'Adjustment 3'],
      ]
    ],
    [
      'id' => 2,
      'adjustments' => [
        ['name' => 'Adjustment 4'],
      ]
    ],
    [
      'id' => 3,
      'adjustments' => [
        ['name' => 'Adjustment 5'],
        ['name' => 'Adjustment 6'],
      ]
    ],
  ],
];

Getting the order items

I started by using $order->getItems() to load the order’s items, converted them into a Collection, and used the Collection’s pipe() method and the dump() function provided by the Devel module to output the order items.

collect($order->getItems())
  ->pipe(function (Collection $collection) {
    dump($collection);
  });

Get the order item adjustments

Now we have a Collection of order items, for each item we need to get it’s adjustments. We can do this with map(), then call getAdjustments() on the order item.

This would return a Collection of arrays, with each array containing it’s own adjustments, so we can use flatten() to collapse all the adjustments into one single-dimensional array.

collect($order->getItems())
  ->map(function (OrderItem $order_item) {
    return $order_item->getAdjustments();
  })
  ->flatten(1);

There are a couple of refactors that we can do here though:

  • Use flatMap() to combine the flatten() and map() methods.
  • Use higher order messages to delegate straight to the getAdjustments() method on the order, rather than having to create a closure and call the method within it.
collect($order->getItems())
  ->flatMap->getAdjustments();

Filtering

In this scenario, each order item had three adjustments - the correct promotion, the incorrect one and the standard VAT addition. I wasn’t concerned about the VAT adjustment for debugging, so I used filter() to remove it based on the result of the adjustment’s getSourceId() method.

collect($order->getItems())
  ->flatMap->getAdjustments()
  ->filter(function (Adjustment $adjustment) {
    return $adjustment->getSourceId() != 'vat';
  });

Conclusion

Now I have just the relevant adjustments, I want to be able to load each one to load it and check it’s conditions. To do this, I need just the source IDs.

Again, I can use a higher order message to directly call getSourceId() on the adjustment and return it’s value to map().

collect($order->getItems())
  ->flatMap->getAdjustments()
  ->filter(function (Adjustment $adjustment) {
    return $adjustment->getSourceId() != 'vat';
  })
  ->map->getSourceId();

This returns a Collection containing just the relevant promotion IDs being applied to the order that I can use for debugging.

Now just to find out why the incorrect promotion was applying!

Aug 23 2018
Aug 23

Since starting to work with Laravel as well as Drupal and Symfony, watching Adam Wathan’s Refactoring to Collections course as well as lessons on Laracasts, I’ve become a fan of Laravel’s Illuminate Collections and the object-orientated pipeline approach for interacting with PHP arrays.

In fact I’ve given a talk on using Collections outside Laravel and have written a Collection class module for Drupal 7.

I’ve also tweeted several examples of code that I’ve written within Drupal that use Collections, and I thought it would be good to collate them all here for reference.

Thanks again to Tighten for releasing and maintaining the tightenco/collect library that makes it possible to pull in Collections via Composer.

Putting @laravelphp's Collection class to good use, cleaning up some of my @drupal 8 code. Thanks @TightenCo for the Collect library! pic.twitter.com/Bn1UfudGvp

— Oliver Davies (@opdavies) August 18, 2017

Putting more @laravelphp Collections to work in my @drupal code today. ? pic.twitter.com/H8xDTT063X

— Oliver Davies (@opdavies) February 14, 2018

I knew that you could specify a property like 'price' in Twig and it would also look for methods like 'getPrice()', but I didn't know (or had maybe forgotten) that @laravelphp Collections does it too.

This means that these two Collections return the same result.

Nice! ? pic.twitter.com/2g2IfThzdy

— Oliver Davies (@opdavies) June 20, 2018

More @laravelphp Collection goodness, within my #Drupal8 project! pic.twitter.com/mWgpNbNIrh

— Oliver Davies (@opdavies) August 10, 2018

Some more #Drupal 8 fun with Laravel Collections. Loading the tags for a post and generating a formatted string of tweetable hashtags. pic.twitter.com/GbyiRPzIRo

— Oliver Davies (@opdavies) August 23, 2018
Aug 21 2018
Aug 21

I’ve been experimenting with moving some code to Drupal 8, and I’m quite intrigued by a different way that I’ve tried to structure it - using event subscribers, building on some of the takeaways from Drupal Dev Days.

Here is how this module is currently structured:

Note that there is no opdavies_blog.module file, and rather than calling actions from within a hook like opdavies_blog_entity_update(), each action becomes it’s own event subscriber class.

This means that there are no long hook_entity_update functions, and instead there are descriptive, readable event subscriber class names, simpler action code that is responsibile only for performing one task, and you’re able to inject and autowire dependencies into the event subscriber classes as services - making it easier and cleaner to use dependency injection, and simpler write tests to mock dependencies when needed.

The additional events are provided by the Hook Event Dispatcher module.

Code

opdavies_blog.services.yml:

services:
  Drupal\opdavies_blog\EventSubscriber\PostToMedium:
    autowire: true
    tags:
      - { name: event_subscriber }

  Drupal\opdavies_blog\EventSubscriber\SendTweet:
    autowire: true
    tags:
      - { name: event_subscriber }

Adding autowire: true is not required for the event subscriber to work. I’m using it to automatically inject any dependencies into the class rather than specifying them separately as arguments.

src/EventSubscriber/SendTweet.php:

namespace Drupal\opdavies_blog\EventSubscriber;

use Drupal\hook_event_dispatcher\Event\Entity\EntityUpdateEvent;
use Drupal\hook_event_dispatcher\HookEventDispatcherInterface;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;

class SendTweet implements EventSubscriberInterface {

  ...

  public static function getSubscribedEvents() {
    return [
      HookEventDispatcherInterface::ENTITY_UPDATE => 'sendTweet',
    ];
  }

  public function sendTweet(EntityUpdateEvent $event) {
    // Perform checks and send the tweet.
  }

}

Aug 16 2018
Aug 16

Have you ever needed to have a 'special user' to perform tasks on your Drupal site, such as performing actions based on an API request, or for sending an internal site message?

If you just create a new user, how do you identify that user going forward? Do you hard-code the 'magic' user ID in your custom code? What if the user has a different ID on different environments of your site? You could declare it in each environment’s settings file and retrieve it from there, but what then if you need to do the same on another site? That would mean some duplication of code - and something that could have been abstracted and re-used.

I had to do this recently, and rather than just duplicate the code I decided to make it into it’s own module - which then became two modules.

System users

The System User module provides a re-usable, generic way to denote users as 'system users', which is not specific to a certain site or environment as this is value is stored against each individual user in the database.

'System user' is a term used in Linux, which I thought also applies well to this scenario.

From https://www.ssh.com/iam/user/system-account:

A system account is a user account that is created by an operating system during installation and that is used for operating system defined purposes. System accounts often have predefiend user ids. Examples of system accounts include the root account in Linux.

A system user isn’t an account that we’d expect a person to log in with and perform routine tasks like updating content, but rather for the system (site) to use to perform tasks like the earlier examples.

Declaring a user as a system user

System User module adds a base field to Drupal’s User entity, which determines whether or not each user is a system user - i.e. if this field is TRUE, that user is a system user. This means that users can easily be queried to identify which are system users, without having to rely on magic, environment and site specific user IDs. This also means that we can have multiple system users, if needed.

{.border .p-1}

In the Drupal 8 version of the module, a SystemUser is a custom entity, that contains it’s own create method for creating new system users. This is a essentially a wrapper around User::create() that automatically sets the value of the system user field as part of the creation.

The original intention is that system users would always be created manually in an custom install or update hook, however since releasing the module, I’ve also added an install hook to the module to automatically create a new system user when the module is installed, basing the username on the site name.

There is also an open issue to add a Drush command to create a new system user, and I’d imagine I’ll also add a Drupal Console command too.

Retrieving system users

Whilst you could easily write your own query that retrieves users based on the value of the system user field, but the module contains a SystemUserManager service that contains methods to do so. It also provides a static helper class that determines if a specified user is a system user by checking the value of the system user field.

// Retrieve the first system user.
$system_user = $this->systemUserManager->getFirst();

// Is the specified user a system user?
$is_system_user = SystemUserManager::isSystemUser($user);

But what do we return if there are no system users? You could return NULL or FALSE, but I decided to take a different approach, which became the second module.

Null users

The Null User module is an implementation of the null object pattern for users in Drupal 8. In this case, a NullUser is an extension of Drupal’s AnonymousUserSession, which means that it inherits sensible defaults to return for a non-existent User. Though, through inheritance, the id, getRoles and hasPermission methods are overridden to return relevant values.

use Drupal\Core\Session\AnonymousUserSession;

class NullUser extends AnonymousUserSession {
  ...
}

Null User module is a dependency of System User in Drupal 8, so When no system user is found from the getFirst() method, a NullUser is returned. Whilst I could alternatively have returned NULL or FALSE, we then would need to check if the returned value was an object or not before calling methods on it.

$system_user = $this->systemUserManager->getFirst(); // Returns NULL or FALSE.

// Need to check if a user was returned or not.
if (!$system_user) {
  return;
}

if ($system_user->isActive()) {
  ...
}

Because instead we’re returning a NullUser, which through class inheritance has the same methods and properties as a regular user, there is no need to do the additional check as you will always receive a relevant object, and the expected methods will always be present.

$system_user = $this->systemUserManager->getFirst(); // Returns a NullUser.

if ($system_user->isActive()) {
  ...
}

This means we have less code, which also is simpler and more readable.

System User module is the only one that I’m aware of that makes use of Null User, but I’ve added a list to the project page so let me know if you can think of any others.

Resources

Jun 04 2018
Jun 04

Within the Docksal documentation for Drupal settings, the example database settings include hard-coded credentials to connect to the Drupal database. For example, within a settings.php file, you could add this:

$databases['default']['default'] = [
  'driver' => 'mysql',
  'host' => 'db',
  'database' => 'myproject_db',
  'username' => 'myproject_user',
  'password' => 'myproject_pass',
];

Whilst this is fine, it does mean that there is duplication in the codebase as the database credentials can also be added as environment variations within .docksal/docksal.env - this is definitely the case if you want to use a custom database name, for example.

Also if one of these values were to change, then Drupal wouldn't be aware of that and would no longer be able to connect to the database.

It also means that the file can’t simply be re-used on another project as it contains project-specific credentials.

We can improve this by using the environment variables within the settings file.

The relevant environment variables are MYSQL_DATABASE for the database name, and MYSQL_USER and MYSQL_PASSWORD for the MySQL username and password. These can be set in .docksal/docksal.env, and will need to be present for this to work.

For example:

DOCKSAL_STACK=default
MYSQL_DATABASE=myproject_db
MYSQL_USER=myproject_user
MYSQL_PASSWORD=myproject_pass

With these in place, they can be referenced within the settings file using the getenv() function.

$databases['default']['default'] = [
  'driver' => 'mysql',
  'host' => 'db',
  'database' => getenv('MYSQL_DATABASE'),
  'username' => getenv('MYSQL_USER'),
  'password' => getenv('MYSQL_PASSWORD'),
];

Now the credentials are no longer duplicated, and the latest values from the environment variables will always be used.

However, you may see a message like this when you try and load the site:

Drupal\Core\Database\DatabaseAccessDeniedException: SQLSTATE[HY000][1045] Access denied for user ''@'172.19.0.4' (using password: NO) in /var/www/core/lib/Drupal/Core/Database/Driver/mysql/Connection.php on line 156

If you see this, the environment variables aren’t being passed into Docksal’s cli container, so the values are not being populated. To enable them, edit .docksal/docksal.yml and add MYSQL_DATABASE, MYSQL_PASSWORD and MYSQL_USER to the environment section of the cli service.

version: '2.1'
services:
  cli:
    environment:
      - MYSQL_DATABASE
      - MYSQL_PASSWORD
      - MYSQL_USER

After changing this file, run fin start to rebuild the project containers and try to load the site again.

May 06 2018
May 06

This week I’ve started writing some custom commands for my Drupal projects that use Docksal, including one to easily run PHPUnit tests in Drupal 8. This is the process of how I created this command.

What is Docksal?

Docksal is a local Docker-based development environment for Drupal projects and other frameworks and CMSes. It is our standard tool for local environments for projects at Microserve.

There was a great talk recently at Drupaldelphia about Docksal.

Why write a custom command?

One of the things that Docksal offers (and is covered in the talk) is the ability to add custom commands to the Docksal’s fin CLI, either globally or as part of your project.

As an advocate of automated testing and TDD practitioner, I write a lot of tests and run PHPUnit numerous times a day. I’ve also given talks and have written other posts on this site relating to testing in Drupal.

There are a couple of ways to run PHPUnit with Docksal. The first is to use fin bash to open a shell into the container, move into the docroot directory if needed, and run the phpunit command.

fin bash
cd /var/www/docroot
../vendor/bin/phpunit -c core modules/custom

Alternatively, it can be run from the host machine using fin exec.

cd docroot
fin exec '../vendor/bin/phpunit -c core modules/custom'

Both of these options require multiple steps as we need to be in the docroot directory where the Drupal code is located before the command can be run, and both have quite long commands to run PHPUnit itself - some of which is repeated every time.

By adding a custom command, I intend to:

  1. Make it easier to get set up to run PHPUnit tests - i.e. setting up a phpunit.xml file.
  2. Make it easier to run the tests that we’d written by shortening the command and making it so it can be run anywhere within our project.

I also hoped to make it project agnostic so that I could add it onto any project and immediately run it.

Creating the command

Each command is a file located within the .docksal/commands directory. The filename is the name of the command (e.g. phpunit) with no file extension.

To create the file, run this from the same directory where your .docksal directory is:

mkdir -p .docksal/commands
touch .docksal/commands/phpunit

This will create a new, empty .docksal/commands/phpunit file, and now the phpunit command is now listed under "Custom commands" when we run fin.

You can write commands with any interpreter. I’m going to use bash, so I’ll add the shebang to the top of the file.

#!/usr/bin/env bash

With this in place, I can now run fin phpunit, though there is no output displayed or actions performed as the rest of the file is empty.

Adding a description and help text

Currently the description for our command when we run fin is the default "No description" text. I’d like to add something more relevant, so I’ll start by adding a new description.

fin interprets lines starting with ## as documentation - the first of which it uses as the description.

#!/usr/bin/env bash

## Run automated PHPUnit tests.

Now when I run it, I see the new description.

Any additional lines are used as help text with running fin help phpunit. Here I’ll add an example command to demonstrate how to run it as well as some more in-depth text about what the command will do.

#!/usr/bin/env bash

## Run automated PHPUnit tests.
##
## Usage: fin phpunit <args>
##
## If a core/phpunit.xml file does not exist, copy one from elsewhere.
## Then run the tests.

Now when I run fin help phpunit, I see the new help text.

Adding some content

Setting the target

As I want the commands to be run within Docksal’s "cli" container, I can specify that with exec_target. If one isn’t specified, the commands are run locally on the host machine.

#: exec_target = cli

Available variables

These variables are provided by fin and are available to use within any custom commands:

  • PROJECT_ROOT - The absolute path to the nearest .docksal directory.
  • DOCROOT - name of the docroot folder.
  • VIRTUAL_HOST - the virtual host name for the project. Such as myproject.docksal.
  • DOCKER_RUNNING - (string) "true" or "false".

Note: If the DOCROOT variable is not defined within the cli container, ensure that it’s added to the environment variables in .docksal/docksal.yml. For example:

version: "2.1"

services:
  cli:
    environment:
      - DOCROOT

Running phpunit

When you run the phpunit command, there are number of options you can pass to it such as --filter, --testsuite and --group, as well as the path to the tests to execute, such as modules/custom.

I wanted to still be able to do this by running fin phpunit <args> so the commands can be customised when executed. However, as the first half of the command (../vendor/bin/phpunit -c core) is consistent, I can wrap that within my custom command and not need to type it every time.

By using "[email protected]" I can capture any additional arguments, such as the test directory path, and append them to the command to execute.

I’m using $PROJECT_ROOT to prefix the command with the absolute path to phpunit so that I don’t need to be in that directory when I run the custom command, and $DOCROOT to always enter the sub-directory where Drupal is located. In this case, it’s "docroot" though I also use "web" and I’ve seen various others used.

DOCROOT_PATH="${PROJECT_ROOT}/${DOCROOT}"
DRUPAL_CORE_PATH="${DOCROOT_PATH}/core"

# If there is no phpunit.xml file, copy one from elsewhere.

# Otherwise run the tests.
${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"

For example, fin phpunit modules/custom would execute /var/www/vendor/bin/phpunit -c /var/www/docroot/core modules/custom within the container.

I can then wrap this within a condition so that the tests are only run when a phpunit.xml file exists, as it is required for them to run successfully.

if [ ! -e ${DRUPAL_CORE_PATH}/phpunit.xml ]; then
    # If there is no phpunit.xml file, copy one from elsewhere.
else
    ${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"
fi

Creating phpunit.xml - step 1

My first thought was that if a phpunit.xml file doesn’t exist was to duplicate core’s phpunit.xml.dist file. However this isn’t enough to run the tests, as values such as SIMPLETEST_BASE_URL, SIMPLETEST_DB and BROWSERTEST_OUTPUT_DIRECTORY need to be populated.

As the tests wouldn't run at this point, I’ve exited early and displayed a message to the user to edit the new phpunit.xml file and run fin phpunit again.

if [ ! -e ${DRUPAL_CORE_PATH}/phpunit.xml ]; then
    echo "Copying ${DRUPAL_CORE_PATH}/phpunit.xml.dist to ${DRUPAL_CORE_PATH}/phpunit.xml."
    echo "Please edit it's values as needed and re-run 'fin phpunit'."
    cp ${DRUPAL_CORE_PATH}/phpunit.xml.dist ${DRUPAL_CORE_PATH}/phpunit.xml
    exit 1;
else
    ${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"
fi

However this isn’t as streamlined as I originally wanted as it still requires the user to perform an additional step before the tests can run.

Creating phpunit.xml - step 2

My second idea was to keep a pre-configured file within the project repository, and to copy that into the expected location. That approach would mean that the project specific values would already be populated, as well as any customisations made to the default settings. I decided on .docksal/drupal/core/phpunit.xml to be the potential location.

Also, if this file is copied then we can go ahead and run the tests straight away rather than needing to exit early.

If a pre-configured file doesn’t exist, then we can default back to copying phpunit.xml.dist.

To avoid duplication, I created a reusable run_tests() function so it could be executed in either scenario.

run_tests() {
    ${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"
}

if [ ! -e ${DRUPAL_CORE_PATH}/phpunit.xml ]; then
    if [ -e "${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml" ]; then
        echo "Copying ${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml to ${DRUPAL_CORE_PATH}/phpunit.xml"
        cp "${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml" ${DRUPAL_CORE_PATH}/phpunit.xml
        run_tests "[email protected]"
    else
        echo "Copying ${DRUPAL_CORE_PATH}/phpunit.xml.dist to ${DRUPAL_CORE_PATH}/phpunit.xml."
        echo "Please edit it's values as needed and re-run 'fin phpunit'."
        cp ${DRUPAL_CORE_PATH}/phpunit.xml.dist ${DRUPAL_CORE_PATH}/phpunit.xml
        exit 1;
    fi
else
    run_tests "[email protected]"
fi

This means that I can execute less steps and run a much shorter command compared to the original, and even if someone didn’t have a phpunit.xml file created they could have copied into place and have tests running with only one command.

The finished file

#!/usr/bin/env bash

#: exec_target = cli

## Run automated PHPUnit tests.
##
## Usage: fin phpunit <args>
##
## If a core/phpunit.xml file does not exist, one is copied from
## .docksal/core/phpunit.xml if that file exists, or copied from the default
## core/phpunit.xml.dist file.

DOCROOT_PATH="${PROJECT_ROOT}/${DOCROOT}"
DRUPAL_CORE_PATH="${DOCROOT_PATH}/core"

run_tests() {
    ${PROJECT_ROOT}/vendor/bin/phpunit -c ${DRUPAL_CORE_PATH} "[email protected]"
}

if [ ! -e ${DRUPAL_CORE_PATH}/phpunit.xml ]; then
    if [ -e "${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml" ]; then
        echo "Copying ${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml to ${DRUPAL_CORE_PATH}/phpunit.xml"
        cp "${PROJECT_ROOT}/.docksal/drupal/core/phpunit.xml" ${DRUPAL_CORE_PATH}/phpunit.xml
        run_tests "[email protected]"
    else
        echo "Copying phpunit.xml.dist to phpunit.xml"
        echo "Please edit it's values as needed and re-run 'fin phpunit'."
        cp ${DRUPAL_CORE_PATH}/phpunit.xml.dist ${DRUPAL_CORE_PATH}/phpunit.xml
        exit 0;
    fi
else
    run_tests "[email protected]"
fi

It’s currently available as a GitHub Gist, though I’m planning on moving it into a public GitHub repository either on my personal account or the Microserve organisation, for people to either use as examples or to download and use directly.

I’ve also started to add other commands to projects such as config-export to standardise the way to export configuration from Drupal 8, run Drupal 7 tests with SimpleTest, and compile front-end assets like CSS within custom themes.

I think it’s a great way to shorten existing commands, or to group multiple commands into one like in this case, and I can see a lot of other potential uses for it during local development and continuous integration. Also being able to run one command like fin init and have it set up everything for your project is very convenient and a big time saver!

Since writing this post, I’ve had a pull request accepted for this command to be added as a Docksal add-on. This means that the command can be added to any Docksal project by running fin addon install phpunit. It will be installed into the .docksal/addons/phpunit directory, and displayed under "Addons" rather than "Custom commands" when you run fin.

Resources

Mar 10 2018
Mar 10

Yay! You’ve written a new Drupal module, theme or installation profile as part of your site, and now you’ve decided to open source it and upload it to Drupal.org as a new contrib project. But how do you split it from the main site repository into it’s own?

Well, there are a couple of options.

Does it need to be part of the site repository?

An interesting thing to consider is, does it need to be a part of the site repository in the first place?

If from the beginning you intend to contribute the module, theme or distribution and it’s written as generic and re-usable from the start, then it could be created as a separate project on Drupal.org or as a private repository on your Git server from the beginning, and added as a dependency of the main project rather than part of it. It could already have the correct branch name and adhere to the Drupal.org release conventions and be managed as a separate project, then there is no later need to "clean it up" or split it from the main repo at all.

This is how I worked at the Drupal Association - with all of the modules needed for Drupal.org hosted on Drupal.org itself, and managed as a dependency of the site repository with Drush Make.

Whether this is a viable option or not will depend on your processes. For example, if your code needs to go through a peer review process before releasing it, then pushing it straight to Drupal.org would either complicate that process or bypass it completely. Pushing it to a separate private repository may depend on your team's level of familiarity with Composer, for example.

It does though avoid the “we’ll clean it up and contribute it later” scenario which probably happens less than people intend.

Create a new, empty repository

If the project is already in the site repo, this is probably the most common method - to create a new, empty repository for the new project, add everything to it and push it.

For example:

cd web/modules/custom/my_new_module

# Create a new Git repository.
git init

# Add everything and make a new commit.
git add -A .
git commit -m 'Initial commit'

# Rename the branch.
git branch -m 8.x-1.x

# Add the new remote and push everything.
git remote add origin [email protected]:project/my_new_module.git
git push origin 8.x-1.x

There is a huge issue with this approach though - you now have only one single commit, and you’ve lost the commmit history!

This means that you lose the story and context of how the project was developed, and what decisions and changes were made during the lifetime of the project so far. Also, if multiple people developed it, now there is only one person being attributed - the one who made the single new commit.

Also, if I’m considering adding your module to my project, personally I’m less likely to do so if I only see one "initial commit". I’d like to see the activity from the days, weeks or months prior to it being released.

What this does allow though is to easily remove references to client names etc before pushing the code.

Use a subtree split

An alternative method is to use git-subtree, a Git command that "merges subtrees together and split repository into subtrees". In this scenario, we can use split to take a directory from within the site repo and split it into it’s own separate repository, keeping the commit history intact.

Here is the description for the split command from the Git project itself:

Extract a new, synthetic project history from the history of the subtree. The new history includes only the commits (including merges) that affected , and each of those commits now has the contents of at the root of the project instead of in a subdirectory. Thus, the newly created history is suitable for export as a separate git repository.

Note: This command needs to be run at the top level of the repository. Otherwise you will see an error like "You need to run this command from the toplevel of the working tree.".

To find the path to the top level, run git rev-parse --show-toplevel.

In order to do this, you need specify the prefix for the subtree (i.e. the directory that contains the project you’re splitting) as well as a name of a new branch that you want to split onto.

git subtree split --prefix web/modules/custom/my_new_module -b split_my_new_module

When complete, you should see a confirmation message showing the branch name and the commit SHA of the branch.

Created branch 'split_my_new_module'
7edcb4b1f4dc34fc3b636b498f4284c7d98c8e4a

If you run git branch, you should now be able to see the new branch, and if you run git log --oneline split_my_new_module, you should only see commits for that module.

If you do need to tidy up a particular commit to remove client references etc, change a commit message or squash some commits together, then you can do that by checking out the new branch, running an interactive rebase and making the required amends.

git checkout split_my_new_module
git rebase -i --root

Once everything is in the desired state, you can use git push to push to the remote repo - specifying the repo URL, the local branch name and the remote branch name:

git push [email protected]:project/my_new_module.git split_my_new_module:8.x-1.x

In this case, the new branch will be 8.x-1.x.

Here is a screenshot of example module that I’ve split and pushed to GitLab. Notice that there are multiple commits in the history, and each still attributed to it’s original author.

Screenshot of a split project repo on GitLab

Also, as this is standard Git functionality, you can follow the same process to extract PHP libraries, Symfony bundles, WordPress plugins or anything else.

Feb 27 2018
Feb 27

My current project at Microserve is a Drupal 8 website that uses the Private Message module for users to send messages to each other.

In some cases though, the threads could contain hundreds of recipients so I decided that it would be good to queue the message requests so that they can be processed as part of a background process for better performance. The Private Message module does not include this, so I've written and released a separate Private Message Queue module.

Queuing a Message

The module provices a PrivateMessageQueuer service (private_message_queue.queuer) which queues the items via the queue() method.

The method accepts an array of User objects as the messsage recipients, the message body text and another user as the message owner. (I’m currently considering whether to make the owner optional, and default to the current user if one is not specified)

Here is an example:

$recipients = $this->getRecipients(); // An array of User objects.
$message = 'Some message text';
$owner = \Drupal::currentUser();

$queuer = \Drupal::service('private_message_queue.queuer');
$queuer->queue($recipients, $message, $owner);

These three pieces of data are then saved as part of the queued item. You can see these by checking the "queue" table in the database or by running drush queue-list.

$ drush queue-list
Queue                  Items  Class
private_message_queue  19     Drupal\Core\Queue\DatabaseQueue

Processing the Queue

The module also provides a PrivateMessageQueue queue worker, which processes the queued items. For each item, it creates a new private message setting the owner and the message body.

It uses the PrivateMessageThread class from the Private Message module to find for an existing thread for the specified recipients, or creates a new thread if one isn't found. The new message is then added to the thread.

The queue is processed on each cron run, so I recommend adding a module like Ultimate Cron so that you can process the queued items frequently (e.g. every 15 minutes) and run the heavier tasks like checking for updates etc less frequently (e.g. once a day).

You can also process the queue manually with Drush using the drush queue-run <queue-name> command - e.g. drush queue-run private_message_queue.

$ drush queue-run private_message_queue
Processed 19 items from the private_message_queue queue in 3.34 sec.

Questions? Comments? I’m @opdavies on Twitter.

Feb 05 2018
Feb 05

What is Tailwind?

Tailwind is a utility-first CSS framework for rapidly building custom user interfaces.

It generates a number of utility classes that you can add to your theme's markup to apply different styling, as well as the ability to apply classes to other markup and create components comprised of utility classes using a custom @apply PostCSS directive.

Initial Configuration

The installation and configuration steps are essentially the same as those outlined within the Tailwind documentation, and should be performed within your custom theme's directory (e.g. sites/all/themes/custom/mytheme for Drupal 7 or themes/custom/mytheme for Drupal 8:

  1. Require PostCSS and Tailwind via npm or yarn.
  2. Generate a configuration file using ./node_modules/.bin/tailwind init.
  3. Tweak the settings as needed.
  4. Add a postcss.config.js file.
  5. Configure your build tool (Gulp, Grunt, Webpack).
  6. Generate the CSS.
  7. Include a path to the generated CSS in your MYTHEME.info, MYTHEME.info.yml or MYTHEME.libraries.yml file.

PostCSS Configuration

Create a postcss.config.js file and add tailwindcss as a plugin, passing the path to the config file:

module.exports = {
    plugins: [
        require('tailwindcss')('./tailwind.js'),
    ]
}

Configuration for Drupal

There are some configuration settings within tailwind.js that you’ll need to change to make things work nicely with Drupal. These are within the options section:

options: {
    prefix: 'tw-',
    important: true,
    ...
}

Prefix

By adding a prefix like tw-, we can ensure that the Tailwind classes don’t conflict with core HTML classes like block. We can also ensure that they won't conflict with any other existing HTML or CSS.

No prefix:

With prefix:

Important

We can also set the !important rule on all Tailwind’s generated classes. We need to do this if we want to override core styles which have more specific rules.

For example: if I had this core markup then the left margin added by tw-ml-4 would be overridden by core’s .item-list ul styling.

<div class="item-list">
  <ul class="tw-ml-4">
    ...
  </ul>
</div>

With the !important rule enabled though, the Tailwind’s class takes precedence and is applied.

Example

For an example of Tailwind within a Drupal 8 theme, see the custom theme for the Drupal Bristol website on GitHub.

Questions? Comments? I’m @opdavies on Twitter.

Jan 30 2018
Jan 30

It’s with heavy hearts that we are announcing there won’t be a DrupalCamp Bristol 2018. The committee have looked at the amount of work required to put the camp on and the capacity we all have and the two numbers are irreconcilable.

Seeing Drupalists from all over the country and from overseas come to Bristol to share knowledge and ideas is something we take pride in. The past three camps have been fantastic, but as a trend we have left it later and later to organise.

This year is the latest we have left to organise and we believe this is because we are all a bit fatigued right now, so it seems like a good place to stop and take stock.

In our washup of last year’s camp we spoke a lot about what DrupalCamp is and who it is for. Traditionally we have tried to get a good mix of speakers from within the Drupal community and from the wider tech community. This does mean we dilute the ‘Drupal’ aspect of the camp, but the benefits it brings in terms of bringing together different views gives the camp greater value in our eyes.

It’s because of this mix of talks and wider shifts in the community in ‘getting us off the island’ that we have been thinking about rebranding to reflect the mix of talks that the camp hosts. The fact is DrupalCamps don’t just cover Drupal anymore. There is Symfony, Composer, OOP principles, React, etc.

We’ll take the gap this year to reevaluate who DrupalCamp Bristol is for and where it fits into the schedule of excellent tech events that take place in Bristol through the year, and we look forward to seeing you in 2019, refreshed and more enthusiastic than ever!

The DrupalCamp Bristol organising committee

Tom, Ollie, Emily, Sophie, Rob, Mark

Questions? Comments? I’m @opdavies on Twitter.

May 20 2017
May 20

Warning: This post is over a year old. I don't always update old posts with new information, so some of this information may be out of date.

Yesterday I was fixing a bug in an inherited Drupal 7 custom module, and I decided that I was going to add some tests to ensure that the bug was fixed and doesn’t get accidentially re-introduced in the future. The test though required me to have a particular content type and fields which are specific to this site, so weren’t present within the standard installation profile used to run tests.

I decided to convert the custom module into a Feature so that the content type and it’s fields could be added to it, and therefore present on the testing site once the module is installed.

To do this, I needed to expose the module to the Features API.

All that’s needed is to add this line to the mymodule.info file:

features[features_api][] = api:2

After clearing the cache, the module is now visible in the Features list - and ready to have the appropriate configuration added to it.

'The features list showing the custom module'

Questions? Comments? I’m @opdavies on Twitter.

May 15 2017
May 15

Warning: This post is over a year old. I don't always update old posts with new information, so some of this information may be out of date.

DrupalCamp Bristol 2017 logo

In less than two months time, DrupalCamp Bristol will be back for our third year! (July seems to come around quicker each year). This is this year’s schedule and venues:

Today we announced Emma Karayiannis as our Saturday keynote speaker, and we’ll be announcing some of the other speakers later this week.

Not submitted your session yet? The session submissions are open until May 31st. We’re looking for talks not only on Drupal, but other related topics such as PHP, Symfony, server administration/DevOps, project management, case studies, being human etc. If you want to submit but want to ask something beforehand, please send us an email or ping us on Twitter.

Not spoken at a DrupalCamp before? No problem. We’re looking for both new and experienced speakers, and have both long (45 minutes) and short (20 minutes) talk slots available.

Not bought your tickets yet? Early bird tickets for the CXO and conference days are still available! The sprint day tickets are free but limited, so do register for a ticket to claim your place.

We still have sponsorships opportunities available (big thanks to Microserve, Deeson and Proctors) who have already signed up), but be quick if you want to be included in our brochure so that we can get you added before our print deadline! Without our sponsors, putting on this event each year would not be possible.

Any other questions? Take a look at our website or get in touch via Twitter or email.

Questions? Comments? I’m @opdavies on Twitter.

May 05 2017
May 05

Warning: This post is over a year old. I don't always update old posts with new information, so some of this information may be out of date.

I’ve been a Drupal VM user for a long time, but lately I’ve been using a combination Drupal VM and Docker for my local development environment. There were a couple of issues preventing me from completely switching to Docker - one of which being that when I tried running of my Simpletest tests, a lot of them would fail where they would pass when run within Drupal VM.

Here’s an excerpt from my docker-compose.yml file:

TL;DR You need to include the name of your web server container as the --url option to run-scripts.php.

I’ve been a Drupal VM user for a long time, but lately I’ve been using a combination Drupal VM and Docker for my local development environment. There were a couple of issues preventing me from completely switching to Docker - one of which being that when I tried running of my Simpletest tests, a lot of them would fail where they would pass when run within Drupal VM.

Here’s an excerpt from my docker-compose.yml file:

services:
  php:
    image: wodby/drupal-php:5.6
    volumes:
      - ./repo:/var/www/html

  nginx:
    image: wodby/drupal-nginx:7-1.10
    environment:
      NGINX_BACKEND_HOST: php
      NGINX_SERVER_ROOT: /var/www/html/web
    ports:
      - "80:80"
   volumes_from:
      - php
...

Nginx and PHP-FPM are running in separate containers, the volumes are shared across both and the Nginx backend is set to use the php container.

This is the command that I was using to run the tests:

$ docker-compose run --rm \
  -w /var/www/html/web \
  php \
  php scripts/run-tests.sh \
    --php /usr/local/bin/php \
    --class OverrideNodeOptionsTestCase

This creates a new instance of the php container, sets the working directory to my Drupal root and runs Drupal’s run-tests.sh script with some arguments. In this case, I'm running the OverrideNodeOptionsTestCase class for the override_node_options tests. Once complete, the container is deleted because of the --rm option.

This resulted in 60 of the 112 tests failing, whereas they all passed when run within a Drupal VM instance.

Test summary
------------

Override node options 62 passes, 60 fails, 29 exceptions, and 17 debug messages

Test run duration: 2 min 25 sec

Running the tests again with the--verbose option, I saw this message appear in the output below some of the failing tests:

simplexml_import_dom(): Invalid Nodetype to import

Update: I later found that https://www.drupal.org/docs/7/testing/running-tests-through-command-line#troubleshooting references this error message, but I didn’t see this page within my original search.

After checking that I had all of the required PHP extensions installed, I ran docker-compose exec php bash to connect to the php container and ran curl http://localhost to check the output. Rather than seeing the HTML for the site, I got this error message:

curl: (7) Failed to connect to localhost port 80: Connection refused

Whereas curl http://nginx returns the HTML for the page, so included it with the --url option to run-tests.sh, and this resulted in my tests all passing.

$ docker-compose run --rm \
  -w /var/www/html/web \
  php \
  php scripts/run-tests.sh \
    --php /usr/local/bin/php \
    --url http://nginx \
    --class OverrideNodeOptionsTestCase

Test summary
------------

Override node options 121 passes, 0 fails, 0 exceptions, and 34 debug messages

Test run duration: 2 min 31 sec

Note: In this example I have separate nginx and php containers, but I've tried and had the same issue when running Nginx and PHP-FPM in the same container - e.g. called app - and still needed to add --url http://app in order for the tests to run successfully.

I don’t know if this issue is macOS specfic (I know that Drupal CI is based on Docker, and I don’t know if it’s an issue) but I’m going to test also on my Ubuntu Desktop environment and investigate further and also compare the test run times for Docker in macOS, Docker in Ubuntu and within Drupal VM. I’m also going to test this with PHPUnit tests with Drupal 8.

Questions? Comments? I’m @opdavies on Twitter.

May 04 2017
May 04

5th May 2017

Tags: docker, drupal, drupal-planet, simpletest, testing

TL;DR You need to include the name of your web server container as the --url option to run-scripts.php.

I’ve been a Drupal VM user for a long time, but lately I’ve been using a combination Drupal VM and Docker for my local development environment. There were a couple of issues preventing me from completely switching to Docker - one of which being that when I tried running of my Simpletest tests, a lot of them would fail where they would pass when run within Drupal VM.

Here’s an excerpt from my docker-compose.yml file:

services:
  php:
    image: wodby/drupal-php:5.6
    volumes:
      - ./repo:/var/www/html

  nginx:
    image: wodby/drupal-nginx:7-1.10
    environment:
      NGINX_BACKEND_HOST: php
      NGINX_SERVER_ROOT: /var/www/html/web
    ports:
      - "80:80"
   volumes_from:
      - php
...

Nginx and PHP-FPM are running in separate containers, the volumes are shared across both and the Nginx backend is set to use the php container.

This is the command that I was using to run the tests:

$ docker-compose run --rm \
  -w /var/www/html/web \
  php \
  php scripts/run-tests.sh \
    --php /usr/local/bin/php \
    --class OverrideNodeOptionsTestCase

This creates a new instance of the php container, sets the working directory to my Drupal root and runs Drupal’s run-tests.sh script with some arguments. In this case, I'm running the OverrideNodeOptionsTestCase class for the override_node_options tests. Once complete, the container is deleted because of the --rm option.

This resulted in 60 of the 112 tests failing, whereas they all passed when run within a Drupal VM instance.

Test summary
------------

Override node options 62 passes, 60 fails, 29 exceptions, and 17 debug messages

Test run duration: 2 min 25 sec

Running the tests again with the--verbose option, I saw this message appear in the output below some of the failing tests:

simplexml_import_dom(): Invalid Nodetype to import

After checking that I had all of the required PHP extensions installed, I ran docker-compose exec php bash to connect to the php container and ran curl http://localhost to check the output. Rather than seeing the HTML for the site, I got this error message:

curl: (7) Failed to connect to localhost port 80: Connection refused

Whereas curl http://nginx returns the HTML for the page, so included it with the --url option to run-tests.sh, and this resulted in my tests all passing.

$ docker-compose run --rm \
  -w /var/www/html/web \
  php \
  php scripts/run-tests.sh \
    --php /usr/local/bin/php \
    --url http://nginx \
    --class OverrideNodeOptionsTestCase

Test summary
------------

Override node options 121 passes, 0 fails, 0 exceptions, and 34 debug messages

Test run duration: 2 min 31 sec

Note: In this example I have separate nginx and php containers, but I've tried and had the same issue when running Nginx and PHP-FPM in the same container - e.g. called app - and still needed to add --url http://app in order for the tests to run successfully.

I don’t know if this issue is macOS specfic (I know that Drupal CI is based on Docker, and I don’t know if it’s an issue) but I’m going to test also on my Ubuntu Desktop environment and investigate further and also compare the test run times for Docker in macOS, Docker in Ubuntu and within Drupal VM. I’m also going to test this with PHPUnit tests with Drupal 8.

About the Author

Picture of Oliver

Oliver Davies is a Web Developer, System Administrator and Drupal specialist based in the UK. He is a Senior Drupal Developer at Microserve and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Drupal Association Individual Member I built Drupal 8 with hand holding a wrench on blue background

Availability

Currently have limited part-time capacity

Currently no spare full-time capacity.

May 03 2016
May 03

Warning: This post is over a year old. I don't always update old posts with new information, so some of this information may be out of date.

How to use the xautoload module to autoload migration classes within your Drupal 7 migration modules.

What is xautoload?

xautoload is a Drupal module that enables the autoloading of PHP classes, in the same way that you would do so in a Composer based project such as Drupal 8 or Symfony.

It supports both the PSR-0 and PSR-4 standards, as well as providing a wildcard syntax for Drupal’s file[] syntax in .info files.

To use it, download and enable it from Drupal.org as you would for any other module, and then add it as a dependency within your module. The xautoload project page suggests including a minimum version in this format:

dependencies[] = xautoload (>= 7.x-5.0)

This will ensure that the version of xautoload is 7.x-5.0 or newer.

How to use it

Wildcard syntax for .info files

Here is an example .info file for a migrate module.

; foo_migrate.info

name = Foo Migration
core = 7.x
package = Foo

files[] = includes/user.inc
files[] = includes/nodes/article.inc
files[] = includes/nodes/page.inc

In this example, each custom migration class is stored in it’s own file within the includes directory, and each class needs to be loaded separately using the files[] = filename syntax.

One thing that the xautoload module does to enable for the use of wildcards within this syntax. By using wildcards, the module file can be simplified as follows:

files[] = includes/**/*.inc

This will load any .inc files within the includes directory as well as any sub-directories, like 'node' in the original example.

This means that any new migration classes that are added will be automatically loaded, so you don’t need to declare each include separately within foo_migrate.info again. The great thing about this approach is that it works with the existing directory and file structure.

Use the PSR-4 structure

If you want to use the PSR-4 approach, you can do that too.

In order to do so, you’ll need to complete the following steps:

  1. Rename the includes directory to src.
  2. Ensure that there is one PHP class per file, and that the file extension is .php rather than .inc.
  3. Ensure that the name of the file matches the name of the class - FooArticleNodeMigration would be in a file called FooArticleNodeMigration.php.
  4. Add a namespace to each PHP file. This uses the same format as Drupal 8, including the machine name of the module. For example, Drupal\foo_migrate.
    • If the class is within a sub-directory, then this will also need to be included within the namespace - e.g. Drupal\foo_migrate\Node.
    • You’ll also need to import any class names that you are referencing, including class names that are you extending, by adding use statements at the top of the file. You may be able to prefix it with \ instead (e.g. \DrupalNode6Migration), but I prefer to use imports.

Now your class may look something like this:

<?php

namespace Drupal\foo_migrate\Node;

use DrupalNode6Migration;

class FooArticleNodeMigration extends DrupalNode6Migration {
  ...
}

With these steps completed, any imports within your .info file can be removed as they are no longer needed and any classes will be loaded automatically.

Within foo_migrate.migrate.inc, I can now reference any class names using their full namespace:

$node_arguments['ArticleNode'] = array(
  'class_name' => 'Drupal\foo_migrate\Node\FooArticleNodeMigration',
  'source_type' => 'story',
  'destination_type' => 'article',
);

Resources

Questions? Comments? I’m @opdavies on Twitter.

May 02 2016
May 02

Posted: 3rd May 2016

Tags: autoloading, drupal, drupal-planet, drupal-7, php

What is xautoload?

xautoload is a Drupal module that enables the autoloading of PHP classes, in the same way that you would do so in a Composer based project such as Drupal 8 or Symfony.

It supports both the PSR-0 and PSR-4 standards, as well as providing a wildcard syntax for Drupal’s file[] syntax in .info files.

To use it, download and enable it from Drupal.org as you would for any other module, and then add it as a dependency within your module. The xautoload project page suggests including a minimum version in this format:

dependencies[] = xautoload (>= 7.x-5.0)

This will ensure that the version of xautoload is 7.x-5.0 or newer.

How to use it

Wildcard syntax for .info files

Here is an example .info file for a migrate module.

; foo_migrate.info

name = Foo Migration
core = 7.x
package = Foo

files[] = includes/user.inc
files[] = includes/nodes/article.inc
files[] = includes/nodes/page.inc

In this example, each custom migration class is stored in it’s own file within the includes directory, and each class needs to be loaded separately using the files[] = filename syntax.

One thing that the xautoload module does to enable for the use of wildcards within this syntax. By using wildcards, the module file can be simplified as follows:

files[] = includes/**/*.inc

This will load any .inc files within the includes directory as well as any sub-directories, like 'node' in the original example.

This means that any new migration classes that are added will be automatically loaded, so you don’t need to declare each include separately within foo_migrate.info again. The great thing about this approach is that it works with the existing directory and file structure.

Use the PSR-4 structure

If you want to use the PSR-4 approach, you can do that too.

In order to do so, you’ll need to complete the following steps:

  1. Rename the includes directory to src.
  2. Ensure that there is one PHP class per file, and that the file extension is .php rather than .inc.
  3. Ensure that the name of the file matches the name of the class - FooArticleNodeMigration would be in a file called FooArticleNodeMigration.php.
  4. Add a namespace to each PHP file. This uses the same format as Drupal 8, including the machine name of the module. For example, Drupal\foo_migrate.
    • If the class is within a sub-directory, then this will also need to be included within the namespace - e.g. Drupal\foo_migrate\Node.
    • You’ll also need to import any class names that you are referencing, including class names that are you extending, by adding use statements at the top of the file. You may be able to prefix it with \ instead (e.g. \DrupalNode6Migration), but I prefer to use imports.

Now your class may look something like this:

<?php

namespace Drupal\foo_migrate\Node;

use DrupalNode6Migration;

class FooArticleNodeMigration extends DrupalNode6Migration {
  ...
}

With these steps completed, any imports within your .info file can be removed as they are no longer needed and any classes will be loaded automatically.

Within foo_migrate.migrate.inc, I can now reference any class names using their full namespace:

$node_arguments['ArticleNode'] = array(
  'class_name' => 'Drupal\foo_migrate\Node\FooArticleNodeMigration',
  'source_type' => 'story',
  'destination_type' => 'article',
);

Resources

About the Author

Picture of Oliver

Oliver Davies is a Drupal Developer and System Administrator based in the UK. He is a Senior Developer at Appnovation and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Drupal Association Individual Member I built Drupal 8 with hand holding a wrench on blue background

Availability

Currently no spare full-time capacity.

Currently no spare part-time capacity.

Latest blog posts

Feb 15 2016
Feb 15

For the past few weeks I’ve been working on a personal side project, based on Drupal VM. It’s called the Drupal VM Generator, and over the weekend I’ve added the final features and fixed the remaining issues, and tagged the 1.0.0 release.

Dec 22 2015
Dec 22

I recently had my first experience using the Entityform module in a project. It was quite easy to configure with different form types, but then I needed to embed the form into an overlay. I was expecting to use the drupal_get_form() function and render it, but this didn’t work.

Here are the steps that I took to be able to load, render and embed the form.

Loading the Form

The first thing that I needed to do to render the form was to load an empty instance of the entityform using entityform_empty_load(). In this example, newsletter is the name of my form type.

$form = entityform_empty_load('newsletter');

This returns an instance of a relevant Entityform object.

Rendering the Form

The next step was to be able to render the form. I did this using the entity_form_wrapper() function.

As this function is within the entityform.admin.inc file and not autoloaded by Drupal, I needed to include it using module_load_include() so that the function was available.

module_load_include('inc', 'entityform', 'entityform.admin');

$output = entityform_form_wrapper($form, 'submit', 'embedded'),

The first argument is the Entityform object that was created in the previous step (I’ve submitted a patch to type hint this within entityform so that it’s clearer what is expected), which is required.

The other two arguments are optional. The second argument is the mode (submit is the default value), and the last is the form context. page is the default value, for use on the submit page, however I changed this to embedded.

I could then pass this result into my theme function to render it successfully within the relevant template file.

Resources

Questions? Comments? I’m @opdavies on Twitter.

Apr 03 2015
Apr 03

This week, my first code patch was committed to Drupal core. The patch adds the user_has_role() function to the user module, to simplify the way to check whether a user in Drupal has been assigned a specific role. This is something that I normally write a custom function for each project, but it's now available in Drupal core as of 7.36.

But what if someone is using a core version less than 7.36 and tries using the function? The site would return an error because that function wouldn't exist.

If you're building a new Drupal site, then I'd assume that you're using a latest version of core, or you have the opportunity to update it when needed. But what if you're writing a contrib module? How can you be sure that the correct minimum version of core?

Setting Dependencies

What I'm going to be doing for my contrib projects is defining a minimum version of Drupal core that the module is compatible with. If this dependency isn't met, the module won't be able to be enabled. This is done within your module's .info file.

Adding a Simple Dependency

You can define a simple dependency for your module by adding a line this this to your project's .info file:

dependencies[] = views

This would make your module dependant on having the Views module present and enabled, which you'd need if you were including views as part of your module, for example.

Adding a Complex Dependency

In the previous example, our module would enable if any version of Views was enabled, but we need to specify a specific version. We can do this by including version numbers within the dependencies field in the following format:

dependencies[] = modulename (major.minor)

This can be a for a specific module release or a branch name:

dependencies[] = modulename (1.0)
dependencies[] = modulename (1.x)

We can also use the following as part of the field for extra granularity:

  • = or == equals (this is the default)
  • > greater than
  • < lesser than
  • >= greater than or equal to
  • <= lesser than or equal to
  • != not equal to

In the original scenario, we want to specify that the module can only be enabled on Drupal core 7.36 or later. To do this, we can use the "greater than or equal to" option.

dependencies[] = system (>=7.36)

Because we need to check for Drupal's core version, we're using the system module as the dependency and specifying that it needs to be either equal to or greater than 7.36. If this dependency is not met, e.g. Drupal 7.35 is being used, then the module cannot be enabled rather than showing a function not found error for user_has_role() when it is called.

A screenshot of the modules page showing System as a dependency for a custom module.

Questions? Comments? I’m @opdavies on Twitter.

Apr 02 2015
Apr 02

Posted: 3rd April 2015

Tags: drupal, drupal-7, drupal-planet

This week, my first code patch was committed to Drupal core. The patch adds the user_has_role() function to the user module, to simplify the way to check whether a user in Drupal has been assigned a specific role. This is something that I normally write a custom function for each project, but it's now available in Drupal core as of 7.36.

But what if someone is using a core version less than 7.36 and tries using the function? The site would return an error because that function wouldn't exist.

If you're building a new Drupal site, then I'd assume that you're using a latest version of core, or you have the opportunity to update it when needed. But what if you're writing a contrib module? How can you be sure that the correct minimum version of core?

Setting Dependencies

What I'm going to be doing for my contrib projects is defining a minimum version of Drupal core that the module is compatible with. If this dependency isn't met, the module won't be able to be enabled. This is done within your module's .info file.

Adding a Simple Dependency

You can define a simple dependency for your module by adding a line this this to your project's .info file:

dependencies[] = views

This would make your module dependant on having the Views module present and enabled, which you'd need if you were including views as part of your module, for example.

Adding a Complex Dependency

In the previous example, our module would enable if any version of Views was enabled, but we need to specify a specific version. We can do this by including version numbers within the dependencies field in the following format:

dependencies[] = modulename (major.minor)

This can be a for a specific module release or a branch name:

dependencies[] = modulename (1.0)
dependencies[] = modulename (1.x)

We can also use the following as part of the field for extra granularity:

  • = or == equals (this is the default)
  • > greater than
  • < lesser than
  • >= greater than or equal to
  • <= lesser than or equal to
  • != not equal to

In the original scenario, we want to specify that the module can only be enabled on Drupal core 7.36 or later. To do this, we can use the "greater than or equal to" option.

dependencies[] = system (>=7.36)

Because we need to check for Drupal's core version, we're using the system module as the dependency and specifying that it needs to be either equal to or greater than 7.36. If this dependency is not met, e.g. Drupal 7.35 is being used, then the module cannot be enabled rather than showing a function not found error for user_has_role() when it is called.

A screenshot of the modules page showing System as a dependency for a custom module.

About the Author

Picture of Oliver

Oliver Davies is a Drupal Developer and System Administrator based in the UK. He is a Senior Developer at Appnovation and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Drupal Association Individual Member I built Drupal 8 with hand holding a wrench on blue background

Availability

Currently no spare full-time capacity.

Currently no spare part-time capacity.

Latest blog posts

Dec 22 2014
Dec 22

Reroute Email module uses hook_mail_alter() to prevent emails from being sent to users from non-production sites. It allows you to enter one or more email addresses that will receive the emails instead of delivering them to the original user.

This is useful in case where you do not want email sent from a Drupal site to reach the users. For example, if you copy a live site to a test site for the purpose of development, and you do not want any email sent to real users of the original site. Or you want to check the emails sent for uniform formatting, footers, ...etc.

As we don't need the module configured on production (we don't need to reroute any emails there), it's best to do this in code using settings.local.php (if you have one) or the standard settings.php file.

The first thing that we need to do is to enable rerouting. Without doing this, nothing will happen.

$conf['reroute_email_enable'] = TRUE;

The next option is to whether to show rerouting description in mail body. I usually have this enabled. Set this to TRUE or FALSE depending on your preference.

$conf['reroute_email_enable_message'] = TRUE;

The last setting is the email address to use. If you're entering a single address, you can add it as a simple string.

$conf['reroute_email_address'] = '[email protected]';

In this example, all emails from the site will be rerouted to [email protected].

If you want to add multiple addresses, these should be added in a semicolon-delimited list. Whilst you could add these also as a string, I prefer to use an array of addresses and the implode() function.

$conf['reroute_email_address'] = implode(';', array(
  '[email protected]',
  '[email protected]',
  '[email protected]',
));

In this example, [email protected] and [email protected] would receive their emails from the site as normal. Any emails to addresses not in the array would continue to be redirected to [email protected].

Dec 20 2014
Dec 20

At the bottom of settings.php, add the following code:

$local_settings = __DIR__ . '/settings.local.php';
if (file_exists($local_settings)) {
  include $local_settings;
}

This allows for you to create a new file called settings.local.php within a sites/* directory (the same place as settings.php), and this will be included as an extension of settings.php. You can see the same technique being used within Drupal 8's default.settings.php file.

Environment specific settings like $databases and $base_url can be placed within the local settings file. Other settings like $conf['locale_custom_strings_en'] (string overrides) and $conf['allow_authorize_operations'] that would apply to all environments can still be placed in settings.php.

settings.php though is ignored by default by Git by a .gitignore file, so it won't show up as a file available to be committed. There are two ways to fix this. The first is to use the --force option when adding the file which overrides the ignore file:

git add --force sites/default/settings.php

The other option is to update the .gitignore file itself so that settings.php is no longer ignored. An updated .gitignore file could look like:

# Ignore configuration files that may contain sensitive information.
sites/*/settings.local*.php

# Ignore paths that contain user-generated content.
sites/*/files
sites/*/private

This will allow for settings.php to be added to Git and committed, but not settings.local.php.

Dec 20 2014
Dec 20

20th December 2014

At the bottom of settings.php, add the following code:

$local_settings = __DIR__ . '/settings.local.php';
if (file_exists($local_settings)) {
  include $local_settings;
}

This allows for you to create a new file called settings.local.php within a sites/* directory (the same place as settings.php), and this will be included as an extension of settings.php. You can see the same technique being used within Drupal 8's default.settings.php file.

Environment specific settings like $databases and $base_url can be placed within the local settings file. Other settings like $conf['locale_custom_strings_en'] (string overrides) and $conf['allow_authorize_operations'] that would apply to all environments can still be placed in settings.php.

settings.php though is ignored by default by Git by a .gitignore file, so it won't show up as a file available to be committed. There are two ways to fix this. The first is to use the --force option when adding the file which overrides the ignore file:

git add --force sites/default/settings.php

The other option is to update the .gitignore file itself so that settings.php is no longer ignored. An updated .gitignore file could look like:

# Ignore configuration files that may contain sensitive information.
sites/*/settings.local*.php

# Ignore paths that contain user-generated content.
sites/*/files
sites/*/private

This will allow for settings.php to be added to Git and committed, but not settings.local.php.

About the Author

Picture of Oliver

Oliver Davies is a Web Developer, System Administrator and Drupal specialist based in the UK. He is a Senior Developer at Microserve and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Availability

Currently have limited part-time capacity

Currently no spare full-time capacity.

Nov 27 2014
Nov 27

I was recently doing some work on a site hosted on Pantheon and came across an issue, for which part of the suggested fix was to ensure that the $base_url variable was explicitly defined within settings.php (this is also best practice on all Drupal sites).

The way that was recommended was by using a switch() function based on Pantheon's environment variable. For example:

switch ($_SERVER['PANTHEON_ENVIRONMENT']) {
  case 'dev':
    // Development environment.
    $base_url = 'dev-my-site.gotpantheon.com';
    break;


  case 'test':
    // Testing environment.
    $base_url = 'test-my-site.gotpantheon.com';
    break;


  case 'live':
    // Production environment.
    $base_url = 'live-my-site.gotpantheon.com';
    break;
}

Whilst this works, it doesn't conform to the DRY (don't repeat yourself) principle and means that you also might get a rather long and complicated settings file, especially when you start using multiple switches and checking for the value of the environment multiple times.

My alternative solution to this is to include an environment-specific settings file.

To do this, add the following code to the bottom of settings.php:

// If using Pantheon, include an environment-specific settings file, for example
// settings.dev.php, if one exists.
if (isset($_SERVER['PANTHEON_ENVIRONMENT'])) {
  $environment_settings = __DIR__ . '/settings.' .  $_SERVER['PANTHEON_ENVIRONMENT'] . '.php';
  if (file_exists($environment_settings)) {
    include $environment_settings;
  }
}

This means that rather than having one long file, each environment has it's own dedicated settings file that contains it's own additional configuration. This is much easier to read and make changes to, and also means that less code is loaded and parsed by PHP. Settings that apply to all environments are still added to settings.php.

Below this, I also include a similar piece of code to include a settings.local.php file. The settings.php file then gets committed into the Git repository.

Within the sites/default directory, I also include an example file (example.settings.env.php) for reference. This is duplicated, renamed and populated accordingly.

<?php

/**
 * This is a specific settings file, just for the x environment. Any settings
 * defined here will be included after those in settings.php.
 *
 * If you have also added a settings.local.php file, that will override any
 * settings stored here.
 *
 * No database credentials should be stored in this file as these are included
 * automatically by Pantheon.
 */

$base_url = '';

The environment specific files are also committed into Git and pushed to Pantheon, and are then included automatically on each environment.

Nov 27 2014
Nov 27

27th November 2014

I was recently doing some work on a site hosted on Pantheon and came across an issue, for which part of the suggested fix was to ensure that the $base_url variable was explicitly defined within settings.php (this is also best practice on all Drupal sites).

The way that was recommended was by using a switch() function based on Pantheon's environment variable. For example:

switch ($_SERVER['PANTHEON_ENVIRONMENT']) {
  case 'dev':
    // Development environment.
    $base_url = 'dev-my-site.gotpantheon.com';
    break;


  case 'test':
    // Testing environment.
    $base_url = 'test-my-site.gotpantheon.com';
    break;


  case 'live':
    // Production environment.
    $base_url = 'live-my-site.gotpantheon.com';
    break;
}

Whilst this works, it doesn't conform to the DRY (don't repeat yourself) principle and means that you also might get a rather long and complicated settings file, especially when you start using multiple switches and checking for the value of the environment multiple times.

My alternative solution to this is to include an environment-specific settings file.

To do this, add the following code to the bottom of settings.php:

if (isset($_SERVER['PANTHEON_ENVIRONMENT'])) {
  if ($_SERVER['PANTHEON_ENVIRONMENT'] != 'live') {
    // You can still add things here, for example to apply to all sites apart
    // from production. Mail reroutes, caching settings etc.
  }

  // Include an environment-specific settings file, for example
  // settings.dev.php, if one exists.
  $environment_settings = __DIR__ . '/settings.' .  $_SERVER['PANTHEON_ENVIRONMENT'] . '.php';
  if (file_exists($environment_settings)) {
    include $environment_settings;
  }
}

This means that rather than having one long file, each environment has it's own dedicated settings file that contains it's own additional configuration. This is much easier to read and make changes to, and also means that less code is loaded and parsed by PHP. Settings that apply to all environments are still added to settings.php.

Below this, I also include a similar piece of code to include a settings.local.php file. The settings.php file then gets committed into the Git repository.

Within the sites/default directory, I also include an example file (example.settings.env.php) for reference. This is duplicated, renamed and populated accordingly.

<?php

/**
 * This is a specific settings file, just for the x environment. Any settings
 * defined here will be included after those in settings.php.
 *
 * If you have also added a settings.local.php file, that will override any
 * settings stored here.
 *
 * No database credentials should be stored in this file as these are included
 * automatically by Pantheon.
 */

$base_url = '';

The environment specific files are also committed into Git and pushed to Pantheon, and are then included automatically on each environment.

About the Author

Picture of Oliver

Oliver Davies is a Web Developer, System Administrator and Drupal specialist based in the UK. He is a Senior Developer at Microserve and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Availability

Currently have limited part-time capacity

Currently no spare full-time capacity.

Nov 20 2014
Nov 20

20th November 2014

Download the Stage File Proxy module from Drupal.org and enable it on your site.

As this module is only going to be needed on pre-production sites, it would be better to configure this within your settings.php or settings.local.php file. We do this using the $conf array which removes the need to configure the module through the UI and store the values in the database.

// File proxy to the live site.
$conf['stage_file_proxy_origin'] = 'http://www.example.com';

// Don't copy the files, just link to them.
$conf['stage_file_proxy_hotlink'] = TRUE;

// Image style images are the wrong size otherwise.
$conf['stage_file_proxy_use_imagecache_root'] = FALSE;

If the origin site is not publicly accessible yet, maybe it's a pre-live or staging site, and protected with a basic access authentication, you can include the username and password within the origin URL.

$conf['stage_file_proxy_origin'] = 'http://user:[email protected]';

About the Author

Picture of Oliver

Oliver Davies is a Web Developer, System Administrator and Drupal specialist based in the UK. He is a Senior Developer at Microserve and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Availability

Currently have limited part-time capacity

Currently no spare full-time capacity.

Nov 18 2014
Nov 18

Using a file structure similar to this, organise your font files into directories, using the the font name for both the directory name and for the file names.

.
??? FuturaBold
?   ??? FuturaBold.eot
?   ??? FuturaBold.svg
?   ??? FuturaBold.ttf
?   ??? FuturaBold.woff
??? FuturaBoldItalic
?   ??? FuturaBoldItalic.eot
?   ??? FuturaBoldItalic.svg
?   ??? FuturaBoldItalic.ttf
?   ??? FuturaBoldItalic.woff
??? FuturaBook
?   ??? FuturaBook.eot
?   ??? FuturaBook.svg
?   ??? FuturaBook.ttf
?   ??? FuturaBook.woff
??? FuturaItalic
?   ??? FuturaItalic.eot
?   ??? FuturaItalic.svg
?   ??? FuturaItalic.ttf
?   ??? FuturaItalic.woff

Within your SASS file, start an @each loop, listing the names of the fonts. In the same way as PHP's foreach loop, each font name will get looped through using the $family variable and then compiled into CSS.

@each $family in FuturaBook, FuturaBold, FuturaBoldItalic, FuturaItalic {
  @font-face {
    font-family: #{$family};
    src: url('../fonts/#{$family}/#{$family}.eot');
    src: url('../fonts/#{$family}/#{$family}.eot?#iefix') format('embedded-opentype'),
         url('../fonts/#{$family}/#{$family}.woff') format('woff'),
         url('../fonts/#{$family}/#{$family}.ttf') format('truetype'),
         url('../fonts/#{$family}/#{$family}.svg##{$family}') format('svg');
    font-weight: normal;
    font-style: normal;
  }
}

When the CSS has been compiled, you can then use in your CSS in the standard way.

font-family: "FuturaBook";

Nov 18 2014
Nov 18

Posted: 18th November 2014

Tags: compass, drupal-planet, fonts, sass

Using a file structure similar to this, organise your font files into directories, using the the font name for both the directory name and for the file names.

.
├── FuturaBold
│   ├── FuturaBold.eot
│   ├── FuturaBold.svg
│   ├── FuturaBold.ttf
│   └── FuturaBold.woff
├── FuturaBoldItalic
│   ├── FuturaBoldItalic.eot
│   ├── FuturaBoldItalic.svg
│   ├── FuturaBoldItalic.ttf
│   └── FuturaBoldItalic.woff
├── FuturaBook
│   ├── FuturaBook.eot
│   ├── FuturaBook.svg
│   ├── FuturaBook.ttf
│   └── FuturaBook.woff
├── FuturaItalic
│   ├── FuturaItalic.eot
│   ├── FuturaItalic.svg
│   ├── FuturaItalic.ttf
│   └── FuturaItalic.woff

Within your SASS file, start an @each loop, listing the names of the fonts. In the same way as PHP's foreach loop, each font name will get looped through using the $family variable and then compiled into CSS.

@each $family in FuturaBook, FuturaBold, FuturaBoldItalic, FuturaItalic {
  @font-face {
    font-family: #{$family};
    src: url('../fonts/#{$family}/#{$family}.eot');
    src: url('../fonts/#{$family}/#{$family}.eot?#iefix') format('embedded-opentype'),
         url('../fonts/#{$family}/#{$family}.woff') format('woff'),
         url('../fonts/#{$family}/#{$family}.ttf') format('truetype'),
         url('../fonts/#{$family}/#{$family}.svg##{$family}') format('svg');
    font-weight: normal;
    font-style: normal;
  }
}

When the CSS has been compiled, you can then use in your CSS in the standard way.

font-family: "FuturaBook";

About the Author

Picture of Oliver

Oliver Davies is a Drupal Developer and System Administrator based in the UK. He is a Senior Developer at Appnovation and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Drupal Association Individual Member I built Drupal 8 with hand holding a wrench on blue background

Availability

Currently no spare full-time capacity.

Currently no spare part-time capacity.

Latest blog posts

Oct 21 2014
Oct 21

If you use the Features module to manage your Drupal configuration, it can be time consuming to update features through the UI, especially if you are working on a remote server and need to keep downloading and uploading files.

If you re-create a feature through the UI, you'll be prompted to download a new archive of the feature in its entirety onto your local computer. You could either commit this into a local repository and then pull it remotely, or use a tool such as SCP to upload the archive onto the server and commit it from there. You can simplify this process by using Drush.

Finding Components

To search for a component, use the drush features-components command. This will display a list of all components on the site. As we're only interested in components that haven't been exported yet, add the --not-exported option to filter the results.

To filter further, you can also use the grep command to filter the results. For example, drush features-components --not-exported field_base | grep foo, would only return non-exported field bases containing the word "foo".

The result is a source and a component, separated by a colon. For example, field_base:field_foo.

Exporting the Feature

Once you have a list of the components that you need to add, you can export the feature. This is done using the drush features-export command, along with the feature name and the component names.

For example:

$ drush features-export -y myfeature field_base:field_foo field_instance:user-field_foo

In this example, the base for field_boo and it's instance on the user object is being added to the "myfeature" feature.

If you are updating an existing feature, you'll get a message informing you that the module already exists and asking if you want to continue. This is fine, and is automatically accepted by including -y within the command. If a feature with the specified name doesn't exist, it will be created.

If you're creating a new feature, you can define where the feature will be created using the --destination option.

Once complete, you will see a confirmation message.

Created module: my feature in sites/default/modules/custom/features/myfeature

The Result

Once finished, the feature is updated in it's original location, so there's no download of the feature and then needing to re-upload it. You can add and commit your changes into Git or continue with your standard workflow straight away.

Useful Links

Oct 20 2014
Oct 20

Posted: 21st October 2014

Tags: drupal, drupal-planet, drush, features

If you use the Features module to manage your Drupal configuration, it can be time consuming to update features through the UI, especially if you are working on a remote server and need to keep downloading and uploading files.

If you re-create a feature through the UI, you'll be prompted to download a new archive of the feature in its entirety onto your local computer. You could either commit this into a local repository and then pull it remotely, or use a tool such as SCP to upload the archive onto the server and commit it from there. You can simplify this process by using Drush.

Finding Components

To search for a component, use the drush features-components command. This will display a list of all components on the site. As we're only interested in components that haven't been exported yet, add the --not-exported option to filter the results.

To filter further, you can also use the grep command to filter the results. For example, drush features-components --not-exported field_base | grep foo, would only return non-exported field bases containing the word "foo".

The result is a source and a component, separated by a colon. For example, field_base:field_foo.

Exporting the Feature

Once you have a list of the components that you need to add, you can export the feature. This is done using the drush features-export command, along with the feature name and the component names.

For example:

$ drush features-export -y myfeature field_base:field_foo field_instance:user-field_foo

In this example, the base for field_boo and it's instance on the user object is being added to the "myfeature" feature.

If you are updating an existing feature, you'll get a message informing you that the module already exists and asking if you want to continue. This is fine, and is automatically accepted by including -y within the command. If a feature with the specified name doesn't exist, it will be created.

If you're creating a new feature, you can define where the feature will be created using the --destination option.

Once complete, you will see a confirmation message.

Created module: my feature in sites/default/modules/custom/features/myfeature

The Result

Once finished, the feature is updated in it's original location, so there's no download of the feature and then needing to re-upload it. You can add and commit your changes into Git or continue with your standard workflow straight away.

About the Author

Picture of Oliver

Oliver Davies is a Drupal Developer and System Administrator based in the UK. He is a Senior Developer at Appnovation and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Drupal Association Individual Member I built Drupal 8 with hand holding a wrench on blue background

Availability

Currently no spare full-time capacity.

Currently no spare part-time capacity.

Latest blog posts

May 21 2014
May 21

The Problem

As an active contributor to the Drupal project, I spend a lot of time working with other peoples’ modules and themes, and occassionally have to fix a bug or add some new functionality.

In the Drupal community, we use a patch based workflow where any changes that I make get exported to a file detailing the differences. The patch file (*.patch) is attached to an item in an issue queue on Drupal.org, applied by the maintainer to their local copy of the code and reviewed, and hopefully committed.

There is an option that the maintainer can add to the end of their commit message.

For example:

--author="opdavies "

This differs slightly different for each Drupal user, and the code can be found on their D.O profile page.

If this is added to the end of the commit message, the resulting commit will show that it was committed by the maintainer but authored by a different user. This will then display on Drupal.org that you’ve made a commit to that project.

A screenshot of a commit that was authored by rli but committed by opdavies

The problem is that some project maintainers either don’t know about this option or occasionally forget to add it. Dreditor can suggest a commit message and assign an author, but it is optional and, of course, not all maintainers use Dreditor (although they probably should).

The git format-patch command seems to be the answer, and will be my preferred method for generating patch files in the future rather than git diff.

What does it do Differently?

From the manual page:

Prepare each commit with its patch in one file per commit, formatted to resemble UNIX mailbox format. The output of this command is convenient for e-mail submission or for use with git am.

Here is a section of a patch that I created for the Metatag module using git format-patch:

From 80c8fa14de7f4a83c2e70367aab0aedcadf4f3b0 Mon Sep 17 00:00:00 2001
From: Oliver Davies 
Date: Mon, 12 May 2014 14:53:55 +0100
Subject: [PATCH] Exclude comment entities when checking if this is the page,
 otherwise comment_fragment.module will break metatag

---

As mentioned above, the patch is structured in an email format. The commit message is used as the subject line, and the date that the commit was made locally is used for the date. What we’re interested in is the “From” value. This contains your name and email address from your ~/.gitconfig file and is used to author the patch automatically.

Everything below this is the same as a standard patch file, the same as if was generated with git diff.

The full patch file can be found at https://drupal.org/files/issues/metatag-comment-fragment-conflict-2265447-4.patch.

The Process

How did I create this patch? Here are the steps that I took:

  1. Clone the source repository using $ git clone --branch 7.x-1.x http://git.drupal.org/project/metatag.git and move into that directory.
  2. Create a branch for this patch using $ git checkout -b 2265447-comment-fragment-conflict.
  3. Add and commit any changes as normal.
  4. Generate the patch file using $ git format-patch 7.x-1.x --stdout > metatag-comment-fragment-conflict-2265447-4.patch.

Note: I am defining 7.x-1.x in the last step as the original branch to compare (i.e. the original branch that we forked to make our issue branch). This will change depending on the project that you are patching, and it’s version number. Also, commits should always be made against the development branch and not the stable release.

By default, a separate patch file will be created for each commit that we’ve made. This is overridden by the --stdout option which combines all of the patches into a single file. This is the recommended approach when uploading to Drupal.org.

The resulting patch file can be uploaded onto a Drupal.org issue queue, reviewed by the Testbot and applied by a module maintainer, and you automatically get the commit attributed. Problem solved.

Committing the Patch

If you need to commit a patch that was created using git format-patch, the best command to do this with is the git am command.

For example, within your repository, run:

$ git am /path/to/file
$ git am ~/Code/metatag-comment-fragment-conflict-2265447-4.patch

You should end up with some output similar to the following:

Applying: #2272799 Added supporters section
Applying: #2272799 Added navigation tabs
Applying: #2272799 Fixed indentation
Applying: #2272799 Replaced URL

Each line is the commit message associated with that patch.

Assuming that there are no errors, you can go ahead and push your updated code into your remote repository.

May 20 2014
May 20

Posted: 21st May 2014

Tags: patches, drupal, drupal-planet, git

The Problem

As an active contributor to the Drupal project, I spend a lot of time working with other peoples’ modules and themes, and occassionally have to fix a bug or add some new functionality.

In the Drupal community, we use a patch based workflow where any changes that I make get exported to a file detailing the differences. The patch file (*.patch) is attached to an item in an issue queue on Drupal.org, applied by the maintainer to their local copy of the code and reviewed, and hopefully committed.

There is an option that the maintainer can add to the end of their commit message.

For example:

--author="opdavies <[email protected]>"

This differs slightly different for each Drupal user, and the code can be found on their Drupal.org profile page.

If this is added to the end of the commit message, the resulting commit will show that it was committed by the maintainer but authored by a different user. This will then display on Drupal.org that you’ve made a commit to that project.

A screenshot of a commit that was authored by rli but committed by opdavies

The problem is that some project maintainers either don’t know about this option or occasionally forget to add it. Dreditor can suggest a commit message and assign an author, but it is optional and, of course, not all maintainers use Dreditor (although they probably should).

The git format-patch command seems to be the answer, and will be my preferred method for generating patch files in the future rather than git diff.

What does it do Differently?

From the manual page:

Prepare each commit with its patch in one file per commit, formatted to resemble UNIX mailbox format. The output of this command is convenient for e-mail submission or for use with git am.

Here is a section of a patch that I created for the Metatag module using git format-patch:

From 80c8fa14de7f4a83c2e70367aab0aedcadf4f3b0 Mon Sep 17 00:00:00 2001
From: Oliver Davies &lt;[email protected]&gt;
Date: Mon, 12 May 2014 14:53:55 +0100
Subject: [PATCH] Exclude comment entities when checking if this is the page,
 otherwise comment_fragment.module will break metatag

---

As mentioned above, the patch is structured in an email format. The commit message is used as the subject line, and the date that the commit was made locally is used for the date. What we’re interested in is the “From” value. This contains your name and email address from your ~/.gitconfig file and is used to author the patch automatically.

Everything below this is the same as a standard patch file, the same as if was generated with git diff.

The full patch file can be found at https://drupal.org/files/issues/metatag-comment-fragment-conflict-2265447-4.patch.

The Process

How did I create this patch? Here are the steps that I took:

  1. Clone the source repository using $ git clone --branch 7.x-1.x http://git.drupal.org/project/metatag.git and move into that directory.
  2. Create a branch for this patch using $ git checkout -b 2265447-comment-fragment-conflict.
  3. Add and commit any changes as normal.
  4. Generate the patch file using $ git format-patch 7.x-1.x --stdout > metatag-comment-fragment-conflict-2265447-4.patch.

Note: I am defining 7.x-1.x in the last step as the original branch to compare (i.e. the original branch that we forked to make our issue branch). This will change depending on the project that you are patching, and it’s version number. Also, commits should always be made against the development branch and not the stable release.

By default, a separate patch file will be created for each commit that we’ve made. This is overridden by the --stdout option which combines all of the patches into a single file. This is the recommended approach when uploading to Drupal.org.

The resulting patch file can be uploaded onto a Drupal.org issue queue, reviewed by the Testbot and applied by a module maintainer, and you automatically get the commit attributed. Problem solved.

Committing the Patch

If you need to commit a patch that was created using git format-patch, the best command to do this with is the git am command.

For example, within your repository, run:

$ git am /path/to/file
$ git am ~/Code/metatag-comment-fragment-conflict-2265447-4.patch

You should end up with some output similar to the following:

Applying: #2272799 Added supporters section
Applying: #2272799 Added navigation tabs
Applying: #2272799 Fixed indentation
Applying: #2272799 Replaced URL

Each line is the commit message associated with that patch.

Assuming that there are no errors, you can go ahead and push your updated code into your remote repository.

About the Author

Picture of Oliver

Oliver Davies is a Drupal Developer and System Administrator based in the UK. He is a Senior Developer at Appnovation and also provides freelance consultancy services for Drupal websites, PHP applications and Linux servers.

Drupal Association Individual Member I built Drupal 8 with hand holding a wrench on blue background

Availability

Currently no spare full-time capacity.

Currently no spare part-time capacity.

Latest blog posts

Dec 31 2013
Dec 31

If you use Drush, it's likely that you've used the "drush pm-download" (or "drush dl" for short) command to start a new project. This command downloads projects from Drupal.org, but if you don't specify a project or type "drush dl drupal", the command will download the current stable version of Drupal core. Currently, this will be Drupal 7 with that being the current stable version of core at the time of writing this post.

But what if you don't want Drupal 7?

I still maintain a number of Drupal 6 sites and occassionally need to download Drupal 6 core as opposed to Drupal 7. I'm also experimenting with Drupal 8 so I need to download that as well.

By declarding the core version of Drupal, such as "drupal-6", Drush will download that instead.

$ drush dl drupal-6

This downloads the most recent stable version of Drupal 6. If you don't want that, you can add the --select and additionally the --all options to be presented with an entire list to chose from.

$ drush dl drupal-6 --select
$ drush dl drupal-6 --select --all

If you want the most recent development version, just type:

$ drush dl drupal-6.x

The same can be done for other core versions of Drupal, from Drupal 5 upwards.

$ drush dl drupal-5 # This will download Drupal 5
$ drush dl drupal-8 # This will download Drupal 8

For a full list of the available options, type "drush help pm-download" into a Terminal window or take a look at the entry on drush.ws.

Dec 24 2013
Dec 24

Testing a patch file is usually a two-step process. First you download the patch file from the source, and then you run a separate command to apply it.

You can save time and typing by running the two commands on one line:

$ curl http://drupal.org/files/[patch-name].patch | git apply

Or, if you don't have curl installed, you can use wget:

$ wget -q -O - http://drupal.org/files/[patch-name].patch | git apply

These commands need to be run within the root of your Git repository (i.e. where the .git directory is).

These snippets were taken from Applying Patches with Git on Drupal.org.

Nov 19 2013
Nov 19

There are times when doing Drupal development when you need to run a custom PHP script, maybe moving data from one field to another, that doesn't warrant the time and effort to create a custom module. In this scenario, it would be quicker to write a .php script and bootstrap Drupal to gain access to functions like node_load() and db_query().

To bootstrap Drupal, you would need to add some additional lines of code to the stop of your script. Something like:

<?php

// Bootstrap Drupal.
$drupal_path = $_SERVER['DOCUMENT_ROOT'];
define('DRUPAL_ROOT', $drupal_path);
require_once DRUPAL_ROOT . '/includes/bootstrap.inc';
drupal_bootstrap(DRUPAL_BOOTSTRAP_FULL);

// Do stuff.
$node = node_load(1);

The script would need be placed in the root of your Drupal directory, and you would then have had to open a browser window and visit http://example.com/foo.php to execute it. This is where the "drush script" (or "drush scr" for short) command is useful, and can be used to execute the script from the command line.

$ drush scr foo.php

It also means that I no longer need to manually bootstrap Drupal, so my script is much cleaner.

<?php

// Just do stuff.
$node = node_load(1);

I prefer to keep these scripts outside of my Drupal directory in a separate "scripts" directory (with Drupal in a "drupal" directory on the same level). This makes it easier to update Drupal as I don't need to worry about accidentally deleting the additional files. From within the drupal directory, I can now run the following command to go up one level, into the scripts directory and then execute the script. Note that you do not need to include the file extension.

$ drush scr ../scripts/foo

Or, if you're using Drush aliases:

$ drush @mysite.local scr foo

If you commonly use the same scripts for different projects, you could also store these within a separate Git repository and checkout the scripts directory using a Git submodule.

Sep 06 2013
Sep 06

First, download the Zen theme if you haven't already done so.

$ drush dl zen

This will now enable you to use the "drush zen" command.

$ drush zen "Oliver Davies" oliverdavies --description="A Zen sub-theme for oliverdavies.co.uk" --without-rtl

The parameters that I'm passing it are:

  1. The human-readable name of the theme.
  2. The machine-readable name of the theme.
  3. The description of the theme (optional).
  4. A flag telling Drush not to include any right-to-left elements within my sub-theme as these aren't needed (optional).

This will create a new theme in sites/all/themes/oliverdavies.

For further help, type $ drush help zen to see the Drush help page for the zen command.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web