Mar 19 2020
Mar 19

Drupal 9 is just around the corner. Here's what you need to do to ensure a smooth upgrade.

Changes from Drupal 8 to Drupal 9

Unlike the Drupal 7 → 8 update, which required a rebuild and migrate, Drupal 9 is a simple incremental update from Drupal 8.

Third-party Dependencies

Drupal 8 is built upon 30+ third-party libraries. In order to keep Drupal stable, these have been kept at major version numbers throughout the Drupal 8 release cycle. Some of these will be end-of-life in coming years, so Drupal 9 is bumping these to newer versions in order to maintain long-term support coverage. The major dependency changes are:

  • Symfony 3.4.x  → 4.4.x
  • Twig 1.x → 2.x

In addition, Drupal 9 requires Drush 10.x or later.

Deprecated Code Removal

Drupal 8 introduced a backwards compatibility policy that required public APIs to remain stable while new features or improvements were added. The old APIs were marked as deprecated, but kept in place for backwards compatibility. This gives developers fair warning that code will be removed in future versions of Drupal giving them time to move to the new APIs.

Drupal 9 removes all deprecated code that has built up during the Drupal 8 release cycle.

Platform requirement changes

In addition, platform requirements have changed. Most importantly, this includes new minimum versions of:

  • PHP 7.3 or later
  • MySQL 5.7 or later

Getting Your Site Ready

Fortunately, there are only a few steps you need to follow in order to get your site ready for Drupal 9.

Update Contributed Modules to their Latest Version

Drupal 9 will work with modules that worked on Drupal 8, so there are no major rewrites required. This means the current module version naming scheme (e.g. 8.x-1.0) doesn't make much sense any more. It will gradually be replaced with a semantic versioning scheme (e.g. 2.1.0) as new module releases are published.

If you are a site builder, the easiest way to ensure your site is compatible with Drupal 9 is to keep your contributed modules up to date. You can check the status of your contributed modules using Acquia's Drupal 9 Deprecation Status page.

Replace Deprecated API Usage

Of course, many Drupal sites include custom modules. In order to get your own modules ready, you need to:

  • replace any usages of deprecated API usage
  • specify core version requirement to include Drupal 9

To automatically find usages of deprecated API usage, there is the excellent Drupal Check tool by Matt Glaman. Follow the installation instructions, then from the command line, run it against your custom module:

drupal-check web/modules/contrib/address

Almost all Drupal deprecations have handy instructions on the newer alternative API methods to use as well as a link to the change record that describes it in more details. For example:

drupal_set_message() is deprecated in Drupal 8.5.0 and will be removed before Drupal 9.0.0. Use \Drupal\Core\Messenger\MessengerInterface::addMessage() instead. See

Reading the change notice, we just need to change:

drupal_set_message("example message");


\Drupal::messenger()->addMessage("example message")

You can run drupal-check again and again until you have no more deprecation messages.

If you aren't comfortable with the command line, there is the excellent Upgrade Status module that you install to give you a user friendly report of what needs updating. There is also Drupal 8 Reactor which can automate some of this for you.

Specify Core Version Requirement

The last step is to specify your modules core compatibility. As per the change record there is a new key to add you your module info.yml file:

core_version_requirement: ^8 || ^9

This new key allows you to use composer-style version constraints and tells Drupal your module is compatible with Drupal 8 and 9.


That's it! There aren't many steps required to get your site ready for Drupal 9, and the best news is you can do it now and be ready the day Drupal 9 is released.

Photo of Kim Pepper

Posted by Kim Pepper
Technical Director

Dated 19 March 2020

Mar 17 2020
Mar 17

Like many companies, PreviousNext didn't expect the Coronavirus crisis to hit so hard or quick, nor did we anticipate the rapid effects on the global economy. However, we have always run our company based on sustainable principles, so feel well prepared to deal with what's coming over the next few months. We hope sharing our approach might help others weather the coming storm too!

First and foremost, the wellbeing of our team is our number one priority and our immediate steps were to ensure everyone is as safe as possible from contracting or spreading COVID-19, including:

  • While our team is already highly distributed with most working from home offices, we were quick to enforce a full work from home policy, including no in-person client meetings or unnecessary travel. This was communicated to clients, with video conferencing and chat tools becoming the standard form of interaction.
  • Making sure our team knew we were taking things seriously and reiterating official guidelines on personal hygiene and social distancing.
  • Noting that attending conferences and personal travel were likely to be affected well before official lockdowns started coming into force, and to plan ahead accordingly.
  • Understanding that many staff will have reduced availability due to school closures and having children at home.
  • Outlining clear provisions for staff to use personal and annual leave in the event they became ill themselves or needed to care for family members. We also made it clear that if leave allowances were exhausted due to COVID-19, leave credits would be extended until individual team members were able to return to work. The last thing we want is our team worrying that their personal finances or long term roles might be in jeopardy due to something completely outside their control.
  • Asking team members to privately notify us if anyone close to them becomes ill which may indicate their own health could be at risk in coming weeks if they've had exposure. This then allows us to forward plan potential contingencies for reduced team availability.

We also took a close look at the likely impact of an economic slowdown on the overall business, such as:

  • Identifying which clients may reduce budgets and starting conversations early with these clients around the likely impact to projects we're working on. This demonstrates that we understand things are likely to change and are willing to work with clients so that neither party is left in the lurch.
  • Working with clients to build larger backlogs of project work so that we can continue working if key people on the client side become unavailable to move a project forward.
  • Assessing what operational expenses can be reigned in the short and medium term to ensure our cash flow stays healthy. This includes evaluating current rental arrangements, subscription based services that may not be critical to operations and other non essential expenses.
  • Establishing a clear overview of our current project pipeline, what our break-even costs would be if projects are scaled back and how long we could maintain that state.
  • Understanding what trigger points we'd have to start drawing on our cash reserves to ride out a few slower months.

Another key consideration is our commitment to the Drupal open source project while the world moves into crisis mode. As we saw in the years following 2007's Global Financial Crisis, Drupal is well placed to thrive as a cost effective alternative to proprietary Content Management Systems and we expect similar things will occur this time around. Obviously, without a strong Drupal we don't have a strong business, so initiatives in this regard are:

  • Maintaining our open source contribution policy so that our team is consistently pushing code into the Drupal ecosystem. This is particularly important with the imminent release of Drupal 9. We would encourage other Drupal services companies to adopt similar contribution policies, especially if their teams are suddenly finding they have extra time on their hands due to the economic downturn.
  • Ensuring that the Drupal community remains strong and unified. There'll obviously be many cancellations or postponements of key conferences in coming months, but we can still collaborate closely through Drupal's issue queues and remote events, like live-streamed meetups.
  • While some initiatives are likely to be delayed, maintaining the continuity of operations for our local DrupalSouth committee and the global Drupal Association is of paramount importance. We can achieve this through donating both our time and funding as specific needs arise. Again, we'd urge other Drupal services companies to share this commitment.

This is the third major economic crisis I've lived through in my professional life, and while things will definitely get tough, they will definitely get better again in the long run. By taking pre-emptive steps now, maintaining the confidence of your team and clients and staying committed to Drupal's long term success, there will be light at the end of the tunnel!

Nov 19 2019
Nov 19

As one of the longest-running and largest Drupal events in the Asia-Pacific region, DrupalSouth is an opportunity for the greater community to come together and celebrate all things Drupal. In 2019, DrupalSouth will be making its way for the first time ever to Hobart, Tasmania, under Downunder!

With our team distributed throughout Australia, PreviousNext use DrupalSouth as an opportunity to get everyone together in person each year, run a team off-site, dinner and to enjoy the conference together. We're also sponsoring and mentoring at the Sprint Day on Wednesday November 27 prior to the main conference commencing. PreviousNext staff are volunteering on the event organising team and our sister company, Skpr, is sponsoring Michelle Mannering's keynote presentation on Thursday November 28.

The conference will feature both local and international speakers, including twelve sessions by PreviousNext’s team who were selected to present this year. The event program committee had no details on the applicants throughout the selection process so all sessions were chosen based on topic and content alone, with PreviousNext set to present or be involved in the following sessions:

We find DrupalSouth is a perfect opportunity to engage socially, network and mix with our region's active Drupal contributors and community members. The conference will run over two days at the Hotel Grand Chancellor in Hobart, from 28-29 November, preceded on Wednesday 27 Nov by a Code Sprint for those are keen to join in. 

Photo of Lucy Vernon

Posted by Lucy Vernon

Dated 19 November 2019

Nov 14 2019
Nov 14

Image Styles Breadcrumb

Images on websites can be a huge pain when you are optimizing a site. We want our images to render as crisp as possible but we also want our sites to load as fast as possible. Content creators will often ask "what size image should I upload" and with the thought of some tiny image being rendered at full-screen width pixelated out of control we'll answer "as large as you've got". The content creator will then upload a 2mb jpeg and the load time/network request size will increase dramatically.

Responsive images can be a decent solution for this. A front end developer can achieve this in many ways. A popular way would be to use a <picture> element with the contents of multiple <source>'s using srcset and `media` attributes and a default <img> tag.

I'll explain how we can do that in Drupal 8. 
The scenario I'm trying to set up in this example is a paragraph that references a media entity with an image field.


Enable the responsive images module from Drupal core.

  1. To enable the responsive image module, go to Admin > Configuration.
  2. Click the checkbox next to Responsive Image.
  3. Click Install.

This module may already be installed on your project so just head to Admin > Config and ensure that the Responsive image styles module is enabled.

Responsive Image Styles Config

Add / Confirm breakpoints

The default theme will already have a breakpoints YAML file. If you're using a custom theme you'll need to make sure you have a breakpoints YAML file for it. This should exist at themes/{theme-name}/{theme-name}.breakpoints.yml, where {theme-name}, is the name of your theme.
Create or open the file and configure your breakpoints. There should be a few breakpoints in there and they'll look something like this

  label: small
  mediaQuery: "(min-width: 0px)"
  weight: 1
    - 1x
    - 2x
  label: medium
  mediaQuery: "(min-width: 768px)"
  weight: 2
    - 1x
    - 2x
  label: large
  mediaQuery: "(min-width: 1024px)"
  weight: 3
    - 1x
    - 2x

You can add as many breakpoints as you need to suit your requirements. The weight should go from 0 for the smallest breakpoint to the highest number for the largest breakpoint. The multipliers are used to provide crisper images for HD and retina displays.

Configure Image Styles (sizes)

Head to Admin > Config > Media > Image Styles and create a size for each breakpoint.

Configuring Image Styles UI

  1. Click Add image style.
  2. Give it an Image style name and click Create new style (e.g. Desktop 1x, Desktop 2x, Tablet 1x etc...).

    Create Image Style UI

  3. Select an effect e.g. Scale or Scale and crop.

    Edit Image Style UI

    Edit Image Style Effect Options

  4. Set a width (height is calculated automatically) or width and height when cropping.

    Image Style Effect UI

  5. When creating multiple styles just use the breadcrumbs to get back to Configure Image Styles

    Image Styles Breadcrumb Item

When you have created all the sizes for your responsive format you can move on to the next step.

Create a responsive Image Style

Head to Admin > Config > Media > Responsive Image Styles

  1. Click Add responsive image style to create a new one.
  2. Give it a label (for example if it's for a paragraph type called profile_image then use that as the name)

    Add responsive image style UI

  3. Select your theme name from the Breakpoint Group

    Breakpoint Group Selection

  4. The breakpoints will load, open the breakpoints that this image style will use and check the radio next to Select a single image style or use multiple.

    Configuring the breakpoint image style

  5. Select the image style from the Image style dropdown (these are the styles we created in the previous step).

    Image style selection UI

  6. Set the Fallback Image style (this will be used where the browser doesn't understand the <source> tags inside the picture element. It should be the most appropriate size to use if you could only pick one across all screen sizes)

    Fallback image style

Add a new view mode for media entities

Head to Admin > Structure > Display Modes > View Modes, click 'Add new view mode' and add your display mode. In this instance, we'll use 'Profile image' again.

Adding a view mode

Update the display of the image for the entity type

Head to Admin > Structure > Media Types > Image > Manage display

  1. On the default tab click on Custom display settings at the bottom and check the new 'Profile Image' view mode and then Click Save

    Custom display settings

  2. Click on the tab that matches your new display type (in my example it's Profile image)
  3. On the Image fields row change the Label to Hidden and the Format to Responsive Image.

    Configuring the image format

  4. Click on the cog at the end of the row.

    Row configuration cog

  5. Under Responsive image style select your style.

    Format Configuration

  6. Select where the file should link to (or Nothing), Click update
  7. Click Save

Update your Paragraph type to use the new display format

Go to Structure > Paragraph Types > {type} > Manage Display

  1. Find the row with the field displaying your media entity and change the format to Rendered Entity
  2. Click the gear icon to configure the view mode by selecting your view mode from the list (in this instance profile image)

    Paragraph type display format

  3. Click Save

Testing Our Work

At this point you should be all set.

  1. Create an example page
  2. Select your paragraph to insert into the page
  3. Add an image
  4. Save the page and view it on the front end
  5. Inspect the image element and ensure that a <picture> element is rendered with <source>'s, a default, and when you resize the browser you see what you expect.
      <source srcset="/sites/default/files/styles/profile_image/public/2019-11/image.jpeg 1x" media="(min-width: 1024px)" type="image/jpeg">
      <source srcset="/sites/default/files/styles/profile_image_1024x1024_/public/2019-11/image.jpeg 1x" media="(min-width: 768px)" type="image/jpeg">
      <source srcset="/sites/default/files/styles/profile_image_512x512_/public/2019-11/image.jpeg 1x" media="(min-width: 320px)" type="image/jpeg">
      <img src="" alt="A profile Image" typeof="foaf:Image">
  6. To inspect further select the Network tab in your developer tools and filter by images. Resize the browser window and watch as new image sizes are loaded at your defined breakpoints.
Photo of Nick Fletcher

Posted by Nick Fletcher
Front end Developer

Dated 15 November 2019


Thanks for this detailed step-by-step post. Much appreciated!

Thanks for putting together the instructions. This was helpful.
I noticed that in your screenshot for 'Add a new view mode for media entities', you are adding a 'Paragraph' view mode. Shouldn't it be a 'Media' view mode?
Adding a 'Media' view mode is working as explained.
Initially, I was confused about what to add looking at the screenshot.


This was the clearest guide on this topic yet, thanks. It seems complex and arbitrary for something that should be rather simple.


Nov 14 2019
Nov 14

PreviousNext builds open source digital platforms for large scale customers, primarily based on Drupal and hosted using Kubernetes, two of the world’s biggest open source projects. With our business reliant on the success of these open source projects, our company is committed to contributing where we can in relation to our relatively small size. We get a lot of questions about how we do this, so are happy to share our policies so that other organisations might adopt similar approaches.

We learned early on in the formation of PreviousNext that developers who are passionate and engaged in open source projects usually make great team members, so wanted to create a work environment where they could sustain this involvement. 

The first step was to determine how much billable work on client projects our developers needed to achieve in order for PreviousNext to be profitable and sustainable. The figure we settled on was 80%, or 32 hrs per week of billable hours of a full time week as the baseline. Team members then self manage their availability to fulfil their billable hours and can direct up to 20% of their remaining paid availability to code contribution or other community volunteering activities. 

From a project management perspective, our team members are not allowed to be scheduled on billable work more than 80% of their time, which is then factored into our Agile sprint planning and communicated to clients. If certain team members contribute more billable hours in a given week, this just accelerates how many tickets we can complete in a Sprint.

If individual team members aren’t involved or interested in contribution, we expect their billable hours rate to be higher in line with more traditional companies. We don’t mandate that team members use their 20% time for contribution, but find that the majority do due to the benefits it gives them outside their roles. 

These benefits include:

  • Learning and maintaining best-practice development skills based on peer review by other talented developers in the global community.
  • Developing leadership and communication skills with diverse and distributed co-contributors from many different cultures and backgrounds.
  • Staying close to and often being at the forefront of new initiatives in Drupal, whether it be as a core code contributor or maintaining key modules that get used by hundreds of thousands of people. For example, the Video Embed Field that Sam Becker co-maintains is used on 123,487 websites and has been downloaded a staggering 1,697,895 times at the time of publishing. That's some useful code!  
  • Developing close working relationships with many experienced and talented developers outside PreviousNext. In addition to providing mentoring and training for our team, these relationships pay dividends when we can open communication channels with people responsible for specific code within the Drupal ecosystem.
  • Building their own profiles within the community and being considered trusted developers in their own right by demonstrating a proven track record. After all, it's demonstrated work rather than the CV that matters most. This often leads to being selected to provide expert talks at conferences and obviously makes them highly desirable employees should they ever move on from PreviousNext.
  • If our team members do get selected as speakers at international Drupal events, PreviousNext funds their full attendance costs and treats their time away as normal paid hours.
  • Working on non-client work on issues that interest them, such as emerging technologies, proof of concepts, or just an itch they need to scratch. We never direct team members that they should be working on specific issues in their contribution time.

All of these individual benefits provide clear advantages to PreviousNext as a company, ensuring our team maintains an extremely high degree of experience and elevating our company’s profile through Drupal’s contribution credit system. This has resulted in PreviousNext being consistently ranked in the top 5 companies globally that contribute code to Drupal off the back of over 1,000 hours of annual code contribution.

In addition to this 20% contribution time, we also ensure that most new modules we author or patch during client projects are open sourced. Our clients are aware that billable time during sprints will go towards this and that they will also receive contribution credit on as the sponsor of the contributions. The benefits to clients of this approach include:

  • Open sourced modules they use and contribute to will be maintained by many other people in the Drupal community. This ensures a higher degree of code stability and security and means that if PreviousNext ceases to be engaged the modules can continue to be maintained either by a new vendor, their internal team or the community at large.
  • Clients can point to their own contribution credits as evidence of being committed Drupal community supporters in their own right. This can be used as a key element in recruitment if they start hiring their own internal Drupal developers.

Beyond code contributions, PreviousNext provides paid time to volunteer on organising Drupal events, sit on community committees, run free training sessions and organise code sprints. This is then backed by our financial contributions to sponsoring events and the Drupal Association itself.

None of this is rocket science, but as a company reliant on open source software we view these contribution policies and initiatives as a key pillar in ensuring PreviousNext's market profile is maintained and the Drupal ecosystem for our business to operate in remains healthy. 

We're always happy to share insights into how your own organisation might adopt similar approaches, so please get in touch if you'd like to know more.


This is a brilliant contribution thanks Owen! And I argue that culture building is not something a rocket scientist can do.


Nov 13 2019
Nov 13

PreviousNext continue to be major contributors to the development and promotion of Drupal 8. As participants of the Drupal 8.8.0 Beta Testing Program, we thought it would be useful to document the steps we took to update one of our sites on Drupal 8.7 to the latest 8.8.0 beta.

Every site is different, so your mileage may vary, but it may save you some time.

Drupal 8.8 is a big release, with a number of new features added, and APIs deprecated to pave the way to a Drupal 9.0 release. Thankfully, the upgrade process was fairly straightforward in our case.

Upgrade PathAuto

First step was to deal with The Path Alias core subsystem has been moved to the "path_alias" module This meant some classes were moved to different namespaces. In order to make things smoother, we installed the latest version of pathauto module and clear the caches.

composer require drupal/pathauto:^[email protected]
drush cr

Core Dev Composer Package

We use the same developer tools for testing as Drupal core, and we want to switch to the new core composer packages, so first we remove the old one.

composer remove --dev webflo/drupal-core-require-dev

Update Patches

We sometimes need to patch core using cweagans/composer-patches. In the case of this site, we are using a patch from ckeditor_stylesheets cache busting: use system.css_js_query_string which needed to be re-rolled for Drupal 8.8.x. We re-rolled the patch, then updated the link in the extra/patches section.

Update Drupal Core and Friends

In our first attempt, composer could not install due to a version conflict with some symfony packages (symfony/finder,  symfony/filesystem and symfony/debug). These are transient dependencies (we don't require them explicitly). Our solution was to explicitly require them (temporarily) with versions that Drupal core is compatible with, then remove them afterwards.

First require new Drupal core and dependencies:

composer require --update-with-dependencies \
  drupal/core:^[email protected] \
  symfony/finder:^3.4 \

Second, require new core-dev package and dependencies:

composer require --dev --update-with-dependencies \
  drupal/core-dev:^[email protected] \

Lastly, remove the temporary required dependencies:

composer remove -n \
  symfony/finder \
  symfony/filesystem \

Update the Database and Export Config

Now our code is updated, we need to update the database schema, then re-export our config. We use drush_cmi_tools, so your commands may be different, e.g. just a drush config-export instead of drush cexy.

drush updb
drush cr
drush cexy


We also need to update our settings.php file now that The sync directory is defined in $settings and not $config_directories.

This is a trivial change from:

$config_directories['sync'] = 'foo/bar';


$settings['config_sync_directory'] = 'foo/bar';

Also we need to move temporary files directory from config to settings.

$config['system.file']['path']['temporary'] = 'foo/bar';


$settings['file_temp_path'] = 'foo/bar';

Final Touches

In order to make sure our code is compatible with Drupal 9, we check for any custom code that is using deprecated APIs using the excellent PHPStan and Matt Glaman's mglaman/phpstan-drupal. (Alternatively you can use Drupal Check.)

 We were using an older version that was incompatible with "nette/bootstrap":">=3" so needed to remove that from the conflict section and do the remove/require dance once again.

composer remove --dev \
  phpstan/phpstan-deprecation-rules \

composer require --dev --update-with-dependencies \
  phpstan/phpstan-deprecation-rules:^0.11.2 \

And that's it! Altogether not too painful once the composer dependencies were all sorted out. As we are testing the beta, some of these issues may be addressed in future betas and RCs.

I hope you found this useful! Got a better solution? Let us know in the comments!

Update: Added additional settings changes.

Photo of Kim Pepper

Posted by Kim Pepper
Technical Director

Dated 13 November 2019


This may be relatively simple for a major development company, but it would not be easy for a more casual developer with a few small sites. Just recognizing what needs to be done is non-trivial. I am just playing with a D7 to D8 migrate at this point and it looks like I should delay until 8.8.0 to avoid some of the issues. Or will some of these issues still arise on a clean install?

Drupal 8.8.0 was released today - and yes, now would probably be a better time to start your D7 to D8 migrate. Just in general, it might not be bad to wait for an 8.8.1, just in case there are any other little clean ups that go in, of course.

Thanks for this post, it helped me through the update process by addressing important issues. What I did to get the missing dependencies that prohibited the update to 8.8.0 was running "composer update" instead of your suggestion in "Update Drupal Core and Friends". It just worked, any dependency was updated an so was core. In my case an older version of typo3/phar-stream-wrapper was the problem.

Thanks for the documentation! This was very helpful for such a tricky update.

Thanks for this. I agree with Walt that Drupal updates are not easy for the casual developer, so we do appreciate the time taken to write and publish these articles. The 8.3 to 8.4 upgrade was a disaster for my project. With the help of this article, the official d.o 'special considerations' instructions and another article at , I was able to move my drupal ecommerce project from 8.7 to 8.8 with relative ease. Still, upgrading drupal feels like a leap in the dark. There is no 'one size fits all' instruction. Backup everything, then be brave!

I have used a single cpmmand `composer update drupal/core "symfony/*" --with-dependencies` to update Drupal Core and Friends.


Nov 12 2019
Nov 12

Since co-founding PreviousNext in 2009 with Kim Pepper, our company has put a lot of focus into supporting the Drupal open source project and community at a regional and global level.

This has included a number of key initiatives, including:

  • Providing our team with up to 20% of their paid working hours to contribute code to Drupal core software and contributed modules. This has seen PreviousNext consistently rank in the Top 5 companies contributing code to Drupal at a global level, with several of our individual team members in the Top 100 contributing developers. 
  • Supporting our team to contribute time to voluntary groups that sustain the Drupal community, such as Drupal’s Security Team, running free Drupal training days and organising Drupal events.
  • Donating funds to the Drupal Association's supporting partner program, global initiatives like Promote Drupal, regional conferences and local meetups.
  • Funding our team to travel and speak or participate in regional and global Drupal conferences.

This support plays a key role in PreviousNext’s ability to attract and retain the best Drupal talent, facilitates trusted relationships with key members of the global Drupal community and maintains our reputation as Drupal experts in the eyes of prospective clients. In other words, strong support for Drupal pays dividends to maintain PreviousNext as a sustainable company.

After a decade of leading PreviousNext myself, long-term colleague Jason Coghlan took the reins as Managing Director in late 2018. Jason is responsible for all of PreviousNext’s operations and client engagements, which he manages in concert with our internal Leadership and Delivery teams. My ongoing role is to guide PreviousNext’s strategy, marketing and finances as Chair of our Executive Team, paired with enhanced engagement with the Drupal community.

The first initiative I’ve focused on in 2019 has been the formation of the DrupalSouth Steering Committee. DrupalSouth has been running as an annual conference in Australia and New Zealand since 2008 but had always been reliant on ad-hoc volunteers to take on the significant work to organise and run each event. The Steering Committee’s objective is to provide ongoing consistency for the annual conference whilst spearheading other initiatives that support and grow Drupal’s community and commercial ecosystem in the region. We’ll be presenting our initial ideas at DrupalSouth in Hobart in late November.

I’m also honoured to have been appointed to the global Drupal Association Board of Directors and just returned from my first board retreat before DrupalCon Amsterdam. The board works alongside the new Drupal Association Executive Director, Heather Rocker, on overall strategy and direction for her team of almost 20 staff to implement. I’ve been asked to chair the board’s Revenue Committee that oversees how the DA is funded through event attendance, sponsorships and other sources, and to sit on the Strategic Planning Committee that will define where the association's focus can be best directed. My initial term will run until 2022 with a frequent presence at DrupalCon North America and Europe in coming years.

Drupal Association Board & Staff

Drupal Association Board & Staff at the Amsterdam retreat

Whilst in Amsterdam, I also sat in on round table discussions with other local Drupal associations from around the world, sharing ideas about how we can scale community engagement whilst leveraging common approaches and resources. A Supporting Partners round table focused more on the needs of Drupal services vendors and large users and a CEO dinner was a great insight into the state of Drupal businesses around the world. It was inspiring to see how professionally organised the global Splash Awards were and to understand how we might bring the initiative to our local region to recognise world-class projects being developed here. To cap things off, I had a talk accepted where I could share some of PreviousNext's experience winning and retaining long term clients - essentially all the things I wish someone had told me a decade ago!

With the upcoming release of Drupal 9 in mid 2020 there’s a high degree of optimism and confidence around Drupal’s immediate future. The software is a clear choice for enterprise and large organisations, Drupal services businesses are doing well and there’s a huge number of fresh and enthusiastic members of our community. While there’s some clear challenges ahead, I’m excited to be able to play a role in helping solve them at a global and regional level.

If you ever want to connect with me to discuss how I can help with your own Drupal community or business initiatives, feel free to get in touch via or Drupal Slack.

Nov 06 2019
Nov 06

Skpr provides a compelling command line workflow for developers.

In this blog post we will be demonstrating Skpr by going through the fundamental commands: package, deploy and config.


Modern applications require a set of steps to prepare the application for deployment, these steps might include:

  • Installing dependencies eg. Composer
  • Building the theme eg. Gulp
  • Installing extra packages

The outcome of this preparation then needs to be stored so it can be deployed onto the platform. This process is known as packaging and is accomplished in Skpr by running the command:

skpr package <version>

As you can see in the diagram below, this command does a lot of heavy lifting. It not only compiles your code, it also splits the application into individually scalable components and pushes them to the Skpr platform, ready for deployment.

Diagram describing that the package command builds 3 artefacts. Nginx, FPM and CLI.

Also of note, this command can be run by developers and automation alike. Typically this command would be run as part of a pipeline in conjunction with the deploy command for a continuous deployment workflow.


Now that our application is packaged, let’s deploy it!

Deploying the application is as simple as running the below command. Seriously, it’s that easy.

skpr deploy <environment> <version>

While simple on the surface, Skpr is actually orchestrating a catalog of cloud managed services.

  • CDN / Cache
  • Certificates
  • Database
  • Storage
  • Search
  • SMTP

Diagram of how the Skpr deploy command interacts with the API and AWS Cloud Services.

These services are then exposed to the application through the Skpr’s configuration system.


The Twelve-Factor app manifesto calls for strict separation of configuration from code. This approach provides several advantages:

  • Sensitive values such as API tokens and private keys will not be leaked if the codebase was ever exposed.
  • There is no need for a switch statement for each environment defining various variables for dev, staging, etc..
  • Feature toggles can be used to dynamically enable functionality without a deployment.

Skpr out of the box will provide configuration for:

  • Database connection details
  • SMTP credentials
  • File storage locations eg. public / private / temporary

A terminal showing the output from Skpr config list.

As a developer you can also add your own custom configuration eg. API keys for an integration.

In this example we are adding an API key for mailchimp and flagging it as a secret to avoid the key from being accidentally exposed (see the [secret] in the command line image above).

skpr config set --secret dev mailchimp.key xxxxxxxxxxxxxxxxxxx

Details on how to configure your application to consume these configuration key/values can be found here.


Skpr provides a simple set of commands for developers to "get the job" done.

If you would like to dive into more of the Skpr commands check out our documentation site, or contact us for a demo via the website.

Photo of Nick Schuch

Posted by Nick Schuch
Sys Ops Lead

Dated 6 November 2019

Nov 05 2019
Nov 05

On a client project we were using a custom Drupal content entity to model some lightweight reusable content.

The content entity was originally single use and did not support bundles (e.g. node entities have node-type bundles).

As the project evolved, we needed to add bundle support for the custom entity-type, despite it already being in production use.

Read on to find out how we achieved this.

In this example, lets call the content entity a 'set' and the bundles a 'set type'.

Create the bundle configuration entity

As we wanted this content entity to support adding new bundles via the UI, a configuration entity makes sense to allow site-builders to create the various bundles as required, so we created a new configuration entity called 'set type' as per the examples, although we used a route provider instead of a routing file. We made sure to add the bundle_of annotation to the config entity.

bundle_of = "set",

Updating the content entity's annotation and fields

Once this was done, the next step was to update the content entity's annotation. We added the 'bundle' key and the 'bundle_entity_type' annotation

bundle_entity_type = "set_type",
*   entity_keys = {
*     "id" = "id",
*     "label" = "name",
*     "uuid" = "uuid",
*     "uid" = "user_id",
*     "bundle" = "type",
*     "langcode" = "langcode",
*   },

We didn't need to add a new field for our baseFieldDefinition to our content entity because we just deferred to the parent implementation. But we made sure to match up the description, label etc as desired - and that we called setInitialValue. As we're planning to add a new column to the entity's tables in the database, we need to populate the type column for existing records. Now with entities that don't support bundles, Drupal defaults to the entity ID for the bundle. e.g. for the 'user' entity, the bundle is always 'user' because User entities don't support bundles. So we knew our existing 'set' entities would have to have a bundle of 'set' too. But our new ones could have whatever we liked. So this is why our field definition for 'type' had to have look like so


Update hooks to get everything in place

Since Drupal 8.7, support for automatic entity updates has been removed, so whilst adding the field, entity keys and updating the annotation works for a new install (hint, there won't be one) it doesn't help our existing production and QA sites - so we need an update hook to bring our existing entity-type and field definitions into sync with the code versions, which also takes care of the required database table changes.

So the steps we need to do here are:

  1. install the config entity type
  2. create a new instance of it for the existing entities
  3. add the new field definition for the type field to the content entity
  4. update the content entity definition

Installing the config entity type

The docs for installing a new entity type make it clear what we need to do. Our code ended up something like this:

 * Adds the set type.
function your_module_update_8001() {
    ->installEntityType(new ConfigEntityType([
      'id' => 'set_type',
      'label' => new TranslatableMarkup('Set type'),
      'label_collection' => new TranslatableMarkup('Set types'),
      'label_singular' => new TranslatableMarkup('set type'),
      'label_plural' => new TranslatableMarkup('set types'),
      'label_count' => [
        'singular' => '@count set type',
        'plural' => '@count set types',
      'handlers' => [
        'list_builder' => 'Drupal\your_module\SetTypeListBuilder',
        'form' => [
          'default' => 'Drupal\your_module\Form\SetTypeForm',
          'delete' => 'Drupal\Core\Entity\EntityDeleteForm',
        'route_provider' => [
          'html' => 'Drupal\Core\Entity\Routing\AdminHtmlRouteProvider',
      'admin_permission' => 'administer set type entities',
      'entity_keys' => [
        'id' => 'id',
        'label' => 'name',
      'links' => [
        'add-form' => '/admin/structure/sets/add',
        'delete-form' => '/admin/structure/sets/manage/{pane_set_type}/delete',
        'reset-form' => '/admin/structure/sets/manage/{pane_set_type}/reset',
        'overview-form' => '/admin/structure/sets/manage/{pane_set_type}/overview',
        'edit-form' => '/admin/structure/sets/manage/{pane_set_type}',
        'collection' => '/admin/structure/sets',
      'config_export' => [

Creating the first bundle

In our first update hook we installed the config entity, now we need to make one for the existing entities, because bundle-less entities use the entity type ID as the bundle, we make sure our new type has the same ID as the entity-type.

 * Adds a new config entity for the default set type.
function your_module_update_8002() {
  $type = SetType::create([
    'id' => 'set',
    'name' => 'Set',
    'description' => 'Provides set panes',

Adding the new field definition and updating the entity definition

The documentation for adding a new field definition is again very useful here, so we follow along to install our new field definition. And similarly the documentation for updating an entity type here, so our final update hook looks like this:

 * Updates defintion for set entity.
function your_module_update_8003() {
  $updates = \Drupal::entityDefinitionUpdateManager();
  $definition = BaseFieldDefinition::create('entity_reference')
    ->setLabel('Set type')
    ->setSetting('target_type', 'set_type')
  $updates->installFieldStorageDefinition('type', 'your_module', 'your_module', $definition);
  $type = $updates->getEntityType('your_module');
  $keys = $type->getKeys();
  $keys['bundle'] = 'type';
  $type->set('entity_keys', $keys);
  $type->set('bundle_entity_type', 'set_type');

And that's it we're done.

Wrapping up

Kudos to those who created the documentation for this, as well as my colleagues Sam Becker, Jibran Ijaz and Daniel Phin who helped me along the way. Hopefully, you find this post useful if you're ever in the same boat.

Photo of Lee Rowlands

Posted by Lee Rowlands
Senior Drupal Developer

Dated 5 November 2019


Hello. Thanks for this blog post. Very useful to have real life examples :-) Just may be you have a typo (last lines of the post) : "pane_set_type" shouldn't be "set_type" ?

thanks, fixed

Hi Lee,

Thanks for this blog post. Suppose I want to update a base field property eg. it's label or description, rather than updating it's type how would I do that?


Oct 21 2019
Oct 21

Hi Lee,

When I try to use container now it's started getting error when trying to install Drush.

I got following error when I ran composer require drush/drush, It seems like it's not compatible with symfony/yaml^3.4

Problem 1
- drush/drush 10.0.2 requires symfony/yaml ^3.4 || ^4.0 -> satisfiable by symfony/yaml[3.4.x-dev, 4.0.x-dev, 4.1.x-dev, 4.2.x-dev, 4.3.x-dev, 4.4.x-dev, v3.4.0, v3.4.0-BETA1, v3.4.0-BETA2, v3.4.0-BETA3, v3.4.0-BETA4, v3.4.0-RC1, v3.4.0-RC2, v3.4.1, v3.4.10, v3.4.11, v3.4.12, v3.4.13, v3.4.14, v3.4.15, v3.4.16, v3.4.17, v3.4.18, v3.4.19, v3.4.2, v3.4.20, v3.4.21, v3.4.22, v3.4.23, v3.4.24, v3.4.25, v3.4.26, v3.4.27, v3.4.28, v3.4.29, v3.4.3, v3.4.30, v3.4.31, v3.4.32, v3.4.33, v3.4.34, v3.4.35, v3.4.4, v3.4.5, v3.4.6, v3.4.7, v3.4.8, v3.4.9, v4.0.0, v4.0.0-BETA1, v4.0.0-BETA2, v4.0.0-BETA3, v4.0.0-BETA4, v4.0.0-RC1, v4.0.0-RC2, v4.0.1, v4.0.10, v4.0.11, v4.0.12, v4.0.13, v4.0.14, v4.0.15, v4.0.2, v4.0.3, v4.0.4, v4.0.5, v4.0.6, v4.0.7, v4.0.8, v4.0.9, v4.1.0, v4.1.0-BETA1, v4.1.0-BETA2, v4.1.0-BETA3, v4.1.1, v4.1.10, v4.1.11, v4.1.12, v4.1.2, v4.1.3, v4.1.4, v4.1.5, v4.1.6, v4.1.7, v4.1.8, v4.1.9, v4.2.0, v4.2.0-BETA1, v4.2.0-BETA2, v4.2.0-RC1, v4.2.1, v4.2.10, v4.2.11, v4.2.12, v4.2.2, v4.2.3, v4.2.4, v4.2.5, v4.2.6, v4.2.7, v4.2.8, v4.2.9, v4.3.0, v4.3.0-BETA1, v4.3.0-BETA2, v4.3.0-RC1, v4.3.1, v4.3.2, v4.3.3, v4.3.4, v4.3.5, v4.3.6, v4.3.7, v4.3.8, v4.4.0-BETA1, v4.4.0-BETA2, v4.4.0-RC1] but these conflict with your requirements or minimum-stability.
- drush/drush 10.0.1 requires symfony/yaml ^3.4 || ^4.0 -> satisfiable by symfony/yaml[3.4.x-dev, 4.0.x-dev, 4.1.x-dev, 4.2.x-dev, 4.3.x-dev, 4.4.x-dev, v3.4.0, v3.4.0-BETA1, v3.4.0-BETA2, v3.4.0-BETA3, v3.4.0-BETA4, v3.4.0-RC1, v3.4.0-RC2, v3.4.1, v3.4.10, v3.4.11, v3.4.12, v3.4.13, v3.4.14, v3.4.15, v3.4.16, v3.4.17, v3.4.18, v3.4.19, v3.4.2, v3.4.20, v3.4.21, v3.4.22, v3.4.23, v3.4.24, v3.4.25, v3.4.26, v3.4.27, v3.4.28, v3.4.29, v3.4.3, v3.4.30, v3.4.31, v3.4.32, v3.4.33, v3.4.34, v3.4.35, v3.4.4, v3.4.5, v3.4.6, v3.4.7, v3.4.8, v3.4.9, v4.0.0, v4.0.0-BETA1, v4.0.0-BETA2, v4.0.0-BETA3, v4.0.0-BETA4, v4.0.0-RC1, v4.0.0-RC2, v4.0.1, v4.0.10, v4.0.11, v4.0.12, v4.0.13, v4.0.14, v4.0.15, v4.0.2, v4.0.3, v4.0.4, v4.0.5, v4.0.6, v4.0.7, v4.0.8, v4.0.9, v4.1.0, v4.1.0-BETA1, v4.1.0-BETA2, v4.1.0-BETA3, v4.1.1, v4.1.10, v4.1.11, v4.1.12, v4.1.2, v4.1.3, v4.1.4, v4.1.5, v4.1.6, v4.1.7, v4.1.8, v4.1.9, v4.2.0, v4.2.0-BETA1, v4.2.0-BETA2, v4.2.0-RC1, v4.2.1, v4.2.10, v4.2.11, v4.2.12, v4.2.2, v4.2.3, v4.2.4, v4.2.5, v4.2.6, v4.2.7, v4.2.8, v4.2.9, v4.3.0, v4.3.0-BETA1, v4.3.0-BETA2, v4.3.0-RC1, v4.3.1, v4.3.2, v4.3.3, v4.3.4, v4.3.5, v4.3.6, v4.3.7, v4.3.8, v4.4.0-BETA1, v4.4.0-BETA2, v4.4.0-RC1] but these conflict with your requirements or minimum-stability.
- drush/drush 10.0.0 requires symfony/yaml ^3.4 || ^4.0 -> satisfiable by symfony/yaml[3.4.x-dev, 4.0.x-dev, 4.1.x-dev, 4.2.x-dev, 4.3.x-dev, 4.4.x-dev, v3.4.0, v3.4.0-BETA1, v3.4.0-BETA2, v3.4.0-BETA3, v3.4.0-BETA4, v3.4.0-RC1, v3.4.0-RC2, v3.4.1, v3.4.10, v3.4.11, v3.4.12, v3.4.13, v3.4.14, v3.4.15, v3.4.16, v3.4.17, v3.4.18, v3.4.19, v3.4.2, v3.4.20, v3.4.21, v3.4.22, v3.4.23, v3.4.24, v3.4.25, v3.4.26, v3.4.27, v3.4.28, v3.4.29, v3.4.3, v3.4.30, v3.4.31, v3.4.32, v3.4.33, v3.4.34, v3.4.35, v3.4.4, v3.4.5, v3.4.6, v3.4.7, v3.4.8, v3.4.9, v4.0.0, v4.0.0-BETA1, v4.0.0-BETA2, v4.0.0-BETA3, v4.0.0-BETA4, v4.0.0-RC1, v4.0.0-RC2, v4.0.1, v4.0.10, v4.0.11, v4.0.12, v4.0.13, v4.0.14, v4.0.15, v4.0.2, v4.0.3, v4.0.4, v4.0.5, v4.0.6, v4.0.7, v4.0.8, v4.0.9, v4.1.0, v4.1.0-BETA1, v4.1.0-BETA2, v4.1.0-BETA3, v4.1.1, v4.1.10, v4.1.11, v4.1.12, v4.1.2, v4.1.3, v4.1.4, v4.1.5, v4.1.6, v4.1.7, v4.1.8, v4.1.9, v4.2.0, v4.2.0-BETA1, v4.2.0-BETA2, v4.2.0-RC1, v4.2.1, v4.2.10, v4.2.11, v4.2.12, v4.2.2, v4.2.3, v4.2.4, v4.2.5, v4.2.6, v4.2.7, v4.2.8, v4.2.9, v4.3.0, v4.3.0-BETA1, v4.3.0-BETA2, v4.3.0-RC1, v4.3.1, v4.3.2, v4.3.3, v4.3.4, v4.3.5, v4.3.6, v4.3.7, v4.3.8, v4.4.0-BETA1, v4.4.0-BETA2, v4.4.0-RC1] but these conflict with your requirements or minimum-stability.
- Installation request for drush/drush ^10.0 -> satisfiable by drush/drush[10.0.0, 10.0.1, 10.0.2].

Oct 08 2019
Oct 08

Skpr - pronounced Skipper - is a cloud hosting platform specifically designed to maximise the productivity of development teams by giving them full control right from the command line.

During our consulting engagements with large organisations, we recognised a clear trend; they were moving away from narrow, single-site hosting services and building bespoke platforms on top of Kubernetes to support their multi-site, multi-technology initiatives.

Back in 2016 we had this exact need for hosting our entire portfolio of sites. Throughout this journey we found that providing developers with a simple Command Line Interface (CLI), has led to huge improvements in our team's efficiency and the overall quality of our products.

So, today we’re announcing the public launch of our hosting platform, Skpr. The platform for teams who want a simple command line tool, backed by a range of industry-leading services and supported by our own team of experts.

Why Skpr is different

Many hosting platforms provide a web interface where deployments can be dragged-and-dropped between environments.

While these solutions are more effective for non-developers, they fall short in integration and extendability within the workflow of the developers actually doing the job. Having a Command Line Interface (CLI) means that not only do we provide the same level of control, we provide the flexibility to extend those workflows. 

  • Scripts - Having a CLI means that Skpr can integrate into existing automation, along with CI tools such as CircleCI.
  • Documentation - Complex tasks carried out via a GUI are very difficult to document. CLIs mean you spend less time describing a user interface and more time documenting the actual process.

Control on Command

With a few commands, developers have the control to package, deploy, configure and monitor their services right from the command line.

And while we want to provide a platform that's powerful, reliable and secure, we're passionate about making it easy-to-use as well.

To find out more, visit

Sep 24 2019
Sep 24

Drupal 8.8.0 will be released in December 2019 and the upcoming changes in JSON:API module codebase introduce huge performance benefits.

by Jibran Ijaz / 24 September 2019

Here are three things to prove that:

1. Recent patches committed to JSON:API in Drupal 8.8 is a simple issue which is making sure that if you are requesting information of related entities then it statically caches the resource type information for that relationship so that when multiple entities of the same entity type and bundle are requested it doesn’t have to collect the resource type information for the related entities over and over again. adds a cache layer to store the normalized entities so that if we need the normalized version of an entity we can just get it from the cache instead of normalizing the whole entity again which can be a very expensive process. introduces new cache backend to store JSON:API resource type information which was stored in the static cache. This means that instead of creating JSON:API resource types every request, we are just creating them once after cache clear.

2. Profiling using

I was able to do some profiling to compare the JSON:API core module in Drupal 8.7 versus 8.8 . Here are the initial conditions:

  • PHP 7.3
  • JSON:API version 8.7
  • No JSON:API Extras
  • Page Cache module disabled.
  • Dynamic Page Cache module is set to cache.backend.null, which forces a 100% cache miss rate.
  • Cleared all caches.
  • Visit user login page to rebuild the container and essential services.

Case I

Visit the first JSON:API endpoint which loads 50 nodes with 8 fields, 2 computed fields, 2 filters, and sorted by title.

JSON:API 8.7 - URL 1

Case II

Visit the first JSON:API endpoint which loads 2 nodes with 45 paragraph fields, each paragraph field has 6 fields and 2 computed fields, 1 filter.

JSON:API 8.7 - URL 2

Then update the JSON:API to 8.8, all other initial conditions were the same as before.

Case I

Visit the first JSON:API endpoint which loads 50 nodes with 8 fields, 2 computed fields, 2 filters, and sorted by title.

JSON:API 8.8 - URL 1

Case II

Visit the first JSON:API endpoint which loads 2 nodes with 45 paragraph fields, each paragraph field with 6 fields and 2 computed fields, 1 filter.

JSON:API 8.8 - URL 2


Case I

The comparison shows 79% improvement in response time.

URL1 comparison from JSON:API 8.7 to JSON:API 8.8

There are 39 more SQL queries on JSON:API in Drupal 8.8.

After having a detailed look at those shows that there are additional calls to new cache bin added by JSON:API but the most important thing was 50 fewer queries to url_aliase table.

URL1 query comparison from JSON:API 8.7 to JSON:API 8.8

Function calls also show the reduced number of function calls to Entity API and normalizers.

URL1 function comparison from JSON:API 8.7 to JSON:API 8.8

Case II

The comparison shows 66% improvement in response time.

URL2 comparison from JSON:API 8.7 to JSON:API 8.8

There are 35 more SQL queries on JSON:API in Drupal 8.8.

These are the same additional calls to the new cache bin.

URL2 query comparison from JSON:API 8.7 to JSON:API 8.8

Function calls also show the reduced number of function calls to Entity API and normalizers — same as before.

URL2 function comparison from JSON:API 8.7 to JSON:API 8.8

I ran the same scenarios with redis cache backends instead of the default database backends. The results show the same kind of improvements.

3. Raw response comparison:

What matters is how this all plays out on the website.

JSONAPI:8.7 first page load on cold cache

JSONAPI:8.7 first page load on cold cache

JSONAPI:8.8 first page load on cold cache

JSONAPI:8.8 first page load on cold cache

Before After Improvement URL1 2.6 sec 1.3 sec 2x faster URL2 4.5 sec 1.8 sec 2.7x faster URL3 7.7 sec 2.5 sec 3.1x faster URL4 7.5 sec 2.4 sec 3.1x faster URL5 7.2 sec 2.5 sec 2.9x faster Overall 10.3 sec 3.8 sec 2.7x faster


In short, JSON:API in Drupal 8.8 is going to be significantly faster than its predecessor!

To improve the performance like this takes enormous effort and this was a community accomplishment but special thanks to @ndobromirov, @kristiaanvandeneynde, @itsekhmistro, and last but not least the hardworking maintainers of JSON:API module @e0ipso, @gabesullice, and @Wim Leers, without their work, support and guidance this would not have been possible. Please give them a shoutout on Twitter or come say ‘hi’ in Drupal Slack #contenta channel. If you are interested in JSON:API and its performance then please feel free to help out at

Thanks to @Wim Leers for feedback on this post!


jsonapi, Performance Optimisation, Drupal Modules
Sep 23 2019
Sep 23

It's a Monday morning and you push your first bit of code for the week. Suddenly all your Javascript tests start failing with crazy errors you've never seen before! And it's not just happening on one project! This post will hopefully help you track down the fix to the Bad Message 400 errors plaguing WebDriver.

Here at PreviousNext, we have automated processes to ensure our PHP images are updated on a weekly basis. On September 22nd 2019, that update included a version bump to the curl library from 7.65.1 to 7.66.0. This had a cascading effect which resulted in builds across all of our projects failing javascript tests running against selenium/standalone-chrome containers.

The errors looked something like this:

WebDriver\Exception\CurlExec: Webdriver http error: 400, payload :<h1>Bad Message 400</h1><pre>reason: Bad Content-Length</pre>

We were able to compare an old version of the PHP image (from a week ago) and track down that version change in CURL. But why was that failing? We didn't want to just go about pinning curl back to the old version and dusting our hands off.

Let's dive into the void (stacktrace)

WebDriver\Exception\CurlExec: Webdriver http error: 400, payload :<h1>Bad Message 400</h1><pre>reason: Bad Content-Length</pre>


When inspecting the code through the above trace, we found that the instaclick/php-webdriver library was responsible for issuing the actual cURL command and was throwing the exception.

Op to the Rescue

When looking through the recent commits of the library, Nick Schuch noticed a suspicious commit that sounded a bit fishy. Sure enough, manually applying those changes got all the tests green again!

But how do we fix it for good? That's where it gets a bit tricky due to some composer constraints (as per usual).

Unfortunately the instaclick/php-webdriver library's HEAD is quite far aHEAD of the latest stable release (1.4.5), and we aren't able to simply bump to dev in our composer file due to behat/mink-selenium2-driver (a Drupal core dev requirement) constraining us to 1.x.

The Fix

The easiest approach for now is to commit a patch locally to your repository and manually patch it until the maintainer releases a new stable release

First download a custom patch file I've prepared against 1.4.5:

wget -O instaclick-curl-fix.patch

Then patch the library (using cweagans/composer-patches) with the new patch file by adding the following to the patches key in your composer.json:

"instaclick/php-webdriver": { "fix cURL POST": "instaclick-curl-fix.patch" }

Then simply run composer update instaclick/php-webdriver

Photo of Adam Bramley

Posted by Adam Bramley
Senior Drupal Developer

Dated 23 September 2019

Add new comment

Sep 13 2019
Sep 13

One of the increasingly popular architectural paradigms that Drupal has been seen as a leader in, is the concept of a single Drupal software package that can be spun up to power networks of websites with varying degrees of commonality. This is usually driven by the ambitious goal of being able to code and configure Drupal once and then leverage that effort as either an entire platform or foundation for many "networked" sites.

Beginning down the path of starting a project like this is complex and unfortunately isn't helped by some of Drupal's (confusingly named) features which describe aspects of reusability but aren't in themselves a full fledged approach to architecting such a network. In addition to that, there are many misconceptions about Drupal's capabilities and affordances when it comes to building such networks.

In order to try and expose some of the discovery and decision making process behind starting an ambitious Drupal network project, the following is a non-exhaustive list of popular architectural paradigms that exist, evaluated on the following axis:

  • Up-front investment: the up-front cost of starting a network of sites.
  • Per-unit investment: the cost of introducing a new site to the network
  • Flexibility: the ability to customise and create bespoke experiences within each network site
  • Platform maintainability: the ability to iterate and evolve the network as a whole

As with all complex projects, there are a large number of requirements and constraints which factor into technical decision making, so these approaches are a broad view of the landscape of Drupal's capabilities.

Models of networked sites


A starter-kit consists of creating a Drupal distribution or install profile, catering to as much common functionality as possible across the network in an initial development phase and then allowing each new website to make any additional required customisations as needed. These customisations may consist of writing additional code, enabling new dependencies and modifying the configuration shipped with the initial distribution.

For each individual site, this model affords the most flexibility. By allowing each site to evolve independently any new requirements or features perceived as bespoke can be implemented and deployed without making consideration to the starter-kit itself or other websites within the network.

The major drawback of this approach is being able to maintain and evolve the network of sites as a whole. Each new site in the network creates a new deployment with it's own configuration, dependencies and code, meaning new features and bug fixes can't be deployed across the whole network without specific individual effort and conflict resolution for each site. In practice once a site is live under this model, it can effectively be considered a siloed project without a significant relationship to other sites in the network.

As far as how feature rich an initial starter kit is largely depends on the project. For example, early versions aGov 8, the starter-kit distribution PreviousNext built and maintained for Australian government organisations was intentionally fairly rudimentary in the amount of content types it shipped with. The goal was a foundation to launch you into best practices, without being overly prescriptive. When future iterations of aGov were released that baked in Drupal 8's new media capabilities, it was not possible to deploy this iteration to all existing installations.

In a similar vein, I would classify govCMS8, the Australian governments own Drupal distribution under this same model. By default, the distribution ships with a lot more features and a lot more ready to go configuration than most starter-kits, however both SaaS and PaaS deployments of govCMS allow a wide scope of deep structural configuration changes to Drupal, which essentially sets up each project to function as a standalone unit after the initial build.


Another less widespread approach is the product model. Under this model, a Drupal distribution is leveraged as the full feature set for all sites in the network and all sites make use of the initial and ongoing development roadmap of the product. This approach is arguably less flexible, since each individual site doesn't have unfettered access to extend and manipulate the platform.

The major advantage of this approach is, a single team can scale their ongoing enhancement and maintenance efforts to the entire network of sites regardless of the number of sites in the network.

Under the product model, since the feature set running on each site is a known and strictly defined set of configuration and dependencies, a team could feasibly migrate hundreds of sites to using new features of Drupal by working on a single software product. All sites would be the target of all new evolutions of the platform and benefit from it's ongoing maintenance. Evolutions of the platform would not strictly be limited to features, but also updating dependencies or moving to new major versions of Drupal. 

An example of a project PreviousNext delivered under this model was a video content platform for group of media organisations. Each organisation could leverage the product to spin up a website to share their videos and engage with their audience. New features were added regularly and at its height, 26 different websites serving vastly different audiences of users would evolve together. One of the challenges of maintaining a product distribution is the governance around flexibility and change. While tempting to allow each site to develop its own product roadmap, when each site required it's own bespoke features, the process for such requests would be followed:

  • The site owner raises a request a change to their website, "please replace the hero image on the homepage with a slideshow of images".
  • The team evaluates the request and places it on the product roadmap.
  • Instead of replacing all hero images with slideshows, the feature is developed an optional choice for content editors: you may either upload slides or a hero image.
  • The feature would be built, tested and deployed to all sites in the network.
  • The site owner is then able to upload slides and all other site owners in the network have the same capability.

This approach certainly takes a level of control and focused organisation effort to accomplish, however leveraged properly can have significant payoffs for the maintainability of a network of sites as a whole. Examples of product based distributions in the Open Source Drupal ecosystem are Open Social or Drupal Commons.

Federated back-ends

Another approach to building out a network of sites is the notion of a federated back-end. This dovetails with terms like "service oriented architecture", "multi-tenancy" or "content hub", where a single deployment and instance of Drupal is leveraged as a large repository of content for multiple web front-ends.

Under this model, instead of the boundary between users and content being defined by different databases and deployments of Drupal, they must instead be implemented in the application layer. That is, Drupal itself is customised and tailored to meet the access needs of the organisations sharing the same Drupal site. While this is certainly additional work and complexity, if a single group of content authors is responsible for content across the whole network, it can be advantageous to lower these barriers. Maintenance for the content hub is also fairly light touch under this model, since only one installation needs to be updated and maintained.

This pattern also intersects with the "product" style of application. Since all web properties are powered by the same underlying application, they closely share a feature set and product roadmap. While this model is also often deployed in conjunction with a decoupled front-end, Drupal sites are capable of delivering integrated front-ends to network sites from a single federated back-end. In some cases, the federated model has an elevated risk of being a single point of failure, given a single deployment and instance is responsible for sites in the network.

An example of the federated back-end model can be illustrated in the "Tide" component of's "Single Digital Presence" project. Drupal 8 is deployed as a single instance serving multiple decoupled front-ends. The features of the single content repository are documented and available for evaluation by prospective users. 

Independent sites

One option that isn't often considered in of a lot of organisations when evaluating reusability of features and effort across a network of Drupal sites is simply building multiple completely unrelated sites and using smaller units of functionality as a mechanism for reuse. Drupal has a mature concept for sharing functionality across Drupal sites: the humble module.

While this approach in general doesn't strictly fit the theme of this blog post, in some cases writing and publishing a general case module which doesn't boil the ocean on dictating the tools, approach and features used to build a new Drupal site, is the best solution for some projects.

These kinds of projects would be driven by websites that are characterized as mostly unique with various related and unrelated feature sets. This is also an approach consistent with day to day of Drupal development outside the scope of network projects. With Drupal's open ecosystem, opportunities for collaboration and reuse often drive building functionality in reusable modules and publishing those modules on

Common misconceptions

Install profiles

Drupal "profiles" or "distributions" are not a silver bullet for developing and organising an architecture for networks of Drupal sites. They are incredibly flexible, so how they are deployed and leveraged still contain the same governance and architectural decisions discussed.

Configuration management

Configuration management is not a silver bullet. While configuration management has simplified a range of complex deployment problems that were present in Drupal 7 sites, it doesn't drive a particular architecture for networks of sites. The tools in Drupal core are continually getting sharper and while innovations in contrib have experimented with new mechanisms for deploying and updating configuration, it's not an end-to-end architectural solution for network sites.


The multisite concept in Drupal is frequently misunderstood as an approach for building a network of sites. In reality, multisites are a tool for deploying any configuration or setup of Drupal sites to a shared document root. It doesn't produce any tangible outcome as far as project architecture is concerned beyond forcing multiple sites to be hosted on the same server.

Headless Drupal & JavaScript frameworks

While some of these approaches, like the "federated back-end" are significantly benefited by a decoupled front-end, headless Drupal is compatible with all models of network sites. You could build a product or starter-kit that was either fully integrated, progressively decoupled or completely decoupled and the same back-end architectural decisions would apply.

Drupal has strong API based functionality, which can be enabled and configured as required. The approach of evaluating and selecting frameworks or front-ends to consume Drupal, have their own set of considerations that need to be carefully evaluated for fitness in any given project.

Styling and appearance

Styleguide-driven development has largely matured to solve issues with reusability, inheritance and extensibility of visual components. This approach has been the foundation for all new sites built by PreviousNext for last few years, see our blog posts. By building components and style guides, duplication of effort when building front-ends has been minimised across both traditional and network based projects. For that reason the visual style and consistency of sites within a network is not necessarily a factor when considering Drupal architectural paradigms.

Summing up

Given the size of Drupal's ecosystem and the increasingly rapid pace of evolution, describing all of the factors and challenges that play into a large network site project is difficult. As always the process of rigorous discovery and deeply understanding a project's goals and requirements should always be the first step in beginning a technical project.

Photo of Sam Becker

Posted by Sam Becker
Senior Developer

Dated 13 September 2019


It should be mentioned that if Aegir ( is used for the hosting, management, provisioning, etc. of sites in any of the above scenarios, it'll save a lot of trouble. For example, all sites running on the same platform (Aegir-speak for a Drupal codebase) can be upgraded with a single button click, with rollback on any failures.


Add new comment

Aug 09 2019
Aug 09

Scheduled Transitions is a module allowing you to schedule a specific previously saved revision to move from one state to another. This post provides an introduction to Scheduled Transitions for Drupal 8.

Scheduled Transitions is a module allowing you to schedule a specific previously saved revision to move from one state to another. For example an editor may edit a piece of content remaining in a draft state throughout the draft process. When ready, an editor may select the ready revision to be moved from draft to published. 

Another more complex use case is with the following workflow Draft -> Needs Review -> Approved -> Published -> Archived. A Content Editor could edit a piece of content until it is in Needs Review status, a Content Moderator will approve the content by setting the state to Approved. The Content Moderator would go to set up a scheduled transition for when the content would move from Approved to Published at some point in the future. If the content is time sensitive, another future scheduled transition could be created to automatically change from Published to Archived.

Scheduled Transitions integrates tightly with Content Moderation and Workflows, inheriting transitions, states, and associated permissions automatically.

This post and accompanying video cover configuration and general usage.


[embedded content]

Another shorter version of the video is available without site building aspects, ready to be shared with an editorial team.


Requirements and dependencies are fairly bleeding edge, but will change in the future, as of posting they are:


Download and install the module using your favourite method:

composer require drupal/scheduled_transitions
drush pm:enable scheduled_transitions # or
drupal module:install scheduled_transitions


Configure Workflows

If you have not already created a workflow, navigate to Configuration -> Workflows, click Add workflow button.

Create a label, select Content moderation from the Workflow type dropdown.

Set up states and the transitions between in any way you desire, and set which entity type bundles the workflow should apply to.

Configure Scheduled Transitions

Navigate to Configuration » Scheduled Transitions

Under the Enabled types heading, select the entity type bundles to enable Scheduled transitions on. Save the form.

Scheduled Transitions: Settings

User permissions

Navigate to People » Permissions.

Under Content Moderation heading, enable all workflow transition permissions that apply.Under Scheduled Transitions heading, enable Add scheduled transitions and View scheduled transitions permissions that apply. These permissions apply to individual entities, in addition to these permissions, users must also have access to edit the individual entities. Make sure you grant any permissions needed for users to edit the entities, for example Node's require Edit any content or Edit own content permissions.

General Usage

Moving on to day-to-day functionality of Scheduled Transitions.

Navigate to a pre-existing entity. Though nodes are show in examples below, Scheduled Transitions works with any revisionable entity type. Such as block content, terms, or custom entity types.

You'll find the Scheduled Transitions tab, with a counter in the tab indicating how many transitions are scheduled for the entity and translation being viewed.

Scheduled Transitions: Tab

Clicking the tab will send you to a listing of all scheduled transitions for an entity.

If the user has permission, an Add Scheduled transitions button will be visible.

Scheduled Transitions: List

Clicking the button presents a modal form. The form displays a list of all revisions for the entity or translation.

Scheduled Transitions: Modal

Click the radio next to the revision you wish to schedule for state change.

After the radio is selected, the form will reload showing valid workflow transitions from the selected revisions' state.

The user selects which transition is to be executed, along with the date and time the transition should be executed.

Scheduled Transitions: Revision Selected

Depending on the state of the selected source revision, an additional checkbox may display, prompting me to recreate pending revisions. This feature is useful if users have created more non published revisions after the scheduled revision. It prevents loss of any intermediate non-published work. A diagram is provided below:

Scheduled Transitions: Recreate Pending Revisions

Click the schedule button. The modal closes and the scheduled transitions list reloads.

Scheduled Transitions: Post creation

When the time is right, the scheduled transition is executed. You can force schedule transitions to execute by running cron manually. Cron should should be set up to run automatically and regularly, preferably every 5 minutes or so.

The job executes the transitions and deletes itself, removing itself from the transition list. As a result of executing the transition, you'll notice when navigating to the core revisions list for an entity a new revision is created, with a log outlining the state change.

Scheduled Transitions: Revisions


When dealing with entities with multiple translations, you can find that transitions are available for the translation in context, and are separate to other translations. For example revisions in English and German languages of an entity are scheduled independently.

Global List

Scheduled transitions comes with Views integration, on installation a view is pre-installed. You can find the view by navigating to Content » Scheduled Transitions. The view shows all pending scheduled transitions on the site.

Scheduled Transitions: Global List

For more information, check out the Scheduled Transitions project page or Scheduled Transitions project documentation.

Photo of Daniel Phin

Posted by Daniel Phin
Drupal Developer

Dated 9 August 2019


I wish every Drupal contrib module had an announcing blog post and accompanying video. Especially if it's this well executed. Thank you for this very high quality contribution!

ditto re: blog & vid

Looks technically solid. But I think the UI is going to be a bit much for the majority of content editors. 99% of the time the only revision that matters is the highest vid. Perhaps something on the entity edit form next to the submit buttons would be more usable.


Add new comment

Jul 29 2019
Jul 29

Page objects are a pattern that can be used to write clearer and more resilient test suites. This blog post will explore implementing page objects in PHP with the Mink library.

There are various PHP libraries for creating and maintaining page objects. In order to create a library that was useful for the current state of PHP functional testing in Drupal, I created a library with the design goals of:

  • Working seamlessly with Drupal core test classes, traits and weitzman/drupal-test-traits.
  • Working with all of Drupal's dev dependency version constraints and not introducing additional dependencies.
  • Exclusively utilising the Mink API, to provide a fast on-ramp for moving existing tests to page objects and for developers to write new page objects using their existing knowledge of Mink.
  • Drawing inspiration from nightwatch.js to provide transferability between PHP and JS functional tests.

Taken from the project page, by implementing page objects:

  • You create tests that are easier to read and maintain.
  • You reduce coupling between test cases and markup.
  • You encourage thorough testing by making the whole process easier.

While these examples will be using sam152/mink-page-objects the principles apply to using any library or indeed plain old objects. First I'll examine a real project test case using Mink directly, written to test a search feature on a Drupal site:

 * Test how search results appear on the site.
public function testSearchItemDisplay() {
  $sample_result = $this->randomMachineName(32);
    'title' => $sample_result,
    'type' => 'news_item',
    'body' => ['value' => 'Test news item body'],
    'moderation_state' => 'published',

    'query' => $sample_result,
  ], 'Search');

  $this->assertSession()->pageTextContains('1 results for');
  $this->assertSession()->elementContains('css', 'h1', $sample_result);
  $this->assertSession()->elementContains('css', '.sidebar-menu__item--active', 'Show all');
  $this->assertSession()->elementContains('css', '.listing', $sample_result);

  // A news item should not appear when filtering by basic pages.
  $this->clickLink('Basic page');
  $this->assertSession()->pageTextContains('0 results for');
  $this->assertSession()->elementContains('css', '.sidebar-menu__item--active', 'Basic page');

  $this->clickLink('News item');
  $this->assertSession()->elementContains('css', '.sidebar-menu__item--active', 'News item');
  $this->assertSession()->elementContains('css', '.listing', $sample_result);

And now the equivalent test refactored to use a page object:

 * Test how search results appear on the site.
public function testSearchItemDisplayPageObject() {
  $sample_result = $this->randomMachineName(32);
    'title' => $sample_result,
    'type' => 'news_item',
    'body' => ['value' => 'Test news item body'],
    'moderation_state' => 'published',

  $search_page = SearchPage::create($this);

    ->elementContains('@title', $sample_result)
    ->assertActiveFilter('Show all');

  $this->clickLink('Basic page');
  $search_page->assertActiveFilter('Basic page')

  $this->clickLink('News item');
  $search_page->assertActiveFilter('News item')

In the second test, there are a few advantages:

  • The code is more DRY, since selectors on the page aren't repeated. In fact, if the page object was used for all future search tests, they'd never be repeated in a test again!
  • The test uses a more natural language that is easier to parse by readers of the code and communicates the intentions of the author in a clearer fashion.
  • The search page object is type-hinted, making writing new tests fast and reducing the amount of page related knowledge developers must collect and remember.

The cost paid for these benefits is an additional layer of indirection between your test case and the test browser, so to realise the full benefit of such an approach, I'd expect a page object to be written to service at least two different test cases however I haven't experimented implementing this pattern across a large scale test suite.

An annotated version of the page object (for the purposes of demonstration) looks like:

 * A page object for the search page.
class SearchPage extends DrupalPageObjectBase {

   * {@inheritdoc}
  protected function getElements() {
    // Selectors found on the page, these can be referenced from any of the Mink
    // API calls within this page object.
    return [
      'title' => 'h1',
      'results' => '.listing',
      'activeFilter' => '.sidebar-menu__item--active',

   * Assert the number of results on the search page.
   * @param int $count
   *   The number of items.
   * @return $this
  public function assertResultCount($count) {
    $this->assertSession()->pageTextContains("$count results for");
    return $this;

   * Assert a string appears on the page.
   * @param string $string
   *   The string that should appear on the page.
   * @return $this
  public function assertResultsContain($string) {
    $this->elementContains('@results', $string);
    return $this;

   * Assert a string does not appear on the page.
   * @param string $string
   *   The string that should not appear on the page.
   * @return $this
  public function assertResultsNotContain($string) {
    $this->elementNotContains('@results', $string);
    return $this;

   * Assert the active filter.
   * @param string $filter
   *   The active filter.
   * @return $this
  public function assertActiveFilter($filter) {
    $this->elementContains('@activeFilter', $filter);
    return $this;

   * Execute a search query.
   * @param string $query
   *   A search query.
   * @return $this
  public function executeSearch($query) {
      'query' => $query,
    ], 'Search');
    return $this;


While the library itself is decoupled from Drupal, the DrupalPageObjectBase base class integrates a few additional Drupal features such as UiHelperTrait for methods like ::drupalGet and ::submitForm as well as creating a ::create factory to automatically wire dependencies from Drupal tests into the page object itself.

I would be interested in hearing thoughts on if introducing page objects may benefit Drupal core's own functional test suite and details on how that might be accomplished given the tools available. 

Photo of Sam Becker

Posted by Sam Becker
Senior Developer

Dated 29 July 2019

Add new comment

May 15 2019
May 15

Display Suite is a handy module we've used for a long time. However for new projects utilising Layout Builder we've found we don't need it. Swap out Display Suite for Drupal 8 core blocks with contexts!

Positioning fields

The main use case for Display Suite (DS) is to position fields into layouts. However, Layout Builder now offers a Drupal 8 core alternative to building layouts.

As DS utilises core's Layout Discovery module switching these layouts over to Layout Builder should be fairly straight forward. Having said that, so far we've only implemented this on new greenfield sites starting from scratch with Layout Builder.

Custom fields

One of DS's most useful features is defining custom fields as @DsField plugins.

Say we have a custom Event entity which needs custom output to format a map of that event.

DsField version


namespace Drupal\my_event\Plugin\DsField;

use Drupal\ds\Plugin\DsField\DsFieldBase;

 * Plugin to render a map of an event.
 * @DsField(
 *   id = "my_event_map",
 *   ...
 *   entity_type = "my_event"
 * )
class EventMap extends DsFieldBase {

   * {@inheritdoc}
  public function build() {
    /** @var \Drupal\my_event\Entity\Event $event */
    $event = $this->entity();
    // Logic here to build and format your map utilising $event.


Block equivalent

This DsField converts directly to a Block plugin utilising context to get the entity.


namespace Drupal\my_event\Plugin\Block;

use Drupal\Core\Block\BlockBase;

 * Block implementation to render a map of an event.
 * @Block(
 *   id = "my_event_map",
 *   ...
 *   context = {
 *     "my_event" = @ContextDefinition("entity:my_event", required = TRUE),
 *   }
 * )
class EventMap extends BlockBase {

   * {@inheritdoc}
  public function build() {
    /** @var \Drupal\my_event\Entity\Event $event */
    $event = $this->getContextValue('my_event');
    // Logic here to build and format your map utilising $event.


This block is then available for placement as per the usual Layout Builder techniques.

Controlling field markup

Another use for DS is to control the markup of and around fields.

As an alternative to DS we often use Element Class Formatter module to easily inject custom classes into fields. In combination with Twig templates utilising embeds and includes this should mostly do away with the need for DS.

Summing up

DS is a great module, full kudos to swentel, aspilicious and everyone else who's worked to make DS such a powerful UI based tool. However we don't really see a place for it looking to a world powered by Layout Builder.

Here's looking forward to a Drupal where all layout happens via Layout Builder!


It may be that I'm still not much more than a rookie, and all of my sites are pretty small, but I have the same problem with layout builder, paragraphs, and Gutenberg. If I build a page with one of these, I can't use the individual elements in a View (slide show, gallery, accordion, etc.).

Please tell me that I have missed something fundamental and that I am basically wrong.

Good question. Layout Builder is using core blocks, so you should be able to extract them with views.  However there's certainly still a very important use case for structured data. E.g. for an entity type with a custom layout built with Layout Builder, also having separate teaser image and text fields attached so these can be extracted into a view is perfectly valid. It's certainly not a case of just chuck everything into Layout Builder.


Add new comment

Mar 13 2019
Mar 13

Views is a powerful and highly popular feature in most Drupal sites. For today’s blog post, we're going to look at how to create a Views Area plugin that will display sort links at the top of the page.

So what is an Area plugin ?

As per the documentation on, they are plugins governing areas of views, such as header, footer, and empty text.

Task at hand

For our example, we will be using a search view. Let’s say that this view is created to accept full text search of a content type, sort by relevance by default, and also can be able to sort by created date in descending order.

If we stick to the default Drupal behaviour and use it’s exposed forms we will get a form with buttons as shown below.

However, for this task we want to alter the default behaviour and make the form show links instead. The end result of the sort links should look like image below:

In order to achieve this we build this form using an Area plugin.

Let us call this view search, and add these settings via UI:

  • Add a field Configure filter criterion: Search: Full text search

Give a path to the view page

Now to add the sort criteria. For this instance we:

  • set relevance to be the default sort, and created date to be the optional sort.

And now for the fun part!

The header section is where we need to add the area plugin. It will add logic to render these two sort options and how each link should work on a user click.

In order for us to add the plugin to the header as shown below, let’s jump into start coding.

The Code

Following the rules of Drupal annotation-based plugins, the plugin should reside under the right directory and namespace for an Area plugin. E.g. app/modules/custom/my_search/src/Plugin/views/area

There we can create a PHP class which we will name as SortBy.php

As for any plugin to work area plugin needs the three main ingredients as well.

  • Namespace needs to follow PSR-4 standards and reside in the Drupal\my_module\Plugin\views\area namespace

  • Must use the @ViewsArea annotation

  • Must implement a particular interface or extend the a base class, in this instance the base class is AreaPluginBase

Namespace needs to follow PSR-4 standards

namespace Drupal\my_search\Plugin\views\area;use Drupal\Core\Url;use Drupal\views\Plugin\views\area\AreaPluginBase;


* Defines an area plugin to display a header sort by option.
* @ingroup views_area_handlers
* @ViewsArea("my_search_sort_by")

Extend the class from AreaPluginBase

class SortBy extends AreaPluginBase {}

To make things a bit easier I will add a few constants in to the play.

// Query parameter of search form.
const KEYWORD_PARAM_NAME = 'search';

// Query parameter created by view for created date field.
const CREATED_DATE_PARAM = 'created_date';

// Query parameter created by view for relevance field.
const RELEVANCE_PARAM = 'search_api_relevance';

// Search view's route name.

For this particular example we only need to override the render() method.

* Render the area.
* @param bool $empty
*   (optional) Indicator if view result is empty or not. Defaults to FALSE.
* @return array
*   In any case we need a valid Drupal render array to return.

public function render($empty = FALSE) {
// Sort criteria array.
// This will be our render array that will be used to generate the
// desired html.
$sort_links = [];

// Drupal request query.
$request_query = \Drupal::request()->query;

// Default query options for date sort criteria.
$date_options = [
  'query' => [
    'sort_by' => self::CREATED_DATE_PARAM,
    'sort_order' => 'DESC',
    'search' => $request_query->get(self::KEYWORD_PARAM_NAME),

So, now we already have a known route name of the search view which is all we need to do is to pass in the $route_parameters, $options values into Url::fromRoute()

// Default query options for relevance sort criteria.
$relevance_options = [
  'query' => [
    'sort_by' => self::RELEVANCE_PARAM,
    'sort_order' => 'DESC',
    'search' => $request_query->get(self::KEYWORD_PARAM_NAME),

These query parameters will be later displayed as below when a search is made from the form and sorted by created date.


// Determine which criteria is currently active.
// Default is set to relevance.
$active_link = self::RELEVANCE_PARAM;

// On search page load we need check if a GET query is passed in   
// having key sort_by
if ($request_query->has('sort_by') && $request_query->get('sort_by') === self::CREATED_DATE_PARAM) {
  $active_link = self::CREATED_DATE_PARAM;

$sort_links = [
    'title' => 'Relevance',
    'link' => Url::fromRoute(self::SEARCH_PAGE_ROUTE, [], $relevance_options),
    'active' => $active_link === self::RELEVANCE_PARAM,
    'title' => 'Date',
    'link' => Url::fromRoute(self::SEARCH_PAGE_ROUTE, [], $date_options),
    'active' => $active_link === self::CREATED_DATE_PARAM,

// Finally we return our render array

return [
  '#theme' => 'cdu_sort_by_links',
  '#sort_links' => $sort_links,

  // Tell Drupal this varies by url.
  '#cache' => [
    'contexts' => ['url'],

// End of render function

Important thing to note is the use of  Url::fromRoute() to generate the link rather than using a hardcoded /search?some_stuff

This is because of one good reason: if someone goes into the view and decides to change page url from /search to /content/search - the Url:: code will keep working but the hard-coded href="" will not.

At the moment we're hardcoding the view/route name, because this is a one off plugin. If you were building an area plugin that worked for all you could use

$this->displayHandler and $this->view to dynamically derive the route name and support any view.

Now our plugin is ready, but it is not discoverable by Drupal. So we need to add:

app/modules/custom/my_search/ file and add below code under hook_views_data_alter()

function my_search_views_data_alter(array &$data) {
 $data['views']['my_search'] = [
  'title' => t('Sort by'),
  'help' => t('Provides sort by option.'),
  'area' => [
    'id' => 'my_search_sort_by',
return $data;

We need to add this because views area plugins are a bit odd. There is a core issue to try and make them behave like normal plugins.

After adding this, we are all set.

Now we can add Sort by plugin under view header.

Final outcome

On the search page you should see the sort links according to the applied theme

When you click on relevance; the url should change into


And on sort by date click;




Add new comment

Feb 11 2019
Feb 11

The time has come to close the book on our government Drupal distribution, aGov. We are no longer actively developing sites using aGov, and instead are focussing our efforts on the new GovCMS Drupal 8 distribution. Here’s a short history of aGov and how we got to this point.

Why we developed aGov

PreviousNext has a long history of developing sites for government at all levels. Back in 2012 there was a big shift in government moving towards accessibility, security and mobile support. There were many government sites built using legacy software that were too expensive and complex to update to conform to these requirements. At the same time government was embracing open source as a legitimate replacement.

We were seeing an increasing number of government agencies coming to us to help them meet these requirements, and we were implementing the same changes on each and every project.

aGov was developed as a response to this. The intention was to create a serious alternative to legacy applications, that could be spun up with minimal effort.

Drupal 7 was not very accessible out of the box, and aGov combined numerous contrib modules and a base theme that helped it meet WCAG 2 AA compliance.

aGov takes off

Thanks in part to the aGov distribution, we saw a huge adoption of Drupal in government. Agencies were able to create sites relatively easily, and host where ever they chose, be that in the cloud or on-premise. At its peak, aGov was in use on 600 sites and has been downloaded almost 200,000 times, consistently staying in the Top 20 Drupal Distributions on

Distributions versus Starter Kits

Back in 2012, Drupal distributions were seen as the answer to the code reuse problem. In Drupal 7, configuration is all stored in the database, making it difficult to manage. The solution was the Features module, which had to accommodate myriad different configuration formats and try to wrangle them all into code that could be deployed. As aGov was typically modified and extended in 99% of cases, this configuration was difficult to maintain and supporting upgrade paths for each and every site was near impossible.

We improved the situation with a 7.x-2.x release and later with a 7.x-3.x release, however the inherent limitations of Drupal 7 were still there.


Around the 2nd half of 2014 we worked with the Australian Department of Finance to fork aGov to the GovCMS distribution. GovCMS is a hosting, procurement process and Drupal distribution to allow government agencies to build and deploy Drupal websites with minimum friction. The potential of Drupal and aGov was now being realised in a comprehensive Drupal SaaS platform.

The Department of Finance took ownership of the GovCMS codebase, however we still saw a need for more complex sites that would benefit from aGov, so decided to keep it alive. In aGov 7.x-3.x we incorporated UI Kit, the government design system, and in 2015 started active development on a Drupal 8 version.

Drupal 8

PreviousNext invested large amounts of time into the Drupal 8 release, which greatly increased our overall expertise and confidence in using it.

Drupal 8 solved a lot of the problems aGov was initially created for. First class accessibility and mobile support, as well as a comprehensive configuration management system that made moving configuration changes around a breeze.

As such, the Drupal 8 version of aGov was much smaller and simpler than it’s Drupal 7 equivalent. This also meant we would often use vanilla Drupal 8 instead of aGov when working on government sites as there was less need for an ‘opinionated’ (read inflexible) solution for different sites.

Maintenance Fixes Only

For the last two years, we have been supporting aGov, by updating core and contributed modules. However, as we no longer use it ourselves, there has been little impetus to extend or enhance the distribution. It became apparent that we could no longer continue to support it alone.

After six years since its original creation, we will be marking the project’s Maintenance Status as Seeking New Maintainer and its development status as No further development.  We will no longer be active in the issue queue or creating new releases.

If you are a user of aGov, you will be still able to upgrade Drupal core and contributed modules on a site by site basis as before. If you encounter upgrade issues, you should create tickets in the project where the issue occurs.

If anyone is interested in taking over maintainership, please contact us through the project’s issue queue.

Where to from here?

Late last year the GovCMS team successfully launched its Kubernetes-based hosting platform, and improved developer workflows. With 247 live websites and 48 site currently in development, the Drupal-based platform is stronger than ever, and continues to grow. 

PreviousNext has been developing GovCMS sites since day one, and we’re continuing to grow this part of our business. Dropping support for aGov is going to free up some of our time to put to better use on the future of government sites in Australia.

Photo of Kim Pepper

Posted by Kim Pepper
Technical Director

Dated 11 February 2019


A pat on the back for the pioneering work you guys did and for pushing the project forward throughout the years. The legacy will live on.

Huge thanks to you Kim, and the whole PreviousNext team for creating and maintaining aGov. It was a "critical success factor" for the adoption of Drupal in Government around Australia, and led to increased maturity in the community.

But thanks also for this very gracious "sunset" post. Yes, it's clearly time to let this one go.



Add new comment

Nov 22 2018
Nov 22

Update: Re-published for DrupalSouth 2018 edition

The PreviousNext team are sponsoring and helping to run the sprint day on Wednesday, December 5th 2018, and there are a few things you can do now to hit the ground running on the day.

What's a Sprint Day about anyway?

Contribution Sprints are a great opportunity to get involved in contributing to Drupal. Contributions don't have to be just about code. Issue triage, documentation, and manual testing are examples of non-code contributions.

If you are new to contributing, you can take a look at the New Contributor tasks on the Contributor Tasks page.

While there will be experienced contributors there on the day to help, keep in mind, this is not a training session. :-)

Set Up a Development Environment

There is more than one way to shear a sheep, and there is also more than one way to set up a local development environment for working on Drupal.

We've create a Drupal project starter kit for sprint attendees which should speed up this process. Head over to and follow the README.

If you have any issues, feel free to post them in the Github issue queue and we'll try and resolve them before the day.

Find Issues to Work On

If you want to see what might be an interesting issue to work on, head over to the Issue Queue and look for issues tagged with 'DrupalSouth 2018'. These are issues that others have tagged.

You can also tag an issue yourself to be added to the list.

Being face-to-face with fellow contributors is a great opportunity to have discussions and put forward ideas. Don't feel like you need to come away from the day having completed lines and lines of code.

We look forward to seeing you all there!

Photo of Kim Pepper

Posted by Kim Pepper
Technical Director

Dated 22 November 2018

Add new comment

Nov 02 2018
Nov 02

As part of the session I presented in Drupal Europe, REST Ready: Auditing established Drupal 8 websites for use as a content hub, I presented a module called “Entity Access Audit”.

This has proved to be a useful tool for auditing our projects for unusual access scenarios as part of our standard go-live security checks or when opening sites up to additional mechanisms of content delivery, such as REST endpoints. Today this code has been released on Entity Access Audit.

There are two primary interfaces for viewing access results, the overview screen and a detailed overview for each entity type. Here is a limited example of the whole-site overview showing a few of the entity types you might find in core or custom modules:

Entity access audit

Here is a more detailed report for a single entity type:

Entity access audit

The driving motivation behind these interfaces was being able to visually scan entity types and ensure that the access results align with our expectations. This has so far helped identify various bugs in custom and contributed code.

In order to conduct a thorough access test, the module uses a predefined set of dimensions and then uses a cartesian product of these dimensions to test every combination. The dimensions tested out of the box, where applicable to the given entity type are:

  • All bundles of an entity type.
  • If the current user is the entity owner or not.
  • The access operation: create, view, update, delete.
  • All the available roles.

It’s worth noting that these are only common factors used to determine access results, they are not comprehensive. If access was determined by other factors, there would be no visibility of this in the generated reports.

The module is certainly not a silver bullet for validating the security of Drupal 8 websites, but has proved to be a useful additional tool when conducting audits.

Photo of Sam Becker

Posted by Sam Becker
Senior Developer

Dated 2 November 2018

Add new comment

Oct 08 2018
Oct 08

In this blog post, we'll have a look at how contributed Drupal modules can remove the core deprecation warnings and be compatible with both Drupal 8 and Drupal 9.

Ever since Drupal Europe, we know Drupal 9 will be released in 2020. As per @catch’s comment in 2608496-54

We already have the continuous upgrade path policy which should mean that any up-to-date Drupal 8 module should work with Drupal 9.0.0, either with zero or minimal changes.

Drupal core has a proper deprecation process so it can be continuously improved. Drupal core also has a continuous process of removing deprecated code usages in core should not trigger deprecated code except in tests and during updates, because of proper deprecation testing.

The big problem for contributed modules aka contrib is the removal of deprecated code usage. To allow contrib to keep up with core's removal of deprecation warnings contrib needs proper deprecation testing which is being discussed in support deprecation testing for contributed modules on

However, Drupal CI build process can be controlled by a drupalci.yml file found in the project. The documentation about it can be found at customizing DrupalCI Testing for Projects.

It is very easy for contributed modules to remove their usage of deprecated code. All we need is to add the following drupalci.yml file to your contributed modules and fix the fails.

# This is the DrupalCI testbot build file for Dynamic Entity Reference.
# Learn to make one for your own project:
        # phpcs will use core's specified version of Coder.
        sniff-all-files: true
        halt-on-fail: true
      # run_tests task is executed several times in order of performance speeds.
      # halt-on-fail can be set on the run_tests tasks in order to fail fast.
      # suppress-deprecations is false in order to be alerted to usages of
      # deprecated code.
        types: 'PHPUnit-Unit'
        testgroups: '--all'
        suppress-deprecations: false
        halt-on-fail: false
        types: 'PHPUnit-Kernel'
        testgroups: '--all'
        suppress-deprecations: false
        halt-on-fail: false
        types: 'PHPUnit-Functional'
        testgroups: '--all'
        suppress-deprecations: false
        halt-on-fail: false
        concurrency: 15
        types: 'PHPUnit-FunctionalJavascript'
        testgroups: '--all'
        suppress-deprecations: false
        halt-on-fail: false

This drupalci.yml will check all the Drupal core coding standards. This can be disabled by the following change:

        # phpcs will use core's specified version of Coder.
        sniff-all-files: false
        halt-on-fail: false

This file also only runs PHPUnit tests, to run legacy Simpletest you have to the following block:

         types: 'Simpletest'
         testgroups: '--all'
         suppress-deprecations: false
         halt-on-fail: false

But if you still have those, you probably want to start there, because they won't be supported in Drupal 9.

Last but not the least if you think the is module is not ready yet to fix all the deprecation warning you can set suppress-deprecations: true.

As a contrib module maintainer or a contrib module consumer I encourage you to add this file to all the contrib modules you maintain or use, or at least create an issue in the module's issue queue so that at the time of Drupal 9 release all of your favourite modules will be ready. JSONAPI module added this file in which inspired me to add this to DER in

Photo of Jibran Ijaz

Posted by Jibran Ijaz
Senior Drupal Developer

Dated 8 October 2018


What Alex said. I did this for the JSON API module because we want it to land in Drupal core. But we are running into the problems explained in the issue linked by Alex.

Despite Drupal 9 being announced, Drupal Continuous Integration system is not yet ready for modules trying to keep current with all deprecations for Drupal 9, while remaining compatible with both simultaneously minors with security team coverage (current + previous, ATM 8.6 + 8.5) and the next minor (next, ATM 8.7). Hopefully we soon will be :)

Thanks for writing about this though, I do think it's important that more module maintainers get in this mindset!

DrupalCI always runs contrib tests against the latest core branch. As a contrib module maintainer if I have to make a compatibility change for core minor version then I create a new release and add that to releases notes after the stable release of core minor version e.g. 8.x-2.0-alpha8. I never have to create a new release for the core patch release at least till now but yes I don't know how would I name the new release if I ever have to do that but then again that's contrib semvar issue.

DrupalCI runs against whichever core branch the maintainer has configured it to run against.

If a contributed module wants to remove usages of deprecations, it should probably never do that against the "Development" branch, as there isnt a way for a contrib module to both remove those deprecations, *and* still be compatible with supported or security branches. The earliest that a contrib module should try to remove new deprecations is at the pre-release phase, as at that point we're unlikely to introduce new deprecations.


Add new comment

Sep 25 2018
Sep 25

Drupal 8.6 has shipped with the Media Library! It’s just one part of the latest round of improvements from the Media Initiative, but what a great improvement! Being brand new it’s still in the “experimental” module state but we’ve set it up on this website to test it out and are feeling pretty comfortable with its stability.

That said, I highly encourage you test it thoroughly on your own site before enabling any experimental module on a production site. Don’t just take my word for it :)

What it adds

The Media Library has two main parts to it...

Grid Listing

There’s the Grid Listing at /admin/content/media, which takes precedence over the usual table of media items (which is still available under the “Table” tab). The grid renders a new Media Library view mode showing the thumbnail and compact title, as well as the bulk edit checkbox.

The new media library grid listing page

Field Widget

Then there’s the field widget! The field widget can be set on the “Manage Form Display” page of any entity with a Media Reference Field. Once enabled, an editor can either browse existing media (by accessing the Grid Listing in a modal) or create a new media item (utilising the new Media Library form mode - which is easy to customise).

Media reference field with the new Media Library form widget

Media Library widget once media has been added, which shows a thumbnail of the media

The widget is very similar to what the ‘Inline Entity Form’ module gave you, especially when paired with the Entity Browsers IEF submodule. But the final result is a much nicer display and in general feels like a nicer UX. Plus it’s in core so you don’t need to add extra modules!

The widget also supports bulk upload which is fantastic. It respects the Media Reference Fields cardinality, so limit it to one - and only file can be uploaded or selected from the browser. Allow more than one and upload or select up to that exact number.  The field even tells you how many you can add and how many you have left. And yes, the field supports drag and drop :)

What is doesn’t add

WYSIWYG embedding

WYSIWYG embed support is now being worked on for a future release of Drupal 8 core, you can follow this Meta issue to keep track of the progress. It sounds like some version of Entity Embed (possibly limited to Media) will make it’s way in and some form of CKEditor plugin or button will be available to achieve something similar to what the Media Entity Browser, Entity Browser, Entity Embed and Embed module set provides currently.

Until then though, we’ve been working on integrating the Media Libraries Grid Listing into a submodule of Media Entity Browser to provide editors with the UX improvements that came with Media Library but keeping the same WYSIWYG embed process (and the contrib modules behind it) they’re currently used to (assuming they’re already using Media Entity Browser, of course). More on this submodule below.

This is essentially a temporary solution until the Media Initiative team and those who help out on their issue queue (all the way from UX through to dev) have the time and mental space to get it into core. It should hopefully have all same the bulk upload features the field widget has, it might even be able to support bulk embedding too!

View mode or image style selectors for editors

Site builders can set the view mode of the rendered media entity from the manage display page, which in turn allows you to set an image style for that view mode, but editors can’t change this per image (without needing multiple different Media reference fields).

There is work on supporting this idea for images uploaded via CKEditor directly, which has nothing to do with Media, but I think it would be a nice feature for Media embedding via WYSIWYG as well. Potentially also for Media Reference Fields. But by no means a deal breaker.

Advanced cropping

From what I can gather there are no plans to add any more advanced cropping capabilities into core. This is probably a good thing since cropping requirements can differ greatly and we don’t want core to get too big. So contrib will still be your goto for this. Image Widget Crop is my favourite for this, but there’s also the simpler Focal Point.

You can test out the submodule from the patch on this issue and let us know what you think! Once the patch is added, enable the submodule then edit your existing Entity Browsers and swap the View widget over to the “Media Entity Browser (Media Library)” view.

Form for changing the Entity Browser view widget

It shouldn’t matter if you’ve customised your entity browser. If you’ve added something like Dropzone for drag-and-drop support it *should* still work (if not, check the Dropzone or Entity Browser issue queues). If you’ve customised the view it uses however, you might need to redo those customisations on the new view.

I also like updating the Form Mode of the Entity Browsers IEF widget to use the new Media Library form display, which I always pair back to just the essential fields (who really needs to manually set the author and created time of uploaded media?).

You still can’t embed more than one media item at a time. But at least now you also can’t select more than one item when browsing so that’s definitely an improvement.

Modal of the Media Entity Browser showing the same Grid listing

Plus editors will experience a fairly consistent UX between browsing and uploading media on fields as they do via the WYSIWYG.

Once setup and tested (ensuring you’ve updated any Media Reference Fields to use the new Media Library widget too) you can safely disable the base Media Entity Browser module and delete any unused configuration - it should just be the old “Media Entity Browser” view.

Please post any feedback on the issue itself so we can make sure it’s at its best before rolling another release of the module.

Happy days!

I hope you have as much fun setting up the Media Library as I did. If you want to contribute to the Media Initiative I’m sure they’ll be more than happy for the help! They’ve done a fantastic job so far but there’s still plenty left to do.

Photo of Rikki Bochow

Posted by Rikki Bochow
Front end Developer

Dated 25 September 2018


Nice and useful article of Media Library in core usage into each Drupal 8 project.
Thank you!


Add new comment

Aug 17 2018
Aug 17

Allow sitebuilders to easily add classes onto field elements with the new element_class_formatter module.

Adding classes onto a field element (for example a link or image tag - as opposed to the wrapper div) isn't always the easiest thing to do in Drupal. It requires preprocessing into the elements render array, using special Url::setOptions functions, or drilling down a combinations of objects and arrays in your Twig template.

The element_class_formatter module aims to make that process easier. At PreviousNext we love field formatters! We write custom ones where needed, and have been re-using a few generic ones for quite a while now. This module extends our generic ones into a complete set, to allow for full flexibility, sitebuilding efficiency and re-usability of code. 

To use this module, add and enable it just like any other, then visit one of your Manage Display screens. The most widely available formatter is the Wrapper (with class) one, but the others follow a similar naming convention; "Formatter name (with class)". The majority of these formatters extend a core formatter, so all the normal formatter options should still be available.

The manage display page with the formatter selected for three different field types

The manage display page with new (with class) field formatters selected

Setting classes on the configuration pane of a link field

The field formatter settings, with all the default options

Use this module alongside Reusable style guide components with Twig embed, Display Suite with Layouts and some Bare templates to get optimum Drupal markup. Or just use it to learn how to write your own custom field formatters!

For feature requests or issues please see the modules Issue queue on

A quick "class formatter" module comparison

There are a couple of "class formatter" modules around so lets do a quick comparison;

Lets say we have a node content type, which has a link field and we're looking at the teaser view mode. The standard markup might look something like this;

<div class="node node--teaser"> 

  <div class="field"> 
    <div class="field-item">

      <a href="">Example link</a>


Then we'll use our field formatters to add my-custom-class into the markup, and see where it ends up.

Element Class Formatter

As mentioned above, this module adds a class to a fields element. So the actual link of a link field, for example. The field template markup is untouched.

The class is set at the view mode (configuration) level, so content editors don't get to choose what class goes on the link. So for our node teaser with link field example, all nodes get the same class every time the teaser is displayed. The new markup would be:

<div class="node node--teaser">

  <div class="field">
    <div class="field-item">

     <a href="" class="my-custom-class">Example link</a>


Field Formatter Class

The Field Formatter Class module is similar to the Element Class Formatter module except that is adds the class to field template markup, not the link. Otherwise it works the same way, on a view mode level.

<div class="node node--teaser">

  <div class="field my-custom-class">
    <div class="field-item">

     <a href="">Example link</a>


I haven't actually used this module before, as it's more natural for me to work in Twig templates, utilising embeds and includes. But if you like doing things via the UI then check it out.

Entity Class Formatter

Entity Class Formatter is a very nice module which lets you (site builder or content editor) add a class to the fields parent entity. It further differs from the above modules in that it's a combination of configuration and content. You (the site builder) define a set of classes that the content editor can choose from. Each node can have a different class from the pre-defined list.

So say there's two node teasers in our markup;

<div class="node node--teaser my-custom-class">

  <div class="field">
    <div class="field-item">

     <a href="">Example link</a>

<div class="node node--teaser my-other-custom-class"> 

  <div class="field"> 
    <div class="field-item"> 

      <a href="">Example link</a>   


Which is really handy for adding things like grid or variant classes to things. As you can see it does nothing to the link fields markup, it's actually it's own separate field, and just utilises a Field Formatter so you can define how the field should be used.


You can actually use all these modules together, as they target different parts of the markup. They're complimentary, not competitors. We definitely use the entity_class_formatter together with element_class_formatter and let Twig handle the middle part. 

Photo of Rikki Bochow

Posted by Rikki Bochow
Front end Developer

Dated 17 August 2018

Add new comment

Aug 14 2018
Aug 14

There is not a lot of documentation available about what's the difference between running a browser in WebDriver mode vs Headless so I did some digging...

Apparently, there are two ways to run Chrome for testing:

  • As WebDriver
  • As Headless


There are two ways to run Chrome as WebDriver:

Using Selenium:

Run Selenium standalone server in WebDriver mode and pass the path of ChromeDriver bin along with the config e.g. Selenium Dockerfile

This works fine with Nightwatch standard setup, \Drupal\FunctionalJavascriptTests\JavascriptTestBase and also with Drupal core's new \Drupal\FunctionalJavascriptTests\WebDriverTestBase.

Using ChromeDriver:

Run ChromeDriver in WebDriver mode e.g. chromedriver Dockerfile

This works fine with Nightwatch, JTB, and WTB.


Using Chrome

Run Chrome browser binary in headless mode. e.g. Chrome headless Dockerfile

Nightwatch is not working with this set up, at least I was unable to configure it. See and for more info. \DMore\ChromeDriver can be used to run the javascript tests.

Using ChromeDriver

Using Selenium ChromeDriver can be run in headless mode something like this:

const fs = require('fs');
const webdriver = require('selenium-webdriver');
const chromedriver = require('chromedriver');

const chromeCapabilities =;
chromeCapabilities.set('chromeOptions', {args: ['--headless']});

const driver = new webdriver.Builder()

DrupalCI is running ChromeDriver without Selenium and testing Nightwatch and WTB on it.


The question is which is the best solution to run Nightwatch and JTB/WTB tests using the same setup?

  • We had seen some memory issues with Selenium containers in the past but we haven't run into any issue recently so I prefer this and you can swap Selenium container to use different browsers for testing.
  • We have also seen some issues while running ChromeDriver in WebDriver mode. It just stops working mid-test runs.
  • I was unable to get Headless Chrome working with Nightwatch but it needs more investigation.
  • Headless ChromeDriver setup on DrupalCI is quite stable. For JTB this would mean that we could use anyone from \Drupal\FunctionalJavascriptTests\DrupalSelenium2Driver and DMore\ChromeDriver.

Please share your ideas and thoughts, thanks!

For more info:

Photo of Jibran Ijaz

Posted by Jibran Ijaz
Senior Drupal Developer

Dated 14 August 2018


We are also having a discussion about this in 'Drupal testing trait' merge request, see merge_requests/37.

Might be worth adding to this list that there's an alternative setup I successfully tested in april using the browserless/chrome image with Cucumber and Puppeteer for behavioral tests.

YMMV but to give a rough idea, here's the relevant docker-compose.yml extract :

image: node:9-alpine
command: 'sh -c "npm i"'
- ./tests:/tests:cached
working_dir: /tests
network_mode: "host"

# See…
image: browserless/chrome
shm_size: 1gb
network_mode: "host"
- "3000:3000"

... and in tests/package.json :

"devDependencies": {
"chai": "^4.1.2",
"cucumber": "^4.0.0",
"puppeteer": "^1.1.0"
"scripts": {
"test": "cucumber-js"

... and to connect, open a page and take screenshots, e.g. in tests/features/support/world.js :

const { setWorldConstructor } = require("cucumber");
const { expect } = require("chai");
const puppeteer = require("puppeteer");
const PAGE = "";

class TodoWorld {

... (snip)

async openTodoPage() {
// See
// (via…)
this.browser = await puppeteer.connect({ browserWSEndpoint: 'ws://localhost:3000' }); = await this.browser.newPage();

... (snip)

async takeScreenshot() {
await{ path: 'screenshot.png' });

... (snip)


→ To run tests, e.g. :
$ docker-compose run b-tester sh -c "npm test"

(result in tests/screenshot.png)

I ran out of time to make a prototype repo, but my plan was to integrate


Add new comment

Aug 08 2018
Aug 08

Malicious users can intercept or monitor plaintext data transmitting across unencrypted networks, jeopardising the confidentiality of sensitive data in Drupal applications. This tutorial will show you how to mitigate this type of attack by encrypting your database queries in transit.

With attackers and data breaches becoming more sophisticated every day, it is imperative that we take as many steps as practical to protect sensitive data in our Drupal apps. PreviousNext use Amazon RDS for our MariaDB and MySQL database instances. RDS supports SSL encryption for data in transit, and it is extremely simple to configure your Drupal app to connect in this manner.

1. RDS PEM Bundle

The first step is ensuring your Drupal application has access to the RDS public certificate chain to initiate the handshake. How you achieve this will depend on your particular deployment methodology - we have opted to bake these certificates into our standard container images. Below are the lines we've added to our PHP Dockerfile.

# Add Amazon RDS TLS public certificate.
ADD  /etc/ssl/certs/rds-combined-ca-bundle.pem
RUN chmod 755 /etc/ssl/certs/rds-combined-ca-bundle.pem

If you use a configuration management tool like ansible or puppet, the same principal applies - download that .pem file to a known location on the app server.

If you have limited control of your hosting environment, you can also commit this file to your codebase and have it deployed alongside your application.

2. Drupal Database Configuration

Next you need to configure Drupal to use this certificate chain if it is available. The PDO extension makes light work of this. This snippet is compatible with Drupal 7 and 8.

$rds_cert_path = "/etc/ssl/certs/rds-combined-ca-bundle.pem";
if (is_readable($rds_cert_path)) {
  $databases['default']['default']['pdo'][PDO::MYSQL_ATTR_SSL_CA] = $rds_cert_path;

3. Confirmation

The hard work is done, you'll now want to confirm that the connections are actually encrypted.

Use drush to smoke check the PDO options are being picked up correctly. Running drush sql-connect should give you a new flag: --ssl-ca.

$ drush sql-connect

mysql ... --ssl-ca=/etc/ssl/certs/rds-combined-ca-bundle.pem

If that looks OK, you can take it a step further and sniff the TCP connection between Drupal and the RDS server.

This requires root access to your server, and the tcpflow package installed - this tool will stream the data being transmitted over port 3306. You are wanting to see illegible garbled data - definitely not content that looks like a SQL queries or responses!

Run this command, and click around your site while logged in (to ensure minimal cache hits).

$ tcpflow -i any -C -g port 3306

This is the type of output which indicates the connection is encrypted.

tcpflow: listening on any

"|{mOXU{7-rd 0E
i\R[dRa+Rk4)P5mR_h9S;lO&/=lnC<U)U87'^[email protected]{4d'Qj2{10
L*HhsM5%7dXh6w`;B;;|kHTt[_'CDm:PJbs$`/fTv'M .p2<KTE
DDujr!e7D#d^[email protected]+v3Hy(('7O\2.6{0+
V{+m'[cq|6t!Zhv,_/:EJbBF9D8Qz+2t=E(6}jR}qDezq'~)ikO$Y:F:G,UjC[{qF;/srT?7mm=#DDUNa"%|"[email protected]<szV*B^g/Ij;-f~r~X~t-]}Yvr9zpO0Yf2mOoZ-{muU1w6R.'u=zCfT,S|Cp4.<vRN_gqc[vER?NLN_XGgve-O}3.q'b*][email protected](|Sm15c&=k6Ty$Ak_ZaA.`vE=]V($Bm;_Z)sp..~&!9}uH+K>JP' Ok&erw
W")wLLi1%l5#lDV85nj>R~7Nj%*\I!zFt?w$u >;5~#)/tJbzwS~3$0u'/hK /99.X?F{2DNrpdHw{Yf!fLv
[email protected]?AsmczC2*`-/R rA-0(}DXDKC9KVnRro}m#IP*2]ftyPU3A#.?~+MDE}|l~uPi5E&hzfgp02!lXnPJLfMyFOIrcq36s90Nz3RX~n?'}ZX
cB&H\hVaZIDYTP|JpTw0 |(ElJo{[email protected]#5#[email protected]#{f)ux(EES'Ur]N!P[cp`8+Z-$vh%Hnk=K^%-[KQF'2NzTfjSgxG'/p HYMxgfOGx1"'SEQ1yY&)DC*|z{')=u`TS0u0{xp-(zi6zp3uZ'~E*ncrGPD,oW\m`2^ Hn0`h{G=zohi6H[d>^BJ~ W"c+JxhIu
h*5%nV9[:60:23K Q`8:Cysg%8q?iX_`Q"'Oj
#\?M"33F6{sN?tb|&E.08, &To*H4ovTXH;IWt<zwQ(Z4kyuLr6tKkdEw3Q,Pq!'_%~MyYL~R^(=(CH;F%CKf8q,eNObGX2Oue2#b]4<
Tu>ofc)P[DQ>Qn3=<al_q8%&C\"\_{GN%iPB;@NYyr)<!oYMOgy'PM$oLr}<#0(g]B.(1LQv)fg\.]0)9$7I nXa[e[w8oRDI1:B6 
\Vbf2bCOKZ%b^/zkk=pu(9xkg|/MnsRc9<[email protected][A!.t|?|tRr (>0^fuefIm1]-YHq5rx|W(S<egZ']=h%*Qq+gR</+0^_2q5GWGam7N).'mA4*`NhwI}noV{V<ZAbgW*c\jFiVyZ0A28TB*&GffP[zb-G,\rirs2
dmkE^hB:(R;<U8 rTc[~s/w7:%QC%TQR'f,:*|[email protected]=!qKgql7D!v
 S+.Y7cg^m!g9G*KFgI)>3:~2&*6!O|DAZWB:#n9<fz/N }(e9m8]!QOHbkd48W%h#!r)uw7{O)2cG`~Vr&AA*Z=Zo<PP
Vej+^)(9MA;De2oMiG^a`tnoCH9l#tLMXGb%EjEkkjQb/=YblLSd}~;S*>|09`I`[email protected]\E\$=/L5VHm)<pI-%(:UYeZ~/1#A.`1m]lH^oVkPsx$ASIla3=E|j{H"{Z!|$[h~W/v!]Iy:I6H%nI\26E=p.ay?JbYd`q\q( VP+mFoJ#$Dt$u
wToLdFb~gay'8uBYRKsiSL?~5LD#MS$Y&Lf[,#jj/*W (E9tT&lhTywDv
poBW3|gJI}2?|&9A/kDCo:X^w<{faH_>[#|tI"lkuK.u|!2MT/@u7u(S{"H.H'Fh/4kF_2{)Jc9NQ%jA_rI1lH;k'$n~M_%t%y)t!C_4FO?idwMB]t^M::S!a=*Jee<[email protected])L;zAuTN2}v#K4AX.(`<J0#G=$FNRof2|O*`0Ef]z\g5n"LH.Z_n3LqDsoa}D&#=XyDp.o\[email protected]$jKs=Rn
Ky`i_&|H0<y]~=XJH%f_s2~u |y\o 35c#ufmrd7'GQ/ P"9 w,Q>X1<{#


Add new comment

Jul 23 2018
Jul 23

In a previous article Using ES6 in your Drupal Components, we discussed writing our javascript using the modern ES6 methods and transpiling down for older browsers. It still used jQuery as an interim step to make the process of refactoring existing components a little easier. But let's go all the way now and pull out jQuery, leaving only modern, vanilla javascript.

Why should we do this?

jQuery was first introduced 12 years ago, with the intention of making javascript easier to write. It had support for older browsers baked into it and improved the developer experience a great deal. It also adds 87KB to a page.

Today, modern vanilla javascript looks so much like jQuery! It’s support in the evergreen browsers is great and it’s so much nicer to write than it was 12 years ago. There are still some things that jQuery wins on but in the world of javascript frameworks, understanding the foundation on which they are built makes learning them so much easier.

And those older browsers? We don’t need jQuery for that either. You can support older browsers with a couple of polyfills. The polyfills I needed for the examples in this post only amounted to a 2KB file.

Drupal 8 and jQuery

One of the selling points of Drupal 8 (for us front-enders at least) was that jQuery would be optional for a theme. You choose to add it as a dependency. A lot of work has gone into rewriting core JS to remove the reliance on jQuery. There are still some sections of core that need work - Ajax related stuff is a big one. But even if you have a complex site which uses features that add jQuery in, it's still only going to be on the pages that need it. Plus we can help! Create issues and write patches for core or contrib modules that have a dependency on jQuery. 

So what does replacing jQuery look like?

In the Using ES6 blog post I had the following example for my header component.

// @file header.es6.js

const headerDefaults = {
  breakpoint: 700,
  toggleClass: 'header__toggle',
  toggleClassActive: 'is-active'

function header(options) {
  (($, this) => {
    const opts = $.extend({}, headerDefaults, options);
    return $(this).each((i, obj) => {
      const $header = $(obj);
      // do stuff with $header
  })(jQuery, this);

export { header as myHeader }


// @file header.drupal.es6.js

import { myHeader } from './header.es6';

(($, { behaviors }, { my_theme }) => {
behaviors.header = {
  attach(context) {$('.header', context), {
      breakpoint: my_theme.header.breakpoint
})(jQuery, Drupal, drupalSettings);

So let’s pull out the jQuery…

// @file header.es6.js

const headerDefaults = {
 breakpoint: 700,
 toggleClass: 'header__toggle',
 toggleClassActive: 'is-active'

function header(options) {
   const opts = Object.assign({}, headerDefaults, options);
   const header = this;
   // do stuff with header.

export { header as myHeader }


// @file header.drupal.es6.js

import { myHeader } from './header.es6';

(({ behaviors }, { my_theme }) => {
 behaviors.header = {
   attach(context) {
     context.querySelectorAll('.header').forEach((obj) => {, {
         breakpoint: my_theme.header.breakpoint,
})(Drupal, drupalSettings);

We’ve replaced $.extend with Object.assign for our default/overridable options. We use context.querySelectorAll('.header'') instead of $('.header', context) to find all instances of .header. We’ve also moved the .each((i, obj) => {}) to the .drupal file as .forEach((obj) => {}) to simplify our called function. Overall not very different at all!

We could go further and convert our functions to Classes, but if you're just getting started with ES6 there's nothing wrong with taking baby steps! Classes are just fancy functions, so upgrading to them in the future would be a great way to learn how they work.

Some other common things;

  • .querySelectorAll() works the same as .find()
  • .querySelector() is the same as .find().first()
  • .setAttribute(‘name’, ‘value’) replaces .attr(‘name’, ‘value’)
  • .getAttribute(‘name’) replaces .attr(‘name’)
  • .classList.add() and .classList.remove() replace .addClass() and .removeClass()
  • .addEventListener('click', (e) => {}) replaces .on('click', (e) => {})
  • .parentNode replaces .parent()
  • .children replaces .children()

You can also still use .focus(), .closest(), .remove(), .append() and .prepend(). Check out You Don't Need jQuery, it's a great resource, or just google “$.thing as vanilla js”.

Everything I’ve mentioned here that’s linked to the MDN web docs required a polyfill for IE, which is available on their respective docs page.

If you’re refactoring existing JS it’s also a good time to make sure you have some Nightwatch JS tests written to make sure you’re not breaking anything :)

Polyfills and Babel

Babel is the JS transpiler we use and it can provide the polyfills itself (babel-polyfill), but due to the nature of our component library based approach, Babel would transpile the polyfills needed for each component into that components JS file. If you bundle everything into one file then obviously this won’t be an issue. But once we start having a couple of different components JS loaded on a page, all with similar polyfills in them you can imagine the amount of duplication and wasted KB.

I prefer to just put the polyfills I need into one file and load it separately. It means have full control over the quality of my polyfills (since not all polyfills are created equally). I can easily make sure I’m only polyfilling what I really need. I can easily pull them out when no longer needed, and I’m only loading that polyfill file to browsers that need it;

js/polyfill.min.js : { attributes: { nomodule: true, defer: true } }

This line is from my themes libraries.yml file, where I'm telling Drupal about the polyfill file. If I pass the nomodule attribute in browsers who DO support ES6 modules will ignore this file, but browsers like IE load it. We're also deferring the file so it's loading after everything else.

I should point out Babel is still needed. We can't polyfill everything (like Classes or Arrow functions) and we can't Transpile everything either. We need both, at least until IE stops requiring support.

Photo of Rikki Bochow

Posted by Rikki Bochow
Front end Developer

Dated 24 July 2018


Great article, as always!
Wondering if you still use Rollup.js as a bundler or along the way you found out a better tool?
(Or reverted to Webpack)

Thanks Gab, yeah we still use Rollup.js for the most part. Some of the more app-like projects are using Webpack, though I'm curious to try out Parcel.js one day too.

How to replace jquery.once when we're using vanilla js?

The addEventLister() function has an option you can set to ensure it only runs once, though it also requires a polyfill. Check out this post (which also shows the alternative approach of using removeEventLister().


Add new comment

Jul 05 2018
Jul 05

Automated accessibility tools are only one part of ensuring a website is accessible, but it is a very simple part that can catch a lot of really easy to fix issues. Issues that when found and corrected early in the development cycle, can go a long way to ensuring they don’t get compounded into much larger issues down the track.

I’m sure we all agree that the accessibility of ALL websites is important. Testing for accessibility (a11y) shouldn’t be limited to Government services. It shouldn’t be something we need to convince non-government clients to set aside extra budget for. It certainly shouldn’t be left as a pre-launch checklist item that only gets the proper attention if the allocated budget and timeframe hasn’t been swallowed up by some other feature.

Testing each new component or feature against an a11y checker, as it’s being developed, takes a small amount of time. Especially when compared to the budget required to check and correct an entire website before launch -- for the very first time. Remembering to run such tests after a components initial development is one thing. Remembering to re-check later down the line when a few changes and possible regressions have gone through is another. Our brains can only do so much, so why not let the nice, clever computer help out?


NightwatchJS is going to be included in Drupal 8.6.x, with some great Drupal specific commands to make functional javascript testing in Drupal super easy. It's early days so the documentation is still being formed.  But we don't have to wait for 8.6.x to start using Nightwatch, especially when we can test interactions against out living Styleguide rather than booting up Drupal.

So lets add it to our build tools;

$ npm install nightwatch

and create a basic nightwatch.json file;

  "src_folders": [
  "output_folder": "build/logs/nightwatch",
  "test_settings": {
    "default": {
      "filter": "**/tests/*.js",
      "launch_url": "",
      "selenium_host": "",
      "selenium_port": "4444",
      "screenshots": {
        "enabled": true,
        "on_failure": true,
        "on_error": true,
        "path": "build/logs/nightwatch"
      "desiredCapabilities": {
        "browserName": "chrome"

We're pointing to our theme and custom modules as the source of our JS tests as we like to keep the tests close to the original JS. Our test settings are largely based on the Docker setup described below, with the addition of the 'filter' setting which searches the source for .js files inside a tests directory.

A test could be as simple as checking for an attribute, like the following example;

 * @file responsiveTableTest.js.

module.exports = {
  'Responsive tables setup': (browser) => {

Which launches the Styleguides table component, waits a beat for the JS to initiate then checks that our td elements have the data-label that our JS added. Or is could be much more complex.

aXe: the Accessibility engine

aXe is a really nice tool for doing basic accessibility checks, and the Nightwatch Accessibility node module integrates aXe with Nightwatch so we can include accessibility testing within our functional JS tests without needing to write out the rules ourself. Even if you don't write any component specific tests with your Nightwatch setup, including this one accessibility test will give you basic coverage.

$ npm install nightwatch-accessibility

Then we edit our nightwatch.json file to include the custom_commands_path and custom_assertions_path;

  "src_folders": ["app/themes/previousnext_d8_theme/src/"],
  "output_folder": "build/logs/nightwatch",
  "custom_commands_path": ["./node_modules/nightwatch-accessibility/commands"],
  "custom_assertions_path": ["./node_modules/nightwatch-accessibility/assertions"],
  "test_settings": {

Then write a test to do the accessibility check;

 * @file Run Axe accessibility tests with Nightwatch.

const axeOptions = {
  timeout: 500,
  runOnly: {
    type: 'tag',
    values: ['wcag2a', 'wcag2aa'],
  reporter: 'v2',
  elementRef: true,

module.exports = {
  'Accessibility test': (browser) => {
      .assert.accessibility('.kss-modifier__example', axeOptions)

Here we're configuring aXe core to check for wcag2a and wcag2aa, for anything inside the .kss-modifier__example selector of our Styleguide. Running this will check all of our components and tell us if it's found any accessibility issues. It'll also fail a build, so when hooked up with something like CircleCI, we know our Pull Requests will fail.

If we want to exclude a selector, instead of the .kss-modifier__example selector, we pass an include/exclude object { include: ['.kss-modifier__example'], exclude: ['.hljs'] }.

If you only add one test add one like this. Hopefully once you get started writing Nightwatch tests you'll see how easy it is and eventually add more :)

You can include the accessibility test within another functional test too, for example a modal component. You'll want to test it opens and closes ok, but once it's open it might have some accessibility issues that the overall check couldn't test for. So we want to re-run the accessibility assertion once it's open;

 * @file dialogTest.js

const axeOptions = require('../../../axeOptions.js'); // axeOptions are now shareable.

const example = '#kssref-6-18 .kss-modifier__example';
const trigger = '#example-dialog-toggle';
const dialog = '.js-dialog';

module.exports = {
  'Dialog opens': (browser) => {
    browser.assert.attributeEquals(dialog, 'aria-hidden', 'false');
    browser.assert.accessibility(example, axeOptions);


As mentioned above this all needs a little docker & selenium setup too. Selenium has docs for adding an image to Docker, but the setup basically looks like this;

@file docker-compose.yml

    [general docker image stuff...]

    image: selenium/standalone-chrome
    network_mode: service:app
      - /dev/shm:/dev/shm

Then depending on what other CI tools you're using you may need some extra config. For instance, to get this running on CircleCI, we need to tell it about the Selenium image too;

@file .circleci/config.yml

     [other docker images...]
     - image: selenium/standalone-chrome

If you're not using docker or any CI tools and just want to test this stuff locally, there's a node module for adding the selenium-webdriver but I haven't tested it out with Nightwatch.

Don’t forget the manual checks!

There’s a lot more to accessibility testing than just these kinds of automated tests. A layer of manual testing will always be required to ensure a website is truly accessible. But automating the grunt work of running a checklist against a page is one very nice step towards an accessible internet.

Add new comment

Jul 03 2018
Jul 03

Back in the Drupal 6 days, I built the BOM Weather Drupal module to pull down weather data from the Australian Bureau of Meteorology (BOM) site, and display it to users.

We recently had a requirement for this in a new Drupal 8 site, so decided to take a more modern approach.

Not that kind of decoupled Drupal

We often hear the term Decoupled Drupal but I'm not talking about a Javascript front-end and Drupal Web Service API backend.

This kind of decoupling is removing the business logic away from Drupal concepts. Drupal then becomes a wrapper around the library to handle incoming web requests, configuration and display logic.

We can write the business logic as a standalone PHP package, with it's own domain models, and publish it to to be shared by both Drupal and non-Drupal projects.

The Bom Weather Library

We started by writing unit-testable code, that pulled in weather forecast data in an XML format, and produced a model in PHP classes that is much easier for consuming code to use. See the full BOM Weather code on GitHub 

For example:

$client = new BomClient($logger);
$forecast = $client->getForecast('IDN10031');

$issueTime = $forecast->getIssueTime();

$regions = $forecast->getRegions();
$metros = $forecast->getMetropolitanAreas();
$locations = $forecast->getLocations();

foreach ($locations as $location) {
  $aac = $location->getAac();
  $desc = $location->getDescription();

  /** @var \BomWeather\Forecast\ForecastPeriod[] $periods */
  $periods = $location->getForecastPeriods();

  // Usually 7 days of forecast data.
  foreach ($periods as $period) {
    $date = $period->getStartTime();
    $maxTemp = $period->getAirTempMaximum();
    $precis = $period->getPrecis();

The library takes care of fetching the data, and the idiosyncrasies of a fairly crufty API (no offence intended!).

Unit Testing

We can have very high test coverage with our model. We can test the integration with mock data, and ensure a large percentage of the code is tested. As we are using PHPUnit tests, they are lightning fast, and are automated as part of a Pull Request workflow on CircleCI.

Any consuming Drupal code can focus on testing just the Drupal integration, and not need to worry about the library code.

Dependency Management

As this is a library, we need to be very careful not to introduce too many runtime dependencies. Also any versions of those dependencies need to be more flexible than what you would normally use for a project. If you make your dependency versions too high they can introduce incompatibilities when used a project level. Consumers will simply not be able to add your library via composer.

We took a strategy with the BOM Weather library of having high-low automated testing via CircleCI. This means you test using both: 

composer update --prefer-lowest


composer update

The first will install the lowest possible versions of your dependencies as specified in your composer.json. The second will install the highest possible versions. 

This ensures your version constraints are set correctly and your code should work with any versions in between.


At PreviousNext, we have been using the decoupled model approach on our projects for the last few years, and can certainly say it leads to more robust, clean and testable code. We have had projects migrate from Drupal 7 to Drupal 8 and as the library code does not need to change, the effort has been much less.

If you are heading to Drupal Camp Singapore, make sure to see Eric Goodwin's session on Moving your logic out of Drupal.

Photo of Kim Pepper

Posted by Kim Pepper
Technical Director

Dated 4 July 2018


Thanks for writing this! It's great to see this approach gain traction in Drupal 8. We're doing the same thing with the Drupal 8 version of the media_mpx module (library at As you say, test coverage of the critical functionality is so much simpler when you aren't dealing with the testing difficulties of Drupal 8 entities.

We've had good success bridging Drupal services back into non-Drupal libraries. For example, we use the cache PSR's to allow the PHP library to save data to Drupal's cache. You might be interested in which does the same thing for locks.

Thanks Andrew. I will check them out!


Add new comment

Jun 18 2018
Jun 18

In Drupal 8.5.0, the "processed" property of text fields is available in REST which means that REST apps can render the HTML output of a textarea without worrying about the filter formats.

In this post, I will show you how you can add your own processed fields to be output via the REST API.

The "processed" property mentioned above is what is known as a computed property on the textarea field.

The ability to make the computed properties available for the REST API like this can be very helpful. For example, when the user inputs the raw value and Drupal performs some complex logical operations on it before showing the output.

Drupal fieldable entities can also have computed properties and those properties can also be exposed via REST. I used the following solution to expose the data of an entity field which takes raw data from the users and perform some complex calculations on it.

First of all, we need to write hook_entity_bundle_field_info to add the property and because it is a computed field we don't need to implement hook_entity_field_storage_info.

<?php // my_module/my_module.module /** * @file * Module file for my_module. */ use Drupal\my_module\FieldStorageDefinition; use Drupal\my_module\Plugin\Field\MyComputedItemList /** * Implements hook_entity_bundle_field_info(). */ function my_module_entity_bundle_field_info(EntityTypeInterface $entity_type, $bundle, array $base_field_definitions) { $fields = []; // Add a property only to nodes of the 'my_bundle' bundle. if ($entity_type->id() === 'node' && $bundle === 'my_bundle') { // It is not a basefield so we need a custom field storage definition see // $fields['my_computed_property'] = FieldStorageDefinition::create('string') ->setLabel(t('My computed property')) ->setDescription(t('This is my computed property.')) ->setComputed(TRUE) ->setClass(MyComputedItemList::class) ->setReadOnly(FALSE) ->setInternal(FALSE) ->setDisplayOptions('view', [ 'label' => 'hidden', 'region' => 'hidden', 'weight' => -5, ]) ->setDisplayOptions('form', [ 'label' => 'hidden', 'region' => 'hidden', 'weight' => -5, ]) ->setTargetEntityTypeId($entity_type->id()) ->setTargetBundle($bundle) ->setName('my_computed_property') ->setDisplayConfigurable('form', FALSE) ->setDisplayConfigurable('view', FALSE); } return $fields; }

Then we need the MyComputedItemList class to perform some magic. This class will allow us to set the computed field value.

<?php // my_module/src/Plugin/Field/MyComputedItemList.php namespace Drupal\my_module\Plugin\Field; use Drupal\Core\Field\FieldItemList; use Drupal\Core\TypedData\ComputedItemListTrait; /** * My computed item list class. */ class MyComputedItemList extends FieldItemList { use ComputedItemListTrait; /** * {@inheritdoc} */ protected function computeValue() { $entity = $this->getEntity(); if ($entity->getEntityTypeId() !== 'node' || $entity->bundle() !== 'my_bundle' || $entity->my_some_other_field->isEmpty()) { return; } $some_string = some_magic($entity->my_some_other_field); $this->list[0] = $this->createItem(0, $some_string); }

The field we add is not a base field so we can't use \Drupal\Core\Field\BaseFieldDefinition. There is an open core issue to address that but in tests there is a workaround using a copy of \Drupal\entity_test\FieldStorageDefinition:

<?php // my_module/src/FieldStorageDefinition.php namespace Drupal\my_module; use Drupal\Core\Field\BaseFieldDefinition; /** * A custom field storage definition class. * * For convenience we extend from BaseFieldDefinition although this should not * implement FieldDefinitionInterface. * * @todo Provide and make use of a proper FieldStorageDefinition class instead: * */ class FieldStorageDefinition extends BaseFieldDefinition { /** * {@inheritdoc} */ public function isBaseField() { return FALSE; } }

Last but not least we need to announce our property definition to the entity system so that it can keep track of it. As it is an existing bundle we can write an update hook. Otherwise, we'd need to implement hook_entity_bundle_create.

<?php // my_module/my_module.install /** * @file * Install file for my module. */ use Drupal\my_module\FieldStorageDefinition; use Drupal\my_module\Plugin\Field\MyComputedItemList; /** * Adds my computed property. */ function my_module_update_8001() { $fields['my_computed_property'] = FieldStorageDefinition::create('string') ->setLabel(t('My computed property')) ->setDescription(t('This is my computed property.')) ->setComputed(TRUE) ->setClass(MyComputedItemList::class) ->setReadOnly(FALSE) ->setInternal(FALSE) ->setDisplayOptions('view', [ 'label' => 'hidden', 'region' => 'hidden', 'weight' => -5, ]) ->setDisplayOptions('form', [ 'label' => 'hidden', 'region' => 'hidden', 'weight' => -5, ]) ->setTargetEntityTypeId('node') ->setTargetBundle('my_bundle') ->setName('my_computed_property') ->setDisplayConfigurable('form', FALSE) ->setDisplayConfigurable('view', FALSE); // Notify the storage about the new field. \Drupal::service('field_definition.listener')->onFieldDefinitionCreate($fields['my_computed_property']); }

The beauty of this solution is that I don't have to write a custom serializer to normalize the output. Drupal Typed Data API is doing all the heavy lifting.

Related Drupal core issues:

Photo of Jibran Ijaz

Posted by Jibran Ijaz
Senior Drupal Developer

Dated 18 June 2018

Add new comment

Jun 14 2018
Jun 14

If you've ever patched Drupal core with Composer you may have noticed patched files can sometimes end up in the wrong place like core/core or core/b. Thankfully there's a quick fix to ensure the files make it into the correct location.

When using cweagans/composer-patches it's easy to include patches in your composer.json

"patches": {
    "drupal/core": {
        "Introduce a batch builder class to make the batch API easier to use": ""

However in certain situations patches will get applied incorrectly. This can happen when the patch is only adding new files (not altering existing files), like in the patch above. The result is the patched files end up in a subfolder core/core. If the patch is adding new files and editing existing files the new files will end up in core/b. This is because composer-patches cycle through the -p levels trying to apply them; 1, 0, 2, then 4.

Thankfully there is an easy fix!

"extra": {
   "patchLevel": {
       "drupal/core": "-p2"

Setting the patch level to p2 ensures any patch for core will get applied correctly.

Note that until composer-patches has a 1.6.5 release, specifically this PR, you'll need to use the dev release like:

"require": {
    "cweagans/composer-patches": "1.x-dev"

The 2.x branch of composer-patches also includes this feature.

Big thanks to cweagans for this great tool and jhedstrom for helping to get this into the 1.x branch.


thanks for the blog @Saul Willers .

Fantastic, thanks Cameron!

Thanks for the trick.

And just as an addendum about *creating* patches from a split core git repository: make sure to use "git diff --src-prefix=a/core/ --dst-prefix=b/core/".



Add new comment

May 09 2018
May 09

Several times in the past I've been caught out by Drupal's cron handler silently catching exceptions during tests.

Your test fails, and there is no clue as to why.

Read on to find out how to shine some light on this, by making your kernel tests fail on any exception during cron.

If you're running cron during a kernel test and expecting something to happen, but it doesn't - it can be hard to debug why.

Ordinarily an uncaught exception during a test will cause PHPUnit to fail, and you can pinpoint the issue.

However, if you're running cron in the test this may not be the case.

This is because, by default Drupal's cron handler catches all exceptions and silently logs them. This is colloquially known as Pokemon exception handling.

The act of logging an exception is not enough to fail a test.

So your test skips the exception and carries on, failing in other ways unexpectedly.

This is exacerbated by the fact that PHP Unit throws an exception for warnings. So the slightest issue in your code will cause it to halt execution. In an ordinary scenario, this exception causes the test to fail. But the pokemon catch block in the Cron class prevents that, and your test continues in a weird state.

This is the code in question in the cron handler

try {
// ... 
catch (\Exception $e) {
  // In case of any other kind of exception, log it and leave the item
  // in the queue to be processed again later.
  watchdog_exception('cron', $e);

So how do you make this fail your test? In the end, it's quite simple.

Firstly, you make your test a logger and use the handy trait to do the bulk of the work.

You only need to implement the log method, as the trait takes care of handling all other methods.

In this case, watchdog_exception logs exceptions as RfcLogLevel::ERROR. The log levels are integers, from most severe to least severe. So in this implementation we tell PHP Unit to fail the test with any messages logged where the severity is ERROR or worse.

use \Drupal\KernelTests\KernelTestBase;
use Psr\Log\LoggerInterface;
use Drupal\Core\Logger\RfcLoggerTrait;
use Drupal\Core\Logger\RfcLogLevel;

class MyTest extends KernelTestBase implements LoggerInterface {
  use RfcLoggerTrait;

   * {@inheritdoc}
  public function log($level, $message, array $context = []) {
    if ($level <= RfcLogLevel::ERROR) {
      $this->fail(strtr($message, $context));

Then in your setUp method, you register your test as a logger.


And that's it - now any errors that are logged will cause the test to fail.

If you think we should do this by default, please comment on this core issue.

Photo of Lee Rowlands

Posted by Lee Rowlands
Senior Drupal Developer

Dated 9 May 2018

Add new comment

Mar 15 2018
Mar 15

In most of the projects we build, the HTML markup provided by core just gets in the way. There is way too many wrapper divs. This can cause issues when trying to create lean markup that matches what is produced in a generated styleguide.

In this post, I'll introduce you to the concept of bare templates, and how you can remove unnecessary markup from your Twig templates.

In Drupal 8, a couple of themes are shipped by default to serve a common set of end user needs.

Among them are:

  • Bartik: A flexible, recolourable theme with many regions and a responsive, mobile-first layout.
  • Seven: The default administration theme for Drupal 8 was designed with clean lines, simple blocks, and sans-serif font to emphasise the tools and tasks at hand.
  • Stark: An intentionally plain theme with almost no styling to demonstrate default Drupal’s HTML and CSS.
  • Stable: A base theme. Stable theme aggregates all of the CSS from Drupal core into a single theme. Theme markup and CSS will not change so any sub-theme of Stable will know that updates will not cause it to break.
  • Classy: A sub-theme of Stable. Theme designed with lot of markups for beginner themers.

But in an actual business scenario the requirements and expectations of a client towards the look and feel of the website is far more distinct than the themes that are provided in Drupal core.

When building your site based upon one of these themes it is common to face issues with templating during the frontend implementation phase. Quite often the default suggested templates for blocks, nodes, fields etc. contain HTML wrapper divs that your style guide doesn’t require.

Usually the most effective way is to build themes using the Stable theme. In Stable, the theme markup and CSS are fixed between any new Drupal core releases making any sub-theme to less likely to break on a Drupal core update. It also uses the verbose field template support for debugging.

Which leads us to use bare templates.

What is a bare template?

A bare template is a twig file that has the minimum number of HTML wrappers around actual content. It could be simple as a file with a single content output like {{}}

Compared to th traditional approach, bare templates provide benefits such as:

  • Ease of maintenance: With minimum markup the complexity of the template is much lesser making it easy to maintain.
  • Cleaner Markup: The markup will only have the essential or relevant elements where as in traditional approach there are a lot of wrappers leading to a complex output.
  • Smaller page size: Less markup means less page size.
  • Avoids the need for markup removal modules: With bare markup method we do not need to use modules like fences or display suite. Which means less modules to maintain and less configuration to worry about.

Our Example

We need to create a bare template for type field and suggest it to render only field name and field_image of my_vocabulary taxonomy entity. This will avoid Drupal from suggesting this bare template for other fields belonging to different entities.

Field template

Let's have a look at field template which resides at app/core/themes/stable/templates/field/field.html.twig

{% if label_hidden %}
{% if multiple %}
  <div{{ attributes }}>
    {% for item in items %}
      <div{{ item.attributes }}>{{ item.content }}</div>
    {% endfor %}
{% else %}
  {% for item in items %}
    <div{{ attributes }}>{{ item.content }}</div>
  {% endfor %}
{% endif %}
{% else %}
<div{{ attributes }}>
  <div{{ title_attributes }}>{{ label }}</div>
  {% if multiple %}
  {% endif %}
  {% for item in items %}
    <div{{ item.attributes }}>{{ item.content }}</div>
  {% endfor %}
  {% if multiple %}
  {% endif %}
{% endif %}

As you see there is quite a lot of div wrappers used in the default template which makes it difficult to style components. If you are looking for simple output, this code is overkill. There is however, a lot of valuable information is provided in the comments of field.html.twig which we can use.

* @file
* Theme override for a field.
* To override output, copy the "field.html.twig" from the templates directory
* to your theme's directory and customize it, just like customizing other
* Drupal templates such as page.html.twig or node.html.twig.
* Instead of overriding the theming for all fields, you can also just override
* theming for a subset of fields using
* @link themeable Theme hook suggestions. @endlink For example,
* here are some theme hook suggestions that can be used for a field_foo field
* on an article node type:
* - field--node--field-foo--article.html.twig
* - field--node--field-foo.html.twig
* - field--node--article.html.twig
* - field--field-foo.html.twig
* - field--text-with-summary.html.twig
* - field.html.twig
* Available variables:
* - attributes: HTML attributes for the containing element.
* - label_hidden: Whether to show the field label or not.
* - title_attributes: HTML attributes for the title.
* - label: The label for the field.
* - multiple: TRUE if a field can contain multiple items.
* - items: List of all the field items. Each item contains:
*   - attributes: List of HTML attributes for each item.
*   - content: The field item's content.
* - entity_type: The entity type to which the field belongs.
* - field_name: The name of the field.
* - field_type: The type of the field.
* - label_display: The display settings for the label.
* @see template_preprocess_field()

The code

Building the hook.

We will be using hook_theme_suggestions_HOOK_alter() to suggest the two fields to use our bare template when rendering.

It is important to note that only these two fields will be using the bare template and the other fields (if any) in that entity will use the default field.html.twig template to render.

my_custom_theme_theme_suggestions_field_alter (&$hooks, $vars){

    // Get the element names passed on when a page is rendered.
    $name = $vars['element']['#field_name'];

    // Build the string layout for the fields.
    // <entity type>:<bundle name>:<view mode>:<field name>

    $bare_hooks = [

    // Build the actual var structure from second parameter
    $hook = implode(':', [

    // Check if the strings match and assign the bare template.
    if (in_array($hook, $bare_hooks, TRUE)) {
        $hooks[] = 'field__no_markup';

The hook key field__no_markup mentioned in the code corresponds to a twig file which must reside under app/themes/custom/my_theme/templates/field/field--no-markup.html.twig

Debugging Output

In order to see how this is working, we can fire up PHPStorm and walk the code in the debugger.

As you can see in the output below, the implode() creates the actual var structure from the second parameter. We will use this to compare with the $bare_hooks array we created  fields specific to content entity types that we need to assign the bare template.

Note: As best practise make sure you pass a third argument TRUE to in_array(). Which will validate the data type as well.


Bare Template Markup

The following is the contents of our bare template file. Notice the lack of any HTML?

* @file
* Theme override to remove all field markup.

{% spaceless %}
{% for item in items %}
  {{ item.content }}
{% endfor %}
{% endspaceless %}

Bare templating can be used for other commonly used templates as well. To make it render a minimal amount of elements.


We can always use custom templating to avoid getting into complicated markups. And have the flexibility to maintain the templates to render for specific entities.



Great post! I love getting to the cleanest markup possible.

Since the field templates don't have the `attributes`, have you run into any issues with Contextual Links & Quick Edit working? I've run into this issue trying to achieve the same thing using different methods:




Add new comment

Mar 12 2018
Mar 12

Since the release of Drupal 8, it has become tricky to determine what and where override configuration is set.

Here are some of the options for a better user experience.

Drupal allows you to override configuration by setting variables in settings.php. This allows you to vary configuration by which environment your site are served. In Drupal 7, when overrides are set, the overridden value is immediately visible in administration UI. Though the true value is transparent, when a user attempts to change configuration, the changes appear to be ignored. The changes are saved and stored. But Drupal exposes the overridden value when a configuration form is (re)loaded.

With Drupal 8, the behaviour of overridden configuration has reversed. You are always presented with active configuration, usually set by site builders. When configuration is accessed by code, overrides are applied on top of active configuration seamlessly. This setup is great if you want to deploy the active configuration to other environments. But it can be confusing on sites with overrides, since its not immediately obvious what Drupal is using.

An example of this confusion is: is your configuration forms show PHP error messages are switched-on, but no messages are visible. Or, perhaps you are overriding Swiftmailer with environment specific email servers. But emails aren't going to the servers displayed on the form.

A Drupal core issue exists to address these concerns. However this post aims to introduce a stopgap. In the form of a contrib module, of course.

Introducing Configuration Override Inspector (COI). This module makes configuration-overrides completely transparent to site builders. It provides a few ways overridden values can be exposed to site builders.

The following examples show error settings set to OFF in active configuration, but ON in overridden configuration. (such as a local.settings.php override on your dev machine)

// settings.php
$config['system.logging']['error_level'] = 'verbose';

Hands-off: Allow users to modify active configuration, while optionally displaying a message with the true value. This is most like out-of-the-box Drupal 8 behaviour:

Coi Passive

Expose and Disable: Choose whether to disable form fields with overrides display the true value as the field value:

Coi Disabled

Invisible: Completely hide form fields with overrides:

Coi Hidden

Unfortunately Configuration Override Inspector doesnt yet know how to map form-fields with appropriate configuration objects. Contrib module Config Override Core Fields exists to provide mapping for Drupal core forms. Further documentation is available for contrib modules to map fields to configuration objects. Which looks a bit like this:

$config = $this->config('system.logging');
$form['error_level'] = [
  '#type' => 'radios',
  '#title' => t('Error messages to display'),
  '#default_value' => $config->get('error_level'),
  // ...
  '#config' => [
    'key' => 'system.logging:error_level',

Get started with Configuration Override Inspector (COI) and Config Override Core Fields:

composer require drupal/coi:^[email protected]
composer require drupal/config_override_core_fields:^[email protected]

COI requires Drupal 8.5 and above, thanks to improvements in Drupal core API.

Have another strategy for handling config overrides? Let me know in the comments!

Photo of Daniel Phin

Posted by Daniel Phin
Drupal Developer

Dated 12 March 2018

Add new comment

Feb 14 2018
Feb 14

In one of our recent projects, our client made a request to use LinkIt module to insert file links to content from the group module.  However, with the added distinction of making sure that only content that is in the same group as the content they are editing is suggested in the matches.

Here’s how we did it.

The LinkIt module

First, let me give you a quick overview of the LinkIt module.

LinkIt is a tool that is commonly used to link internal or external artifacts. One of the main advantages of using it is because LinkIt maintains links by uuid which means no occurrence for broken links. And it can link any type of entity varying from core entities like nodes, users, taxonomy terms, files, comments and to custom entities created by developers.

Once you install the module, you need to set a Linkit profile which consists of information about which plugins to use. To set the profiles use /admin/config/content/linkit path. And the final step will be to enable the Linkit plugin on the text format you want to use. Formats are found at admin/config/content/formats. And you should see the link icon when editing content item.

Once you click on the LinkIt icon it will prompt a modal as shown below.

By default LinkIt ships with a UI to maintain profiles that enables you to manage matchers.


Matchers are responsible for managing the autoload suggestion criteria for a particular LinkIt field. It provides bundle restrictions and bundle grouping settings

Proposed resolution

To solve the issue; we started off by creating a matcher for our particular entity type. Linkit has an EntityMatcher plugin that uses Drupal's Plugin Derivatives API to expose one plugin for each entity type. We started by adding the matcher that linkit module exposed for our custom group content entity type.

We left the bundle restrictions and bundle grouping sections un-ticked so that all existing bundles are allowed so the content of those bundles will be displayed.

Now that the content is ready we have to let the matcher know that we only need to load content that belongs to the particular group for which the user is editing or creating the page.

Using the deriver

In order to do that we have to create a new class in /modules/custom/your_plugin_name/src/Plugin/Linkit/Matcher/YourClassNameMatcher.php by extending existing EntityMatcher class which derives at /modules/contrib/linkit/src/Plugin/Linkit/Matcher/EntityMatcher.php.

Because Linkit module's plugin deriver exposes each entity-type plugin with and ID for the form entity:{entity_type_id} we simply need to create a new plugin with an ID that matches our entity type ID. This then takes precedence over the default derivative based plugin provided by Linkit module. We can then modify the logic in either the ::execute() or ::buildEntityQuery method.

Using LinkIt autocomplete request

But here comes the challenge, in that content edit page the LinkIt modal doesn’t know about the group of the content being edited, therefore we cannot easily filter the suggestions based on the content being edited. We need to take some fairly extreme measures to make that group ID available for our new class to filter the content once the modal is loaded and user starts typing in the field.

In this case the group id is available from the page uri.

So in order to pass this along, we can make use of the fact that the linkit autocomplete widget has a data attribute 'data-autocomplete-path' which is used by its JavaScript to perform the autocomplete request. We can add a process callback to the LinkIt element to extract the current page uri and pass it as a query parameter in the autocomplete path.

The code

To do so we need to implement hook_element_info_alter in our custom module. Here we will add a new process callback and in that callback we can add the current browser url as a query parameter to the data-autocomplete-path attribute of the modal.

\Drupal\linkit\Element\Linkit is as follows;

public function getInfo() {
 $class = get_class($this);
 return [
  '#input' => TRUE,
  '#size' => 60,
  '#process' => [
    [$class, 'processLinkitAutocomplete'],
    [$class, 'processGroup'],
  '#pre_render' => [
    [$class, 'preRenderLinkitElement'],
    [$class, 'preRenderGroup'],
  '#theme' => 'input__textfield',
  '#theme_wrappers' => ['form_element'],

Below is the code to add the process callback and alter the data-autocomplete-path element. We rely on the HTTP Referer header which Drupal sends in its AJAX request that is used to display the LinkIt modal, which in turn builds the LinkIt element

* Implements hook_element_info_alter().

function your_module_name_element_info_alter(array &$info) {
  $info['linkit']['#process'][] = 'your_module_name_linkit_process';

* Process callback.
function your_module_name_linkit_process($element) {
 // Get the HTTP referrer (current page URL)
 $url = \Drupal::request()->server->get('HTTP_REFERER');

 // Parse out just the path.
 $path = parse_url($url, PHP_URL_PATH);

 // Append it as a query parameter to the autocomplete path.
 $element['#attributes']['data-autocomplete-path'] .= '?uri=' . urlencode($path);
 return $element;

Once this is done we can now proceed to create the new plugin class extending EntityMatcher class. Notice the highlighted areas.

namespace Drupal\your_module\Plugin\Linkit\Matcher;

use Drupal\linkit\Plugin\Linkit\Matcher\EntityMatcher;
use Drupal\linkit\Suggestion\EntitySuggestion;
use Drupal\linkit\Suggestion\SuggestionCollection;

* Provides specific LinkIt matchers for our custom entity type.
* @Matcher(
*   id = "entity:your_content_entity_type",
*   label = @Translation("Your custom content entity"),
*   target_entity = "your_content_entity_type",
*   provider = "your_module"
* )

class YourContentEntityMatcher extends EntityMatcher {

 * {@inheritdoc}
public function execute($string) {
  $suggestions = new SuggestionCollection();
  $query = $this->buildEntityQuery($string);
  $query_result = $query->execute();
  $url_results = $this->findEntityIdByUrl($string);
  $result = array_merge($query_result, $url_results);

  if (empty($result)) {
    return $suggestions;

  $entities = $this->entityTypeManager->getStorage($this->targetType)->loadMultiple($result);

  $group_id = FALSE;
  // Extract the Group ID from the uri query parameter.
  if (\Drupal::request()->query->has('uri')) {
    $uri = \Drupal::Request()->query->get('uri');
    list(, , $group_id) = explode('/', $uri);

  foreach ($entities as $entity) {
    // Check the access against the defined entity access handler.
    /** @var \Drupal\Core\Access\AccessResultInterface $access */
    $access = $entity->access('view', $this->currentUser, TRUE);
    if (!$access->isAllowed()) {

    // Exclude content that is from a different group
    if ($group_id && $group_id != $entity->getGroup()->id()) {

    $entity = $this->entityRepository->getTranslationFromContext($entity);
    $suggestion = new EntitySuggestion();

  return $suggestions;


And we are done.

By re-implementing the execute() method of EntityMatcher class we are now able to make the LinkIt field to display only content from the same group as the content the user is editing/creating.

So next challenge here is to create some test coverage for this, as we're relying on a few random pieces of code - a plugin, some JavaScript in the LinkIt module, an element info alter hook and a process callback - any of which could change and render all of this non-functional. But that's a story for another post.

Photo of Pasan Gamage

Posted by Pasan Gamage
Drupal Developer

Dated 14 February 2018


I am the maintainer for the Linkit module, and I really liked this post. Glad you found it (quite) easy to extend.

Hey there, I've got linkit 8.x-4.3 and the class EntitySuggestion and SuggestionCollection don't even seem to exist at all? So the use statements fail and everything after that. Is there some aspect you did not include in this description?


Add new comment

Feb 07 2018
Feb 07

Great to see this project a really good page builder is badly needed for Drupal - looks like a very good start, well done Lee.

Not sure if you are familiar with the layout builder and visual composer build by NikaDevs (a theme company) but you could do a lot worse then having a look at their approach, it's a very good page builder - which they ave on all their themes.



Jan 24 2018
Jan 24

When optimising a site for performance, one of the options with the best effort-to-reward ratio is image optimisation. Crunching those images in your Front End workflow is easy, but how about author-uploaded images through the CMS?

Recently, a client of ours was looking for ways to reduce the size of uploaded images on their site without burdening the authors. To solve this, we used the module Image Optimize which allows you to use a number of compression tools, both local and 3rd party.

The tools it currently supports include:

  • Local
    • PNG 
    • JPEG
  • 3rd Party 

We decided to avoid the use of 3rd party services, as processing the images on our servers could reduce processing time (no waiting for a third party to reply) and ensure reliability.

In order to pick the tools which best served our we picked an image that closely represented the type of image the authors often used. We picked an image featuring a person’s face with a complex background - one png and one jpeg, and ran it through each of the tools with a moderately aggressive compression level.

PNG Results

Compression Library Compressed size Percentage saving Original (Drupal 8 default resizing) 234kb - AdvPng 234kb 0% OptiPng 200kb 14.52% PngCrush 200kb 14.52% PngOut 194kb 17.09% PngQuant 63kb 73.07%
Compression Library Compressed size Percentage saving Original 1403kb - AdvPng 1403kb 0% OptiPng 1288kb 8.19% PngCrush 1288kb 8.19% PngOut 1313kb 6.41% PngQuant 445kb 68.28%

JPEG Results

Compression Library Compressed size Percentage saving Original (Drupal 8 default resizing) 57kb - JfifRemove 57kb 0% JpegOptim 49kb 14.03% JpegTran 57kb 0%
Compression Library Compressed size Percentage saving Original 778kb - JfifRemove 778kb 0% JpegOptim 83kb 89.33% JpegTran 715kb 8.09%

Using a combination of PngQuant and JpegOptim, we could save anywhere between 14% and 89% in file size, with larger images bringing greater percentage savings.

Setting up automated image compression in Drupal 8

The Image Optimize module allows us to set up optimisation pipelines and attach them to our image styles. This allows us to set both site-wide and per-image style optimisation.

After installing the Image Optimize module, head to the Image Optimize pipelines configuration (Configuration > Media > Image Optimize pipeline) and add a new optimization pipeline.

Now add the PngQuant and JpegOptim processors. If they have been installed to the server Image Optimize should pick up their location automatically, or you can manually set the location if using a standalone binary.

JpegOptim has some additional quality settings, I’m setting “Progressive” to always and “Quality” to a sweet spot of 60. 70 could also be used as a more conservative target.

JpegOptim Settings

The final pipeline looks like the following:


Back to the Image Optimize pipelines configuration page, we can now set the new pipeline as the sitewide default:

Default sitewide pipeline

And boom! Automated sitewide image compression!

Overriding image compression for individual image styles

If the default compression pipeline is too aggressive (or conservative) for a particular image style, we can override it in the Image Styles configuration (Configuration > Media > Image styles). Edit the image style you’d like to override, and select your alternative pipeline:

Override default pipeline

Applying compression to existing images

Flushing the image cache will recreate existing images with compression the next time the image is loaded. This can be done with the drush command 

drush image-flush --all


Setting up automated image optimisation is a relatively simple process, with potentially large impacts on site performance. If you have experience with image optimisation, I would love to hear about it in the comments.

Photo of Tony Comben

Posted by Tony Comben
Front end Developer

Dated 24 January 2018


Did you consider using mod_pagespeed at all? If you did what made you decide against it?

Good question, Drupal performs many of the optimisations that mod_pagespeed does but allows us more granular control. One of the benefits of this approach is being able to control compression levels per image style. As Drupal is resizing and reshaping images then anyway, I feel it makes sense to do the compression at the same time.

Hi Tony,
Nice of you to post some details on this.

How does this integrate with Responsive Images and the Picture fields?

Can it crop and scale immediately after upload to get multiple files for multiple view ports?


Hi Saj, Responsive Images picks up its variants from the Image Styles so this works seamlessly. You can set your image dimensions and cropping in the image style, and the compression is applied after that.

Nice write up! I never knew about this Drupal module.

It'd be nice to compare the Original Image + Drupal Compression + Final Selected compression library output through some image samples.

Also might worth mentioning that PngQuant is a lossly image compression algorithm - and the others aren't (hence the big compression difference).

I'd recommend running optipng or pngcrush after pngquant to get an even more compressed image. Careful though, this can burn CPU cycles, especially with the module's obsessive parameter choices. Have a look at the $cmd entries in binaries/*.inc if you're curious.

Hi Christoph,

How do you define the order by which these compressions are applied?

Great article btw! The comparison metric is quite useful in knowing which tool is the best performer. I initially went for jpegtran but jpegoptim is producing way better results.


I recommend one of the external services when you’re on a host where you can’t install extra server software (like Pantheon).

Hi! Nice post. Which version did you use with Drupal 8 ?

8.x-2.0-alpha3 ? No issues with alpha version ? Thanks

Hi! Did you try this version of the module with actual Drupal version?
What result have you got?

I am glad to read such a huge stuff about the picture optimisation in Drupal. This looks really interesting to me and I would love to try out out after my bus tours from washington dc

Hi Tony,
Nice explanation of use of imageAPI module. m a newbie in D8, working on a live site.. I had a question regarding, setting manual path of pngquant.. As understood in windows, php folder should have its dll file, but as i am working on server directly, i dont know how to proceed from that step. Please do help.


Add new comment

Jan 22 2018
Jan 22

In November 2017 I presented at Drupal South on using Dialogflow to power conversational interfaces with Drupal.

The video and slides are below, the demo in which I talk to Drupal starts in the first minute.

by Lee Rowlands / 23 January 2018 Open slides in new window


Conversational UI, Drupal 8, Chatbots, DrupalSouth
Jan 22 2018
Jan 22

All PreviousNext Drupal 8 projects are now managed using Composer. This is a powerful tool, and allows our projects to define both public and private modules or libraries, and their dependencies, and bring them all together.

However, a if you require public or private modules which are hosted on GitHub you may run into the API Rate Limits. In order to overcome this, it is recommended to add a GitHub personal access token to your composer configuration.

In this blog post, I'll show how you can do this in a secure and manageable way.

It's common practice when you encounter a Drupal project to see the following snippet in a composer.json file:

"config": {
    "github-oauth": {

What this means is, everyone is sharing a single account's personal access token. While this may be convenient, it's also a major security risk should the token accidentally be made public, or a team member leaves the organisation, and still has read/write access to your repositories.

A better approach, is to have each team member have their own personal access token configure locally. This ensures that individuals can only access repositories they have read permissions for, and once they leave your organisation they can no longer access any private dependencies.

Step 1: Create a personal access token

Go to and generate a new token.

Generate GitHub Token

You will need to specify all repo scopes.

Select GitHub Scopes

Finally, hit Generate Token to create the token.

GitHub token

Copy this, as well need it in the next step.

Step 2: Configure Composer to use your personal access token

Run the following from the command line:


You're all set! From now on, composer will use your own individual personal access token which is stored in $HOME/.composer/auth.json

What about Automated Testing Environments?

Fortunately, composer also accepts an environment variable COMPOSER_AUTH with a JSON-formatted string as an argument. For example:


You can simply set this environment variable in your CI Environment (e.g. CircleCI, TravisCI, Jenkins) and have a personal access token specific to the CI environment.


By using Personal Access Tokens, you can now safely remove any tokens from the project's composer.json file, removing the risk this gets exposed. You can also know that by removing access for any ex-team members, they are no longer able to access your organisations repos using a token. Finally, in the event of a token being compromised, you have reduced the attack surface, and can more easily identify which user's token was used.

Photo of Kim Pepper

Posted by Kim Pepper
Technical Director

Dated 22 January 2018

Add new comment

Jan 18 2018
Jan 18

After reading a blog post by Matthias Noback on keeping an eye on code churn, I was motivated to run the churn php library over some modules in core to gauge the level of churn.

Is this something you might like to do on your modules? Read on for more information.

What is churn

As Matthias details in his blog post - churn is a measure of the number of times a piece of code has been changed over time. The red flags start to crop up when you have high complexity and high churn.

Enter churn-php

Churn php is a library that analyses PHP code that has its history in git to identify high churn/complexity scores.

You can either install it with composer require bmitch/churn-php --dev or run it using docker docker run --rm -ti -v $PWD:/app dockerizedphp/churn run /path/to/code

Some results from core

So I ran it for some modules I look after in core, as well as the Drupal\Core\Entity namespace.

Block Content

File Times Changed Complexity Score core/modules/block_content/src/Entity/BlockContent.php 41 6 1 core/modules/block_content/src/BlockContentForm.php 32 6 0.78 core/modules/block_content/src/Plugin/Block/BlockContentBlock.php 20 6 0.488 core/modules/block_content/src/Tests/BlockContentTestBase.php 16 6 0.39 core/modules/block_content/src/BlockContentTypeForm.php 18 4 0.347 core/modules/block_content/src/Controller/BlockContentController.php 8 6 0.195


File Times Changed Complexity Score core/modules/comment/src/CommentForm.php 60 45 1 core/modules/comment/src/Entity/Comment.php 55 25 0.548 core/modules/comment/src/Tests/CommentTestBase.php 33 29 0.426 core/modules/comment/src/Controller/CommentController.php 32 20 0.274 core/modules/comment/src/CommentViewBuilder.php 37 16 0.25 core/modules/comment/src/Plugin/Field/FieldFormatter/CommentDefaultFormatter.php 32 18 0.24 core/modules/comment/src/Form/CommentAdminOverview.php 29 17 0.191 core/modules/comment/src/CommentAccessControlHandler.php 17 28 0.19 core/modules/comment/src/CommentLinkBuilder.php 15 29 0.17 core/modules/comment/src/CommentManager.php 29 15 0.157


File Times Changed Complexity Score core/lib/Drupal/Core/Entity/ContentEntityBase.php 115 173 0.808 core/lib/Drupal/Core/Entity/Sql/SqlContentEntityStorage.php 61 196 0.465 core/lib/Drupal/Core/Entity/Sql/SqlContentEntityStorageSchema.php 56 203 0.427 core/lib/Drupal/Core/Entity/Entity.php 131 43 0.212 core/lib/Drupal/Core/Entity/ContentEntityStorageBase.php 41 105 0.16


So, what to do with these results?

Well I think if you're looking to simplify your code-base and identify places that would warrant refactoring, those with a high 'churn' score would be a good place to start.

What do you think? Let us know in the comments.

Photo of Lee Rowlands

Posted by Lee Rowlands
Senior Drupal Developer

Dated 19 January 2018



Add new comment


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web