Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
May 19 2020
May 19

How amazing does it feel when you walk into a coffee shop and the barista greets you by name and asks if you’d like the usual? Or when you meet someone you haven’t seen in a long time and they ask about some obscure and specific hobby you once mentioned you had?

These personalized experiences give you the warm and fuzzies. You typically come away from those interactions a fan of the place or person. Heck, if someone were to criticize them, you’d speak up that that's not your experience. And you wouldn’t hesitate to recommend that place or person to others.

At an event a few years ago, I noticed someone who seemed a little hesitant. I introduced myself and invited them to join me at my table, and we chatted a little. We never spoke much after that. But on multiple occasions over the past few years, that person has given me a glowing reference when I came up in conversation. 

Personalization makes us feel valued and understood. And that's how you want your customers to feel. Because if they do, they will buy more and advocate for your brand.

Personalized Marketing Options to Consider

Broadly speaking, there are two ways to do web personalization: with real-time data or historical data.

Real-time data involves using location data to serve up a specific site, content, or offer. Here are a few examples:

  • Using device type or operating system to either manage how content is displayed or make assumptions on product needs
  • Using traffic source to tailor content (i.e., looking at when and what the user came from)
  • Basing promotions on products or services that have proven popular with others

Historical data goes deeper. This involves presenting personalized content, products, or offers based on users' previous interactions. You could look at factors like:

  • The number of orders they made
  • Their average order size
  • The total amount they spent
  • The products they looked at
  • The carts they abandoned
  • The time that has elapsed since their last transaction and/or visit

The options are as vast as the data you have collected. But through segmentation and rules, you can greatly increase the user's odds of converting.

Why You Need to Tread Carefully

Many consumers are becoming increasingly concerned about privacy and data management. You need to ensure that the personalization you supply is helping them in making a conversion decision and not simply showing them how much you know about them.

For instance, your barista asking if you fancy trying the new mocha latte (because they knew you had recently bought one from another brand) is much less creepy than being greeted with, "I heard you’re now into chocolate, so try this new mocha latte." The difference is small, but crucial.

Choose the Right Tools

With the overwhelming array of personalization options, it's important to work with an experienced team that can help guide you. At Acro, we love Drupal, and it can do many entry personalization functions within its platform (much more than most content management systems).

However, if you need to get very sophisticated, then you need a third-party platform. We love Acquia Lift. For features, usability and support, it is unparalleled. If you would like a personalized introduction to Acquia, hit me up and I’ll set you up, personally. 

The Bottom Line

Global research and advisory firm Gartner stated that the three key takeaways on personalization are:

  1. Consumers want to receive personalized help as they navigate the buying journey.
  2. Focusing solely on personalized recognition is potentially detrimental to a company’s commercial objectives.
  3. When it comes to help, consumers prioritize information, a simpler purchase process and saving time.1

Peronalization isn’t the ultimate goal. It’s another tool to achieve whatever your actual goal is, whether that be increased sales, increased order value, increased frequency or brand loyalty. Once you define what your goals are, you can explore if personalization will give the required ROI.

If you would like to have a conversation about your business goals and see if personalization is an appropriate tool for you, give me a call. And if not, if we ever meet out and about, you’re always welcome to sit at my table.

1 - Source: Gartner, "Maximize the Impact of Personalization,” April 2019

May 19 2020
May 19

Bit.ly for Drupal 8 provides a rich API that other Drupal modules can use to access Bit.ly functionality. Today’s topic of discussion is on how we can integrate Bitly with Drupal 8 easily in just a few simple steps.

What are the first three reasons that come to your mind for shortening a URL? Here are mine –

  • To make more space for content to be posted on micro-blogging sites such as Twitter
  • To mask the original URL 
  • To simply reduce the length of a super-long, ugly URL

Bitly has been out go-to URL shortening service for a long time now and we love its efficacy and speed. For those of you who haven’t stumbled upon Bitly yet, Bitly is a link management platform that offers products and services on link shortening and more. It allows you to shorten URLs for your websites. 

bitly implementation in Drupal

Getting Started- Bitly Integration with Drupal 8

Bitly allows for easy integration with your Drupal 8 website. Since there are no stable modules to integrate Bitly in Drupal 8, we will learn about how to call the Bitly API and create forms to integrate it. Here are some steps to get you started with it -

Step 1 – First, you need to create a Bitly account. Go to https://bitly.com/ and create an account. You need to login to your Bitly account once you have created the account.

You will then see your Dashboard page.

Bitly Integration with Drupal

Step 2 - Next Click on top right side of the screen and go to ‘Profile Settings’

Profile settings

Go to Registered OAuth Applications -> Register New App -> Get Registration Code. 
It will send you an email for your mail id that you used to sign up.

Get Registration with bitly

Step 3 - Click on Complete Registration. It will take you to back to Bitly website where you have to add the below details -

complete registration

And Save this OAUTH App. Once saved, a ‘CLIENT ID’ and ‘CLIENT SECRET’ key will be generated as shown below.


Step 4 - Next, go to your browser and enter this Url :
Once you hit this Url, it will redirect to {YOUR_WEBSITE_URL}?code={YOUR_CODE} . Copy this code and paste it somewhere safe so it's not lost.

Step 5 – Next, you have to use POSTMAN to call the API or you can use the CURL command also. Learn more about how the Postman tool can help you with API development and testing.

The method will be POST, Url is https://api-ssl.bitly.com/oauth/access_token and parameters will be client_id, client_secret, code, redirect_uri.

Bilty API Development

When you hit this API, it will return the following –

access_token - The OAuth access token for specified user
login - The end-user’s Bitly username
apiKey - The application key which will be used
This login and apiKey values should be stored and can be integrate with your Drupal modules.

Implementing the integration

Now let us learn to create a custom form for integrating with Bitly as there is no stable module for this in Drupal 8.


namespace Drupal\bitly\Form;

use Drupal\Core\Form\ConfigFormBase;

use Drupal\Core\Form\FormStateInterface;


* Class bitlyForm.


* @package Drupal\bitly\Form

class bitlyForm extends ConfigFormBase {

  * {@inheritdoc}
 protected function getEditableConfigNames() {
return ['bitly.shorten_url'];
  * {@inheritdoc}
 public function getFormId() {
 return 'bitly_shorten_url_configuration_form';
  * {@inheritdoc}
 public function buildForm(array $form, FormStateInterface $form_state) {
  $config = $this->config('bitly.shorten_url');
   $form['configuration'] = [
     '#type' => 'fieldset',
     '#title' => $this->t('Configuration'),
   $form['configuration']['login_name'] = [
     '#type' => 'textfield',
     '#title' => $this->t('Login Name'),
     '#default_value' => $config->get('login_name'),
     '#required' => TRUE,
   $form['configuration']['app_key'] = [
     '#type' => 'textfield',
     '#title' => $this->t('Application Key'),
     '#default_value' => $config->get('app_key'),
     '#required' => TRUE,

   return parent::buildForm($form, $form_state);
  * {@inheritdoc}
 public function submitForm(array &$form, FormStateInterface $form_state) {
     ->set('login_name', trim($form_state->getValue('login_name')))
     ->set('app_key', trim($form_state->getValue('app_key')))
   parent::submitForm($form, $form_state);


Add the Login name and Application Key that you have generated and stored previously.

Now, let’s create a function to integrate your Drupal website with Bitly.


* Generate Shorten Url.


* @param string $url

*   The url.

* @param string $login

*   The login name.

* @param string $appKey

*   The api key.


* @return string

*   Returns bitly url.


public static function makeBitlyUrl($url, $login, $appKey) {

 // Create the URL.

 $bitly = ‘http://api.bit.ly/shorten?version=2.0.1&longUrl=’. urlencode($url) . '&login=' . $login . '&apiKey=' . $appKey . '&format=xml';

 // Get the url

 // could also use cURL here.

 $response = file_get_contents($bitly);

 // Xml.

 $xml = simplexml_load_string($response);

 return 'https://bit.ly/' . $xml->results->nodeKeyVal->hash;

Let’s now call the makeBitlyUrl function to generate the Bitly Url.

$config = \Drupal::config(‘bitly.shorten_url');

$bitlyUrl = (string) makeBitlyUrl('https://www.example.com', $config->get('login_name'), $config->get('app_key'));

You will now get the integrated Bitly Url in the $bitlyUrl variable.

May 19 2020
May 19

Web accessibility helps to make the world wide web usable for everyone. Many of the most common accessibility issues making sites difficult or impossible to use in a non-traditional way can be easily fixed. 

The following hook modifies any exposed view filter to have aria labels on each form element as well as inserts the view name into the form element, form element label, form actions wrapper, and form submit button ID. (E.g. edit-submit-[VIEW_NAME]-[VIEW_DISPLAY])

Developers need to ensure that page markup does not contain duplicate IDs because many screen readers and assistive technologies use IDs within the page as landmarks for the user to easily parse the content.

The addition of aria-labels acts as an agnostic accessibility feature to inform the user the purpose of the element:

* Implements hook_form_alter().
function MYMODULE_form_alter(&$form, FormStateInterface &$form_state, $form_id) {
 // Add aria-labels to views exposed filters.
 if ($form_id === 'views_exposed_form' && strpos($form['#id'], '-block-') !== FALSE) {
   // Assumed structure (e.g. views_exposed_form_[view_name]_[block_name]).
   $exploded_form_id = explode('views_exposed_form_', str_replace('-', '_', $form['#id']));

   if (isset($exploded_form_id[1]) && !empty($exploded_form_id[1])) {
     // Get the view id.
     $view = explode('_block_', $exploded_form_id[1]);
     $view = isset($view[0]) ? $view[0] : '';

     if (!empty($view)) {
       // Get the view display id.
       $view_display = explode($view . '_', $exploded_form_id[1]);
       $view_display = $view_display[1];
       // Get the view display title.
       $view = Views::getView($view);
       $clean_view_display_title = Html::getClass($view->getTitle());

       foreach ($form['#info'] as $info) {
         if (isset($form[$info['value']])) {
           $clean_info_value = Html::getClass($info['value']);

           // Add an aria-label to the form element.
           $form[$info['value']]['#attributes']['aria-label'] = $view->getTitle() . ' ' . trim($info['label'], ':');
           // Update the id on the form element and label.
           $form[$info['value']]['#id'] = 'edit-' . $clean_view_display_title . '-' . $clean_info_value;

       if (isset($form['actions'])) {
         $form['actions']['#id'] = 'edit-actions-' . $clean_view_display_title;

         // Update the id on the submit button.
         if (isset($form['actions']['submit'])) {
           $form['actions']['submit']['#id'] .= '-' . $clean_view_display_title;

This second hook modifies any Table view display to prepend the view name and view display name to the table header ID attribute and corresponding table row header attribute (e.g. [VIEW_NAME]--[VIEW_DISPLAY]--[EXISITNG_ID]):

* Implements template_preprocess_views_view_table().
function MYMODULE_preprocess_views_view_table(&$variables) {
 $view_name = Html::getClass($variables['view']->id());
 $view_display = Html::getClass($variables['view']->getDisplay()->display['id']);
 $id_prefix = $view_name . '--' . $view_display . '--';

 // Update each table header 'id' to be
 // prepended with [VIEW_NAME]--[VIEW_DISPLAY]--.
 foreach ($variables['header'] as $header_key => $header) {
   if (
     isset($header['attributes']) &&
   ) {
     $variables['header'][$header_key]['attributes']->setAttribute('id', $id_prefix . $header['attributes']->storage()['id']);

 // Update each row 'headers' to be
 // prepended with [VIEW_NAME]--[VIEW_DISPLAY]--.
 foreach ($variables['rows'] as $row_key => $row) {
   if (isset($row['columns']) && !empty($row['columns'])) {
     foreach ($row['columns'] as $column_key => $column) {
       if (
         isset($column['attributes']) &&
       ) {
         $variables['rows'][$row_key]['columns'][$column_key]['attributes']->setAttribute('headers', $id_prefix . $column['attributes']->storage()['headers']);

Comment below for other ideas, or if you need help using these. 

May 19 2020
May 19

Hi friends. I’m very excited about DrupalSpoons, and would love you to give it a try. DrupalSpoons is a model and a movement. It shows how great development for Drupal Contrib modules can be. DrupalSpoons is a particular configuration of groups and projects built at Gitlab.com. DrupalSpoons offers Issues, Merge Requests (same as PRs), and CI to Contrib projects. It uses zero custom code (except for the issue migration), since I have no special access to gitlab.com.

Please read its README.md and click through the many links there. There you will learn the goals of the project, and more about its implementation. Devel and KeyCDN modules are now developed on DrupalSpoons. For example, observe that all of Devel’s open issues were migrated.

If you maintain a Contrib module, please consider moving it to DrupalSpoons. If you are a contributor, open a “Move to DrupalSpoons” issue for your favorite module. We can be done with rolling and re-rolling patches like its 1999. If we get enough momentum, I’m hoping that the DA will adopt a similar approach for its Gitlab instance and all projects can move back to git.drupalcode.org.

If you like my proposal, please retweet and join! I led prior movements in Drupal like groups.drupal.org and #D7CX pledge, and Drupal Code of Conduct. These movements only succeed with early vocal support.

To discuss this blog post and DrupalSpoons in general, lets use this issue or #drupalspoons on Drupal Slack. Please bring your most positive self :)

P.S. I never set out to build a Drupal project platform. A Devel co-maintainer asked me to fix our Travis tests. I did so on Gitlab.com and it Corona-spiralled from there. “If you wish to make an apple pie from scratch, you must first invent the universe” – Carl Sagan.

Devel issues

May 18 2020
May 18

You Might Also Like

This past Friday, the Olivero project reached a major milestone and tagged our first alpha release. You can download it on the Olivero project page. This release marks a point in time where the code is stable(ish!), looks good amazing, and is ready for additional testing!

What is Olivero?

Olivero is a new theme that is slated to make it into Drupal core in version 9.1 as the new default theme (replacing Bartik). It’s named after Rachel Olivero, who was a valued community member and accessibility advocate. 

About this release

This release has been a long time coming (read about the origin here) and has countless hours from dedicated volunteers poured into it with love. 

This release is not perfect, in fact, we’ve stated that “perfect is the enemy of good” for our alpha releases! That being said, we’ve done extensive testing with various devices, and browsers (yes — this theme supports Internet Explorer 11), and have done lots of accessibility work, although more still needs to be done!

You get a feature! You get a feature! Everyone gets a feature!

Well… almost. Besides cross browser and accessibility work, we’ve included some common features in this initial release.

  • Dropdown menu support — this is self-explanatory, but until Drupal 9.1, core themes did not support multi-level menus. 
  • Option for “always-on” mobile navigation — This is intended for the use case where the page has more top-level menu items than can fit within the header.
  • Background color options for site branding — You can change the site branding (logo) background to suit your needs. The current options are white, gray, and blue (which is the default). 

As of this release, we’re not including support for dark mode or changing the color of the theme (via color module, or anything else). However, this is on the horizon (likely after the initial core inclusion). 

How you can help

We’re at the point now where we can use some real-world testing. Please find things that break, we know they’re there!

For example, I loaded up Lullabot.com with the new theme and discovered various default styling that can be tightened up (e.g., links within headers).

We’re also at the point where we can use additional accessibility testing. Please fire up your screen readers, and decrease (or increase) your contrast and take a look!

As with all free open source projects, please take a look at the issue queue to make sure that the issue isn’t already created.

Tugboat helps save the day!

We also want to give out a huge shoutout to the Tugboat team. If you aren’t familiar with Tugboat, it’s an automated service that generates live, working sites on every PR, branch, or what-not.

Through the Tugboat service, we worked to set up a Drupal multisite install that has previews with content, without content, and using the minimal profile (note this is currently broken). 

Check out the Tugboat preview here! Note that the “working” status of this Tugboat preview will be in flux, as we commit code, rebuild the preview, etc.

Next steps

We’re off to the races! The next step is incremental alpha releases (either weekly or biweekly). We’re also going to make a list of features that are necessary to tag our first beta and work toward that. 

We hope to create our first Drupal core path in a month or so. This will give core committers the opportunity to put an eye on the codebase, so it can be included in version 9.1. We’re aiming to get this committed into core in late July-ish.

Thank you, thank you, thank you!

We have so many people helping out, and we wouldn’t be here without them. Here is the full list of committers (16 as of today), and this does not even count the people who are doing testing, etc! 

Mike Herchel


A senior front-end developer, Mike is also a lead of the Drupal 9 core "Olivero" theme initiative, organizer for Florida DrupalCamp, maintainer for the Drupal Quicklink module, and an expert hammocker

Putra Bonaccorsi


Putra Bonaccorsi is a Senior Front-end Developer with a flair for creative uses of CMS and a dedication to finding unique solutions to tough problems.

May 18 2020
May 18

Of all the discussions in the Drupal community, few have generated such a proliferation of blog posts and conference sessions as decoupled Drupal, which is also the subject of a 2019 book and an annual New York conference—and has its share of risks and rewards. But one of the most pressing concerns for Drupal is how we can ensure a future for our open-source content management system (CMS) that doesn't relegate it to the status of a replaceable content repository. In short, we have to reinvent Drupal to provide not only the optimal back-end experience for developers, but also a front end that ensures Drupal's continued longevity for years to come.

A few months ago, Fabian Franz (Senior Technical Architect and Performance Lead at Tag1 Consulting) offered up an inspirational session that presents a potential vision for Drupal's front-end future that includes Web Components and reactivity in the mix. In Fabian's perspective, by adopting some of the key ideas that have made popular JavaScript frameworks famous among front-end developers, we can ensure Drupal's survival for years to come.

In this multi-part blog series that covers Fabian's session in detail from start to finish, we summarize some of the key ideas that could promise an exciting vision not only for the front-end developer experience of Drupal but also for the user experience all Drupal developers have to offer their customers. In this fifth installment in the series, we continue our analysis of some of the previous solutions we examined and consider some of the newfangled approaches made possible by this evolution in Drupal.

The "unicorn dream"

Before we get started, I strongly recommend referring back to the first, second, third, and fourth installments of this blog series if you have not already. They cover essential background information and insight into all of the key components that constitute the vision that Fabian describes. Key concepts to understand include Drupal's render pipeline, virtual DOMs in React and Vue, the growing Twig ecosystem, universal data stores, and how reactivity can be enabled in Drupal.

One of the final questions Fabian asks in his presentation is about the promise unleashed by the completion of work to enable shared rendering in Drupal, as well as reactivity and offline-enabled functionality. During his talk, Fabian recalls a discussion he had at DrupalCon Los Angeles with community members about what he calls the unicorn dream: an as-yet unfulfilled vision to enable the implementation of a Drupal site front end with nothing more than a single index.html file.


Fabian argues that the component-driven approach that we have outlined in this blog series makes this unicorn dream possible thanks to slots in Web Components. Because React, Vue, and Twig all have slots as part of their approaches to componentization, this possibility becomes more possible than ever before. Front-end developers can insert repeatable blocks with little overhead while still benefiting from configuration set by editors who don't touch a single line of code but that affects rendered output. Developers can extend said block rather than overriding the block.

Consider, for instance, the following example that illustrates leveraging an attribute to override the title of a block:

    <sidebar type="left">
      <block slot="header" id="views:recent_content">
        <h2 slot="title">I override the title</h2>

When Fabian attempted to do this with pure Twig, he acknowledges that the complexity quickly became prohibitive to proceed, and the prototype never reached core-readiness. However, thanks to this approach using Web Components slots, one could create plugins for modern editors that would simply use and configure custom elements. Among editors that would support this hypothetical scenario are heavyweights like CKEditor 5, ProseMirror (which Tag1 recently selected as part of a recent evaluation of rich-text editors), and Quip.

Developer experience improvements

This means that we as developers no longer have the need to convert the display of tokens through a variety of complex approaches. Instead, we can simply render HTML and directly output the configured component; Drupal will handle the rest:

    <drupal-image id="123" />

Moreover, leveraging BigPipe placeholders with default content finally becomes simple thanks to this approach, argues Fabian. We can simply place default content within the component, and once the content arrives, it becomes available for use:

    <block id="views:recent_content" drupal-placeholder="bigpipe">
      I am some default content!

In this way, we can take advantage of our existing work implementing BigPipe in Drupal 8 rather than resorting to other JavaScript to resolve this problem for us.

Performance improvements

Finally, some of the most important advancements we can make come in the area of performance. For front-end developers who need to serve the ever-heightening demands of customers needing the most interactive and reactive user experience possible, performance is perennially a paramount consideration. When using a universal data store, performance can be improved drastically, particularly when the store is utilized for as many data requirements as possible.

We can simply update the real-time key-value store, even if this happens to solely be located on Drupal. As Fabian argues, a data-driven mindset makes the problem of shared rendering and componentization in Drupal's front end much simpler to confront. Developers, contends Fabian, can export both the data and template to a service such as Amazon S3 and proceed to load the component on an entirely different microsite, thus yielding benefits not only for a single site but for a collection of sites all relying on the same unified component, such as &lt;my-company-nav />.

Such an approach would mean that this company-wide navigation component would always be active on all sites requiring that component, simplifying the codebase across a variety of disparate technologies.

Editorial experience improvements

Nonetheless, perhaps some of the most intriguing benefits come from improvements to the editorial experience and advancements in what becomes possible despite the separation of concerns. One of the chief complaints about decoupled Drupal architectures, and indeed one of its most formidable disadvantages, is the loss of crucial in-context functionality that editors frequently rely on on a daily basis such as contextual links and in-place editing.

With Fabian's approach, the dream that formerly seemed utterly impossible of achieving contextual administrative interfaces within a decoupled Drupal front end become not only possible but realistic. We can keep key components of Drupal's contextual user interface such as contextual links as part of the data tree rather than admitting to our customers that such functionality would need to vanish in a scenario enabling greater reactivity and interactivity for users.

After all, one of the key critiques of decoupled Drupal and JavaScript approaches paired with Drupal, as I cover in my book Decoupled Drupal in Practice, is the lack of support for contextual interfaces and live preview, though I've presented on how Gatsby can mitigate some of these issues. Not only does this solution allow for contextual interfaces like contextual links to remain intact; it also means that solutions like progressive decoupling also become much more feasible.

Moreover, one of the key benefits of Fabian's approach is Drupal's capacity to remain agnostic to front-end technologies, which guarantees that Drupal is never coupled to a framework that could become obsolete in only a few years, without having to reinvent the wheel or create a Drupal-native JavaScript framework. And one of the key defenses of Fabian's vision is this rousing notion: We can potentially enable single-page applications with Drupal without having to write a single line of JavaScript.

Outstanding questions

Despite the rousing finish to Fabian's session, pertinent questions and concerns remain about the viability of his approach that were borne out during the Q&A session following the session. One member of the audience cited the large number of examples written in Vue and asked whether other front-end technologies could truly be used successfully to implement the pattern that Fabian prescribes. Fabian responded by stating that some work will be necessary to implement this in the framework's own virtual DOM, but in general the approach is possible, as long as a customizable render() function is available.

Another member of the audience asked how Drupal core needs to evolve in order to enable the sort of future Fabian describes. Fabian answered by recommending that more areas in Drupal responsible for rendering should be converted to lazy builders. This is because once no dependencies in the render tree are present, conversion to a component tree would be much simpler. Fabian also cited the need for a hook that would traverse the DOM to discover custom components after each rendering of the Twig template. Thus, the main difference would be writing HTML in lieu of a declaration in Twig such as {% include menu-item %}.


In this fifth and final installment of our multi-part blog series about a visionary future for Drupal's front end, we examined Fabian's rousing DrupalCon Amsterdam session to discuss some of the benefits that reactivity and offline-first approaches could have in Drupal, as well as a framework-agnostic front-end vision for components that potentially extends Drupal's longevity for many years to come. For more information about these concepts, please watch Fabian's talk and follow our in-house Tag1 Team Talks for discussion about this fascinating subject.

Special thanks to Fabian Franz and Michael Meyers for their feedback during the writing process.

Photo by Stephen Leonardi on Unsplash

May 18 2020
May 18

The Bibliography & Citation project (aka BibCite) is developed and maintained by ADCI Solutions and helps to organize and save bibliographic data about the content ranging from web pages to books and scientific research works. The project also helps to design citation of sources according to thousands of standards that have been accepted at different times in different organizations. 

As we can tell from the drupal.org statistics, at the time of the publication of this article BibCite is used on more than 281 websites. Its solutions are very specific, therefore each use case is highly likely to be a conscious one, and we as project`s creators appreciate it a lot. 

We tend to think that submitted issues and concerns about the module’s performance are legit indicators of BibCite`s success and demand. So, we have hundreds of issues to discuss. We decided to contact users to gather first-hand information about organizations they work in and the problems they solve using BibCite. 

We would like to thank our respondents and co-authors of this article.

Ricardo Marcelino, Omibee

Omibee is a Drupal agency that works with organizations such as research centers to help them promote and manage their activities.

Based on that experience, we developed Omibee Research, a prebuilt Drupal installation that enables research centers and institutes to have their own online platform with low entry effort.

Omibee Research can be fully customized according to specific needs; features and modules that are frequently used in this field are already included into its core: scientific activity management, research groups collaboration, grant, and annual reports management, etc. The core package includes Bibliography & Citation along with several other modules, preset entities, integrations, and a specifically designed theme.

The product is free to download and has the same license as Drupal (GPL v2), a theme and several ways to import information. The paid services we provide for this product focus on installation, customization and appearance.

There are currently around 10 centers in Portugal using Omibee Research. A few examples: 

May 18 2020
May 18

We recently introduced the concept of digital experience frameworks as the essential tools for creating and managing digital experiences. That blog post covered the basics of DXF: what’s meant by the term, some different types of DXF, and how to choose the right ones for your needs, supported by a short look into the choices for Agiledrop’s own suite of DXF.

This post will then focus more on the advantages of open-source digital experience frameworks and how they can be utilized to streamline operations and drive growth for your business. 

We’ll discuss the main reasons for opting for open-source DXF rather than custom development or proprietary tools, and take a look at the benefits of using them, both for your products/services as well as for your internal operations. 

How can you drive business growth with open-source digital experience frameworks?

In the experience economy, websites are just one of the numerous channels with which your customers interact with your business. In order to truly drive business growth, you’ll want to take advantage of all available channels where your target audience spends their time (and money!).

It’s true that with custom development, you’ll get all the customizability you desire and won’t be constrained with certain limitations of specific and already established DXF. However, custom development, especially in the multichannel digital landscape, is not only much more costly, but also much much more time consuming than relying on open-source DXF.

Just think of it - while you may achieve more functionality with custom development, the question arises whether the custom code will be completed at a time when this functionality is still relevant. Chances are high that this won’t be the case.

With SaaS solutions for creating digital experiences, the story is a bit different. They are incredibly time- and resource-efficient, but that’s also reflected in the budget. 

For big companies with a reasonable budget, it might make sense to rely on established SaaS providers - but what about small to medium-sized businesses that don’t have the luxury to afford a premium subscription to, say, Salesforce?

Furthermore, while initial development is faster, the SaaS still likely has certain limitations which can’t be dealt with as flexibly as with a digital experience framework. Plus, if you ever decide to migrate from a SaaS to another solution, you’ll have a very hard time getting all your data - it won’t, technically, be yours.

Luckily, there are a lot of really good open-source solutions - this is basically free software, supported and vetted by a community of experts (with frameworks such as WordPress or React, for example, these communities are downright huge). 

There are 2 crucial things to take into account here:

  • Vetting by experts guarantees a very high level of security. This is especially true in Angular and Drupal: the former is supported by Google, while the latter is renowned for being the most secure open-source CMS and as such a favorite of governments, nonprofits and similar organizations.
  • Wide range of customizable options makes it easier to do personalization well, which contributes to a better customer experience, with higher conversion and lower bounce rates.

All three digital experience frameworks which we utilize at Agiledrop satisfy these two criteria. As the two leading CMS with huge communities backing them, WordPress and Drupal are able to respond efficiently to the ever-changing market demands, both of them introducing sought-after features while allowing for better and better integration with other technologies. 

Both of them can also be used as “headless” or “decoupled” content management systems, relying on a front-end framework such as Angular or React for the presentation of that content.

While both of them offer out-of-the-box support for React, they can function with basically any framework. There have been a few articles recently on using WordPress with Vue, and one of our developers has been working on a project that uses Vue in combination with Drupal. 

Our front-end framework of choice, however, is the TypeScript-based Angular, due to its enterprise capabilities. These especially make it a perfect fit with Drupal, which is also predominantly used for bigger, enterprise platforms. 

Angular is developed and maintained by leading tech company Google, and the TypeScript language by another tech giant, Microsoft. On top of that, the framework’s regular release cycle guarantees constant additions and optimizations to functionality and security. 

All three frameworks provide enough out-of-the-box features, as well as all the plugins, modules and other tools contributed by the community, that you can significantly cut down on costs with them, as you’ll require less custom development to achieve the same functionality. 

What’s more, to cater to the recent explosion of digital channels, they also come with excellent mobile support, as well as the ability to integrate with any kind of channel, allowing your business to leverage all the channels it needs to, from the web to IoT. 

In addition to powering all sorts of digital experiences for your audiences, open-source DXF are also the ideal tools for all of your internal operations, from WebOps to project and resource management. 

WordPress and especially Drupal are perfect for internal platforms where you need good content management capabilities, media handling and well-defined permissions and user roles. 

Frameworks such as Angular or React are then suited towards more specific use cases - at Agiledrop, we recently revamped our resource management dashboards which now utilize Angular, for example. And, as our project managers testify, their day-to-day work has been greatly facilitated thanks to this upgrade!

So, by providing a great experience for both your users and customers, as well as your employees, open-source digital experience frameworks are a cost-effective and future-proof solution for establishing and scaling your digital presence. Leveraging them allows for more innovation and flexibility, enabling you to better tailor your digital experiences to the needs of your audiences.


To sum up, open-source digital experience frameworks such as Drupal, WordPress and Angular can be used to power any kind of digital experience, from products to operations, from web to mobile to physical digital display.

They are by their very nature future-proof enough to guarantee business relevance a few years down the line when new trends emerge, allowing you to scale and grow without having to worry about migrating your entire codebase every few years, or losing any user data. 

This is only enhanced with the frameworks’ commitment to backward compatibility, which will make upgrades between future versions even easier. 

If you’re looking for the right suite of digital experience frameworks for your next project, and proven engineers versed in those frameworks, reach out to us and we’ll craft a team with the perfect skill-set for your needs.

May 18 2020
May 18

As you may have heard, Drupal 9 porting day a couple weeks ago was a huge success! Gábor Hojtsy wrote up a great summary on the event including all the money that was raised for the Drupal Association. And, you can read my porting day recap as well. Thanks again to all who helped out.

The April porting day was so successful that Surabhi Gokte and Gabriele Maira (gambry) encouraged Gábor to organize a repeat performance and, like magic, we have Drupal 9 porting weekend coming May 22 and 23. This is scheduled during the time we had hoped to contribute at DrupalCon Minneapolis. Now we'll join each other virtually in a few days to make Drupal even better.

The goal for Drupal 9 porting weekend is to make more Drupal 8 contributed projects (modules, themes, and distributions) compatible with Drupal 9. As of May 16, 2020, there are 1,617 out of 8,982 projects that already work on Drupal 9. So, during the porting weekend, we'll work on the 7,365 that don't. While that might seem like a daunting number, about half of those (3,405) likely only need a one-line change to make the project Drupal 9 compatible!

There is a wonderful effort underway right now to automate Drupal 9 compatibility issue and patch creation for contributed projects. Even with this effort, we still need your help during porting weekend. We hope you can participate!

Image credit: Aaron Deutsch

I've listed some useful Drupal 9 porting resources you can review and information about collaborating in Slack during the event. I've also added specific sections for preparation depending on the type of contribution you are hoping to do:

There are a lot of Drupal 9 resources, but here are some that are particularly helpful for preparing for patch weekend:

  1. Drupal 9 readiness (#d9readiness) channel on Slack
  2. Drupal 9 compatibility contribution quickstart guide
  3. Drupal 9 Deprecation Status
  4. Upgrade Status
  5. drupal-rector
  6. Upgrade Rector
  7. Running PHPUnit tests
  8. simplytest.me
  9. dreditor Chrome browser plugin
  10. Drupal 9 porting day recap
  11. How to prepare for Drupal 9 Porting Weekend
  12. Drupal 9: Status, Resources, and Ways to Contribute

We'll be collaborating in the Drupal 9 readiness (#d9readiness) channel on Slack. If you don’t have access to Drupal Slack click here to get an invite.

When you are ready to join the sprint, announce yourself in the Slack channel with something like "I'm here for the sprint and will be helping for the next X hours. [insert what you are hoping to do or if you need help here]", e.g.

  1. I will be working on my personal Foobar module and don't need any help but will post status updates here.
  2. I will be working on my personal Foobar module and would like help reviewing/testing.
  3. I will be finding my own issues to work on and will post them here as I work on them so others can collaborate.
  4. I don't have an issue I'm working on but am open to creating patches.
  5. I don't have an issue I'm working on but am open to reviewing/testing patches.
  6. I'm not sure what to help with and need guidance.

We will be using Slack threads extensively during the sprint. Each project or issue that is being worked on should ideally have a Slack thread associated with it in the channel. Also, if you are responding to questions in the channel, please respond via a thread unless the answer is a simple yes/no type response that won't have other people chiming in. It's best to err on the side of creating threads during the sprint in order to keep the "chatter" organized so that it's easier for people to understand what's happening and being worked on.

There will be mentors available to help match people with projects and issues. The mentors are listed on the Drupal 9 porting weekend event page. If you have questions, please do not send direct messages to the mentors unless there is a Drupal Code of Conduct issue that you need help with. You can send questions directly in the #d9readiness channel. You don't have to be a mentor to answer questions during the sprint. If you know the answer and have time to respond, please do so.

Do you want to help out during porting weekend but don't want to think about it until then?

No worries! Show up in the #d9readiness Slack channel and announce your arrival along with what you'd like to help with. If you don't know how to help, that's okay too. The mentors can find something for you. :)

Do you like the idea of helping but haven't done it before and you're not sure you'd like to participate?

No pressure! Join the #d9readiness Slack channel and watch in real time as people contribute together. It's okay to just hang out and see how it happens. You won't be forced to do anything but, who knows, maybe you'll change your mind and jump in after all. :)

Note: There are people interested in livestreaming the event on Twitch.tv. If that happens —which we hope it does!— we'll announce it via the Slack channel and Twitter. Thanks to Ofer Shaal for this great idea!

Have you prepared any websites or contributed projects for Drupal 9 already? Or helped with Drupal 9 compatibility issues in the issue queues?

Nice! We could use your guidance. If you are up for mentoring during the sprint, contact Gábor Hojtsy and let him know what days and times (with time zone) you can cover. Even if you can only help for an hour or two, that's okay! Ideally, we'd love to get 1 to 2 people covering all times across all time zones over the 2 days. You can check the current coverage on the Drupal 9 porting weekend event page.

After you sign up, make sure to join the #d9readiness Slack channel and arrive during your planned time. If there are other mentors there already, check in with them to see if they need any help with anything. Otherwise, scroll backwards in the Slack channel to catch up on what's been happening and if anyone has asked for help and didn't get a response.

How you mentor others will depend on what they need help with. Someone might need help finding an issue or project to work on. Another might have a technical question when reviewing a patch. If you are worried you might not know how to answer all the questions, it's okay! If you don't know the answer, just let them know and, if possible, see if someone else in the channel knows the answer. This is how we learn together.

If time allows, we hope to have an up-to-date list of issues to start with at the beginning of the sprint. Stay tuned in the Slack channel for more information about this.

Are you good at searching the issue queues?

Searching the issue queues for existing issues is a very valuable skill that should not be underrated. Sometimes we might forget to check if there is already an issue for our Drupal 9 compatibility problem and create a new one and we end up with duplicates. It happens to the best of us. I've done it myself!

Maybe you aren't up for doing any issue patching, reviewing, or testing but would be happy to look for existing issues or see if issues are duplicates and close out the extras. We'd love that type of help! If you would like to focus on this aspect of contribution, here are some tips:

Check the Drupal 9 plan of the project

A thousand projects set their Drupal 9 plans up for their project pages. Project maintainers can do this by editing their project manually. If a Drupal 9 plan is provided, and it does not say the project is Drupal 9 compatible already, it would be the best starting point for finding the related issues. There still may be other issues that need triage though.

Searching the issue queues

If there was no Drupal 9 plan or even if there was a Drupal 9 plan, looking for other issues may turn up things you can clean up. You'll want to use the "Advanced search" when searching the queue and make sure to look for all issues rather than just open issues. I've made the mistake before where I was only looking at open issues and missed ones that were fixed but closed.

Keep in mind that some issues are tagged with "Drupal 9 compatibility" but not all are. In some cases, issues won't be tagged at all. There is an old tag called "Drupal 9 readiness" that was mistakenly used. If you find relevant issues that aren't tagged with "Drupal 9 compatibility", you can add the tag as you come across them. Only remove the "Drupal 9 readiness" tag if the issue is tagged with "Drupal 9 compatibility".

The issue titles aren't consistent so you'll need to search for various keywords within a project's issue queue, e.g. "drupal 9", "compatibility", "deprecated", "deprecation", "port", "update", "readiness", "core_version_requirement". Searching for these types of keywords using the "Advanced search" should help you find most, if not all, relevant issues.

If you need a refresher on the issue status codes, the "Status settings of issues" documentation is helpful. For the porting weekend, it would be good to work on issues with "Active", "Needs work", and "Needs review" statuses. You can also double check that the "Reviewed & tested by the community" (RTBC) issues were actually reviewed and tested by someone before moving to the RTBC status. The person adding the patch should not set the issue status to RTBC themselves.

What do these issues look like?

It's good to be familiar with what these Drupal 9 compatibility issues typically look like. Here are some examples with their current status as of May 16, 2020:

  1. Info.yml file core_version_requirement changes
  2. Deprecations
  3. Miscellaneous

What issues can be closed?

It's important to only close issues if you know they are duplicates or are irrelevant. If you find more than one issue where they are for the same thing (e.g. updating the info.yml file), then you'll have to assess which, if any, should be closed. This is a judgment call that's handled on a case-by-case basis.

Example 1: Both issue 1 and issue 2 are for info.yml file changes. Neither have had any work on them or any comments. In this case, I'd keep the first issue and close the second as a duplicate and specify the first issue in the "related issue" field on the second issue.

Example 2: Issue 2 had work done on it and has a patch and issue 1 doesn't have any work done yet. In this case, I'd keep the second issue and close the first and specify the first one in the "related issue" field of the second issue.

Example 3: Both issue 1 and issue 2 have had work done. This one is trickier. If it's obvious who did the work first, I'd close out the later issue and add the "related issue" to the other one. If it's not obvious which should be closed, I'd add a comment to both issues and set both of them up to be related to each other.

Example 4: This is a scenario that happened to me recently. Issue 1 was for the info.yml file and issue 2 was for dealing with the jQuery library dependency. Issue 1 had a patch and was RTBC. Issue 2 eventually added the info.yml file fix into the patch for the jQuery dependency changes. This made issue 1 a duplicate so it was closed. The person who did the patch for issue 1 was given an issue credit on issue 2 even though they didn't work directly on that issue. (Issue credits can only be assigned by project maintainers, so when working on issues of projects you don't maintain, please leave a note suggesting to add the appropriate credit for the issues you closed).

When should issues be created?

If it's clear from searching the project's issue queues that there are no existing issues for Drupal 9 compatibility, the next step is to double check using the Upgrade Status module. This can be done using simplytest.me or your local machine.

If it's a module, you would enable the module you want to check and enable the Upgrade Status module. Then you would check the Upgrade Status page for the module's Drupal 9 compatibility info. Ideally, you would check the module's most recent release as well as the module's dev version. This is because it's pretty common for compatibility fixes to be in the dev version but not in an official release yet.

It is useful to have the dreditor Chrome browser plugin when creating issues so that you can follow the issue formatting guidelines. But, don't let this keep you from creating issues. You can look at this issue example for formatting.

Please tag all Drupal 9 compatibility issues with the tag "Drupal 9 compatibility" and, if you are working on it during the porting weekend also tag it with "Drupal 9 porting weekend". Please do not tag it with outdated tags like "D9 readiness", "Drupal 9 readiness", "Drupal 9", "D9 update", or create new tags. It is difficult to find issues when lots of different tags are used. Other tags that may be relevant to add during the sprint are "Novice", "Needs tests", "Needs manual testing", "Needs reroll", and "Needs issue summary update".

Are you a developer interested in creating or updating code patches?

For some projects there are already patches ready for review and testing but, for others, new patches need to be created manually or using the drupal-rector utility or the Upgrade Rector module.

Note: It is very important to check the issue queue first to make sure you are not creating duplicate patches.

Using a local development environment

To create and update code patches locally, you have a number of options. I'm agnostic on local development tools and have used LAMP, MAMP, MAMP PRO, Lando, and DDEV at one time or another. I've known others who've successfully used Docksal and WAMP.

Pick your favorite or try something new. If you want to be more efficient during the porting weekend, I highly recommend you set up your local environment a few days beforehand. You'll want to make sure you know how to "blow away" your testing site easily and recreate it.

Generating patch files

If you are new to creating patches for Drupal projects, I highly recommend scanning the Advanced patch contributor guide. Since I don't create patches regularly, this is where I go to refresh my memory every time I make one.

For additional information for creating patches manually or using Drupal Rector, Tsegaselassie Tadesse (tsega) wrote a great post for the event: "How to prepare for Drupal 9 Porting Weekend".

Are you a developer who understands Drupal coding standards and has an eye for detail?

For every patch that gets committed, it's best practice for someone who didn't write the patch to review the code. Code patches might be simple, one-line changes or complex changes that span many files.

The patches we are focused on for the porting weekend will be mostly deprecation patches and info.yml file patches. Sometimes these are combined into one issue and sometimes they are split out. You can review some example issues above. For the info.yml file patches, these are trivial to code review. Ideally, someone should test the patch as well before marking these issues RTBC.

For reviewing deprecation patches, you'll likely need to check the change record for the deprecation to see if the code is updated properly. Some of these are pretty straight forward such as replacing drupal_set_message with the Messenger service while some will be more involved like handling the removal of jQuery UI from Drupal core.

If the changes look good and you have time to successfully test the patch as well, the issue can be marked RTBC. If you review it but don't test it, add a comment with your notes and make it clear that it still needs manual testing and add the "Needs manual testing" tag. If the patch needs changes, add your comment with feedback and move the issue back to "Needs work".

When doing code reviews, it's very useful to use the dreditor Chrome browser plugin". This will provide a UI for reviewing the code as well as some shortcuts when editing your comment.

Do you like testing things?

Manual testing is a vital part of fixing issues. For each project that's made Drupal 9 compatible, we need people to try out the updates to make sure they don't break anything. How you test depends on the project. Some will be easy to test and some won't be. Ideally, you'll need to be familiar with the contributed project features to test properly though, for simpler projects, looking through the project page and README file might be enough to get you going.

Testing using simplytest.me

In many cases, you can test using simplytest.me. If you haven't used this wonderful tool, you've been missing out! There are times when I don't want to install a site locally but want to contribute by testing a patch. This is very common for me right now because my laptop is almost out of disk space. ;) But, it could be that you are on a computer that doesn't have a development environment set up or you just don't want to set one up for whatever reason.

If you'll be using simplytest.me, I highly recommend installing the dreditor Chrome browser plugin so that when you see patches in the issue queue, you'll see a SIMPLYTEST.ME button. Click that and it will open a new tab with the relevant project and patch. You don't have to install the plugin though; you can go directly to simplytest.me and choose the project and add the link to the patch.

Note, at the time of writing, simplytest.me is working for testing Drupal 8 versions but not Drupal 9. For contributed projects that are sticking with a Drupal 8 version and making that version Drupal 9 compatible, then you can test the Drupal 8 version on simplytest.me. For projects that are making a Drupal 9 version, then you'll need to test locally. If you are able to help with updating simplytest.me for Drupal 9 support, please contact Adam Bergstein (@nerdstein) in the #simplytestme Slack channel.

Side note: When testing core patches, something to double check is that you are testing the correct version. The version selected automatically on simplytest.me is the same as the version on the issue. This isn't always the version for the patch you are testing. This is not a problem with contributed project issues as long as the version on the issue is correct.

Testing with a local development environment

If you want to test locally, you have a number of local development options. One advantage of local testing compared to simplytest.me is that it's usually faster to spin up and refresh your site in between testing different patches. With local testing, you can also test what happens when you are on an older version of the project and update the code with composer.

Reporting your results

As you manually test a project's features, it's good to take screenshots of key things as you go so that you can upload a select number in your comment. You don't have to take tons of screenshots, but you especially want to screenshot any bugs you find during testing and also copy any error text you encounter.

When you are done testing, it is very helpful to explain your steps in a bulleted list. For example, here's a Smart Trim issue comment I added with steps to reproduce and screenshots. Bonus points if you add the testing instructions to the issue summary so it's easier for others (and yourself!) to find it later.

Please add information about your testing to your issue comment instead of a "works for me" type comment. It is much more helpful to know what you tested, if it worked as expected or not, error text if available, and select screenshots showing it working or not working.


Big thanks to Gábor Hojtsy and Ofer Shaal for reviewing and fine-tuning this post. It takes a village to make Drupal its best! We hope you join us for Drupal 9 porting weekend, no matter how you'd like to participate.

Image credit: Aaron Deutsch

p.s. Apologies, my old website is still on Drupal 6 and looks particularly bad on mobile. I've only started posting here again after many years and I've been very busy reviewing Drupal 9 patches and surviving a pandemic. :) Please ignore the cobbler's old shoes for now, thanks!

Attachment Size drupal9-launch-rocket-aaron-deutsch-v2.png 427.83 KB
May 17 2020
May 17

We missed a week, but we're back with a super-sized retro episode where we talk with two of the original DrupalEasy podcast hosts! First, Andrew Riley drops by to cover some recent Drupal news and make some module picks-of-the week (including a Drupal logo memory game!) Then, Mike speaks with Ryan Price about feature flags - it's not a module, not a service, but rather a design pattern. Finally, Chris Weber also has some new change records for us.

URLs mentioned

The Change Notice

DrupalEasy News



Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

May 17 2020
May 17
Congratulations image for DCO graduates

Last week, the 15th semester of Drupal Career Online concluded, and we're proud to announce 10 new graduates of the program! Congrats to Aida, Ashley, Avery, Carla, Jada, Kim, Matt, Micah, Tonderlier, and Tyler!

Drupal Career Online is a 12-week, live-instructor, online Drupal training program designed to teach best practices and sound fundamentals for Drupal developers. 

The course includes not only (virtual) classroom instruction, but also a number of other experiences designed to provide students with various opportunities to learn and practice the material. 

  • Community mentors - each student is (optionally) introduced a volunteer Drupal community mentor. The mentor and student decide how best to work together to achieve the student's goals. Thanks so much to this semester's mentor volunteers: Ofer, Doug, Adam, Philip, Andy, Brian, Albert, and Corey.
  • Office hours - each week during the 12-week class, in addition to 7 hours of classroom time, there are 4 additional (optional) office hours. These operate much like traditional college office hours; we open a Zoom room, student show up and ask questions, or just hang out and listen in on others. 
  • Screencasts - in addition to dedicated, produced screencasts for each lesson, every classroom period (all 3.5 hours) is recorded and provided to students as raw video. This way, if they have to miss class for any reason, they can watch the recording. Many students who attend class also take advantage of these videos to rewatch portions of class that they need additional practice with. 
  • Weekly activity reports and evaluations - both the students and the instructor provides weekly feedback. This allows us to ensure that each student is getting exactly what they need to succeed in the DCO. 

While the DCO has been around for over 6 years, that doesn't mean that our curriculum is stale. In fact, just the opposite - every semester, the curriculum is updated with the latest best practices. In the Spring 2020 semester, for example, lessons were updated to include the new drupal/recommended-project Composer template, our module development lesson was updated to avoid Drupal 9 deprecations, our class project was updated, and best practice layout tools were update (with more of an emphasis on Layout Builder). These ongoing curriculum iterations ensure our graduates are learning current best practices!

Perhaps the most important aspect of the DCO is the fact that our alumni have access to weekly, office hours - each and every week throughout the year. These office hours provide an opportunity for alumni to re-connect, ask questions when they get stuck, and sometimes just lurk to hear what others are asking about. We feel that it provides a level of comfort to our alumni - that we are as invested in their future as they are.

The next semester of Drupal Career Online begins August 31. Interested in learning more about it? Come to one of our free, 1-hour Taste of Drupal webinars where we'll tell you all about the DCO and you can ask any questions you may have. 

May 15 2020
May 15

There is a lot of excitement in the Drupal community about the pending release of Drupal 9. In particular, one of the most appealing elements is how the transition to Drupal 9 promises to be the easiest major upgrade in more than a decade.

In reminiscing with other community members about some painful upgrades from long ago, it got me thinking about how the nature of the majority of Drupal modules has changed.

Early Days - Modules as Solutions

My first big Drupal project was using the 4.6 version of core. One of the things I remember was the way many of the modules available tried to anticipate a particular use case and provide a “solution” -- essentially an attempt at a fully-formed (if configurable) way to make Drupal meet a set of needs. 

An example of this is the Job Search module I helped to maintain for a time. It had a preconfigured relationship between jobs and resumes, and permissions associated with different actions you would expect users to take: post jobs, apply for jobs, and so on.

There were frequent requests from users who wanted it to work just a little differently from how it was made. Sometimes these changes got incorporated back into the module, but sometimes they remained a special case - a customization that was really only useful in the context of their site. That started to change with the rise in popularity of toolsets and modules like CCK, which were more about making Drupal more flexible so it can be quickly customized to meet a very specific set of needs.

Rise of the Toolsets

What we’ve seen since that time is an increasingly powerful set of modules that extend the capabilities of Drupal, but leave it up to the site builder to decide how these capabilities should be applied. Image API, Responsive Images, Metatags, and many more are examples of modules that gave Drupal important new abilities, but without any recommended starting point on how they should be used.

Even a tool like Features was built to help make those configuration decisions portable between sites or environments on the same site. But increasingly all decisions on how these should be set up fell entirely on the site builder. Which was fine for those of us used to Drupal (or fortunate enough to work among an experienced team of Drupal experts) but more daunting for someone trying to put together a simple site.

In that time we’ve seen Drupal become the CMS of choice for governments, record labels, and major universities, but we’ve seen competitors slowly take over niches where Drupal used to be popular, such as startups and charities. Having to build from scratch can be less attractive for an organization with limited resources, so it’s understandable they’d be tempted to go an easier route, if available.

A Middle Way

Distributions have been one attempt at addressing this problem, such as the popular Commerce Kickstart, which helped to install a ready-to-use e-commerce site. The challenge we’ve seen in using distributions is that they’re complex to maintain, so often you’re not able to use the latest versions of core or popular contrib modules. Or when it comes time to upgrade, it has built-in assumptions about what’s installed, which can make it more complex to upgrade the component pieces. And finally, a distribution is typically only an option when you’re starting to build (or potentially re-build) a site, not for adding incremental functionality.

One of the exciting features Drupal introduced in version 8 was configuration management.  

In addition, Drupal Console gives us an easy way to export individual elements: a content type, a view, and related dependencies. At Digital Echidna we’ve been experimenting with using these to create modules that are effectively just sets of related configuration meant to be a starting point to help us quickly address a particular use case: a locations map, an events calendar, and yes, even a jobs board.

smart date logoSmart Date Module

Now, I’ve adapted this approach to help anyone interested in using (or even just trying out) the Smart Date module I’ve mentioned many times in this blog. It’s easy to install the Smart Date Starter Kit and it will give you a functional (if basic) Event content type and a view with displays to show upcoming and past events. 

It isn’t preconfigured for recurring events (since not every site needs that) but if you want to add that, it’s as simple as installing the Smart Date Recurring submodule and then updating the configuration of the “When” field to allow recurring events. That’s it! The view has already been set up to properly aggregate events with multiple events.

If you also need your events to show in a calendar, you can use the Smart Date Calendar Kit. Installing it via composer gives you all the dependencies (including Smart Date and Fullcalendar View) plus everything described above in the starter kit. The calendar is connected to the list views as navigation tabs, so it should be a robust starting point to manage events on your site.

Both of these are just an initial set of configurations so you can add as much additional complexity as necessary to suit the specific needs of your site. And we’ve tried to build in some admin experience best practices, such as built-in administrative links to add content, so it’s intuitive to maintain out of the box. 

I hope you’ll try them out and post an issue if you think there are ways these could be made even better.

Future Considerations

The recent Drupal business survey posted by Dries hints that there may be similar conversations already happening elsewhere in the community, so it will be interesting to see the results when they’re announced at DrupalCon Global in July. It’s yet another reason to be excited about Drupal 9, and the future direction for this platform. With all kinds of innovations happening each and every day, it is an exciting era in the history of Drupal. 

May 15 2020
May 15

The editorial experience becomes more and more important for each CMS. Users want more control but easier interfaces. In Drupal, one of the biggest and most popular OpenSource CMSes on the market, the search for the best experience continues until today. Various approaches are tested by the community and with each edition, some new approaches are investigated and some are abandoned. From the perspective of a Drupal agency, it is quite interesting to watch this process unfold.

Let’s explore how Drupal editorial experience got to where it is today and where it is heading.

Drupal beginnings

Initially, Drupal pages were built on nodes. Think of a node as an article or page in a typical CMS. A node had a Title and a Body. You titled the node and inserted all the content in one big text field. Typically people would type in the body field the content of the page - this would be mostly text, but you could stick anything you like in there (HTML, images etc) Quite quickly people incorporated WYSIWYG editors into the body (CKEditor, TinyMCE and other ones were integrated as community-contributed modules). You could now author a quite complex page without knowledge of HTML.

The WYSIWYG’s were so popular, that in Drupal 7 CKEditor was added into Drupal core.

On the database side, everything  was still quite simple. A node table with title and an additional table for the body. That is how Wordpress stayed pretty much till today. In Drupal however, the evolution continued. 

Drupal 6 - the domination of CCK & templating

At the time of Drupal 5, an initially small module was created, which in Drupal 6 completely changed the rules of the game (the Content Construction Kit module, aka. CCK).

CCK was a contributed module which allowed to add additional fields to nodes. This does not sound too exciting, but it was. The absolutely brilliant CCK module allowed users to add various fields (number, text, bool, select etc) and it was creating a separate database table for each field. The table field was matching what you wanted to store in it (A decimal, a float, a varchar, a text etc). On top of that, it was adding the field to the default content form. 

This was magnificent because you could create a form with multiple fields and then display the data in a template pre-built by the developer (an image on the right, stats on the left, long tests at the bottom -- that sort of thing). This is how one was building pages in Drupal. The editor did not have to ‘design’ the layout in WYSIWYG anymore. You could just fill in a form with fields and the template took care of the rest. 

Moreover, you could now query in SQL for particular nodes by the field content. Eg. if you created a City node type and added a population decimal field to it, you could search for all cities with a population larger or smaller than a set amount. 

Very quickly after CCK, another module was created - the Views module. Views allowed users to build the queries in the admin interface. You could now create a list of cities ordered by population with a title and a teaser and some other data without the need to code anything. This was a massive breakthrough which allowed developers to create very compelling websites without writing a line of code. 

CCK was so popular that is was incorporated into Drupal in version 7 and Views followed in Drupal 8.

This is how Drupal websites were built for quite a while. Many are still built like this today.

Drupal 7 - First attempts at page layouts 

From Drupal 7 fields were considered a standard. Templating, however, was not sufficient for the community and clients. Drupal developers began searching for solutions to allow more control over the content display with just the UI.

The main reason for the search efforts is the way websites began to be built. The knowledge that an additional click reduces the chance the customer will get to the content was propagating. The approach of having a sidebar and dividing information into pages was no longer interesting. Long scrollable pages were born.

The advent of long landing pages with content in sections began somewhere in 2010. It was, however, the mobile that effectively killed the sidebar. You just could not fit a submenu in a sidebar on mobile. You now had to stick everything on one long scrollable page with multiple sections (scrolling on mobile is much easier than clicking links). And each section had to be interesting, compelling and different.

Drupal developers started to search for solutions on how to allow the editors to create sections on pages easier.

The initial solutions were:

  • Panelizer - a module based on another one (Panels) which was effectively taking over the node display. Instead of just fields, you could now design your page to include blocks, fields, static images, views and various other elements Drupal renders. Editors could override “the default’ predefined layout on a node by node basis. The solution was great and got a lot of traction in the Drupal world.
  • Paragraphs - a bit late to the party on Drupal 7, paragraphs nonetheless made a splash. It started to gain popularity very quickly. The main reason for that was that it was bridging the 2 worlds: drupal form building experience and freedom to add blocks while maintaining ease of use for the editors, which the above solutions did not have. 
  • Context - Context a more general module, which gave users a mechanism for well - acting on contexts (eg what page you are on, what user or role you are etc. ) Using these conditions, one could add reactions (eg. show this block, set this setting). Context was very widely used for a while to arrange blocks on pages. If I am on this page, I want to see these blocks. The downside was you managed the layouts from a central location, needed admin privileges to manage the layouts and the UI was not straightforward. Not suitable for large websites.
  • Blockreference - a simple yet powerful module which allows referencing blocks from a node and effectively stacks them one over the other. This solution did not get a lot of traction.

Current state in Drupal 8 and onwards

Drupal 8, being a very big re-write of Drupal has evened out the playing field a little, allowing the community to vote again on what it thinks should be the approach to building pages. 

Blockreference did not get a D8 version, mostly because entityreference module was now in Drupal 8 core and blocks became references. One could theoretically build pages like this using what Drupal gives out of the box, but that did not catch on.

Context did not manage to gather a lot of usages in D8 and until today does not have a stable release.

Paragraphs - initial winner

The paragraphs module came out as a clear winner. It was stable very quickly and became the de-facto standard in Drupal 8 for over a year. With over 140k installations it now runs one-third of all the drupal websites. It is also worth mentioning that popular Drupal distributions created on Drupal 8 chose paragraphs as the base of their content editing experience. These would be in particular Thunder - distro for publishing houses and Droopler - for building corporate websites. 

Here is an overview of how paragraphs work in Drupal 8. A lot of work is also being done to further improve the editorial experience in the Experimental widget.

Panelizer moved to the core (became Layout Builder)

Panelizer took a different road. It lingered on behind paragraphs in term of the number of installs but because of its popularity in D7 work was underway to migrate it into Drupal core (just as CCK in d7). It was however only in Drupal 8.5 that Layout Builder became available (as an experimental module). At Drupal 7.8 it became stable. 

Layout Builder offers great flexibility and great promise, but the UI even as of writing still has a long way to go to be self-explanatory (one needs a bit of training as many things are not obvious). Also, there is no clear “best practice” as to how to manage content now and what should be composing the pages. Integrations are also lagging, most importantly with search modules. 

Currently, there is no clear winner and best practice is not established yet. There is the paragraphs module with 100k installations, multiple integrations and a clear UI. On the other hand, there is the Layout Builder which is in Drupal core, what is an incredible strength. 

Still, though there are many modules which did not stand the test of time and were removed from the core.

Gutenberg (a WordPress editor)

Last but not least there is the Gutenberg project. It is the newest of the interesting editors in Drupal. It was ported from Wordpress where it is the main editor.

Gutenberg is a React-based editor which takes over the whole editing experience giving the user a WYSIWYG like an experience. It differs in approach to Paragraphs and Layout Builder in that it does not store the layout or entities in the database, but it stores the generated HTML. You create the content with a WYSIWYG editor and the resulting HTML is saved. This makes it a true WYSIWYG readability of the content for machines (automatic updates or migrations of such content may be troublesome). Nonetheless, it continues to be integrated into Drupal better and better. With 900 or so installs it is by no means comparable to the two above but the speed of adoption is impressive. Check out a quick overview of Gutenberg in Drupal.

As you can see, there is no clear winner. Drupal community is still testing various approaches to building websites and empowering the editors. On the one hand, this is fantastic because competition helps the best solution win. On the other, the efforts of developers are spread over multiple various approaches, making progress slower. What is best? I do not know.


May 15 2020
May 15

Drupal 8 comes with a Diff component in core. It's mainly used for showing changes to configuration or content revisions. However, it can be used in other ways too. Here are some quick steps to show how to use the Diff component in your custom forms or controllers.

One way Drupal core uses the Diff component is to show the difference between 'Active' and 'Staged' config. See \Drupal\Core\Config\ConfigManager::diff for more details.

Recently, I was building a form where I had to show the diff between two versions of an entity. The Diff component is designed as a stand-alone component that can be used to show the diff between any two arrays or strings. So why not use that in our own forms?

Here's a quick example:

use Drupal\Component\Diff\Diff;
use Drupal\Core\Serialization\Yaml;

    $diffFormatter = \Drupal::service('diff.formatter');
    $from = explode("\n", Yaml::encode($current->toArray()));
    $to = explode("\n", Yaml::encode($revision->toArray()));
    $diff = new Diff($from, $to);
    $diffFormatter->show_header = FALSE;
    // Add the CSS for the inline diff.
    $form['#attached']['library'][] = 'system/diff';

    $form['diff'] = [
      '#type' => 'table',
      '#attributes' => [
        'class' => ['diff'],
      '#header' => [
        ['data' => t('From'), 'colspan' => '2'],
        ['data' => t('To'), 'colspan' => '2'],
      '#rows' => $diffFormatter->format($diff),

I hope this helps. Let me know how you have used the Diff component in your project in the comments!

Photo of Jibran Ijaz

Posted by Jibran Ijaz
Senior Drupal Developer

Dated 15 May 2020

May 15 2020
May 15

Matt and Mike talk with Putra Bonaccorsi and host Mike Herchel about Drupal 9's new front-end theme, and its past, present, and future. 

May 14 2020
May 14

Harish R Rao portraitContinuing our short series of articles highlighting ways that the Drupal software and its community are building solutions to help combat the effect of COVID-19, today we hear from Harish R. Rao of Interpersonal Frequency. Here, he describes their project at San Mateo County.

Under normal circumstances, a client asking for a new website “today” might seem laughable. But in the age of COVID-19, a quick turnaround on a new site could literally be a matter of life and death. In mid-March, San Mateo County, California reached out to the Interpersonal Frequency (I.F.) Solutions & Support team because their county, home to roughly 750,000 residents, had been one of the first in the U.S. to adopt a shelter-in-place order. The traffic on their county website had more than quintupled, and their current homepage was not working to convey vital and quickly evolving information in simple and accessible ways. 

Fortunately, San Mateo’s Drupal site was well-suited to the challenge. We knew we had the foundation for a solution, equipped to handle even a surge of visitors and able to support the rapid deployment of new content templates. 

We knew that our company was well-suited to the challenge, as well. Having collected aggregated analytics data and best practices from other civil emergencies, we understood the essentials needed, and we had clear indications of an urgent public need for COVID-19-related responses from municipal digital infrastructure. I.F. assembled a team of seven experts in content, user experience, and Drupal development. The team dove in to solve the specifics with the goal of supporting San Mateo’s residents and County staff as efficiently and effectively as possible. 

screenshot of the websiteOur team worked with the client to marry best practices with their specific needs, such as clearly communicating and providing access to the public services still available online and breaking down information silos between government departments. We identified priorities and set out to provide a one-stop-shop for local COVID-19 updates that would be accessible, mobile-responsive, user-centered, and easy to maintain and update. We also set out to create a solution that could launch and assist the public even as the county was still developing new interior pages and that could morph to support new priorities as the situation evolved. We needed a site that would work now, next week, next month, and beyond. 

Drupal allowed us to address all of these goals. After a near round-the-clock effort over six days -- including a weekend -- we deployed. The design launched as a system of content cards that could be rearranged and repurposed as needed. It was purposely mobile-forward and flexible. The county has continued to shape and adjust the site as the situation and their priorities evolve, and you can see the latest iteration at SMCgov.org

May 13 2020
May 13

As the global pandemic continues to spread — causing widespread sickness and death, restricting in-person human contact, creating additional responsibilities at home or financial hardships, or any of the countless other changes to daily life that have resulted in feelings such as fear, anger, boredom, or uncertainty — this virus has forced some of us to reassess our values and our place in the world. While the majority of us who participate in the Drupal community remain focused squarely on technical issues, others might find now is an especially good time to take a closer look at Drupal's Values and Principles. For those of us privileged enough to have the time and space to consider more philosophical questions, we can ask if Drupal's stated values (still) align with our values, or even consider the role of Drupal in our lives when the pandemic subsides.

This article — the first in a series of articles exploring Drupal's values and principles — considers Drupal's first principle, "impact gives purpose," which is one aspect of the first value, "prioritize impact." On one level, the first principle is merely practical. It concludes by prioritizing the "stakeholders" we should consider: "When faced with trade-offs, prioritize the needs of the people who create content (our largest user base) before the people who build sites (our second largest user base) before the people who develop Drupal (our smallest user base)." In its simplest form, this principle tells us that Drupal ranks the needs of content creators before the needs of the developers.

However, the first principle offers considerably more depth. While acknowledging the practical nature of the Drupal software, it calls on us to aspire to a higher goal: "When contributing to Drupal, the goal is often to optimize the project for personal needs ('scratching our own itch'), but it has to be bigger than that." Thus, Drupal is presented as much more than simply a good product.

The phrase "scratching our own itch" has become a platitude. It's everywhere. The Harvard Business Review called it "one of the most influential aphorisms in entrepreneurship." The phrase is well known among software developers in part because in his influential 1999 book, The Cathedral and the Bazaar, (the highly controversial) Eric S. Raymond wrote, "Every good work of software starts by scratching a developer's personal itch." In the Drupal community, however, we see ourselves as aspiring to much more. 

As the first principle states, "Slowly, focus shifted from writing the perfect code to growing the project, and amplifying Drupal's impact through better marketing, user experience, and more." 

Countless individuals and Drupal subgroups express their desire to impact people. For instance, the Drupal agency Palantir prioritizes impact that is "positive," "lasting," "thoughtful," and "deliberate." Over at ThinkShout, a Drupal agency that works "with courageous organizations that put people and the planet first," the "impact" they aspire to in their first core value "is driven by our sense of connectedness and desire to deliver meaningful, measurable results." Countless individuals and organizations in the Drupal community feel motivated by a sincere desire to positively "impact" other human beings.

Drupal's first principle is especially ambitious in describing the impact of the Drupal community: "Prioritizing impact means that every community member acts in the best interest of the project." It seems unlikely that "every community member" can or should make the Drupal project their top priority. Though it may be idealized, it's a worthy goal. We also must reiterate that people will necessarily begin with their own needs.

Contributions to the Drupal project should not come at personal expense. Imagine telling a single parent, who recently lost their job and wants to build a career with Drupal, to consistently act "in the best interest of the project." Change should come from individuals who have the capacity to help others. Part of why some of us contribute to Drupal is because we imagine another human being finding value in our work. We do not require those who benefit to give back. In this idealized form, we encourage people to participate, but we give with an open hand and no expectation of reciprocation. We contribute because we believe our actions have meaning. As the first principle states, "We derive meaning from our contributions when our work creates more value for others than it does for us."

When we look inward to examine our value systems, we probably do not want to find a heap of clichés, and phrases like "prioritize impact" and "create value for others" might sound rather cliché to some ears. In fact, on various lists of "business buzzwords," the word "impact" takes the top slot. The noun "impact" comes from the Latin impactus, which means "collision" or "striking one thing against another." The cultural and historical context of "impact" doesn't negate its usefulness, but if the real goal is to "derive meaning," it might be helpful to reconsider this principle in more human terms.

As previously noted, much of Drupal's first principle points toward bigger goals that extend beyond the conference room to a human-centered skill that good people work to cultivate: generosity. We seek to help others, both at home and in our careers. The business-friendly language in the first principle like, "maximize the impact the project can have on others," could, for at least some of us, be read as "practice generosity toward others." We seek to use Drupal for Good or even live in (with) Drutopia.

Thanks to Drupal and its community, some of us possess the fortunate capacity to help others. If that describes you, then consider in what ways you have the opportunity to be generous. Toni Morrison — the iconic writer, activist, and college professor who became the first African-American woman to win the Nobel Prize in Literature — used to tell her students:

"When you get these jobs that you have been so brilliantly trained for, remember that your real job is that if you are free, you need to free somebody else. If you have some power, then your job is to empower somebody else. This is not just a grab-bag candy game."

In this case, Morrison's inspirational words apply not just to students, but to countless people in the Drupal community. Many in our community have freedom and power. We have the opportunity to help others. Help other Drupalers. Help kids. Help the homeless. Help anyone in need. Maybe even help Drupal and give to #DrupalCares. If your actions produce positive results, keep going!

Ultimately, action matters more than language. Whether you feel motivated by the desire to make an impact, or you want to practice generosity, don't let up because the world has changed. Take another look at Drupal's Values & Principles and determine for yourself if they motivate you to action. This is not just a grab-bag candy game.

May 13 2020
May 13

Web accessibility is an inclusive design that ensures everyone can access your website, no matter their abilities. In the same way a ramp on the sidewalk makes sure someone in a wheelchair can get over the curb, having an alternative (alt) text on an image can make sure someone using a screen-reader can understand what the image conveys.

Types of Disabilities

Globally, 1 in 5 people have some form of a disability and someone's ability can change over time. For example, as adults age, they may lose some of their sight or hearing. A disability can also be temporary such as a broken arm or misplaced glasses. Sometimes the disability can be situational: someone on a busy subway who cannot hear the audio in a video would rather read captions, and someone by the pool in bright sunlight is in need of high contrast.  

Disabilities vary considerably. Some users with poor eyesight need high contrast and to increase the font size to access the content. Users who are blind need a screen-reader to access websites. Users who cannot hear need alternatives to access the audio content. Users with a mobility impairment may need to use voice activated commands or a mouse alternative such as a mouth-operated joystick to access the content. Users with epilepsy need to avoid quickly flashing content.   

How to Make Your Website Accessible

The best way to ensure your website is accessible is to provide multiple ways of accessing your content, such as alt text for images, captions for video and the ability to navigate with a keyboard instead of a mouse. Making sure the layout and structure of your website is logical, intuitive and simple to navigate can also make it easier to use for everyone.   

Why Web Accessibility Matters

Well, maybe now you're saying to yourself equal access sounds nice, but why should I care about web accessibility? This seems like an extra headache that is going to cost me money. Well, in fact, not caring about accessibility is going to cost you money. If your website is not accessible, you risk losing a significant portion of potential clients since 20% of the population will not be able to use your site. Second, the law mandates it.   

In the US, Title III of the Americans with Disabilities Act includes websites and now applies to any public-facing businesses and private businesses in 12 categories, including sales, entertainment, service establishments, recreation and more. A website which is deemed inaccessible to someone with a disability can be forced to immediately redesign the website and to pay monetary damages and the other party's attorney fees.   

In Canada, four provinces currently have web accessibility laws: Quebec, Manitoba, Nova Scotia and Ontario. The Accessibility for Ontarians with Disabilities Act (AODA) is the most comprehensive of the four, and aims to create a barrier-free Ontario by 2025. To this end, by 2021 all private and non-profit organizations with more than 50 employees and all public organizations must make their websites compliant with Web Content Accessibility Guidelines (WCAG). The federal government is looking to pass web accessibility guidelines in the near future with the aim of enforcing WCAG. We can expect Canadian federal web accessibility laws to be rolled out soon.   

With all this in mind, it makes sense to build accessibility into your website design and updates now before you are hastily forced to do so by new laws. Or, before you get sued. In 2019, the pizza giant Domino's famously lost a US Supreme Court Case (Domino's Pizza v. Guillermo Robles) by failing to make their website accessible to a blind man who used a screen reader to access their site and mobile app. And their brand will forever be remembered as the big bad pizza chain who went up against a blind man in court. And lost. Don't be Domino's. The damage to your brand alone is reason enough. 

Accessibility Is a Human Right

Web accessibility is a human rights issue. It is imperative that everyone can access the same services in society, no matter their abilities. Now more than ever, many essential services, such as banking, healthcare, utilities bills and education are moving online. The laws have been slow to catch up to the internet but as the internet becomes more and more integrated into our everyday lives, the courts are finally catching up. There is no time like the present to design a website with accessibility in mind, or to incorporate accessibility as you update your website. Also, it can just make it easier for everyone to navigate.   

We're happy to help get you there. Contact us if you want us to help make your website accessible, or take our Web Accessibility training.

May 13 2020
May 13

What is the day-to-day life of a Drupal core committer like? Besides squashing bugs and shepherding the Drupal project, the maintainers responsible for Drupal core are also constantly thinking of ways to improve the developer experience and upgrade process for novice and veteran Drupal users alike. With Drupal 9 coming just around the corner, and with no extended support planned for Drupal 8 thanks to a more seamless transition to the next major release, Drupal's core developers are hard at work building tools, approving patches, and readying Drupal 9 for its day in the spotlight. But Drupal 9 isn't the only version that requires upkeep and support—other members of the Drupal core team also ensure the continued longevity of earlier versions of Drupal like Drupal 7 as well.

The impending release of Drupal 9 has many developers scrambling to prepare their Drupal implementations and many module maintainers working hard to ensure their contributed plugins are Drupal 9-ready. Thanks to Gábor Hojtsy's offer of #DrupalCares contributions in return for Drupal 9-ready modules, there has been a dizzying acceleration in the growth of modules available as soon as Drupal 9 lands. In addition, the new Rector module allows for Drupal contributors to have access to a low-level assessment of what needs to change in their code to be fully equipped for the Drupal 9 launch.

In this inaugural episode of Core Confidential, the insider guide to Drupal core development and Tag1's new series, we dive into the day-to-day life of a core committer and what you need to know about Drupal 9 readiness with the help of Fabian Franz (VP of Software Engineering at Tag1), Michael Meyers (Managing Director at Tag1), and your host Preston So (Editor in Chief at Tag1 and author of Decoupled Drupal in Practice). Learn more about how Drupal's core team continues to support the Drupal project as it gets ready for the latest and greatest in Drupal 9, due to be released this summer for eager CMS practitioners worldwide.

[embedded content]


May 13 2020
May 13

Keeping a castle secure requires you to watch out in different directions. The same applies to websites — protecting your “digital castle” involves many different aspects.

Luckily, Drupal website security is on a pretty high level, so you just need to observe its best practices. You can always count on our Drupal support & maintenance team with the task to improve website security and implement the best website security measures like:

  • switching to HTTPS
  • applying Drupal security updates
  • bringing order to roles and permissions
  • blocking access to important files
  • installing security modules
  • and many more.

Our team can not just protect your website but also your budget from extra expenses thanks to our reasonable pricing and quick problem solving.

Today, we are taking a closer look at one of the practices to improve website security that is not so frequently described — website session timeout. Let’s see how this is done by the Automated Logout Drupal module.

How to improve website security with automated logout

You might have noticed that online banking applications show a countdown of your session time. This session time is usually very, very short.

Of course, not all apps or websites deal with this level of sensitive operations. So their session expiration time may vary. Still, if you want to improve website security, your site needs automated logout in any case.

The explanation is simple: this feature prevents hackers from intercepting a user’s session ID and intruding into your site. This makes it one of the website security basics that are used to improve the protection level.

  • According to OWASP (Open Web Application Security Project), insufficient session expiration increases session-based attacks. The shorter your website session is, the fewer chances you leave to attackers. So it is recommended to keep a good balance between security and usability depending on the purpose of your website.

Website security features of the Automated Logout Drupal module

As part of the security measures for a website, the Automated Logout contributed module in Drupal allows site admins to specify the time of inactivity, so users are automatically logged out when it expires.

The module is very flexible in its settings. Among its features to improve website security are:

  • different session timeouts for different user roles
  • individual website session timeout on a per-user basis
  • customized notifications about the upcoming logout
  • JS mechanisms to keep users logged-in when they have multiple tabs open or are working on a form
  • and more

How to work with the Automated Logout Drupal module to improve security

Let’s see the module in action. With the module installed and enabled, go to Configuration — People — Automated logout settings. Here are the key details to configure:

1) The main time settings

  • Set the timeout value in seconds (60 or longer). If role-based timeout is activated, this setting will not be used.
  • Set the maximum timeout in seconds. This is the maximum time that can be set by users who are allowed to set their own timeout.

2) Time for a response

  • Set the timeout padding in seconds. This is the time a user has for responding to the dialog before the logout (whether they want to resume the session or not).

Automated Logout Drupal module - settings for time

3) Where to redirect users

You need to set the redirect URL to which a user is redirected after the session is over.

Automated Logout Drupal module - settings for redirect URL

4) User-specific and role-specific timeouts

You can disable user-specific logout thresholds if you want to forbid everyone to set their individual maximum logout time. If this is allowed, this can be configured in individual user profiles in the People section of the admin dashboard. However, it never exceeds the sitewide maximum timeout you have set in Point 1.

Automated Logout Drupal module - settings for user-specific and role-specific timeouts

  • You can enable role timeout if you want to allow specific user roles to set their per-role maximum timeouts and redirect URLs. The permissions for specific roles can be set in the People — Permissions section of the admin dashboard.

Automated Logout Drupal module - roles and permissions

5) The logout dialog settings

When the logout is close, it’s a good practice to show a dialog window to users that informs them about this and gives them a chance to respond “yes” or “no” to the option to reset their session. Here are the things to customize:

  • The dialog title
  • The message to display in the logout dialog
  • The message to display after the logout
  • The type of message (status or warning)
  • The time for a user’s response (see Point 1).

6) The response buttons

It’s also possible to customize the “confirm” and “decline” button text in the dialog window or totally disable the response buttons.

If there is a need to improve the standard look of the buttons to meet your brand’s identity or customize the above-described process in any other ways, this is all possible if you contact a Drupal team.

Improve website security with our support experts!

It’s easy to stay safe when the best security measures for websites are taken. Our Drupal development company knows how to improve website security, so let us help you make your site a protected place.

As support and maintenance experts, we strive to improve sites in every aspect, so you can ask us to improve not only your site’s security, but also its performance, SEO, etc.

You can reach out to us with tasks of any scope — from installing and configuring specific security modules to performing a comprehensive security audit at a good price. Drop us a line to improve website security today!

May 13 2020
May 13

Security above all. The Drudesk team would love to help all businesses make their websites secure.

Hence our very popular service of Drupal website security audit that we offer at affordable prices. During the audit, we perform in-depth checks and find security vulnerabilities of different nature. After a good clean-up, we always recommend using helpful tools that will help them regularly keep an eye on website security.

One of these tools is the Security Review module that can quickly check a website for security vulnerabilities. In this post, the Drudesk help team will show how it performs a quick Drupal website security check on a number of important points.

Why is security important for a website?

Hackers use plenty of techniques (SQL Injection, cross-site scripting, remote code execution, and so on) to intrude into websites and manipulate their data for malicious purposes.

Hence the importance of web security checks — they help you discover your site’s weak points. Providing for your website safety allows you to:

  • ensure your crystal clear reputation to build a long-term customer relationship
  • protect your important business information from disclosure
  • keep your customers’ data (names, credit card numbers, etc.) intact
  • avoid being blacklisted by Google or other services due to security vulnerabilities
  • insure yourself from legal proceedings based on sensitive data manipulation
  • safeguard your business against direct money losses related to compromised data
  • keep your overall website performance smooth and never lose your conversions

The best Drupal security practices

If you are using Drupal, your situation with website safety is favorable because Drupal is rated highly for security. However, to ensure Drupal website security, there are good practices websites need to observe:

  • keeping your website up-to-date with the security releases published by the Drupal Security Team
  • using the HTTPS protocol for encrypted data transfer
  • setting the right roles and permissions to relevant users
  • taking precautions with the super admin user
  • using strong logins and passwords for admin accounts
  • doing regular backups
  • blocking access to important files
  • removing outdated modules
  • using helpful Drupal security modules

When it comes to the last point, in the collection of modules for checking and improving your Drupal website security, there is the Security Review module to perform quick website security checks. Next, we describe its work.

Website security check with the Security Review module

The Security Review Drupal module can quickly check a website for security vulnerabilities. It performs an automated website security check on-demand — with a click on the admin dashboard.

This will give you a quick answer to the question “How secure is my website?” based on a number of factors. The module uses a very nice website security testing checklist:

  • Safe extensions only are used for uploaded files and images.
  • Dangerous tags in submitted content are not found.
  • Untrusted roles do not have access to important permissions.
  • Error reporting is set to log only.
  • PHP files in Drupal files directory are not executable.
  • Files and directories are not writable on the server.
  • The private files directory is not in the server root.
  • Sensitive temporary files are not found.
  • Untrusted users are not allowed to add dangerous HTML tags.
  • Base_url and trusted_host_patterns are not set.
  • Views are access controlled.

With the module installed and enabled, go to Reports — Security Review and click “Run and review” to start the website security check. After a few seconds of the check, you get a list of things to improve. Each of them can be opened for more details or skipped.

Security Review Drupal module

If you click on the “Settings,” you can select the website security check steps that should be skipped, set the “untrusted roles,” and choose to log the check results and skips.

Security Review Drupal module settings

The module does not do any improvements by itself — it is one of website security scanning tools. Its website security check is meant to inform you of what needs to be improved.

Let us check your website for security vulnerabilities!

Dealing with safety is difficult, even if the website security test runs from just one click on your dashboard. You will also need to know how to properly fix the discovered vulnerabilities in the check. But you can always call Drudesk!

Our Drupal support and maintenance team is here to check your website for security vulnerabilities and increase its safety. We respect your budget and offer very reasonable pricing for all stack of support services, including security tests. After years of experience with various site vulnerabilities, Drudesk devs can quickly find the cause of the problem.

So drop us a line if you would like to:

  • install and configure the security modules that check and/or improve your site
  • interpret their website security check results properly and take measures
  • check the website security comprehensively — with an audit by our experts

Never hesitate to contact us! What can be more affordable than our services? A totally free consultation! Drudesk: here is where your Drupal website security starts.

May 13 2020
May 13

Drupal 9 roadmap

Drupal 9 will be released in 2020.

If you are a contributor of any module on drupal.org and want those sites to live long after the end of life in 2021, you need to prepare for Drupal 9.

Why should you do it at all? First of all, users of your module may want to have it on Drupal 9. Secondly, your clients may ask to migrate their websites to Drupal 9, so it’s better to be ready.

May 12 2020
May 12

For more than 13 years, BADCamp has been dedicated to providing the Drupal community with fun, inspiring, and safe opportunities to teach, learn and share together. While 2020 has seen many changes to the ways we interact with one another, our passion for open-source, and the people who contribute to it, boogies on!

For the safety of all members of our local and global communities, BADCamp XIV will be delivered in an online format. 

Mark your calendars for Oct 14-17, 2020 – the first-ever virtual BADCamp! Featuring all the things you love about BADCamp: trainings that help you level up your skills, sessions that promote learning from members of your community, opportunities to make new friends and connections, and lots of fun! Now, from the comfort of your own kitchen table, home office, Animal Crossing headquarters, sofa, or wherever your WiFi signal is strongest.

While we still have many details to work out, we want you to know that, wherever you are, we are in this together. We look forward to seeing you in October.

May 12 2020
May 12

As part of the Drupal Community Working Group's (CWG) continuing expansion of their Community Health Team, we are pleased to welcome Dr. Michelle Drapkin, a Behavior Scientist and Clinical Psychologist, as one of our two mental health subject matter experts. This team will focus on proactive programs to help promote the overall health of the Drupal community. With the COVID-19 pandemic causing various levels of anxiety for individuals around the world, we are working to provide resources to community members.

On May 22 we met with Dr. Drapkin for a Wellbeing Hour, where she walked us through some strategies to support our wellbeing during this challenging time. Dr. Drapkin is an expert in evidence-based approaches to managing stress and anxiety and gave us a tour down a buffet of options to support our health and wellbeing. We got a taste of using present-moment awareness to manage your stress and learned about leaning into your values, changing your relationship with your thoughts, and being more intentional with communication to support your relationships (at work and home!).

About our speaker:
Dr. Michelle Drapkin is the Owner/Director of the CBT Center of Central NJ. She is one of the newest members of the CWG team helping as a mental health subject matter expert on the Community Health Team.

About the Drupal Community Working Group (CWG):
The CWG is a volunteer group whose mission is to foster a friendly and welcoming community for the Drupal open source project and to uphold the Drupal Community Code of Conduct.

File attachments:  CWG-Wellness-hour.png Drupal Wellbeing Hour 5.22.20 handouts.pdf
May 12 2020
May 12

This post was written collaboratively by tedbow and hestenet.

Project Update Bot now enabled for all projects

The initial testing period for the Project Update Bot is now complete, and the Bot has been enabled for all projects.

If you are a project maintainer and would like to understand how to control the bot for your projects, please consult the instructions on the bot's account page.


Drupal 9.0.0-beta2 has been released, and Drupal 9.0.0 is scheduled to be released on June 3, 2020. The upgrade from Drupal 8 to Drupal 9 should be the easiest major version update in the last decade of Drupal’s history. One of the reasons for this is because over 1,700 contributed modules already have a Drupal 9 compatible release. Making a module that's already compatible with Drupal 8 compatible with Drupal 9 just requires removing deprecations. When looking at just one module, these changes are usually trivial, but when we consider managing over 8,800  Drupal 8 modules on drupal.org the upgrade process could easily take hundreds or thousands of hours from the Drupal community.

Fortunately we can accelerate this process dramatically by using Drupal Rector, a tool developed by Palantir.net and  Pronovix. Drupal Rector can be used by developers to automatically fix many of the deprecations needed to make a module Drupal 9 compatible. In fact Drupal Rector can currently fix deprecations in more than 2000 existing projects on Drupal.org. Although using Drupal Rector is not difficult, running it manually on all Drupal 8 modules would have taken hundreds of developer hours to complete.

To accelerate upgrading of modules to Drupal 9 the Drupal Association has partnered with Acquia and Palantir to automatically provide patches generated by Drupal Rector for all contributed modules possible. For many modules these patches will be able to make the modules fully compatible with Drupal 9. For some modules it will still be necessary for a developer to manually replace other deprecations. Patches will also be provided for contributed themes but preparing a theme for Drupal 9 will require other changes such as updates for Twig 2.

These patches will be posted to issues created by the new Project Update Bot. An example issue can be seen here: https://www.drupal.org/project/entity_block_visibility/issues/3134823

The Project Update Bot is not associated with any individual or company and therefore will not affect issue credits for any individual or company. If maintainers find the patches provided by the bot useful they are welcome to credit the bot account as a way to provide feedback on this initiative.

It will be up to the project maintainers to decide if they want to use these patches but in many cases the patches can help speed up the process of updating a module for Drupal 9 dramatically.

Maintainer Options

For project maintainers there are a few options for dealing with these issues

  1. Leave the issue open and apply the provided patch to remove some or all Drupal 9 deprecations. The Project Update Bot will check weekly if Drupal Rector is able to remove new deprecations and post a new patch if possible.

  2. Remove the “ProjectUpdateBotD9” tag from the issue to stop new patches from being posted. If you would like to use the issue and the patch as a starting point simply remove this tag and the bot will not post any new patches. Add the tag back and the bot will post new patches if possible.

  3. Close the issue to stop the bot from posting new patches. If you are already handling deprecations in another issue or otherwise don’t find the patches helpful simply close the issue and the bot will not post any new patches

Providing feedback

If there are problems with one of the patches posted by the Project Update Bot, such as it does not correctly replace a deprecation, you can file an issue in the Rector issue queue. For other issues with the bot, for instance if the issue summary created by the bot is unclear,  use the Infrastructure project issue queue using the component “Bot: Drupal Rector”.

How can you help?

While Drupal Rector can currently fix some or all of the deprecations in over 2000 contributed modules to Drupal 9 it currently only covers 50% of total Drupal 9 deprecations. New Rector rules are being added with every new release of Drupal Rector. You can help by making new Drupal Rector rules which will make it possible for Drupal Rector to upgrade even more modules.

May 12 2020
May 12

When your website users are able to quickly find what they are looking for, this naturally transforms in increased conversions and more. A good UX consultant will always recommend you to add search to a website.

Read on to discover more details about why search is important. If you are using Drupal, this post will be of special interest to you because we will describe how to make the website search functionality really fast and user-friendly with Lunr.js.

Why is search important on a website?

The question “Does my website need a search?” is asked by many customers. On large and complex websites, internal website search is a must-have in order to improve website navigation. The decision to add a search box to a website is especially vital for content-rich websites, e-commerce stores, knowledge bases, and so on.

However, it is also very helpful for small and medium-sized websites. Here we will discuss in what ways search functionality on any website, when it is fast and user-friendly enough, can benefit its owner:

  • If a user finds the needed thing through the search box quickly, they are likely to purchase it.
  • Search results give you new ideas about products or services users are interested in.
  • You get information about the user behavior for your marketing strategies.
  • A search feature on your website provides you with new SEO keywords.
  • Adding search functionality to a website makes your design more customer-centric.
  • A search box increases your session duration and reduces bounce rate.

You can always reach out to our web development team for a free consultation and further creation of a search feature on your website at very affordable prices. We respect your budget and will recommend a fast, easy-to-use, and efficient search solution that will not require extra costs. Meanwhile, let’s discuss how search functionality is created on websites.

How to add a search function to a website built with Drupal?

The answer to the question about how to set up search functionality on a website, depends on the CMS you are using. Each CMS has its own special extensions that provide for the search feature.

In Drupal, to create basic search options for smaller sites, it’s enough to use the built-in Search module. It allows your users to search for full words in Drupal entities (content nodes, users, etc.). You can also specify the indexing settings and choose the ranking factors like:

  • publication time
  • activity in comments
  • keyword relevance

Through the use of extra modules, Drupal websites can also get more complex search features like:

  • faceted search
  • search by alternate spellings
  • similar content suggestions
  • result highlighting
  • search through attachments
  • multisite search
  • and much more

To achieve this, they connect to robust search platforms like Apache Solr or Elasticsearch through contrib modules like Search API Solr Search and Elasticsearch Connector, as well as use modules like Search API, Facets, and many more.

Superfast and easy-to-use internal website search with Lunr.js

In addition to the ones described above, there are other interesting, JavaScript-based options to add search to a website built with Drupal. They offer specifically fast search in Drupal on the client side. One of which we are sharing with you right now — your Drupal can have a search functionality based on Lunr.js.

What is Lunr?

Lunr.js is a full-text search library to use in the browser. It is a small but full-features library that provides great search experiences. Lunr.js offers a simple search interface for finding the content that best matches the search queries. It requires no server-side search services and needs no external dependencies.

You might have noticed its “lunar” brand design. Lunr.js is an alternative to the famous search engine — Solr. “A bit like Solr, but much smaller and not as bright,” runs the official slogan of Lunr.


All JavaScript solutions are known for exceptional speed, and Lunr is no exception. Lunr.js search functionality is especially good for cases when instant search in Drupal is needed.

A sum-up of Lunr.js search features

  • instant search results with suggestions
  • client-side search without overloading Drupal
  • keyword-based search
  • partial and fuzzy search options
  • a scoring system for showing relevant results
  • paging
  • location history
  • lazy-loading search results
  • and more

Modules for the integration between Lunr.js and Drupal

  • Lunr search

The Lunr search module integrates Lunr.js with Drupal. It uses Drupal Views to pre-build a search index and the search result pages. These are delivered to the client using JavaScript. Its features include no-configuration multilingual support, custom field/facet searches, and more.

Lunr search Drupal module
  • Search API Lunr

The Search API Lunr module provides a Search API backend Lunr search with configurable fields. Instead of adding content to a search back-end, JSON files are loaded directly into the browser.

Add a search feature to your website with our experts!

Ready to give your website a new boost through adding a search functionality? Our website support agency experts are ready to do it at affordable prices.

We have both developers who specialize in Drupal and those who master WordPress. JavaScript frameworks are another passion of our team, because we know they can enrich your search box, like any other website elements, with special speed and interactivity.

Contact us and let’s discuss your best price!

May 12 2020
May 12

Many people researching Drupal Commerce 2.x for Drupal 8 (or the upcoming Drupal 9) are likely wanting to either remove the extra ecommerce shopping carts or allow checkout for multiple carts. This blog post will explain why we have multiple carts—and why being able to checkout with multiple carts is challenging, but possible.

Why you can have more than one Drupal Commerce cart

First, let’s demonstrate what Commerce 2.x can do out of the box for a single user and is often considered a bug. 

  1. Go to Acro Media’s demo store.
  2. Start out as anonymous and register as a user.
    1. Register here.
    2. Check your email/spam and click a link.
    3. Set your password because you’ll need to log back in shortly.
      Note: Acro doesn’t use your email address used on the account sign up on this site to contact you for marketing purposes. You can opt into marketing materials by clicking the large red help question mark on the right.
  3. Once registered, add something to your cart, and log out.
  4. Add something to your cart and log in.
  5. Go to /cart.


If you are seeing two carts, then you have discovered, like many others, that Drupal Commerce 2.x shows multiple carts by design. Drupal Commerce 1.x created multiple carts like this as well, but would only show one cart at a time. In 1.x, you could follow the five steps outlined above, then checkout and your original cart would display.

Why? Because the system will not delete carts. We’re using a simple anonymous session to create two carts in a potentially common edge case.

The pros and cons of multiple carts

Pro Con
  • Customers never lose a cart, even if their use of the site means they have more than one.
  • You could have multiple sellers, enabling a marketplace feature to be built on top of the existing functionality
  • You could enable different checkout workflows (one for digital services, one for recurring services, and another for physical items that require shipping).
  • You could end up with a confusing user experience by making your customers check out multiple times.
  • Payment and fulfillment must be handled separately for different items or different vendors.
  • More than one cart presents a significant visual challenge for designers. In the cart dropdown, for example, how do you should more than one cart? On the cart page, for another example, how do you handle more than one checkout button?

Turning off multiple carts in Drupal Commerce 2.x

There are two relatively simple Drupal modules you can use to show a single cart to a user:

Commerce Combine Carts—If this module is turned on, the multi-cart demo above would not produce two carts.

Commerce Cart Advanced—This module packs a lot of features into it for the crowd of users who want management tools around their multiple cart experience, but it also includes the feature to display only one cart at a time. It was created and is maintained by Acro Media’s senior developer known as krystalcode (Dimitris Bozelos).

Checking out multiple carts, Etsy/Amazon style

The holy grail of marketplace commerce is multi-store and single-checkout. The idea is that you could have a site that features multiple stores and customers could check out once from more than one store. 

According to the original author and former maintainer of Drupal Commerce 2.x, bojanz, you can do this by coding a form that acts like a checkout flow-form, but changes more than one order simultaneously.

However, you also have to consider a number of other issues: 

  • Fulfillment—If the stores are selling physical products, how will these orders appear to the customer and to customer service for each store? Likely, each store would want to only see the products for which they are responsible.
  • Order management—Even Amazon does some weird things with orders for its customers. Often, orders are split up for seemingly no reason, changing order totals and making order management challenging for customers and for customer service.
  • Payment—If you, as the site owner, plan to pay stores from your own bank account, you’ll want to set up a single, site-wide payment gateway and manage disbursement payments to your store owners. If not, then you’ll require each store to have its own payment gateway credentials or some other even more complex setup.
  • Taxes—Assuming you have good solutions for all of the above, taxes will still likely make it very hard to move forward. Tax law is hard in the best of times, and depending on how you take payment, tax rules would need to be created and maintained per store. Solutions like Avalara AvaTax only work per store and can be overly expensive for small retailers.

The bottom line

Basically, you have a few contrib options if you want to manage carts for your customers. But if you want that elusive multi-vendor, single checkout, you’ll have to plan well according to your business needs. Regardless, the flexibility of Drupal’s ecommerce cart functionality is capable of creating the best ecommerce shopping carts out there, you just need to know how to do it.

May 12 2020
May 12

Drupal 9 is just around the corner and the beta release is out to be tested already! What’s your Drupal 9 plan?  Are you still in the “Why should I upgrade to Drupal 9 after all” phase? Or are you wondering what your next steps should be for Drupal 9 readiness? We have answers to all your questions including Drupal 9 features and a quick Drupal 9 checklist on how to prepare for Drupal 9.

The best way to prepare yourself for tomorrow is to give your best today. And very apparently, the Drupal community has done just that. I know, migrating to Drupal 8 from previous versions was hard. It meant a complete rebuild of the code and a lot of learning. But once you are fully onboard Drupal 8, life gets easier. Think of it as a hard climb for a gorgeous view from the mountaintop. Truly worth all the effort, isn’t it? So, if you’re still on Drupal 7 (or 6), this is a good time to migrate to Drupal 8 and then simply upgrade to Drupal 9 from there. As you read this, modules are being ported (and made compatible) to Drupal 9 as a collective effort by the amazing Drupal community.

drupal 9 readiness

The Much-Talked-About Drupal 9 Release Date

One of the most frequently asked Drupal questions lately has been about the Drupal 9 release date. So, when is the most-awaited Drupal 9 release date? Drupal 9 is currently scheduled to release on June 3rd, 2020. The Drupal community has been successfully releasing minor versions every six months since the adoption of semantic versioning in Drupal 8. Every minor version came with several valuable new features. 

Drupal 8 extensively depends on third-party libraries like Symfony, Twig, Guzzle, CKEditor and must keep up pace with their updates as well. For example, Symfony 3 (Drupal 8’s biggest dependency) will reach EOL (end of life) by November 2021. Drupal 8 reaches end of life as well. Drupal 9 will be released with the latest Symfony 4.4 support and will not be backwards compatible with Symfony 3. 

Drupal 9         The Drupal 9 Readiness Roadmap (Image Credits – Drupal.org)

Drupal 9 Features - What’s New in Drupal 9?

Here are some key features of Drupal 9 that we will be discussing about in detail thereafter –

  • Easy to upgrade to Drupal 9 (if already on Drupal 8)
  • You get the latest of the best! (Symfony, Composer, Twig, PHP, CKEditor and more)
  • Will receive community support and security fixes after November 2021 (after Drupal 8 EOL)
  • Subsequent Drupal 9 minor upgrade versions will be backwards compatible to Drupal 9
  • Drupal 9.1 will have an all new and modern default theme called Olivero to offer 
  • Many exciting features and enhancements to look forward to, starting Drupal 9.1!

Drupal 9 is already being built within Drupal 8. Drupal 8.9 will release along with Drupal 9.0 in June 2020. This is because Drupal 9 is going to be the same as Drupal 8.9, except that it will be a cleaned-up version that is updated with support for its third-party dependencies. And hence one of the greatest Drupal 9 features is that it is so easy to upgrade!

Every new minor version of Drupal 8 saw many new features , but it also contained a lot of old code so it could be backwards compatible. This “old code” is also famously known as “deprecated code”. Because of the dependencies on third parties like Symfony, Twig, etc., Drupal 9 will incorporate updates to these dependencies. Drupal contributors and module developers are collectively making the road to Drupal 9 easier by eliminating “bad smelling code” (as Jacob Rockowitz calls it in his blog about deprecating code for his Webform module) from various Drupal 8 modules.

       Drupal 9.0 = Drupal 8.9 – Deprecated code + Upgraded dependencies

Drupal 9          What’s new in Drupal 9.0 (Image Credits - Drupal.org)

The Drupal 9 Readiness Checklist

Regardless whether you’re upgrading from Drupal 7 to Drupal 9 or Drupal 8 to Drupal 9, you will need to start planning for Drupal 9. The scheduled release is soon approaching and now is a good time to get prepared for Drupal 9. 

  • Drupal 7 to Drupal 9 Migration

If you are still on Drupal 7 and looking forward to getting onboard Drupal 9, it is not too late. Ideally, it is recommended to split this process into two parts – 1. Migrate to Drupal 8 and 2. Upgrade to Drupal 9. 

  • Migrate content and code to Drupal 8
  • Check for availability of modules in Drupal 8 using the Upgrade Status Drupal module
  • Upgrade your Drupal 7 modules to Drupal 8 with the help of modules such as Drupal Module Upgrader
  • Stay updated with the latest core releases.
  • Remove any deprecated code
  • Upgrade to Drupal 9

And as already discussed, upgrading from Drupal 8’s latest version to Drupal 9 is easy as pie. A Drupal 7 to Drupal 9 migration will take as much resource time (and resource budget) as a Drupal 7 to Drupal 8 to Drupal 9 upgrade. Drupal 7 will reach end-of-life by November 2021 and will continue to receive community support until then. For detailed instructions on how to install Drupal 9, check this article.

  • Stay Up to date with Drupal 8

With every new minor version release of Drupal 8, the benefits are not only restricted to access to new features and enhancements. It also takes you one step closer to Drupal 9. Since Drupal 8.8 is the last minor release before the Drupal 8.9 release (which also happens at the same time as Drupal 9!), it is the last Drupal 8 version to contain significant feature additions. Drupal 8.9 releases on June 3rd, 2020 and will include more stable modules (that were previously experimental) and a few UX and API enhancements. So, the best thing to do now is to keep Drupal core updated and update your website to Drupal 8.8.  

  • Weed out the Deprecated code

Make way for new and improved Drupal 9 features by removing old and deprecated code from your Drupal 8 codebase. When you keep Drupal core and contributed modules up to date, you are also embracing cleaner code. Updated versions remove usage of deprecated code and API. There are various methods to check for deprecated code. 

  • Sometimes functions are marked with @deprecated annotations that warn the developer that the following code is deprecated and what they should be using instead. 
  • Use a command-line tool like Drupal Check (by Matt Glaman from Centarro) to help check for deprecated code and other bugs. It can also be integrated in continuous integration environments.
  • Download the Drupal 8 Upgrade Status module on top of Drupal-Check for a more UI based solution. It can scan your entire Drupal project and generates a clean visual report that illustrates any compatibility issues, the need to upgrade modules to the latest version and Drupal 9 readiness. 
  • Drupal.org also offers support to check for Drupal 9 readiness and deprecation within its testing system. Like enabling static analysis with phpStan or by setting a trigger_error() when a deprecated level is reached.

    Once identified, it is time for some manual work to remove the deprecated code and refine existing codebase. Use automated tools like Drupal 8 rector to resolve some code issues, although it does need some manual intervention.

Drupal 9 Checklist                                     
                                                   Drupal 9 Checklist
May 12 2020
May 12

Load testing is one of the tools we leverage regularly at Tag1. It can help prevent website outages, stress test code changes, and identify bottlenecks. The ability to run the same test repeatedly gives critical insight into the impact of changes to the code and/or systems. Often -- as part of our engagements with clients -- we will write a load test that can be leveraged and re-used by the client into the future.

In some cases, our clients have extensive infrastructures and multi-layered caches, including CDNs, that also need to be load tested. In these instances, it can take a considerable amount of computing power to generate sufficient load to apply stress and identify bottlenecks. This ultimately led us to write and open source Goose, a new and powerful load testing tool.

Discovering Locust was a breath of fresh air, solving so many of the frustrations we used to have when load testing with jMeter. Instead of working with a clunky UI to build sprawling, bloated JMX configuration files, Locust allows the writing of truly flexible test plans in pure Python. This allows code to easily be re-used between projects, and swarms of distributed Locusts can easily be spun-up to apply distributed load during testing. Locust added considerable power and flexibility to our load testing capabilities, and made the entire process more enjoyable.

Though Python is a great language that allows for quickly writing code, it's not without flaws. Locust uses resources more efficiently than jMeter, but the Python GIL, or Global Interpreter Lock, locks Python to a single CPU core. Fortunately, you can work around this limitation by starting a "slave process" for each core, and then performing the load test with a "master process" all running on the same server. Locust is therefore able to work around some of Python's limitations thanks to its excellent support for distributed load testing.

Recently we've been hearing a lot about the Rust language, and were curious to see if it could improve some of our standard toolset. The language has a steep learning curve primarily due to its concept of ownership, an ingenious solution to memory management that avoids the need for garbage collection. The language focuses on correctness, trading slower compilation times for extremely performant and reliable binaries. And there's a lot of (well earned) hype about how easy it is to write safe multithreaded code in Rust. It seemed like an excellent way to increase our ability to load test large client websites with fewer load testing server resources.

The Rust ecosystem is still fairly young and evolving, but there are already fantastic libraries providing much flexibility when load testing. The compiler ensures that you're writing safe code, and the resulting binaries tend to be really fast without extra programming effort. For these reasons, it was looking like Rust would be an excellent language to use for load testing.

Indeed, once we had an early prototype of Goose, we were able to run some comparisons, and have seen amazing performance improvements compared to similar load tests run with Locust. With the same test plan, Goose is consistently able to generate over five times as much traffic as Locust using the same CPU resources on the test server. As you add more CPU cores to the testing infrastructure, Goose's multithreaded Rust implementation seamlessly takes advantage of the added resources without additional configuration.

When writing Goose, we were primarily interested in preserving specific functionality from Locust that we use regularly. We first identified the run-time options we depend on the most, and then used the Rust structopt library to add them as command line options to the as-of-yet then non-functional Goose. We then worked option by option, studying how they are implemented in Locust and reimplementing them in Rust. The end result can be seen by passing the -h flag to one of the included examples.

CLI Options

The easiest way to develop Rust libraries and applications is with Cargo, the Rust package manager. Goose includes some example load tests to demonstrate how to write them, each of which can be run with Cargo. To compile and run the included simple example and pass the resulting application the -h flag, you can type:

$ cargo run --example simple --release -- -h
    Finished release [optimized] target(s) in 0.06s
     Running `target/release/examples/simple -h`
client 0.5.8
CLI options available when launching a Goose loadtest, provided by StructOpt

    simple [FLAGS] [OPTIONS]

    -h, --help            Prints help information
    -l, --list            Shows list of all possible Goose tasks and exits
    -g, --log-level       Log level (-g, -gg, -ggg, etc.)
        --only-summary    Only prints summary stats
        --print-stats     Prints stats in the console
        --reset-stats     Resets statistics once hatching has been completed
        --status-codes    Includes status code counts in console stats
    -V, --version         Prints version information
    -v, --verbose         Debug level (-v, -vv, -vvv, etc.)

    -c, --clients           Number of concurrent Goose users (defaults to available CPUs)
    -r, --hatch-rate     How many users to spawn per second [default: 1]
    -H, --host                 Host to load test in the following format: [default: ]
        --log-file          [default: goose.log]
    -t, --run-time         Stop after the specified amount of time, e.g. (300s, 20m, 3h, 1h30m, etc.)
                                     [default: ]


Goose displays the same statistics as Locust, though we chose to split the data into multiple tables in order to make the tool more useful from the command line. The following statistics were displayed after running a one-hour load test using the included drupal_loadtest example with the following options (which should look familiar to anyone that has experience running Locust from the command line):

    cargo run --release --example drupal_loadtest --   --host=http://apache.fosciana -c 100 -r 10 -t 15m --print-stats --only-summary -v

The load test ran for fifteen minutes, then automatically exited after displaying the following statistics:

 Name                    | # reqs         | # fails        | req/s  | fail/s
 GET (Auth) comment form | 13,192         | 0 (0%)         | 14     | 0    
 GET (Auth) node page    | 43,948         | 0 (0%)         | 48     | 0    
 GET (Auth) login        | 20             | 0 (0%)         | 0      | 0    
 GET (Anon) user page    | 268,256        | 0 (0%)         | 298    | 0    
 GET static asset        | 8,443,480      | 0 (0%)         | 9,381  | 0    
 GET (Auth) user page    | 13,185         | 0 (0%)         | 14     | 0    
 GET (Anon) node page    | 894,176        | 0 (0%)         | 993    | 0    
 POST (Auth) login       | 20             | 0 (0%)         | 0      | 0    
 GET (Auth) front page   | 65,936         | 1 (0.0%)       | 73     | 0    
 POST (Auth) comment f.. | 13,192         | 0 (0%)         | 14     | 0    
 GET (Anon) front page   | 1,341,311      | 0 (0%)         | 1,490  | 0    
 Aggregated              | 11,096,716     | 1 (0.0%)       | 12,329 | 0    
 Name                    | Avg (ms)   | Min        | Max        | Median    
 GET (Auth) comment form | 108        | 16         | 6271       | 100       
 GET (Auth) node page    | 109        | 14         | 6339       | 100       
 GET (Auth) login        | 23147      | 18388      | 27907      | 23000     
 GET (Anon) user page    | 13         | 1          | 6220       | 4         
 GET static asset        | 4          | 1          | 6127       | 3         
 GET (Auth) user page    | 57         | 8          | 6205       | 50        
 GET (Anon) node page    | 13         | 1          | 26478      | 4         
 POST (Auth) login       | 181        | 98         | 234        | 200       
 GET (Auth) front page   | 83         | 16         | 6262       | 70        
 POST (Auth) comment f.. | 144        | 25         | 6294       | 100       
 GET (Anon) front page   | 5          | 1          | 10031      | 3         
 Aggregated              | 6          | 1          | 27907      | 3         
 Slowest page load within specified percentile of requests (in ms):
 Name                    | 50%    | 75%    | 98%    | 99%    | 99.9%  | 99.99%
 GET (Auth) comment form | 100    | 100    | 200    | 300    | 1000   |   1000
 GET (Auth) node page    | 100    | 100    | 200    | 300    | 1000   |   1000
 GET (Auth) login        | 23000  | 25000  | 27907  | 27907  | 27907  |  27907
 GET (Anon) user page    | 4      | 8      | 90     | 100    | 200    |    200
 GET static asset        | 3      | 6      | 10     | 10     | 30     |     30
 GET (Auth) user page    | 50     | 60     | 100    | 100    | 2000   |   2000
 GET (Anon) node page    | 4      | 7      | 200    | 200    | 300    |    300
 POST (Auth) login       | 200    | 200    | 200    | 200    | 200    |    200
 GET (Auth) front page   | 70     | 100    | 200    | 200    | 1000   |   1000
 POST (Auth) comment f.. | 100    | 200    | 300    | 300    | 400    |    400
 GET (Anon) front page   | 3      | 6      | 10     | 10     | 30     |     30
 Aggregated              | 3      | 6      | 40     | 90     | 200    |   4000

Reviewing the above statistics, you can see there was a single error during the load test. Looking in the apache access_log, we find that it was a 500 error returned by the server when loading the front page as a logged in user: - - [07/May/2020:01:26:34 -0400] "GET / HTTP/1.1" 500 4329 "-" "goose/0.5.8"

Goose introduces counts per-status-code, something not available in Locust. This can be enabled by specifying the --status-codes flag when running a load test, which provides more insight into what sorts of errors or other response codes the web server returned during the load test. During one round of testing, Goose generated the following warning:

    06:41:12 [ WARN] "/node/1687": error sending request for url (http://apache.fosciana/node/1687): error trying to connect: dns error: failed to lookup address information: Name or service not known

In this particular case, no request was made as a DNS lookup failed, and so there was no status code returned by the server. Goose assigns client failures such as the above a status code of 0, which shows up in the status code table as follows:

 Name                    | Status codes            
 GET static asset        | 125,282 [200]          
 GET (Auth) comment form | 1,369 [200]            
 GET (Anon) user page    | 11,139 [200]           
 GET (Anon) front page   | 55,787 [200]           
 POST (Auth) login       | 48 [200]               
 GET (Auth) node page    | 4,563 [200]            
 GET (Auth) front page   | 6,854 [200]            
 GET (Anon) node page    | 37,091 [200], 1 [0]    
 GET (Auth) login        | 48 [200]               
 POST (Auth) comment f.. | 1,369 [200]            
 GET (Auth) user page    | 1,364 [200]            
 Aggregated              | 244,914 [200], 1 [0]    

As with all other statistics tables, Goose breaks things out per request, as well as giving an aggregated summary of all requests added together.


Load tests are collections of one or more task sets, each containing one or more tasks. Each "client" runs in its own thread and is assigned a task set, repeatedly running all contained tasks. You can better simulate real users or your desired load patterns through weighting, causing individual tasks to run more or less frequently, and individual task sets to be assigned to more or fewer client threads.

When using Locust we’ve frequently found its heuristic style of assigning weights frustrating, as large weights mixed with small weights within a task set can lead to individual tasks never running. Goose is intentionally very precise when applying weights. If a task set has two tasks -- for example, task "a" with a weight of 1 and task "b" with a weight of 99 -- it will consistently run task "a" one time, and task "b" ninety nine times each and every time it loops through the task set. The order of tasks, however, are randomly shuffled each time the client thread loops through the task set.


A client is assigned one task set, and by default will run all contained tasks in a random order, shuffling the order each time it completes the running of all tasks. In some cases, it can be desirable to better control the order client threads run tasks. Goose allows you to optionally assign a sequence (any integer value) to one or more tasks in a task set, controlling the order in which client threads run the tasks. Tasks can be both weighted and sequenced at the same time, and any tasks with the same sequence value will be run in a random order, before any tasks with a higher sequence value. If a task set mixes sequenced tasks and unsequenced tasks, the sequenced tasks will always all run before the unsequenced tasks.

On Start

Tasks can also be flagged to only run when a client thread first starts. For example, if a task set is intended to simulate a logged in user, you likely want the user to log in only one time when the client thread first starts. For maximum flexibility, these tasks can also be sequenced and weighted if you want the tasks to run more than once, or multiple tasks to run in a specific order only when the client first starts.

On Stop

Similarly, Goose also allows tasks to be flagged to only run when a client thread is stopping. For example, you can have a client thread simulate logging out at the end of the load test. Goose client threads will only execute these tasks when a load test reaches the configured run time, or is canceled with control-c. As expected, these tasks can also be sequenced and weighted. You can also flag any task to run both at start time and at stop time.

Wait Time

If no wait time is assigned to a task set, any client threads running that set will execute tasks one after the other as rapidly as they can. This can generate large amounts of load, but it can also result in generating unrealistic loads, or it can bottleneck the load testing server itself. Typically you'd specify a wait time, which tells Goose client threads how long to pause after executing each task. Wait time is declared with a low-high integer tuple, and the actual time paused after each task is a randomly selected value from this range.


Rust has no global lock and thus is able to make far better use of available CPU cores than Python. By default Goose will spin up 1 client per core, each running in its own thread. You can use the --clients option to control how many total clients are launched, and the --hatch-rate option to control how quickly they are launched by specifying how many to launch per second. When you build more complex test plans and start launching thousands of concurrent clients, you’ll likely need to increase kernel level limits on the maximum number of open files. You'll also need to add some delays to the task set, by specifying a wait time as described above.

Run Time

If you don't specify a run time, Goose will generate load until you manually stop it. If you've enabled the display of statistics, they will be displayed as soon as you cancel the load test with control-c.

Naming Tasks

When using Goose's built in statistics, by default each request is recorded and identified by the URL requested. As load tests get more complex, this can result in less useful statistics. For example, when load testing the Drupal Memcache module, one of our tasks loads a random node, and this can generate up to 10,000 unique URLs. In this case, the Drupal-powered website follows the same code path to serve up any node, so we prefer that the statistics for loading nodes are all grouped together, instead of being broken out per node id.

This can be achieved by applying custom names at the task level, which causes all requests made within that task to be grouped together when displaying statistics. Names can also be specified at the request level, giving total flexibility over how statistics are grouped and identified. Naming tasks and requests is only relevant when displaying statistics.


The primary use-case of Goose is generating HTTP(S) requests. Each client thread initializes a Reqwest blocking client when it starts, and then this client is used for all subsequent requests made by that individual thread. And no, that's not a typo, the Rust library we're using is spelled "Reqwest". The Reqwest client automatically stores cookies, handles headers, and much more, simplifying the task of writing load test tasks. All available Reqwest functions can be called directly, but it's important to use the provided Goose helpers if you want accurate statistics, and if you want to be able to easily change the host the load test is applied against with a run-time flag.

Goose provides very simplistic GET, POST, HEAD and DELETE wrappers, simplifying the most common request types. There are also two-part helpers allowing raw access to the underlying Reqwest objects allowing more complex GET, POST, HEAD, DELETE, PUT and PATCH requests.

By default, Goose will check the status code returned by the server, identifying 2xx codes as successes, and non-2xx codes as failures. It allows you to override this within your task if necessary, for example if you want to write a task that tests 404 pages and therefore considers a 404 status code as a success, and anything else including 2xx status codes as a failure. It can also be useful to review response bodies or headers and verify expected text or tags are where you expect them, flagging the response as a failure if not.

Our first proof of concept for Goose was to load test a new version of the Drupal Memcache module. Years ago we started load testing each release with jMeter, an effective way to validate changes in the low-level code that's trusted to help the performance of tens of thousands of Drupal websites. A few years ago these tests were rewritten in Python, as Locust had become our favored load testing tool at Tag1. Thus, rewriting the tests again in Rust for Goose seemed like an excellent place to start testing Goose, and offered a chance to make some early comparisons between the tools.

8-core Test System Running Against a 16-core Web Server

All of our current testing is being done on a single system with a 32-core AMD Threadripper, managed with Proxmox. We set up two VMs running Debian 10 during initial development, with a 16-core VM running Apache, PHP, and MySQL, and an 8-core VM running Goose. All server processes are restarted between tests, and the database is reloaded from a backup.

Once Goose supported all functionality required by the Drupal Memcache loadtest, it was a good time to run some comparisons to better understand if we are indeed benefitting by using Rust. To begin, we simply used the existing load testing VMs already set up for development. Of course, you generally wouldn't have (or want to need) so many cores dedicated to the load testing tool compared to the web server.


Our old test plans "simulated" 100 users pounding the web pages as fast as possible (without any wait time), so we started with the same configuration for Goose. This is not a very realistic test, as real users would generally pause on each page, but we wanted to change as few variables as necessary when getting started. And the primary intent of this load test is to put some stress on the Drupal memcache module's code.

We launched the first Goose load test as follows:

    cargo run --release --example drupal_loadtest -- --host=http://apache.fosciana -c 100 -r 10 -t 1h --print-stats --only-summary -v

It was initially surprising that this didn't put much strain on the load testing VM, taking only about 40% of the available CPU resources. This was surprising as Goose creates a new thread for each client, and Rust has no global lock, so it should have been using all 8 cores fully available to it, yet clearly wasn't using them:

8 core Goose CPU

During this test, it generated nearly 60 Mbit/second of traffic for the duration of the test:

8 core Goose traffic

Further analysis revealed that the shared web and database server was the bottleneck. Specifically, several of the Goose task sets were logging in and posting comments so quickly that Drupal's caches were flushing 10 times a second, causing the MySQL database to bottleneck and slow everything down. This resulted in all the clients being blocked, waiting for the web page to return results.

The following are the Goose statistics output after one such load test run:

 Name                    | # reqs         | # fails        | req/s  | fail/s
 GET (Anon) user page    | 475,275        | 0 (0%)         | 132    | 0    
 POST (Auth) login       | 20             | 0 (0%)         | 0      | 0    
 GET (Auth) comment form | 37,295         | 0 (0%)         | 10     | 0    
 GET (Anon) front page   | 2,376,330      | 1 (0.0%)       | 660    | 0    
 GET (Auth) node page    | 124,327        | 0 (0%)         | 34     | 0    
 GET (Auth) login        | 20             | 0 (0%)         | 0      | 0    
 GET static asset        | 5,125,594      | 0 (0%)         | 1,423  | 0    
 GET (Auth) user page    | 37,293         | 0 (0%)         | 10     | 0    
 GET (Auth) front page   | 186,468        | 0 (0%)         | 51     | 0    
 POST (Auth) comment f.. | 37,295         | 0 (0%)         | 10     | 0    
 GET (Anon) node page    | 1,584,250      | 0 (0%)         | 440    | 0    
 Aggregated              | 9,984,167      | 1 (0.0%)       | 2,773  | 0    
 Name                    | Avg (ms)   | Min        | Max        | Median    
 GET (Anon) user page    | 75         | 1          | 6129       | 80        
 POST (Auth) login       | 149        | 68         | 309        | 100       
 GET (Auth) comment form | 189        | 23         | 6204       | 200       
 GET (Anon) front page   | 11         | 1          | 6043       | 7         
 GET (Auth) node page    | 190        | 17         | 6267       | 200       
 GET (Auth) login        | 49         | 3          | 140        | 40        
 GET static asset        | 2          | 1          | 100        | 1         
 GET (Auth) user page    | 93         | 7          | 6082       | 80        
 GET (Auth) front page   | 121        | 14         | 6115       | 100       
 POST (Auth) comment f.. | 281        | 40         | 1987       | 300       
 GET (Anon) node page    | 127        | 1          | 6280       | 100       
 Aggregated              | 34         | 1          | 6280       | 4         
 Slowest page load within specified percentile of requests (in ms):
 Name                    | 50%    | 75%    | 98%    | 99%    | 99.9%  | 99.99%
 GET (Anon) user page    | 80     | 100    | 200    | 200    | 300    |    300
 POST (Auth) login       | 100    | 200    | 300    | 300    | 300    |    300
 GET (Auth) comment form | 200    | 200    | 300    | 400    | 900    |    900
 GET (Anon) front page   | 7      | 10     | 30     | 40     | 100    |    100
 GET (Auth) node page    | 200    | 200    | 300    | 400    | 6000   |   6000
 GET (Auth) login        | 40     | 80     | 100    | 100    | 100    |    100
 GET static asset        | 1      | 2      | 10     | 20     | 30     |     30
 GET (Auth) user page    | 80     | 100    | 200    | 200    | 5000   |   5000
 GET (Auth) front page   | 100    | 100    | 200    | 200    | 6000   |   6000
 POST (Auth) comment f.. | 300    | 300    | 500    | 500    | 700    |    700
 GET (Anon) node page    | 100    | 200    | 400    | 400    | 500    |    500
 Aggregated              | 4      | 10     | 300    | 300    | 400    |   6000

The single failure above was a time out, for which Goose displayed the following easy to understand error:

failed to parse front page: error decoding response body: operation timed out


We then configured Locust to generate the same load from the same 8-core VM. Python's Global Interpreter Lock quickly made an appearance, limiting how much traffic a single instance of Locust can generate.

The load test was launched with the following options:

    locust -f locust_testplan.py --host=http://apache.fosciana --no-web -c 100 -r 10 -t 1h --only-summary

Locust saturated a single core of the 8-core VM:

8 core Locust CPU

It also generated considerably less traffic, around 2.3 Mbit/second compared to the 58 Mbit/second generated by Goose:

8 core Locust network

The following are the complete Locust statistics output after one such load test run:

 Name                                                          # reqs      # fails     Avg     Min     Max  |  Median   req/s failures/s
 GET (Anonymous) /node/[nid]                                    74860     0(0.00%)     296      21   11697  |     270   20.74    0.00
 GET (Anonymous) /user/[uid]                                    22739     5(0.02%)     297      14    8352  |     270    6.30    0.00
 GET (Anonymous) Front page                                    112904     0(0.00%)     312       4   12564  |     290   31.28    0.00
 GET (Auth) /node/[nid]                                         17768     0(0.00%)     296      24   10540  |     270    4.92    0.00
 GET (Auth) /user/[uid]                                          5200     1(0.02%)     293      15    6120  |     270    1.44    0.00
 GET (Auth) Comment form                                         5306     0(0.00%)     293      18    2330  |     270    1.47    0.00
 GET (Auth) Front page                                          26405     0(0.00%)     289      20   10137  |     260    7.32    0.00
 POST (Auth) Logging in: /user                                     20     0(0.00%)     370     105     706  |     350    0.01    0.00
 GET (Auth) Login                                                  20     0(0.00%)    2600     909    5889  |    2200    0.01    0.00
 POST (Auth) Posting comment                                     5306     0(0.00%)     448      34    5147  |     440    1.47    0.00
 GET (Static File)                                             835603     0(0.00%)     293       4   11965  |     270  231.51    0.00
 Aggregated                                                   1106131     6(0.00%)     296       4   12564  |     270  306.46    0.00

Percentage of the requests completed within given times
 Type                 Name                                                           # reqs    50%    66%    75%    80%    90%    95%    98%    99%  99.9% 99.99%   100%
 GET                  (Anonymous) /node/[nid]                                         74860    270    330    370    390    440    480    530    570    810   5100  12000
 GET                  (Anonymous) /user/[uid]                                         22739    270    330    370    390    440    480    530    570    800   4800   8400
 GET                  (Anonymous) Front page                                         112904    290    350    390    410    450    490    530    560    790   5800  13000
 GET                  (Auth) /node/[nid]                                              17768    270    330    370    390    440    480    530    580    770   6600  11000
 GET                  (Auth) /user/[uid]                                               5200    270    330    370    390    440    480    540    580    840   6100   6100
 GET                  (Auth) Comment form                                              5306    270    330    370    390    440    480    530    580    750   2300   2300
 GET                  (Auth) Front page                                               26405    260    320    370    390    430    470    520    560    800   3600  10000
 POST                 (Auth) Logging in: /user                                           20    360    410    460    530    610    710    710    710    710    710    710
 GET                  (Auth) Login                                                       20   2400   3200   4000   4000   4400   5900   5900   5900   5900   5900   5900
 POST                 (Auth) Posting comment                                           5306    440    550    620    650    710    760    840    890   1200   5100   5100
 GET                  (Static File)                                                  835603    270    330    370    390    440    480    520    560    780   5200  12000
 None                 Aggregated                                                    1106131    270    330    380    400    440    480    530    570    800   5400  13000

Error report
 # occurrences      Error                                                                                               
 5                  GET (Anonymous) /user/[uid]: "HTTPError('404 Client Error: Not Found for url: (Anonymous) /user/[uid]')"
 1                  GET (Auth) /user/[uid]: "HTTPError('404 Client Error: Not Found for url: (Auth) /user/[uid]')"      

Distributed Locust

Fortunately, Locust has fantastic support for running distributed tests, and this functionality can also be utilized to generate more load from a multi-core server.

We first started the master Locust process as follows:

    locust -f locust_testplan.py --host=http://apache.fosciana --no-web -t 1h -c100 -r10 --only-summary --master --expect-slaves=8

We then launched eight more instances of Locust running in slave-mode, starting each one as follows:

    locust -f locust_testplan.py --host=http://apache.fosciana --no-web --only-summary --slave

The end result was eight individual Python instances all working in a coordinated fashion to generate load using all available CPU cores. The increased load is visible in the following CPU graph, where the load from a single Locust instance can be seen on the left, and the load from one master and eight slaves can be seen on the right:

8 core distributed Locust CPU

Perhaps more importantly, this resulted in considerably more network traffic, as desired:

8 core distributed Locust network

With both distributed Locust and standard Goose, we are using all available CPU cores, but our requests are being throttled by bottlenecks on the combined web and database server. In this distributed configuration, Locust was able to sustain a little over half the network load as Goose.

1-core Testing System Running Against a 16-core Web Server with Varnish

From here we made a number of configuration changes, running new load tests after each change to profile the impact. Ultimately we ended up on a single-core VM for running the load tests, against a 16-core VM for running the LAMP stack. We also added Varnish to cache anonymous pages and static assets in memory, offloading these requests from the database.

We tuned the database for the most obvious gains, giving InnoDB more memory, disabling full ACID support to minimize flushing to disk, and turning off the query cache to avoid the global lock. We also conifgured Drupal to cache anyomous pages for a minimum of 1 minute. Our goal was to remove the server-side bottlenecks to better understand our load testing potential.

    innodb_buffer_pool_size = 1G
    innodb_flush_log_at_trx_commit = 0
    query_cache_size = 0

These combined changes removed the most extreme server-side bottlenecks.

Measuring Goose Performance on a Single Core VM

With the web server able to sustain more simulated traffic, we launched another Goose load test in order to see how much traffic Goose could generate from a single-CPU system. With a little trial and error, we determined that 12 clients loading pages as fast as they could produced the optimal load from a 1-core VM, initiating the test with the following options:

    cargo run --release --example drupal_loadtest --  --host=http://apache.fosciana -c 12 -r 2 -t1h --print-stats --only-summary -v

Goose was now bottlenecked only by running from a single CPU core, fairly consistently consuming 100% of its CPU resources:

1 core Goose CPU

And perhaps more importantly, Goose was able to generate 35Mbit/second of network traffic, all from a single process running on a single-core VM:

1 core Goose network

Using top to look at the server load, you can see that MySQL, Varnish, Memcached and Apache are all getting a healthy workout:

apache server top

And with varnishstat we can get some insight into where Varnish is spending its time. It's successfully serving most requests out of memory:

apache server varnishstat

Measuring Locust Performance on a Single Core VM

From the same single-core VM, we also ran the equivalent load test with Locust. We started it with similar command line options:

    locust -f locust_testplan.py --host=http://apache.fosciana --no-web -t 1h -c12 -r2 --only-summary

As seen below, Locust again pegs the single CPU at 100%. In fact, it's much more consistent about doing this than Goose is -- an apparent bug in Goose (see the dips and valleys on the left side of the chart below) -- something that still needs to be profiled and fixed:

1 core Locust CPU

However, while Locust produces steady load, it's only generating about 3Mbit/second of traffic versus Goose's 35Mbit/second. Now that there's no server bottleneck, Goose's true potential and advantages are far more visible. The following graph shows network traffic generated by Goose on the left side of the graph, and Locust on the right side. In both instances they are utilizing 100% CPU on the load test VM:

1 core Locust network

Speeding Up Locust through Optimization

We've used Locust enough to know it can generate significantly more load than this. Through profiling, we identified that the botteleneck was due to using Beautiful Soup to extract links from the pages. Parsing the HTML is really expensive! To solve, we replaced the Beautiful Soup logic with a simple regular expression.

The load testing client continued to use 100% of the available CPU, but network traffic grew nearly four times, to 8 Mbit/second:

1 core Locust CPU

This was definitely a big step in the right direction! But the question remained, could we generate even more load from a single core?

Speeding Up Locust with FastHttpLocust

Locust includes an alternative HTTP client, called FastHttp, which the documentation suggests can increase Locust's performance. We updated our test plan, switching from HttpLocust to FastHttpLocust. The defined tasks are simple enough no other changes were necessary.

We then launched the load test again with the same parameters, and saw another dramatic improvement. Locust was now able to generate nearly 20M of sustained traffic.

1 core Locust CPU

Further optimizations, such as completely replacing Beautiful Soup with regular expressions didn't produce any additional measurable gains.

On the web server, we see that Locust is doing a decent job of simulating load, putting some visible stress on server processes:

FastHttp Locust network

However, reviewing our earlier notes, by comparison Goose was able to generate over 35 Mbit/second. And what's even more interesting is that it's doing this while leveraging heavy libraries to parse the HTML and extract links and post comments. These libraries make our job writing load tests easier, but it leads to an obvious question, can we speed up Goose through the same optimizations we made to Locust?

Speeding Up Goose through Optimization

We did two rounds of optimizations on Goose. First, we replaced the Select library with regular expressions optimizing how we extract static assets from the page. Next, we also replaced the Scraper library with regular expressions optimizing how we log in and post comments.

As with Locust, we saw a considerable improvement. Goose was now able to generate 110 Mbit/second of useful network traffic, all from a single VM core!

1 core Optimized Goose Network

On the web server, Goose is giving all server processes a truly impressive workout:

1 core Optimized Goose Top

This additional load is consistent:

1 core Optimized Goose Network

And Varnish continues to serve most requests out of RAM:

1 core Optimized Goose Varnishstat

After an hour, Goose displayed the following statistics:

 Name                    | # reqs         | # fails        | req/s  | fail/s
 GET (Auth) node page    | 112,787        | 0 (0%)         | 31     | 0    
 GET (Anon) user page    | 416,767        | 0 (0%)         | 115    | 0    
 GET (Auth) login        | 3              | 0 (0%)         | 0      | 0    
 POST (Auth) login       | 3              | 0 (0%)         | 0      | 0    
 GET (Auth) front page   | 169,178        | 0 (0%)         | 46     | 0    
 GET static asset        | 13,518,078     | 0 (0%)         | 3,755  | 0    
 GET (Auth) comment form | 33,836         | 0 (0%)         | 9      | 0    
 GET (Anon) node page    | 1,389,225      | 0 (0%)         | 385    | 0    
 GET (Auth) user page    | 33,834         | 0 (0%)         | 9      | 0    
 GET (Anon) front page   | 2,083,835      | 0 (0%)         | 578    | 0    
 POST (Auth) comment f.. | 33,836         | 0 (0%)         | 9      | 0    
 Aggregated              | 17,791,382     | 0 (0%)         | 4,942  | 0    
 Name                    | Avg (ms)   | Min        | Max        | Median    
 GET (Auth) node page    | 27         | 10         | 5973       | 30        
 GET (Anon) user page    | 5          | 1          | 12196      | 1         
 GET (Auth) login        | 8899       | 6398       | 11400      | 9000      
 POST (Auth) login       | 64         | 57         | 74         | 60        
 GET (Auth) front page   | 22         | 14         | 6029       | 20        
 GET static asset        | 0          | 1          | 6030       | 1         
 GET (Auth) comment form | 27         | 10         | 5973       | 30        
 GET (Anon) node page    | 7          | 1          | 6038       | 1         
 GET (Auth) user page    | 13         | 6          | 6014       | 10        
 GET (Anon) front page   | 0          | 1          | 6017       | 1         
 POST (Auth) comment f.. | 38         | 20         | 265        | 40        
 Aggregated              | 1          | 1          | 12196      | 1         
 Slowest page load within specified percentile of requests (in ms):
 Name                    | 50%    | 75%    | 98%    | 99%    | 99.9%  | 99.99%
 GET (Auth) node page    | 30     | 30     | 50     | 50     | 70     |     70
 GET (Anon) user page    | 1      | 10     | 20     | 30     | 40     |     40
 GET (Auth) login        | 9000   | 9000   | 11000  | 11000  | 11000  |  11000
 POST (Auth) login       | 60     | 60     | 70     | 70     | 70     |     70
 GET (Auth) front page   | 20     | 20     | 40     | 40     | 50     |     50
 GET static asset        | 1      | 1      | 4      | 6      | 10     |     10
 GET (Auth) comment form | 30     | 30     | 50     | 50     | 70     |     70
 GET (Anon) node page    | 1      | 3      | 50     | 50     | 70     |     70
 GET (Auth) user page    | 10     | 10     | 20     | 30     | 40     |     40
 GET (Anon) front page   | 1      | 1      | 5      | 7      | 20     |     20
 POST (Auth) comment f.. | 40     | 40     | 60     | 70     | 100    |    100
 Aggregated              | 1      | 1      | 30     | 30     | 50     |     70

More optimizations are certainly possible. For example, just how Locust offers a FastHttpClient, the Rust ecosystem also has clients faster than Reqwest. And as Goose is written in Rust, adding more cores to the load testing server gives it more power without any additional configuration.


This Is A Goose

While Goose has proven quite capable at generating a lot of load, it's hard to miss the periodic dips visible in the Goose network traffic graphs. Some effort is required to profile the load testing tool under load, to understand what bottlenecks are causing this, and determine if it's fixable. Best case, the generated load should be steady, as is generally seen when load testing with Locust. Hopefully this issue can be fully understood and resolved in a future release.

Beyond that, this is a very early version of Goose, and as such is totally unoptimized. We are confident that with a little time and effort Goose's ability to generate load can be greatly improved.

Automated Testing

Cargo has built-in support for running tests, and Goose would benefit from considerably better test coverage. While there's already quite a few tests written, over time we aim to have nearly complete coverage.

More Examples

As of the 0.5.8 release which was used to write this blog, Goose comes with two example load tests. The first, simple.rs, is a clone of the example currently found on the Locust.io website. It doesn't do much more than demonstrating how to set up a load test, including a simple POST task, and some GET tasks. It is primarily useful to someone familiar with Locust, looking to understand the differences in building a load test in Rust with Goose.

The second example, drupal_loadtest.rs, was previously discussed and is a clone of the load test Tag1 has been using to validate new releases of the Drupal Memcache module. It leverages much more Goose functionality, including weighting task sets and tasks, as well as parsing the pages that are loaded to confirm expected elements exist. Prior to our regular expressin optimization it leveraged the scraper library to extract form elements required to log into a Drupal website and post comments. It also used the select library to extract links from returned HTML in order to load static elements embedded in image tags.

The plan is to add several other useful examples, providing additional recipes on how you might leverage Goose to load test your websites, API endpoints, and more. Contributed examples leveraging different libraries from the Rust ecosystem are very welcome!


Currently Goose is controlled entirely through run-time options specified on the command line. The development plan is to expose an API allowing the same functionality to be controlled and monitored in other ways.


The first intended use-case of the Goose API will be to add support for distributed load testing. Two or more Goose instances working together will be referred to as a Gaggle. A Goose Manager instance will be able to control one or more Goose Worker instances. If enabled, Goose Workers will also regularly send statistics data to the Goose Manger instance. We are also exploring the possibility of multi-tiered Gaggles, allowing a single Goose instance to be both a Worker and a Manager, making it possible to group together multiple Gaggles.

Web User Interface

The second intended use-case of the Goose API will be to add a simple UI for controlling and monitoring load tests from a web browser. As with everything else in Goose, the initial goal of this UI will be to clone the functionality currently provided in the Locust UI. Once that is working, we will consider additional functionality.

The web user interface will live in its own Cargo library for a couple of reasons. First, if you don't need the UI, you won't have to install it and its dependencies. Second, we hope eventually alternative UIs will be contributed by the open source community!


Currently Goose uses Reqwest's blocking HTTP Client to load web pages. The Reqwest documentation explains:

"The blocking Client will block the current thread to execute, instead of returning futures that need to be executed on a runtime."

With each Goose client running in its own thread, blocking is likely the best simulation of a real user when building load tests. That said, as of Rust 1.39 which was released in November of 2018, Rust gained async-await syntax. We intend to explore adding support for Reqwest's default async-based Client as an optional alternative, as well as adding support for defining tasks themselves to be async. This should allow individual Goose client threads to generate much more network traffic.

Multiple HTTP Clients

Related, we will also explore supporting completely different HTTP clients. There's nothing in Goose's design that requires it to work only with Reqwest. Different clients will have different performance characteristics, and may provide functionality required to load test your project.

The current intent is to keep Reqwest's blocking HTTP client as the default, and to make other clients available as compile-time Cargo features. If another client library proves to be more flexible or performant, it may ultimately become the default.


One of our favorite features of Locust is how easy it is to write load plans, partially thanks to their use of Python decorators. We hope to similarly simplify the creation of Goose load plans by adding macros, simplifying everything between initializing and executing the GooseState when writing a load plan. Our goal is that writing a load plan for Goose essentially be as simple as defining the individual tasks in pure Rust, and tagging them with one or more macros.

Though Goose is still in an early stage of development, it is already proving to be very powerful and useful. We're actively using it to prepare the next release of the Drupal Memcache module, ensuring there won't be unexpected performance regressions as mission critical websites upgrade to the latest release. We're also excited to leverage the correctness, performance, and flexibility provided by Rust and its ecosystem with future client load tests.

To get started using Goose in your own load testing, check out the comprehensive documentation. The tool is released as open source, contributions are welcome!

May 12 2020
May 12

Do you want to serve personalized experiences to your users? Are you managing a large amount of data, served from disparate systems? Is your IT team unable to support you with fast changes to your marketing platform? Are you worried about data ethics and how your technology is geared to ensure standardization? Are you looking to quickly prototype your marketing platform and test it out for your specific use case?  Drupal 9 is here to solve many of your content problems of the 20s

Drupal 9 is scheduled to be released on June 3rd 2020.  Drupal’s next major release coming with features aligned towards a future-proof offering. Here are 9 things to watch out for in Drupal 9.


1. User Experiences for First time Users

You never get a second chance to make a first impression. First-time users, beginners, evaluator experience with Drupal is this context would facilitate overall increased adoption of Drupal.
In Drupal 9 there is going to be an improved User experience for First-time users. Some of the specific features in the road map include installation profiles for common use cases, separation of journeys by roles and an upgraded ‘Try Drupal’ experience which helps users familiarize themselves with the platform.  Installation profiles means a ready to use bundle of Drupal that solves for specific use cases like profile for conferences, profile for a government project, profile for setting up a multilingual project, etc. This will make it easier for the first time user to not only try Drupal, but try Drupal in a context that is relevant to them. Separation of journeys by roles is a concept where what the user can experience based on his/her role is different. Today this is limited to an anonymous user and an admin role. ‘Try Drupal’ is a quick, free build out of Drupal in the cloud by popular Drupal cloud hosting providers.


2. Decouple CMS Capabilities

Tomorrow’s content can flow to the Digital screens in conferences, the kiosks and other marketing channels from a central content repository. Decoupled/Headless CMS capabilities are a prerequisite for this “cross-platform publishing”. 

Drupal 9 brings enhanced Decoupled CMS capabilities. Some of the key enhancements in this area include auto-generated API documentation, JSON API explorer to quickly generate APIs and optimized performance. The JSON API explorer is an interactive web application that makes building JSON queries simple. The performance optimizations are much-needed in the context of the large scale applications that are powered by a Decoupled CMS design.

Native Dam
2. Native DAM

Enterprises have a large volume of content. Digital Asset Management (DAM) brings this content to a central location, making this content accessible and contextually available to serve the end-user the content they need. This is important for the marketers whose objective is to engage with their users/customers. AI and analytics of content are tightly coupled with DAMs.

Drupal 9 roadmap includes building out a DAM solution.

 Integration Framework

3. Integration Framework

Today’s Marketing solutions are a mash up of different tools, cloud services and content publishing channels that provide specialized capabilities.  Some of these interactions include social integrations, integrations with bots, integrations with CRM solutions, Marketing automation solutions and such.

Drupal 9 continues  to act as an Integration framework. Based on the popular needs this layer is normally plugged in by the Contributed  modules. To ensure you have access to these modules, it is important you stay up to date.

Editor Tools

4. Improvement to Editor Tools

Editors are one of the main stakeholders who use Drupal. They are constantly using Drupal to churn out fresh content to the world wide web. Drupal has consistently prioritized improvements to the Content Editor tools.

These tools enable marketers and content teams to work independent of technology teams and enable publishing content with faster time-to-market.

Admin UI Claro theme brings a modern look improving the visual appeal, focused on accessibility and mobile compatibility in Drupal 9.

Rich media content like images, videos, tweets are an integral  part of today’s digital content. Managing these Rich media content in a centralized repository delivering them to the various devices, channels or various experiences further enhances the speed with which the editors can publish. 

Drupal 9 brings with it the Media module as part of core. Media will assist in organizing the multi-media assets in a manner that renders easy reusability. These are augmented by the existing capabilities to manage responsible images, integrations with streaming services or external video hosting services.

Structured Data

5. Structured Data

Structured data is simply data that is organized based on specific data models. The  information around a particular content is captured, stored and displayed based on the meaning the particular piece of data provides to the content as a whole. Also Structured data helps to build relationships.

As we move into an era of personalization and individualized experiences with a forecast of increased data in the coming years, structured data becomes extremely important.  Structured data is also important in the context of Search Engine Optimization.
Some new features in consideration in Drupal 9 include support for GraphQL that allows to manage large amounts of structured data.

 Web standards

6. Open Web Standards

“Open Web Standards” define a set of best practices to follow while publishing on the World Wide Web. With increase in data privacy challenges, capabilities of certain organizations/browsers to take control of user data there has been an increased focus on “Open Web Standards” that help to preserve the “Open Web”. 

Aspects of “Open Web standard” that will remain a priority in Drupal 9 include GDPR, Accessibility, Privacy, Fast by default, Webmentions and Semantic. 

GDPR stands for General Data Protection Regulation. It's the core of Europe's digital privacy legislation. Accessibility is something Drupal has been. Web accessibility means ensuring Drupal is designed and developed so that people with disabilities can use them. Drupal has been adhering to WCAG Accessibility guidelines since Drupal 8. Webmention is a (now) standardized protocol that enables one website address (URL) to notify another website address that the former contains a reference to the latter. 

7. Simple Upgrade Process

Ease of software maintenance, lowers the cost maintenance for the end user.

Maintenance of Drupal is easier with a simple upgrade process in Drupal 9. Tightly integrated Composer initiative facilitates updates of 3rd party tools will continue to remain a key focus area in Drupal 9. Further in the roadmap is Automatic upgrades which further enhance this process.

Application changes

8. Managing Application Changes

Agile development is the norm of the day. Customers want to be in a mode where product is developed iteratively, quickly testing their ideas. Key to this is a technical process that ensures tools that facilitate deploying changes to applications easily. 

Continued emphasis to support automated deployment using Devops tools like Jenkins or Kubernetes and improved configuration management systems make this possible. 

Community Centric

9. Community Centric Approach

Drupal, with its strong community-centric approach, is poised to deliver cutting edge, innovative tools for Marketers, Content editors, Site builders and all its different stakeholders.

What’s unique to the Drupal 9 release is that the upgrade process is really simple. Just need 4 steps to upgrade your Drupal 8 site. Even if you are still in Drupal 7, there is no need to fret as there is an upgrade path directly to 9 as well. Many of the most complex sites go beyond just the Drupal core and today well ahead of the Drupal 9 launch we also have close to  1500 contrib modules fully compatible and  3400+ other modules that are near completion. 

Are you evaluating migrating to Drupal 9?  Write to us to get a Free Audit & Approach note for your site.

Related resources:

May 11 2020
May 11


PORTLAND, Ore., May 11, 2020—The CDC, NIH, and Medecins Sans Frontieres/Doctors Without Borders—among many others—depend on the power of Drupal, the largest independent open source content management system, to keep their websites dynamic and secure. But the cancellation of the Drupal Association’s annual keystone fundraising event—originally scheduled for May 2020—put the nonprofit’s finances in jeopardy.

COVID-19 has delivered a particularly hard economic hit to non-profits, and the Drupal Association (DA) is no exception,” says Heather Rocker, executive director of the DA. “When we made the decision to cancel DrupalCon North America 2020 for the safety of our attendees, the next question was how to recover those funds so we could continue operations for our community of millions around the world.”

Enter #DrupalCares, a global fundraiser conceived with the hopes of bridging the significant funding gap left as a result of the pandemic. While the campaign had a strong start, what really put the fundraising into overdrive was the #DrupalCares match challenge, a $100,000 matching grant for individual contributions funded by Drupal creator Dries Buytaert and his wife Vanessa. Then a coalition of Drupal businesses came together to match those contributions again—bringing the potential impact up to $300,000. These contributions, together with the contributions from Drupal service providers and end-users, accelerated the campaign dramatically.
As of today, May 11, 2020, #DrupalCares has raised $500,000, meeting its 60-day goal in just over 30 days.  Nearly 150 businesses and organizations, along with over 2,000 individual donors and members donated, to reach the goal in record time.  
"I'm in awe of how quickly the Drupal community rallied to raise funds for the Drupal Association,” said Dries Buytaert, founder of Drupal. “With this fundraising campaign behind us, the Drupal Association can refocus on key initiatives such as the Drupal 9 launch next month.

“DrupalCon has been an important reason for Drupal's success,” said Buytaert. “Even though we'll be gathering virtually this summer, I'm very excited that DrupalCon will live on. I'd like to thank everyone who helped us reach our goals—the Drupal community is stronger than ever."
While the nonprofit Drupal Association was impacted by COVID-19, the Drupal ecosystem remains strong. As Buytaert wrote in March, open source software seems to be recession-proof.
“Open Source has grown to be more secure, more flexible, and more stable than ever before,” said Buytaert. “Today, the benefits of Open Source are even more compelling than during past recessions.”
Open source contribution also provides an excellent opportunity for individuals to expand their skills—or even re-skill—during this time of record unemployment. Drupal has demonstrated once again that the power of community and the open source model make projects like Drupal the best possible investment in uncertain times.
In addition to the #DrupalCares campaign, the majority of original DrupalCon 2020 sponsors allowed the Association to retain their sponsorship dollars as the event prepares to shift to its first-ever virtual DrupalCon Global 2020.  
“Like so many organizations, we had to pivot quickly on a major keystone event, but we also had to pivot quickly on a product launch, as we were planning to introduce Drupal 9, our first major software upgrade in almost five years, at DrupalCon,” says Rocker. “DrupalCon was originally scheduled to host approximately 3,000 attendees in May 2020 in Minneapolis, so we didn’t have time for a ‘wait and see’ approach. I’m grateful to a solid, creative Association staff and the extended leadership of our Drupal community who are willing to do whatever it takes to make this event a success.”
“Additionally, Drupal 9 is scheduled to launch on schedule in early June, which is a testament to how dedicated this community is to continuing to be trailblazers—even now, when a delay caused by these world events would have been no surprise,” said Rocker.
The #DrupalCares fundraising campaign remains active through May 31, 2020. To learn more about Drupal or make a donation, visit www.Drupal.org.
About Drupal and the Drupal Association
Drupal is the open source content management software behind millions of websites and applications, boasting a community of 46,000-plus developers and more than 1.3 million users on Drupal.org. The Drupal Association is the non-profit organization dedicated to accelerating the Drupal software project, fostering the community, and supporting its growth.
For more information contact Heather Rocker,  [email protected]

May 11 2020
May 11

May’s DrupalCon Minneapolis has morphed into July’s DrupalCon Global! Hurrah! Heather Rocker, Executive Director of the Drupal Association, recently used two perfect words to describe the Drupal community: flexible and adaptable. Extraordinary times require nothing less and with the Drupal Association’s adaptive and flexible response to the pandemic, it might just mean that more of the world’s 3 million Drupal users get access to the ideas, celebration, and kinship that DrupalCon embodies.

My teammates Joe, Amber, and Ashley were set to give a beginner-level theming workshop at DrupalCon Minneapolis in May. It’s a course they’ve worked hard to craft as an excellent resource for learning how to make custom Drupal themes, and they’ve taught it in person at many workshops over the years. Sadly, this year they won’t be able to do this in Minneapolis, but we do have the workshop material online in the form of our Hands-On Theming course. To make this even more accessible and to mark the original DrupalCon Minneapolis week, during May 18-22, we’ll make our entire Hands-On Theming course free for anyone wanting to learn.

We encourage you to make a donation of any size to the Drupal Association in lieu of payment to us, if that is something you can do. Small but mighty, the association supports Drupal.org projects and has the massive task of scheduling (and rescheduling) DrupalCon.

Come back next week to learn Drupal theming for free, support the Drupal Association if you can, and have a great “DrupalCon” week.

May 11 2020
May 11
Aaron Deutsch (contributed by proud mom, Kristen Pol)
Image Credit: Aaron Deutsch (contributed by proud mom, Kristen Pol)

Excited! Humbled! Appreciative! Energized! Thankful! Those are just a few of the emotions our team is feeling today as I had the honor of announcing that you helped us meet our #DrupalCares emergency funding goal. Today, we launched a press release to recognize the contributions of the Drupal Community and demonstrate to the world that the Drupal is strong.

We proudly announce that #DrupalCares has raised $500,000, meeting its 60-day goal in just over 30 days. Nearly 150 businesses and organizations, along with over 2,000 individual donors and members, donated to reach the goal in record time. Drupal has demonstrated once again that the power of community and the open source model make projects like Drupal the best possible investment in uncertain times.

While the campaign had a strong start, what really put the fundraising into overdrive was the #DrupalCares match challenge, a $100,000 matching grant for individual contributions funded by Drupal creator Dries Buytaert and his wife Vanessa. Then a coalition of Drupal businesses came together to match those contributions again—bringing the potential impact up to $300,000. These contributions, together with the contributions from Drupal service providers and end-users, accelerated the campaign dramatically.

"I'm in awe of how quickly the Drupal community rallied to raise funds for the Drupal Association,” said Dries Buytaert, founder of Drupal. “With this fundraising campaign behind us, the Drupal Association can refocus on key initiatives such as the Drupal 9 launch next month.

“DrupalCon has been an important reason for Drupal's success,” said Buytaert. “Even though we'll be gathering virtually this summer, I'm very excited that DrupalCon will live on. I'd like to thank everyone who helped us reach our goals—the Drupal community is stronger than ever."

Part of the success of #DrupalCares was thanks to community-developed fundraisers encouraging Drupal users around the globe to donate. Gábor Hojtsy, Ron Northcutt, and Ofer Shaal started the Drupal 9 module challenge, donating €9 for each module that created its first Drupal 9 compatible release. The amazee.io team created and hosted Pixels for Drupal (with help from Alanna Burke, Sean Hamlin, Brandon Williams, Eli Stone, Michael Schmid) which awarded donors pixels for fun recognition. Jeff Geerling helped amplify our message on Youtube, making a donation for every like. Oliver Davies turned purchases of Test Driven Drupal in April into donations. These and other creative Community-led campaigns helped to drive  #DrupalCares awareness and giving further. 

On behalf of the Drupal Association staff and board, we hope you’ll enjoy this token of our sincere #DrupalThanks for the support and encouragement you’ve given during this #DrupalCares journey. Tackling this hurdle of emergency funding means that we can pivot to other important projects on the horizon such as the launch of Drupal 9 and the virtual version of DrupalCon. For those that wish to continue contributing, or for those that haven’t had an opportunity yet, the official campaign stays open through May 31. Every donation and membership continues to drive our diversification of funding in the right direction.

May 11 2020
May 11

Drupal is one of the largest and most active open-source software projects in the world. Behind the scenes is the Drupal Association, the non-profit organization responsible for enabling it to thrive by architecting and introducing new tooling and infrastructure to support the needs of the community and ecosystem. Many of us know the Drupal Association as the primary organizer of the global DrupalCon conference twice a year. But it's less common knowledge that the Drupal Association is actively engaged in Drupal development and maintains some of the most important elements of the Drupal project. This runs across the spectrum of software localizations, version updates, security advisories, dependency metadata, and other "cloud services" like the Drupal CI system that empower developers to keep building on Drupal.

With the ongoing coronavirus pandemic, the Drupal Association is in dire financial straits due to losses sustained from DrupalCon North America (one of the largest sources of funding) having to be held as a virtual event this year. As part of the #DrupalCares campaign, we at Tag1 Consulting implore organizations that use Drupal, companies that provide Drupal services, and even individuals who make their living off Drupal development to contribute in some shape or form to the Drupal Association in this time of need.

We are putting our money where our mouth is. For years we have donated at least eighty hours a month to support the DA and Drupal.org infrastructure and tooling. I’m proud to announce that we are expanding this commitment by 50% to 120 hours a month of pro-bono work, from our most senior resources, to help the DA offset some of its operating expenses. Furthermore, we contributed to help #DrupalCares reach its $100,000 goal and so that any donation you make is doubled in value.

To gain insights into building software communities at scale in open source, Michael Meyers (Managing Director at Tag1) and I (Preston So, Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) recently kicked off a Tag1 Team Talks miniseries with the Drupal Association's engineering team, represented by Tim Lehnen (Chief Technology Officer at the Drupal Association) and Narayan Newton (Chief Technology Officer at Tag1), to examine all the ways in which the DA keeps the Drupal community ticking.

Why Tag1 supports the Drupal Association

Here at Tag1, we work with a diverse range of technologies, but Drupal has been our passion for many years. It's been a critical part of our business since Tag1's inception, and we're grateful to the Drupal Association for sustaining such an essential part of our work today. By no means is it an understatement to characterize the Drupal Association as the lifeblood of the Drupal ecosystem. Because of our appreciation for what Drupal has given us, we're committed to doing our part to giving back to Drupal, not only over the course of our many years working in concert with the Drupal Association but also right now during the #DrupalCares campaign.

How we contribute to Drupal

Though Tag1 is well-known for being the all-time number-two contributor to the Drupal project, with the largest concentration of core committers, branch managers, release managers, and core maintainers of any organization in the community, we're much less known for how we support the underlying foundations of the ecosystem. Beyond the more visible contributions of staff members like Moshe Weitzman, Nathaniel Catchpole (catch), Francesco Placella (plach), and Fabian Franz (fabianx), we also do much more than add our support to Drupal core development. After all, supporting Drupal requires more than just code; it also requires the tooling and infrastructure that keep the project's blood flowing.

During our Tag1 Team Talks episode with the Drupal Association, Tim Lehnen eloquently made the case for the non-profit that has driven Drupal's success for so many years: While the software makes up the bulk of open-source contributions, offering surrounding services that buttress the software's core is another key function that the Drupal Association performs. To that end, for many years, Tag1 has donated 80 hours of pro-bono work a month to ensure that Drupal.org and all the tooling the community relies on stays up and running. Tag1 is honored to increase our monthly contribution of pro-bono hours to the Drupal Association by 50% from 80 to 120 hours of expert work from our most senior resources. And now with our increased work hours and financial contributions, critical projects like the migration to GitLab can continue to move forward, even during a situation like the current pandemic.

Supporting Drupal's test infrastructure

In Drupal, a key aspect of code contribution is running tests that verify a patch will work against a massive variety of environments, be compatible with a spectrum of versions of Drupal, and not introduce any functional regressions in the code. One of the key questions many community members ask is why Drupal maintains its own testing infrastructure in lieu of a service such as TravisCI.

Unfortunately, whenever existing continuous integration solutions were tasked with running a Drupal core test for every Drupal patch, they would consistently time out, maxing out available resources. To solve the challenges associated with developing and testing at scale, the DA partnered with Tag1. We deployed our expertise in infrastructure, mission-critical application development, and performance and scalability to help run and maintain Drupal.org's servers and the DrupalCI test runner system. The CI system ensures that contributors have a reliable center for collaboration and a dependable test infrastructure for all of their patches and modules. Tag1's deep expertise has been critical to the success of the DrupalCI system, which we scaled dynamically to the extent that it is now concurrently running more than an entire decade's worth of testing in a single year.

The new testing infrastructure was an enormous undertaking for the Drupal Association due to its complexity. Narayan Newton opted from early days to leverage standard Unix tools to build out the environments for testing targets. And rather than use Kubernetes for the orchestration of tests, the Drupal Association opted to use Jenkins and the EC2 Fleet plugin for DrupalCI. Jenkins manages the orchestration of virtual machines (VMs) and initializes them as test targets before actually running the tests themselves in a clean room environment. As Narayan notes during our conversation, one of the most fascinating quirks of Drupal's infrastructure is that many of its core elements were installed before standardized tooling emerged to handle those use cases in a regimented way.

Supporting Drupal's migration to GitLab

In addition to our contributions to Drupal's underlying infrastructure, Tag1 also assists with key initiatives run by the Drupal Association such as the ongoing migration from Drupal's homegrown Git system to GitLab, a source control provider. According to Narayan, the migration to GitLab has been much more straightforward than previous historical migrations in Drupal's past, more specifically the original migration from Drupal's previous CVS source control system to Git, which it has used ever since. Code management in Drupal has long employed a bespoke Git approach with a homegrown Git daemon written by the community and cgit as the web-based front end for Git repositories.

One of the key benefits GitLab provides to the Drupal Association is the fact that the DA is no longer responsible for building and supporting a source control system for Drupal at the scale at which it operates. After all, GitLab has a dedicated site reliability engineering (SRE) team focused on ensuring source availability even at high loads. And as Narayan notes, GitLab has been responsive to security issues, in addition to facilitating "one of the smoothest migrations I've been a part of." But this doesn't mean there weren't complications.

Because GitLab has a superset of features that include some existing Drupal.org functionality, the Drupal Association, supported by Tag1, worked closely with the GitLab team to ensure that certain features could be disabled for use with the Drupal project, avoiding many of the issues that have plagued the GitHub mirror of Drupal since its conception. Narayan contributed key features to ensure that GitLab's integration points could be toggled on and off in order to enable the unique needs and requirements of the Drupal community and ecosystem.

Tim adds that in terms of lack of downtime, disruption, forklifting the entire Git code management infrastructure without disrupting the development community was a rousing success, especially given that there was no impact on a minor version release. In the process, the Drupal community has gained a number of key features that will enable accelerated development and conversation between contributors in ever-richer ways. In coming months, the Drupal Association will also facilitate the addition of GitLab's merge requests feature, which will introduce yet more efficiencies for those making code contributions.

Why #DrupalCares is so important

For us, Drupal is a key reason we exist, and the Drupal Association has done wonders to ensure the longevity of an open-source software project we hold dear. This is why in these troubling times for the Drupal Association, it could not be more important to uphold the ideals of open source and ensure the survival of our beloved community and ecosystem. Over the course of the past month, we've witnessed an incredible outpouring of support from all corners of the community, buttressed by the various matches provided by community members like none other than project lead Dries Buytaert. We at Tag1 Consulting have contributed toward #DrupalCares' $100,000 goal in order to multiply the impact of community donations and buttress our existing support.

Without your support, whether as a company or an individual, we may never see another DrupalCon grace our stages or celebrate yet another major version release that introduces innovative features to the Drupal milieu. And it's not just about the more visible elements of the Drupal experience like DrupalCon. It's also about the invisible yet essential work the Drupal Association does to keep the Drupal project rolling along. Thanks to the innumerable contributions the Drupal Association has made to maintain DrupalCI, the GitLab migration, Composer Façade, and a host of other improvements to Drupal's infrastructure and tooling, with the support of Tag1, the Drupal project remains one of the most impressive open-source projects in our industry.


Here at Tag1, we believe in the enduring value of open source and its ability to enrich our day-to-day lives in addition to the way we do business. We're dedicated to deepening our already extensive support for the Drupal Association in ways both financial and technological. And now it's your turn to return the favor. If you're an individual community member, we strongly encourage you to start or renew a membership. If you're an organization or company in the Drupal space, we encourage you to contribute what you can to ensure the continued success of Drupal. Together, we can keep Drupal alive for a new era of contribution and community.

Special thanks to Jeremy Andrews and Michael Meyers for their feedback during the writing process.

Photo by Jon Tyson on Unsplash

May 11 2020
May 11

What is a views display extender

The display extender plugin allow to add additional options or configuration to a views regardless of the type of display (e.g. page, block, ..).

For example, if you wanted to allow site users to add certain metadata to the rendered output of every view display regardless of display type, you could provide this option as a display extender.

What we can do with it

We will see how we implement such a plugin, for the example, we will add some metadata (useless metatags as example) to the document head when the views is displayed.

We will call the display extender plugin HeadMetadata (id: head_metadata) and we will implement it in a module called views_head_metadata.

The implementation

Make our plugin discoverable

Views do not discover display extender plugins with a hook info as usual, for this particular type of plugin, views has a variable in his views.settings configuration object.

You need to add your plugin ID to the variable views.settings.display_extenders (that is a list).

To do so, I will recommend you to implement the hook_install (as well uninstall) in the module install file. To manipulate config object you can look at my previous notes on CMI.

Make the plugin class

As seen in the previous post on Drupal 8 plugins, you need to implement the class in the plugin type namespace, extend the base class for this type of plugin, and add the metadata annotation.

In the case of the display extender plugin, the namespace is Drupal\views_head_metadata\Plugin\views\display_extender, the base class is DisplayExtenderPluginBase, and the metadata annotation are defined in \Drupal\views\Annotation\ViewsDisplayExtender.

The display extender plugins methods are nearly the same that the display plugins, you can think of its like a set of methods to alter the display plugin.

The important methods to understand are :

  • defineOptionsAlter(&$options) : Define an array of options your plugins will define and save. Sort of schema of your plugin.
  • optionsSummary(&$categories, &$options) : To add a category (the section of the views admin interface) if you want to add one, and define your options settings (in wich category there are, and the value to display as the summary).
  • buildOptionsForm(&$form, FormStateInterface $form_state) : Where you construct the form(s) for your plugin, of course linked with a validate and submit method.

Generate the metadata tags in the document head

Now that we have our settings added to every views display, we need to use those to generate the tags in the document head as promised.

To work on the views render we will use the hook for that : hook_views_pre_render($view) and the render array property #attached.

Implement that hook in the .module of our module views_head_metadata, let's see :

May 11 2020
May 11

For the example we are going to implement an area that will present some links and text in a custom way, not sure if it's really usefull, but that not the point of this article.

The Plugin system

For the first post on the plugins I will introduce briefly on the concept. For those that already been using Ctools plugins system, you already now about the plugin system purposes.

For those who doesn't know about it, the plugin system is a way to let other module implements her own use case for an existing features, think of Field formatter : provide your own render array for a particular field display, or Widget : provide your own form element for a particular field type, etc...

The plugin system has three base elements :

Plugin Types

The plugin type is the central controlling class that defines how the plugins of this type will be discovered and instantiated. The type will describe the central purpose of all plugins of that type; e.g. cache backends, image actions, blocks, etc.

Plugin Discovery

Plugin Discovery is the process of finding plugins within the available code base that qualify for use within this particular plugin type's use case.

Plugin Factory

The Factory is responsible for instantiating the specific plugin(s) chosen for a given use case.

Detailled informations : https://www.drupal.org/node/1637730

In our case Views is responsible of that implementations so we are not going further on that, let see now how to implement a plugin definition.

The Plugin definitions

The existing documentation on the plugin definitions are a little abstract for now to understand how it really works (https://www.drupal.org/node/1653532).

You have to understand simply that a Plugin in most case is a Class implementation, namespaced within the namespace of the plugin type, in our example this is : \Drupal\module_name\Plugin\views\area

So if I implement a custom views area Plugin in my module the class will be located under the location module_name/src/Plugin/views/area/MyAreaHandler.php

To know where to implement a plugin definition for a plugin type, you can in most case look at module docs, or directly in the source code of the module (looking at an example of a definition will be enough)

In most cases, the modules that implement a Plugin type will provide a base class for the plugins definitions, in our example views area provide a base class : \Drupal\views\Plugin\views\area\AreaPluginBase

Drupal provide also a base class, if you implement a custom Plugin type, for the Plugin definition : \Drupal\Component\Plugin\PluginBase

Your custom plugin definition class must also have annotation metadata, that is defined by the module that implement the plugin type, in our example : \Drupal\views\Annotation\ViewsArea

In the case of views you will also need to implement the hook_views_data() into module_name.views.inc file, there you will inform views about the name and metadata of your Area handler.

Hands on implementation

So we have a custom module let's call it module_name for the example :)

We will create the class that implements our plugin definition and we are gonna give it this Plugin ID : my_custom_site_area.

We save this file into module_name/src/Plugin/views/area/MyCustomSiteArea.php

Now we just have to implements the hook_views_data() and yes this is the end, you can use your awesome views area handler into any view and any area.

Define this hook into the file : module_name/module_name.views.inc

May 11 2020
May 11

There is three types of configuration data :

The Simple Configuration API

  • Used to store unique configuration object.

  • Are namespaced by the module_name.

  • Can contain a list of structured variables (string, int, array, ..)

  • Default values can be found in Yaml : config/install/module_name.config_object_name.yml

  • Have a schema defined in config/schema/module_name.schema.yml

Code example :

The States

  • Not exportable, simple value that hardly depend of the environment.

  • Value can differ between environment (e.g. last_cron, maintenance_mode have different value on your local and on the production site)

The Entity Configuration API

  • Configuration object that can be multiple (e.g. views, image style, ckeditor profile, ...).

  • New Configuration type can be defined in custom module.

  • Have a defined schema in Yaml.

  • Not fieldable.

  • Values can be exported and stored as Yaml, can be stored by modules in config/install

Code example :


Store configuration object in the module :

Config object (not states) can be stored in a module and imported during the install process of the modules.

To export a config object in a module you can use the configuration synchronisation UI at /admin/config/development/configuration/single/export

Select the configuration object type, then the object, copy the content and store it in your custom module config/install directory following the name convention that is provided below the textarea.

You can also use the features module that is now a simple configuration packager.

If after the install of the module, you want to update the config object, you can use the following drush command :

Configuration override system

Remember the variable $conf in settings.php in D6/D7 for overriding variables.

In D8, you can also override variable from the configuration API:

You can also do overrides at runtime.

Example: getting a value in a specific languages :

Drupal provide a storage for override an module can specify her own way of override, for deeper informations look at :


Configuration schema

The config object of Config API and of the configuration entity API have attached schema defined in module_name/config/install/module_name.schema.yml

These schema are not mandatory, but if you want to have translatable strings, nor form configuration / consistent export, you must take the time to implement the schema for your configuration object. However if you don't want to, you can just implement the toArray() method in your entity config object class.

Example, docs and informations : https://www.drupal.org/node/1905070

Configuration dependencies calculation

Default is in the .info of the module that define the config object like in D6/D7

But config entity can implements calculateDependencies() method to provide dynamic dependencies depending on config entity values.

Think of Config entity that store field display information for content entities specific view modes, there a need to have the module that hold the fields / formatters in dependencies but these are dynamic depending on the content entity display.

More information : https://www.drupal.org/node/2235409

May 11 2020
May 11


Migrate in Drupal 8

Migrate is now included in the Drupal core for making the upgrade path from 6.x and 7.x versions to Drupal 8.

Drupal 8 has two new modules :
Migrate: « Handles migrations »
Migrate Drupal : « Contains migrations from older Drupal versions. »

None of these module have a User Interface.

« Migrate » contains the core framework classes, the destination, source and process plugins schemas and definitions, and at last the migration config entity schema and definition.

« Migrate Drupal » contains implementations of destination, sources and process plugins for Drupal 6 and 7 you can use it or extend it, it's ready to use. But this module doesn't contain the configuration to migrate all you datas from your older Drupal site to Drupal 8.

The core provides templates of migration configuration entity that are located under each module of the core that needs one, under a folder named 'migration_templates' to find all the templates you can use this command in your Drupal 8 site:

To make a Drupal core to core migration, you will find all the infos here : https://www.Drupal.org/node/2257723 there is an UI in progress for upgrading.

A migration framework

Let have a look at each big piece of the migration framework :

Source plugins

Drupal provides an interface and base classes for the migration source plugin :

  • SqlBase : Base class for SQL source, you need to extend this class to use it in your migration.
  • SourcePluginBase : Base class for every custom source plugin.
  • MenuLink: For D6/D7 menu links.
  • EmptySource (id:empty): Plugin source that returns an empty row.
  • ...

Process plugins

There is the equivalent of the D7 MigrateFieldHandler but this is not reduced to fields or to a particular field type.
Its purpose is to transform a raw value into something acceptable by your new site schema.

The method transform() of the plugin is in charge of transforming your $value or skipping the entire row if needed.
If the source property has multiple values, the transform() will happen on each one.

Drupal provides migration process plugin into each module of the core that needs it (for the core upgrade),
To find out which one and where it is located you can use this command :

Destination plugins

Destination plugins are the classes that handle where your data are saved in the new Drupal 8 sites schemas.

Drupal provides a lot of useful destination classes :

  • DestinationBase : Base class for migrate destination classes.
  • Entity (id: entity) : Base class for entity destinations.
  • Config (id: config) : Class for importing configuration entities.
  • EntityBaseFieldOverride (id: entity:base_field_override): Class for importing base field.
  • EntityConfigBase : Base class for importing configuration entities.
  • EntityImageStyle (id: entity:image_style): Class for importing image_style.
  • EntityContentBase (id: entity:%entity_type): The destination class for all content entities lacking a specific class.
  • EntityNodeType: (id: entity:node_type): A class for migrate node type.
  • EntityFile (id: entity:file): Class for migrate files.
  • EntityFieldInstance: Class for migrate field instance.
  • EntityFieldStorageConfig: Class for migrate field storage.
  • EntityRevision, EntityViewMode, EntityUser, Book...
  • And so more…

Builder plugins:

"Builder plugins implement custom logic to generate migration entities from migration templates. For example, a migration may need to be customized based on the data that is present in the source database; such customization is implemented by builders." - doc API

This is used in the user module, the builder create a migration configuration entity based on a migration template and then add fields mapping to the process, based on the data in the source database. (@see /Drupal/user/Plugin/migrate/builder/d7/User)

Id map plugins:

"It creates one map and one message table per migration entity to store the relevant information." - doc API
This is where rollback, update and the map creation are handled.
Drupal provides the Sql plugin (@see /Drupal/migrate/Plugin/migrate/id_map/Sql) based on the core base class PluginBase.

And we are talking only about core from the beginning.
All the examples (That means docs for devs) are in core !

About now :

While there *almost* a simple UI to use migration in Drupal 8 for Drupal to Drupal, Migrate can be used for every kind of data input. The work is in progess for http://Drupal.org/project/migrate_plus to bring an UI and more source plugins, process plugins and examples. There already is the CSV source plugin and a pending patch for the code example. The primary goal of « migrate plus » is to have all the features (UI, Sources, Destinations.. ) of the Drupal 7 version.

Concrete migration

(migration with Drupal 8 are made easy)

I need to migrate some content with image, attached files and categories from custom tables in an external SQL database to Drupal.

To begin shortly :

  • Drush 8 (dev master) and console installed.
  • Create the custom module (in the code, I assume the module name is “example_migrate”):
    $ Drupal generate:module
    or create the module by yourself, you only need the info.yml file.
  • Activate migrate and migrate_plus tools
    $ Drupal module:install migrate_tools
    $ drush en migrate_tools
  • What we have in Drupal for the code example :
    • a taxonomy vocabulary : ‘example_content_category’
    • a content type ‘article’
    • some fields: body, field_image, field_attached_files, field_category
  • Define in settings.php, the connexion to your external database:

We are going to tell migrate source to use this database target. It happens in each migration configuration file, it’s a configuration property used by the SqlBase source plugin:

This is one of the reasons SqlBase has a wrapper for select query and you need to call it in your source plugin, like $this->select(), instead of building the query with bare hands.

N.B. Each time you add a custom yml file in your custom module you need to uninstall/reinstall the module for the config/install files to imports. In order to avoid that, you can import a single migration config file by copy/paste in the admin/config configuration synchronisation section.

The File migration

The content has images and files to migrate, I suppose in this example that the source database has a unique id for each file in a specific table that hold the file path to migrate.

We need a migration for the file to a Drupal 8 file entity, we write the source plugin for the file migration:

File: src/Plugin/migrate/source/ExampleFile.php

We have the source class and our source fields and each row generate a path to the file on my local disk.

But we need to transform our external file path to a local Drupal public file system URI, for that we need a process plugin. In our case the process plugin will take the external filepath and filename as arguments and return the new Drupal URI.

File: src/Plugin/migrate/process/ExampleFileUri.php

We need another process plugin to transform our source date values to timestamp (created, changed), as the date format is the same across the source database, this plugin will be reused in the content migration for the same purpose:

File: src/Plugin/migrate/process/ExampleDate.php

For the destination we use the core plugin: entity:file.

Now we have to define our migration config entity file, this is where the source, destination and process (field mappings) are defined:

File: config/install/migrate.migration.example_file.yml

We are done for the file migration, you can execute it with the migrate_tools (of the migrate_plus project) drush command:

The Term migration

The content has categories to migrate.
We need to import them as taxonomy term, in this example I suppose the categories didn't have unique ids, it is just a column of the article table with the category name…

First we create the source :

File: src/Plugin/migrate/source/ExampleCategory.php

And we can now create the migration config entity file :

File: config/install/migrate.migration.example_category.yml

This is done, to execute it :

The Content migration

The content from the source has an html content, raw excerpt, image, attached files, categories and the creation/updated date in the format Y-m-d H:i:s

We create the source plugin:

File: src/Plugin/migrate/source/ExampleContent.php

Now we can create the content migration config entity file :

File: config/install/migrate.migration.example_content.yml

Finally, execute it :

Group the migration

Thanks to migrate_plus, you can specify a migration group for your migration.
You need a to create a config entity for that :

File: config/install/migrate_plus.migration_group.example.yml

Then in your migration config yaml file, be sure to have the line migration_group next to the label:

So you can use the command to run the migration together, and the order of execution will depend on the migration dependencies:

I hope that you enjoyed our article.

Best regards,

Delta https://www.drupal.org/u/delta

May 11 2020
May 11

At Studio.gd we love the Drupal ecosystem and it became very important to us to give back and participate.
Today we're proud to announce a new module that we hope will help you !

Inline Entity Display module will help you handle the display of referenced entity fields directly in the parent entity.
For exemple if you reference a taxomony "Tags" to an Article node, you will be able directly in the manage display of the article to display tags' fields. It can become very usefull with more complex referenced entity like field collection for exemple.

VOIR LE MODULE : https://www.drupal.org/project/inline_entity_display


- You can control, for each compatible reference field instances, if the fields from the referenced entities would be available as extra fields. Disabled by default.

- You can manage the visibility of the referenced entities fields on the manage display form. Hidden by default.

- View modes are added to represent this context and manage custom display settings for the referenced entities fields in this context {entity_type}_{view_mode} Example: "Node: Teaser" is used to render referenced entities fields, when you reference an entity into a node, and you view this node as a teaser if there are no custom settings for this view mode, fields are rendered using the default view mode settings.

- Extra data attributes are added on the default fields markup, so the field of the same entity can be identified.

Compatible with Field group on manage display form.

Compatible with Display Suite layouts on manage display form.


- Entity API
- One of the compatible reference fields module.


The simplytest.me install of this module will come automatically with these modules: entity_reference, field_collection, field_group, display suite.

VOIR LE MODULE : https://www.drupal.org/project/inline_entity_display

We are currently developping a similar module for Drupal 8 but more powerful and more flexible, Stay tuned !

May 11 2020
May 11

Join us on May 28th for an Amazee Labs Webinar about Test-Driven Development with Storybook and Cypress

Test-driven development is central to any agile development process, and navigating to a browser manually for testing is time that can be spent on more important things. Mastering the testing tools that accompany different languages and frameworks might seem daunting, but it’s one of the most rewarding and efficient things a developer can learn, and we’re here to help.

In our last webinar, we talked about applying Test-Driven Development with Cypress.io to Drupal modules and Gatsby Websites. 

Now we’ll take the next step on our journey to total confidence with testing by teaching you how to mix Storybook into your workflow in order to: 

  • Improve functional and visual test coverage for user interface components 

  • Speed up development 

  • Test without ever touching a browser

Join us on May 28th, 2020 at 04:00 PM -- Register online now! 

Want to review before you join? Check out the resources and full video recording of our last webinar on TDD with Gatsby.

Watch our previous Webinars:


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web