Oct 21 2019
Oct 21

The 2019 Drupal South sprint is shaping up to be the biggest contribution event in the Australia-Pacific region since Drupalcon Sydney 2013.

This year, core-contributors with over 3000 commit credits between them will be in attendance, including 3 members of the core committers team, 3 members of the Drupal security team, 7 core module/subsystem maintainers as well as maintainers of major contrib modules and command-line tools.

With Drupal 9 just around the corner, this will be a great chance to help the community get popular modules ready for Drupal 9, meet some great people and help shape the future of Drupal.

The PreviousNext team are sponsoring and helping to run the sprint day on Wednesday, November 27th 2019, and there are a few things you can do now to hit the ground running on the day.

by Lee Rowlands / 21 October 2019

What's a Sprint Day about anyway?

Contribution Sprints are a great opportunity to get involved in contributing to Drupal. Contributions don't have to be just about code. Issue triage, documentation, and manual testing are examples of non-code contributions.

If you are new to contributing, you can take a look at the New Contributor tasks on the Drupal.org Contributor Tasks page.

While there will be experienced contributors there on the day to help, keep in mind, this is not a training session. :-)

Set Up a Development Environment

There is more than one way to shear a sheep, and there is also more than one way to set up a local development environment for working on Drupal.

If you don't already have a local development environment setup, we recommend using Docker Compose for local development - follow the instructions for installing Docker Compose on OSX, Windows and Linux.

Once you've setup Docker compose, you need to setup a folder containing your docker-compose.yml and a clone of Drupal core. The instructions for that vary depending on your operating system, we have instructions below for OSX, Windows and Linux, although please note the Windows version is untested.


mkdir -p ~/dev/drupal
cd ~/dev/drupal
wget https://gist.githubusercontent.com/larowlan/9ba2c569fd52e8ac12aee962cc9319c9/raw/e69795e7219c9c73eb8d8d171c31277eeb5bcbaa/docker-compose.yml
git clone --branch 8.9.x https://git.drupalcode.org/project/drupal.git app
docker-compose up -d
docker-compose run -w /data/app app composer install


git clone --branch 8.9.x https://git.drupalcode.org/project/drupal.git app
docker-compose up -d
docker-compose run -w /data/app app composer install


mkdir -p ~/dev/drupal # or wherever you want to put the folder
cd ~/dev/drupal
wget https://gist.githubusercontent.com/larowlan/63a0f6efacee71b483af3a2184178dd0/raw/248dff13557efa533c0ca297d39c87cd3eb348fe/docker-compose.ymlgit clone --branch 8.9.x https://git.drupalcode.org/project/drupal.git app
docker-compose up -d
docker-compose exec app /bin/bash -c "cd /data/app && composer install"

If you have any issues, join us on Drupal slack in the #australia-nz channel beforehand and we'll be happy to answer any questions you might have.

Install dreditor browser extension

Dreditor is a browser extension that makes it easier to review patches on Drupal.org. Its a must for anyone contributing to Drupal.

There are versions for Firefox and Chrome.

Find Issues to Work On

If you want to see what might be an interesting issue to work on, head over to the Drupal.org Issue Queue and look for issues tagged with 'DrupalSouth 2019'. These are issues that others have tagged.

You can also tag an issue yourself to be added to the list.

Being face-to-face with fellow contributors is a great opportunity to have discussions and put forward ideas. Don't feel like you need to come away from the day having completed lines and lines of code.

We look forward to seeing you all there!


DrupalSouth, Code Sprint
Oct 20 2019
Oct 20

There has been amazing progress with Drupal’s API-First Initiative in the past few years. A major milestone was when JSON:API was added to core as a stable module in Drupal 8.7.

An API-first Drupal enables many things, but probably the best known use case of APIs is a decoupled front-end. Numerous blog posts and at least one book have been been written about decoupled Drupal. There have been many conference sessions on the topic, and there is an entire conference focused only on decoupled CMS architectures.

These exciting developments ensure that Drupal stays relevant, and that it is a first-class choice for building ambitious digital experiences in the modern web development landscape, where component-based JavaScript front-end libraries and frameworks like React and Vue are so popular.

A lot of the activity around decoupled Drupal originates from developers and other employees of Drupal agencies working on projects for large enterprises and organizations. These projects might require a content service not just for for app-like websites, but also for other publishing channels like native mobile apps and digital kiosks.

However, from a practical requirements and resources point of view, the vast majority of website owners just need a traditional website. And for someone like me, who often works as the only developer on (non-ambitious?) traditional website projects, JavaScript-based decoupled front-ends are interesting, but not required.

There are other ways

When so much attention is focused on APIs and JavaScript, it is possible to miss or forget the fact that loose coupling and components are general architectural principles that can be implemented in many different ways. A different approach can be used to get many of the benefits of a decoupled, component-based front-end.

Component-based theming is a commonly used name for a specific way to develop and organize Twig templates and related assets in Drupal themes. I am not sure how well known it is that component-based theming is actually a way to decouple most of the front-end from Drupal.

The best known benefit of this decoupling approach is the possibility to develop most of the theme layer independently of Drupal, with the help of a tool like Pattern Lab. But while there are already some very interesting examples out there, I believe that the full potential of decoupled Twig components is yet to be realized. By this I mean Twig component libraries that can be used with minimal integration effort in both Drupal and WordPress, for example. I believe that agencies offering services for both platforms could benefit a lot from this.

For ‘small Drupal’ too

Since it is often associated with style guides and design systems, it is possible to get the impression that component-based theming is only useful for large teams working on projects for large clients. However, I know from my own experience that even single-developer projects can benefit from it.

Component-based theming adds very little overhead to regular Drupal theming. Thinking in components can improve the quality of your work in many ways, and most importantly, building anything from blocks is fun!

There is no requirement to develop a style guide – although you might find that you end up with the beginnings of one as a happy side result.

A decoupled front-end means that work on the theme can be started even before Drupal is installed, and clients can be shown the design in a real browser much sooner.

So unless you are working on a project that requires a JavaScript-based front-end, I can warmly recommend component-based theming to teams and projects of all sizes.

Decoupling the front-end without APIs and JavaScript


Oct 20 2019
Oct 20

Supercharge Drupal 8 migration process by externalizing data export and transformation.

Nuvole offers a training at DrupalCon Amsterdam: "Drupal 8 Migration as a process" - register until October 27.

When having to import data onto a Drupal 8 site there is no other choice than relying on core’s Migrate API and its contrib ecosystem. The Migrate API in Drupal 8 implements a rather classic Extract, Transform, Load (ETL) process, with the following "Drupal-lingo" twists:

  • The extract phase is called "source" and it uses a source plugin to read data from external systems, be it a Drupal 7 database, a CSV file, a REST web service, etc.
  • The transform phase is called "process" and it uses process plugins to process and transform data
  • The load phase is called "destination" and it uses destination plugins to import data into specific Drupal 8 entity storages (e.g. nodes, taxonomy terms, etc.)

To recap, Drupal core implements the ETL process as follows:

  • Source plugins extract the data from the source.
  • Process plugins transform the data.
  • Destination plugins save the data as Drupal 8 entities.

Limitations of standard Drupal 8 ETL process

In the process described above the three steps are executed at the same moment in time, sequentially, every time we run a migration import. In this scenario we are to lose one of the most valuable aspects of an ETL process: testing and validating data prior to import.

Also, the process above easily accommodates for multiple data sources to be consolidated into one, coherent, dataset. With Drupal Migrate API this is only possible by using chained source plugin, and it can only run on the destination site. This is quite a limitation in complex enterprise scenarios, where displayed data is often the result of complex and convoluted transformations in the backend.

A middle-format approach

At Nuvole we have adopted a so called "middle-format approach". The process simply moves the Extract and Transform parts outside Drupal so to ease the production of an easy to import dataset, in a well known middle-format (such as JSON API).

This approach has proved very successful in complex scenarios and, while completely leveraging the standard Drupal 8 migration process, it also allows to:

  1. Aggregate data from different sources (not only Drupal 7 databases)
  2. Test the data transformation process
  3. Test imported Drupal 8 data
  4. Easily automate the points above

The process looks like the following:

The approach outlined above brings the following benefits:

  • Exported data can be reviewed by the client and iteratively refined
  • Since Drupal 8 is not a precondition to export and transform data, data export site building can run in parallel, making the whole migration process much more efficient
  • Exported data can be presented to the different stakeholders using a user friendly UI, even before starting any Drupal 8 development
  • Since the data uses a well-known middle-format building the import process (as Drupal core’s Migrate plugins) is straightforward and allows to maximize code reusability

Middle-format migration at work

At Nuvole we have successfully used the middle-format approach to incrementally transform, review and import data over several years, on complex sites such as the World Food Programme main website. In that scenario we had to consolidate data coming from two Drupal 7 sites (plus a number of external data sources) in 13 different languages, over a two years period.

We have recently open-sourced a simplified version of the tool we have used to export and transform data, you can find its boilerplate code here.

We will run an hands-on session on how to use such a tool to plan and execute data exports in our upcoming DrupalCon Amsterdam training.

Oct 18 2019
Oct 18

“For me, forgiveness and compassion are always linked: how do we hold people accountable for wrongdoing and yet at the same time remain in touch with their humanity enough to believe in their capacity to be transformed?”

–bell hooks

One of the questions that we hear regularly at the Drupal Community Working Group (CWG) is how we handle people who have been banned from various spaces in the Drupal community for violations of our community Code of Conduct.

As our conflict resolution policy and process states, one of the potential actions that the CWG can take is to impose permanent or temporary bans from some or all Drupal community spaces, which may be both physical (DrupalCamps and other meetups) and virtual (community Slack channels, Drupal.org, etc.).

While we work to avoid bans whenever possible, sometimes they are unavoidable as an individual's behavior can have impacts that go far beyond just those individuals who may be directly involved. While the CWG’s original charter focused on interpersonal conflicts, the updated version we adopted at the end of last year makes it clear that the group’s primary responsibility is for the overall health of the community, which means that we need to consider the impact of a person’s ongoing participation on others in the community, not just on those who may have raised concerns.

In those cases where an individual is banned, we keep records in case they need to be reviewed in the future or additional action is required. However, we do not maintain a comprehensive “master list” of banned individuals.

If and when the CWG needs to issue an indefinite ban (or when an individual chooses to leave the community on their own, rather than working with us), we always let the individual involved know that they can reach out to us at any time. We generally try to avoid using the term “permanent ban”, as we recognize that people can transform and change over time; however, we are clear that indefinite bans remain in effect until such time as we agree to lift them, and that any attempts to circumvent bans may result in additional action.

On those occasions when a banned individual reaches out to us expressing interest in rejoining the community, we begin an established process where we start by examining the situation very carefully before agreeing to proceed. This involves reviewing the individual’s past history and the interactions they have had with other community members since their ban, as well as reaching out to those who have filed past reports and/or others who we are aware may have been impacted by this person’s past words and actions to get their feedback. If we are aware that the individual has already taken action in both public and private settings to address their past behavior before coming to us, that is a very positive sign.

To be clear, however, what we are looking for is for the individual to not just show remorse or regret for their actions, but also demonstrate a commitment to address and repair the damage their past behavior has caused. They need to be willing to take responsibility for their words and actions and the impact they have on others. In addition to demonstrating the ability to hold themselves accountable, they must also agree to allow others in the community to hold them accountable. If the CWG is not confident that these conditions can be met, we will not proceed further, instead recommending that the individual engage in additional reflection before coming back to us.

If and when we agree that the individual is ready to move forward with reintegration into the community, the next step is to collaboratively develop an action plan with clear goals and milestones. This plan must not only satisfy the questions and concerns that have been raised by others, but also include safeguards to address any potential for relapses in negative behavior, such as regular check-ins with the CWG and/or others and limitations in access to various Drupal community spaces. As the individual continues to demonstrate consistently positive behavior and is able to re-establish trust with more members of the community, they may gain additional access and privileges over time. However, if they engage in behavior that violates the Code of Conduct at any point in the process, they remain subject to immediate reinstatement of previous bans and/or other actions.

This process is difficult and hard, and it’s not for everyone. We understand that, and we don’t blame anyone who chooses to walk away. For those who are willing to be open and vulnerable, face their mistakes head on, and learn from them, we’re there every step of the way. As our values and principles state, “The expectation of leaders is not that they are perfect or have years of experience. The expectation of leaders is that they learn from their mistakes, rise to the challenge, support others ahead of their own needs or ego, and continuously work to improve themselves.”

Oct 18 2019
Oct 18

The Promote Drupal team invites you to take part in a 15 minute 1:1 interview contributing to growing our understanding of your needs during the evaluation process. 
We have made great strides with the Promote Drupal initiative to provide product and project information resources that highlight the value of Drupal in helping enterprises and agencies leverage an open source CMS platform for their businesses. We want to learn from you in order to continue to deliver a project that offers significant value and that meets organizations’ needs.  

Calling all members of the Drupal Community

The Promote Drupal team invites you to also take part in a 15 minute 1:1 interview contributing to growing our understanding of your marketing needs. From agencies to developers, we value your insight as we work to advance the Promote Drupal initiative. We have made great strides with this initiative, including:

Come with your ideas, thoughts on the initiative and where you think our focus should be for the next 12 months!

Oct 18 2019
Oct 18

This is a beginner’s tutorial that will quickly get you up and running with media entities and a media entity browser in Drupal 8.

One of the best parts of Drupal 8 is the in-core feature of media entities. This allows Drupal to manage resources that otherwise it would not be able to. It accomplishes this by creating a bridge relationship between Drupal’s CMS, media files and external sources. It’s basically an entity referencing all kinds of media files like images, videos, documents or other external sources like Tweets, embed videos, etc.

This tutorial is for the core “media” module set up, not to be confused with the contributed module “media_entity” module. If your Drupal setup already has this module and its sub-modules enabled in your setup, you may want to migrate your setup to the core module “media” following the instructions on the project page at https://www.drupal.org/project/media_entity

First Things First, Enable the Media Module:

Enable the media module (NOT entity_media). The “media” module comes with Drupal core so you don’t need to install it. All you have to do is enable it.

Using the Drush command:

drush en media -y

- Or: In the Drupal’s interface, go to /admin/modules, search for media in the search box, check the Media module and install to enable the module.

Configure or Create Media types:

Once the Media module is enabled, it will create the basic media types. You can find these under the structure menu. Media types are entity types just like nodes allowing you to add custom fields and adjust or make new displays to your needs.

To keep things simple in this tutorial, I will concentrate on the “Image” Media type. But you can apply the same principals to any other type. The difference will be on how the sources are being added and display just like node types.

Display the ”name” field (optional) :

It’s my personal preference to always enable the “Name” field in the forms of media types. This way you will be forced to add a name to your media type that you can later use for searching in the media browser.
(For even more searching options, you can add taxonomy tag field to group resources and so on. But for this example, let’s keep it simple.)

For this example navigate to /admin/structure/media/manage/image/form-display

Now drag into the content area “Name” field and drag to hide or disable all other fields with the exception of the “Image” field, so all you have displayed on the form is the image and name field. This will come in handy when creating the media browser view in the next step.

Installing the Entity Browser Modules and Creating Media Browsers:

The media browser is what you will use to pick and add new images to the content where the media entity reference field is being used. It basically adds a widget for your field. The media entities can also be managed via the content page in the “Media” tab.

Install these Modules: Chaos Tools, Entity Browser:

composer require drupal/ctools
composer require drupal/entity_browser
composer require drupal/inline_entity_form

Then enable Chaos Tools, Entity Browser, and Entity Browser IEF.

drush en ctools, entity_browser, inline_entity_form, entity_browser_entity_form

- Or: In the Drupal’s interface, go to /admin/modules  Ctools, Entity Browser, and Entity Browser IEF (you may have to enable the entity browser first).

Create a View to Use in the Media Browser.

We create this view first, in order to use it in the media browser we will create later, to list the available images in the media browser.

1. Go to /admin/structure/views/add:

  • Name your new view “Image Media Entity Browser Listing” 
  • View Settings: Show: Media of type: Image sorted by: Newest first
  • Click on Save and Edit

2. Now Configure the view:

  • Top left under “Displays” click on the “+Add” button and select “Entity browser”
  • FORMAT: pick Table
  • Add FIELDS:
    • Name
    • Thumbnail - options:
      • Label: Thumbnail
      • Formatter: Image
      • Image Style: Thumbnail
    • Entity browser bulk select form - options:
      • Label: Select
  • Rearrange the fields in this order: Entity Select, Thumbnail, Name
    • Name (media) - options:
      • Check the “Expose this filer to visitors”
      • Label: Search by name
      • Operator: Contains
  • Save the view.

Create the Entity Browser for Image media:

Got to /admin/config/content/entity_browser and click on the “+Add Entity Browser” button.

  • General Setting tab:
    • Label: Image Media Entity Browser
  • Widget Settings tab:
    • Add widget plugin:
      • View - options:
        • Label: Image Library
        • Submit button text: Select
        • View, view display: Media Browser: Image Media Entity Browser Listing
      • Entity Form - options:
        • Label: Image Upload
        • Submit button text: Upload
        • Entity type: Media
        • Bundle: Image
        • Form mode: Default

Create a Media Entity Reference Field to Use the Image Media Browser

Congrats! So you already set up the media entities and the media browser. Now all you have to do is start using it on your content builds.

Now, whenever you’re creating content, you will have a field that opens the media browser on a modal window with options to search, select, and upload new images.

That’s it! - Now all you have to do is repeat the process, configure things to your preferences and soon you will be a pro on media entities. Next, I’ll add a few other recommendations if you want to improve upon your setup.

Recommended modules that can add more power to your entity setups:

  • embed
  • video_embed_field
  • entity_embed
  • dropzonejs

Migrating Existing File Fields:

If you are working on an existing Drupal 8 setup and you want to migrate old file fields to a media entity field, follow this tutorial for instructions: Basic Migration Of File Fields To Media Entities.

Happy coding!

Oct 18 2019
Oct 18

The Migrate File to Media module provides an easy way to migrate old file fields like images, files, videos, etc, into the media entities with a few drush commands.

So you can have an understanding of how the migration process works, In this tutorial, we will run through a few quick step-by-step instructions on how to migrate specific image field types to image entity reference fields.

Media Module

The core media module is what creates the “media” entity types. If you haven’t set this up yet, or if you are not too familiar with media entity setup, I highly recommend following this tutorial “Drupal 8 Basic Media And Media Browser Setup For Beginners”, before you continue with this tutorial. 

If you still want to continue with a basic setup of media entities, simply enable/install the core media module that already comes with Drupal 8 core (NOT the media_entity module). Upon installation, it will automatically create a few media entity bundles for you, including the image media bundle we will be using for this tutorial.

Migrate File to Media module

Now for the migration, install the “Migrate File to Media module” - ( Only compatible with Drush 9 and up )

composer require drupal/migrate_file_to_media

Enable the “migrate_file_to_media” module:

drush en migrate_file_to_media

It will warn you that it will enable the following (“migrate”,”migrate_drupal”, “migate_plus” and “migrate_tools”) if they’re not enabled yet.

Once we have all of this in place, we can start using some simple commands to do the work for us. However, first, we need to create some mapping so that the migration module knows from what source fields to what destination fields.


We can’t migrate a source without a road map and destination. In the example of this tutorial, we will be performing a migration of the field “field_image” (image field type), in the content bundle “article” (node entity type) into a new media entity reference field that we will be creating using drush commands.

When migrating a site with a vast amount of content, I recommend to become familiar with the site’s content structure, the document where all the images are located, and in what content type bundles.

Generate the destination media field in the article node types using drush:

drush migrate:file-media-fields node article image image

Command bisection:

drush migrate:file-media-fields [entity_type] [bundle] [source_field_type] [destination_media_bundle]

  • [entity_type]: (The parent entity type) In the case of our example, it is a node content type. But other bundle types, like paragraphs or taxonomies for example, can be used.
  • [bundle]: (The parent bundle type) In this example, the “article”
  • [source_field_type]: (The source field type) This will be the source field type. Not to be confused with the name of the field, but the type of field. When running the command, this will check for all the fields of type “image” as in this tutorial.
  • [destination_media_bundle]: (The destination field media bundle) This will be the destination field bundle type. This will create a media entity reference field of type “images” for every image field found on the parent node bundle type. It will also give it the same name with the suffix of “_media” as in this tutorial.


Ok, so we created the destination. Now let’s create the roadmap so that the migration knows where and how to migrate data from the old to new by generating some YML files.

Before generating the YML files, we need to generate/create a custom module where the new files will reside: (Just generate the module, don’t enable it yet.)

Basic module drupal generate command, run on web-root:

drupal generate:module
# Note: follow the instructions on the screen and give it the machine name “basic_migration_example”

- Or: Module generate command with all options for this tutorial, run on your web-root:

drupal generate:module --module="Basic Migration Example 101" --machine-name="basic_migration_example" --module-path="modules/custom" --description="Basic Migration Tutorial 101" --core="8.x" --package="Custom" --features-bundle=”no” --test=”yes” --dependencies=”no” --twigtemplate=”no” --composer --module-file

Once the module is generated, clear the cache:

drush cr

Now we are ready to start generating the YML files, run:

drush generate yml-migrate_file_to_media_migration_media

You will get something like this. For this tutorial please follow the instructions as follows:

Welcome to yml-migrate_file_to_media_migration_media generator!


 Module machine name:
 ➤ basic_migration_example

 Module name:
 ➤ Basic Migration Example 101

 Plugin label [Example]:
 ➤ article_media_basic_migration

 Plugin ID [example]:
 ➤ article_media_basic_migration

 Migration Group [media]:
 ➤ ( Press Enter Button to use suggested "media" )

 Entity Type [node]:
 ➤ ( Press Enter Button to use suggested "node")

 Source Bundle:
 ➤ article

 Source Field Names (comma separated) [field_image]:
 ➤ ( Press Enter Button to use suggested "field_image" )

 Target Media Type [image]:
 ➤ ( Press Enter Button to use suggested "image" )

 Target Field [field_media_image]:
 ➤ ( Press Enter Button to use suggested "field_media_image" )
# Be sure to use the name of the field inside the Media Entity for the Target Field, and not the name of the media field on the node. 
 Language Code [en]:
 ➤ ( Press Enter Button to use suggested "en" )

 Translation languages (comma separated) [none]:
 ➤ ( Press Enter Button "none" )

The new files will be created in your new module folder /config/install.

There will be two YML files generated per migration ID, these file names will be suffixed with _step1 and _step2. You can open these files and adjust them as needed, but for this tutorial, the configuration we gave it on the YML generation process is just what we need.

If you want to look at YML examples, you will find some under the migrate_file_to_media module folder: modules/contrib/migrate_file_to_media/migrate_file_to_media_example/

Once you install/enable the new module we just created for this migration, all the YML configurations will be loaded to your Drupal 8 database setup.

Editing Migration Configurations after installation:

Once loaded into the system, if you need to make modifications, you need to get familiarized with the configuration management interface and how it all works to make changes. Here is a good read on that: https://www.ostraining.com/blog/drupal/config/

Install a New Module and YML files:

Ok, so now that the YML files are created, let’s enable/install the new module, and the new YML configuration files will be added to the system as well.

Enable new custom module:

drush en basic_migration_example

And now, check the migration status by running: (This will also give you the generated migration ids)

drush migrate:status

You will get a result of something like this:

Group Migration ID Status Total Imported Unprs media article-media-basic-migration_step1 Idle 3 0 3 media article-media-basic-migration_step2 Idle 3 0 3            

Duplicate File Detection (!)

– This is an important step, DO NOT SKIP – In order to run migrations, first you need to run a file duplication check for all migrations.

drush migrate:duplicate-file-detection [migration-id]

For our tutorial, run file duplication checks run “_step1” of the migration. You will have to do this for every first instance of your migrations one by one as I have not been able to figure out another way to make this command just run through all in one command.

For this tutorial, run:

drush migrate:duplicate-file-detection article_media_basic_migration_step1

Check for existing medias (optional)

If you already have images or files loaded in the medias entities run this command to check for media duplicates:

drush migrate:duplicate-media-detection image --all
# Run: drush migrate:duplicate-media-detection -help for extra options 

Migration time:

Now, if all is set up correctly, we can run a migration:

Migrate all migration of media group by group id:

drush migrate:import --group=media 

- Or: Run single migration by ids:

drush migrate:import article_media_basic_migration_step1 
drush migrate:import article_media_basic_migration_step2 

If it all goes well, you should see that the imported will match the total and unprocessed will be 0. Run:

drush migrate:status

Results should look like this:

Group Migration ID Status Total Imported Unprs media article-media-basic-migration_step1 Idle 3 3 0 media article-media-basic-migration_step2 Idle 3 3 0            

Take a look at the content of your article. You will now see that the new fields have been populated with the right image media entity reference. Just adjust your displays to show the new media field and hide or remove them altogether with the old image field when all your migration is complete.

A few more drush migration commands that may come useful:

  • drush migrate –help
  • drush migrate:rollback
  • drush migrate:stop
  • drush migrate:reset-status
  • drush migrate:import  [_OPTIONS_] :
    • –feedback - Frequency of progress messages, in seconds or items processed ( great for debugging)
    • –update - In addition to processing unimported items from the source, update previously-imported items with new data
    • –group - Name of the migration group to run
    • –idlist - A comma-delimited list of ids to import or rollback. If unspecified, migrate imports all pending items or rolls back all items for the content set.
    • –limit - Limit on the length of each migration process, expressed in seconds or number of items
    • –rollback - Rollback specified migration(s) if applicable
    • –stop - Stop specified migration(s) if applicable
    • Less commonly used options:
      • –file_function - Override file function to use when migrating images
      • –force - Force an operation to run, even if all dependencies are not satisfied
      • –needs-update - Reimport up to 10K records where needs_update=1. This option is only needed when your Drupal DB is on a different DB server from your source data. Otherwise, these records get migrated with just migrate-import.
      • –instrument - Capture performance information (timer, memory, or all)

Further information on Drush Migrate Tools commands visit: https://www.drupal.org/node/1561820

Happy coding!

Oct 18 2019
Oct 18

In this digitally altered scenario, where every millisecond is crucial for the marketing industry to deliver customer responsiveness and enhance productivity, enterprises should start leveraging the marketing automation system earnestly & in a full-fledged manner to take their game to the next level.


The digital integration of marketing automation tools will create a huge difference in the upcoming times for marketing communication

After all, marketing automation is an efficient marketing strategy that draws together your different content with demographic information of your customers, to help you convert these potential leads at the most feasible time. 

The digital integration of marketing automation tools will create a huge difference in the upcoming times for marketing communication. As a result, these software tools and technologies will make it easier to incorporate seamlessly with a Drupal-based website through custom modules.

Now, let’s take a deep look at this blog which provides insights on marketing automation, it’s benefits, and available Drupal modules to make the work of marketing teams easier manifolds-

Marketing automation facilitates organizations to consolidate, streamline, and automate tasks with the help of modules and tools that were earlier bulky, repetitive, and consumed a large chunk of time, such as emails, social media, analyzing the audience, and pushing them the right content at the right time.

Marketing teams can view a centralized dashboard and also tweak their strategies to enhance the overall ROI of the organization

This way, marketing teams can view a centralized dashboard and also tweak their strategies to enhance the overall ROI of the organization. These time and cost-saving effects keep increasing along with the growth of organizations in size and complexity.

Various elements being pointed out by element in the centreThe benefits of implementing marketing automation tools are manifold for overall business as well as for marketing teams as well. Below is the list of benefits of marketing automation-

1. Reduce your staffing costs

Marketing automation tools let your team set up lead nurturing and marketing campaigns that can be automatically triggered based on certain defined criteria.

Within the few months of the setup of automated campaigns, your business can easily send thousands of personalized emails each day on autopilot.

2. Grow revenue

Once your company starts automating your cross-sells, up-sells, and customer follow-ups, you can observe an increase in your customer lifetime value. And when combined with better lead management and prioritization, it will possibly increase your sales activity too to boost your ROI.

3. Improves accountability of sales and marketing teams

Marketing automation ensures that the obstacles are identified well on-time with its tangible processes, and aerial view reporting of the company’s projects.

In case, the marketing team is unable to convert the identified leads into “sales-qualified leads”, it will get instant and impartial feedback for improving their nurturing campaigns.

This feedback mechanism not only reduces heated arguments but also lets staff members take the onus of their part and hence perform it more diligently.

4. Less repetition and more focus on creativity

When manual work is replaced with automated rules and campaigns, enterprises naturally free up their team members’ time to emphasize more creative tasks

When manual work is replaced with automated rules and campaigns, enterprises naturally free up their team members’ time to emphasize more creative tasks.

This clearly improves team members’ productivity & efficiency and simultaneously bestows them with the joy of enjoying creative work every day instead of performing mundane repetitive tasks. 

5. Refine your marketing processes

Enticing customers in this digital era is no more a cakewalk. If you want to engage customers with your services, then visualize their entire journey and work around making such worthy marketing strategies to gradually refine your leads targeting and nurturing process.

This whole journey will indicate to you about the leads that are falling off during the nurturing process, and hence, make refinements to those areas accordingly.

6. Target potential customers across various channels

Marketing automation lets enterprises reach customers in a personalized way across different online and offline channels

Personalization has become a newfangled solution now to show customers that companies do take their interests in likes/ dislikes. Marketing automation lets enterprises reach customers in a personalized way across different online and offline channels.

The services these tools offer range from multi-channel targeting, email, social, phone calls to text messages, and postcard targeting.

7. Schedule posts and campaigns ahead of time

Marketing automation tools give an edge to companies where they can easily schedule the different posts for different segments of their audience, based on the evaluation and insights gained on audience type.

Marketing teams can add variations in the content to send it to different sections of their audience, making it more personalized as per customers’ needs and interests.

8. Get reality check- what’s fruitful and what’s not

Marketing automation tools can help you figure out the campaigns that worked and for what segment of customers as per the data obtained through CRM

Marketing automation tools can help you figure out the campaigns that worked and for what segment of customers as per the data obtained through CRM. This level of detail enables you to create in-depth statistics reports, and hence, emphasize weak links for better results.

8 written inside Drupal logoHere is a complete list of the best Drupal marketing automation modules that you can incorporate in your website to reap its benefits-

1. Marketo MA

The Marketo MA module can help you incorporate the tracking capabilities of Marketo along with the ability to collect lead data during user registration and form submission.

Features include-

  1. Adds Munchkin tracking code to your webpages
  2. Collect lead data using Marketo’s Munchkin Javascript or API integrations.
  3. Ability to determine which user actions trigger lead capture
  4. Stipulate how user profile fields should align with Marketo fields

2. Marketo

Marketo is one of the prominent marketing automation modules in Drupal that offers consolidation of various Drupal components and the Marketo email marketing system.

It provides a framework by which Munchkin javascript can be embedded into your pages and an API as well for linking the tracking cookie and lead information. 

3. Eloqua

Eloqua is an automated marketing tool that streamlines all your stem-to-stern sales processes like, demand generation, revenue performance management, etc. 

It further boosts the sales processes by capturing a substantial amount of quality data of your customers’ sales leads and re-post the form submissions to the platform, provided you have an Eloqua subscription, to begin with.

4. Pardot

Well-known for its marketing and customer-relationship management tool, Pardot uses Pardot Analytics to collect details of your potential and current customers. For example, it can track whether a potential client was involved or discouraged by a particular price. Enterprises can also upload links and files to analyze if these particular sites are accessed or not, and so can create a list of links accordingly which customers can use to find other products that might interest them. 

Pardot offers a top-notch path based tracking system which can be leveraged throughout a Drupal-powered website.

Learn more about Pardot’s feature from here-

[embedded content]

5. HubSpot

The HubSpot module integrates with HubSpot API and Webform to submit Webforms directly to HubSpot’s lead management system. HubSpot’s JavaScript tracking code can be directly embedded into your Drupal website.

HubSpot mentioned in a boxFor example, a Webform based contact form on the website can submit its data to HubSpot, where marketing teams may already track potential clients and contacts; or Webform-based e-newsletter signup could transmit the lead to HubSpot’s targeted marketing system, allowing you to use your formerly existing email campaigns.

6. Poptin

With Poptin, you can create amazing popups, options, and forms in no time. 

Poptin popup plugin tracks the behavior of website visitors and accordingly shows them the right content at the right time.

This, as a result, increases leads, gets customers to subscribe to a newsletter, increases their engagement and retains visitors that are just about to leave website using exit intent technology and many other triggers.

7. Mailchimp

This module provides collaboration with MailChimp, a well-known email delivery service. The module provides the following features-

  1. Allows website users or visitors to choose the email lists which they want to be in and out
  2. Lets marketing teams generate and send MailChimp email campaigns from the site
  3. Marketing team and users both can view a history of emails they have been sent from Mailchimp
  4. Ensures that the email delivery service is efficient, simple and precise

8. Personalize

Personalize module comes with an array of extendable plug-ins and APIs to tweak Drupal content for giving it a personalized touch. Two fundamentals of this module are-

  1. Personalizing content should be easy for anonymous users as it is for authenticated users
  2. Personalization should continue to work even when the pages are fully cached (including in varnish and CDNs).

9. Loopfuse Integration

No matter whether you are running SMBs or MNCs, you can integrate this module on your Drupal website with LoopFuse Oneview to automate marketing processes. It facilitates enterprises to automate web activity tracking, lead qualification and lead scoring activities. 

Know more about marketing automation from here:

[embedded content]

10. Silver Pop Engage

Silver Pop Engage module caters to the sophisticated marketing automation product capabilities by integrating its Web Tracking API and XML API to allow tracking a user through various flows and levels of processes.

An anonymous user is allocated a distinctive cookie value as a result of which whenever they trigger any custom events, it gets stored in Silverpop’s Engage database. 

With the Engage Marketing Automation product, you can:

  1. Elevate the number of leads entering your pipeline and nurturing them until they are sales-ready
  2. Easily create multi-track drip campaigns driven by leads’ behavior
  3. Implement relevant communications and follow-ups constantly to keep your potential customers occupied
  4. Use several scoring models to score as per specified behavior and demographics.
  5. Establish a substantial marketing ROI by measuring the influence of campaigns.

11. Wildfire Email Marketing Platform

Wildfire is an email marketing system that incorporates completely into your website to allow the marketing team to send intuitive bulk emails to the subscribers in just a handful of clicks.

Any normal Drupal content can be put straight into an email hassle-free. The module also offers mail templates, list management, content management, and job tracking tools that are present in your Drupal website.

The marketing team has to just choose the stories that they want to include in their mail and rest WildFire takes care of. However, the prerequisite to use this module is to have an account with Wildfire HQ to perform a mail-out.

12. Automatr Marketing Automation

This is currently the only marketing automation tool and the first one as well that is specifically built for Drupal and other open-source platforms. It installs the basic integration code for Automatr on your Drupal website.

Below are the features it offers-

  1. Employs cookie and IP information to track every visitor on the website. It registers each page visit and download
  2. Keep a record of form submissions and tie them to the respective visitors so that you can analyze their actions.
  3. Showcases consolidated visitor history. 
  4. Powerful reporting
  5. Deploys SendGrid (highly efficient email distribution cloud) on the back end. And it is included in the cost.

13. Salesforce suite

This suite of modules supports integration with Salesforce by aligning Drupal entities such as users, nodes, files, with Salesforce objects such as contacts, organizations, and opportunities to perform more actions other than simply pushing or pulling information from Drupal to Salesforce or another way round.

Changes can also be made in real-time or independently during the scheduled runs.


To sum it up, marketing automation amalgamated with Drupal is the way ahead for businesses to design and deliver excellent experiences. It will facilitate enterprises in evaluating the levels of engagement which their site provides to the potential and existing customers. 

As consumers become more demanding and data growth breaks out, progressive companies will look forward to emerging technologies to lead in an era of intelligent and automated customer experience. 

However, the success of such content management and marketing automation solutions will depend on the ease of adoption and its ability to scale across millions of customers/ across locations while ensuring that errors and bugs are being fixed on a timely basis.

Oct 18 2019
Oct 18

This post is going to be somewhat unusual, as it’s being written by four people. It’s going to be on a single topic (kind of) of having groups functionality, under a single code base and database.

It’s notably about Organic groups for Drupal 8, but not just. In fact, I think that at this point in time, my story about it, is just one out of three. Pieter Frenssen and Maarten Segers, the co-maintainers have their own voice.

But to make this one even more complete (and interesting), we should hear also Kristiaan Van den Eynde’s voice – the author of Group – a competing/complementing module.

Coincidently and apart from me, all those people heavily invested in group functionality are Flemish. So, if you ever wondered how does the story of “Three Flemish and an Israeli walk into a Bar,” goes… Here’s your chance.

Amitai Burstein

Ten years ago, I did my very first DrupalCon presentation about how the UN approached Gizra, my company, to ask us to build them a Drupal site, powered by OG to serve all UN country members.

The presentation video is of poor quality, which is in line with my poor acting and low talent for imitations. I watched it just recently and while I did occasionally laugh at my own past-me jokes, I was cringing at the rest. I sure had a lot of nerve at that time. I remember that I came to it with the “Make it or it break” attitude.

I think it succeeded, in the sense that overnight, I – along with Gizra – became more known in the community. It surely helped landing a few big contracts. But still, in one of those cases of “I didn’t see this was coming” reality is so good at, in 2019 Gizra is really building that UN site, and it’s really powered by OG8. So yeah, now on “Truth or Lie” games I can say “I’ve built a site for Kim Jong-un” and get away with it.

Here are some of my thoughts about Drupal, OG8 and being a maintainer of a popular module:

1) My concept of group functionality has not changed much in the past ten years. Framing the problem is quite simple: have permissions and membership on the group level, similar to how one has it on the site as a whole. The implementation, or rather the flexible implementation is where it gets complex.

I’d argue that one can build a very hard-coded group functionality in a short time, in a similar way I’m constantly tempted to think that I could build an e-commerce site with hard-coded and limited functionality in a short time. Maybe I could. But nowadays I know it often ends badly. OG’s value (or to that matter Group’s, as-well) isn’t just the code itself. That could change from one version to another. The biggest value is what I like to call “codified knowledge base.”

Almost everything I and the rest of the co-maintainers know, is somewhere in the code. It’s not just about having the right IF and ELSE. It’s making sure all those pieces of knowledge and lessons learned, are dumped from our brains into a more Git-friendly version.

2) Drupal – the CMS is quite amazing. It’s surely one of the best CMS out there. But like every big project it has many moving parts.

I believe that my biggest concern has always been trying to avoid breaking stuff once they are released. Automatic testing is a big part of it, and having a quite big test coverage is what allows us to code and refactor. Over the years, I have grown more and more fond of and familiar with static-typed languages, and friendly compilers. As a result, I’ve become more and more afraid of doing those refactors in PHP.

Since I’m not a purist, and rewriting Drupal to use [enter-your-favorite-statically-typed-language] is not an option, I’ve also learned to overcome these fears. Write code; Write tests; Cross fingers; Rinse and Repeat.

3) Drupal – the community is quite amazing. It’s surely one of the best communities out there. No buts.

While I have been less involved with the actual coding of OG8 in the past couple of years, I keep reading almost every single PR. This is such a great learning experience for me. From going over Pieter’s well-thought, well-implemented, and crazily-tested; to learning Maarten’s techniques for code review – manual and automatic ones.

It’s also interesting to see what I’ve felt for some time now, being put into numbers in Dries’s post Who sponsors Drupal development? :

Most contributions are sponsored, but volunteer contributions remains very important to Drupal’s success.

OG8, unlike the days of OG7, isn’t being built by someone(s) spending their evenings on writing features and fixing bugs of community “strangers”. It’s done by senior developers working for organizations that sponsor their work (and of course, some of their spare time, which I value even more!)

This means that currently OG8 is a snapshot of different needs, and almost nothing more. As crazy as it may sound, the UI for setting permissions for OG roles isn’t yet there. It’s surely needed by others, but not enough for a big organization to sponsor it. I suspect this means that while in the past OG was adopted by site builders, nowadays it more oriented towards developers. I’d want that eventually it will allow site builders to set it up – but evidently I don’t want it enough to do it in my spare time.

4) I’ve been with OG for a long time – since Drupal 4.7. First as a site builder, then slowly started to learn how to code, and submitted patches, and then I became the maintainer, and then I have let others join the ride.

I love the fact that I was able to evolve inside this little island, and it’s still an interesting journey. Before Drupal 8 was released, for a year or two I have felt a bit bad for not having it more in me to giving OG8 more love. I was noticing how Group module was gaining momentum, while OG wasn’t advancing as fast as I was used to. It was probably one of the very few times I had that contribution (and lack of it) to open source caused me some inconvenience. I wasn’t burned out or too stressed, just thinking about OG didn’t give me as much pleasure as it once did.

But along those two years I have also done some more significant changes in my life/work style. Starting to work at home (which I love); Working less hours (which I adore); And training 5-6 times a week (which I can’t stop talking about).

I think that every maintainer of a popular module reaches at a certain point this cross road, asking themselves how to continue. Finding that spot I currently have, which is more of – let’s make sure the ship is heading the right way, and not falling into pitfalls I know for so long – is I believe valuable for the project, but also for me as a person.

Pieter Frenssen

If I would make a list of all the skills I have learned in a decade of working with Drupal then “contributing to open source” would be right at the top. I have been an enthusiastic user of open source software for many years, but I thought it would be impossible for a self-taught programmer like me to make any meaningful contributions to these software packages I was using every day. Among a bunch of other software I used Drupal to build my websites, but – apart from reporting the occasional bug in the issue queue – I did not make any contributions. But this changed at my first Drupal conference.

As many people in our field I was originally working in a different industry. I am an audio engineer by trade, and I spent the first 15 years of my career in recording studios and on concert stages. Programming was a fun hobby and a great way to pass the long waiting periods in between concerts. Over time, I got more and more demand for creating websites and at some point I realized it was time to take my hobby more seriously. I needed to get in touch with other programmers and a conference seemed like a great way to do this.

Some unit tests are done by grabbing a microphone and shouting “Test 1, 2, 3!”

Drupalcon Copenhagen was the first time I attended a tech conference and it was an eye opening experience. Here I met the people whose names I saw in the issue queue, and not only were they giving presentations to share their knowledge, but also actively working on building the software. And they were very happy to include newcomers. I was suddenly taking part in discussions and was invited to work on fixing some bugs in the upcoming Drupal 7. This was very interesting and tons of fun.

Back home I started to regularly contribute small fixes for bugs that I stumbled upon. It was in this period that I made my first contributions to Organic Groups: I fixed some small bugs like PHP notices and mistakes in the help texts. But apart from these my involvement with OG in the Drupal 7 era was as an end user. I made several projects with it, and interestingly enough one of those projects was done in collaboration with Kristiaan Van den Eynde who later went on to create the Group module based on his vision of how groups were supposed to work.

When Drupal 8 came around, my team was evaluating the modules we needed to have for porting a large Drupal 6 based project to Drupal 8. There were a few modules that already had a stable Drupal 8 version, but most were either in alpha or beta, or did not even have a release yet. One of those modules was Organic Groups. At the time it was not more than a proof of concept and most of the functionality that was used in our D6 project was not there yet. We decided to go ahead with it, and implement any functionality that we needed from OG in scope of our project. There were a few factors that made this not as crazy as it might seem:

There was a large time frame involved. The project is large and very complex, having been in continuous development for 8 years. It was estimated that the D8 port – including a completely new backend architecture – would take around 18 months of time. Any work that we would put towards improving Organic Groups on Drupal 8 would benefit our organisation in the future. We have a large number of websites using Organic Groups on Drupal 7 which will need to be migrated to D8 at some point in the future. If we were only considering a single site it would have been more economical to migrate to Group instead.

Organic Groups is one of the most complex modules in the Drupal ecosystem, and has evolved over a period of 15 years, being first conceived in 2004. It implements many edge cases that came out of real life use of the module in tens of thousands of different websites. And now it needed to be ported to Drupal 8. How do you approach a task like this? Our project manager did not like the idea of having a Jira ticket called “Port OG to D8”, estimated to 500 hours, and locking away a developer for a couple of months. So we decided on the following:

  • We would contribute work in small, well scoped PRs. Small PRs are easier to review and accept by Amitai and his team of co-maintainers. We considered that this would reduce the risk that our sprint goals would be affected by a PR not being accepted in time. There was always a direct link between a PR in the Organic Groups issue queue and a ticket on our Jira board regarding some business objective.

  • Try to be as clear as possible in the description of the PR about what the change is intended to achieve. If there was any doubt about the approach to take, we would create a discussion issue before starting any development. On our side, the Jira ticket describing the business need could then be postponed to the next sprint and blocked on the result of the discussion. Since we are working for a paying client we want to avoid spending time developing a solution that would end up being rejected by the module maintainers.

  • To further assist the speedy acceptance of PRs, we assign members of our own team on doing code review on the upstream work. This meant we could already do a couple iterations and improve the code quality to a state that hopefully could be accepted right away, or only require minor modifications. Work with clear labels on the PRs to effectively communicate the state of the code. A label like “Ready for review” is very helpful and avoids people wasting time reviewing unfinished work.

  • Provide extensive and well-documented tests on every PR. It cannot be underestimated how well a set of tests can help module maintainers to quickly understand the goal of a PR, and trust that the code does what it is supposed to do and will not break existing functionality.

An important rule to follow for making contributions to an open source project is to always put the requirements of the project first. It is tempting when working against a sprint deadline to propose a solution that provides some narrow functionality that your project needs, but often these are not aligned with the needs of the project’s wider use base. It is understandable in this case that a PR will be rejected. Often a Drupal module can be configured to cater to different use cases that are needed for different kinds of projects. This flexibility comes at the cost of added development time. One example in OG is the cleaning up of orphaned group content when a group is being deleted: for small sites this is done immediately, but for large sites OG offers an option to do this on a cron job.

As a developer working for a paying customer it can be difficult to justify why we are spending time to develop certain use cases which are not directly needed by the project, but are needed upstream. This means we need to pick our battles carefully. Often the additional work is not substantial, and will be compensated by the fact that the code will be maintained upstream or can be used by other projects within our organisation. But in other cases the work simply cannot be justified from the customer’s viewpoint. One example is the Organic Groups UI module. The way groups are created and managed in our project differed so much from the way this is done in OG that we decided to implement our group administration UI in our own custom module.

The decision to only work on what is strictly needed for the project also meant that we had to accept the fact that our project would probably go to production before the module would be completely finished. We mitigated this risk by implementing our own set of functional tests that fully cover our functionality and UI. We run these in addition to the test coverage of OG. This turned out to be a very good plan, since at the time of launch not only OG was still unstable, but also 20 of the other modules that we are using.

From our perspective the collaboration was a great success. We were able to contribute a significant part of the D8 port, the communication with the other OG contributors in the issue queue went super smooth, and we paved the way for using OG in our other websites. Since our project went live the nature of our contributions has changed – our grouping functionality is feature complete so we no longer need to make any sweeping changes, but we still discover bugs from time to time and were able to identify some performance bottlenecks. Along the way I also got promoted to co-maintainer, and whenever I am at a code sprint I make time to review PRs.

Maarten Segers

The first project where I used Organic Groups was a platform for the European Commission that enables learning and knowledge exchange through online groups where members can keep in touch with each other and share their work. At that time the platform was using Drupal 6 and we didn’t have a lot of the cool things we now have in Drupal 8, like configuration management, services and dependency injection, and the new and improved API’s for caching, entities, plugins, etc. I had used another module (Domain Access) on a different project before to solve the problem of having different subsites but ran into a lot of bugs related to the access layer and it was primarily used to support different domain names which we didn’t need for the EC platform.

What I liked, and still like, most about OG was that it was really well tested. If you make any changes to the default access system you’d better make sure you have tests for it because:

  • Access API is one of the toughest API’s in Drupal
  • Bugs with access can quickly become security issues (e.g. unauthorized access)

Developing this platform taught me that adding a “group” feature can quickly make your website become very complex from a functional perspective as it can affect other existing or new features. Let’s say we have a feature to show the latest news articles, once we have groups there might be a feature for:

  • The latest news articles for a given group
  • The latest news articles for your own groups
  • The latest news articles for a given user’s groups

If we have comments there might be:

  • Comments in a given group
  • Comments in your own groups
  • Comments for a given user’s groups

Now replace “comments” with “most recent comments” or “most active comments” and repeat. If we have a site search there will likely be a group search or filter and now imagine some more complex features like multilingual or content moderation and all the variations groups may introduce on them.

Before I met Amitai at DrupalCon Amsterdam in 2014, I wasn’t contributing very actively to Drupal but as Dries Buytaert puts it, I was a taker.

In my early days as a developer back in the ‘90s I used to build client server solutions and desktop applications. I built the software on Windows and Linux and assembled the hardware. Years later, I discovered these inexpensive microcomputers and microcontrollers that enabled me to build IoT solutions as a hobby. I’m still amazed how much more bang for the buck we get these days when it comes to hardware. But it wasn’t only the cheap hardware that enabled people to build IoT solutions. In the 90s most of the software was still proprietary. Although I did use sourceforge (again, as a taker), it wasn’t until I discovered GitHub that I really started contributing myself. I couldn’t believe how much was already out there, if you need something, anything, someone had probably already written a library for it. Also the ease of use and the low threshold to add or change a line of code made me become very active in contributing to a lot of different projects related to IoT libraries. Note that a lot of commits for bug fixes are one liners and a lot of PRs for new features on a well maintained project are just a few lines of code.

When Amitai asked me why I wasn’t so visible on drupal.org, I did some reflection and it was mostly because of two reasons. First, I didn’t like to attach .patch files to issues in the issue queue. I still don’t like it but now I try to convince myself that it’s worth it and I hope that the GitLab integration will arrive soon! In the meanwhile I’ve been contributing the most on modules that also live on GitHub like OG and Search api solr. The second reason was that most of our solutions were “tailored fit” custom modules.

Once you start contributing to open source project you quickly start to understand how in turn you benefit from it. I learned that by open sourcing it’s possible to reduce the total cost of ownership of a project. I’m not going too much into detail here on the advantages of doing open source right but I’ll list a few:

  • with public scrutiny, others can identify and fix possible issues for you
  • you learn a lot
  • you may receive new features
  • your own contributions may be further improved
  • you avoid reinventing the wheel

In my opinion collaboration tools like GitHub have drastically improved open source workflows over the years. Did you know that you can now put multi-line comments or inline comments? Don’t hesitate to open an issue, a pull request on GitHub or join the Organic Groups Slack channel if you want to get updates have questions or you want to contribute yourself, it will be appreciated!

When developing the website for the European Year of Development (in Drupal 7) we used Organic Groups to model the countries and organisations. When designing the website’s features we made sure that the group feature didn’t have a huge impact on the complexity as we only had a few months to build. For instance, with Organic Groups it would have been possible to manage a hierarchical editorial workflow but instead we decided to use the existing platform to prepare and discuss the initial content.

When building websites using Drupal 6 and 7 I never really hesitated to use Organic Groups as the module had proved it was stable and well maintained. There were some performance issues but nothing we couldn’t handle. It wasn’t until we started building Drupal 8 sites that I started looking for alternatives as the Drupal 8 release for OG wasn’t in sight.

I met Kristiaan at DrupalCon in Dublin in 2016 and we discussed the Group module. By that time a lot of work had already gone into both the “Organic Groups” as well as the “Group” module. While both have a different architecture they try to solve issues in the same problem space. For example, each module has its own submodule to create group navigation (Group menu and OG menu).

A similar situation existed for the search problem space, more specifically for the Apache Solr integration in Drupal: there was the Apachesolr project and the Search api solr project. Both had a huge amount of submodules that tried to solve the same problems: Sorting search results, autocompletion, geospatial search, etc. This meant that two groups of developers were spending time to solve the same issues. The maintainers succeeded to join their efforts in Search api solr which also lead to a single Search API ecosystem.

Perhaps one day we can define one Group API as a basis for Drupal 10 and build our own Group ecosystem!

Kristiaan Van den Eynde

I’ve never really gone into the specifics of why I started developing Group and how I ended up in a position where I am invited to add my thoughts to a blog post written by three people working on a competing/complementing module. Maybe it’s time I cross that bridge publicly once and for all, because in my heart and mind I’ve already left that part of my past far behind me.

Way back in 2010, I started my first official IT job at a university college in Antwerp, Belgium. I had been tinkering with websites many years before that, but this time it was the real deal. After a round of headaches using closed source applications, we managed to convince the higher-ups to allow us to use Drupal for our websites.

One of the first Drupal 7 projects we built was a student platform which needed the functionality people have grown to expect from OG or Group, except there was only OG at the time. Having just had my first taste of what Drupal 7 development was like, I was still in this phase where everything is both daunting and complicated, but the solutions offered almost never seem to suit your needs. OG was not an exception to this rule.

This often triggered the naive, energetic caffeine junky that I was at the time to do all the things maintainers hate to see from their users: I complained in issues about how broken things were, about patches not being accepted, about the general approach of the module and so on.

And thus came the fateful day, where after a round of “frustrating” issues I encountered with OG, I went to see my boss and pitched him the idea that I could write this better and more tailored towards our needs. The fact that he was a very understanding boss and we had way more time on our hands than we needed led him to say yes, and in a single instant changed my career in the most unimaginable way ever.

When I initially started Group development, my main focus wasn’t to reinvent the functionality OG offered but rather to come up with a different data architecture, and as a result, a different developer and user experience. I must have struck gold somehow, because over the next few months of development, I got a lot of positive feedback and was therefore even more motivated to spend my every waking moment polishing my module.

Over the years to come, I went overboard in my enthusiasm and added far more features to Group than I cared to maintain. I was starstruck by the sudden popularity and wanted to appease everyone to make sure I kept that momentum going. This was a really bad idea.

Around the end of 2015, the fact that I was the maintainer of Group landed me a job at a prestigious Drupal agency in the UK: Deeson. My personal life had seen a few drastic changes which meant I no longer had all the spare time in the world to work on Group. So I brokered a deal where I got to work one paid day a week on Group. This is when I was first encouraged to begin development on the Drupal 8 version of Group and to start speaking at events.

I have since grown a lot both as a developer and a person and learned a few valuable lessons, the most important being to respect anyone willing to invest such a huge amount of time into writing a module the size of OG or Group. Next on that list is to first try to collaborate on a project before deciding to reinvent the wheel. Even if I have my career to thank for it, writing Group from scratch was an undertaking that should have definitely crashed and burned, yet somehow didn’t.

So here we are: A rookie turned senior who has come to respect and like Amitai, even if many years ago I strongly disliked his product. Talking to Amitai now, I realize that we both share the same knowledge and that we both want to fix this problem space the best way possible.

It is my hope that one day we can combine OG and Group into a single module that uses the best from both worlds and can be as well written and tested as Commerce. While that ship has definitely sailed for Drupal 8 and we might still have some minor disagreements about how to best approach certain functionalities, I hope to sit down with Amitai, Pieter and Maarten one day to make this happen for Drupal 9 or, more realistically, Drupal 10. In the meantime, I’ll just keep spending my one day a week (now at Factorial) to work on Group and Core.

And who knows: If us “burying the hatchet” someday leads to a single module whose approach people tend to disagree with over time, we might see another person like me step up and try to do it differently. I would certainly welcome the competition as, in the end, it improves the product for both parties involved. I would like to leave that person one piece of advice, though: Do not get dragged down the rabbit hole, keep a healthy work-life balance and try to respect the people who came before you. I know I should have.

For those wondering how "competing" module maintainers (og and group) get along in #Drupal.#DrupalCon pic.twitter.com/rCXJkW97dL

— Amitai Burstein (@amitaibu) September 27, 2016
Oct 17 2019
Oct 17

We’re excited to once again be sponsoring Acquia Engage. At Engage, today’s most impressive digital leaders share their expertise, their insights, and their secrets to creating customer experiences that truly make a difference.

Join Sr. Director of Consulting, Ken Rickard for a session on the search challenges commonly presented to large organizations and how using an open source solution solves these challenges.

The Digital Services team for the state of Georgia (DSGa) run a Drupal 7 platform for over 100 websites. During 2019, they began to transition those sites to a new Drupal 8 platform. Their flagship site, Georgia.gov, needs to search content from across the entire site network. While both sets of sites are hosted on Acquia and use Acquia Search, their Drupal 7 search solution could not incorporate content from the new Drupal 8 sites.

Fortunately, open source software gave them a different option. What we built is called Federated Search, and is freely available on Drupal.org. Using Drupal, Acquia Search, and React, Palantir collaborated with the DSGa and their development partners (Lullabot and MediaCurrent, respectively) to re-launch network-wide search in both Drupal 8 and Drupal 7.

In this session, we’ll explore how Federated Search integrates with Acquia Search and hosting and details for getting started using the application in Drupal 7 and Drupal 8.

  • Date: Tuesday, November 5, 2019
  • Time: 11:00 - 11:45 AM ET
Oct 17 2019
Oct 17

Our team is so enthusiastic to participate in the third iteration of Decoupled Days. Palantir is excited to sponsor this year’s event and continue to share our insights into Content Management Systems.

Join Senior Engineer and Technical Architect Dan Montgomery for a session on content modeling. He’ll break down:

  • How a master content model can enable scalable growth
  • How to create a standardized structure for content
  • How Drupal can function as a content repository that serves other products

You’ll walk away with an understanding of how to develop architecture and structures that are scalable for future, unknown endpoints.

  • Date: Thursday, July 18
  • Time: 9:00am
  • Location: Room 
Oct 17 2019
Oct 17

Facilitating design workshops with key stakeholders allows them to have insight into the process of "how the sausage is made" and provides the product team buy-in from the get-go.

Join Palantir's Director of UX Operations, Lesley Guthrie, for a session on design workshops. She'll go over:

  • How to choose the right exercises 
  • How to play to the team skill sets
  • Ways to adjust the workshop to fit the needs of the project 

You'll learn how to sell it the idea of the design workshop to stakeholders and collaborate with them on a solution that can be tested and validated with real users.

Oct 17 2019
Oct 17

When it comes to slow and broken digital user experiences, none of us has any patience. When someone can't access a website to get the information they need, they click the browser's back button and move on to the next link in their search results. Drupal has continually improved performance by adding a powerful cache management layer to Drupal 8. Meanwhile, any time database changes are deployed to a Drupal website, the recommended and default behavior is to display the below maintenance page across the entire website including the homepage.

Site under maintenance

Site under maintenance

For better or worse, displaying Drupal's maintenance page is a broken user experience.

There are technical reasons why Drupal's maintenance page exists - end-users don't care about the technical reasons behind a maintenance page. End-users come to a website to get information. To their minds, if they can't get this information immediately (from their perspective) the website is broken. Sure, the maintenance page can provide some more information and reasons why the website is unavailable. Still, a website's digital door is temporarily shut. The expectation is that the internet superhighway is available 24/7, yet in the Drupal community we are okay with a little downtime every week or so.

We need to examine this problem and come up with some better solutions.

Why does maintenance mode exist?

The best metaphor as to why Drupal needs to display a maintenance page when deploying code and database changes is…

You can't work on a car's engine while it is driving down the highway.

Putting a Drupal site in maintenance mode is part of the steps to updating core, modules, and themes. Drush, Drupal's command-line interface, automatically switches a site into maintenance mode when deploying database changes.

Drupal's maintenance mode has been around since Drupal 4.7. Khalid Baheyeldin (kbahey) contributed the first patch in Issue #32622: Site Offline Under Maintenance feature. The Drupal community came to an immediate consensus that stands today - being able to take a site offline was and still is a useful feature.

Drupal's maintenance page provides a way to stop visitors from accessing an undesirable user experience while allowing administrators and developers to access a site.

The existence of Drupal’s maintenance mode makes sense. Even preventing a user from accessing a broken website, while a website is being updated, makes sense. The problem is displaying a maintenance page across an entire website feels like a broken user experience to end-users. The possible solutions to this problem lay in not having or displaying a broken website.

Is maintenance mode always needed?

Drupal is complex architecture with a lot of moving parts therefore, certain parts of Drupal are going to need to be updated in isolation. We can't assume that a deployment, which is working in staging/testing environment, is going to work the same in a production/live environment. Production environments are under considerable more load than any staging/testing environment. Production servers will have concurrent page requests and form submissions, which may fail as code and database records are updated and even when Drupal's cache is cleared.

I explorated with the maintainers of Drush, the possibility of allowing a developer to decide when a site should not be put into maintenance mode during database updates, and the answer is deploying code and database changes without maintenance mode can result in data loss.

The risk of data loss means we must still use maintenance mode when deploying code and database changes.

Not having a broken website during code and database updates can be mitigated by reducing the downtime. Code changes generally only takes a minute or two, meanwhile database changes can require long batch processes to be executed, and can take several minutes.

I can speak from my experience building and maintaining the Webform module Drupal 8. I have written over 150 incremental updates to Webform related configuration and data schemes. There might be an opportunity to review and rethink how certain aspects of Drupal's Update API works. Maybe a module maintainer can explicitly state that a specific update hook requires a website to be in maintenance mode.

Stepping back from the technical challenges and issues around deploying code, what is needed when deploying changes to a production environment is...

A safe environment that won't have any data loss and provide the optimal user experience.

How to provide a safe environment for deploying updates?

When I initially stated that a site's homepage is replaced with Drupal's maintenance page when changes are deployed, I ignored the current poormans workaround, which is most enterprise sites are heavily cached by a reverse proxy. When changes are deployed, the reverse proxy can continue to serve cache pages. In many cases, a site appears to be up and running for most anonymous users as long as they are requesting cached pages, as soon as they click a form's submit button or login they will see a maintenance page. The approach of relying on cached pages during deployments is the recommended solution for most Drupal hosting providers.

Serving a cached site is essentially providing users with a "read-only" user experience, except certain actions will unexpectedly result in a maintenance page. This solution still provides a somewhat broken and unpredictable user experience. The notion that a read-only version of website can be served to end-users when deploying changes made me wonder…

Instead of displaying a maintenance page site-wide, could a site be switched to read-only mode during deployments?

What is a read-only mode?

In the Drupal community, "there is always a module for that" and Sharique Farooqui (Sharique) has created a Read only mode module which…

"Provides an alternate to the built in Maintenance Mode in Drupal. Instead of displaying a static text file to users while the site is in maintenance mode, Read Only Mode will allow access (reading) of new content while preventing the addition of new content (posting / submitting forms / etc)."

In read-only mode forms are replaced with the below message.

Site is currently in maintenance. During this maintenance it is not possible to change site content (like comments, pages and users).

A read-only Drupal instance should...

  • Allow users to access content. 
  • Disable content editing, comments, and webform submission.
  • Display custom messages when a user can't create or update content.

An interesting nuance to read-only forms and comments is that if the application data is being stored remotely, a site may not have to disable forms and comments. For example, if a website uses Disqus for comments, then it would not need to disable comments. If a webform does not save results to the database and the webform either sends email notification or remote posts submission data to third party server; a site's webforms might also not need to be disabled.

How can a site safely be switched to read-only mode?

The read-only mode must prevent the creating, updating, and deleting any records stored in the database. Ideally, the entire database should set to read-only access. If you know how complicated a Drupal site can be, this is not realistic. The critical thing is for the read-only mode to prevent users from writing data that might be lost.

Another challenge is this: if a site is set to read-only while code and database updates are being executed, it is still possible to run into cache clear, locking, and performance issues. The best solution is to create an isolated read-only instance that is independent of the production instance while code is being deployed.

Enterprise Drupal websites use load-balanced servers to increase reliability and availability through redundancy. Most Drupal hosting providers have some form of redundancy where if a server stops responding the website will failover to another server. Load balancers support calling a health-check script, which monitor's the server's database and filesystem. The health check script is called every few seconds. If a server's database or filesystem becomes unavailable, the load balancer will direct user traffic to a stable server.

We can apply a similar approach to create load-balanced environment in which a site is switched to maintenance mode and user traffic is directed to an isolated, read-only instance of a website.

Here is a step-by-step look at leveraging a read-only server during a Drupal code and database deployment.

  1. A dedicated read-only copy of a production Drupal site is set up. The "read-only" site needs to be configured to always enable the Read only mode module. 
  2. The read-only Drupal site is synced with production nightly or manually. The read-only site must never be synced during a deployment.
  3. A health check script is set up on the production site which returns FALSE when a site is switched to maintenance mode.
  4. When the production site is switched to maintenance mode, the load balancer, using the healthcheck, should direct all traffic to the read-only site.
  5. Once the production site switches off maintenance mode, the load balancer should now direct all traffic back to the production site.

What are some of the downsides to a read-only server?

A load-balanced read-only server is not a perfect solution. End-users will still not be able to submit certain forms and comments. Adding another server to a load balanced hosting environment increases infrastructure costs. Fortunately, the read-only server is not used frequently and by being read-only, requires less computational resources.

What is missing and what is next for deploying an enterprise Drupal website with minimal downtime?

This blog post is just walking through a proof-of-concept of leveraging a read-only server during code and database deployments. Everything discussed needs more thought, work, and community contribution. I suspect some enterprise Drupal sites have come up with other and possibly better solutions to reducing downtime during deployments.

In a previous blog post, I talked about how companies within open source should work together to solve shared and challenging problems to benefit the broader community. Improving deployments is a challenging problem that is impacting everyone in the Drupal community. Reducing downtime during deployments helps everyone, especially the customers of Drupal hosting providers. I am sure this question is asked to every hosting provider, and I am optimistic that hosting providers can contribute their ideas, solutions, and resources to solving this problem.

The next step is for an organization or hosting provider to implement a full proof-of-concept and document what it takes to minimize downtime during deployments using a read-only server.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!


Oct 17 2019
Oct 17

The DrupalCon Minneapolis program committee shares some important session proposal hints, tips, and pointers.

Oct 17 2019
Oct 17

Content is at the forefront of users’ digital experiences. Content marketing is driven by copywriters who need to constantly optimize their content strategy as we move more and more towards multichannel content

But this content strategy typically only focuses on marketing copywriting, while completely discounting another major facet of copywriting - UX copywriting or microcopy, a crucial aspect of any user experience. 

Where longer forms of content, such as blog posts, videos or infographics, work to inform or persuade customers about a product or service, or tell stories and provide an experience in and of itself, microcopy is intended more as a guide that facilitates the use of said product or service, in the same way as an intuitive UI would. 

In this post, we’ll take a look at some best practices of writing microcopy, and we’ll see on the basis of some examples why good microcopy is so important, especially in the negative interactions customers and users may have with your brand.

What exactly is microcopy and how does it differ from traditional copy(writing)?

Microcopy refers to all the little bits of copy that are not content and don’t tell a story of their own. Things like the text on buttons of an interface, in error messages, in forms - all of this falls under the domain of UX writing and requires a different approach than traditional, marketing-oriented copywriting. 

The psychological process is where the first major difference lies. While it’s true that both require the ability to empathize and put oneself in the user’s shoes, this thought process is realized differently in these two forms of copywriting. 

With content marketing, the greater focus is on the customer - their desires, the problems that they wish to solve with your particular product or service. 

By contrast, microcopy focuses on the user, who is the same entity, just at a different stage of their digital journey. Microcopy steps in once the customer becomes the user and now needs some help with adequately using the product or service and getting their desired benefit out of it. 

Because, as we know, a beautiful UI and cutting-edge technology have little effect if they’re difficult to use and are unable to guide the user to the solution to their pain.

This could be succinctly recapped as: good copy brings the customer in, while good microcopy retains the customer now turned user.

Who writes microcopy?

Traditionally, microcopy has been taken care of by UX/UI designers themselves. Since consistency is as important in the copy as in the visual design, designers are perfectly suited for the role of UX writing, especially if the product or service they’re designing is in their native language.

And yet, writing and language proficiency are not typically something designers should invest a lot in. It’s outside the scope of their expertise and responsibilities, and tasking them with producing stellar copy alongside their design could lead to burnout on the one hand and a subpar user experience on the other, as they would need to disperse their focus. 

This is precisely why we’re witnessing a rise of a new position that focuses exclusively on microcopy - the UX (copy)writer. Whereas the UX designer focuses on the visual aspects of the design, the UX writer takes care of its linguistic peculiarities. 

Logically, it’s very important that the UX writer is included in the project from its very start. They have to be aligned with designers and other stakeholders in order to best capture the essence of the product in words. If UX copy is viewed as nothing more than an afterthought, it’s unlikely to provide a good user experience. 

Error messages and other points of friction

Clear, concise and user-friendly copy is important in all touchpoints users have with your digital presence. Where it’s downright vital, however, is in the negative touchpoints. 

As Tom Wentworth, SVP of Product Marketing at Acquia, points out in their webinar on the Digital Experience Platform, a single bad experience can break a brand in the digital space, and it’s extremely difficult to come back from it.

With that in mind, and also knowing that it’s practically impossible to eliminate all negative experiences for all users, alleviating these negative touchpoints becomes a priority. Luckily, as we’re well aware, words have a tremendous impact and can completely change how we feel about a certain situation.

And the magic of microcopy lies exactly there: it has the potential to turn these negative experiences into positive interactions with your brand. 

Take, for example, an error message such as the “404 - page not found”. Instead of a simple blank page and these three empty words, you can use this page to guide the user to other important pages on your site (e.g. related products in the case of an e-commerce platform). 

Or, perhaps you want to liven the mood and empathize with the user not being able to find what they’re looking for. Staying true to your brand’s tone and voice, of course, you can instill some humor and/or empathy in the “page not found” message, or explain to the user why the error occurred (e.g. “the page you’re looking for may have been removed or its link has been updated”).

Consider this “Access denied” page on Agaric’s website: 

While encountering such a page would typically lead to frustration, the verse from Marvin Moore’s “Green Door” appeases that frustration by self-referencing the page itself with an ironic comment about its “thin” hospitality. The page contains both an apology and humor, which work in tandem to transform the negative touchpoint into a memorable one. 

An area where good microcopy is especially important is e-commerce, which has a lot of potential friction points. Even with its recent rise in popularity and better security, a lot of people are still reluctant to share personal information such as credit card info and spend money online.

Because of this, they may abandon a purchase if the checkout process is convoluted and non-transparent. In order to lead customers to the purchase step and retain them afterwards, you need helpful on-point copy accompanying each step of the process. 

E.g., when an item has been added to the cart, tell the user that this has been successful. After they place an order, tell them that the order has been placed - and when they can expect a reply and/or delivery, for some additional spice to the customer experience.

Examples of good microcopy

One of the coolest examples of good microcopy are Slack’s welcome messages. You’re able to customize them and set your own, but the default ones provided by Slack truly strike a chord with the user. As Slack is one of the first things people check when they arrive at work in the morning, being greeted with such a warm message can do wonders for one’s day.

Here are a couple of these welcome messages that really stuck with us:

Both examples are a testament to how great of an impact a simple sentence or two can have. 

The first one feels as if you’re being greeted by a close friend who’s been eagerly awaiting your arrival. The second one is even cooler and even warmer (pun definitely intended); it plays on the antonymy of the two words, at the same time showing concern for both the mental and physical comfort of the reader in a fun and easy-going way. 

This personal touch in both of these is accentuated as well as justified with the signature “Your friends at Slack” - naturally, your friends are concerned with your well-being and are happy to see you, so this signature is very fitting. 

The rapid pace and the progressively distributed nature of working in the digital often restrict the time we can spend with loved ones and at times altogether prevent us from doing so. This makes thoughtfulness and recognition that much more valuable, in any way we can get them - even if they’re coming from a bot or another automated source. 

More examples of excellent copy can be found in the newsletter messages of best-selling author and entrepreneur Nir Eyal. Even the subscription pop-up itself is very empathetic and answers exactly the questions reluctant new subscribers typically have:

First, the copy is clear, and the benefit the new subscriber will get is clearly outlined in the subtitle. But it’s the innocuous little line at the end that truly shines. In just two short sentences, Nir addresses and appeases two of the major hesitations to subscribing: he promises to keep your email safe from spam, while stressing that you can unsubscribe whenever you wish. 

The latter fact especially serves to build trust with the reader and prevents them from feeling cheated or tricked into opting in. And he keeps his promise - every email you receive contains an unsubscribe link, in plain sight rather than intentionally made barely noticeable.

Making unsubscribing extremely difficult is, unfortunately, a pretty common dark UX pattern, so, those businesses that simplify this process and even point to it automatically get points with the user. This is especially true for users from the EU, who have benefited from more transparency since the implementation of GDPR in May 2018.

The message you get confirming your subscription, then, is even more heartwarming:

The page reads easily, with short sentences and highlighted bits. The second point is very welcoming and promises the reader a positive interaction if they choose to connect, encouraging them to do so. 

But, again, it’s the very last sentence that truly hits the spot - “it’s great to have you here!” This feels perfectly genuine and gives the reader a strong impression that it was written specifically for them, personalized and manual rather than something automated. Even knowing that it is in fact an automated response, you can’t help but feel that Nir is genuinely happy to have you. 

Notice that this last sentence is very similar to one of the previously mentioned Slack loading messages - “You’re here! The day just got better.” Both make the reader feel special and valuable, without having to be super personalized and to solve particular pains. 

Feeling inadequate or incompetent is actually a pain in and of itself, one that a lot of people face daily, and hence such small instances of recognition can truly go a long way towards brightening their day.


We hope this blog post has given you a better idea of what microcopy is and what some best practices for writing microcopy are. Ideally, you’ll start noticing more and more examples of exceptionally good (or poor!) microcopy and, in time, subconsciously adopt some of these practices and incorporate them into your own writing. 

For further reading, we highly recommend the excellent book Microcopy: The Complete Guide by Kinneret Yifrah. It’s truly an invaluable resource for anyone undertaking UX writing and further elaborates on a lot of the points mentioned in this post. 

Oct 17 2019
Oct 17

The Drupal 8 Salesforce Suite allows you to map Drupal entities to Salesforce objects using a 1-to-1 mapping. To do this it provides a series of field mapping types that allow you to select how you want to relate the data between the two systems. Each field type provides handling to help ensure the data is handled correctly on each side of the system.

As of this writing the suite provides six usable field mapping types:

  • Properties — The most common type to handle mapping data fields.
  • Record Type — A special handler to support Salesforce record type settings when needed.
  • Related IDs — Handles translating SFIDs to Drupal Entity IDs when two objects are related in both systems.
  • Related Properties — For handling properties across a relationship (when possible).
  • Constant — A constant value on the Drupal side that can be pushed to Salesforce.
  • Token — A value set via Drupal Token.

There is a seventh called Broken to handle mappings that have changed and need a fallback until its fixed. The salesforce_examples module also includes a very simple example called Hardcoded the shows how to create a mapping with a fixed value (similar to, but less powerful than, Constant field).

These six handle the vast majority of use cases but not all.  Fortunately the suite was designed using Drupal 8 annotated plugins , so you can add your own as needed. There is an example in the suite’s example module, and you can review the code of the ones that are included, but I think some people would find an overview helpful.

As an example I’m using the plugin I created to add support for related entities to the webform submodule of the suite (I’m referencing the patch in #10 cause that’s current as of this writing, but you should actually use whatever version is most recent or been accepted).

Like all good annotated plugins to tell Drupal about it all we have to do is create the file in the right place. In this case that is: [my_module_root]/src/Plugins/SalesforceMappingField/[ClassName] or more specifically: salesforce_webform/src/Plugin/SalesforceMappingField/WebformEntityElements.php

At the top of the file we need to define the namespace, add some use statements.

namespace Drupal\salesforce_webform\Plugin\SalesforceMappingField;
use Drupal\Core\Entity\EntityInterface;
use Drupal\Core\Form\FormStateInterface;
use Drupal\salesforce_mapping\Entity\SalesforceMappingInterface;
use Drupal\salesforce_mapping\SalesforceMappingFieldPluginBase;
use Drupal\salesforce_mapping\MappingConstants;

Next we need to provide the required annotation for the plugin manager to use. In this case it just provides the plugin’s ID, which needs to be unique across all plugins of this type, and a translated label.

 * Adapter for Webform elements.
 * @Plugin(
 *   id = "WebformEntityElements",
 *   label = @Translation("Webform entity elements")
 * )

Now we define the class itself which must extend SalesforceMappingFieldPluginBase.

class WebformEntityElements extends SalesforceMappingFieldPluginBase {

With those things in place we can start the real work.  The mapping field plugins are made up of a few parts: 

  • The configuration form elements which display on the mapping settings edit form.
  • A value function to provide the actual outbound value from the field.
  • Nice details to limit when the mapping should be used, and support dependency management.

The buildConfigurationForm function returns an array of form elements. The base class provides some basic pieces of that array that you should plan to use and modify. So first we call the function on that parent class, and then make our changes:

   * {@inheritdoc}
  public function buildConfigurationForm(array $form, FormStateInterface $form_state) {
    $pluginForm = parent::buildConfigurationForm($form, $form_state);
    $options = $this->getConfigurationOptions($form['#entity']);
    if (empty($options)) {
      $pluginForm['drupal_field_value'] += [
        '#markup' => t('No available webform entity reference elements.'),
    else {
      $pluginForm['drupal_field_value'] += [
        '#type' => 'select',
        '#options' => $options,
        '#empty_option' => $this->t('- Select -'),
        '#default_value' => $this->config('drupal_field_value'),
        '#description' => $this->t('Select a webform entity reference element.'),
    // Just allowed to push.
    $pluginForm['direction']['#options'] = [
      MappingConstants::SALESFORCE_MAPPING_DIRECTION_DRUPAL_SF => $pluginForm['direction']['#options'][MappingConstants::SALESFORCE_MAPPING_DIRECTION_DRUPAL_SF],
    $pluginForm['direction']['#default_value'] =
    return $pluginForm;

In this case we are using a helper function to get us a list of entity reference fields on this plugin (details are in the patch and unimportant to this discussion). We then make those fields the list of Drupal fields for the settings form. The array we got from the parent class already provides a list of Salesforce fields in $pluginForm[‘salesforce_field’] so we don’t have to worry about that part.  Since the salesforce_webform module is push-only on its mappings, this plugin was designed to be push only as well, and so limits to direction options to be push only. The default set of options is:    

'#options' => [
    MappingConstants::SALESFORCE_MAPPING_DIRECTION_DRUPAL_SF => t('Drupal to SF'),
    MappingConstants::SALESFORCE_MAPPING_DIRECTION_SF_DRUPAL => t('SF to Drupal'),
    MappingConstants::SALESFORCE_MAPPING_DIRECTION_SYNC => t('Sync'),

And you can limit those anyway that makes sense for your plugin.

With the form array completed, we now move on to the value function. This is generally the most interesting part of the plugin since it does the work of actually setting the value returned by the mapping.

   * {@inheritdoc}
  public function value(EntityInterface $entity, SalesforceMappingInterface $mapping) {
    $element_parts = explode('__', $this->config('drupal_field_value'));
    $main_element_name = reset($element_parts);
    $webform = $this->entityTypeManager->getStorage('webform')->load($mapping->get('drupal_bundle'));
    $webform_element = $webform->getElement($main_element_name);
    if (!$webform_element) {
      // This reference field does not exist.
    try {
      $value = $entity->getElementData($main_element_name);
      $referenced_mappings = $this->mappedObjectStorage->loadByDrupal($webform_element['#target_type'], $value);
      if (!empty($referenced_mappings)) {
        $mapping = reset($referenced_mappings);
        return $mapping->sfid();
    catch (\Exception $e) {
      return NULL;

In this case we are finding the entity referred to in the webform submission, loading any mapping objects that may exist for that entity, and returning the Salesforce ID of the mapped object if it exists.  Yours will likely need to do something very different.

There are actually two related functions defined by the plugin interface, defined in the base class, and available for override as needed for setting pull and push values independently:

   * An extension of ::value, ::pushValue does some basic type-checking and
   * validation against Salesforce field types to protect against basic data
   * errors.
   * @param \Drupal\Core\Entity\EntityInterface $entity
   * @param \Drupal\salesforce_mapping\Entity\SalesforceMappingInterface $mapping
   * @return mixed
  public function pushValue(EntityInterface $entity, SalesforceMappingInterface $mapping);
   * An extension of ::value, ::pullValue does some basic type-checking and
   * validation against Drupal field types to protect against basic data
   * errors.
   * @param \Drupal\salesforce\SObject $sf_object
   * @param \Drupal\Core\Entity\EntityInterface $entity
   * @param \Drupal\salesforce_mapping\Entity\SalesforceMappingInterface $mapping
   * @return mixed
  public function pullValue(SObject $sf_object, EntityInterface $entity, SalesforceMappingInterface $mapping);

But be careful overriding them directly. The base class provides some useful handling of various data types that need massaging between Drupal and Salesforce, you may lose that if you aren’t careful. I encourage you to look at the details of both pushValue and pullValue before working on those.

Okay, with the configuration and values handled, we just need to deal with programmatically telling Drupal when it can pull and push these fields. Most of the time you don’t need to do this, but you can simplify some of the processing by overriding pull() and push() to make sure the have the right response hard coded instead of derived from other sources. In this case pulling the field would be bad, so we block that:

   * {@inheritdoc}
  public function pull() {
    return FALSE;

Also, we only want this mapping to appear as an option if the site has the webform module enabled. Without it there is no point in offering it at all. The plugin interface provides a function called isAllowed() for this purpose:

   * {@inheritdoc}
  public static function isAllowed(SalesforceMappingInterface $mapping) {
    return \Drupal::service('module_handler')->moduleExists('webform');

You can also use that function to limit a field even more tightly based on the mapping itself.

To further ensure the configuration of this mapping entity defines its dependencies correctly we can define additional dependencies in getDependencies(). Again here we are tied to the Webform module and we should enforce that during and config exports:

   * {@inheritdoc}
  public function getDependencies(SalesforceMappingInterface $mapping) {
    return ['module' => ['webform']];

And that is about it.  Once the class exists and is properly setup, all you need to do is rebuild the caches and you should see your new mapping field as an option on your Salesforce mapping objects (at least when isAllowed() is returning true).

Oct 16 2019
Oct 16

Our normally scheduled call to chat about all things Drupal and nonprofits will happen Thursday, October 17, at 1pm ET / 10am PT. (Convert to your local time zone.)

Feel free to share your thoughts and discussion points ahead of time in our collaborative Google doc: https://nten.org/drupal/notes

We have an hour to chat so bring your best Drupal topics and let's do this thing!

Some examples to get your mind firing: how do I recreate [feature] on my Drupal 7 site in Drupal 8? I need to explain [complicated thing] to a non-technical stakeholder -- any advice? How can I get Drupal and my CRM to play nicely?

This free call is sponsored by NTEN.org but open to everyone.

View notes of previous months' calls.

Oct 16 2019
Oct 16
Integrating CRM into web technology has come as a turning point in business process automation. Well, enterprises can now easily evaluate and manage their contacts, existing clients, and leads or prospects generated through the marketing campaigns on social media platforms without any nuisance.


Drupal integrates seamlessly with third-party applications catering to varied verticals and industries, making it a perfect building block to create a robust functional system


Given the fact that Drupal integrates seamlessly with third-party applications and systems catering to varied verticals and industries, it accounts for a perfect building block to create a robust functional system.


Drupal when integrated with CRM, makes an apt solution for handling colossal user and content databases


Drupal when integrated with CRM, makes an apt solution for handling colossal user and content databases. And all this comes to reality because of Drupal’s highly modular and scalable architecture. However, the best part of it is that it gives substantial operational efficiency through marketing automation.

This blog will elucidate some of the best enterprise-level CRM integration for your Drupal website and its modules and distributions that will help you create a robust CRM strategy for your business.

Benefits Of Drupal and CRM Integration

There are several benefits of integrating Drupal site with a third-party CRM solution. The most obvious ones to native CRM in Drupal comprises of-


CRM interfaces can easily leverage Drupal’s growing suite of mobile and responsive tools and themes. 


  1. The ability to utilize CRM data as content, or the ability to showcase aggregate CRM data on the website.
  2. More chances to integrate CRM data with Drupal dedicated tools- such as data visualization tools, geo-mapping tools, etc.
  3. Reduced staff training costs- as the staff doesn’t need to get trained on multiple platforms.
  4. A considerable amount of reduction in technical risk as all the tools will rely on a single, Drupal framework.
  5. Potential reductions in hosting and IT costs.
  6. The ability to customize your CRM solution up to 360 degrees.
  7. Users or site visitors registering for events, making donations or payments, or engaging in other website transactions are ensured a more seamless and interface-friendly experience.
  8. CRM interfaces can easily leverage Drupal’s growing suite of mobile and responsive tools and themes. 
  9. Tracks competitors and manages every opportunity through various built-in tools. Faster response to any client that enquires about services or products to show them you care about their business.
  10. Proficiency in managing complex engagement scoring or engagement analytics.

Modules and Distributions For Integration

There are a plethora of modules and distributions in Drupal that one can leverage and integrate with third-party tools and CRM. Take a look-

1. Webform CiviCRM Integration (CiviCRM)

Various features available in CiviCRM
This module lets you receive the data input in the specific format you want. You can also create & update contacts, sign-up forms, opt-in forms, tags, relationships, cases, and membership & contributions forms via these robust and user-friendly webforms that incorporate effortlessly into your Drupal website.

Its other features include-

  • Auto fillup of forms for already signed in users or anonymous users following a personalized link from CiviMail.
  • Use CiviCRM contact data to showcase customized messages, emails or set access restrictions to the form.
  • Approve credit-card payments for events, memberships, and contributions.

Available for - Drupal 7 | Covered by Security Advisory

2.  RedHen CRM

RedHen is a flexible and light-weight Drupal-native CRM that manages information about contacts, organizations, and their relationships with each other

RedHen is a flexible and light-weight Drupal-native CRM that manages information about contacts, organizations, and their relationships with each other. It comprises of an automated filter in the UI to let you filter by any field you add to contact, and a smart find-and-dedupe interface to keep contacts and the associated data spick and span.

It also provides engagement tracking, event registration integration, and customizable one-page donation forms. 

Available for - Drupal 7 | Covered by Security Advisory

3. Salesforce Suite

The Salesforce Suite enables the integration of Drupal and Salesforce, as a result of which, Drupal entities (for example, users, nodes, files) are synchronized with Salesforce objects (for example, contacts, organizations, opportunities). 

In addition to this, it also provides the ability of mapping tool that can push Drupal data to Salesforce as well as pull, or import Salesforce data into Drupal.

The changes can be made in real-time or independently in batches during cron run.

Available for - Drupal 8 | Covered by Security Advisory

Watch this video to find out more on the same-

[embedded content]

4. Webform2Sugar (SugarCRM)

There are two modules that connect to SugarCRM from Drupal. They are

  1. Drupal to Sugar
  2. Webform2Sugar

This module, however, in itself is available only for - Drupal 7 and not covered by the security advisory

  1. Drupal to Sugar- 

The integration of Drupal and SugarCRM at webforms level results into a more powerful business solution, where an innate GUI is integrated with these both in the admin section

The integration of Drupal and SugarCRM at webforms level results into a more powerful business solution, where an innate GUI is readily integrated with these both in the admin section, to provide a more intimate mapping of modules and related fields without the need for any custom coding.

However, in case you have a multi-page webform, this module won’t function as required.

Available for - Drupal 7 | Covered by Security Advisory

B. Webform2Sugar:

This user-friendly module works fine even if you have a multi-page webform. Similar to that of Drupal to SugarCRM, the mapping of the fields can be done directly from the admin UI. It can also capture webform results as a new lead in SugarCRM.

Available for - Drupal 7 | Not Covered by Security Advisory

5.  Zoho CRM

Zoho CRM suite of modules ensures integration of Drupal with Zoho CRM. It ensures streamlining of Drupal objects (users, nodes, Ubercart orders) with Zoho CRM modules (Contacts, Accounts, Potentials, etc) to enable easy exchange of data from both ends. 

It also defines the mapping between fields in the Drupal object and Zoho CRM module to send and receive data.

Available for - Drupal 7 | Not Covered by Security Advisory

6.  CRM Core

With this module, you can manage contacts, activities, and relationships in your Drupal website. It also provides immense support to these entities and integrates several tools to make it easier for further support on the website.

It's features include-

  1. User sync allows contacts to be linked with user accounts.
  2. Match provides support for duplicate records
  3. Offers central repository for reports
  4. Allows administrators to handle the UI for managing contacts.

Available for - Drupal 7 | Covered by Security Advisory

7. Leadsquared Integration

The Leadsquared CRM module integrates with the Drupal form to capture the leads. The leads are captured majorly during:

  • Registration
  • Form Submit
  • After publishing/payment/update of content

Available for - Drupal 7 | Not Covered by Security Advisory

8. Pardot integration

Pardot facilitates the marketing and sales department to create, deploy, and manage online marketing campaigns that can increase ROI and maximize efficiency

Pardot provides a software-as-a-service marketing automation application to facilitate the marketing and sales department to create, deploy, and manage online marketing campaigns that can increase ROI and maximize efficiency. It features certified CRM integrations with salesforce.com, NetSuite, Microsoft Dynamics CRM, and SugarCRM, to empower marketers with lead nurturing, lead scoring, and ROI reporting for creating and qualifying sales leads, contracting sales cycles and displaying marketing accountability.

Available for - Drupal 7 | Covered by Security Advisory

9. Freshchat

This module helps you in engaging your site visitors and users to improve sales, gather feedback, and provide support.

Features include-

  • Uses bot for the quick response

Engage dynamically with users through contextual and timely messages. The bot helps in capturing leads, validating and qualifying responses, and auto-uploading them into your CRM. Customize bot messages to deliver the right experience.

  • Fetches more info to improve sales

With details available such as events timeline, user information, and social profiles, know what your visitors are up to. Fetches more info from external tools like CRM or order management systems for more relevancy.

  • Retain customers

Setup in-app campaigns to get users, new users, onboard, retain old ones and re-engage lapsing customers. Send announcements, get feedback, and share product best practices inside the product to bridge the gap and maximize the impact.

Available for - Drupal 8 | Not Covered by Security Advisory

10. SnapEngage

Increase engagement with your visitors and improve your conversion rate to convert visitors and potential leads into customers

It is a straight-forward and streamlined live chat service. Increase engagement with your visitors and improve your conversion rate to convert visitors and potential leads into customers. Live chat with your visitors from your favorite instant messenger or mobile device and easily integrate with your existing CRM or Helpdesk solution.

Available for - Drupal 7 | Covered by Security Advisory

11. Optify

Optify offers a digital marketing software suite to easily create and manage several lead generation programs, nurture prospects, prioritize the best-performing programs and align the reporting of client results-just from one login.

In addition to this, it comprises of integrated SEO, email marketing, landing pages, lead intelligence and nurturing, contact management and seamless integration with Drupal.

Available for - Drupal 7 | Covered by Security Advisory

12. Vtiger CRM

vtiger written  inside box with two way arrow
                                                                Source: Drupal.org

Vtiger CRM comprises of three separate modules, each having distinct functionality. Let’s find out more on this-

  1. Vtiger CRM

This is the base module that provides forms for manipulating settings essential to connect with Vtiger CRM, along with the synchronization page and the VtigerCrmApi class with methods.

2. Vtiger Entity

It provides functionality that allows transferring any fieldable Drupal entity to any Vtiger CRM entity.

3. Vtiger Webform 

This module provides a similar field-mapping UI to specify mappings between Webform components and Vtiger CRM modules. A new Vtiger record is created every time upon the submission of the Webform.

Available for - Drupal 7 | Covered by Security Advisory

13. Re:plain

Re:plain written inside the box                                                                         Source: Drupal.org

It is a flexible and scalable module as it lets you add those functions only that you need as per your business requirements

Re: plain makes communication simpler by providing you with your messenger from where you can directly chat with customers instantly. It is a flexible and scalable module as it lets you add those functions only that you need as per your business requirements. As a result, you can see a significant boost in your and your customers’ revenue.

Its features include-

1. Unlimited Operators

An unlimited number of people can respond to clients with the further option of transferring clients between operators.

2. Template answers

Insert and utilize answers to a similar type of question to save time.

3. Scheduling

Set this module to off or on mode automatically as per your working hours.

4. Integration With Google Analytics

Send data directly to Google Analytics for deep insights.

5. Banners

Use this feature to announce a sale, promote a new service, or just draw your clients’ attention to an FAQ section.

6. Custom Forms

Collect your visitors’ information in customized forms. You can also customize the fields as per your requirement.

7. Video Welcome

Increase the engagement of users by creating and uploading a video on YouTube and setting this link under the Video welcome feature.

8. Active Invitation

Specify the time when you want your visitors to see the welcome message.

9. Relevancy

Integrate it with your mobile applications, CRM, support services, and ERP to leverage the data insights it offers. It will assist in tapping the potential customers at the right time and thus, increasing the overall ROI.

Available for - Drupal 8 | Not Covered by Security Advisory

Final words

If your organization can integrate and deploy Drupal with any of the above-listed CRMs, there are no second thoughts that your organization will enjoy streamlined and optimized business processes in the long term. Consequently, it will strengthen sales and also make the whole process much more effective and capability-driven.

That said, Drupal installations are all unique because of the different modules and customizations that they use, so integration has to be set up differently by an expert.

Every integration module is not ready-to-use, so special care must be taken to ensure compatibility and effectiveness. 

Srijan has a vast experience as well as skills that require integration of Drupal with CRM applications. It has helped its clients in building a reliable CRM strategy. Contact us now! 

Oct 16 2019
Oct 16

The topic of migration from Drupal 7 to Drupal 8 is getting hotter than ever. After all, there is already the ninth Drupal version on the horizon. The best way to prepare for it is to upgrade to Drupal 8 now. See why, and also discover a case study of migrating a career website to Drupal 8 performed by our team.

Why it’s high time for migrating your website to Drupal 8

Drupal 8 was released in November 2015, and Drupal 9 is planned for June 2020. All websites that are still on D7 are lagging a little behind, and soon this breach will be more and more noticeable.

Here’s why it’s worth migrating to Drupal 8 now:

  • It paves the way for easy and instant upgrades in the future. Drupal 9 will be almost like the latest version of Drupal 8, with just a good clean-up from deprecated code and the introduction of the latest libraries.
  • Many experts recommend migrating your website to Drupal 8 as part of preparing for Drupal 9. For example, Drupal creator Dries Buytaert pictured D7 with an abandoned rail track, while the D8 track leads to D9 and beyond, so “you will never have to change the tracks again.”
  • By migrating your site now, you are not wasting time but enjoying numerous and irresistible Drupal 8’s benefits for business. D8 is mobile-first, multilingual, open to integration with third-party apps, handy in content creation, respective of web accessibility standards, clean and efficient in code thanks to using OOP, Twig, HTML5, and so on.

Case study of migrating a website to Drupal 8

Of course, we take care of migrating our customers. One of them is the international multidomain online store — JYSK. It has Danish roots and owns 2,740 stores in 39 countries across the globe.

JYSK international online store

All JYSK websites inside this huge multidomain “machine” needed to be migrated to D8. Today, we will describe one of them — the JYSK career website. This simple case study will become one of our future blog series about migrating websites to Drupal 8!

JYSK career website

The website offers careers in various departments of the JYSK chain:

  • the store
  • distribution center
  • customer service
  • headquarters

Among the sections of the grid menu, there are also ones about the student programs, internship, JYSK values, and more.

JYSK career website menu

The position descriptions are accompanied by the stories of real people who work on them:

JYSK career website job description

For this website, we performed all the classic steps of migrating a website to Drupal 8

  • migrating the functionality
  • migrating the configuration
  • migrating the content

We would now like to emphasize the most interesting points in this particular Drupal 8 migration case.

Database migration

During the database migration process, we used the Drupal 8 core Migrate module, as well as contributed modules such as Migrate Tools and Migrate Plus that extend the core migrate functionality.

To migrate the databases, including content nodes, users, user roles, and taxonomy terms, our development team created a custom module specifically for the JYSK job site migration.

The migration module is interesting because it approaches migration in terms of entity configuration. First, we create migration groups (for the nodes, users, taxonomy terms, etc.) and then create configuration for each subgroup (content type, taxonomy vocabulary, etc.). The plugin maps the Drupal 7 and Drupal 8 fields and is responsible for the field change during the migration.

Code rewriting

However, one of the most challenging steps was migrating the functionality. The job.jysk.dk website features were provided by a number of custom modules. They all needed a good rewrite to meet Drupal 8 standards.

Drupal 8 has many functions and APIs that are deprecated, which we needed to replace. And all custom modules were rewritten according to the OOP (object-oriented programming) style adopted by D8. The eighth version also introduced the use of plugins, so we needed to perform code refactoring.


We used the contributed module Twig Tweak to change the page templates in accordance with Drupal 8’s new template engine — Twig. The module improved the development experience by offering useful functions and filters.

We rebuilt the main website’s theme on the basis of the Bootstrap theme. It builds a bridge between Drupal and the Bootstrap Framework — a powerful and intuitive, mobile-first front-end framework.

Migrate to Drupal 8 with us!

If you are still on Drupal 7, consider migrating your website to Drupal 8 now. No matter is your website is simple or complex, our Drupal team will smoothly and carefully upgrade it to Drupal 8, as well as help you prepare for Drupal 9.

Oct 16 2019
Oct 16

Every online store owner is looking to boost e-commerce sales. A lot here depends on the platform the website is built with — some of them are able to give you more than others.

We believe — and we can prove — that Drupal is the best solution for e-commerce websites. Drupal online stores are usually based on one of the two pillars of Drupal e-commerce development:

  • Ubercart
  • or Drupal Commerce

However, one of these pillars has been getting weaker lately. So if you have a store on Ubercart, let’s discuss why you should consider an Ubercart to Commerce migration.

Making the shift should make your store richer in features, give it reliability for the future, and help you boost e-commerce sales.

Why you should consider Ubercart to Commerce migration

Ubercart and Drupal Commerce are both officially contributed modules, but they are more than that. They represent full-fledged e-commerce platforms and serve as the basis for online stores with all the necessary features present out-of-the-box.

Add-on modules can further extend their capabilities — for example, we posted a collection of useful Drupal 8 modules for e-commerce that help you boost sales in many ways. This collection featured both Ubercart and Drupal Commerce add-ons. However, the situation is changing now — let’s see in what way.

Ubercart: how e-commerce began for Drupal

Ubercart was the first solution that appeared in Drupal's e-commerce development and gave birth to Drupal Commerce. Its first release for Drupal 5 was in 2007. Ubercart is known for its user-friendliness and simplicity.

Already out-of-the-box, it has configurable product catalogs, single-page checkout, flexible system of product attributes, handy reports about the store activity, easy order creation, and other standard features your store may need.

However, if you want to think about your online store’s future and boost e-commerce sales, the next chapter may be a little discouraging for store owners with Ubercart.

Vague future prospects for Ubercart in Drupal 8

Unfortunately, it looks like the development community is shifting their efforts to Commerce and has begun to forget about Ubercart. All this considerably impacts the creation of new and innovative features for the UC.

Even Ryan Szrama, the lead developer for Ubercart, is now the lead developer for Drupal Commerce. One of the most painful things here is the absence of a stable Drupal 8 release. The latest version of Ubercart for Drupal 8 is in an alpha state, which means it is not ready for live sites. It was released back in 2016, which is a sign of a very weak development process. Commits to the Drupal 8 branch are getting very rare.

Sooner or later, everyone will have to face the fact that Drupal 7 is reaching its end-of-life in 2021. The Drupal world is already preparing for Drupal 9, which is coming in June of 2020. Experts recommend an upgrade from Drupal 7 to Drupal 8 as the best way to prepare for Drupal 9. In this case, websites that are clean from the deprecated code will jump to Drupal 9 easily.

Since Drupal Commerce fully supports Drupal 8 and Ubercart effectively does not, your Ubercart to Drupal Commerce migration will be needed as part of the natural website upgrade strategy.

Drupal Commerce benefits to boost e-commerce sales

Drupal Commerce is having a true development boom. Drupal Commerce 2.0, the first stable release for Drupal 8, came in September 2017. And we already have the 8.x-2.14 version released on August 2019.

Drupal Commerce demo

Our colleagues from Drudesk have outlined a few of the great Drupal Commerce 2 features that boost e-commerce sales by making the stores convenient and effective:

  • 70+ payment gateways supported
  • single or multi-stores multiple currencies
  • unique SKUs for product variations
  • a great choice of payment types
  • automatic country-based tax calculation
  • customizable workflows for different order types
  • configurable shipping methods and shipping integrations
  • flexible checkouts

and much more.

The rise of the decoupled commerce architecture

It should be especially noted that Drupal Commerce supports the decoupled architecture that is a hot trend in today’s development. In decoupled Drupal Commerce, the presentation layer is entrusted to a JavaScript framework and is separated from the e-commerce backend.

decoupled Drupal Commerce example

This helps you boost e-commerce sales by significantly enhancing the user experience. Namely, it provides high website speed, real-time cart updates with product previews when the user changes the product quantity, sizes, etc., push notifications, and more. Among the interesting modules specifically for decoupled commerce are:

  • Commerce Cart API provides a RESTful interface for the interaction with shopping carts through a lightweight API.
  • Commerce Cart Flyout creates a sidebar that flies out whenever a user clicks on the shopping cart and wants to view or edit the product details.

Commerce Cart Flyout module for Drupal

Migrate from Ubercart to Drupal Commerce with us

So let a future-ready and feature-rich online store help you boost e-commerce sales! If you are with Ubercart, entrust our Drupal team with doing a smooth Ubercart to Commerce migration, as well as your complete website's upgrade to Drupal 8. Drop us a line and let’s discuss the details.

Oct 16 2019
Oct 16

Events like BADCamp and alike are staples for our team at OpenSense Labs. We thrive on the onsets of such camps and Drupal community affairs. Last week, our team attended the BADCamp 2019 from 2-5 October in Berkeley, California.   

Engrossed in the activities from the planning and organizing of the camp to sponsoring and speaking at the sessions, we were thrilled to be a part of such an astounding camp.

BADCamp photo

This year, the 4-days long event organized in Berkeley had a great lineup of sessions and training events for us to make the most of it. 

As a Contrib sponsor, we had a dedicated booth at the sponsor area. The team had a great time connecting with fellow Drupalers, budding developers and community members who share a common mission and philosophy. Our cool t-shirts and Drupal stickers were a hit among the community! As a Drupal agency, helping and strengthening the community with people across the globe is the best thing about Drupal for us. 

With more engaging conversations and making new friends, it gives us immense pleasure to say that BADCamp 2019 was a success. Let’s hear it from the team now! 

Vidhatanand V

CEO, OpenSense Labs
Vid at BADcamp 2019

Over the course of four amazing days, BADCamp 2019 was an exceptional Drupal event for us to be a part of in Berkeley! From connecting with the community across the globe to having the best of social gatherings, we had a great time. I would like to say thanks to everyone who came to my session on Federated Search. Apart from being an opportunity to share my experience, it also helped me grow and learn from the discussions that followed. It was a great experience presenting a session on the federated search and absolutely loved how it turned into an engrossing discussion. It’s always a good feeling sharing knowledge, after all its a drupal community.

With a lot of professional experiences that we take home, we now look forward to another spectacular year of friendly faces, engaging conversations, and activities in 2020.

Thank you, BADCamp! 

Devanshu Arora

Business Head, OpenSense Labs
Dev at BADCamp 2019

Being one of the largest Drupal events in the West, BADCamp is one of my favorite events. As a second time exhibitor and third-time attendee, we were excited to reunite with the community in Berkeley this year. It was incredibly well organized and full of experiences.

Having a dedicated table at the sponsor area, Vid and I were able to hold fun discussions with the fellow community members and made new friends, too. Anne Stefanyk’s session on How to Work Remotely had great takeaways. Platform.sh as usual, hosted a fantastic social.

It was so great to be back in the Bay Area and I cannot wait to see what more can unfold in 2020!

Until next time!


Thank you BADCamp for a remarkable experience! And thanks to everyone who attended BADCamp and stopped by our booth to say hi. It was our pleasure to be a sponsor and we are glad we could help and contribute.  

You can access our session on Federated Search here.

Next Up

We are heading to the Netherlands for DrupalCon next week! With more exciting sessions on Content Marketing and Theming of Drupal 8, our team is all set to contribute as a silver sponsor at DrupalCon Amsterdam 2019. 

Oct 16 2019
Oct 16

According to Statista, nearly 2 billion digital buyers in 2019 have been estimated in 2019 alone. Online shopping, since it was invented by Michael Aldrich in 1979 in the form of teleshopping, has evolved into a mammoth industry. A lot has changed since the first products were sold online in the 1990s. Today, the eCommerce shift is an important, palpable movement in most economies. Digital innovation in customer experience, business models and technology has been changing digital commerce.

Image including a laptop and mobile phone being used for shopping for representing decoupled drupal commerce and react native based ecommerce platform

Remarkable eCommerce solutions propel great customer journeys. In other words, the technology you use has to deliver on the customer promise. Being a leading content management system and enabler of digital transformation, Drupal can’t be far behind. Its eCommerce solution in Drupal Commerce is built to deliver marvellous customer experience. And when another great solution called React Native is used in tandem, there is no limit to what can be achieved.

The decoupled approach with Drupal Commerce

A shopping cart on left and 'Drupal Commerce' written on right

Drupal Commerce is one of the prolific e-commerce solutions that is powering thousands and thousands of online stores of all sizes. It integrates commerce, content and community for creating engrossing web experiences. As a matter of fact, it is the only commerce platform to have been built upon an enterprise CMS which makes it an immensely content-driven solution. Moreover, being an open-source eCommerce framework, highly modular and configurable, easy to use and highly extensible, Drupal Commerce helps organisations in powering product marketing solutions to a great extent.

Graphical representation with light and dark blue regionsUsage statistics of Drupal Commerce | Source: Drupal.org

Decoupling Drupal Commerce can be even better for enhancing scalability and flexibility. In decoupled Drupal commerce, the frontend of your shopping experience is separated from the backend. In this, Drupal’s astounding content management capabilities are leveraged for greater flexibility in developing your commerce experience. And the fast and reactive JavaScript interfaces communicate with powerful Drupal backends through REST API.

Decoupling Drupal Commerce can be even better for enhancing scalability and flexibility

There is a plentitude of modules that can come handy in the development of your Decoupled Drupal Commerce-powered shopping platform. First up is the Drupal Commerce module that is your go-to option for building a Drupal-powered digital commerce platform. You can replace the default cart block and use Commerce Cart Flyout module for a progressively decoupled implementation. If you need to give a RESTful interface that can communicate with carts in Drupal Commerce through a lightweight public API, there’s Commerce Cart API module that helps in building fully decoupled or progressively decoupled cart experiences.

And the modules like JSON: API and JSON: API Extras can be highly useful. While JSON: API module allows you to generate API server for implementing JSON: API specification, JSON: API Extras helps in customising your API. JSON: API is now also part of the Drupal core which makes it even more quintessential asset.

To get a complete list of essential modules in decoupled Drupal ecosystem, check them out here.

React Native: An efficacious mobile app solution

Oval shaped circles overlapping each on top and 'React Native' written at bottom

React Native, a native version of the JavaScript library, helps in building native mobile applications. And the applications built using React Native are distinct in nature when compared to the apps built using Java or Objective-C.

React Native, with its top-of-the-line business value, offers new perspectives

React native allows you to iterate at lightning speed, works tremendously well on targeted platforms and streamlines the debugging process. It leverages the same fundamental UI (user interface) building blocks as the regular iOS and Android applications but the difference is that you assemble the building blocks with the help of React and JavaScript. It works well with the components written in Swift, Java or Objective-C.

Since its inception in the form of Facebook’s internal hackathon project in 2013, React Native has since become one of the most sought after frameworks. Shoutem states that the first public preview of React Native was done in 2015 at React.js Con. Later, in the same year, React Native was made an open-source framework and was available on GitHub.

Today, React Native, with its top-of-the-line business value, offers new perspectives and in various contexts for building mobile applications. Not surprisingly, top-rated mobile apps show an inclination towards React Native for a great mobile presence.

Icons resembling folder, arrow, cloud, camera stacked together

Decoupled Drupal Commerce + React Native

Think about the astronomical preeminence you can attain in your e-commerce venture when you combine the greatness of Drupal Commerce and React Native. Decoupled Drupal Commerce, when leveraged along with React Native, can work wonders.
This is exactly what Eldum Rétt, an Icelandic subscription service that delivers food boxes to private households and has been a market leader, opted for their digital commerce presence. Powered by Drupal 7 and Drupal Commerce, when the original website of Eldum Rétt was first built, the demand for more flexibility arose after a period of time which eventually called for an upgrade. A digital agency helped them move up the ladder i.e to Drupal 8 and Drupal Commerce 2 in addition to a React Native-based mobile application that can interact with the main Drupal-powered website.

Homepage of Eldum Rett website with a girl holding balloons

Redressal of Eldum Rétt was crucial for their pursuit of exploration of new market opportunities, offering better user experience via a native application, and enabling fantastic customer engagement. A modern solution required modern tech stack. A spectacular content store like Drupal and its provision for commerce suite in the form of Drupal Commerce ensured that Eldum Rétt gets a modern e-commerce platform that could improve their digital presence. Moreover, being open source and highly extensible, Drupal made it easy to extend the features via its APIs and enable intricate functionalities. Here, Drupal acted as the main data store for all its products, user data and information on orders and expose these to mobile applications for the customers to interact with.
A flexible subscription system was created where any product declared as subscribable can be bought as a subscription. It also enables the flexible configuration of meal kits and menus. Schedules can also be easily set. Moreover, it uses smart packing algorithm and the shipping process offers the customer to choose small deliveries instead of bundle deliveries. It has an advanced notification system and also leverages OAuth for authentication purposes. Recipes include comprehensive and structured information and most of the data was migrated from the old system to the new one.


The combination of decoupled Drupal Commerce and React Native can be fruitful for a great digital commerce presence.
We have been offering digital innovation solution with our expertise in Drupal Commerce. Contact us at [email protected] and let us know how you want us to build a unique and innovative solution using Drupal Commerce.

Oct 16 2019
Oct 16

Join the team making Drupal 9 as part of the contribution events in person at DrupalCon Amsterdam and remotely! We'll be spending the last day of DrupalCon Amsterdam working on three things.

  1. Removing deprecated API use from Drupal 8 contributed projects to get them ready for Drupal 9.
  2. Updating core dependencies and removing deprecated APIs from Drupal 9 itself.
  3. Improve the tools and documentation that help people prepare for Drupal 9.

On October 31st, join on Slack even if you are there in person but definitely if you are remote. We'll meet in the #d9readiness channel on Drupal Slack. If you’re at Drupalcon Amsterdam, come to Europe 2 Forum!

To remove deprecated API use from contributed projects:

  1. Tools: Pull up the getting started info at https://tiny.cc/d9readiness and get set up with the drupal-check tool to check your code for deprecated API use.
  2. Help people who want to learn how to remove deprecated API use from projects, or remove some deprecated API use yourself! In minutes, you too could help make Drupal 9! Here are tips and tricks for the most common deprecated APIs: https://github.com/mglaman/drupal-check/wiki/Deprecation-Error-Solutions.
  3. Pick an issue: Check the list of issues tagged Drupal 9 compatibility + Amsterdam2019 (or use your own module!).
  4. If you're a module developer who would like to get a better chance to having your module reviewed / patched by a new contributor there, please create (or tag an existing) issue with the Drupal 9 compatibility + Amsterdam2019 tags!

To remove the deprecated APIs themselves, update dependencies in Drupal 9 or improve the deprecated API checking tools and documentation, please ask for specifics in the Slack channel or in person. The Drupal 9 tables will be easy to find!

You can also help spread the word by retweeting this:

Join #IMadeDrupal9 at @DrupalConEur and remotely!https://t.co/85IUEelGqK pic.twitter.com/uKWDx16Nid

— Gábor Hojtsy (@gaborhojtsy) October 16, 2019

1xINTERNET will even supply a limited amount of stickers to proudly present that you made a difference!

Thanks everyone for contributing to making Drupal 9!

Oct 16 2019
Oct 16

October 16, 2019

Unit tests are the fastest, most reliable kinds of tests: they confirm that the smallest units of your code, i.e. class methods, work as expected.

Unit tests do not require a full environment with a database and external libraries; this makes unit tests extremely fast.

In this article we will look at how to take any PHP code – a Drupal site or module, or indeed any other PHP codebase unrelated to Drupal – and start unit testing it today. We’ll start by setting up tests which work for any PHP code, and then we’ll see how to run your tests on the Drupal testbot if you so desire.

This article accompanies a talk I gave about unit testing at Drupalcamp Ottawa on Octoboer 18, 2019, here are the accompanying slides.

Before we start testing

Unit tests are useless unless they are run on every change (commit) to a codebase through continuous integration (CI). And it’s excruciatingly painful to make CI work without some sort of platform-agnostic DevOps setup (we’ll use a Docker-based workflow), so before we even start testing, we’ll set up CI and Docker.

Docker for all things

In the context of this article, we’ll define DevOps as a way to embed all dependencies within our code, meaning we want to limit the number of dependencies on our computer or CI server to run our code. To do this, we will start by installing and starting Docker Desktop.

Once you’ve set it up, confirm you have Docker running:

docker -v
# Docker version 19.03.2, build 6a30dfc

At this point, we can be assured that any code we run through Docker will run on any machine which has Docker installed. In this article we’ll use mostly PHPUnit, so instead of installing and configuring PHPUnit on our computer and our CI server and our colleagues’ computers, we can simply make sure our computer and our CI server have Docker installed, and run:

docker run --rm phpunit/phpunit --version

The first time this is run on an environment, it should result in:

Unable to find image 'phpunit/phpunit:latest' locally
latest: Pulling from phpunit/phpunit
Digest: sha256:bbbb143951f55fe93dbfed9adf130cae8623a1948f5a458e1aabbd175f7cb0b6
Status: Downloaded newer image for phpunit/phpunit:latest
PHPUnit 6.5.13 by Sebastian Bergmann, Julien Breux (Docker) and contributors.

On subsequent runs it will result in:

PHPUnit 6.5.13 by Sebastian Bergmann, Julien Breux (Docker) and contributors.

Installing PHPUnit can also be done through Composer. In this article we won’t use Composer because

  • that would require us to manage a specific version of PHP on each machine;
  • Composer does not work for programming languages other than PHP (say, for example, we want to unit test Javascript or Python).

Let’s get started!

Host your code on Github or Bitbucket

We will avoid getting ahead of ourselves by learning and using Drupal’s unit test classes (which are based on PHPUnit) and testing infrastructure (we’ll do that below): we want to start by understanding how to unit test any PHP code (Drupal or otherwise).

To that end, we will need to host our code (or a mirror thereof) on non-Drupal infrastructure. Github and Bitbucket both integrate with CircleCI, a free, fast, and easy cloud continuous integration (CI) service with no vendor lock-in; we’ll use CircleCI later on in this article. With understanding of general unit testing principles under your belt, you can later move on to use framework-specific (including Drupal-specific) testing environments if you deem it necessary (for example if you are a contributor to core or to contrib modules which follow Drupal’s testing guidelines).

To demonstrate the principles in this article, I have taken a random Drupal 8 module which, at the time of this writing, has no unit tests, Automatic Entity Label. My selection is completely arbitrary, and I don’t use this module myself, and I’m not advocating you use it or not use it.

So, as my first step, I have added v. 8.x-3.0-beta1 of this module as is to Github, and tagged it as “original”.

You can see the version I uploaded to Github, without tests, here. There are no unit tests – yet.

Start continuous integration

Because, as we mentioned above, automated testing is all but useless without continuous integration (CI) to confirm your tests are passing, the next step is to set up CI. Attaching CircleCI to Github repos is straightforward. I started by adding a test that simply confirms that we can access PHPUnit on our CI environment.

Here is the changes I made to my code to add continuous integration. At this stage, this code only confirms that PHPUnit can be run via Docker, nothing else. If you want to follow along with your own codebase, you can add the same minor changes (in fact you are encouraged to do so). The change to the README.md document is a “Badge” which displays as green if tests pass, and red if they don’t, on the project’s home page. The rest is straightforward.

Once your code is set up for CI integration, create an account and log on to CircleCI using your Github account (Bitbucket works also), select your project from your list of projects (“Set Up Project” button), and start building it (“Start Building” button); that’s it!

Here is my very first build for my version of Auto Entity Label. It is worth unfolding the “Tests” section and looking at the test results:

Unable to find image 'phpunit/phpunit:latest' locally
latest: Pulling from phpunit/phpunit
Digest: sha256:bbbb143951f55fe93dbfed9adf130cae8623a1948f5a458e1aabbd175f7cb0b6
Status: Downloaded newer image for phpunit/phpunit:latest
PHPUnit 6.5.13 by Sebastian Bergmann, Julien Breux (Docker) and contributors.

You’ll notice that you have output very similar to what you have on your own computer. That’s the magic of Docker: build once, run anywhere. Without it, Continuous Integration is like pulling teeth.

Setting up PHPUnit to actually run tests

Before we can test anything, PHPUnit needs to know where the tests reside, which tests to run, and how to autoload classes based on their namespace. Different frameworks, including Drupal, have recommendations on all this, but to get a good idea of how PHPUnit works, let’s start from scratch by creating four new files in our project (keep them empty for now):

  • ./phpunit.xml, at the root of our project, will define where are tests are located, and where our autoloader is located.
  • ./phpunit-autoload.php, at the root of our project, is our autoloader; it tells PHPUnit that, for example, the namespace Drupal\auto_entitylabel\AutoEntityLabelManager corresponds to the file src/AutoEntityLabelManager.
  • ./phpunit-bootstrap.php, we’ll leave empty for now, and look at it later on.
  • ./tests/AutoEntityLabelManagerTest.php, which will contain a test for the AutoEntityLabelManager class.


In this file, we’ll tell PHPUnit where to find our tests, and where the autoloader is. Different developers have their own preferences for what to put here, and Drupal has specific recommendations, but for now we’ll just use a simple file declaring that our tests are in ./tests (although they could be anywhere), and that the file phpunit-autoload.php (you could name it anything) should be loaded before each test is run:

<?xml version="1.0" encoding="UTF-8"?>
<phpunit bootstrap="phpunit-autoload.php">
    <testsuite name="myproject">


In this file, we’ll tell PHPUnit how to find files based on namespaces. Different projects do this differently. For example, Drupal 7 has a custom Drupal-only way of autoloading classes; Drupal 8 uses the PSR-4 standard. In our example, we’re telling PHPUnit that any code which uses the class Drupal\auto_entitylabel\Something will load the corresponding file ./src/Something.php:


 * @file
 * PHPUnit class autoloader.
 * PHPUnit knows nothing about Drupal, so provide PHPUnit with the bare
 * minimum it needs to know in order to find classes by namespace.
 * Used by the PHPUnit test runner and referenced in ./phpunit.xml.

spl_autoload_register(function ($class) {
  if (substr($class, 0, strlen('Drupal\\auto_entitylabel\\')) == 'Drupal\\auto_entitylabel\\') {
    $class2 = str_replace('Drupal\\auto_entitylabel\\', '', $class);
    $path = 'src/' . str_replace('\\', '/', $class2) . '.php';
    require_once $path;


(We’ll leave that one empty for now, but later on we’ll use it to put dummy versions of classes that Drupal code expects to find.)


Here is our first test. Let’s start with a very simple unit test: once which tests a pure function with no externalities.

Let’s take AutoEntityLabelManager::auto_entitylabel_entity_label_visible().

Here it is context, and here is the actual code we want to test:

public static function auto_entitylabel_entity_label_visible($entity_type) {
  // @codingStandardsIgnoreEnd
  $hidden = [
    'profile2' => TRUE,
  return empty($hidden[$entity_type]);

This is actual code which exists in the Auto Entity Label project; I have never tried this function in a running Drupal instance, I’m not even sure why it’s there, but I can still test it. I assume that if I call AutoEntityLabelManager::auto_entitylabel_entity_label_visible('whatever'), I should get TRUE as a response. This is what I will test for in ./tests/AutoEntityLabelManagerTest.php:


namespace Drupal\auto_entitylabel\Tests;

use Drupal\auto_entitylabel\AutoEntityLabelManager;
use PHPUnit\Framework\TestCase;

 * Test AutoEntityLabelManager.
 * @group myproject
class AutoEntityLabelManagerTest extends TestCase {

   * Test for auto_entitylabel_entity_label_visible().
   * @cover ::auto_entitylabel_entity_label_visible
  public function testAuto_entitylabel_entity_label_visible() {
    $this->assertTrue(AutoEntityLabelManager::auto_entitylabel_entity_label_visible('whatever') === TRUE, 'Label "whatever" is visible.');


For test methods to be called by PHPUnit, they need to start with a lowercase test.

(If you have looked at other Drupal unit testing tutorials, you might have noticed that Drupal unit tests are based not on PHPUnit\Framework\TestCase but on Drupal\Tests\UnitTestCase. The latter provides some useful, but not critical, helper code. In our case, using PHPUnit directly without Drupal means we don’t depend on Drupal to run our code; and we can better understand the intricacies of PHPUnit.)


Finally we’ll need to tweak ./scripts/ci.sh a bit:

docker run --rm -v "$(pwd)":/app phpunit/phpunit \
  --group myproject

Adding -v "$(pwd)":/app shares our code on our host computer or server with a directory called /app on the PHPUnit Docker container, so PHPUnit actually has access to our code. --group myproject runs all tests in the “myproject” group (recall that in tests/AutoEntityLabelManagerTest.php, we have added @group myproject to the class comment).

Here are the changes we made to our code.

Running our first test… and running into our first problem

With all those changes in place, if you run ./scripts/ci.sh, you should have this output:

$ ./scripts/ci.sh
PHPUnit 6.5.13 by Sebastian Bergmann, Julien Breux (Docker) and contributors.

…and this Fatal error…

PHP Fatal error:  Trait 'Drupal\Core\StringTranslation\StringTranslationTrait' not found in /app/src/AutoEntityLabelManager.php on line 16

So what’s happening here? It turns out AutoEntityLabelManager uses something called StringTranslationTrait. A PHP trait is a code sharing pattern. It’s a fascinating topic and super useful to write testable code (we’ll get to it later); but right now we don’t need it and don’t really care about it, it’s just getting in the way of our test. We somehow need to tell PHPUnit that Drupal\Core\StringTranslation\StringTranslationTrait needs to exist, but we don’t really care – right now – what it does.

That’s where our phpunit-bootstrap.php file comes in. In it, we can define Drupal\Core\StringTranslation\StringTranslationTrait so that PHP will not complain that it does not exit.

In phpunit-autoload.php, require phpunit-bootsrap.php:

require_once 'phpunit-bootstrap.php';

And in phpunit-bootsrap.php, define a dummy version of Drupal\Core\StringTranslation\StringTranslationTrait:


 * @file
 * PHPUnit knows nothing about Drupal. Declare required classes here.

namespace Drupal\Core\StringTranslation {
  trait StringTranslationTrait {}

Here is the diff in our repo.

Running our first passing test!

This is a big day for you, it’s the day of your first passing test:

$ ./scripts/ci.sh
PHPUnit 6.5.13 by Sebastian Bergmann, Julien Breux (Docker) and contributors.

.                                                                   1 / 1 (100%)

Time: 124 ms, Memory: 4.00MB

OK (1 test, 1 assertion)

Because of the magic of Docker, the same output can be found on our CI infrastructure’s equivalent passing test (by unfolding the “Tests” section) once we push our code to Github.

Introducing test providers

OK, we’re getting into the jargon of PHPUnit now. To introduce the concept of test providers, consider this: almost every time we run a test, we’d like to bombard our unit (our PHP method) with a variety of inputs and expected outputs, and confirm our unit always works as expected.

The basic testing code is always the same, but the inputs and expected outputs change.

Consider our existing test:

 * Test for auto_entitylabel_entity_label_visible().
 * @cover ::auto_entitylabel_entity_label_visible
public function testAuto_entitylabel_entity_label_visible() {
  $this->assertTrue(AutoEntityLabelManager::auto_entitylabel_entity_label_visible('whatever') === TRUE, 'Label "whatever" is visible.');

Maybe calling our method with “whatever” should yield TRUE, but we might also want to test other inputs to make sure we cover every possible usecase for the method. In our case, looking at the method, we can reasonably surmise that calling it with “profile2” should yield FALSE. Again, I’m not sure why this is; in the context of this tutorial, all I want to do is to make sure the method works as expected.

So the answer here is to serarate the testing code from the inputs and expected outputs. That’s where the provider comes in. We will add arguments to the test code, and define a separate function which calls our test code with different arguments. The end results looks like this (I also like to print_r() the expected and actual output in case they differ, but this is not required):

 * Test for auto_entitylabel_entity_label_visible().
 * @param string $message
 *   The test message.
 * @param string $input
 *   Input string.
 * @param bool $expected
 *   Expected output.
 * @cover ::auto_entitylabel_entity_label_visible
 * @dataProvider providerAuto_entitylabel_entity_label_visible
public function testAuto_entitylabel_entity_label_visible(string $message, string $input, bool $expected) {
  $output = AutoEntityLabelManager::auto_entitylabel_entity_label_visible($input);

  if ($output != $expected) {
      'output' => $output,
      'expected' => $expected,

  $this->assertTrue($output === $expected, $message);

 * Provider for testAuto_entitylabel_entity_label_visible().
public function providerAuto_entitylabel_entity_label_visible() {
  return [
      'message' => 'Label "whatever" is visible',
      'input' => 'whatever',
      'expected' => TRUE,
      'message' => 'Label "profile2" is invisible',
      'input' => 'profile2',
      'expected' => FALSE,
      'message' => 'Empty label is visible',
      'input' => '',
      'expected' => TRUE,

Here is the diff in GitHub.

At this point, we have one test method being called with three different sets of data, so the same test method is being run three times; running the test now shows three dots:

$ ./scripts/ci.sh
PHPUnit 6.5.13 by Sebastian Bergmann, Julien Breux (Docker) and contributors.

...                                                                 3 / 3 (100%)

Time: 232 ms, Memory: 4.00MB

OK (3 tests, 3 assertions)

Breaking down monster functions

It must be human nature, but over time, during development, functions tend to get longer and longer, and more and more complex. Functions longer than a few lines tend to be hard to test, because of the sheer number of possible execution paths, especially if there are several levels of control statements.

Let’s take, as an example, auto_entitylabel_prepare_entityform(). With its multiple switch and if statements, it has a cyclomatic complexity of 7, the highest in this codebase, according to the static analysis tool Pdepend. If you’re curious about finding your cyclomatic complexity, you can use the magic of Docker, run the following, and take a look at ./php_code_quality/pdepend_output.xml:

mkdir -p php_code_quality && docker run -it --rm -v "$PWD":/app -w /app adamculp/php-code-quality:latest php /usr/local/lib/php-code-quality/vendor/bin/pdepend --suffix='php,module' --summary-xml='./php_code_quality/pdepend_output.xml' .

See adamculp/php-code-quality for more details. But I digress…

Testing this completely would require close to 2 to the power 7 test providers, so the easiest is to break it down into smaller functions with a lower cyclomatic complexity (that is, fewer control statements). We’ll get to that in a minute, but first…

Procedural code is not testable, use class methods

For all but pure functions, procedural code like our auto_entitylabel_prepare_entityform(), as well as private and static methods, are untestable with mock objects (which we’ll get those later). Therefore, any code you’d like to test should exist within a class. For our purposes, we’ll put auto_entitylabel_prepare_entityform() within a Singleton class, like this, and name it prepareEntityForm(). (You don’t need to use a Singleton; you can use a Drupal service or whatever you want, as long as everything you want to test is a non-static class method.)

Our second test

So we put our procedural code in a class. But the problem remains: it’s too complex to fully cover with unit tests, so as a next step I recommend surgically removing only those parts of the method we want to test, and putting them in a separate method. Let’s focus on these lines of code, which can lead to this change in our code.

Object and method mocking, and stubs

Let’s consider a scenario where we want to add some tests to EntityLabelNotNullConstraintValidator::validate().

Let’s start by splitting the validate method into smaller parts, like this. We will now focus on testing a more manageable method with a lower cyclomatic complexity:

 * Manage typed data if it is valid.
 * @return bool
 *   FALSE if the parent class validation should be called.
public function manageTypedData() : bool {
  $typed_data = $this->getTypedData();
  if ($typed_data instanceof FieldItemList && $typed_data->isEmpty()) {
    return $this->manageValidTypedData($typed_data);
  return FALSE;

Recall that in unit testing, we are only testing single units of code. In this case, the unit of code we are testing is manageTypedData(), above.

In order to test `manageTypedData() and nothing else, conceptually, we need to assume that getTypedData() and manageValidTypedData() are doing their jobs, we will not call them, but replace them with stub methods within a mock object.

We want to avoid calling getTypedData() and manageValidTypedData() because that would interfere with our testing of manageTypedData() – we need to mock getTypedData() and manageValidTypedData().

When we test manageTypedData() in this way, we need to replace the real getTypedData() and manageValidTypedData() with mock methods and make them return whatever we want.

PHPUnit achieves this by making a copy of our EntityLabelNotNullConstraintValidator class, where getTypedData() and manageValidTypedData() are replaced with our own methods which return what we want. So in the context of our test, we do not instantiate EntityLabelNotNullConstraintValidator, but rather, a mock version of that class in which we replace certain methods. Here is how to instantiate that class:

$object = $this->getMockBuilder(EntityLabelNotNullConstraintValidator::class)
// We don't care how getTypedData() figures out what to return to
// manageTypedData, but we do want to see how our function will react
// to a variety of possibilities.
// We will assume manageValidTypedData() is doing its job; that's not
// what were are testing here. For our test, it will always return TRUE.

In the above example, our new object behaves exactly as EntityLabelNotNullConstraintValidator, except that getTypedData() returns $input (which we’ll define in a provider); and manageValidTypedData() always returns TRUE.

Keep in mind that private methods cannot be mocked, so for that reason I generally avoid using them; use protected methods instead.

Here is our initial test for this.

Our provider, at this point, only makes sure that if getTypedData() returns a new \stdClass() which is not an instanceof FieldItemList, then the method we’re testing will return FALSE.

Here is how we could extend our provider to make sure our method reacts correctly if getTypedData() returns a FieldItemList whose isEmpty() method returns TRUE, and FALSE.

Testing protecting methods

Let’s say we want to (partially) test the protected AutoEntityLabelManager::getConfig(), we need to introduce a new trick.

Start by taking a look at our test code which fails. If you try to run this, you will get:

There was 1 error:

1) Drupal\auto_entitylabel\Tests\AutoEntityLabelManagerTest::testGetConfig
Error: Cannot access protected property Mock_AutoEntityLabelManager_0f5704cf::$config

So we want to test a protected method (getConfig()), and, in order to test it, we need to modify a protected property ($config). These two will result in “Cannot access”-type failures.

The solution is to use a trick known as class reflection; it’s a bit opaque, but it does allow us to access protected properties and methods.

Take a look at some changes which result in a working version of our test.

Copy-pasting is perhaps your best fiend here, because this concept kind of plays with your mind. But basically, a ReflectionClass allows us to retrieve properties and methods as objects, then set their visibility using methods of those objects, then set their values or call them using their own methods… As I said, copy-pasting is good, sometimes.

A note about testing abstract classes

There are no abstract classes in Auto Entity Label, but if you want to test an abstract class, here is how to create a mock object:

$object = $this->getMockBuilder(MyAbstractClass::class)

Using traits

Consider the following scenario: a bunch of your code uses the legacy drupal_set_message() method. You might have something like:

class a extends some_class {
  public function a() {

class b extends some_other_class {
  public function b() {

Your tests will complain if you try to call, or mock drupal_set_message() when unit-testing a::a() or b::b(), because drupal_set_message()` is procedural and you can’t do much with it (thankfully there is fewer and fewer procedural code in Drupal modules, but you’ll still find a lot of it).

So in order to make drupal_set_message() mockable, you might want to something like:

class a extends some_class {
  protected method drupalSetMessage($x) {
  public function a() {

class b extends some_other_class {
  protected method drupalSetMessage($x) {
  public function b() {

Now, however, we’re in code duplication territory, which is not cool (well, not much of what we’re doing is cool, not in the traditional sense anyway). We can’t define a base class which has drupalSetMessage() as a method because PHP doesn’t (and probably shouldn’t) support multiple inheritance. That’s where traits come in, it’s a technique for code reuse which is exactly adapted to this situation:

trait commonMethodsTrait {
  protected method drupalSetMessage($x) {

class a extends some_class {
  use commonMethodsTrait;

  public function a() {

class b extends some_other_class {
  use commonMethodsTrait;

  public function b() {

Drupal uses this a lot: the t() method is peppered in most of core and contrib; earlier in this article we ran into StringTranslationTrait; that allows developers to use $this->t() instead of the legacy t(), therefore making it mockable when testing methods which use it. The great thing about this approach is that we do not even need Drupal’s StringTranslationTrait when running our tests, we can mock t() even if a dummy version of StringTranslationTrait is used.

Check out this test for an example.

What about Javascript, Python and other languages?

PHP has PHPUnit; other languages also have their test suites, and they, too, can run within Docker. Javascript has AVA; Python has unittest.

All unit test frameworks support mocking.

Let’s look a bit more closely at AVA, but we do not want to install and maintain it on all our developers’ machines, and on our CI server, so we’ll use a Dockerized version of AVA. We can download that project and, specifically, run tests against example 3:

git clone [email protected]:dcycle/docker-ava.git
docker run -v $(pwd)/example03/test:/app/code \
  -v $(pwd)/example03/code:/mycode dcycle/ava

The result here, again due to the magic of Docker, should be:

So what’s going on here? We have some sample Javascript code which has a function we’d like to test:

module.exports = {
  dangerlevel: function(){
    return this.tsunamidangerlevel() * 4 + this.volcanodangerlevel() * 10;

  tsunamidangerlevel: function(num){
    // Call some external API.
    return this_will_fail_during_testing();
    // During tests, we want to ignore this function.

  volcanodangerlevel: function(num){
    // Call some external API.
    return this_will_fail_during_testing();
    // During tests, we want to ignore this function.

In this specific case we’d like to mock tsunamidangerlevel() and volcanodangerlevel() during unit testing: we don’t care that this_will_fail_during_testing() is unknown to our test code. Our test could look something like this:

import test from 'ava'
import sinon from 'sinon'

var my = require('/mycode/dangerlevel.js');

test('Danger level is correct', t => {
  sinon.stub(my, 'tsunamidangerlevel').returns(1);
  sinon.stub(my, 'volcanodangerlevel').returns(2);

  t.true(my.dangerlevel() == 24);

What we’re saying here is that if tsunamidangerlevel() returns 1 and volcanodangerlevel() returns 2, then dangerlevel() should return 24.

The Drupal testbot

Edit (December 10, 2019): until this issue is fixed I recommend using the CircleCI technique and not testing on the Drupal infrastructure.

Drupal has its own Continuous Integration infrastructure, or testbot. It’s a bit more involving to reproduce its results locally; still, you might want to use if you are developing a Drupal module; and indeed you’ll have to use if it you are submitting patches to core.

In fact, it is possible to tweak our code a bit to allow it to run on the Drupal testbot and CircleCI.

Here are some changes to our code which allow exactly that. Let’s go over the changes required:

  • Tests need to be in ./tests/src/Unit;
  • The @group name should be unique to your project (you can use your project’s machine name);
  • The tests should have the namespace Drupal\Tests\my_project_machine_name\Unit or Drupal\Tests\my_project_machine_name\Unit\Sub\Folder (for example Drupal\Tests\my_project_machine_name\Unit\Plugin\Validation);
  • The unit tests have access to Drupal code. This is actually quite annoying, for example, we can no longer just create an anonymous class for FieldItemList but rather, we need to create a mock object using disableOriginalConstructor(); this is because, the unit test code being aware of Drupal, it knows that FieldItemList requires parameters to its constructor; and therefore it complains when we don’t have any (in the case of an anonymous object).

To make sure this works, I created a project (it has to be a full project, as far as I can tell, can’t be a sandbox project, or at least I didn’t figure out to do this with a sandbox project) at Unit Test Tutorial. I then activated automated testing under the Automated testing tab.

The results can be seen on the Drupal testbot. Look for these lines specifically:

20:32:38 Drupal\Tests\auto_entitylabel\Unit\AutoEntityLabelSingletonT   2 passes
20:32:38 Drupal\Tests\auto_entitylabel\Unit\AutoEntityLabelManagerTes   4 passes
20:32:38 Drupal\Tests\auto_entitylabel\Unit\Plugin\Validation\EntityL   1 passes
20:32:38 Drupal\Tests\auto_entitylabel\Unit\Form\AutoEntityLabelFormT   1 passes

My main annoyance with using the Drupal testbot is that it’s hard to test locally; you need to have access to a Drupal instance with PHPUnit installed as a dev dependency, and a database. To remedy this, the Drupal Tester Docker project can be used to run Drupal-like tests locally, here is how:

git clone https://github.com/dcycle/drupal-tester.git
cd drupal-tester/
mkdir -p modules
cd modules
git clone --branch 8.x-1.x https://git.drupalcode.org/project/unit_test_tutorial.git
cd ..
./scripts/test.sh "--verbose --suppress-deprecations unit_test_tutorial"
docker-compose down -v

This will give you more or less the same results as the Drupal testbot:

Drupal\Tests\auto_entitylabel\Unit\AutoEntityLabelManagerTes   4 passes
Drupal\Tests\auto_entitylabel\Unit\AutoEntityLabelSingletonT   2 passes
Drupal\Tests\auto_entitylabel\Unit\Form\AutoEntityLabelFormT   1 passes
Drupal\Tests\auto_entitylabel\Unit\Plugin\Validation\EntityL   1 passes

In conclusion

Our promise, from the title of this article, is “Start unit testing your PHP code today”. Hopefully the tricks herein will allow you to do just that. My advice to you, dear testers, is to start by using Docker locally, then to make sure you have Continuous Integration set up (on Drupal testbot or CircleCI, or, as in our example, both), and only then start testing.

Happy coding!

Please enable JavaScript to view the comments powered by Disqus.

Oct 15 2019
Oct 15

I began my DrupalEasy journey with the greatest of intentions. Jumping in head first, I upgraded to Windows 10 Pro, set up a new local development environment — I highly recommend DDEV for its power and flexibility, and because it allows development teams to use Docker in their workflow — and reacquainted myself to Composer and the command line. If there was a roll, I was on it.

Then week 2 happened. What I learned then is that unfortunately, having a teacher doesn’t automatically make the path to Drupal proficiency a smooth, easy ascent to greatness — at least not for me. The greatest challenge that I encountered, and totally underestimated, was the whole concept of time.

Now, if you’re anything like me, you’re learning Drupal while also working a full-time job. This was fine when I was teaching myself on my own time. But with an actual course like DrupalEasy, I totally underestimated the time commitment of scheduled class times and assignments. While the homework is optional, I have to at least attempt it to get the most out of the course.

In week 2, I had a vacation, a wedding, and a team retreat on my calendar. To say I fell behind in the class is an understatement. On top of catching up with email and work tasks, I now had to find time to watch hours of video lecture and complete the homework assignments. The class was learning without me and I felt totally frazzled.

I realized I had to get focused — to get really intentional with my time and plan, plan, plan. It was the only way to balance Drupal, work, and life. Thankfully, both Michael (my instructor) and Addi (my boss) were extremely supportive. I also knew there was a gap week scheduled that would allow me time to catch up. (Hello gap week!) Soon, I’ll be right back in line with all of my classmates as if I had been there all along.

So if your Drupal journey is anything like mine, know there’ll be bumps along the way. Mine was time. Just don’t let a bump on your path become a deterrent. It’s okay to fall behind or get a bit lost. Just don’t stop. There’s hope. Your “gap week” is approaching.

Oct 15 2019
Oct 15

Last month I begun my second decade of working with Drupal! How crazy is that? I started at ComputerMinds in 2009. Drupalcon Paris was my first week on the job - I just remember taking so many notes, as if it were a university course! I had a lot to learn then, but now I can look back with a much more experienced head, hopefully wiser, with some sort of perspective.

Conference room before the keynote at Drupalcon Paris 2009The conference room before the keynote at Drupalcon Paris 2009, when my career with Drupal began.

My first steps with Drupal were on Drupal 6. It could do a lot (at least once CCK and Views were installed), and Wordpress was much smaller so the two were still occupying a similar space in the market. Most of the sites we built were trying to be a mini social network, a blog, or a brochure site, sometimes with e-commerce thrown in. There were rounded corners everywhere, garish pinks, and some terribly brittle javascript. Supporting Internet Explorer 6 was a nightmare, but still necessary.

It's probably only over the last few years that it became clear that Drupal's strength is its ability to be turned to just about anything that a website needs to do. That has meant that whilst alternative products have picked up the simpler sites, Drupal has been brilliant for projects with complex requirements. Integrating with CRMs and all sorts of other APIs, handling enormous traffic loads, providing content for apps - this kind of stuff has always been Drupal's jam. You just have to know how to get it hooked up in all the right places!

Speaking of hooks, it's been interesting to see Drupal move from its famous magical hooks, towards event-driven architecture. For me, that single shift represented an enormous change in direction for Drupal. I believe the events/subscriber pattern, as a part of a wider effort in Drupal 8 to deliberately use existing well-defined design patterns and solutions, is a sign of a much more mature and professional platform. Most coding problems have been solved well elsewhere, we shouldn't reinvent the wheel! (Although I know I can be guilty of that!) That's just one example of how the Drupal ecosystem has become more professional over the last ten years. Formal testing is another example. Many people have felt left behind by this shift, as a need in the enterprise world that Drupal could meet was identified. 'Enterprise' gets used as a dirty word sometimes - but to be frank, there's more money and more suitable opportunity there!

That is something the Drupal community has to be honest about. It is rightly aiming to champion diversity, and be as accessible as possible for as many as possible across the world (I especially love that Drupal is now so good for multilingual projects). It's not like Drupal is suddenly inappropriate for smaller projects - in fact, I'd still suggest it's far ahead in many aspects.

But money makes things happen, and gives people their livelihood. I appreciate seeing honesty and innovation about this coming from community leaders. Thankfully those kinds of values are what drive the Drupal project, even if money is often the facilitator. As a community we must always fight to keep those things in the right relation to each other: money has an inevitable influence that we must accept, but it must be led by us and our values, not the other way around. I should add that I am very aware that I am privileged to be a developer in a leading Drupal agency, so my opinion will be shaped by that position!

To end with honesty and innovation myself, I would love to see the following carried into the Drupal project's next 10 years, which I saw the community espouse over the last decade. I know I need to grow in these things myself!

  • Maintain excellence, even as the make-up of the community changes.
  • Deliberately listen to under-represented portions of the community, as an intentional force against the skewing effect of money/power.
  • Keep watch for what competitors and other relevant products are doing, to incorporate worthwhile things wherever possible.
  • Reflect on the strengths & weaknesses of Drupal (and its community) with honesty. Let's make the most of what makes Drupal what it is and not be afraid for it to focus on that; the same goes for us as individuals.

I'm proud to work with Drupal, long may it continue!

Oct 15 2019
Oct 15

Drupal 8 is making things so much easier for content authors. One of the most important reasons being the addition of the Layout Builder module in core Drupal 8. With easy drag and drop interface, preview and customizing abilities, the layout builder is soon becoming a favorite page-building and designing tool.

In my previous article, I wrote about how you can get started with installing and using Drupal 8’s Layout builder. Here, I want to share my knowledge on customizing a layout builder for unique requirements.

If your drupal website needs multiple sections with multiple blocks, then you can’t use the default sections that are provided by Drupal. For this, you can create your own custom layout.

Getting Started with Layout Builder :

We will first create a custom layout builder module for our custom layout builder We will name the folder as custom_layout. Next, we will create an info.yml file. We should specify the basic keys for it. The custom layout builder in drupal 8 will have a dependency on the layout builder module. Let us specify it over here.

custom layout
name: 'Custom Layout'
type: module
description: 'Provides a way for building layout'
core: 8.x
package: 'Custom'
 - layout_builder:layout_builder

Next we will create layouts.yml file to specify regions for our custom layout.

  • custom_layout: key for our custom layout builder layout.
  • label: Label for our custom layout builder layout.
  • category: Category for our custom layout builder layout.
  • default_region:  Default region are the regions which are default in all type of layout.
  • icon_map: Icon which will be shown up while we choose our layout.



To create above icon map we need to follow the below steps.

1. First row is “Header Left” and “Header Right”
         We have specified - [header_left, header_left, header_right] - header_left is             
         defined 2 times so it will take 2 parts of total width of the   screen then header_right will take rest of the portion so ratio will be  “75%/25%”.
2. Second row is “Content” and “Sidebar”
        We have specified - [content, content, sidebar]   same above logic applied here.    
3.  Third row is “Footer Left” and “Footer Right”
        We have specified - [footer_left, footer_right] -  since there are only 2 regions it will take 50% each.

  • regions: Regions which we need for our layout. We have header_left, header_right, sidebar, content, footer_left, footer_right.
custom layout builder


 label: 'Custom Layout'
 category: 'Custom Layouts'
 default_region: content
   - [header_left, header_left, header_right]
   - [content, content, sidebar]
   - [content, content, sidebar]
   - [content, content, sidebar]
   - [footer_left, footer_right]

     label: Header Left
     label: Header Right
     label: Sidebar
     label: Content
     label: Footer Left
     label: Footer Right

Next, let us create an html structure for our layout. We will create a folder named “layouts” within our module. In the folder we will create another folder named “custom_layout”

And within that folder, we will create twig file named “custom-layout.html.twig” 

custom layout builder

We have to specify the twig file in layouts.yml 

  • path: It specifies in which folder your html structure will be written
  • template: It specifies which twig template to use for this layout under the path.
custom layout

 label: 'Custom Layout'
 path: layouts/custom_layout
 category: 'Custom Layouts'
 template: custom-layout
 default_region: content
   - [header_left, header_left, header_right]
   - [content, content, sidebar]
   - [content, content, sidebar]
   - [content, content, sidebar]
   - [footer_left, footer_right]

     label: Header Left
     label: Header Right
     label: Sidebar
     label: Content
     label: Footer Left
     label: Footer Right

Next we will write html structure for our regions in “custom-layout.html.twig” file.
We will set classes has “layout” and “layout--custom-layout”  and we will wrap the whole content inside it.
We will specify regions which where been defined in layouts.yml , we can access those regions like “{{ content.header_left }}”

layout builder


{% set classes = [
] %}
{% if content %}
 <div{{ attributes.addClass(classes) }}>
   {% if content.header_left %}
     <div {{ region_attributes.header_left.addClass('layout__region', 'layout__region--header_left') }}>
       {{ content.header_left }}
   {% endif %}
   {% if content.header_right %}
     <div {{ region_attributes.header_right.addClass('layout__region', 'layout__region--header_right') }}>
       {{ content.header_right }}
   {% endif %}
   {% if content.content %}
     <div {{ region_attributes.content.addClass('layout__region', 'layout__region--content') }}>
       {{ content.content }}
   {% endif %}
   {% if content.sidebar %}
     <div {{ region_attributes.sidebar.addClass('layout__region', 'layout__region--sidebar') }}>
       {{ content.sidebar }}
   {% endif %}
   {% if content.footer_left %}
     <div {{ region_attributes.footer_left.addClass('layout__region', 'layout__region--footer_left') }}>
       {{ content.footer_left }}
   {% endif %}
   {% if content.footer_right %}
     <div {{ region_attributes.footer_right.addClass('layout__region', 'layout__region--footer_right') }}>
       {{ content.footer_right }}
   {% endif %}
{% endif %}

After the html structure is written, we will have to write css for each region. We will now create libraries.yml in our custom module.



version: VERSION
     css/custom_layout.css: {}

We will define that library in layouts.yml


layout builer


 label: 'Custom Layout'
 path: layouts/custom_layout
 category: 'Custom Layouts'
 template: custom-layout
 library: custom_layout/custom_layout
 default_region: content
   - [header_left, header_left, header_right]
   - [content, content, sidebar]
   - [content, content, sidebar]
   - [content, content, sidebar]
   - [footer_left, footer_right]

     label: Header Left
     label: Header Right
     label: Sidebar
     label: Content
     label: Footer Left
     label: Footer Right

Now let’s start with styling our regions block. We will specify structure  for each region as below-



.layout--custom-layout {
 display: -webkit-box;
 display: -ms-flexbox;
 display: flex;
 -ms-flex-wrap: wrap;
 flex-wrap: wrap;

@media screen and (min-width: 40em) {
 .layout--custom-layout .layout__region--header_left {
   -webkit-box-flex: 0;
   -ms-flex: 0 1 70%;
   flex: 0 1 70%;

 .layout--custom-layout .layout__region--header_right {
   -webkit-box-flex: 0;
   -ms-flex: 0 1 30%;
   flex: 0 1 30%;

 .layout--custom-layout .layout__region--content {
   -webkit-box-flex: 0;
   -ms-flex: 0 1 70%;
   flex: 0 1 70%;

 .layout--custom-layout .layout__region--sidebar {
   -webkit-box-flex: 0;
   -ms-flex: 0 1 30%;
   flex: 0 1 30%;

 .layout--custom-layout .layout__region--footer_left {
   -webkit-box-flex: 0;
   -ms-flex: 0 1 50%;
   flex: 0 1 50%;

 .layout--custom-layout .layout__region--footer_right {
   -webkit-box-flex: 0;
   -ms-flex: 0 1 50%;
   flex: 0 1 50%;

Next, let us enable our custom module


Let us go to Structure -> Content types and click on “Manage display” for any content type. For now we will use ‘article’ content type.


After we choose our custom layout-

Oct 15 2019
Oct 15

In our recent rebrand at Third and Grove, we took on a Drupal headless build with a Gatsby front end. With any project where you are pushing the limits of what technologies are capable of, there were some growing pains. 

These growing pains resulted from a few other places too (developers less familiar with React and Gatsby, using new and actively changing tools). We ran into some issues that we thought were really strange with bundle sizes, which turned out to be due to the way we were querying images. We also ran into load time issues with some SVGs that were being handled with a library called svgr. We also had a few fonts to load. Well, 18 (yeah that’s right). As a last resort implementing our own lazy loading helped bring us in a perfect lighthouse score!

Oct 14 2019
Oct 14

There’s no question that literacy in the 21st Century is a multi-faceted concept that extends far beyond books on the shelves.

The American Library Association not only gets it, it’s embracing the evolving role of libraries, by driving a wide range of programs designed to spark excitement about STEM (Science, Technology, Engineering, and Math) careers among youth, and more specifically Computational Thinking. Recently, the ALA Ready to Code site, which was largely funded by a grant from Google, has raised the bar even further.

Libraries Ready to Code grew out of a conviction that helping youth to become comfortable with technology is key to the mission of libraries in the current climate. 

Resources on the Drupal 8 site are aligned with the ALA’s emphasis on inclusivity. Engaging exercises are targeted toward ages that range from preschool to early teen years, and the intention is to reach children as young as possible -- before stereotypes concerning who could or should pursue technology careers begin to take hold. 

The ALA has been developing resources designed to help librarians prepare youth to succeed in the high-tech future for many years. Ready to Code has leveraged an amazing collection of educational assets to create an inviting and visually appealing learning path.

Discovery Journey

While Computational Thinking is required for 21st Century literacy, ALA leadership is well aware of the fact that coding expertise has not traditionally been something that librarians learned while studying for a degree in library science. At the same time, there was a recognition that the interests and inclinations of librarians, tend to gravitate more toward literature and content than technology. 

As such, the objective of the site was not for librarians to actually teach coding to youth or to learn how to code themselves. Instead, the site was designed to help librarians who have varying comfort levels relative to tech to:

  • Understand what Computational Thinking is all about, 
  • Get on-board with the idea of libraries taking a lead in teaching youth about it, and
  • Facilitate programs that broaden perspectives. 

Multifaceted Mission

Having developed Drupal websites for the ALA prior to this project, Promet Source had an established relationship and was thrilled to step up to the role of designing and developing Ready to Code. Working with both Google and the ALA was a huge inspiration for us, as their intentional and well-organized programs for youth were very thought provoking. The content organization on the site was a creative challenge as we serve up relative content to users based on their experience level and interest in certain topics. In addition to the UX and UI design of the site, Promet designers also developed branding for the site, providing a new logo and style guide for the newly established ALA Ready to Code which became the initiative’s brand. 

Once the site went live, the response exceeded expectations on every front. The primary objective was for librarians to embrace the site and actively introduce youth to all that Ready to Code has to offer. That objective has been boosted by a steady stream of awards. Key among them is the American Association of School Librarians’ 2019 Best Websites for Teaching & Learning. Since this award is very much on the radar of librarians, it has added validity to the site and accelerate its adoption. 

Two additional and prestigious awards:

And then there’s the input from ALA clients, such as this review on the Clutch website:

"They (Promet Source) masterfully took our ideas and translated them into specific pieces of the website."
      -- Marijke Visser, Senior Policy Advocate, American Library Association

Among those of us at Promet Source who were engaged with ALA leadership from the very outset and took a personal stake in the success of Ready to Code, this ongoing validation that the site is achieving what it set out to accomplish has been the gift that keeps on giving. 

Interested in collaborating on big ideas that stand to make a big difference for igniting digital possibilities? Contact us today

Oct 14 2019
Oct 14

Drupal 8 brought a lot of new features along with it. Making it easier to create rich and beautiful pages. Among the new features included in Drupal 8.7, we saw the stable built-in drag-and-drop Layout Builder, updated Media Library interface, and more. These changes impact site builders, administrators, editors, module developers, themer's, and distribution developers. Drupal 8.8 will be released in December 2019 and will be the last minor release with features. 
Let’s look at what Drupal 8.8 has in store for us:

| Drupal's WYSIWYG editor will allow media embedding in Drupal 8.8!

It’s been a decade that the Drupal community has been waiting for the most wanted end-user feature - Better media handling! The addition of WYSIWYG integration completes the final milestone. Read more about it here - https://wimleers.com/blog/media-embedding-drupal-8.8. Drupal 8.8 will come along with complete media management, this enables the site builders and content authors to easily embed media in Drupal.

Check out the media embedding here - https://youtu.be/sTc2JJzs9iU

| New and modern administration theme Claro

Claro is a concise, clean, responsive theme with an improved look and enhanced web accessibility. Built on top of the Seven theme, it is now a contributed project. There is a probability that the Claro theme will be added to Drupal 8.8.0 core as an experimental theme.

| Composer in Drupal core initiative

To know more visit - https://www.drupal.org/docs/develop/using-composer/using-drupals-composer-scaffold

| JSON:API module in Drupal 8.8 is expected to be significantly faster

This has been possible due to the resolution of the following issues:

New cache layers were introduced which minimized the cost by reusing computed entities.

| Content Moderation and Workspaces core modules can be  used on the same site with Drupal 8.8.0!

Starting with Drupal 8.8.0, the Content Moderation and Workspaces modules are no longer incompatible, so they can be installed and used together.

When both modules are installed, the Latest revision local task provided by Content Moderation is no longer available because Workspaces always shows the latest workspace-specific revision on the canonical entity view page (e.g. /node/1). Additionally, when a moderation workflow is enabled for an entity type/bundle and if there are entities in draft (non-default/unpublished) moderation states in a workspace, that workspace can not be published to Live until all the draft entities reach a publishable moderation state.

| jQuery UI is being phased out from the Drupal core

jQuery UI allowed module developers to add lavish effects to their code. Added to Drupal core in 2009, jQuery UI has been unmaintained since 2017 and is listed as an Emeritus project (projects where the maintainers have reached or are nearing end-of-life). jQuery UI is being deprecated from Drupal core and will be removed by Drupal 9. With jQuery UI’s end of life, it will not work with future jQuery versions. Drupal core is in the process of switching to pure JavaScript solutions. Modules and/or themes depending on jQuery UI would require it as a dependency and manage their libraries.

| Configuration Management improvements in Drupal 8.8

  • Sync directory is defined in $settings['config_sync_directory'] and not $config_directories.

The ability to support multiple configuration directions in the $config_directories is now deprecated. If you have custom or contributed code that relies on this ability you need to move your setting either to $settings or another to storage. To know more about this visit - https://www.drupal.org/node/3018145

These events allow modules to interact with the configuration deployment workflow. This was previously only possible with the contrib Config Filter module.

A few modules help in developing a Drupal site but are not intended to be deployed to production. Until Drupal 8.8.0 developers had to rely on contrib solutions such as Config Split to separate the development configuration. But sometimes it is not necessary to share the development configuration and instead it is more important to guarantee that development modules can not be included in the configuration export. This is precisely what the lesser-known Config Exclude contrib module did and its functionality is now available for everyone.

| Path aliases have been converted to revisionable entities

With Drupal 8.8.0, custom URL aliases are now provided by a new path_alias revisionable content entity type. The path.alias_storage service has been kept in place for backward compatibility, and its hook has been deprecated. Check out this link for code changes recommended to fully utilize the new system and prepare your code for Drupal 9 - https://www.drupal.org/node/3013865 


With all of these features, the upcoming and previous versions of Drupal 8 are very lucrative for content editors. Kudos to all the Drupalers who have worked on Drupal 8.8 and chalking out the path to Drupal 9. 

Reference - Drupal.org

Oct 13 2019
Oct 13

The other day I was reviewing my read later items and stumbled upon the New command line tool to install & run Drupal change record I had completely forgotten about. This was timely because I was extensively testing the excellent Acquia Developer Studio for work and was trying to think about how it could help me review core changes quickly or contribute more easily. Turns out, you can’t ask for a tool to do everything and sometimes it’s important to get back to finding the right tool for the job. And in this instance, quick-start has no equivalent that I know of in terms of ease of use and required dependencies.

After playing with it a bit, I realized I could probably create a wrapper to speed up operations even more. The workflow I had in mind was this:

  • Work exclusively from the local Drupal Git clone
  • Don’t install any dependency like Drush or Drupal Console
  • Install Drupal with a one-liner
  • Optionally select a different install profile
  • Optionally install a patch
  • Clean up everything with a one-liner

And then I came up with this repo (make sure to review the README file!). Disclaimer: I’ve only tested it on Linux. It works very simply with two commands: quick-start and quick-clean.

When I type quick-start, I can either pass a Drupal profile or use the default (standard). The script takes care of pulling Composer dependencies and installing Drupal with default parameters so I can concentrate on the task at hand, not the install process itself. At some point I even experimented with assigning a dynamic port (shuf -i8000-8999 -n1) but that was so over-engineered I gave up. This is how it looks like now:

$ quick-start umami
Drupal codebase detected. Proceeding...
> Drupal\Core\Composer\Composer::ensureComposerVersion
Loading composer repositories with package information
Installing dependencies (including require-dev) from lock file
Package operations: 106 installs, 0 updates, 0 removals
- Installing composer/installers (v1.7.0): Loading from cache
> Drupal\Core\Composer\Composer::vendorTestCodeCleanup
Generating autoload files
> Drupal\Core\Composer\Composer::preAutoloadDump
> Drupal\Core\Composer\Composer::ensureHtaccess
Skipped installation of bin bin/composer for package composer/composer: file not found in package
18/18 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓]
Congratulations, you installed Drupal!
Username: admin
Password: UvtRWr-Z82WKfV2Q
Drupal development server started: <http://localhost:8888>
This server is not meant for production use.
One time login url: <http://localhost:8888/en/user/reset/1/1570899448/vVyaEHQkIRKGLLtuRttsdXtCBfNP1DBMWJwQIH4_XKc/login>
Press Ctrl-C to quit the Drupal development server.

If I want to, I can even pass a patch file:

$ quick-start minimal https://www.drupal.org/files/issues/2019-09-10/2966607-127.patch
Drupal codebase detected. Proceeding...
HEAD is now at 10c41e77a5 Issue #3079810 by jhodgdon, andypost, mikelutz: core/help_topics directory does not work
Already up to date.
--2019-10-15 16:27:51--  https://www.drupal.org/files/issues/2019-09-10/2966607-127.patch
Resolving www.drupal.org (www.drupal.org)...,,, ...
Connecting to www.drupal.org (www.drupal.org)||:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 33247 (32K) [text/plain]
Saving to: ‘2966607-127.patch’
2966607-127.patch             100%[===============================================>]  32,47K  --.-KB/s    in 0,02s   
2019-10-15 16:27:51 (1,27 MB/s) - ‘2966607-127.patch’ saved [33247/33247]
Checking patch core/lib/Drupal/Core/Cache/CacheTagsChecksumInterface.php...
Applied patch core/tests/Drupal/KernelTests/Core/Cache/EndOfTransactionQueriesTest.php cleanly.
> Drupal\Core\Composer\Composer::ensureComposerVersion
Loading composer repositories with package information
Installing dependencies (including require-dev) from lock file
Package operations: 107 installs, 0 updates, 0 removals
- Installing composer/installers (v1.7.0): Loading from cache
18/18 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓]
Congratulations, you installed Drupal!
Username: admin
Password: yIk4BtdbLEtyZ80X
Drupal development server started: <http://localhost:8888>
This server is not meant for production use.
One time login url: <http://localhost:8888/user/reset/1/1571149676/oWK8NQiIUWvOzVqieC-jEfjfpjy0hkINpX4rDAKNOL4/login>
Press Ctrl-C to quit the Drupal development server.

The one annoyance I have is this whole solution doesn’t really scale. I understand quick-start is for dev only and I should keep my expectations low, but it’ll fail randomly when using devel_generate, clicking on too many pages in a short period of time or installing too many modules at once. When this happens to you, just shut down the server (C^) and run quick-start again. This is a severe limitation I’ve reported here.

Anyway, once I’m done and want to clean up my repo, there’s the quick-clean command for that. It’ll wipe everything within your Git clone (seriously, be careful) so you come back to a clean Git state, with the latest commits from upstream. It looks like this:

$ quick-clean
Drupal codebase detected. Proceeding...
[sudo] password for anavarre:
HEAD is now at 03bdf28929 Issue #2860644 by daffie, shashikant_chauhan, dww: Add support of NOT REGEXP operator to PostgreSQL + fix testRegexCondition
Already up to date.

To my knowledge, there’s no easiest nor quickest way to install, test or contribute to Drupal with the bare minimum requirements to run a PHP app. Here’s a demo.


Oct 11 2019
Oct 11

Read our Roadmap to understand how this work falls into priorities set by the Drupal Association with direction and collaboration from the Board and community.

Note from the author: "I'm back in the hot seat! As many of you know, I took on the role of interim executive director in September 2018 while the Drupal Association underwent an executive search. This summer we found a fantastic new leader in Heather Rocker, and now that she's had a few months to settle in, I'm able to return to my regular duties, like bringing you these updates. Thanks for your patience!"

- Tim Lehnen(hestenet)

September was a flurry of activity here at the Drupal Association at large. Coming off a season of travel to a number of Drupal events, we headed straight into our semi-annual staff off-site here in Portland, and followed that up by attending Google's second-ever CMS Leadership summit.

Despite the whirlwind of events taking place in September, we've also landed some major milestones on our roadmap, and are hard at work getting some exciting things together to talk about with you all at DrupalCon Amsterdam at the end of October. As an added bonus, this month's report includes a short retrospective about the impact of the GitLab migration on our maintenance work. 

Project News

Composer Initiative work committed for release in Drupal 8.8.0

 Drupal Composer Facade

A major community initiative for Drupal 8.8.0 has been the push to make Drupal's internal scaffolding and filetree consistent, whether you start using Drupal by using the .zip download, or by using Composer from the get-go. Starting with Drupal 8.8.0 - no matter how you start your journey with Drupal, you'll be ready to use Composer's advanced dependency management when you need it.

Drupal Association engineering team member Ryan Aslett(mixologic) has been the initiative lead for this effort for more than a year. We're thrilled that his work and the work of many volunteers has been committed for release in Drupal 8.8.0!

We want to thank the following contributors for participating in this initiative in collaboration with us: 

The work is not over! There are still a number of clean ups and refinements being worked on in the Drupal core queue, and the Drupal Association team is working hard in October to ensure that Drupal.org itself will be ready to deliver these Composer-ready packages of Drupal 8.8.0 on release. 

Reminder: Drupal 8.8.0 is coming soon! Drupal 8

Speaking of Drupal 8.8.0 - it enters the alpha phase during the week of October the 14th, in preparation for release in December of this year.

Drupal 8.8.0 is the last minor release of Drupal before the simultaneous release of Drupal 8.9.0 and 9.0.0 next year. You can find more information about the Drupal release cycle here.

If you want to help ensure a smooth release, we invite you to join the Drupal Minor Release beta testing program.

Drupal.org Update

Preparing our infrastructure for Automatic Updates

In September we spent a good amount of time outlining the architectural requirements that will need to be met in order to support delivering the update packages that are part of the Automatic Updates initiative.

We are only in the first phase of this initiative, which focuses on: 1) Informing site owners of upcoming critical releases, 2) Providing readiness checks that site owners can use to validate they are ready to apply an update, and 3) offering in-place automatic updates for a small subset of use-cases (critical security releases).

As this initiative progresses, and begins to cover more and more use cases, it should greatly reduce TCO for site owners, and friction for new adopters. However, to make that forward progress we are seeking sponsors for the second phase of work.

Readying our secure signing infrastructure

Yubikey HSMWith the help of a number of community contributors (see below), a new architecture for a highly secure signing infrastructure has been laid out. As we roll into Q4 we'll get ready to stand this new infrastructure up and begin securing the first automatic updates packages.

Going into early October, a number of contributors came together at BadCamp to help advance this effort further. Without the collaboration between community members and Drupal Association staff, these initiatives would not be possible.

We'd like to thank the following contributors to the Automatic Updates/Secure Signing Infrastructure initiative: 

Supporting Drupal 9 readiness testing

In conjunction with the Drupal core team, the DA engineering team has been supporting the work to ensure that contributed projects are ready for the release of Drupal 9.

Early testing has shown that over 54% of projects compatible with Drupal 8 are *already* Drupal 9 ready, and we'll be continuing to work with the core team to get out the word about how to update the modules that are not yet compatible.

Infrastructure Update

A brief retrospective on the GitLab migration

Drupal + GitLabDrupal.org's partnership with GitLab to provide the tooling for Drupal and the ~40,000 contributed projects hosted on Drupal.org has been a significant step forward for our community. We're no longer relying on our own, home-brew git infrastructure for the project, and we're gradually rolling out more powerful collaboration tools to move the project forward. 

But what has that meant in terms of maintenance work for the Drupal Association engineering team?

There was some hope as we were evaluating tooling providers that making a switch would almost entirely eliminate the maintenance and support burden. While that was a hopeful outlook, the reality is that maintaining 'off-the-shelf' software can be at least as much work as maintaining mature existing tools.

GitLab in particular is still iterating at a tremendously rapid pace, releasing updates and new features every month. However, that speed of development has also meant frequent maintenance and security releases, meaning the DA team has had to update our GitLab installation almost once a week in some months.

Does that mean we're unhappy with the change? Absolutely not! We're still thrilled to be working with the GitLab team, and are excited about the new capabilities this partnership unlocks for the Drupal community (with more coming soon!).

However, it is a good lesson to anyone running a service for a large community that there's no free lunch - and a great reminder of why the support of Drupal Association members and supporting partners is so essential to our work.


As always, we’d like to say thanks to all the volunteers who work with us, and to the Drupal Association Supporters, who make it possible for us to work on these projects. In particular, we want to thank:

  • Tag1 - Renewing Signature Supporting Partner
  • Gitlab - *NEW* Premium Technology Supporter
  • Four Kitchens - Renewing Premium Supporting Partner
  • Phase2 - Renewing Premium Supporting Partner
  • WebEnertia - Renewing Premium Supporting Partner
  • Thunder - Renewing Premium Supporting Partner
  • Palantir -Renewing Premium Supporting Partner
  • Specbee - Renewing Premium Supporting Partner 
  • Pantheon - Renewing Premium Hosting Supporter
  • Cyber-Duck - *NEW* Classic Supporting Partner
  • Websolutions Agency - *NEW* Classic Supporting Partner
  • Unic - *NEW* Classic Supporting Partner
  • Kalamuna - Renewing Classic Supporting Partner 
  • ThinkShout - Renewing Classic Supporting Partner 
  • Amazee - Renewing Classic Supporting Partner 
  • Access - Renewing Classic Supporting Partner 
  • Studio Present - Renewing Classic Supporting Partner 
  • undpaul- Renewing Classic Supporting Partner 
  • Mediacurrent - Renewing Classic Supporting Partner 
  • Appnovation - Renewing Classic Supporting Partner 
  • Position2 - Renewing Classic Supporting Partner 
  • Kanopi Studios - Renewing Classic Supporting Partner 
  • Deeson - Renewing Classic Supporting Partner 
  • GeekHive - Renewing Classic Supporting Partner 
  • OpenSense Labs - Renewing Classic Supporting Partner 
  • Synetic - Renewing Classic Supporting Partner 
  • Axelerant - Renewing Classic Supporting Partner 
  • Centretek - Renewing Classic Supporting Partner 
  • PreviousNext - Renewing Classic Supporting Partner 
  • UniMity Solutions - Renewing Classic Supporting Partner 
  • Code Koalas - Renewing Classic Supporting Partner 
  • Vardot - Renewing Classic Supporting Partner 
  • Berger Schmidt - Renewing Classic Supporting Partner 
  • Authorize.Net - Renewing Classic Technology Supporter
  • JetBrains - Renewing Classic Technology Supporter
  • GlowHost - Renewing Classic Hosting Supporter
  • Sevaa - Renewing Classic Hosting Supporter
  • Green Geeks - Renewing Classic Hosting Supporter

If you would like to support our work as an individual or an organization, consider becoming a member of the Drupal Association.

Follow us on Twitter for regular updates: @drupal_org, @drupal_infra

Oct 11 2019
Oct 11

We recently completed a special data integration project for the University of Minnesota, Office of the Executive Vice President and Provost, Faculty and Academic Affairs. Faculty Affairs, as they are referred to, uses a product called Activity Insight from Digital Measures (referred to internally as “Works”) to capture and organize faculty information. Works acts as both a departmental LinkedIn and their internal performance review tool. They needed help getting faculty data out of Works in order to display it in web profiles on departmental websites. Many universities use Digital Measures to manage their faculty information, so when we took on the project, we wanted to be able to open source the solution we came up with and make the code available to those universities who wanted to do similar projects. Luckily, Faculty Affairs was a willing partner in working towards that goal!

The Digital Measures Migration

Once the project was kicked off, our DevOps Engineer Tess Flynn took a peek under the covers. “Digital Measures posed a significant challenge to open sourcing anything,” said Tess, “because it doesn’t really have a set data schema. It’s a REST-based application for storing structured data. It’s not as specific as a database; it’s somewhere between a document store and a database. All Digital Measures lets you do is say: this is a data object that we want to store, and these are the fields it can have. The University creates the structure, and the data can then be accessed through an API.”

For this project, we created one module specific to Digital Measures (Digitalmeasures Migrate) to access the Digital Measures API from within Drupal. This, in turn, allows data extracted from the API to be leveraged through Drupal 8 migrations. Due to the volume of data, we write it to a “staging” database table. This way, we have more flexibility in processing it, without bringing down the Digital Measures API, or encountering rate limitations. Once we get the XML out of the API and store it in the database, then we can do the transform part. That’s where the bulk of the work is done.

The other seven modules (process plugins) solve specific migration problems in the transform step of the Extract-Transform-Load (ETL) process. We wrote a series of Drupal migrations for Faculty Affairs which relied on these plugins.

The Framework

Here are the modules we created and contributed to Drupal.org:

  • Digitalmeasures Migrate
    Provides a method to access Digital Measures API through Drupal.
  • Migrate Process XML
    One of the most-used modules when writing migrations for Faculty Affairs. It reads XML and allows you to extract particular key sections using XPath.
  • Migrate Process S3
    This is the second-most useful module. It allows you to download objects from an S3 bucket as a file to your Drupal site. For Faculty Affairs, this was used to download a professor’s profile photo, resume/CV, and other attached files. Once downloaded, this is then attached to the professor’s profile so that direct linking to S3 is not required.
  • Migrate Process URL
    Allows you to manipulate URL values that are provided within the data.
  • Migrate Process Regex
    While a small module, it’s useful in a surprising number of cases! This module provides a way to use Regular Expressions in a Drupal migration. This allows for matching and text replacements where XPath alone wouldn’t be enough. “This is a tiny thing, but it’s really handy in migrations,” said Tess. “For example, when you have two quotes next to each other that should have been one quote.”
  • Migrate Process Vardump
    Often used for debugging, this module takes any data given to it and dumps it to the terminal output and then passes it on. “It doesn’t sound very useful, unless you’re writing migrations, but then it’s the best thing EVER!” joked Tess. Every time Tess has written migrations, she said she had to write a custom version of this. So having this as a standard module will help many users. This module currently has the most users on Drupal.org out of all the modules in the framework.
  • Migrate Process Skip
    When processing lists of data, the migration system has very specific ideas of what’s considered “empty” or not: zero = false, empty string, and NULL, which of these means ”empty”? Drupal has a “skip_on_empty” migration process plugin, but it may not give the desired result if you don’t know what you’re doing. This module provides a few different mechanisms to define what is “empty” and should be skipped.
  • Migrate Process Trim
    Often, we need to remove leading or trailing characters—such as spaces—after extraction. This module provides a quick and simple means to achieve this in a Drupal migration.

We Open Sourced the Tools, Not the Solution

“Originally I really wanted to open source a solution, but it didn’t end up working that way,” remarked Tess. While multiple colleges and departments within the University of Minnesota (and outside of it) are all using Digital Measures, they all have their own customized implementation of it. Because of this, there will always be custom work required to display the Digital Measures data in each department’s web profiles. So the best you can do is to release tools to get to the solution. The framework we created will give each department’s IT team a head start on the process. “The tools we created are like socket wrenches to work on a car,” said Tess. “Every car is different, but the socket wrenches make the work easier.”

If we’d open sourced the Faculty Affairs solution, it would have been specific to them, and not transferable to another school. But what we can open-source are the tools to create a solution.

“There are mechanisms to retrieve types of data, but the types of data are up to the user of the API to decide,” Tess continued. “Faculty Affairs created the schema and data formats. All we did was used their existing forms, and tweaked it here and there as we needed to, and then wrote tools to create the necessary things to import data from Digital Measures into Drupal. Digital Measures is so customizable, you can’t do anything but release tools.”

This framework has broader applications than just Digital Measures. “Each module solves a specific issue that can be used for Digital Measures.” said Tess, “But the specific problems they solve are common to migration processes in general. With the exception of Digitalmeasures Migrate, these modules are fairly generic and can also be used for a variety of other migrations.”

If you want to use the framework yourself on a Digital Measures migration, the place to start is with the Digitalmeasures Migrate module. “That gets you the data,” said Tess, “but it doesn’t necessarily put it into a form that you can immediately use in Drupal, because unfortunately there’s no way of telling how the data is structured, so it’s up to you to make that part.” Tess did include a rough process for using the framework in the README for the Digitalmeasures Migrate module.

Got a Data Migration Problem? We Can Solve It!

Are you using Digital Measures and wishing you could pull the data into your Drupal deployment? Do you have any other Drupal problem we can help you solve? Drop us a line!  

Oct 10 2019
Oct 10

Guest blog by Diana Connolly, Production Manager, Groundswell Marketing


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web