Sep 10 2019
Sep 10

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

I'm excited to share that when Drupal 8.8 drops in December, Drupal's WYSIWYG editor will allow media embedding.

You may wonder: Why is that worth announcing on your blog? It's just one new button in my WYSIWYG editor.

It's a big deal because Drupal's media management has been going through a decade-long transformation. The addition of WYSIWYG integration completes the final milestone. You can read more about it on Wim's blog post.

Drupal 8.8 should ship with complete media management, which is fantastic news for site builders and content authors who have long wanted a simpler way to embed media in Drupal.

Congratulations to the Media Initiative team for this significant achievement!

Sep 10 2019
Sep 10

open waters

In this episode, we talk with Mediacurrent's Mario Hernandez about why training is so important for web teams to grow and stay competitive. And yes, we are once again interviewing one of the hosts. 

Audio Download Link

About Mario 

In addition to his position as Head of Learning, Mario is a Senior Front End Developer with over 15 years of Drupal experience. He and I actually started on the same day, 5 years ago. Mario is a regular speaker and trainer at tech conferences including Drupal Camps and DrupalCons. He is a co-host of the Open Waters podcast and an active participant in the Drupal core project and other open source projects. Prior to Mediacurrent, Mario also has over 10 years of experience in the Federal Government.

Project Pick

Apollo GraphQL

  • Server
  • Client
  • Platform


The best way to learn is to teach. 

  1. How did you get started with Drupal and front end development in general?
  2. How did you get started doing training?
  3. What is your favorite part of training people?
  4. Is Mediacurret’s training limited to only events and/or only Drupal?
  5. How do you think training is most effective when working with a client’s internal development team?
  6. In addition to FE training, does Mediacurrent offer training in other areas?  Yes! We offer training in Accessibility, SEO, Back End, Digital Strategy, GatsbyJS and more
  7. How can organizations interested in our training offerings get more information?
Sep 10 2019
Sep 10

This is the second of three of blog posts about creating an app with React Native. To get started with React Native, read the Basics of React Native. Once you are familiar with the system and have an app, it is time to fill it out with content. If you don’t have content on your Drupal website, read Understanding Drupal 8’s Migrate API.

Exposing Drupal content

Some helpful modules to expose Drupal content are: Views, RESTful Web Services, and Serialization. The concept of getting Drupal content to an endpoint is simple:

  1. Build a view containing the data you want to expose.
  2. Add a “REST export” display to the view. During this step, select the appropriate serialization type.
  3. This will automatically create a REST endpoint at the URL.

The dataflow should look something like this: Drupal Content -> View -> Serializer -> REST endpoint.

Using fetch to asynchronously retrieve data

React Native’s compiler is Babel, which means ES6 code can be used anywhere in the project. This is useful for retrieving data, because ES6 enables the async and await keywords. When used in conjunction with something like fetch, you can create a smooth and elegant solution for pulling data from a server, such as content from a Drupal website.

The code for pulling from a Drupal website REST endpoint is the same as REST endpoint. It should look something like this:

async getSomeData() { let url = ""; let response = await fetch(url); let data = await response.text(); return data; }

The advantage to making a call like this asynchronously is that it allows other threads to continue running while the fetch is waiting for the server call to return with all of the data it ordered. This improves the user experience because it allows them to continue using other functions while the data loads.

Building a FlatList from a data set

After pulling in data from the endpoint, add a component to display the data. <FlatList> is an excellent component already built in to React Native. These components are useful because they could handle infinite amounts of data without impacting performance, since they only render the part of the list that is currently on screen.

A <FlatList> component takes two props for displaying data. You may need to massage the data to make it easier to use inside a <FlatList>. The first prop is the set of data that it will display. The next prop required by a <FlatList> is renderItem, which describes how the data should be displayed. This is a JSX object that tells the <FlatList> component how to represent each list item, and what fields to pull from the data. You can use any component inside renderItem.

The ListItem component provided by React Native Elements has lots of styling features, like color gradients and automatic chevron placement.

Here is an example <FlatList>:

<FlatList> style={{backgroundColor: '#ededed'}} data= {this.state.peopleData} renderItem={({person}) => <View> <ListItem title={} titleStyle={{ color: '#00AEED', fontWeight: 'bold' }} subtitle={person.position} /> </View> } />

With the skills to expose, retrieve, and display your data, you can integrate a Drupal website with your new React Native app.

Sep 10 2019
Sep 10

Dropsolid is a Diamond sponsor at DrupalCon Amsterdam, 28-31 October. In this post, I’d like to share a bit about our vision for delivering the best customer experiences, our open integrated Digital Experience Platform, our partner program, and a special opportunity for DrupalCon attendees.

Are you working in a digital agency and coming to DrupalCon? We’d love to meet you at DrupalCon and talk about how our tools, infrastructure, and expertise could help you as a digital agency partner. We’ll be at Stand 13, by the catering area, so if you fancy a coffee, stop by for a chat. We’re running a very special giveaway, too. Complete a quick survey and we’ll donate 15 minutes of core contribution time as a thank you.

Sign up for Dropsolid News

A vision for Drupal to improve customer experience

In my previous post, I wrote about why we’re sponsoring DrupalCon. Simply put, without it, we wouldn’t exist. I also wrote about what we’re working on for the future, inspired by the market changes around digital experience management. I think we have something unique to offer our partner digital agencies right now.

I’ve gone from being a developer to a CEO, and I know the attraction of solving problems by building your own solutions. Yet, like many agencies, we discovered a few years ago that doing everything in-house was hindering us from growth. To solve this, we ended up pivoting our entire company, defining and offering solutions in a completely different way.

We found that many of our clients’ and partners’ teams were working in silos, with different focuses—one on marketing, another on hosting, and so on. We believe we have to take an integrated approach to solving today’s problems and a big part of that is offering stellar customer experience. We discovered that investing in customer experience meant your customers stick around more and longer. This translates to increased customer lifetime value, lower customer acquisition costs, and lower running costs. But what does it take to get there?

We have to recognize how problems are connected, so we can build connected solutions. You can see this in problems like search engine optimization. SEO is as much about great user experience as it is about your content. Today, for example, the speed and performance of your website affects your search engine rankings. Incidentally, my colleagues Wouter De Bruycker (SEO Specialist) and Brent Gees (Drupal Architect) will be talking about avoiding Drupal SEO pitfalls at DrupalCon Amsterdam.

Similarly, it seemed that various solutions out there were narrowly focused on a single area. We saw the potential and power of integrating these as parts of a unified Digital Experience Platform. Stand-alone, any one of these tools offers benefits, but integrated together, the whole is greater than the sum of its parts.

We are taking this approach with our clients already. With each successful engagement, we add what we learn to our toolbox of integrated solutions. We are building these solutions out for customers with consultation and training to make the most out of their investments. These include our hosting platform; our local dev tool, Launchpad; our Drupal install profile, Dropsolid Rocketship; Dropsolid Personalization; and Dropsolid Search optimized with Machine Learning. 

But our vision is bigger. We are working towards an open, integrated, Digital Experience Platform that our partner agencies can leverage to greater creative freedom and increased capacity without getting in their own way.

Stop by at DrupalCon or get in touch and see what we’re building for you. 

Read more: Open Digital Experience Platform

A Partner for European Digital Agencies

Dropsolid is the only European company sponsoring DrupalCon Amsterdam at the top-tier, Diamond sponsor level. With all due respect for our American colleagues, we believe a robust European company should exist to support all of us here. We want to help other European companies build successful digital experiences with Drupal at the core for organizations, governments, and others.

Like many Drupal agencies, we’ve gotten to where we are now providing services to our local market. Being based in Belgium, we design, strategize, build, maintain, and run websites and applications for clients, mainly in the Benelux region.

Dropsolid is the only European company sponsoring DrupalCon Amsterdam at the top-tier, Diamond sponsor level.

Now, we are looking for partners outside of Belgium to benefit from using our Drupal Open Digital Experience Platform for themselves and their customers. Dropsolid has the tools, infrastructure, and expertise to support creating sustainable digital experiences for anyone. Furthermore, we have the advantage of knowing and understanding the differing needs of our colleagues and clients across Europe.

Come join us!

We are looking for more partners to join us on this journey. By leaning on our tools and expertise, those who have already joined us now have more capacity for creative growth and opportunity.

What you might see as tedious problems and cost-centers holding your agency back, we see as our playground for invention and innovation. Our partners can extend and improve their core capabilities by off-loading some work onto us. And you gain shared revenue from selling services that your customers need.

You might be our ideal partner if you prefer

  • benefitting from recurring revenue, and 
  • not taking on additional complexity that distracts you from your core creative business.

Partners who sign up with us at DrupalCon will get significant benefits including preferred status and better terms and conditions compared to our standard offerings. Talk to us about it at our booth at Stand 13 or contact us to arrange a time to talk.

Facets of the Digital Experience - Dropsolid Sessions at DrupalCon

Check out my other post to see where to meet the Dropsolid Team at DrupalCon. You’re welcome to come say hello at our booth at Stand 13, and we can show you the facets of digital experience management as we see them, and also share our vision for the future.

Each one of our talks focuses on different facets of improving the digital experience for customers:

Sep 10 2019
Sep 10
Just like the poem says, “Little drop makes the mighty Ocean,” all contributions matter in the growth of the Drupal global community.  A diverse community results in great things. To ensure the longevity of Drupal digital experiences and the…
Sep 10 2019
Sep 10

DrupalCon Minneapolis 2020 is accepting session proposal submissions until Dec. 4, 2019! We welcome a wealth of perspectives and a vast knowledge base as presenters in Minneapolis for DrupalCon North America.

Sep 10 2019
Sep 10

Download the Menu Item Extras module like you would any other Drupal module. You can download it with Composer using the following command:

composer require drupal/menu_item_extras

Install the Menu Item Extras module. The module also provides a Demo module that can be used to see some examples of a menu with fields and display modes configured. In this case, we will just look at the base Menu Item Extras module.

Navigate to Structure > Menus > Main Navigation. The first thing you should notice is that there are additional links to Manage fields, Manage form display, Manage display, and View mode settings. This is very similar to what you have probably used on other entity types.

If you need to store any additional data for a menu link, you can do this on the Manage fields page. One potential use of this is to add an image field:

Manage Fields

You can then manage the way this is displayed on the menu link add/edit form:

Manage Form Display

You can also control how this menu item is displayed:

Manage Display

If you navigate to Structure > Display modes > View modes you can add additional view modes for menu items. In this example, I created a new view mode for Custom menu links. I called the view mode Image Link.

Custom Menu Link

You can now navigate back to Structure > Menus > Main Navigation and go to Manage display. In the Custom Display Settings section, you can enable the Image Link view mode and configure the display settings for the Default links and the Image Link view mode displays.

Custom Display Settings

You can now navigate to the View Modes Settings tab and select what view mode to use for each link in your menu.

View Mode Settings

This additional flexibility allows you to do a lot with your menu items. You could use this to build out a customized mega menu (this would require additional theme and template development). You could also use this to customize the display of menu items (perhaps by adding icons next to menu links, adding additional menu link descriptions, and more. The module provides you the site building tools to customize your menu items, now it’s up to you to decide how you want to use it!

Sep 09 2019
Sep 09

Our lead community developer, Alona Oneill, has been sitting in on the latest Drupal Core Initiative meetings and putting together meeting recaps outlining key talking points from each discussion. This article breaks down highlights from meetings this past week.

You'll find that the meetings, while also providing updates of completed tasks, are also conversations looking for community member involvement. There are many moving pieces as things are getting ramped up for Drupal 9, so if you see something you think you can provide assistance on, we encourage you to get involved.

Drupal 9 Readiness Meeting

September 02, 2019

Meetings are for core and contributed project developers as well as people who have integrations and services related to core. Site developers who want to stay in the know to keep up-to-date for the easiest Drupal 9 upgrade of their sites are also welcome.

  • It usually happens every other Monday at 18:00 UTC.
  • It is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public Drupal 9 Readiness Agenda anyone can add to.
  • The transcript will be exported and posted to the agenda issue.

Multiple Version Compatibility for info.yml Files

Congratulations on landing multiple version compatibility for info.yml files! The new core_version_requirement key in *.info.yml files for modules and themes now supports semantic versioning as implemented by the Composer project. This allows modules and themes to also specify that they are compatible with multiple major versions of Drupal core. For more information, read the issue at
New 'core_version_requirement' key in info.yml files for modules and themes allows Composer semantic version constraints including specifying multiple major versions of core.

To follow up, the issue Don't catch exception for invalid 'core_version_requirement' in info.yml files was opened and Gábor Hojtsy posted Drupal 8.7.7+ will support extensions compatible with both Drupal 8 and 9! to explain the multi-version support in the newest release.

Drupal 9 Requirements Issue

The Drupal 9 requirements issue has been updated to list 3 requirements: 

  1. Multi-version compatibility
  2. Symfony 4.4 green, and
  3. lack of (or very low) use of deprecated APIs.

There is also a fallback date on the week of October 14th, alongside 8.9's branching [META] Requirements for opening the Drupal 9 branch.

Symfony Updates

Drupal Core Depreciation

Drupal core's own deprecation testing results are posted here.

DrupalCon Amsterdam

  • DrupalCon Amsterdam is approaching fast! It would be lovely to run Drupal 9 compatibility contribution events for next month's DrupalCon.
  • Tools for PHP deprecations should be there.

Missing tooling for constants, JS deprecations, and Twig deprecations

Admin UI Meeting

September 04, 2019

  • Meetings are for core and contributed project developers as well as people who have integrations and services related to core. 
  • Usually happens every other Wednesday at 2:30pm UTC.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • There are roughly 5-10 minutes between topics for those who are multitasking to follow along.
  • The Admin UI Meeting agenda is public and anyone can add new topics in the document.

Core Issue Draft

  • Review core issue draft to add Claro administration theme to core.
  • We need to open an issue to start evaluating what's needed to add Claro to core and start getting feedback from everybody involved in the process
  • Roadmap issue we still need several things.

Underlining Link

Today some concerns were raised about the last issue for underlining link.

There are two proposed options to solve the issue.

Black underlined links and hover in blue.

showing black text with a black underline on load and on hover text changes to blue

Blue underlined links and remove underline on hover.

showing blue text with black underline on load and on hover the underline disappears leaving only blue text

Action Link Styles and Padding

The current design of action links leads to several issues:

  • Spacing between mixed elements button | button | action-link | action-link | button.
  • No explicit visual feedback that they're links.
  • Extra whitespace if an action link is the first element in a content row.

In the last revision, we had with Product Management for Claro we had some feedback that we need to address before adding Claro to core.

Some initial tests are moving this way.

spacial adjustments to settings toolbar in claro

Composer in Core Initiative Meeting

September 04, 2019

While working toward issue 2982680: Add composer-ready project templates to Drupal core, we discovered that the Vendor Hardening Plugin sometimes fails to work, throwing an error. The issue is documented in Move Drupal Components out of 'core' directory.

A related issue, [meta] Reorganize Components so they are testable in isolation, covers the ability to test Drupal's core Components in isolation as much as possible.

Vendor Hardening plugin

Add Composer vendor/ hardening plugin to core.

Ryan Aslett summarized: Duplicate the contents of core-file-security into core-vendor-hardening because core-vendor-hardening is unable to find core-file-security when composer-installers move it due to the composer's behavior regarding plugin interaction. Moving all of the components might have other consequences and require additional efforts that we eventually want to tackle, just not right now.

Broken Builds

  • While creating the 1.4.0 release of this project this line was changed resulting in broken builds for all packages that depended on behat/mink-selenium2-driver: 1.3.x-dev such as Drupal core and many others.
  • It also looks like we're inching closer to a 1.7.2 release in behat/mink.    

Template Files

The templates we have so far look good, once vendor hards things the question is, will they be done? The tests are probably not something we want in drupalci like that. Possible we should wait for Add a new test type to do real update testing to add tests. It's okay to add this to core without tests for the time being.

Automatic Updates

Potential conflicts have been uncovered and need to be addressed.

Angie Byron has been trying the past couple of days to get up to speed on the autoupdates initiative. Work is happening at and it seems they’re going a “quasi-patch” approach doing in-place updates. You can find Greg Anderson's summary within the issue comments to follow along.

Migration Initiative Meeting

September 05, 2019

This meeting:

  • Usually happens every Thursday and alternates between 14:00 and 21:00 UTC.
  • Is for core migrate maintainers and developers and anybody else in the community with an interest in migrations.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public migration meeting agenda anyone can add to.
  • Transcript will be exported and posted to the agenda issue.
  • For anonymous comments, start with a bust in silhouette emoji. To take a comment or thread off the record, start with a no entry sign emoji.

Issues Needing Review

Great progress is being made, and there are a lot of issues that are awaiting review.

31 Days of Migrations

31 days of migrations have been an outstanding success, thanks to Mauricio Dinarte and his brother! How best can we include these guides in the official documentation and make sure that they are disseminated to those who could use them? Mauricio suggested: 

  • It can be copied over in full like this book, Drupal 7 - The Essentials.
  • Should it be broken down into "recipes" (self-contained examples)?
  • Should it be broken apart and add the relevant pieces into the existing documentation topics/pages?
Sep 09 2019
Sep 09

Rain logo updated

Mediacurrent created the Rain Install Profile to build fast, consistent Drupal websites and improve the editorial experience. Rain expedites website creation, configuration, and deployment.


The Mediacurrent development team uses a Composer project template that extends the official Drupal Composer template to add Rain projects as well as additional tools and scripts.

Our template by default leverages a fork of DrupalVM which will provision the local environment. Note that Docker-based environments such as Lando or DDEV could be used as an alternative to Vagrant.

In this tutorial, we will walk through each step to get you up and running quickly. Below, you can also watch a narrated tutorial video to see these steps in action.

Installation instructions

First, you will want to create a repository wherever you typically host your Git projects (e.g. Github, Bitbucket or Gitlab). Once you have that setup you can clone Mediacurrent’s repo and point the origin back to your Git repo. The example command below illustrates how this is done.


git remote set-url origin [email protected]:mediacurrent/shortcode_project.git

Next, you will want to initialize the project. You can do that by running the following commands with your local host name and IP (see example below).


composer install

composer drupal-scaffold

./scripts/hobson project:init example.mcdev

Finally, to build the project and run the install you can simply run the following build command to execute the composer install and Drupal install:


Note that this command does require Mediacurent’s Vagrant environment in order to work. If you are using an alternative local environment you would run composer install, followed by the drush site install command instead of running the build script.

Once you get a full install working with the sample profile that’s been provided you will want to follow the project README documentation for further setup instructions. Remember to commit all of your files and push up to your Git’s origin. That’s it!

Questions or comments? Let me know at

Sep 09 2019
Sep 09

When you build a new website, going live is relatively easy. You get ahold of a domain name, point it at a webhost, put the website code there, and you're up and running!

After a site is live, it gets a lot more complicated.

What's important about deployment?

If you have a simple brochure site, deploying updates doesn't have to be complicated. The more your site does, the more complex deployment becomes. A deployment plan can help you stay out of trouble, keep your site online, minimize data loss. So when going live with an update to a site, you should ask:

  • How much downtime is acceptable?
  • How much testing do we need before we make a change to the production site?
  • What data could we lose, from the production site?
  • What might go wrong with this deployment strategy?
  • How can we recover if something does go wrong?

A good deployment plan should make you smile with comfort, knowing you have all the bases covered. Are you smiling? If not, read on.

Common deployment strategies

Here are the main strategies we've seen or used for deployment:

  • Do all work in the production environment so there's nothing to deploy
  • Copy the entire new site into the production environment
  • Compile/build a site and put the result into the production environment
  • Dev/Stage/Production pipeline
  • Blue/Green deployments

Let's take a deeper look at each one.

No Deployment - work in production

All too often, this is what you get if you aren't careful hiring a freelancer. This really seems to be the standard approach for most WordPress sites, which to me is horrifying.

Coding is often a process of trying something, breaking things, and then fixing them. Rinse and repeat. If you're doing this on a live production website, your site visitors will see broken pages, weird issue, or sometimes nothing at all. If your site is already getting traffic, working in production is irresponsible, dangerous. Especially if you aren't extremely careful about backups, and aren't extremely proficient.

The only benefit of "no deployment" deployment strategies is that it's cheap -- you're saving the cost of managing a copy of your site, and deploying changes.

Copy site to production

This also seems to be a pretty common way of deploying sites -- simply copy the new site in its entirety to the production server and make it live.

For major upgrades, such as going from Drupal 7 to Drupal 8, or changing from one platform to an entirely different one, this is the main strategy we use. And there are definitely times when this strategy makes sense. However, for day-to-day maintenance, theme refreshes, or most typical deployments, this is not a very good approach.

If your production site has a database, and regular changes to it, you need to be extremely careful to not lose production data. For example, if your site allows user comments, or takes e-commerce orders, or manages sales leads, if you simply copy a new site up you risk losing something.

Save this one for entirely new sites. Don't do this for day to day work -- unless your site doesn't even have a database.

Build site and deploy

"Static site generators" like Gatsby and Jeckyll have become quite popular recently, because they generate static sites that do not have a database -- greatly simplifying security. If you're running a full-blown Content Management System (CMS) like Drupal or WordPress, you're putting an application with bugs on the Internet where anyone can attack it. If your site is just a collection of files, they can't really attack it -- they can attack the hosting environment but your site itself has far less "attack surface" for an attacker to go after.

Gatsby in particular is becoming quite popular as a front-end to Drupal and WordPress -- you write your content in a private CMS on a private LAN, not reachable from the Internet, export the entire site using Gatsby (the build step), and then copy the resulting code up to the web host (much like the previous deployment strategy).

If you use this approach, you still need to consider how to keep your CMS up to date, though if it's not online, updating it in place becomes a far more reasonable proposition.

Dev/Stage/Production pipeline

Now we've reached what we consider to be the "standard" deployment practice -- run 3 copies of your site:

  • Dev, or Development -- this copy is where you do the development work, or at least integrate all the various developer copies, and conduct the initial round of testing.
  • Stage, or Test -- The main purpose of this copy is to test the deployment process itself, and understand what might break when you roll out to production.
  • Production, or Live -- The site that's available to the public.

In general, code flows from dev to production, whereas content/data flows from production to dev. If your site takes orders, collects data using forms, supports ratings/reviews or comments, or does anything sophisticated, you'll probably end up with this deployment strategy.

Several of the more professional hosts, like Pantheon, Acquia, WP Engine, and others provide these 3 environments along with easy ways to deploy code up to prod, and copy data down from prod.

Many larger companies or highly technical startups have built out "continuous integration/continuous deployment" on pipelines along these lines -- including Freelock. "Continuous Integration" basically kicks off automatic tests after code is pushed to a particular branch, and "Continuous Deployment" automates the deployment of code to production when tests have passed.

This is the key service we provide to nearly all our clients -- two different kinds of testing, a fully automatic pipeline, with automatic backups, release scheduling, release note management, and more. And we've build our pipeline to work with a variety of hosts including Pantheon and Acquia but also bare Linux servers at any cloud provider.

The main downsides of this type of deployment is that it can be slow to deploy, very hard to set up, prone to breaking as code and standards evolve, and different platforms have different challenges around deploying configuration changes. For example, when you move a WordPress database to another location, you need to do a string search/replacement in the database to update the URL and the disk location, and you may need to do manual steps after the code gets deployed. Drupal, on the other hand, may put the site in maintenance mode for a few minutes as database updates get applied.

All in all, when done well, this is a great deployment strategy, but can be very expensive to maintain. That's why our service is such a great value -- we do all the hard work of keeping it running smoothly across many dozens of clients, have automated a lot of the hard bits, and streamlined the setup.

Blue/Green deployments

If even a minute of downtime costs a significant amount of income, you may want to consider a Blue/Green deployment strategy. This is a strategy made for "high availability" -- doing your utmost to both minimize maintenance windows, and provide a rock-solid roll-back option if something goes awry.

With a Blue/Green deployment strategy, you essentially create two full production environments -- "blue" and "green". One of them is live at any given instance, the other is in standby. When you want to deploy an update, you deploy all the new code and configuration changes to the offline environment, and when it's all ready to go, you simply "promote" it to be the live one. For example, if Blue is live, you deploy everything to Green, possibly using a normal dev/stage/prod deployment process. The configuration changes happen while the green site is offline, so the public never gets a "down for maintenance" message. When it's all ready, you promote Green to live, and Blue becomes the offline standby copy. And if you discover a problem after going live, you simply promote Blue back to live, and Green goes into standby where it can get fixed.

There is a big downside here -- if your site takes orders, or otherwise changes the production database, there's a window where you could lose data, much like the "Copy Site to Production" strategy. You might be able to somewhat mitigate this by setting the live site to "read only" but still available, while you copy the database to the standby site and then apply config and promote. Or you might be able to create a "journal" or log of changes that you replay on the new site after it gets promoted. Or move to a micro-service architecture -- but then you're just moving the problem into individual microservices that still need a deployment strategy.

Which deployment strategy is best?

There is no "best" deployment strategy -- it's all about tradeoffs, and what is most appropriate for a particular site's situation. If you break up your site into multiple pieces, you may end up using multiple strategies -- but each one might be quite a bit simpler than trying to update the whole. On the other hand, that might actually lower availability, as various pieces end up with different maintenance schedules.

If you're running a PHP-based CMS, and you want to rest easy that your site is up-to-date, running correctly, and with a solid recovery plan if something goes wrong, we can help with that!

Sep 09 2019
Sep 09

With Drupal 9 stated to be released in June 2020, the Drupal community has around 11 months to go. So, before it maps out a transition plan, now is the time to discuss what to expect from Drupal 9.

Switch to Drupal 9

You must be wondering:

Is Drupal 9 a reasonable plan for you?

Is it easy to migrate from recent Drupal versions to the new one?

This blog post has all your questions answered.

Let’s see the major changes in Drupal 9

The latest version of Drupal is said to be built on Drupal 8 and the migration will be far easy this time.

  • Updated dependencies version so that they can be supported
  • Removal of deprecated code before release

The foremost update to be made in Drupal 9 is Symfony 4 or 5 and the team is working hard for its implementation.

Planning to Move to Drupal 9?

With the release of Drupal 8.7 in 2019, it has optionally supported Twig 2 that has helped developers start testing their code against the version of Twig. Drupal 8.8 will virtually support the recent version of Symfony. Ideally, the Drupal community would like to release Drupal 9 with support for Symfony 5, that is to be released at the end of 2019.

Drupal 9

If you are already using Drupal 8, the best advice is to keep your site up to date. Drupal 9 is an updated version of Drupal 8 with the updates for third-party dependencies and depreciated code removed.

Ensure that you are not using any deprecated APIs and modules and wherever possible use the recent versions of dependencies. If you do that, your upgrade experience will not encounter any problems.

Since Drupal 9 is being built within version 8, developers will have the choice to test their code and make updates before the release of Drupal 9. This is an outstanding update and was not possible with the previous versions of Drupal!

So, where are you in the Drupal journey?

Here are some scenarios to support your migration process:

Are you on Drupal 6?

You are way behind! . We strongly suggest you move to Drupal 8 as soon as you can. Migration from 8 to 9 will be straight forward. Drupal 8 includes migration facilitations for Drupal 6 which probably won’t be included in Drupal 9. While there is a possibility that there might be some contribution modules available, but it is better to be safe.

Are you on Drupal 7?

Both Drupal 7 and 8 support will end by 2021. Since the release of Drupal 9 is set for June 2020, you should plan your upgrade and go live. If not migrated by the required time, your website may be vulnerable to security threats. Since Drupal 7 to 9 are similar to new development, you just need to consider the timeline involved.

Are you on Drupal 8?

Great work if you are already on Drupal 8! For you, it would be easier to move to the next major version with zero effort.

The big difference between the last version of Drupal 8 and the first order of Drupal 9 is that of deprecated code being removed. You just need to check that your themes, modules, and profiles don’t include any such code. So, there’s no need to worry about migrating your content at all!

Want to know more about Drupal and its migration process?

Sep 09 2019
Sep 09

The node_list cache tag is generated automatically when we create a view that displays nodes entity type in Drupal 8. This cache tag will invalidate the cache of all views that list any kind of nodes (page, article, ....) when we make a CUD (create, update, delete) action on any kind of nodes.

This is a very good cache invalidation strategy since when we modify a node through a CUD action, the cache of every views that display nodes will be invalidate to reflect this new change. Yes, the improvements in the new Drupal8 Cache API are amazing! Thanks to all the contributors (like Wim Leers) to make this possible.

node_list cache tag views drupal9

(to see the cache tags in your header response, just enable your settings.local.php)

So far so good, but... What would happen if we have a high traffic web site with hundred of different node bundles and hundred of views displaying different kind of nodes?

In that case, if we modify a node (let's say a page), the cache of every views listing page nodes (but also views listing article nodes and other node bundles) will be invalidated. This means that if we modify a page node, the cache of all views displaying article nodes will also be invalided. This could be a serious performance issue and especially when we have a lot of updates like in a newspaper web site.

How can we invalidate the view caches in a more specific way, let's say only for nodes of the same type?

To do so, we'll use a two steps process:

- Place a custom cache tag for all views displaying the same node type, like node:type:page for all views listing page nodes, node:type:article for views listing articles nodes and so on...
- Invalidate these custom cache tags when a CUD operation is performed on a specific node type with a hook_node_presave() 

Add custom cache tags with the Views Custom Cache Tags contrib module

Luckily, to solve the node-list cache tag problem, the community made a contrib module that let us to place custom cache tags in our views: the Views Custom Cache Tags contrib module. This module allows us to define custom cache tags for any view we build.

In this case, we are going to place in our views a custom cache tags for each type of node we are displaying:
node:type:article for views listing articles
node:type:page for views listing pages
node:type:<your custom node type> ....

First, we are going to download and install the module.

# Download the module with composer
composer require drupal/views_custom_cache_tag
# Install the module with DC
drupal moi views_custom_cache_tag 

Then we could create two block views, one listing 5 articles (name it Block Articles) and one listing 5 pages (name it Block Pages). Next we place a custom cache tag node:type:article in the view that list articles (Block Articles). We do the same, but with an other custom cache tag like  node:type:page in the view listing pages (Block Pages).

Let's see how to do it for the view (block) listing articles:

1. Edit the view
2. Go to ADVANCED and click on 'Tag based' in Caching

custom cache tags views drupal8

3. Then click on Custom Tag based

custom cache tags views drupal 8

4. Insert the custom cache tag for this node type. In our case, as we are listing articles, we introduce node:type:article. For views listing other kind of nodes, we'll introduce node:type:<node-type>. Don't forget to click on Apply when you're done.

custom cache tags views drupal8

5. Save the view

custom cache tags views Drupal 8

When we have placed custom cache tags in all views listing nodes, we can now move to the second step, invalidate these tags when needed.

Invalidate custom cache tags with a hook_node_presave

Now we need to invalidate these custom cache tags when a CUD action is performed on a specific node type thanks to the hook_node_presave.

To do that, we're going to create a module. You can download the code of this module here.

Let's first create our module with Drupal Console as follow:

drupal generate:module  --module="Invalidate custom cache tags" --machine-name="kb_invalidate_custom_cache_tags" --module-path="modules/custom" --description="Example to invalidate views custom cache tags" --core="8.x" --package="Custom"  --module-file --dependencies="views_custom_cache_tag"  --no-interaction

Then we enable the module, we can do it also with Drupal Console:

drupal moi kb_invalidate_custom_cache_tags

Now we can edit our kb_invalidate_custom_cache_tags.module and place the following hook:

// For hook_ENTITY_TYPE_presave.
use Drupal\Core\Cache\Cache;
use Drupal\Core\Entity\EntityInterface;

 * Implements hook_ENTITY_TYPE_presave().
 * Invalid cache tags for node lists.
function kb_invalidate_custom_cache_tags_node_presave(EntityInterface $entity) {
  $cache_tag = 'node:type:' . $entity->getType();

Yes, we still have hooks in Drupal8... This hook will be fired each time a node is created or updated. We first retrieve the node type (page, article, ...) with $entity->getType() and create the variable $cache_tag with this value, this variable will correspond to our custom tag for this kind of node. Next we mark this cache tag in all bins as invalid with Cache::invalidateTags([$cache_tag]);. So the cache of every view with this custom cache tag will be invalidated.

In this case, if we insert or update a node of type article, the views custom cache tag will be node:type:article and the cache of all views with this custom cache tag will be invalidated. The cache of the views with other custom cache tags like node:type:page will remain valid. This was just what we were looking for! Thank you Drupal!


In order to avoid the cache of all node views to be invalidated when we make a CUD operation on a node, we need the general node_list cache tag to be replaced by a more specific custom cache tag like node:type:<node-type>.

Thanks the Views Custom Cache Tags contrib module we can now insert custom cache tags in our views based on the node type listed in the view like node:type:article, node:type:page and so on.

Next, we mark these custom cache tags as invalid when we insert or update a specific node type thanks to the hook_ENTITY_TYPE_presave hook we've placed on our custom module.

Voilà! If you have an other kind of strategy to face the node_list problem, please share it with us in the comments.

Sep 09 2019
Sep 09

The staff and board of the Drupal Association would like to congratulate our newest At-Large board member:

Leslie Glynn

Leslie Glynn portrait photoLeslie has more than 30 years of experience in the tech field as a software developer and project manager. She has been a freelance Drupal Project Manager and Site Builder since 2012. Glynn is very active in the Drupal community as an event organizer (Design 4 Drupal, Boston and NEDCamp), sprint organizer, mentor, trainer and volunteer. She is the winner of the 2019 Aaron Winborn Award. This annual award recognizes an individual who demonstrates personal integrity, kindness, and above-and-beyond commitment to the Drupal community.

Being a volunteer at numerous Drupal camps and DrupalCons has given me the opportunity to meet and learn from many diverse members of the Drupal community. I hope to bring that knowledge and experience to my work on Drupal Association initiatives. One of the things I would like to help with is growing Drupal adoption through new initiatives that reach out to under-represented and diverse groups through an increased presence at secondary schools and universities and to groups such as "Girls Who Code" and other groups in the tech space.

We are all looking forward to working with you, Leslie.

Thank you to all our candidates

On behalf of all the staff and board of the Drupal Association, and I’m sure the rest of the Drupal community, I would like to thank all of those people who stood for election this year. It truly is a big commitment to contribution and one to be applauded. We wish you well for 2019 and hope to see you back in 2020!

About the Elections Methodology: Instant Run-off Voting (IRV)

Elections for the Community-at-large positions on the Drupal Association Board are conducted through Instant Run-off Voting. This means that voters can rank candidates according to their preference. When tabulating ballots, the voters' top-ranked choices are considered first. If no candidate has more than 50% of the vote, the candidate with the lowest votes is eliminated. Then the ballots are tabulated again, with all the ballots that had the eliminated candidate as their first rank now recalculated with their second rank choices. This process is repeated until only two candidates remain and a clear winner can be determined. This voting method helps to ensure that the candidate who is most preferred by the most number of voters is ultimately elected. You can learn more about IRV (also known as Alternative Vote) in this video.

Detailed Voting Results

There were 12 candidates in contention for the single vacancy among the two community-at-large seats on the Board. 1,050 voters cast their ballots out of a pool of 49,498 eligible voters (2.2%).

The full results output is below. The system allows for candidates to keep their name hidden, if they choose, so we replaced the names of those who did with a candidate number:

The number of voters is 1050 and there were 998 valid votes
and 52 empty votes. Removed withdrawn candidate Tushar Thatikonda from 
the ballots.

Counting votes using Instant Runoff Voting.

 R|Candi|Candi|Imre |Brian|Candi|Shada|Ahmad|Candi|Alann|Manji|Lesli|Exhau
  |date |date |Gmeli| Gilb|date |b Ash| Khal|date |a Bur|t Sin|e Gly|sted 
  |4    |3    |g Mei|ert  |2    |raf  |il   |1    |ke   |gh   |nn   |     
  |     |     |jling|     |     |     |     |     |     |     |     |     
  |     |     |     |     |     |     |     |     |     |     |     |     
 1|   71|   74|  166|  119|   36|   45|    7|  115|   67|  116|  182|    0
  | Count of first choices.
 2|   71|   75|  167|  120|   36|   46|     |  116|   67|  117|  183|    0
  | Count after eliminating Ahmad Khalil and transferring votes.
 3|   72|   76|  177|  124|     |   47|     |  118|   68|  117|  185|   14
  | Count after eliminating Candidate 2  and transferring votes.
 4|   74|   76|  178|  125|     |     |     |  132|   70|  130|  186|   27
  | Count after eliminating Shadab Ashraf and transferring votes.
 5|   89|   77|  183|  133|     |     |     |  142|     |  131|  211|   32
  | Count after eliminating Alanna Burke and transferring votes.
 6|   93|     |  192|  134|     |     |     |  151|     |  134|  217|   77
  | Count after eliminating Candidate 3 and transferring votes.
 7|     |     |  199|  149|     |     |     |  177|     |  136|  248|   89
  | Count after eliminating Candidate 4 and transferring votes.
 8|     |     |  208|  163|     |     |     |  228|     |     |  254|  145
  | Count after eliminating Manjit Singh and transferring votes.
 9|     |     |  239|     |     |     |     |  247|     |     |  296|  216
  | Count after eliminating Brian Gilbert and transferring votes.
10|     |     |     |     |     |     |     |  288|     |     |  359|  351
  | Count after eliminating Imre Gmelig Meijling and transferring votes.
  | Final round is between Candidate 1 and Leslie Glynn.
  | Candidate Leslie Glynn is elected.

Winner is Leslie Glynn.
Sep 09 2019
Sep 09

We’re back with an overview of the blog posts we wrote last month. If there are some you particularly enjoyed, this is the perfect opportunity to revisit them, as well as catch up on the ones you might have missed.

Recap of Acquia's webinar on the Digital Experience Platform

The first post we wrote in August is a recap of Acquia’s webinar on the DXP (Digital Experience Platform), which was presented by Tom Wentworth, SVP of Product Marketing at Acquia, and Justin Emond, CEO of Third and Grove

They talked about digital experiences in general, then explained what a DXP is, why an open approach is best for a DXP, and how Acquia can serve as the basis for an open DXP.

The high emphasis placed on digital experiences is due to the fact that a single negative one can do irreparable damage to a brand. It is thus important to deliver integrated experiences on a platform that’s future-ready. 

As the only truly open DXP, Acquia’s Open Experience Platform is likely the best choice, as integrations with future technologies will be easier due to this open nature.

Read more

Interview with Ricardo Amaro: The future is open, the future is community and inclusion

Our second post is part of the series of our Drupal Community Interviews. This one features a prominent and prolific member of the community - Ricardo Amaro, Principal Site Reliability Engineer at Acquia and an active member of the Portuguese as well as the broader Drupal communities.

Ricardo has been involved in numerous important projects and initiatives, ranging from more technical endeavors such as Docker and containers, to more community-oriented things such as the Promote Drupal initiative

Apart from that, he has presented at Drupal events and participated in the organization of several of them in Portugal as the president of the Portuguese Drupal Association

He is also a strong advocate for Free Software and encourages collaboration with other projects in the ecosystem. He strives to keep the future of the web and technology in general open and rich in possibilities.

Read more

Top 10 Drupal Accessibility Modules

Even though Drupal is already quite optimized for accessibility, it never hurts to have even more resources at one’s disposal. This was our reasoning behind researching Drupal’s available accessibility modules and putting together this list. 

The modules on the list touch different aspects of accessibility and take into account everyone who interacts with the site in any way: there are modules for developers building the site, those for admins and content editors, and those that are geared towards users of the site (e.g. the Fluidproject UI Options module).

Some of the modules have particularly interesting functionality. Namely, the a11y module provides support for simulating specific disabilities, which helps developers feel empathy for users with these disabilities. The htmLawed module can also be especially useful, as it improves both accessibility and security.

Read more

Interview with pinball wizard Greg Dunlap, Senior Digital Strategist at Lullabot

Next up, we have another community interview, this one with pinball enthusiast Greg Dunlap, Lullabot’s Senior Digital Strategist. Interestingly, his first interaction with Drupal was with Lullabot, the company he’s now working for more than 10 years later!

Greg points out that it was actually Lullabot’s Jeff Eaton who gave him the push to start contributing, and the two became really good friends. He believes (and we agree!) that who you do something with is more important than what you do - very fitting, then, that he and Jeff now form Lullabot’s strategy team.

One of the things he has particularly enjoyed recently was working with the Drupal Diversity and Inclusion group. Since welcoming diverse backgrounds and viewpoints into the community is instrumental to the future of Drupal, he encourages anyone who’s interested to join the initiative.

Read more

Agiledrop recognized as a top Drupal development company by

Our final post from August is a bit more company oriented. In a press release published in early August, the IT directory and review platform listed us among the top 10 Drupal development companies of August 2019.

Of course, we’re very happy with the recognition and, with our diverse contribution to the Drupalverse and the numerous successful client projects, we feel it is well deserved. 

Among the reasons for selecting us, the spokesperson at listed the super fast integration of development teams into clients’ teams, our clear and frequent communication with clients, and our adherence to strict coding and security standards. 

To learn more about our work, you can also check out our portfolio of references and case studies, as well our profile page on, which their team helped us build.

Read more

These were all our blog posts from August. We'll be back again next month with an overview of September's posts. Till then - enjoy!

Sep 09 2019
Sep 09

Some different modules and plugins can alter the display of a view in Drupal, for instance, to alternate the order of image and text every new row, or to build some kind of stacked layout. 

It is possible to alter the display of a view with just some lines of CSS code instead. This approach has many advantages, being the fact of not having to install and update a module, the most relevant one.

Keep reading to learn how!

Step #1. - The Proposed Layout

190828 theming views

As you can notice, we can divide the layout into six columns and five rows. There are empty cells in the grid, whereas other cells contain one item of the view across many cells (grid area). The Drupal view shows a list of articles and their titles. The view format is unformatted list.

Step #2. - Create an Image Style

To ensure that all images are squared, it is necessary to configure an image style and set it as display style in the view.

  • Click Configuration > Image styles
    190828 theming views 001
  • Click Add image style
    190828 theming views 002
  • Give the new image style a proper name, for example, Crop 600x600.

It is always a good idea to include some reference about the dimension or the proportion of the image style. That helps when having multiple image styles configured.

  • Click Create new style
  • Select Crop from the dropdown
  • Click Add
    190828 theming views 003
  • Set height and width for the crop effect (make sure both dimensions are equal)
  • Leave the default crop anchor at the center of the image
  • Click Add effect
    190828 theming views 004
  • Make sure the image effect was recorded properly and click Save
    190828 theming views 005

Step #3. - Create the View

You can read more about rewriting results in Views here.

  • Save the view
  • Click Structure > Block layout
  • Scroll down to the Content section
  • Click Place block
    190828 theming views 013
  • Search for your block
  • Click Place block
  • Uncheck Display title
  • Click Save block
    190828 theming views 014
  • Drag the cross handle and place the block above the Main content
  • Scroll down and click Save blocks

Step #4. - Theming the View

There are 3 classes you need to target to apply the layout styles to the view:

  • .gallery-item  (each content card will be a grid)

  • #block-views-block-front-gallery-block-1 .view-content

  • #block-views-block-front-gallery-block-1 .view-row

We set the specificity of the CSS styles on the block. The classes .view-content and .view-row are default Views classes. Theming only with these would break the layout of other views on the site, for example, the teaser view on the front page.

Hint: I am working on a local development environment with a subtheme of Bartik. There is much more about Drupal theming at OSTraining here.
If you don’t know how to create a subtheme in Drupal yet, and you are working on a sandbox installation, just add the code at the end of the file and please remember always to clear the cache.


Let’s start with the content inside the .gallery-item container. It will be a grid with one column and 4 rows. The image will cover all 4 rows, whereas the title text will be located on the last row. To center the title on its cell, we declare the link tag as a grid container too.

  • Edit the CSS code:
.gallery-item {
    display: grid;
    grid-template-rows: repeat(4, 1fr);
.gallery-item a:first-of-type {
    grid-row: 1 / span 4;
    grid-column: 1;
.gallery-item a:last-of-type {
   grid-row: 4;
   grid-column: 1;
   display: grid; /* Acting as a grid container */
   align-content: center;
   justify-content: center;
   background-color: rgba(112, 97, 97, 0.5);
   color: white;
   font-size: 1.2em;

 Make the images responsive.

  • Edit the CSS code:
img {
   display: block;
   max-width: 100%;
   height: auto;

As already stated, we need a grid with 5 rows and 6 columns. After declaring it, map every position in the grid according to the layout with an area name. The empty cells/areas will be represented with a period. 

  • Edit the CSS code:
#block-views-block-front-gallery-block-1 .view-content {
 display: grid;
 grid-template-columns: repeat(6, 1fr);
 grid-template-rows: repeat(5, 1fr);
 grid-gap: 0.75em;
 ". thumb1 main main main thumb2"
 ". thumb3 main main main thumb4"
 ". thumb5 main main main thumb6"
 "secondary secondary thumb7 thumb8 thumb9 ."
 "secondary secondary . . . .";
 max-width: 70vw;
 margin: 0 auto;

Now it’s time to assign each grid item to its corresponding region.

  • Edit the CSS code:
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(1) {
   grid-area: main;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(2) {
   grid-area: secondary;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(3) {
   grid-area: thumb1;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(4) {
   grid-area: thumb3;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(5) {
   grid-area: thumb5;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(6) {
   grid-area: thumb2;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(7) {
   grid-area: thumb4;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(8) {
   grid-area: thumb6;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(9) {
   grid-area: thumb7;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(10) {
   grid-area: thumb8;
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(11) {
   grid-area: thumb9;

I think this is a practical way to layout Views items the way you want without the need of installing extra modules, which could unnecessarily affect the performance of your site. 

The Media Queries

The layout will break at around 970px, because of the font size. 

  • Edit the CSS code:
@media screen and (max-width: 970px) {
 .views-row > div .gallery-item > a:nth-child(2) {
   font-size: .9em;

To change the layout, just add a media query with a new grid-template-areas distribution, and of course, we have to change the way the rows and columns are distributed The items are already assigned to their respective areas.

  • Edit the CSS code:
@media screen and (max-width: 700px) {
 .view-content {
   grid-template-columns: repeat(2, 1fr);
   grid-template-rows: repeat(10, auto);
     "main main"
     "main main"
     "thumb1 thumb2"
     "thumb3 thumb4"
     "secondary secondary"
     "secondary secondary"
     "thumb5 thumb6"
     "thumb7 thumb8"
     "thumb9 thumb9"
     "thumb9 thumb9";

This layout will work even with the smallest device screen.

I hope you liked this tutorial. Thanks for reading!

About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Sep 09 2019
Sep 09

3 minute read Published: 9 Sep, 2019 Author: Colan Schwartz
Drupal Planet , Aegir , DevOps

Have you been looking for a self-hosted solution for hosting and managing Drupal sites? Would you like be able able to upgrade all of your sites at once with a single button click? Are you tired of dealing with all of the proprietary Drupal hosting providers that won’t let you customize your set-up? Wouldn’t it be nice if all of your sites had free automatically-updating HTTPS certificates? You probably know that Aegir can do all of this, but it’s now trivial to set up a temporary trial instance to see how it works.

The new Aegir Development VM makes this possible.


Throughout Aegir’s history, we’ve had several projects striving to achieve the same goal. They’re listed in the Contributed Projects section of the documentation.

Aegir Up

Aegir Up was based on a VirtualBox virtual machine (VM), managed by Vagrant and provisioned with Puppet. It was superseded by Valkyrie (see below).

Aegir Development Environment

Aegir Development Environment took a completely different approach using Docker. It assembles all of the services (each one in a container, e.g. the MySQL database) into a system managed by Docker Compose. While this is a novel approach, it’s not necessary to have multiple containers to get a basic Aegir instance up and running.


Valkyrie was similar to Aegir Up, but provisioning moved from Puppet to Ansible. Valkyrie also made extensive use of custom Drush commands to simplify development.

Its focus was more on developing Drupal sites than on developing Aegir. Now that we have Lando, it’s no longer necessary to include this type of functionality.

It was superseded by the now current Aegir Development VM.


Like Valkyrie, the Aegir Development VM is based on a VirtualBox VM (but that’s not the only option; see below) managed with Vagrant and provisioned with Ansible. However, it doesn’t rely on custom Drush commands.


Customizable configuration

The Aegir Development VM configuration is very easy to customize as Ansible variables are used throughout.

For example, if you’d like to use Nginx instead of Apache, simply replace:

    aegir_http_service_type: apache


    aegir_http_service_type: nginx

…or override using the command line.

You can also install and enable additional Aegir modules from the available set.

Support for remote VMs

For those folks with older hardware who are unable to spare extra gigabytes (GB) for VMs, it’s possible to set up the VM remotely.

While the default amount of RAM necessary is 1 GB, 2 GB would be better for any serious work, and 4 GB is necessary if creating platforms directly from Packagist.

Support for DigitalOcean is included, but other IaaS providers (e.g. OpenStack) can be added later. Patches welcome!

Fully qualified domain name (FQDN) not required

While Aegir can quickly be installed with a small number of commands in the Quick Start Guide, that process requires an FQDN, usually something like (which requires global DNS configuration). That is not the case with the Dev VM, which assumes aegir.local by default.

Simplified development

You can use it for Aegir development as well as trying Aegir!

Unlike the default set-up provisioned by the Quick Start Guide, which would require additional configuration, the individual components (e.g. Hosting, Provision, etc.) are cloned repositories making it easy to create patches (and for module maintainers: push changes upstream).


We’ve recently updated the project so that an up-to-date VM is being used, and it’s now ready for general use. Please go ahead and try it.

If you run into any problems, feel free to create issues on the issue board and/or submit merge requests.

The article Try Aegir now with the new Dev VM first appeared on the Consensus Enterprises blog.

We've disabled blog comments to prevent spam, but if you have questions or comments about this post, get in touch!

Sep 07 2019
Sep 07

In the previous article, we began talking about debugging Drupal migrations. We gave some recommendations of things to do before diving deep into debugging. We also introduced the log process plugin. Today, we are going to show how to use the Migrate Devel module and the debug process plugin. Then we will give some guidelines on using a real debugger like XDebug. Next, we will share tips so you get used to migration errors. Finally, we are going to briefly talk about the migrate:fields-source Drush command. Let’s get started.

Example configuration for debug process plugin

The migrate_devel module

The Migrate Devel module is very helpful for debugging migrations. It allows you to visualize the data as it is received from the source, the result of field transformation in the process pipeline, and values that are stored in the destination. It works by adding extra options to Drush commands. When these options are used, you will see more output in the terminal with details on how rows are being processed.

As of this writing, you will need to apply a patch to use this module. Migrate Devel was originally written for Drush 8 which is still supported, but no longer recommended. Instead, you should use at least version 9 of Drush. Between 8 and 9 there were major changes in Drush internals.  Commands need to be updated to work with the new version. Unfortunately, the Migrate Devel module is not fully compatible with Drush 9 yet. Most of the benefits listed in the project page have not been ported. For instance, automatically reverting the migrations and applying the changes to the migration files is not yet available. The partial support is still useful and to get it you need to apply the patch from this issue. If you are using the Drush commands provided by Migrate Plus, you will also want to apply this patch. If you are using the Drupal composer template, you can add this to your composer.json to apply both patches:

"extra": {
  "patches": {
    "drupal/migrate_devel": {
      "drush 9 support": ""
    "drupal/migrate_tools": {
      "--limit option": ""

With the patches applied and the modules installed, you will get two new command line options for the migrate:import command: --migrate-debug and --migrate-debug-pre. The major difference between them is that the latter runs before the destination is saved. Therefore, --migrate-debug-pre does not provide debug information of the destination.

Using any of the flags will produce a lot of debug information for each row being processed. Many times, analyzing a subset of the records is enough to stop potential issues. The patch to Migrate Tools will allow you to use the --limit and --idlist options with the migrate:import command to limit the number of elements to process.

To demonstrate the output generated by the module, let’s use the image migration from the CSV source example. You can get the code at The following snippets how to execute the import command with the extra debugging options and the resulting output:

# Import only one element.
$ drush migrate:import udm_csv_source_image --migrate-debug --limit=1

# Use the row's unique identifier to limit which element to import.
$ drush migrate:import udm_csv_source_image --migrate-debug --idlist="P01"
$ drush migrate:import udm_csv_source_image --migrate-debug --limit=1
│                                   $Source                                    │
array (10) [
    'photo_id' => string (3) "P01"
    'photo_url' => string (74) ""
    'path' => string (76) "modules/custom/ud_migrations/ud_migrations_csv_source/sources/udm_photos.csv"
    'ids' => array (1) [
        string (8) "photo_id"
    'header_offset' => NULL
    'fields' => array (2) [
        array (2) [
            'name' => string (8) "photo_id"
            'label' => string (8) "Photo ID"
        array (2) [
            'name' => string (9) "photo_url"
            'label' => string (9) "Photo URL"
    'delimiter' => string (1) ","
    'enclosure' => string (1) """
    'escape' => string (1) "\"
    'plugin' => string (3) "csv"
│                                 $Destination                                 │
array (4) [
    'psf_destination_filename' => string (25) "picture-15-1421176712.jpg"
    'psf_destination_full_path' => string (25) "picture-15-1421176712.jpg"
    'psf_source_image_path' => string (74) ""
    'uri' => string (29) "./picture-15-1421176712_6.jpg"
│                             $DestinationIDValues                             │
array (1) [
    string (1) "3"
Called from +56 /var/www/drupalvm/drupal/web/modules/contrib/migrate_devel/src/EventSubscriber/MigrationEventSubscriber.php
 [notice] Processed 1 item (1 created, 0 updated, 0 failed, 0 ignored) - done with 'udm_csv_source_image'

In the terminal, you can see the data as it is passed along in the Migrate API. In the $Source, you can see how the source plugin was configured and the different columns for the row being processed. In the $Destination, you can see all the fields that were mapped in the process section and their values after executing all the process plugin transformation. In $DestinationIDValues, you can see the unique identifier of the destination entity that was created. This migration created an image so the destination array has only one element: the file ID (fid). For paragraphs, which are revisioned entities, you will get two values: the id and the revision_id. The following snippet shows the $Destination and  $DestinationIDValues sections for the paragraph migration in the same example module:

$ drush migrate:import udm_csv_source_paragraph --migrate-debug --limit=1
│                                   $Source                                    │
│                                 $Destination                                 │
array (3) [
    'field_ud_book_paragraph_title' => string (32) "The definitive guide to Drupal 7"
    'field_ud_book_paragraph_author' => string UTF-8 (24) "Benjamin Melançon et al."
    'type' => string (17) "ud_book_paragraph"
│                             $DestinationIDValues                             │
array (2) [
    'id' => string (1) "3"
    'revision_id' => string (1) "7"
Called from +56 /var/www/drupalvm/drupal/web/modules/contrib/migrate_devel/src/EventSubscriber/MigrationEventSubscriber.php
 [notice] Processed 1 item (1 created, 0 updated, 0 failed, 0 ignored) - done with 'udm_csv_source_paragraph'

The debug process plugin

The Migrate Devel module also provides a new process plugin called debug. The plugin works by printing the value it receives to the terminal. As Benji Fisher explains in this issue, the debug plugin offers the following advantages over the log plugin provided by the core Migrate API:

  • The use of print_r() handles both arrays and scalar values gracefully.
  • It is easy to differentiate debugging code that should be removed from logging plugin configuration that should stay.
  • It saves time as there is no need to run the migrate:messages command to read the logged values.

In short, you can use the debug plugin in place of log. There is a particular case where using debug is really useful. If used in between of a process plugin chain, you can see how elements are being transformed in each step. The following snippet shows an example of this setup and the output it produces:

  - plugin: skip_on_empty
    source: src_fruit_list
    method: process
    message: 'No fruit_list listed.'
  - plugin: debug
    label: 'Step 1: Value received from the source plugin: '
  - plugin: explode
    delimiter: ','
  - plugin: debug
    label: 'Step 2: Exploded taxonomy term names '
    multiple: true
  - plugin: callback
    callable: trim
  - plugin: debug
    label: 'Step 3: Trimmed taxonomy term names '
  - plugin: entity_generate
    entity_type: taxonomy_term
    value_key: name
    bundle_key: vid
    bundle: tags
  - plugin: debug
    label: 'Step 4: Generated taxonomy term IDs '
$ drush migrate:import udm_config_entity_lookup_entity_generate_node --limit=1
Step 1: Value received from the source plugin: Apple, Pear, Banana
Step 2: Exploded taxonomy term names Array
    [0] => Apple
    [1] =>  Pear
    [2] =>  Banana
Step 3: Trimmed taxonomy term names Array
    [0] => Apple
    [1] => Pear
    [2] => Banana
Step 4: Generated taxonomy term IDs Array
    [0] => 2
    [1] => 3
    [2] => 7
 [notice] Processed 1 item (1 created, 0 updated, 0 failed, 0 ignored) - done with 'udm_config_entity_lookup_entity_generate_node'

The process pipeline is part of the node migration from the entity_generate plugin example. In the code snippet, a debug step is added after each plugin in the chain. That way, you can verify that the transformations are happening as expected. In the last step you get an array of the taxonomy term IDs (tid) that will be associated with the field_tags field. Note that this plugin accepts two optional parameters:

  • label is a string to print before the debug output. It can be used to give context to what is being printed.
  • multiple is a boolean that when set to true signals the next plugin in the pipeline to process each element of an array individually. The functionality is similar to the multiple_values plugin provided by Migrate Plus.

Using the right tool for the job: a debugger

Many migration issues can be solved by following the recommendations from the previous article and the tools provided by Migrate Devel. But there are problems so complex that you need a full-blown debugger. The many layers of abstraction in Drupal, and the fact that multiple modules might be involved in a single migration, makes the use of debuggers very appealing. With them, you can step through each line of code across multiple files and see how each variable changes over time.

In the next article, we will explain how to configure XDebug to work with PHPStorm and DrupalVM. For now, let’s consider where are good places to add breakpoints. In this article, Lucas Hedding recommends adding them in:

  • The import method of the MigrateExecutable class.
  • The processRow method of the MigrateExecutable class.
  • The process plugin if you know which one might be causing an issue. The transform method is a good place to set the breakpoint.

The use of a debugger is no guarantee that you will find the solution to your issue. It will depend on many factors, including your familiarity with the system and how deep lies the problem. Previous debugging experience, even if not directly related to migrations, will help a lot. Do not get discouraged if it takes you too much time to discover what is causing the problem or if you cannot find it at all. Each time you will get a better understanding of the system.

Adam Globus-Hoenich, a migrate maintainer, once told me that the Migrate API "is impossible to understand for people that are not migrate maintainers." That was after spending about an hour together trying to debug an issue and failing to make it work. I mention this not with the intention to discourage you. But to illustrate that no single person knows everything about the Migrate API and even their maintainers can have a hard time debugging issues. Personally, I have spent countless hours in the debugger tracking how the data flows from the source to the destination entities. It is mind-blowing, and I barely understand what is going on. The community has come together to produce a fantastic piece of software. Anyone who uses the Migrate API is standing on the shoulders of giants.

If it is not broken, break it on purpose

One of the best ways to reduce the time you spend debugging an issue is having experience with a similar problem. A great way to learn is by finding a working example and breaking it on purpose. This will let you get familiar with the requirements and assumptions made by the system and the errors it produces.

Throughout the series, we have created many examples. We have made our best effort to explain how each example work. But we were not able to document every detail in the articles. In part to keep them within a reasonable length. But also, because we do not fully comprehend the system. In any case, we highly encourage you to take the examples and break them in every imaginable way. Making one change at a time, see how the migration behaves and what errors are produced. These are some things to try:

  • Do not leave a space after a colon (:) when setting a configuration option. Example: id:this_is_going_to_be_fun.
  • Change the indentation of plugin definitions.
  • Try to use a plugin provided by a contributed module that is not enabled.
  • Do not set a required plugin configuration option.
  • Leave out a full section like source, process, or destination.
  • Mix the upper and lowercase letters in configuration options, variables, pseudofields, etc.
  • Try to convert a migration managed as code to configuration; and vice versa.

The migrate:fields-source Drush command

Before wrapping up the discussion on debugging migrations, let’s quickly cover the migrate:fields-source Drush command. It lists all the fields available in the source that can be used later in the process section. Many source plugins require that you manually set the list of fields to fetch from the source. Because of this, the information provided by this command is redundant most of the time. However, it is particularly useful with CSV source migrations. The CSV plugin automatically includes all the columns in the file. Executing this command will let you know which columns are available. For example, running drush migrate:fields-source udm_csv_source_node produces the following output in the terminal:

$ drush migrate:fields-source udm_csv_source_node
 -------------- -------------
  Machine Name   Description
 -------------- -------------
  unique_id      unique_id
  name           name
  photo_file     photo_file
  book_ref       book_ref
 -------------- -------------

The migration is part of the CSV source example. By running the command you can see that the file contains four columns. The values under "Machine Name" are the ones you are going to use for field mappings in the process section. The Drush command has a --format option that lets you change the format of the output. Execute drush migrate:fields-source --help to get a list of valid formats.

What did you learn in today’s blog post? Have you ever used the migrate devel module for debugging purposes? What is your strategy when using a debugger like XDebug? Any debugging tips that have been useful to you? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at as well as here on, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Sep 06 2019
Sep 06

Our team has always been engaged and hands-on when it comes to web accessibility and inclusion. Through learning, teaching, auditing, remediating or supporting others doing the same, providing information access to all is at the core of how we run our business. Inclusivity, equality, and global access is still very much a work in progress for humanity. Any step that Hook 42 can take towards improving inclusion and access, you bet we’re going to do it. This year’s Bay Area Drupal Camp (BADCamp) is no exception.

This camp is packed with content covering accessibility topics. Aimee and Lindsey leading a web accessibility training session on Thursday. On Friday and Saturday, there are four sessions that will touch base on accessible best practices. Thank you to each of the presenters for broadening the audience one of our favorite subjects. We hear you, and are so happy to join in on the discussion!

Web Accessibility Sessions at BADCamp

Your Code is Terrible!, presented by Jayson Jaynes, will cover the topic of semantic HTML. In the talk, Jayson will explain its importance, how to best practice it, and what developers benefit from by understanding it. The best part? Jayson will explore some tools that make creating semantic code in Drupal easier and how to utilize your new tools to ensure accessibility compliance.

It's a Bird... It's a Plane... It's ¯\_(ツ)_/¯? Using Machine Learning to Meet Accessibility Requirements, presented by Danny Teng and Neha Bhomia, will explore a new world where manual input for alt tags becomes a thing of the past. The pair will explore how machine learning can be leveraged to take care of the tedious task of alt text generation. 

Shine a Light on Me, presented by Paul Sheldrake, will cover a broad overview of a chrome extension, Lighthouse, that checks pages for not only accessibility compliance, but site performance and SEO concerns as well. Paul will cover the basics for why it’s important to run these scans, and how Lighthouse can help make that process a little better for everyone.

SVG Magic!, by Anthony Horn, will talk about the glory that SVGs behold. There is more to an SVG than putting scalable vector on the web, and Anthony will walk through exactly what that entails from animation to accessibility compliance and everything in between.

Web Accessibility Training at BADCamp

Our accessibility training session at BADCamp, Web Accessibility Training, will take place on Thursday from 8:00 am to 5:00 pm. We hope you’re ready for a deep-dive into all things inclusion on the web. 

Aimee and Lindsey will cover as much as they can squeeze into one full-day crash course. The course starts at accessibility laws and principles, works through design, content, media, code, and testing tools. We cover the topic with both a broad but deep approach for all attendees to gain the most exposure to such a big topic. 

At the end of the day, our goal is to make sure everyone becomes an advocate for web accessibility! We hope you’ll gain a better understanding of where your organization stands with accessibility, what specific role you’ll play in ensuring your websites stay compliant, and how you can show others where to go in order to apply accessible best practices in their areas of expertise.

It’s going to be a busy camp, and we are thrilled to be part of such a hot topic in the web community. If you want to chat further about accessibility, you can always stop by our booth while you’re there to pick our brains. We’re always in the mood to talk accessibility!

Sep 06 2019
Sep 06

To get started, you will need a Drupal 8 site. If you don’t have one, you can create a free Drupal 8 development site on Pantheon. Once you have your Drupal 8 site installed, make sure you have some Article content. You will also need to enable the JSON:API module.

API Module

That is it for setup on the Drupal site. The next step is to install the Gatsby Source Drupal plugin.

npm install --save gatsby-source-drupal


yarn add gatsby-source-drupal

Next open the gatsby-config.js file so we can configure the plugin. Add the following code to the Gatsby config.

Gatsby Drupal Config

If you re-run your Gatsby development server, you will be able to open the GraphiQL explorer at http://localhost:8000/___graphql and explore the data the Drupal database is providing. In this case, I am filtering by the article content type. You can use the Explorer panel on the left to build the query and the play button at the top to run the query.

GraphQL Explorer

Now that you know your Gatsby site can see your Drupal data, we need to consume this data during the Gatsby build process and create pages. In this case, we want to create a page on our Gatsby site for each Article. We want to make sure we use the path that was configured in Drupal as the path to our Article. Anytime we want to create pages during the Gatsby build process, we can do so in the gatsby-node.js file. Go ahead and open that file up.

Add the following code to that file:

const path = require('path');
exports.createPages = async ({ actions, graphql }) => {
 const { createPage } = actions;
 const articles = await graphql(`
     allNodeArticle {
       nodes {
         path {
 `); =>
     path: articleData.path.alias,
     component: path.resolve(`src/templates/article.js`),
     context: {

This code does a few things. First it uses GraphQL to pull in the id and path of all the Articles from your Drupal site. It then loops through this list and calls Gatsby’s createPage method which will create a Gatsby page for this Article. We make sure we pass in the correct path and a template (which we still need to create). We also pass in the Article id as the context. You will see why this is important in a few minutes.

Create the src/templates folder if it doesn’t already exist, then create a file called article.js. Add the following code to this new article.js file:

import React from 'react';
import PropTypes from 'prop-types';
import { graphql } from 'gatsby';
import Layout from '../components/layout';
const Article = ({ data }) => {
 const post = data.nodeArticle;
 return (
       dangerouslySetInnerHTML={{ __html: post.body.processed }}
Article.propTypes = {
 data: PropTypes.object.isRequired,
export const query = graphql`
 query($ArticleId: String!) {
   nodeArticle(id: { eq: $ArticleId }) {
     body {
     field_image {
     relationships {
        field_image {
          localFile {
export default Article;

This might seem a bit confusing at first. The first section contains the React component called Article. It receives a data prop which contains Article data. It returns outputs the title and the body text in a Layout component.

The second part contains the propTypes. This just says that our Article component will receive a prop called data that will be an object and it will be required. This is just a way to validate our props.

The third part is where it gets a bit more confusing. As you already know, we ran one query in the gatsby-node.js file to get the data, but here we are also running a page query. When using Drupal as a backend, it’s useful for each template to run it’s own page query so it can build the page in a self contained manner. This is especially important when you want to implement live preview (which we will cover in the future). This query takes the id and loads additional Article data such as the title and the body field. Notice that the relationships field has to be used to pull in the actual image.

Another thing to note here is that this is not the best way to pull in images. It’s recommended to use the Gatsby Image component, but that makes the GraphQL a little more complicated so we will revisit that in more depth in the future.

If you shut down and re-run your Gatsby development server, it should create the article pages at the same path they were created on your Drupal site. There is no listing page (another thing we will fix in the future), but you can manually paste one of the known paths into the address bar to see your Article content on your Gatsby Site!

Gatsby Article Page

Sep 06 2019
Sep 06

I'm excited to share that when Drupal 8.8 drops in December, Drupal's WYSIWYG editor will allow media embedding.

You may wonder: Why is that worth announcing on your blog? It's just one new button in my WYSIWYG editor..

It's a big deal because Drupal's media management has been going through a decade-long transformation. The addition of WYSIWYG integration completes the final milestone. You can read more about it on Wim's blog post.

Drupal 8.8 should ship with complete media management, which is fantastic news for site builders and content authors who have long wanted a simpler way to embed media in Drupal.

Congratulations to the Media Initiative team for this significant achievement!

September 06, 2019

27 sec read time

db db
Sep 06 2019
Sep 06


Today, IT security is paramount to succeed in business. Enterprises are spending hefty amount on security than ever before. Progress in both security and hacking technologies such as intrusion detection systems, honey pots, honeynets, and other various security-related hardware and software solutions are showcasing the pressing need for transformation in the information security domain.

One of the reports by Gartner cited that enterprises in India alone are going to spend laboriously on the information security front which will mark up to US$2 billion in 2020.

The increasing awareness on the benefits of the risk assessment and the realization of the fact that security is one of the driving forces for digital transformation are boosting enterprise security globally. 

The battle between open-source and proprietary software has been throwing a fit since long. Multiple issues and concerns are being examined and scrutinized by both sides of the story. In the most recent phase of this fanatical dispute, both camps have inspected the issue of security with serious tenacity.

Having said that, let’s take a sneak peek into this blog for further insights on the same.

Myths Are Meant to Be Debunked

Proprietary software is more secure than open-source software. This myth comes from many prejudices. But a commercial license doesn’t assure security. Unlike proprietary software, open-source software is transparent about potential vulnerabilities.

#Myth1: Anyone can view the code 

Because it is open source, anyone can view the code. People often want to argue that being able to view the code allows nefarious hackers to look at it and exploit vulnerabilities.

However, this openness enables collaboration. Unlike, say, one proprietary software, which is developed and is maintained by a single company, Drupal is developed and maintained by more than one hundred thousand programmers around the world. These programmers might work for companies that compete with each other, or they might volunteer to create something new that’s then given away. For free.

In fact, in 2015 Google open sourced its artificial intelligence engine, TensorFlow. Something which is a core part of its business. It hoped more developers would make the software better as they adapted it to their own needs. And it did, by making it open source, Google boasts of more than 1,300 developers, outside Google, have worked on TensorFlow making it one of the standard frameworks for developing AI applications, which could bolster its cloud-hosted AI services. 

#Myth2: Proprietary software are secure and not prone to attacks

There have been multiple instances in the past that depicts that proprietary software has been attacked several times. Such as:

Melissa Virus and ILoveYou Worm - spread through Microsoft Word email attachments. The email contained attachment. If the victim’s system had the Microsoft outlook application installed, then the virus would send the email to 50 too all contacts in the Outlook program’s address book. would also overwrite & consequently destroy various types of files on the victim’s device including MP3 files, JPEG files, and more. It led Microsoft to shut down its inbound email system.

Wannacry - a worldwide cyberattack that took place in 2017. It was a ransomware crypto worm attack that aimed at computers using Windows operating systems, encrypting all the files on hard drives on these machines. It didn’t let users access the files until they paid a ransom in the cryptocurrency Bitcoin.

The WannaCry attack impacted major entities all over the world, such as the National Health Service in Britain and Scotland, the University of Montreal in Canada, State Government websites in India, and Russian Railways.

With that said, it's evident that proprietary software is also easily vulnerable to attacks!

Although countermeasures like anti-virus programs and security patches were implemented to mitigate the threats and weaknesses, the long-term and especially exorbitant effects of these dangers have been engraved for permanent into the memories of people all over the world. This is because these attacks not only damaged vital electronic data but also shut down business operations and services, and facilitated malicious infiltration and theft of money & proprietary information.

History of Open source Software

The term “open-source”, popular since its inception in the late 70s and early 80s has come from a revolution, “open-source revolution”, which completely revamped the way software is developed- resulting in the birth of the community-generated software development method.

Box with text written inside it

In 1971, Richard Stallman, a young software engineer from Harvard, joined the MIT Artificial Intelligence Lab with the intent of developing computing platforms. After serving for a few years in the early 1980s, the MIT Lab became extinct due to the booming of proprietary software in the market and lost its talented developers to privately held tech companies.

Stallman, who was closely involved in the field knew customers’ software requirements believed customers should be empowered enough to fix and debug the software themselves instead of simply operating it.

“Users should be empowered enough to fix and debug the software themselves-instead of simply operating it”

The majority of software until now was controlled in its entirety by the developer where individual user rights were completely discarded. This was also a pain point for MIT AI Lab since they failed to incorporate this feature into their software development strategies.

The Disembarkation of the Free Software Movement

But this was until 1984. Post evaluation, Stallman began his GNU Project. Initiating with a compiler, GCC and a new operating systems-Stallman felt that GNU project was the major turning point in the evolution of free software community.

“The Free Software Foundation was formulated to let users run the software as they wanted”

Stallman believed that software should be available for free in terms of accessibility. Hence, the Free Software Foundation (FSF) was formulated so that users can run, modify, update, and disseminate software in the community.

Later on, he also introduced the concept of copyleft, wherein a program is first copyrighted, and then additional distribution terms are added for its further use.

Challenges Associated With Proprietary CMS 

Proprietary CMS comes up with a set of restrictions which makes it less flexible in comparison to open-source software. 

“The contribution and development teams of proprietary cms are smaller, which makes it evident that there is a probability of missing out on mistakes and bugs in the code”

It might appear that closed source software or proprietary software is more secure since the code is not available. But unfortunately, it is not the case! The contribution and development teams of proprietary CMS are smaller, which makes it evident that there is a probability of missing out on mistakes and bugs in the code.

You might not know what issues the proprietary system has had in the past, or is having currently because the provider of the proprietary CMS isn’t going to voluntarily reveal this information. This sets a major drawback for proprietary CMS users in terms of security as well.

Let’s further see the challenges associated with proprietary CMS-

Not many customizations options

Since these proprietary CMS are developed for a specific kind of industry and audience, it gets difficult to customize the website to fit the exact needs of the people. Users are not building their system so it's obvious that they will have limited flexibility options.

Portability is beyond the bounds of possibility

Users don’t have an option to extract data and files out of their system with a proprietary solution. They are quite restricted because they won’t be able to even move their website from one hosting service to another.

“Several CMS vendors don’t upgrade their platforms, so it's better to do a bit of research first and then jump onto doing business with a vendor”

You don’t have any option other than trusting the company blindly

Since the company owns the platform and the storage space your website will be built upon, you’ll have to manifest a lot of trust into your vendor. They will have to continuously develop and refine their software, to handle their consumers’ needs better. The vendor should also be in reach whenever you need assistance with your website

Several CMS vendors don’t upgrade their platforms, so it's better to do a bit of research first and then jump onto doing business with a vendor.

You are just renting software

Even if you have bought the proprietary CMS, you won’t own the code it's built with. It is not yours and hence requires a monthly rent from you, to keep your website running.

Benefits of Open-source Software

“People in the open-source community come forward to find solutions, assist each other, and to share extensions that would benefit the masses”

  • It is open-source!

This implies that the source code is available for anyone who wishes to study it, analyze it, and modify it in any way.

Thanks to this feature that people can easily extend the code and add specific functionalities as per their requirements.

  • An open-source CMS is maintained by the large community

There is always a primary group of developers, similar to WordPress but it is also supported by its user base. People in the open-source community come forward to find solutions, assist each other, and to share extensions that would benefit the masses.

Rectangle with various lengths of horizontal bar

  • An open-source CMS can be hosted ubiquitously

Most of them, like Drupal, offers one-click installs in the control panel of the accompanying hosting service, which again is very user-friendly and comfortable.

  • The CMS software itself is usually free of cost

You can easily make use of plenty of extensions, themes, and a variety of tools for free. However, there are plenty of paid extensions and themes as well. Some solutions can only be leveraged with paid software. An open-source CMS is usually the most budget-friendly solution.

Alternatives to Proprietary Software

It is interesting to see that there are so many open-source software alternatives for the existing proprietary software which are equivalent or more reliable, secure, and flexible. 

If you are contemplating to migrate from proprietary software to open-source, you can surely - that too with ease!

Software Category

Proprietary Software

Equivalent Open-source Software

Operating System

Microsoft Windows

Linux Ubuntu


Internet Explorer

Mozilla Firefox

Office automation

Microsoft Office

Open Office



Sci Lab

Graphics Tool

Adobe Photoshop

GIMP(GNU Image Manipulation Program

Drafting tool

Auto CAD


Web Editors

Adobe Dreamweaver


Desktop Publishing

Adobe Acrobat

PDF Creator







Media Player

Windows Media Player

VLC Player


Oracle, Microsoft SQL Server



Microsoft Window Server

Red Hat Server, Ubuntu Server

Web Server



Open-source Security in Drupal

Drupal, having a proven track record of being the most secure CMS, has been rolling with punches against critical internet susceptibleness. Thanks to Drupal security team for earnestly finding anomalies, authenticating them, and responding to security issues.  

The responsibilities of the security team include documentation of these identifications and alterations made so that developers don’t feel heebie-jeebies when faced with similar kind of situation.

“Drupal community comprises of over 100,000 contributors towards its enhancement”

Besides, the team also assists the infrastructure team to keep the infrastructure secure. They ensure that any security issues for code hosted on Drupal are reviewed, reported, and solved in the shortest period possible.

Important features that make Drupal 8 the best WCMS in regards to Security-

  • The Security Working Group (SecWBG) ensures that Drupal core and Drupal’s contributed project ecosystem provides a secure platform while ensuring that the best practices are followed.
  • The community makes sure that people are notified the day patches are released, which are released every Wednesday for contributed projects, and the third Wednesday of every month for core, usually for a fixed period.
  • Drupal abides by the OWASP ( Open Web Application Security Project) standards and its community is devoted towards prevention of any security breaches.
  • Drupal community comprises of over 100,000 contributors towards its enhancement. An open-source code base, where contributed modules are properly reviewed, verified, and sent a notification if that module is acceptable for use.
  • Apart from encrypting and hashing the passwords, Drupal provides those modules which can support two-step authentication and SSL certificates.
  • Any member can make changes to Drupal modules and report any issues or bugs that occur in their system.
  • Access controls offered by Drupal is a superb feature. Dedicated accounts can be created for certain user roles with specified permissions. For instance, you can create separate user accounts for Admin and Editor.
  • It’s multibranched cache structure that assists in reducing Denial of Service (DoS) attacks and makes it as the best CMS for some of the world’s highest traffic websites like NASA, the University of Oxford, Grammys, Pfizer, etc.

Statistics Says It All

Sucuri, a security platform for websites, curated the “Hacked website report 2018”. It evaluated more than 34,000 compromised websites. Among the statistics it shared, one of the factors was to juxtapose the affected open-source CMS applications.


The results were clearly on Drupal’s side declaring it a better WCMS than other leading platforms for preventing safety hazards.

The infection crept in these websites due to improper deployment, configuration, and maintenance.

Additionally, Cloud Security Report by Alert Logic also marked Drupal as the website content management system with the least number of web application attacks.11 Columns and 8 rows with text written inside them                                                                        Source: Alert Logic

Difference Between Open-source and Proprietary Software





Open-source software is free which makes it an alluring option if you have in-house capacities to meet your business requirements.

Proprietary software costs differently from a couple of thousand dollars to one hundred thousand dollars, depending upon the multifaceted nature of the framework needed.

Service and support

Open-source software communities of developers are huge and steadfast which helps clients with prompt solutions to their problems.

Proprietary software vendors offer progressing backing to clients- a key offering point for clients without specialized mastery.


Open-source software boosts innovation by providing users the opportunity to modify, append, or distribute as per their requirements.

Proprietary software vendors don’t permit its users to view or adjust the source code, thus making it unfit for organizations who desire scalability and flexibility.

Only developers can incorporate new features to the product as and when requested by users.


As open-source code is available to everybody, it increases the possibility of finding more vulnerabilities easily. 

It is also worth noting that open-source communities fixed security vulnerabilities twice as quickly as commercial software vendors do.

Proprietary software is considered secure as it is developed in a governed condition of the employees having a frequent direction.

However, getting rid of the possibility of backdoor Trojans as well as lowering the threat of any other bugs or obstacles can be troublesome in proprietary software.


Open-source software is available for free on the web with 24*7 support from the community.

Proprietary software is accessible if the companies have the rights to the bundle or they have purchased from the respective vendors.

The trial version is also accessible for free to test.


As organizations aim at deriving more business values from less, open-source software can deliver high flexibility, lower IT costs and increased opportunities for innovation.

With proprietary software, such as Microsoft Windows, and Office, companies are required to upgrade both software and hardware on a timely basis. Updates must be installed for the proper working. However, not all updates are flexible with all the versions of the software.

In The End

Website security has always been a cause of hindrance in the journey of digital transformation and survival due to several potential threats. 

Open-source software can be considered as a befitting solution than a closed source or proprietary software. Further, this report indicates that there is an obvious desire among companies to adopt open-source technology and also prioritize the task of enhancing security in their organization.
Rectangle with text written inside Source: Gartner

However, it all depends on the preferences and needs of the organization and the on-going project for their digital business.

Drupal, an open-source content management framework, comes out as the most secure CMS in comparison to the leading players in the market.

It has been the pacesetter when it comes to opting the security focussed CMS. More individuals working on and reviewing the product always means a higher chance of a secure product!

Sep 06 2019
Sep 06

Diagnosing a website for accessibility and fixing any issues that are uncovered is not a one-size-fits-all endeavor. 

Every site has a distinct set of strengths and challenges. Plus, site owners vary widely in their level of expertise and the availability of resources that can be devoted to accessibility -- which includes diagnosing the site, making the necessary modifications, and ensuring that the tools and expertise are in place to maintain compliance. 

That’s why flexibility is an essential criteria when seeking ADA Section 508 accessibility solutions.

Another key: a consultative approach. Generally speaking, developers and content editors aren’t hired for their knowledge of WCAG 2.1, and for most organizations, this expertise is not mission critical. Tapping the expertise of a Certified Professional in Accessibility Core Competencies (CPACC) is the most efficient and effective path for bringing a site into compliance.

For organizations that partner with Promet Source to transition their websites into compliance, the process consists of a series of value-added steps in a specific order.  

The following graphic depicts the essential steps involved in an Accessibility Audit in which Promet review all facets of a website’s accessibility relative to WCAG 2.1, and consults with site owners on remediation. 

A circular graphic that indicates the six steps in a Promet Source Accessibility Audit. 1. PA11Y Setup 2. PA11Y Remediation 3. Round 1 Manual Audit 4. Round 1 Remediation 5. Round 2 Manual Audit 6. Final Statements

PA11Y Setup

A11Y is an abbreviation for accessibility, with the number 11 representing the number of letters between the first and last letter. PA11Y is an automated testing tool that scans web pages to detect inaccessibility. While automated testing is an essential component of the accessibility audit process, it cannot be counted on to be comprehensive. 

On average, automated testing detects approximately 30 percent of a site’s accessibility errors. The errors detected by automated testing tend to be the “low-hanging fruit” found within global elements across the site, as well as logins, landing pages, and representative page templates.

PA11Y Remediation

What sets Promet apart following this initial, automatic testing phase is a high degree of consultation, along with a list of custom code fixes for bringing the site into compliance. Additionally, for a year following the audit, clients have the advantage of a dashboard, that serves as a tool from which pages can be scanned and red flagged for accessibility errors.

It’s also important to point out that at the onset of the audit process, it might not be clear what remediation will entail. For any number of reasons, clients who initially intended to manage the remediation in house, might opt for a different approach once they gain an understanding of the scope of work involved.

Round 1 Manual Audit 

The manual audit does not occur until all of the issues flagged by the PA11Y scan are fixed. This process is facilitated by the customized code fixes that Promet provides, along with a dashboard that provides a roadmap of sorts for tracking progression and red flagging issues that need to be fixed.  

As mentioned above, the PA11Y scan cannot be counted on to detect all of the accessibility errors on a site. Manual testing is required to root out the deeper errors, which are the issues that have a greater tendency to expose site owners to legal liability. Among them:

  • Keyboard testing,
  • Color contrast testing,
  • Screen reader testing,
  • Link testing,
  • Tables and forms testing,
  • Cognitive testing, and 
  • Mobile testing.

If a site is revealed to be unresponsive, this finding can result in a recommendation to not move forward with remediation. Another potential remediation deal breaker: a mobile site that is not consistent in terms of content and functionality with the desktop site, as a mobile site is required to have the same content as its desktop counterpart.

It’s important to note that a strong accessibility initiative has been built into Drupal 8, and that will continue to be the case for Drupal 9 and subsequent updates. At this point, we have found Drupal to be the best CMS in terms of accessibility.

Round 1 Remediation

Promet is in close consultation with clients during the manual audit, and walks through every component of success criteria before the client moves forward with Round 1 Remediation.

A customized plan is created that varies according to depth and breadth of remediation work required, as well as the in-house expertise, and available resources. Depending on client needs, the plan can incorporate various levels of consultation, and either online or in-person training.

Working closely with both content editors and developers, the training focuses on the required remediation steps, as well as how to write code that’s accessible. Ensuring the accessibility of PDFs is another key area of focus.  

The remediation dashboard serves as an essential tool during and following Round 1 remediation. The dashboard flags errors and issues warnings which then need to be manually reviewed and addressed.

Round 2 Manual Audit

The Round 2 Audit represents the final review, along with ongoing consultation concerning any remediation challenges that have proven to be complex, and best practices for maintaining compliance. The Round 2 Audit won’t begin until all errors reported in the Round 1 Audit have been remediated to 0 errors.

Final Statements

Once all recommended remediation has been completed and verified, final statements are prepared. The final statements provide official language that the audit and remediation are complete. A final Statement of Accessibility and Statement of Work Completed will be provided. Optimally, a complete Statement of Conformance is issued, but in instances where the site links to third-party vendors, (which is often the case) and the vendor sites are not accessible, a Statement of Partial Conformance is issued, along with an explanation of the site owner’s good-faith efforts. 

It is recommended that instances of inaccessibility be reported to third-parties that are linked to the site. Often the result is ongoing remediation work and ultimately, a comprehensive Statement of Conformance.

Moving Forward

Without exception, Promet clients report a high degree of added value during and following an accessibility audit. The education, consultation, and opportunity to dig deep and deconstruct aspects of a site that no longer serve the organizational mission fuels a better and wiser team of developers and content editors. Plus, the dashboard that remains in place for a full year, is an essential resource for staying on track.

In the current climate, websites are highly dynamic and serve as the primary point of engagement for customers and constituents. Constantly evolving sites call for an ongoing focus on accessibility, and an acknowledgement that staff turnover can erode the education, expertise, and commitment to accessibility that is in place at the conclusion of an audit. For this reason, a bi-annual or annual audit, which can be viewed essentially as an accessibility refresh, is a highly recommended best practice. Interested in kicking off a conversation about auditing your site for accessibility? Contact us today.

Sep 05 2019
Sep 05

Throughout the series we have shown many examples. I do not recall any of them working on the first try. When working on Drupal migrations, it is often the case that things do not work right away. Today’s article is the first of a two part series on debugging Drupal migrations. We start giving some recommendations of things to do before diving deep into debugging. Then, we are going to talk about migrate messages and presented the log process plugin. Let’s get started.

Example configuration for log process plugin.

Minimizing the surface for errors

The Migrate API is a very powerful ETL framework that interacts with many systems provided by Drupal core and contributed modules. This adds layers of abstraction that can make the debugging process more complicated compared to other systems. For instance, if something fails with a remote JSON migration, the error might be produced in the Migrate API, the Entity API, the Migrate Plus module, the Migrate Tools module, or even the Guzzle HTTP Client library that fetches the file. For a more concrete example, while working on a recent article, I stumbled upon an issue that involved three modules. The problem was that when trying to rollback a CSV migration from the user interface an exception will be thrown making the operation fail. This is related to an issue in the core Migrate API that manifests itself when rollback operations are initiated from the interface provided by Migrate Plus. Then, the issue causes a condition in the Migrate Source CSV plugin that fails and the exception is thrown.

In general, you should aim to minimize the surface for errors. One way to do this by starting the migration with the minimum possible set up. For example, if you are going to migrate nodes, start by configuring the source plugin, one field (the title), and the destination. When that works, keep migrating one field at a time. If the field has multiple subfields, you can even migrate one subfield at a time. Commit every progress to version control so you can go back to a working state if things go wrong. Read this article for more recommendations on writing migrations.

What to check first?

Debugging is a process that might involve many steps. There are a few things that you should check before diving too deep into trying to find the root of the problem. Let’s begin with making sure that changes to your migrations are properly detected by the system. One common question I see people ask is where to place the migration definition files. Should they go in the migrations or config/install directory of your custom module? The answer to this is whether you want to manage your migrations as code or configuration. Your choice will determine the workflow to follow for changes in the migration files to take effect. Migrations managed in code go in the migrations directory and require rebuilding caches for changes to take effect. On the other hand, migrations managed in configuration are placed in the config/install directory and require configuration synchronization for changes to take effect. So, make sure to follow the right workflow.

After verifying that your changes are being applied, the next thing to do is verify that the modules that provide your plugins are enabled and the plugins themselves are properly configured. Look for typos in the configuration options. Always refer to the official documentation to know which options are available and find the proper spelling of them. Other places to look at is the code for the plugin definition or articles like the ones in this series documenting how to use them. Things to keep in mind include proper indentation of the configuration options. An extra whitespace or a wrong indentation level can break the migration. You can either get a fatal error or the migration can fail silently without producing the expected results. Something else to be mindful is the version of the modules you are using because the configuration options might change per version. For example, the newly released 8.x-3.x branch of Migrate Source CSV changed various configuration options as described in this change record. And the 8.x-5.x branch of Migrate Plus changed some configurations for plugin related with DOM manipulation as described in this change record. Keeping an eye on the issue queue and change records for the different modules you use is always a good idea.

If the problem persists, look for reports of similar problems in the issue queue. Make sure to include closed issues as well in case your problem has been fixed or documented already. Remember that a problem in a module can affect a different module. Keeping an eye on the issue queue and change records for all the modules you use is always a good idea. Another place ask questions is the #migrate channel in Drupal slack. The support that is offered there is fantastic.

Migration messages

If nothing else has worked, it is time to investigate what is going wrong. In case the migration outputs an error or a stacktrace to the terminal, you can use that to search in the code base where the problem might originate. But if there is no output or if the output is not useful, the next thing to do is check the migration messages.

The Migrate API allows plugins to log messages to the database in case an error occurs. Not every plugin leverages this functionality, but it is always worth checking if a plugin in your migration wrote messages that could give you a hint of what went wrong. Some plugins like skip_on_empty and skip_row_if_not_set even expose a configuration option to specify messages to log. To check the migration messages use the following Drush command: drush migrate:messages [migration_id]. If you are managing migrations as configuration, the interface provided by Migrate Plus also exposes them.

Messages are logged separately per migration, even if you run multiple migrations at once. This could happen if you execute dependencies or use groups or tags. In those cases, errors might be produced in more than one migration. You will have to look at the messages for each of them individually.

Let’s consider the following example. In the source there is a field called src_decimal_number with values like 3.1415, 2.7182, and 1.4142. It is needed to separate the number into two components: the integer part (3) and the decimal part (1415). For this, we are going to use the extract process plugin. Errors will be purposely introduced to demonstrate the workflow to check messages and update migrations. The following example shows the process plugin configuration and the output produced by trying to import the migration:

# Source values: 3.1415, 2.7182, and 1.4142

  plugin: explode
  source: src_decimal_number
$ drush mim ud_migrations_debug
[notice] Processed 3 items (0 created, 0 updated, 3 failed, 0 ignored) - done with 'ud_migrations_debug'

In MigrateToolsCommands.php line 811:
ud_migrations_debug Migration - 3 failed.

The error produced in the console does not say much. Let’s see if any messages were logged using: drush migrate:messages ud_migrations_debug. In the previous example, the messages will look like this:

 ------------------- ------- --------------------
  Source IDs Hash    Level   Message
 ------------------- ------- --------------------
  7ad742e...732e755   1       delimiter is empty
  2d3ec2b...5e53703   1       delimiter is empty
  12a042f...1432a5f   1       delimiter is empty

In this case, the migration messages are good enough to let us know what is wrong. The required delimiter configuration option was not set. When an error occurs, usually you need to perform at least three steps:

  • Rollback the migration. This will also clear the messages.
  • Make changes to definition file and make they are applied. This will depend on whether you are managing the migrations as code or configuration.
  • Import the migration again.

Let’s say we performed these steps, but we got an error again. The following snippet shows the updated plugin configuration and the messages that were logged:

  plugin: explode
  source: src_decimal_number
  delimiter: '.'
 ------------------- ------- ------------------------------------
  Source IDs Hash    Level   Message
 ------------------- ------- ------------------------------------
  7ad742e...732e755   1       3.1415000000000002 is not a string
  2d3ec2b...5e53703   1       2.7181999999999999 is not a string
  12a042f...1432a5f   1       1.4141999999999999 is not a string

The new error occurs because the explode operation works on strings, but we are providing numbers. One way to fix this is to update the source to add quotes around the number so it is treated as a string. This is of course not ideal and many times not even possible. A better way to make it work is setting the strict option to false in the plugin configuration. This will make sure to cast the input value to a string before applying the explode operation. This demonstrates the importance of reading the plugin documentation to know which options are at your disposal. Of course, you can also have a look at the plugin code to see how it works.

Note: Sometimes the error produces an non-recoverable condition. The migration can be left in a status of "Importing" or "Reverting". Refer to this article to learn how to fix this condition.

The log process plugin

In the example, adding the extra configuration option will make the import operation finish without errors. But, how can you be sure the expected values are being produced? Not getting an error does not necessarily mean that the migration works as expected. It is possible that the transformations being applied do not yield the values we think or the format that Drupal expects. This is particularly true if you have complex process plugin chains. As a reminder, we want to separate a decimal number from the source like 3.1415 into its components: 3 and 1415.

The log process plugin can be used for checking the outcome of plugin transformations. This plugin offered by the core Migrate API does two things. First, it logs the value it receives to the messages table. Second, the value is returned unchanged so that it can be used in process chains. The following snippets show how to use the log plugin and what is stored in the messages table:

  - plugin: explode
    source: src_decimal_number
    delimiter: '.'
    strict: false
  - plugin: log
 ------------------- ------- --------
  Source IDs Hash    Level   Message
 ------------------- ------- --------
  7ad742e...732e755   1       3
  7ad742e...732e755   1       1415
  2d3ec2b...5e53703   1       2
  2d3ec2b...5e53703   1       7182
  12a042f...1432a5f   1       1
  2d3ec2b...5e53703   1       4142

Because the explode plugin produces an array, each of the elements is logged individually. And sure enough, in the output you can see the numbers being separated as expected.

The log plugin can be used to verify that source values are being read properly and process plugin chains produce the expected results. Use it as part of your debugging strategy, but make sure to remove it when done with the verifications. It makes the migration to run slower because it has to write to the database. The overhead is not needed once you verify things are working as expected.

In the next article, we are going to cover the Migrate Devel module, the debug process plugin, recommendations for using a proper debugger like XDebug, and the migrate:fields-source Drush command.

What did you learn in today’s blog post? What workflow do you follow to debug a migration issue? Have you ever used the log process plugin for debugging purposes? If so, how did it help to solve the issue? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

Next: How to debug Drupal migrations - Part 2

This blog post series, cross-posted at as well as here on, is made possible thanks to these generous sponsors: by Osio Labs has online tutorials about migrations, among other topics, and Agaric provides migration trainings, among other services.  Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Sep 05 2019
Sep 05

tl;dr Are you an event organizer who wants to increase the number of speakers from marginalized and underrepresented groups at your event?  We're having an online training November 16 so you can learn how to hold your own Speaker Diversity Workshop in your local area! Register here now!

One of the most common questions we get in the Drupal Diversity and Inclusion channel is from event organizers, asking how to increase the number of speakers at their events who are from marginalized and underrepresented groups.

We were delighted to offer an online two-day Speaker Diversity Workshop, on September 21 and 28, to help people from marginalized and underrepresented groups to prepare for submitting talks to events. We had a tremendous response and great attendance!

To help event organizers even further, we are very excited to share that on Saturday, November 16, from 1-4 p.m. ET, we’ll be offering a Train the Trainers online workshop on how to run the Drupal Speaker Diversity Workshop at your local events. This will allow meetups and DrupalCamps to offer the workshop all around the world! Register for this workshop here

The Drupal Diversity and Inclusion group has partnered with Jiil Binder on this effort, as she successfully pioneered this approach within the WordPress community. By both offering direct speaker training workshops herself and teaching communities how to run these workshops, she’s been able to make a huge difference in the WordPress community since she began this outreach in 2018. So far the workshop has been run by twelve WordPress meetup groups in the US, Canada, Brazil, South Africa, and Venezuela.

All of the communities that held this workshop experienced a real change in the speaker roster for their annual conferences; many of their WordCamps went from having 10% women speakers to having 50% or more women speakers in less than a year. In 2017, Seattle had 60% women speakers and in 2018, Vancouver had 63%.

— “Want to See a More Diverse WordPress Contributor Community? So Do We.”, Automattic blog post

This workshop includes the following:

  • A three hour online workshop

  • PDFs with Dos and Don’ts for running anevent that supports people from marginalized and underrepresented groups

We strongly encourage local meetup groups and Drupal Camp organizers to identify one or more people within their local community to attend the Train the Trainers workshop. Having trainers who are themselves people from marginalized and underrepresented groups is a big help in connecting trainers with the people they will be working with in the workshop. Allies can assist the primary leaders of the workshop, so all folks are welcome to attend the Train the Trainers event. 

Marginalized? Underrepresented? Could you clarify that?

Some people have asked or wondered if these workshops are right for them, and what we mean when we say “people from marginalized and underrepresented groups”. On our Statement of Values page, we say the following:

There are many intersecting oppressions in society today. Some of them can make it difficult for people to take part in open source communities. We oppose excluding people due to racism, misogyny, homophobia, transphobia, ableism, Islamophobia, class, and BDSM or kink lifestyles, as a non-exhaustive list. We seek to amplify the voices of those affected by oppressions. We also want to create safer spaces in the Drupal community where individuals can work and grow.

If folks have felt tension around speaking due to an aspect of their identity that is underrepresented or marginalized in the Drupal community, we’d love for you to bring your experiences to learn how to lead a Speaker Diversity Workshop!


A huge thanks to our partner in this workshop, Pantheon: without their matching fund, this workshop would not be possible.

We also want to pass on a special thanks to our corporate sponsors, Lullabot and Kanopi Studios, who helped to kick off this fundraising drive, and who believed in this effort from the very start. And another big thanks to individual sponsors Dries Buytaert and Drew Griffiths.

And finally we want to thank all the other organizations and individuals who stepped up to make this possible. Thank you, thank you, thank you!

Our work is never done! We welcome one-time donations and ongoing sponsorships to help us do the work of diversity and inclusion in the Drupal community: donate today with Open Collective!

Sep 05 2019
Sep 05

We love to say that Drupal 8’s logo resembles the infinity sign, which means infinite opportunities for websites. It includes plenty of ways to make your website user-friendly and engaging. 

One of the techniques used in this area is infinite scrolling, which can be implemented through a nice Drupal 8 module called Views Infinite Scroll. Let’s see what infinite scrolling is and how to design it with the help of this Drupal 8 module. 

A glimpse at the technique: what is infinite scrolling? 

Infinite scrolling means continuous content uploading as the user scrolls down the page. This can optionally be accompanied by the “load more” button at the bottom of the page, which infinitely uploads the content upon click.

Endless pages make the user’s interaction with the website more natural because it is convenient to not have to click on the next page. This technique is especially popular with content-rich websites, social networks, e-commerce stores, etc. It is incredibly useful for long pages and mobile navigation. 

However, infinite scrolling should be used carefully so it does not annoy the user, distract their attention from their main goal, or block the calls-to-action. For example:

  • If the footer “disappears” together with your contacts every time your user scrolls, consider using a sticky footer or move the key links to the sidebar. 
  • Try the “load more” button to give the user more control and never block anything.
  • You can also add more usability by letting the user choose the number of displayed items before hitting “load more.”

All this and more is provided by the Views Infinite Scroll module in Drupal 8, which we will now move on to.

Introduction to the Views Infinite Scroll module in Drupal 8

The Drupal 8 Views Infinite Scroll module works with the Drupal Views. This allows you to present any Drupal data you wish — collections of images, articles, products, lists of user-profiles, or anything else.  

You can:

  • let the content be infinitely uploaded upon the scroll
  • add a “load more” button with any text on it
  • expose some viewing options to users

The Views Infinite Scroll module is available both for Drupal 7 and Drupal 8. However, the Drupal 8 version has a special bonus — it uses the built-in AJAX system of Drupal Views and requires no third-party libraries. In the next chapter, we will look more closely at how it works.

How the Views Infinite Scroll Drupal module works

Installing the module

We start by installing the Views Infinite Scroll Drupal module in any preferred way and enabling it. In our example, we are using its 8.x-1.5 version.

Installing the Views Infinite Scroll module

Creating the Drupal view

Now let’s prepare our Drupal view with a few elements in it. In our case, the grid view will show two columns of car images. 

When creating the page, we choose the “create a page” option. The default “Use a pager” and “10 items to display” settings can remain unchanged so far — we will take care of this in the next step.

Drupal 8 Views

Setting up the Drupal Views infinite scrolling

On the Drupal Views dashboard, we select the Pager section and open the “Use pager — mini” option.

Setting Drupal 8 Views pager to infinite scroll

There, we switch the pager to “Infinite Scroll.”

Setting Drupal 8 Views pager to infinite scroll

Next, we configure the settings for the Infinite Scroll. We set the number of items per page to 4.

And, most importantly, we can check or uncheck the automatic loading of content. If we uncheck it, there will be a “load more” button, for which we can write our custom text. “View more luxury cars” sounds good for this example.

Configuring infinite scroll in Drupal 8 Views

After saving the view and checking the page, we see 4 cars on page with a nice “View more luxury cars” button. Success!

View more button in Drupal 8 Views

Exposing choices to users

In our Infinite Scroll settings, there is the “Exposed options” section. By checking its options, you will allow users to:

choose the number of items displayed
see all items
specify the number of items skipped from the beginning

Exposed options in Drupal 8 infinite scroll

With these applied, our collection now looks like this.
Infinite scroll in Drupal 8 Views with exposed options

Additional CSS tweaks will make the view look exactly the way you want as far as colors, fonts, distances between elements, etc.

Apply infinite scrolling on your website

Make your customer satisfaction infinite through the use of the infinite scroll technique! So if you want to:

  • install and configure the Views Infinite Scroll module
  • customize the output with CSS
  • create a custom Drupal module for your ideas in scrolling
  • design the scrolling effect from scratch using the latest techniques

contact our Drupal team!

Sep 05 2019
Sep 05

The Drupal 8 Field Defaults module is a handy little module that allows you to bulk update the default values of a Drupal field. This is helpful if you have ever added a field to a content type or entity and wished you could have the default value apply to all the existing content on your Drupal site.

Download and install the Field Defaults module just like any other Drupal 8 module.

You can visit the configuration page by going to Admin > Configuration > System > Field Defaults settings. The only setting is the ability to retain the original entity updated time when default values are updated.

Field Default Module Settings

Navigate to a Content Type and go to Manage Fields. Edit one of the fields on the content type. In the screenshot below, I am editing a text field called Example. You will notice under the default value section there is a new fieldset called Update Existing Content. If you needed to change the default value and wanted it to apply to all of your existing content on the site, you would use the checkboxes to update the defaults.

Field Default Module Update Content

That’s it! There really is not a lot to it, but it’s useful when you are adding new fields to existing sites.

Sep 04 2019
Sep 04
Date: 2019-September-04Description: 

In June of 2011, the Drupal Security Team issued Public Service Advisory PSA-2011-002 - External libraries and plugins.

8 years later that is still the policy of the Drupal Security team. As Drupal core and modules leverage 3rd party code more and more it seems like an important time to remind site owners that they are responsible for monitoring security of 3rd party libraries. Here is the advice from 2011 which is even more relevant today:

Just like there's a need to diligently follow announcements and update contributed modules downloaded from, there's also a need to follow announcements by vendors of third-party libraries or plugins that are required by such modules.

Drupal's update module has no functionality to alert you to these announcements. The Drupal security team will not release announcements about security issues in external libraries and plugins.

Current PHPUnit/Mailchimp library exploit

Recently we have become aware of a vulnerability that is being actively exploited on some Drupal sites. The vulnerability is in PHPUnit and has a CVE# CVE-2017-9841. The exploit targets Drupal sites that currently or previously used the Mailchimp or Mailchimp commerce module and still have a vulnerable version of the file sites/all/libraries/mailchimp/vendor/phpunit/phpunit/src/Util/PHP/eval-stdin.php. See below for details on whether a file is vulnerable or not. The vulnerable file might be at other paths on your individual site, but an automated attack exists that is looking for that specific path. This attack can execute PHP on the server.


Follow release announcements by the vendors of the external libraries and plugins you use.

In this specific case, check for the existence of a file named eval-stdin.php and check its contents. If they match the new version in this commit then it is safe. If the file reads from php://input then the codebase is vulnerable. This is not an indication of a site being compromised, just of it being vulnerable. To fix this vulnerability, update your libraries. In particular you should ensure the Mailchimp and Mailchimp Ecommerce modules and their libraries are updated.

If you discover your site has been compromised, we have a guide of how to remediate a compromised site.

Also see the Drupal core project page.

Reported By: Coordinated By: 
Sep 04 2019
Sep 04

Your website’s users are its dearest treasure. Drupal 8 offers everything to make your users happy and satisfied. They can publish content with ease, quickly find things through the robust search in Drupal 8, use their native language thanks to Drupal 8’s multilingual improvements, and so much more.

But let’s get to the beginning of their journey — user profiles. We will take a tour of building the structure of user profiles in Drupal 8.

First, let’s talk about some basics before moving on to a few interesting tweaks that make profiles richer and more engaging. These will involve new Drupal 8 core modules such as:

  • the Media Library to enrich profiles with multimedia
  • the Layout Builder to shape the profile layout with the handy drag-and-drop feature

And we will also use the not so new but always essential Views module that is part of the Drupal 8 core to help us display the needed data more precisely. Let’s begin!

Building user profiles in Drupal 8

1. Introduction: using fields to build user profiles

Users are fieldable entities in Drupal 8 just like content types. This means you can build user profiles with any fields, and every account will have them.

These can be any fields imaginable — first name, last name, picture, email, link to the website, and so forth. They can be created in Configuration — People — Account settings — Manage fields with the use of the relevant field types.

Managing fields in Drupal 8 profiles

The order in which the fields will appear to visitors can be specified by the drag-and-dropping at “Manage display,” hide or show their labels and use formatters.

Manage display to reorder fields in Drupal 8

Every field can be made required or optional on the field “Edit” page.

Making fields required or optional in Drupal 8

A special group of field types is “Reference”. It allows you to connect to other entity types. With this, you can allow users to:

  • list other users of your website (i.e. “My mentors”)
  • select options from taxonomy vocabularies (i.e. counties, cities, or spoken languages)
  • list their favorite content from your website

and much more.

One of interesting referenced entity cases comes next with the Media Library.

adding referenced entity field in Drupal 8

2. Making profiles richer with the Media Library

You can allow users to embed media of various types from Drupal 8’s Media Library into their profiles. This includes images, videos, audio, files, and remote videos from YouTube or Vimeo. For example, they can list their featured photos, favorite music videos, and so on.

The Media Library appeared as an experimental module in Drupal 8.6 for media handling. New Media Library interface in Drupal 8.7 impressed even the experts with its stylish design and handy features.

Media Library in Drupal 8

For media embedding, the Media and the Media Library core modules need to be enabled. Then it’s necessary to set the Reference field type to “Media,” specify the allowed number of values and select the media type.

Media field settings in Drupal 8

Adding Media field in Drupal 8

With these settings like on our picture, the user’s account will have the fields for up to 5 featured photos that users can embed into profiles directly from the Media Library.

Adding media from Media Library in Drupal 8

It’s great to know that in Drupal 8.8, there will be a media embed button added to the CKEditor dashboard.

3. Displaying the needed data in profiles via Views

More opportunities are open thanks to adding collections of entities, or Drupal Views. Remember, for example, we mentioned the referenced entity field open to other users such as the “My mentors” field?

referenced entity user field in Drupal 8

However, this just listed the mentor’s usernames on the user profile. What if we want the mentors’ pictures to be shown?

Views comes to the rescue! We can arrange mentors’ photos as Views and attach it as a block or page to the user profile.

We need to:

  • go to Structure — Views and create a new view block of the “user” type that will use fields
  • add the field for the user picture and, in the field settings, relate it to our “My Mentors” field
  • add a relationship in the “Advanced” section of Views to the “My Mentors” field
  • create a contextual filter in the “Advanced” section of Views that will display only the mentors of the particular user

Views in Drupal 8 that shows each user's mentors

Contextual filter by user ID in Drupal 8

With this done, we now have a block that shows the pictures of mentors on each user page. However, this block is not yet added anywhere to the site.

4. Shaping user profiles with the Layout Builder

It’s time to finally unite it all together. The Views block can be attached to the profile using the traditional Drupal block layout as a simple option. However, the new Layout Builder that appeared in Drupal 8.5 offers an amazing drag-and-drop interface for this purpose!

The Layout Builder is used with all fieldable entities, including user profiles. In addition to enabling the module, we need to enable the Layout Builder on the Manage Display tab of the particular entity type (in this case — user account settings).

Enable Layout Builder for entity type

The Manage layout button takes us to the drag-and-drop interface where we can add sections with a different number of columns, set their width proportions, and add blocks to them. Blocks include Drupal fields, Views blocks, forms, menus, and much more.

adding blocks in Layout Builder Drupal 8

Every Drupal block is configured on the right sidebar with all settings traditionally available in “Manage display.”

Configuring field in Drupal 8's Layout Builder

We are creating a three-column section and adding profile fields as blocks, including the “My featured photos” filed and the “My mentors” Views block. We will save the result and see how our profile looks.

User profile created in Drupal 8's Layout Builder

Of course, it still needs a good touch of HTML and CSS. However, we have only touched the tip of the iceberg of what core Drupal 8 modules can do for building user profiles. The opportunities are endless!

Building user profiles in Drupal 8 with our team

Let your user profiles look exactly as you wish, with no limits to your imagination. Entrust building user profiles to our Drupal team who will use core, contributed, or custom modules created specifically for your case. Contact us!

Sep 04 2019
Sep 04

In recent posts we have explored the Migrate Plus and Migrate Tools modules. They extend the Migrate API to provide migrations defined as configuration entities, groups to share configuration among migrations, a user interface to execute migrations, among other things. Yet another benefit of using Migrate Plus is the option to leverage the many process plugins it provides. Today, we are going to learn about two of them: `entity_lookup` and `entity_generate`. We are going to compare them with the `migration_lookup` plugin, show how to configure them, and explain their compromises and limitations. Let’s get started.

What is the difference among the migration_lookup, entity_lookup, entity_generate plugins?

In the article about migration dependencies we covered the `migration_lookup` plugin provided by the core Migrate API. It lets you maintain relationships among entities that are being imported. For example, if you are migrating a node that has associated users, taxonomy terms, images, paragraphs, etc. This plugin has a very important restriction: the related entities must come from another migration. But what can you do if you need to reference entities that already exists system? You might already have users in Drupal that you want to assign as node authors. In that case, the `migration_lookup` plugin cannot be used, but `entity_lookup` can do the job.

The `entity_lookup` plugin is provided by the Migrate Plus module. You can use it to query any entity in the system and get its unique identifier. This is often used to populate entity reference fields, but it can be used to set any field or property in the destination. For example, you can query existing users and assign the `uid` node property which indicates who created the node. If no entity is found, the module returns a `NULL` value which you can use in combination of other plugins to provide a fallback behavior. The advantage of this plugin is that it does not require another migration. You can query any entity in the entire system.

The `entity_generate` plugin, also provided by the Migrate Plus module, is an extension of `entity_lookup`. If no entity is found, this plugin will automatically create one. For example, you might have a list of taxonomy terms to associate with a node. If some of the terms do not exist, you would like to create and relate them to the node.

Note: The `migration_lookup` offers a feature called stubbing that neither `entity_lookup` nor `entity_generate` provides. It allows you to create a placeholder entity that will be updated later in the migration process. For example, in a hierarchical taxonomy terms migration, it is possible that a term is migrated before its parent. In that case, a stub for the parent will be created and later updated with the real data.

Getting the example code

You can get the full code example at The module to enable is `UD Config entity_lookup and entity_generate examples` whose machine name is `ud_migrations_config_entity_lookup_entity_generate`. It comes with one JSON migrations: `udm_config_entity_lookup_entity_generate_node`. Read this article for details on migrating from JSON files. The following snippet shows a sample of the file:

  "data": {
    "udm_nodes": [
        "unique_id": 1,
        "thoughtful_title": "Amazing recipe",
        "creative_author": "udm_user",
        "fruit_list": "Apple, Pear, Banana"

Additionally, the example module creates three users upon installation: 'udm_user', 'udm_usuario', and 'udm_utilisateur'. They are deleted automatically when the module is uninstalled. They will be used to assign the node authors. The example will create nodes of types "Article" from the standard installation profile. You can execute the migration from the interface provided by Migrate Tools at `/admin/structure/migrate/manage/default/migrations`.

Using the entity_lookup to assign the node author

Let’s start by assigning the node author. The following snippet shows how to configure the `entity_lookup` plugin to assign the node author:

  - plugin: entity_lookup
    entity_type: user
    value_key: name
    source: src_creative_author
  - plugin: default_value
    default_value: 1

The `uid` node property is used to assign the node author. It expects an integer value representing a user ID (`uid`). The source data contains usernames so we need to query the database to get the corresponding user IDs. The users that will be referenced were not imported using the Migrate API. They were already in the system. Therefore, `migration_lookup` cannot be used, but `entity_lookup` can.

The plugin is configured using three keys. `entity_type` is set to machine name of the entity to query: `user` in this case. `value_key` is the name of the entity property to lookup. In Drupal, the usernames are stored in a property called `name`. Finally, `source` specifies which field from the source contains the lookup value for the `name` entity property. For example, the first record has a `src_creative_author` value of `udm_user`. So, this plugin will instruct Drupal to search among all the users in the system one whose `name` (username) is `udm_user`. If a value if found, the plugin will return the user ID. Because the `uid` node property expects a user ID, the return value of this plugin can be used directly to assign its value.

What happens if the plugin does not find an entity matching the conditions? It returns a `NULL` value. Then it is up to you to decide what to do. If you let the `NULL` value pass through, Drupal will take some default behavior. In the case of the `uid` property, if the received value is not valid, the node creation will be attributed to the anonymous user (uid: 0). Alternatively, you can detect if `NULL` is returned and take some action. In the example, the second record specifies the "udm_not_found" user which does not exists. To accommodate for this, a process pipeline is defined to manually specify a user if `entity_lookup` did not find one. The `default_value` plugin is used to return `1` in that case. The number represents a user ID, not a username. Particularly, this is the user ID of "super user" created when Drupal was first installed. If you need to assign a different user, but the user ID is unknown, you can create a pseudofield and use the `entity_lookup` plugin again to finds its user ID. Then, use that pseudofield as the default value.

Important: User entities do not have bundles. Do not set the `bundle_key` nor `bundle` configuration options of the `entity_lookup`. Otherwise, you will get the following error: "The entity_lookup plugin found no bundle but destination entity requires one." Files do not have bundles either. For entities that have bundles like nodes and taxonomy terms, those options need to be set in the `entity_lookup` plugin.

Using the entity_generate to assign and create taxonomy terms

Now, let’s migrate a comma separated list of taxonomy terms. An example value is `Apple, Pear, Banana`.  The following snippet shows how to configure the `entity_generate` plugin to look up taxonomy terms and create them on the fly if they do not exist:

  - plugin: skip_on_empty
    source: src_fruit_list
    method: process
    message: 'No src_fruit_list listed.'
  - plugin: explode
    delimiter: ','
  - plugin: callback
    callable: trim
  - plugin: entity_generate
    entity_type: taxonomy_term
    value_key: name
    bundle_key: vid
    bundle: tags

The terms will be assigned to the `field_tags` field using a process pipeline of four plugins:

  • `skip_on_empty` will skip the processing of this field if the record does not have a `src_fruit_list` column.
  • `explode` will break the string of comma separated files into individual elements.
  • `callback` will use the `trim` PHP function to remove any whitespace from the start or end of the taxonomy term name.
  • `entity_generate` takes care of finding the taxonomy terms in the system and creating the ones that do not exist.

For a detailed explanation of the `skip_on_empty` and `explode` plugins see this article. For the `callback` plugin see this article. Let’s focus on the `entity_generate` plugin for now. The `field_tags` field expects an array of taxonomy terms IDs (`tid`). The source data contains term names so we need to query the database to get the corresponding term IDs. The taxonomy terms that will be referenced were not imported using the Migrate API. And they might exist in the system yet. If that is the case, they should be created on the fly. Therefore, `migration_lookup` cannot be used, but `entity_generate` can.

The plugin is configured using five keys. `entity_type` is set to machine name of the entity to query: `taxonomy_term` in this case. `value_key` is the name of the entity property to lookup. In Drupal, the taxonomy term names are stored in a property called `name`. Usually, you would include a `source` that specifies which field from the source contains the lookup value for the `name` entity property. In this case it is not necessary to define this configuration option. The lookup value will be passed from the previous plugin in the process pipeline. In this case, the trimmed version of the taxonomy term name.

If, and only if, the entity type has bundles, you also must define two more configuration options: `bundle_key` and `bundle`. Similar to `value_key` and `source`, these extra options will become another condition in the query looking for the entities. `bundle_key` is the name of the entity property that stores which bundle the entity belongs to. `bundle` contains the value of the bundle used to restrict the search. The terminology is a bit confusing, but it boils down to the following. It is possible that the same value exists in multiple bundles of the same entity. So, you must pick one bundle where the lookup operation will be performed. In the case of the taxonomy term entity, the bundles are the vocabularies. Which vocabulary a term belongs to is associated in the `vid` entity property. In the example, that is `tags`. Let’s consider an example term of "Apple". So, this plugin will instruct Drupal to search for a taxonomy term whose `name` (term name) is "Apple" that belongs to the "tags" `vid` (vocabulary).

What happens if the plugin does not find an entity matching the conditions? It will create one on the fly! It will use the value from the source configuration or from the process pipeline. This value will be used to assign the `value_key` entity property for the newly created entity. The entity will be created in the proper bundle as specified by the `bundle_key` and `bundle` configuration options. In the example, the terms will be created in the `tags` vocabulary. It is important to note that values are trimmed to remove whispaces at the start and end of the name. Otherwise, if your source contains spaces after the commas that separate elements, you might end up with terms that seem duplicated like "Apple" and " Apple".

More configuration options

Both `entity_lookup` and `entity_generate` share the previous configuration options. Additionally, the following options are only available:
`ignore_case` contains a boolean value to indicate if the query should be case sensitive or not. It defaults to true.
`access_check` contains a boolean value to indicate if the system should check whether the user has access to the entity. It defaults to true.
`values` and `default_values` apply only to the `entity_generate` plugin. You can use them to set fields that could exist in the destination entity. An example configuration is included in the code for the plugin.

One interesting fact about these plugins is that none of the configuration options is required. The `source` can be skipped if the value comes from the process pipeline. The rest of the configuration options can be inferred by code introspection. This has some restrictions and assumptions. For example, if you are migrating nodes, the code introspection requires the `type` node property defined in the process section. If you do not set one because you define a `default_bundle` in the destination section, an error will be produced. Similarly, for entity reference fields it is assumed they point to one bundle only. Otherwise, the system cannot guess which bundle to lookup and an error will be produced. Therefore, always set the `entity_type` and `value_key` configurations. And for entity types that have bundles, `bundle_key` and `bundle` must be set as well.

Note: There are various open issues contemplating changes to the configuration options. See this issue and the related ones to keep up to date with any future change.

Compromises and limitations

The `entity_lookup` and `entity_generate` plugins violate some ETL principles. For example, they query the destination system from the process section. And in the case of `entity_generate` it even creates entities from the process section. Ideally, each phase of the ETL process is self contained. That being said, there are valid uses cases to use these plugins and they can you save time when their functionality is needed.

An important limitation of the `entity_generate` plugin is that it is not able to clean after itself. That is, if you rollback the migration that calls this plugin, any created entity will remain in the system. This would leave data that is potentially invalid or otherwise never used in Drupal. Those values could leak into the user interface like in autocomplete fields. Ideally, rolling back a migration should delete any data that was created with it.

The recommended way to maintain relationships among entities in a migration project is to have multiple migrations. Then, you use the `migration_lookup` plugin to relate them. Throughout the series, several examples have been presented. For example, this article shows how to do taxonomy term migrations.

What did you learn in today’s blog post? Did you know how to configure these plugins for entities that do not have bundles? Did you know that reverting a migration does not delete entities created by the `entity_generate` plugin? Did you know you can assign fields in the generated entity? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at as well as here on, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Sep 04 2019
Sep 04

04 Sep

Dominique De Cooman

Dropsolid was conceived at DrupalCon, and now we’re a Diamond Sponsor! We’ll be in Amsterdam to show off the Dropsolid platform and our vision for Drupal. We’d love your feedback. And we are donating 15 minutes of core contributor time for everyone who completes our survey at our booth.

We hope to see you there! Contact us, sign up for our newsletter, or stop by our booth!

Stop by Our Booth, Help Make Drupal Shine

We will donate 15 minutes of core contribution time for each person who fills out a short survey at our booth (Stand 13, by the catering area) at DrupalCon—one per person.

We didn’t want to be Diamond sponsors just for the sake of it. Drupal and DrupalCon got us here, made us what we are today. We want to make a difference. We asked ourselves what kind of a booth-giveaway would make a lasting impact on Drupal? A t-shirt of just the right shade of blue? ;-) We decided to invest in Drupal, paying a core contributor for their work.

Sponsors are a DrupalCon’s Best Friend

DrupalCon sparked the formation of Dropsolid, and we are very proud to be able to be Diamond Sponsors in Amsterdam this year. I wanted to take a moment to reflect on what DrupalCon has meant for us.

In 2012, after five years as a developer, I attended my very first DrupalCon in Munich. I saw Dries speak, attended so many sessions, met so many community members. There was so much incredible, positive energy; I was overwhelmed.

At that DrupalCon, I met some extraordinary people who helped persuade me that founding a Drupal company was a great idea. The experience convinced me to invest everything I ever owned into a company with Drupal at its core. I felt it was now or never. And so Steven Pepermans and I founded Dropsolid.

Now seven years later, we are one of the Diamond sponsors at DrupalCon Amsterdam. It’s hard to believe Dropsolid can do this. Sponsoring DrupalCon is a dream come true for us. For us, this is already a huge achievement. The very experience of it co-created our company, and now we get to contribute to it ourselves.

We are grateful to be here and want to make a difference.

The Dropsolid Vision for Better Customer Experience with Drupal

At the conference, we want to share a vision for possibilities with Drupal. We see Drupal pinning together an integrated digital experience platform that enables teams to deliver great digital customer experiences more effectively and at a lower cost. Our vision starts with the best practices of working with Drupal, hosting, deployment & development tools, and digital marketing capabilities. It’s what we offer customers today.

Out in the market, these “digital experience platforms” make connecting all the parts together easier. It means you can avoid getting nickeled-and-dimed on individual services and dealing with quirks in integrations. This is all possible right now with Drupal, when you have the skills and knowledge to put everything together. It’s what we do for our clients every day. We build flexible integrated platforms, and we provide training and consultation along the way.

In building these solutions with Drupal, we discovered some best practices, many things that can be recycled and reused, and that there are real advantages and economies of scale. We’ll be talking about that in our talks such as Nick’s talk on improving on-site search with Drupal Machine Learning, and Wouter and Brent’s talk about avoiding Drupal SEO Pitfalls, and Mathias will share the insights we’ve gained in working on Launchpad our Local Development Tool. These are very practical and direct ways to get more out of your investment with Drupal.

But we have a bigger vision. Next we’re working on our integrated service so you can get these capabilities with one offering in Drupal. If you want to know more about this vision, and how to get there today, come along to my talk about Open Digital Experiences to Increase Customer Lifetime Value
You can also stop by our booth to see demos of our Dropsolid hosting Platform, see how to use Dropsolid Personalization, and see Rocketship in action.

Facets of the Digital Experience - Dropsolid Sessions at DrupalCon

Where to meet the Dropsolid Team. In addition to visiting our booth (and making us pay a core contributor!) at Stand 13, we’ll be showing many facets of what goes into digital experiences —investing in Digital Customer Experiences, Search Engine Optimization tips for Drupal—and we’ll be on a panel about local development tools, too.

Demo: A future vision of Drupal as a Digital Experience Platform

  • What: In our live demo, we’ll show you the power of our Platform, Launchpad, Rocketship, Search & Machine Learning, and Personalization tools working together to break down silos and create engaging customer experiences with Drupal. 
  • When: Wed, 30 Oct, 12:40 -13:10
  • Where: Sponsor Stage

Stop Buying Drupal Websites, Buy Open Digital Experiences to Increase Customer Lifetime Value

  • Who: Dominique De Cooman, Founder and CEO Dropsolid 
  • What: My talk is a distillation of what we learned about the difference between Drupal being “just a CMS” for “just building a website,” and how Drupal can be a truly comprehensive Digital Experience Manager.
  • When: Tue, 29 Oct, 11:55 to 12:15
  • Where: Room G 107

The Battle of the Local Development Tools [Panel Discussion]

  • Who: Mattias Michaux, Drupal Developer, and DevOps Engineer joins a Panel chaired by Michael Schmid from Amazee
  • What: DrupalCon website: “In this session, creators and users of different local development tools will provide their story of why they made the choices they made.”
  • When: Wed, 30 Oct, 11:30 - 12:10
  • Where: Room G 102

Machine Learning: Creating More Relevant Search Results With “Learn To Rank”

  • Who: Nick Veenhof, CTO Dropsolid, and Mattias Michaux
  • What: A summary of what machine learning is, but more importantly, how can you use it for a pervasive problem, namely the relevance of your internal site search.
  • When: Wed, 30 Oct, 16:15 to 16:55
  • Where: Auditorium

Drupal SEO Pitfalls and How To Avoid Them

  • Who: Wouter De Bruycker, SEO Specialist and Brent Gees, Drupal Architect
  • What: Drupal can be your perfect technical SEO platform, but to get the most out of it, you have to make sure it’s set­ up as it should be for the search engines. We will go into the details of how to detect SEO issues common and rare (on real Drupal sites!) and explain their impact on SEO.
  • When: Wed, 30 Oct, 16:40 to 17:00
  • Where: Room G 102

See you there! Be in touch!

We hope to see you there! Sign up for our newsletter or stop by our booth at Stand 13 and help us contribute!

And remember, we will donate 15 minutes of core contribution time for each person who fills out a short survey at our booth (Stand 13, by the catering area) at DrupalCon—one per person.

DrupalJam 2019

Sep 04 2019
Sep 04

Drupal 8 makes it easier and easier to create rich, interesting, and beautiful content pages. Among the new features of the Drupal 8.7 release, we saw the stable Layout Builder and the new Media Library user interface. 

Another great piece of news is coming now! The Media Library in Drupal 8 has an embed button added to the CKEditor panel, and media embedding without a mouse is possible. This Media Library and CKEditor integration is now in the dev branch and will be officially available with the Drupal 8.8 stable release in December 2019. 

Consider scheduling your Drupal website update to 8.8 with our team. Meanwhile, let’s learn more about the new features.

The Media Library in Drupal 8 and rich content creation

Thanks to the Media Library and Media modules being part of the Drupal core, media handling in Drupal 8 is very convenient. It’s possible to add various types of media, store them in the Library, and reuse the content whenever you need it. 

You can display the items in a grid or table view, select and insert them, sort and filter them by various criteria, bulk upload, and so on. With the new Library user interface introduced in Drupal 8.7, everything looks and works especially well. Here are our screenshots from this version.

Media Library in Drupal 8

Media Library in Drupal 8: adding or selecting images

The default Drupal 8 media types are:

  • Audio
  • File
  • Image
  • Remote video (with links from YouTube, Vimeo, etc.)
  • Video

Using items from Media Library in content

Content editors appreciate the ability to select items from the Library and insert it directly into the content. To achieve this, it is necessary to add the Media field of the relevant type to a content type (or other fieldable entities like user account). 

Media Library in Drupal 8: oEmbed videosMedia Library in Drupal 8: adding or selecting videos

Great news: media button in the CKEditor panel

To make media selection and embedding experiences even smoother, the embed button has now been added to the CKEditor dashboard in Drupal 8.8x-dev release. This Media Library and WYSIWYG integration was announced in a tweet by “The Drop is Always moving.”

Media Library and CKEditor integration tweet

As we see, the Media Library button has an icon that looks attractive and clearly shows its purpose to users. 

The Media subsystem maintainer Phenaproxima shows nice screenshots and writes that the icon design is agreed by all, usability tests are successfully passed, and the button is well-tested. Congrats and thanks to the team of amazing experts for their job!

The work is successfully committed to the Drupal 8.8.x dev branch, waiting for the official release on December 4, 2019.

Media Library button added to Drupal 8.8 CKEditor

Users can click on the button, see the Media Library, select media, and click “ Insert Selected.”

Media Library button added to Drupal 8.8 CKeditor

The button can be enabled or disabled by drag-and-dropping, which is a great capability of the CKEditor in Drupal 8.

CKEditor panel now with Media Library button

Breaking news: final patch for Media Library and WYSIWYG integration

As we were preparing this article for publication, another awesome news arrived about the final feature patch for the Media Library. It allows for media embedding in WYSIWYG with no mouse needed. Wim Leers, one of the gurus who make such things happen, posted a video on his blog post.

Media embedding without mouse in Drupal 8.8 CKEditor

Enjoy the Media library’s new features!

Start producing richer content in a snap of a finger — use the Media Library in Drupal 8. Our Drupal support and development team can assist you in every step of your way. For example, we can:

  • update you to Drupal 8.7 so you can use new Media Library’s user interface
  • upgrade your website to Drupal 8 if you are still on Drupal 7
  • adjust your website’s settings for easy media handling workflows
  • advise you and set up other attractive ways to display content in Drupal 8
  • and, of course, update you to the upcoming Drupal 8.8 as soon as it arrives in December

Follow our news about Drupal support services and always feel free to contact us!

Sep 04 2019
Sep 04

Well, summer’s officially come to a close. We at Agiledrop were a little bummed about it and decided to try to do something to prolong it, even if just for a bit. To that end, here’s a recap of our favorite Drupal blog posts from the sunny August - we hope you enjoy it!

Agaric’s series of posts on Drupal migrations

We’re kicking off August’s list of top Drupal blog posts not with a single blog post, but rather with a series of posts concerning different aspects of Drupal migrations written by Mauricio Dinarte of Agaric

As stated, this extensive series covers it all, from the basics of the migration process in Drupal, to the more advanced things such as migrating different entities, migrating from different types of source files (CSV, JSON and XML) and managing migrations as configuration entities. 

Since this series spans an entire month, we’re not going to link individual blog posts; see Agaric’s blog for a specific chapter on Drupal migrations. Or you can dive right into the series as a whole, starting with part 1!

Read part 1

Contribution and Client Projects: Part Two

Next on our list is a blog post by AmazeelabsChristophe Jossart which serves as a helpful guide for developers who are just starting out with their Drupal contribution and don’t yet know their way around

The post neatly recaps the documentation for new contributors and educates the reader on working on the issue queue on, explaining both creating a patch and contributing a full project.

Christophe then also provides some basic information about next year’s Drupal 9 release and a list of helpful developer tools. He concludes by suggesting other ways to contribute beyond just code, thus supplying any kind of newcomer with the know-how to get actively involved in the community. 

Read more

How to Choose a Digital Experience Platform in 2019

This next post doesn’t address Drupal specifically, but rather the CMS as a system; namely, the last stage in the evolution of the content management system which is already ushering in a new contender - the Digital Experience Platform or DXP.

In this post, Justin Emond of Third and Grove first explains the concept of DXP and what characterizes a successful DXP in practice. He then gives an overview of the top four DXP providers, backed up by Gartner’s Magic Quadrant for Digital Experience Platforms and for Web Content Management, respectively. 

Comparing time to market, ease of extension, cost, and commerce capabilities, Justin’s overview is unbiased and thus a great resource for determining the right DXP for a certain organization. 

Read more

Use Taxonomy Terms as Webform Options in Drupal 8

Moving on, we have a blog post accompanied by a video by Ivan Zugec of WebWash. As with all his video tutorials, this one too strikes the perfect balance between informative and accessible, and is thus suitable for both beginners and more experienced Drupalists. 

The aim of this tutorial is to enable content editors to manage Webform options without having to tinker with the configuration. As Ivan points out, the best way is to do this with the Taxonomy system. 

The post and video take you through creating a taxonomy vocabulary and then creating a “Term select” element. For more control over the reference of entities, Ivan suggests using “Entity select” instead of “Term select”.

Read more

Contributing to Open Source, what's your math?

Just like the first blog post on this list, this one again focuses on open source contribution. In it, Baddý Sonja Breidert breaks down 1xINTERNET’s contribution to Drupal in 2018, both in terms of numbers and the reasons why they (and why everyone benefiting from open source should!) invest so much in contribution. 

Baddý’s post divides contribution into three main areas: community work, sponsorships and memberships, and source code contribution. Adding up their contributions in all three areas, she calculates that as much as 7.5% of 1xINTERNET’s annual budget goes into giving back to Drupal. She finishes with a call to action inviting other organizations to do and share their open source contribution math. 

Read more

Component-based theming with Layout Builder

Our previous few recaps of top Drupal blog posts have both included a post about the Layout Builder - rightly and logically so, since this powerful new feature has recently become stable in Drupal. And, after all, as content editors, our marketing team is always interested in new functionalities and technologies that have the potential to facilitate our work.

Continuing with this trend, the next post we wanted to highlight this time is Aleksi Peebles’ Component-based theming with Layout Builder, the straightforward title of which already specifies what it is about. The post describes the steps needed to take to display a simple Code paragraph using the Layout Builder; in case of redundant HTML markup, Aleksi’s own Full Reset module is able to deal with it.

Read more

Drupal Tome + Docksal + Netlify

In the next post on this month’s list, Aaron Crosman describes his proof-of-concept implementation of the Drupal Tome distribution with Docksal and Netlify. Since this was in the context of SCDUG’s competition for the cheapest possible Drupal 8 hosting, Netlify was his platform of choice as he only had to pay for domain registration.

Besides describing the setup, Aaron also tackled getting Drupal’s default install profile Umami onto the newly set-up site. While reliable, this process is a slow one (Sam Mortenson has pointed out that the reason for this is that Umami installs a lot of content which Tome then reinstalls). There are still some problems with the whole approach, such as lack of support for forms, but it works well enough for this purpose.

Read more

Module Mayhem in the Drupal Kitchen

The last blog post to make it on our list for this month was written by Cheeky Monkey Media’s Kodie Beckley and plays upon the metaphor of Drupal as a professional kitchen. Specifically, Kodie compares the abundance of Drupal modules to having too many cooks all working in one kitchen.

The main problems with having too many modules are increased load time, compatibility issues and the fact that a lot of available modules are (or will be) outdated or no longer supported. 

Kodie suggests sticking to modules that are supported and still actively developed, and keeping only those that are truly necessary. For a more thorough insight into the workings of your site, he recommends performing a site audit.

Read more

This concludes our selection of the top Drupal-related blog posts from August. We’ll be doing a similar recap early next month, so, don’t worry about missing some interesting Drupal content - we’ll have you covered!

Sep 04 2019
Sep 04

I'm excited to announce that Acquia has acquired Cohesion, the creator of DX8, a software-as-a-service (SaaS) visual Drupal website builder made for marketers and designers. With Cohesion DX8, users can create and design Drupal websites without having to write PHP, HTML or CSS, or know how a Drupal theme works. Instead, you can create designs, layouts and pages using a drag-and-drop user interface.

Amazon founder and CEO Jeff Bezos is often asked to predict what the future will be like in 10 years. One time, he famously answered that predictions are the wrong way to go about business strategy. Bezos said that the secret to business success is to focus on the things that will not change. By focusing on those things that won't change, you know that all the time, effort and money you invest today is still going to be paying you dividends 10 years from now. For Amazon's e-commerce business, he knows that in the next decade people will still want faster shipping and lower shipping costs.

As I wrote in a recent blog post, no-code and low-code website building solutions have had an increasing impact on the web since the early 1990s. While the no-code and low-code trend has been a 25-year long trend, I believe we're only at the beginning. There is no doubt in my mind that 10 years from today, we'll still be working on making website building faster and easier.

Acquia's acquisition of Cohesion is a direct response to this trend, empowering marketers, content authors and designers to build Drupal websites faster and cheaper than ever. This is big news for Drupal as it will lower the cost of ownership and accelerate the pace of website development. For example, if you are still on Drupal 7, and are looking to migrate to Drupal 8, I'd take a close look at Cohesion DX8. It could accelerate your Drupal 8 migration and reduce its cost.

Here is a quick look at some of my favorite features:

An animated GIF showing how to edit styles with Cohesion.An easy-to-use “style builder” enables designers to create templates from within the browser. The image illustrates how easy it is to modify styles, in this case a button design. An animated GIF showing how to edit a page with Cohesion.In-context editing makes it really easy to modify content on the page and even change the layout from one column to two columns and see the results immediately.

I'm personally excited to work with the Cohesion team on unlocking the power of Drupal for more organizations worldwide. I'll share more about Cohesion DX8's progress in the coming months. In the meantime, welcome to the team, Cohesion!

September 04, 2019

1 min read time

db db
Sep 03 2019
Sep 03

2007 is the year of my first DrupalCon, and the year the #1 most wanted end-user feature was Better media handling. 2019 is the year that Drupal will finally have it. Doing things right takes time!

Back then I never would’ve believed I would some day play a small role in making it happen :)

Without further ado, and without using a mouse:

The text editor assisted in producing this HTML:

<p>Let's talk about llamas!</p>

<drupal-media alt="A beautiful llama!" data-align="center" data-entity-type="media" data-entity-uuid="84911dc4-c086-4781-afc3-eb49b7380ff5"></drupal-media>

<p>(I like llamas, okay?)</p>

If you’re wondering why something seemingly so simple could have taken such a long time, read on for a little bit of Drupal history! (By no means a complete history.)

2007 and Drupal five

Twelve years ago, in Dries’ State of Drupal talk , Better media handling was deemed super important. I attended a session about it — this is the (verbatim) session description:

  • Drupal’s core features for file management and media handling
  • common problems and requirements (restrictions, performance issues, multi-lingual content, dependencies between nodes and files)
  • first approaches: own node types for managing, improved filemananger.module (example: Bloomstreet,European Resistance Archive, Director’s Cut Commercials)
  • next step: generic media management module with pluggable media types, mutli server infrastructure, different protocols, file systems, file encoding/transcoding

It’s surprisingly relevant today.

By the way, you can still look at the session’s slides or even watch it!

2007–2013 (?)

The era of the venerable Media module, from which many lessons were learned, but which never quite reached the required level of usability for inclusion in Drupal core.

2013 (?) – 2019

The Media initiative started around 2013 (I think?), with the Media entity module as the first area of focus. After years of monumental work by many hundreds of Drupal contributors (yes, really!), only one missing puzzle piece was left: WYSIWYG embedding of media. The first thing I worked on after joining Acquia was shipping a WYSIWYG editor with Drupal 8.0, so I was asked to take this on.

To help you understand the massive scale of the Media Initiative: this last puzzle piece represents only the last few percent of work!

Drupal has always focused on content modeling and structured content. WYSIWYG embedding of media should not result in blobs of HTML being embedded. So we’re using domain-specific markup (<drupal-media>) to continue to respect structured content principles. The result is document transclusion combined with an assistive “WYSIWYG” editing UX — which we wished for in 2013.

A little less than two months ago, we added the MediaEmbed text filter to Drupal 8.8 (domain-specific markup), then we made those have previews using CKEditor Widgets for assistive “WYSIWYG” editing, followed by media library integration and per-embed metadata overriding (for example overriding alt, as shown in the screencast).

I was responsible for coming up with an architecture that addressed all needs, but it’s phenaproxima, oknate and rainbreaw who got this actually committed to Drupal core!

Complete media management shipped in increments

Fortunately, for many (most?) Drupal 8 sites, this will not require significant rework, only gradual change. Drupal 8.8 will ship with complete media management, but it’ll be the fifth Drupal core release in a little over two years that adds layers of functionality in order to arrive at that complete solution:

  • Drupal 8.4 added foundational Media API support, but still required contributed modules for it to be usable
  • Drupal 8.5 made Media usable out-of-the-box
  • Drupal 8.6 added oEmbed support (enabling YouTube videos for example) and added an experimental Media Library
  • Drupal 8.7 made the Media Library feature-complete: bulk uploads, massively improved UX
  • Drupal 8.8 will contain the key thing that was blocking Media Library from being marked stable (non-experimental): WYSIWYG integration

Today is the perfect moment to start looking into adopting it shortly after Drupal 8.8 ships in December!

Sep 03 2019
Sep 03

Redfin Solutions started using React Native in early June when a client needed an app that could integrate with their Drupal website. Since it was our first project with React Native, we recorded useful information to share with the rest of the team. This is the first in a series of three blog posts that will cover what we learned and what we found the most useful while using React Native.

Expo & React Native CLI

There are two main framework options for building a React Native app: React Native CLI and Expo.

React Native CLI is the option most people from a native app development background choose because it trades a streamlined workflow for the ability to add native modules written in Java or Objective-C.

Expo is easier for people from a web development background because it provides a streamlined workflow to those who don’t need to link native modules to their app. It comes with integrated libraries, a client app for development, and it doesn’t require the use of Android Studio or XCode to build the project for Android and iOS separately. With a signing key, Expo handles the building process. This speeds up the development process and frees up time to spend on new features for the app.


The React Native design philosophy separates each screen into a hierarchy of components. At the lowest level are simple components like <Text>. Larger components are constructed out of other smaller components. They can also be designed as a specific case of another component. For example, it may be simpler to create a <HelloWorld> component that is a more specific version of the <Text> component if you are repeatedly creating <Text>Hello World!</Text> components. 

Props & State

Every component has two stores of data that contain information about itself, props and state. Props are the parameters of a component. It is primarily set when the component is created. For example, the URL of an image component is passed in when it’s created and stored in props. 

Conversely, state is used to store data that changes. When state changes, the component is re-rendered to show the change. For example, the current value of a volume slider might be stored in state.


JSX is a syntax extension for JavaScript that comes with React Native. It is a simple way to express how the React Native components should be rendered into elements on the screen. JSX is intuitive because it functions within React Native that same way HTML functions in a webpage. Take a look at this JSX for putting text and an image on a screen:

<View> <Text> Hello World! </Text> <Image source={require('../assets/images/hello.png')} /> </View>

Lifecycle API

Every component follows a lifecycle API. This is a set of methods that React Native calls during certain events in a component’s life. The only required method, besides the constructor of a component, is the render() method, which expresses how to render the component on the screen by returning React elements that are usually defined by JSX. 


Styling in React Native is similar to CSS. Every component can be styled with a StyleSheet prop, which is a set of CSS-style selectors passed inside of a JavaScript object. They even support Flexbox. For example: 

const styles = StyleSheet.create({ header: { fontWeight: 'bold', fontSize: 30, }, });

And when you want to apply it to a component: <Text style={styles.header}> Hello World! </Text>

To see all the style options available, check out the documentation for each component. 

React Navigation

It is easier to start app designs at navigation by planning out what each screen will contain and how to navigate between them. This top-down approach prevents context switching between screens while writing. 

React Navigation, one of Expo’s integrated libraries, provides tools for creating a navigation system within React Native. Choose the ‘tabs’ option when initializing the project, and Expo will build a simple navigation system.

A StackNavigator is a good way to control screens because it allows them to remain concurrent when swapping between different screens. Each screen will retain its information.

To learn more about technical details check out the React-Navigation Docs.

Building Screens

To create a simple, static screen, you don’t need many moving parts because React Native provides robust components for these already. For example, components like <ScrollView>, <Image>, <Text>, and <Linking> can do most of the lifting on a page that only has to display information and images. A simple screen might look like this:

import React from 'react'; import { ScrollView, StyleSheet } from 'react-native'; export default function HamsterScreen() { return ( <ScrollView style={styles.container}> <Text> Your mother is a hamster! </Text> <Image source={require('../assets/images/hamster.png')} /> </ScrollView> ); } HamsterScreen.navigationOptions = { title: 'Hamster', }; const styles = StyleSheet.create({ container: { flex: 1, paddingTop: 15, backgroundColor: '#fff', }, });

This is just the beginning of learning how to use React Native. Keep an eye out for our upcoming blog post about using React Native with Drupal. In the mean time, watch Designing an App with Drupal and React Native, a Design 4 Drupal session presented by our summer intern developer, Jacob Morin.

Sep 03 2019
Sep 03

Our lead community developer, Alona Oneill, has been sitting in on the latest Drupal Core Initiative meetings and putting together meeting recaps outlining key talking points from each discussion. This article breaks down highlights from meetings this past week.

You'll find that the meetings, while also providing updates of completed tasks, are also conversations looking for community member involvement. There are many moving pieces as things are getting ramped up for Drupal 9, so if you see something you think you can provide assistance on, we encourage you to get involved.

Out of the Box Initiative Meeting

August 27, 2019

Mark Conroy (markconroy) and Ofer Shaal (shaal) talked about:

Drupal Getting Involved Guide Refresh Meeting

August 27, 2019

This meeting:

  • Usually happens on alternate Tuesdays.
  • Is text only!
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public agenda issue anyone can add to: Meeting - Getting Involved Guide - 27 August 2019
  • Meeting transcripts are still a work in progress, but please comment on the meeting agenda issue so we can grant you credit for attending/contributing!

During the meeting, the following topics were discussed:


We have consensus on Personas! The issue, Consensus on contribution personas, tracks all of the progress. We need to store those personas as they are in somewhere we can easily reference.

Writing Style Toolset

Loving that we know the styles are already agreed in issue Evaluate, pick, and configure checking/linting tools for the Getting Involved Guide. The toolset is making great progress so far.

Proposed outline 

The outline needs a review - how do we make it attractive and welcoming to new potential contributors? Find documentation and progress updates in the issue Proposed Outline for the Getting involved Guide.

Community Section

Admin UI Meeting 

August 28, 2019

  • Meetings are for core and contributed project developers as well as people who have integrations and services related to core. 
  • Usually happens every other Wednesday at 2:30pm UTC.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • There are roughly 5-10 minutes between topics for those who are multitasking to follow along.

Beta & Stable Blockers Revision

  • Field cardinality can be unpostponed after table drags has been finished.
  • Cards issue is blocked by action links so that could use some help.
  • Image and file fields are blocked by design work.
  • Status report page is blocked on design unless we go with the existing one for beta.
  • Probably Media designs won’t be on time, so probably the Media should be moved to stable.
  • We need some help with accessibility revisions.

Layout Redesign: Max-width Limit

We opened this issue for Layout Redesign after facing some problems with Card Style Updates with wide screens.

Customizable Settings

Looking to determine what we would like to have as customizable options. The discussion is started in the issue Provide form for customizable settings.

Sep 03 2019
Sep 03

Often corporate design is focused on logos, fonts, colors and content, but predefined interaction components such as buttons, cards, accordions, sliders, shopping carts, etc., are missing.

In many cases such organizations are missing methodologies and tooling to create a complete corporate design, to make it available to the whole organization, and to maintain it. With maintaining I refer to extending, updating, or changing the design, as well as rolling these changes out.

Design systems to the rescue

A design system is a methodology to build corporate design and break it down into components. Each of these components are named and have a clear definition of how they are to be used.

Typically a design system is comprised of two elements: a design language and an online system to make the components available, including their styles and template source codes.

When creating a design system all stakeholders must work together and agree on the outcome. Stakeholders usually include designers, marketeers, frontend developers, managers, and all other people who create or work with the corporate design.

All stakeholders agree on the naming of the components. This is referred to as creating a design language. They also agree how the components are supposed to be used across all properties. This includes digital properties such as websites, social media profiles, etc., as well as offline properties such as brochures, advertorials, merchandising articles, etc.

Let’s have an example: On a website you typically have overview pages that provide links to landing pages with the actual content. Think of a listing of products or services where the user clicks on the most suitable item. For these items you would typically use an ‘Image Text Card’. Such a card could be assembled by using a ‘Squared Image’, ‘Regular Text’, and a ‘Standard Button’.

Example: ‘Image Text Card’

Example: ‘Image Text Card’

As can be seen above, the design language that all stakeholders have agreed on contains:

  • ‘Image Text Card’,     
  • ‘Squared Image’,     
  • ‘Regular Text’, and     
  • ‘Standard Button’.

This methodology would be used for all aspects of the corporate design.

It is good practice to start with the smallest possible components such as colors, texts, headings, and images. Each must be named and have a clear definition of how it is used. Colors could be named as ‘Primary Color’, ‘Secondary Color’, ‘Danger Color’, etc. Fonts could be named as ‘Regular Text’, ‘Summary Text’, ‘Cited text’, ‘Eyebrow Text’, ‘Link Text’ etc. Headings would typically be organized as ‘1st Level Heading’, ‘2nd Level Heading’, ‘3rd Level Heading’, etc. Images would be organized as ‘Squared Image’, ‘Hero Image’, ‘Banner Image‘, Portrait Image’, etc. Starting with the smallest possible components is also referred to as Atomic Design, any my colleague Jule wrote about this here.

Headlines, colors, image formats

Headlines, colors, image formats

Components, as in the examples above, are then used, when the layout for a new page type is created, let’s say for a news article. Such an article typically consists of a ‘1st Level Heading’, a ‘Hero image’, an ‘Eyebrow text’, a ‘Summary Texts’, and ‘Regular Text’ combined with ‘Cited text’ and ‘Link Text’.

Example: ‘News Article’

Example: ‘News Article’

When working with a design system methodology an important aspect is having a concise design language. If all your components have sensible names it is easy for the stakeholders of the design system to communicate.

You could for example ask the designer to change the ‘Regular Button’ on the ‘Text Image Card’ to an ‘Icon Button’.

Example: ‘Regular Button’ and ‘Icon Button’ on ‘Image Text Card’

Example: ‘Regular Button’ and ‘Icon Button’ on ‘Image Text Card’

Making design systems available

A design system is typically made available with a web based application, that allows easy navigation of all components simulating different use cases.

Also such software makes templates available, that can be used by other software to use the components of the design system.

Popular examples of design systems are

At 1xINTERNET we use Pattern Lab for creating an online design systems. We primarily develop with Drupal and React and have built our own set of tooling for creating decoupled frontends. As a starting point we have used the well established Particle theme.

The design patterns are made available as templates that can be integrated in other software. Hereby, the design system would typically supply the templates in the programming language needed by the different systems.

Working like this allows us to completely separate frontend development from the development of the actual websites. Once the frontend is ready and all components are made available with as templates, they are used by all websites.

With this approach the highest possible re-usability of frontend is achieved, and after initial creation cost, the development time of new websites can be significantly reduced.

A nice demo is provided by Patternlab.

How do you handle different designs?

Often large organizations have different designs. Sometimes they have different brands, or the same brand is used differently for varying audiences. Such audiences could be students, adults, or pensioners, who are best reached with different communication.

Different designs could be different logos, fonts, colors, etc. But they could also include completely different interaction components like fancy sliders, or traditional accordions.

Depending on the use case such designs could be integrated into the standard design system. The number of available sliders could be increased, and it could be agreed that usage of certain sliders is only allowed for certain use cases.

Alternatively, a variant of the design system could be created. Here fonts and colors could be changed, the majority of components could be integrated from the main design system could be used, some components could be excluded, and other components could be added.

Example: Alternative design for headlines, colors, and image formats

Example: Alternative design for headlines, colors, and image formats

Example: Application of alternative design to different ‘Image Text Cards’

Example: Application of alternative design to different ‘Image Text Cards’

The application of different designs in web projects works the same and with one design system.

Testing component based designs

An important aspect of component based designs, is that they can easily be tested.

All components of a system are developed for different screen resolutions, work for touch- and non-touch screens, are optimized for accessibility, and are optimized for the best possible user experience.

In such a system all components can be tested individually with a variety of testing tools. We always use Visual Regression testing during quality assurance, to analyze what components are affected by the changes.

Screenshot: Visual regression testing of different patterns

Example: Application of alternative design to different ‘Image Text Cards’

How to share design systems across implementation teams

Given that you have created a design system and made it available for users to work with it, the question arises, how this is actually done.

Imagine a scenario in which you have a standardized CMS technology such as Drupal in your organization. Frontend in Drupal is created with so-called Twig templates. Twig is a flexible templating engine for PHP.

These templates can simply be included in your CMS. Technically speaking, the CMS would include the Twig templates from the design system and parametrize these with the content generated by the CMS.

Code snippet: Include another Twig template in Twig

Code snippet: Include another Twig template in Twig

Staying within the example above for generating a news article, the CMS would call the template for ‘1st Level Headline’ to display the headline, the template for ‘Hero Image’ to display the image, etc.

A good way to integrate your design system into a website project is to make it available as a regular source code dependency. That way when all dependencies of the website are updated, the design system is also updated, and the newer version is automatically included in the next version of your website. This is especially useful to automatically roll out extensions, updates, or changes of your design system to all websites.

Let’s have an example. Say, you discover that the component ‘Cited Text’ does not comply with WCAG 2.1. standards for web accessibility, the design team is asked to provide an updated visual design. Then the frontend developers update the component with new styles and supply a new template. Once the new design system is tested a new version is created. As soon as the websites using ‘Cited Text’ are updated, the new version of cited text is rolled out.

When to use Patternlab

We primarily use Drupal and React for building websites. The technology we use for creating design systems is Patternlab (see above).

For both we provide templates. For Drupal we provide Twig templates, for React we provide JSX templates for the different React components.

Patternlab can be extended to also ship other types of templates (Angular, Vue, etc.). The question of whether Patternlab should be used to build up a design system depends on the technology being used in the website projects.

For building standalone JS based applications Storybook is a great tool for developing UI components. 

Sep 03 2019
Sep 03

I have spent my two previous blog posts exploring how requesting an appointment online begins a patient's digital journey and how top US hospitals approach online appointment request forms. Now, I would like to make some recommendations and strategies for building an exemplary appointment request form and follow it with an explanation on to how these recommendations are being applied to the Webform module's "Request a Medical Appointment" form template.


Creating an exemplary appointment request form is an iterative process that requires experimentation and testing with analytics to determine which solutions work and which ones don't.

There are many different levels of statistics that can be captured from a form.

The most immediately available statistic is the form's completion rate, which indicates how many people successfully filled out and submitted the form. Subtracting the completing rate from how many users visit the form provides a general sense of the form drop-off rate. What is missing from these statistics is more nuanced information about the form's drop-off rate that can give a better understanding of the form's user experience.


Tracking a form's events shows which inputs were filled in and in what sequence they were entered. This detailed information helps to determine how a user is interacting with a form. Furthermore, knowing where users drop off from completing a form can help indicate which questions may need to be removed or reworked.

Analytics provides the digital team with insights that lead to experimentation and testing.

A good iterative process should always including testing, which makes sure that changes and improvements are successful.


Forms built using an iterative process typically have some current version of the form where inputs or labeling could be improved. The best way to confirm that an improvement or experiment is successful is to perform an A/B test where the original and improved variant is randomly presented to end-users, and each variant's success rate is tracked and then compared.


At some point, the best way to get direct and helpful feedback about an appointment request form is to ask users their opinion. User testing an appointment request form is tricky because you risk inconveniencing potential patients. Large healthcare institutions should consider doing professional usability testing with real patients who volunteer and are compensated for their participation. Seeing and hearing a user's frustrations with an appointment request results in inspiring a digital team to care more about the patient's experience.

Personally, my experience watching user testing made me realize that the most important thing that needs to happen on an appointment request form is that a patient is able to book an appointment, even if it requires them to make a phone call. Always include a phone number on an appointment request form.

On a related note, witnessing and understanding how users with disabilities navigate a website, inspires everyone to care about accessibility.


There are a lot of resources about form design and usability that provide general form recommendations. The goal here is to highlight common suggestions and recommendations explicitly applicable to appointment request forms.



Make it easy for users to find the appointment request form. Appointment request forms should be accessible within three clicks from a hospital's homepage.


Create landing pages that route users to the correct appointment request form. Landing pages can route domestic and international patients to the proper form and direct referring physicians to doctor portals.



Set expectations by indicating how long and what type of response is expected. For example, will patients receive a confirmation email with a callback within 24 hours?


Group related inputs by type of information. Common groupings for appointment request forms include contact, patient, diagnosis, and insurance information.


Explain why specific healthcare information is needed. Tell users why providing insurance, referring physicians, and diagnoses will help when booking an appointment.


Only collect the relevant information required to complete the immediate task of booking an appointment. Scrutinize each input and question and ask is this information needed.


Make sure the layout inputs are easy to understand and complete. Top aligned labels are the recommended approach for mobile and desktop forms.


Use conditional logic to ask simple yes/no questions which then ask for more specific answers. For example, asking who is filling out the form should be one of the first questions on all appointment request forms.


Clearly indicate what inputs are required or optional. If most of a form's inputs are needed, it may be more appropriate to note which inputs are optional.

Required elements should come before optional elements. Placing the required inputs first makes it easier for patients to skip optional inputs.

Make sure users with disabilities can complete the form. Forms should fully accessible to screen readers and keyboard navigation.

The Webform module's templates are intended to provide a starting point. Site builders and architects can customize the suggested elements and information for their specific business requirements.

Request a Medical Appointment Template

Request a Medical Appointment Template

Please note: The new 'Request a Medical Appointment' template is only available to new installations of the Webform module using the latest release of Webform 8.x-5.4+. Otherwise, you can manually install the webform.webform.template_medical_appointment.yml via Drupal's Configuration Management UI (/admin/config/development/configuration/single/import) or API.


Accessibility should not be an after-thought. Appointment requests need to be easy to understand and complete for all users.


For mission-critical forms, it is best to avoid any unnecessary JavaScript enhancements or behaviors that can cause accessibility issues. This is why the appointment request template avoids using Select2 to enhance select menus. There is a fascinating related discussion happening on about improving the usability and accessibility of long select lists. Ben Mullins (bnjmnm) is doing a fantastic accessibility audit of several of the available select menu enhancement libraries.


Forms should show an explanation of required (*) fields, so I decided to display the 'Indicates required field" message at the top of the form.


The appointment request form is configured to display a native browser warning message when a user navigates away from a form with unsaved changes. Because the warning is generated from the web browser and not custom JavaScript, screen readers and other devices for people with disabilities should adequately handle this behavior. For example, Gmail also uses this functionality when a user is about to lose unsaved changes.


The goal of the appointment request is to collect patient and caregiver information. Therefore, it is key to ask for this information directly and make it clear what information is required to complete the form.

For example, the caregiver and patient contact information inputs ask for the same type of information. This makes it easy for an end-user reviewing the form to know that they need the caregiver's or patient's first name, last name, phone, and email.


This appointment request form tries to keep the labeling as simple as possible. Because the form targets patients and caregivers, it is challenging to create generic labeling that addresses both audiences. In the past, I have written custom JavaScript, which conditionally changes form element labels and descriptions based on the audience. For example, a caregiver would see 'Patient First Name' and a patient would see 'Your First Name.' This approach becomes unmanageable. There are no form builders on the market that support conditional form element labels and description. Conditional form element labels and descriptions would be a tremendous and challenging feature for the Webform module to explore.


Required elements always come first. Besides indicating which elements are required, all the medical information is denoted as (optional) because a care specialist can also collect this information during the appointment scheduling phone calls.


For phone numbers, I nudge the user to enter the phone number using the standard US format (XXX) XXX-XXXX. Still, we don't need to require users to enter this format. Once again, I avoid using a JavaScript enhancement, like a phone number input mask. Input masks create an accessibility issue for screen readers because the (__)-___-____ is read aloud as "opening parenthesis, underscore, underscore, underscore, closing parenthesis, dash, etc."


The appointment request template displays a straightforward confirmation page and deliberately does not send a confirmation email to the end-user. Sending insecure emails to a patient can be considered a Health Insurance Portability and Accountability Act (HIPAA) violation. Generally, it is recommended to use secure messaging communicate with the patient or caregiver.


Drupal and the Webform modules provide a responsive user experience out-of-box. Radios can be challenging to use on a small screen, so I decided to enhance the radios so that they would visual appear and behave like buttons.

In the Webform module, there is a dedicated buttons element that uses jQuery UI Buttons. I opted to use a recent enhancement to radios which makes it possible to style radios as buttons without using jQuery UI.


Even though single-column forms are more comfortable to complete, allowing some elements to be laid out in two columns makes it possible for someone to see the entire form and, in turn, realize that is is very easy to fill out.


Drupal's Bartik theme provides a good starting point for a form's visual design. I do find the default table element, used for the phone preferences, to be a little bit visually heavy. Phone preferences are optional and therefore should not be the most visually appealing element on the page. BTW, another reason I chose buttons for the first question, 'Who are you?", was to further draw a user's eye to this question, which is inside a container with a light gray background.


The saving of results is disabled to prevent protected healthcare information (PHI) from being stored in Drupal. The appointment request template sends a confirmation and notification email. In a production environment submissions should be remotely posted to CRM or database.


The simpler a form is to fill out, the more likely end-users will complete it. At the same time, it is important to capture the most accurate information. Since an appointment request form results in a phone call back, it is beneficial to know the best time to call a prospective patient or caregiver. I decided to experiment with adding a custom composite element allowing users to enter the best days and times to receive a callback. This widget adds complexity to the form, which is why it is optional. To further reduce this widget's complexity, I removed the adding, removing, and sorting of the days and times.

Honestly, we don't know if the enhancement is useful. If we track all the requests, we can look at how many people provide this information and how many days/times they are providing. A more direct approach is to perform usability testing that would allow us to watch end-users fill-out and submit the form.

There is always room for improvement. I hope my suggestions and tips inspired you to take a second look at your existing webforms and think differently when building new webforms. Building webforms is an iterative process and we should always welcome feedback.

I welcome any feedback to help me improve the webform's 'Request a Medical Appointment' template. Thanks for taking the time with me to research, review, and explore building an exemplary appointment request form.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!



About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web