Feb 13 2020
Feb 13

Integrating With Key Systems

One of the overall goals when embarking on this redesign was to create an experience that mimicked the streamlined usability of an ecommerce site. HonorHealth wanted to create a place where patients could easily make choices about their care and schedule appointments with minimal effort. In order to provide this experience, it was imperative that the new platform could play well with HonorHealth’s external services and created a firm foundation to integrate more self-service tools in the future.

In particular, Palantir integrated with three services as part of the build: SymphonyRM, Docscores, and Clockwise.

  • SymphonyRM offers a rich set of data served by a dynamic API. HonorHealth leverages SymphonyRM’s Provider Operations services to hold its practice location and physician data. Palantir worked closely with Symphony to help define the data structure. Through this work, HonorHealth was able to reduce the number of steps and people required to maintain their provider and location data. By leveraging the strategy work done and the technical consultation of Palantir’s Technical Architecture Owner, HonorHealth was able to keep focused on the most valuable content to their users throughout all of their integrated systems.
  • Docscores provides a platform for gathering high-quality ratings and review data on healthcare practitioners and locations. Palantir integrated this data directly with the physician and location data provided from SymphonyRM to create a research and discovery tool for HonorHealth users. On the new HonorHealth website, users can find physicians near a specific location and read about other patients’ experiences with them.
  • Clockwise provides real-time wait estimates for people looking for Urgent Care services in their area. Each of these individual “under the hood” integrations don’t represent a significant shift for website users, but when all of these changes are coupled with the intense focus on putting the user experience of the site first, the result speaks for itself: a beautiful website that works well and empowers people to engage in their ongoing healthcare in meaningful ways.

Each of these individual “under the hood” integrations don’t represent a significant shift for website users, but when all of these changes are coupled with the intense focus on putting the user experience of the site first, the result speaks for itself: a beautiful website that works well and empowers people to engage in their ongoing healthcare in meaningful ways.

Feb 13 2020
Feb 13

Identifying “Top Tasks”

The biggest negative factor of the previous ETF site’s user experience was its confusing menus. The site presented too many options and pathways for people to find information such as which health insurance plan they belong to or how to apply for retirement benefits, and the pathways often led to different pages about the same topic. Frequently, people would give up or call customer support, which is only open during typical business hours.

Palantir knew the redesign would have the most impact if the site was restructured to fit the needs of ETF’s customers. In order to guarantee we were addressing customers’ most important needs, we used the Top Task Identification methodology developed by customer experience researcher and advocate Gerry McGovern.

Through the use of this method, we organized ETF’s content by the tasks site users deemed most important, with multiple paths to get to content through their homepage, site and organic search, and related content.

Feb 13 2020
Feb 13

Hi. Please read the next paragraph in your best David Attenborough voice.

When the majestic monarch takes flight on its marathon migration to Mexico, it’s easy to be moved by their mission. Each year hundreds of thousands of the small, winged creatures take to the sky with a common goal: survival and prosperity. No single butterfly completes the trip by itself, but as a group, they complete one of the most remarkable journeys in the world.

Okay, back to your normal voice now.

Icon1Migrating your website doesn’t usually have the same visual impact, but behind the scenes, it can be just as impressive as the monarch’s multigenerational marathon migrations. Ask any experienced Drupal developer, and they’ll tell you that the amount of work that goes into a website migration can be just as much work as the monarch migration. They’re transferring an entire website from an older version of Drupal to a new one (Drupal 7 to Drupal 9, for instance), making sure everything stays the same and still works.

There are a plethora of different reasons to migrate your website to a new platform, and we’ll talk about some of the common situations a little later. Specifically, we’re going to be focusing on migrating a Drupal 6 or 7 site to Drupal 8. This information will get you ready for Drupal 9, the latest release of Drupal coming later in 2020. However, a lot of this general information also applies to site migrations using other Content Management Systems (CMSs), so if you aren’t on Drupal, don’t worry! This article isn’t useless!

So what’s migration, and what does it mean for your site? In straightforward terms, website migration is the process of copying data and configurations from Site A (built on Platform X) to Site B (built on Platform Y) and using that data to create users, nodes, configs, and components automatically. Essentially, you’re transferring every piece of content and data from one system to another. These systems can be anything, an old Wordpress installation to a new Drupal installation, for example, or a Joomla site to a Concrete5 site, or even an old Drupal 6 to a fresh Drupal 8 installation. It’s important to remember that your beautiful monarch, built out of content and configurations, isn’t what’s changing. It’s migrating to a newly built home that’s already familiar to it.

Uhh… That sounds like too much work. Why would you ever want to migrate your site? Fear not true believers! I’m going to walk you through an example! Let’s play make-believe for a minute and say that you’re the owner of WidgCo, makers of the most excellent widgets in the whole world. Still on the same page as me? Okay, cool.

Icon2You originally built your website back in 2008 using Drupal 6 so that WidgCo had an online catalogue for the 347 widgets you offer. Users would find the part numbers on your website, and then call your ordering phone line to make the purchase. The types of products you provide, the price point of those products, and most importantly, the buying habits of your typical customer meant that online purchasing wasn’t a feasible way to go right off the bat. Your clients are used to picking up the phone, reading off the part numbers they want to one of your sales reps, and then they get mailed their invoice at the end of the month.

It’s 2020 now, though, and your customers want to be able to finish all their widget ordering and invoice payments online. You’ll need to install a module enabling e-commerce on your site to meet their modern expectations. Since you’re dealing with sensitive financial information, you’re going to want to make sure that it’s as stable and secure as possible. Bad news, however. You’re still on Drupal 6, and the module developers have stopped supporting the Drupal 6 and 7 versions and are only updating the Drupal 8 version. To get the new features, you’ll have to migrate your site from your current older version of Drupal to a more modern version.

Now that we know why we need to migrate, the next question, of course, is, “How do I migrate?” As usual with these sorts of things, there are different options around how to migrate your website. No matter the method, though, the first step is backing everything up! If something goes wrong and your site breaks, you’re going to be left high and dry if you don’t have a proper backup ready to rock. Once you’ve got everything safely backed up, we can move on to the next step, where we finally start the migration.

Here’s where we hit a fork in the road. You can choose one of two main methods to perform your migration: automatically using the Migrate Upgrade module, or writing some good ol’ fashioned code by hand and doing it manually. To pick which method is best for you, you need to ask, “is my content model going to change?” Is your website content going to be staying mostly the same? Is it going to be organized in the same way on the new site?

Icon3If your content is going to stay pretty much the same, then automated migration using the Migrate Upgrade module is the way to go. You’ll save your time, money, and sanity. When you go the automatic route, the module automatically analyzes your Drupal 6 or 7 website’s content model. The module will detect content types and configurations on your site, all by itself. Once it’s done that, it will automatically generate the proper content types and configuration for your new Drupal 8 site. If there are some little changes you need to make, the Migrate Upgrade module has an API available to developers so that they can make alterations or customizations to the process. 

I don’t know about you, but that’s all a bit confusing to me, and I’m the monkey writing it. I’ll use another one of those handy-dandy, fancy-schmancy metaphors to explain it a bit differently. Imagine that your website is the building that our fictional WidgCo is headquartered. The building is no longer suitable for your company and to keep growing and progressing, so you’ve got to migrate to a new office. See what I did there?

When your workers built your office, they used the best, most advanced materials they had available, but over time they’ve deteriorated, decayed, or shown to be unsafe. The ceilings are filled with asbestos, the horrid fluorescent lights won’t stop flickering, and nobody is totally sure what the original colour of the carpet was (wait, is this even a carpet?). Having said all that, the layout of the building has been great for WidgCo. The locations of team leaders relative to their team members have been carefully optimized. The processes of the quality assurance folks on the assembly line are perfect, with every widget leaving WidgCo meeting your exacting tolerances. Even the way that employees clock in and out every day is just how everyone likes it.  You need a new building that can be updated, upgraded, or renovated moving ahead. At the same time, you need to keep your optimized office layout, your excellent employee roster, and your perfected processes precisely the same. Otherwise, WidgCo will lose productivity, fall behind, and miss out on money. You need to plan your move so that your workers and clients can come to your building and keep going about business as usual as if nothing changed.

That’s what migration can do for you. Automated migration will look at the old building (your old website), identify the different teams, where they sit, what their processes are, and it will automatically categorize all of that information into a plan. By having this plan in hand before you move into the new building, you’re able to replicate the old office floor plan exactly. Even the thermostat was set to the same setting. In web terms, that means all your content, pages, blog articles, galleries, users, and everything else are functioning correctly in the correct places.

Icon4Performing a migration by hand is a similar process. Instead of having your plan automatically generated, you and your development team go through with custom scripts and code to manually categorize and sort your site data. Instead of a program automatically sorting and classifying your data, your team can identify specific elements of your website and dictate precisely how they should move them to the new site.

Okay, that’s enough for now, I think. I don’t want to overload my brain, let alone yours! This article is just to serve as a high-level overview of what migrations are, how they work, and why you’d need to perform one. It’s been a lot of info, but we’re only scratching the surface of migrations. There are tons of tiny technical tidbits we could touch on, but unless you’re a web developer, it’ll mostly be gibberish. For now, though, I hope this has been a good primer on the topic.

Our expert team at Cheeky Monkey Media kicks ass at this stuff. In fact, we’ve put together a full Drupal whitepaper, just for you. If you need a hand, have any questions, or are interested in learning more about working together, don’t be afraid to reach out! We’re always happy to help and share some of our nerdy knowledge.

Feb 13 2020
Feb 13

Table of contents

What is Tag1 Quo
How does Tag1 Quo work
What makes Tag1 Quo unique
--Immediate notice of vulnerabilities
--Backports of LTS patches
--Automated QA testing for Drupal 7 LTS
--Customer-driven product development

One of the challenges of securing any Drupal site is the often wide range of modules to track, security advisories to follow, and updates to implement. When it comes to Drupal security, particularly older versions of Drupal such as Drupal 6 and Drupal 7, even a slight delay in patching security vulnerabilities can jeopardize mission-critical sites. Now that Drupal 7 and Drupal 8 are fast approaching their end of life (EOL) in November 2021 (Drupal 6 reached end of life on February 24, 2016), the time is now to prepare your Drupal sites for a secure future, regardless of what version you are using.

Fortunately, Tag1 Consulting, the leading Drupal performance and security consultancy, is here for you. We’ve just redesigned Tag1 Quo, the enterprise security monitoring services trusted by large Drupal users around the world, from the ground up, with an all-new interface and capabilities for multiple Drupal versions from Drupal 6 to Drupal 8. Paired with the Tag1 Quo module, available for download on Drupal.org, and Tag1 Quo’s services, you can ensure the security of your site with full peace of mind. In this blog post, we’ll cover some of the core features of Tag1 Quo and discuss why it is essential for your sites’ security.

What is Tag1 Quo?

Tag1 Quo is a software-as-a-service (SaaS) security monitoring and alerting service for Drupal 6, Drupal 7, and Drupal 8. In addition, it includes long-term support (LTS) for Drupal 6 and is slated to commence backporting security patches for both Drupal 7 and Drupal 8 when both major versions no longer have community-supported backports. The centerpiece of Tag1 Quo integration with Drupal is the Tag1 Quo module, which is installed on your servers and communicates securely with our servers.

In addition, for a fee, we can help you with a self-hosted version of Tag1 Quo for sites hosted on-premise. This does require setup fees and entails higher per-site licensing fees, so we encourage you to reach out to us directly if you’re interested in pursuing this option.

How does Tag1 Quo work?

When a new module update is released on Drupal.org, or when a security advisory is announced that directly impacts your Drupal codebases, the Tag1 Quo system alerts you immediately and provides all of the necessary updates required to mitigate the vulnerability, with a direct link to the code you need to install to address the issue. Not only are these alerts sent over e-mail by default; they can also flow directly into your internal project workflows, including issue tracking and ticketing systems.

Tag1 Quo doesn’t stop there. As part of our long-term support (LTS) offering, when security releases and critical updates emerge, or when new security vulnerabilities are announced for community-supported Drupal versions, Tag1 audits these and determines whether the identified vulnerability also impacts end-of-life (EOL) versions of Drupal such as Drupal 6 and, in November 2021, Drupal 7. If those EOL versions are also susceptible to the vulnerabilities, we backport and test all patches to secure the EOL versions as well and distribute them to you through the Tag1 alert system.

Moreover, when a new security vulnerability is discovered in an EOL version of Drupal without an equivalent issue in a currently supported version, Tag1 creates a patch to rectify the problem and collaborates with the Drupal Security Team (several of whom are part of the Tag1 team) to determine if the EOL vulnerability applies vice-versa to all currently supported versions of Drupal so that they can be patched too. In short, no matter where the vulnerability occurs across all of Drupal’s versions, you can rest easy with Tag1 Quo’s guarantees.

What makes Tag1 Quo unique

Tag1 Quo features a centralized dashboard with an at-a-glance view of all of your Drupal sites and their current status, regardless of where each one is hosted. After all, most enterprise organizations juggle perhaps dozens of websites that need to remain secure. Such a perspective at an organizational level is essential to maintain the security of all of your websites. But the Tag1 Quo dashboard is only one among a range of capabilities unique to the service.

Immediate notice of vulnerabilities

Although several members of the Tag1 team are also part of the Drupal Security Team, and are aware of vulnerabilities as soon as they are reported, the Drupal Security Team’s first policy is to collaborate privately to address the issue before revealing its nature publicly. This is to facilitate progressive disclosure in the form of issuances of public advisories and releases of public patches before nefarious actors are able to attack Drupal sites with success. This is for your safety and for the viability of released patches.

Thanks to our deep knowledge of both projects used by our clients' websites and security advisories, Tag1 has the distinction of being among the very first to notify Tag1 Quo customers as soon as the official announcement is released. Immediately afterwards, Tag1 Quo will prepare you to apply the updates as quickly as possible to ensure your web properties’ continued safety.

Backports of LTS patches

If a fix for a vulnerability is reported for currently supported versions of Drupal but also applies to EOL versions, the patch must be backported for all Drupal sites to benefit from the patch. Unfortunately, this process can be complex and require considerable planning and analysis of the problem across multiple versions—and it can sometimes only occur after the patch targeting supported versions has been architected or completed. This means it may take more time to develop patches for LTS versions of Drupal.

Luckily, we have a head-start in developing LTS patches thanks to our advance notice of vulnerabilities in currently supported versions of Drupal. Despite the fact that we cannot guarantee that LTS updates will be consistently released simultaneously with those targeting supported versions, Tag1 has an admirable track record in releasing critical LTS updates at the same time as or within hours of the issuance of patches for supported Drupal versions.

Automated QA testing for Drupal 7 LTS

Throughout Drupal’s history, the community encouraged contributors to write tests alongside code as a best practice, but this was rarely the case until it became an official requirement for all core contributions beginning with the Drupal 7 development cycle in 2007. Tag1 team members were instrumental in tests becoming a core code requirement, and we created the first automated quality assurance (QA) testing systems distributed with Drupal. In fact, Tag1 maintains the current Drupal CI (continuous integration) systems that perform over a decade of concurrent years of testing within a single calendar year.

Because the Drupal Association has ended support for Drupal 7 tests and decommissioned those capabilities on Drupal.org, Tag1 is offering the Tag1 Quo Automated QA Testing platform as a part of Tag1 Quo for Drupal 7 LTS. The service will run all tests for Drupal 7 core and any contributed module tests that are available. Where feasible and appropriate, Tag1 will also create new tests for Drupal 7’s LTS releases. Therefore, when you are notified of LTS updates, you can rest assured that they have been tested robustly against core and focus your attention on integration testing with your custom code instead, all the while rolling out updates with the highest possible confidence.

Customer-driven product development

Last but certainly not least, Tag1 Quo is focused on your requirements. We encourage our customers to request development in order for us to make Tag1 Quo the optimal solution for your organization. By working closely with you to determine the scope of your feature requests, we can provide estimates for the work and an implementation timeline. While such custom development is outside the scope of Tag1 Quo’s licensing fees, we allot unused Tag1 Quo consulting and support hours to minor modifications on a monthly basis.

Examples of features we can provide for custom code in your codebases includes ensuring your internal repositories are relying on the latest versions of dependencies, and providing insights into your custom code through site status views on your Tag1 Quo dashboard. We can even do things like add custom alerts to notify specific teams and users responsible for these sites and customize the alerts to flow into support queues or other ticketing systems. Please get in touch with us for more information about these services.


The new and improved Tag1 Quo promises you peace of mind and renewed focus for your organization on building business value and adding new features. Gone are the days of worrying about security vulnerabilities and anxiety-inducing weekends spent applying updates. Thanks to Tag1 Quo, regardless of whether your site is on Drupal 6, Drupal 7, or Drupal 8, you can rest assured that your sites will remain secure and monitored for future potential vulnerabilities. With a redesigned interface and feature improvements, there is perhaps no other Drupal security monitoring service better tuned to your needs.

Special thanks to Jeremy Andrews and Michael Meyers for their feedback during the writing process.

Photo by Ian Schneider on Unsplash

Feb 13 2020
Feb 13

Welcome! If you need to update your Drupal 8 site to the latest feature branch, this post is for you. 

Drupal 8.8.0 introduced many exciting changes. On the heels of this release, we also saw a security release for Drupal 8.8.1, which covers four distinct security advisories (SAs). Drupal 8.7.11 also has fixes for those SAs, but developers might consider this a golden opportunity to take care of both updates at once.

This post will cover four common pitfalls of upgrading to the 8.8.x branch:

  • Pathauto version conflicts
  • New sync directory syntax
  • Temporary file path settings
  • Composer developer tools


These instructions assume you are:

  • Maintaining a Drupal 8 site on version 8.7.x
  • Using Composer to manage dependencies
  • Comfortable using the command line tool Drush

Pathauto version conflicts

The Pathauto module has been around for a long time. With over 7 million downloads and 675K reported sites using Pathauto, chances are high that this section applies to you.

Drupal core 8.8.0 introduced the path_alias module into core. However, this module conflicts with the Pathauto contrib module at version 8.x-1.5 or below. If you have Pathauto installed on your site, you must first update to Pathauto 8.5-1.6 or later. 

I strongly suggest updating Pathauto as a first step, and deploying all the way to production. The order of operations is important here, because updating Pathauto after core will result in data loss. While the release notes say it is safe to update “before or at the same time” as core, it is good to have some extra precaution around the sequence of events. 

Visit the full change record for more details: Drupal 8.8.0 requires pathauto version 8.x-1.6 or higher if installed.

Diagnosing path alias issues

What if I neglect to update Pathauto? How can I identify the symptoms of this problem? After running drush updb, I would expect to see this SQL error:

[error]  SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry '7142' for key 'PRIMARY': INSERT INTO {path_alias}

This Drupal core bug report provides more detail. It also describes how to walk it back to a working state if you encounter this problem.

New sync directory syntax

The configuration management system introduced a new sync directory syntax for 8.8.0. The default location for Drupal’s config sync is sites/default/files/config_[HASH]. It is very common practice to customize this location. It makes config files easier to understand and manage. If you do customize this location, note that Drupal no longer uses the $config_directories variable. 

Here is what a custom config sync directory might have looked like in Drupal 8.7.x or lower. In your settings.php file:

$config_directories['sync'] = dirname(DRUPAL_ROOT) . '/config';

Now in Drupal 8.8.x, this setting should be updated to look like this:

$settings['config_sync_directory'] = dirname(DRUPAL_ROOT) . '/config';

Read the full change record for more technical details: The sync directory is defined in $settings and not $config_directories.

Diagnosing sync directory issues

You can tell right away if there is a problem with your sync directory if config files are showing up in an unexpected place. You can also use drush to discover the current value of your config sync directory. 


$ drush php
>>> Drupal\Core\Site\Settings::get('config_sync_directory')
=> "/var/www/html/foo/bar"


drush php-eval '$path = Drupal\Core\Site\Settings::get("config_sync_directory"); print $path;'

Here are a few more ways to retrieve the same information, using the drush status command.

For Drush 8 only:

drush status config_sync

For Drush 9 or 10:

drush status --fields=%paths

Temporary file path settings

In this new feature release, the old procedural function file_directory_temp() is deprecated. Drupal now uses the FileSystem service instead, and this has implications if you are setting a custom value for temporary file storage.

To customize the temporary file location the old way, you may have something like this in your settings.php file:

$config['system.file']['path']['temporary'] = '/tmp';

Change this setting to the new syntax before running database updates:

$settings['file_temp_path'] = '/tmp';

Read the full change record to learn more: file_directory_temp() is deprecated and moved to the FileSystem service.

Diagnosing temp directory issues

Take a look at your database logs. In Drupal’s logs at /admin/reports/dblog, you can filter on “file system”. Any errors about temporary file placement may indicate an issue.

Composer developer tools

Composer Support in Core is just one of many strategic initiatives in the Drupal community. Some early ways of using Composer are now deprecated in favor of this new support. For example, Drupal now has an official Composer template project. If you used the unofficial template (but recommended at the time) drupal-composer/drupal-project to install Drupal before, you will have a bit of updating to do.

Manually edit composer.json to remove deprecated packages. Remove this line:

   "require-dev": {
        "webflo/drupal-core-require-dev": "^8.7.0"

And replace it with this one:

   "require-dev": {
        "drupal/core-dev": "^8.8"

Then edit the require statement for Drupal core itself: 

   "require": {
        "drupal/core": "^8.8"

Now that composer.json is up to date, you can go ahead and run Composer updates in the usual way, with composer update --lock.

If you are starting a new Drupal 8 site from scratch, refer to this guide on Starting a Site Using Drupal Composer Project Templates. It has instructions on how to use the new recommended way of handling Composer templates, using drupal/recommended-project.

Diagnosing deprecated Composer tools

Without following the steps above, if you try to run composer update, Composer will fail with this error:

Your requirements could not be resolved to an installable set of packages.

Depending on what package versions are installed, and the syntax of your composer.json file, the rest of the error output will vary. Here is an example:

The requested package webflo/drupal-core-require-dev (locked at 8.7.11, required as ^8.8) is satisfiable by webflo/drupal-core-require-dev[8.7.11] but these conflict with your requirements or minimum-stability.

The important thing to know here is that Composer is being helpful. It is preventing you from upgrading a deprecated package. 

You can verify this by using the Composer prohibits command:

$ composer prohibits drupal/core:^8.8
webflo/drupal-core-require-dev  8.7.11  requires  drupal/core (8.7.11)                                 
drupal/core      8.8.0  requires   typo3/phar-stream-wrapper (^3.1.3)                   
drupal-composer/drupal-project  dev-my-project  does not require  typo3/phar-stream-wrapper (but v2.1.4 is installed)

Or its alias:

$ composer why-not drupal/core:^8.8

But wait! There’s more!

These are just a few pitfalls. There are other considerations to make before updating to 8.8.x. Make sure to read the release notes carefully to see if any other advisories apply to you.

Feb 13 2020
Feb 13

Amazee Labs has been awarded the Daimler Key Supplier Inspiration Award for 2020! 

Markus Schäfer, Member of the Board of Management of Daimler AG and Mercedes-Benz AG, responsible for Group Research and Mercedes-Benz Cars Development, Procurement and Supplier Quality, presented the award to Amazee Labs CEO, Stephanie Lüpold on 12 February 2020 in Stuttgart’s Carl Benz Arena.

In presenting the award for outstanding suppliers, Mr Schäfer said “In order to fulfil our role as innovation and technology leader in the future, we also expect courageous impulses with inspiring visions in all areas from our partners. Together we are creating ground-breaking mobility solutions that are in line with our social, environmental and economic targets.”

Amazee Labs Daimler Supplier Awards

Amazee Labs won the award for developing the content management system for Daimler’s new smart.com website. They were able to set new standards in mapping internationalization processes thereby turning a former challenge of Drupal into a major advantage. 

Stephanie Lüpold, CEO of Amazee Labs is exceptionally pleased: "We want to thank Daimler for the award. As a strong player in the Drupal open source community, Amazee Labs has solved one of the top challenges faced by Drupal by developing a new, revolutionary and easy way to manage content internationally. Not only have we solved this issue for Drupal, but it’s also helped Drupal achieve a significant advantage over other systems. The team has done an outstanding job. We are extremely pleased that our solution is already being used successfully by several multinational companies, including Daimler. This award is recognition of our hard work and great motivation to continue to do everything possible for our customers".

Amazee Labs - Daimler Supplier Awards

Amazee Labs wants to congratulate all the other winners, it’s an honour to be included in such esteemed company.

Feb 12 2020
Feb 12

Drupal 9.0.0-alpha1 came out yesterday! I decided to take it for a test drive by updating this very blog from Drupal 8 to Drupal 9. It was easier than I expected.

I started by doing a bit of preparatory housecleaning to make the update process easier. I recommend you do the same! Specifically I:

  • Updated to the latest version of Drupal 8
  • Updated all of my contributed modules to their latest versions
  • Removed all contributed modules that were not enabled, drush pm-list --status=disabled --no-core
  • Removed my require-dev dependencies from composer.json to reduce the possible version conflicts (I'll add them back in later)

Some background on the Drupal application that I updated:

  • Dependencies are managed by Composer
  • There are no custom modules (though I show you how to deal with them a bit)
  • There is a custom theme

Check out this video to watch me perform the update and troubleshoot issues as they arise:

[embedded content]


  • Use Composer's alias capability to "trick" various modules into considering Drupal 9.0.0-alpha1 to be 8.8.2: "drupal/core": "9.0.0-alpha1 as 8.8.2", This allows you to test against contributed modules that may not be 100% Drupal 9 ready.
  • Use Acquia's Drupal 9 Deprecation Status page to quickly search for the Drupal 9 readiness status of contributed modules and find related issues and patches
  • Use cweagans/composer-patches to apply patches to contributed modules where necessary
  • Use Drupal Check to scan custom modules and themes for deprecated code usage
  • Clear caches, reload pages, fix issues, and repeat until things stop blowing up!

The process ended up being far smoother and faster than I had expected. Total time was actually a bit less than 10 minutes to get a working updated application locally.

By the time that we have a beta or stable release, I expect it to even better. For those of you following along at home, good luck!

Feb 12 2020
Feb 12

If your website has plenty of media files (PDF presentations, price-lists, product images, etc.), you know how cumbersome it can be to replace them. Special attention in file replacement needs to be paid to SEO — because, as every SEO expert knows, every detail matters there.

Luckily, your Drupal 8 website offers more and more ways of easy media management that will also allow you to stay in line with the best practices of SEO.

Discover how to replace Drupal media files easily, with no fuss or extra manual efforts, and with your SEO rankings preserved. This is possible thanks to the new Drupal module — the Media Entity File Replace.

When do you need to Drupal replace media files?

The content never stays unchanged — it needs to keep up with the new business circumstances. Media files are not exception. For example, you may need to:

  • update a PDF presentation for your company/products/services
  • change your price-list
  • update your how-to checklist
  • upload new product images with better quality than before
  • make changes to your corporate video

and many other types of content.

What is the problem with the standard file replacement?

The standard procedure includes removing the old file and uploading another one. When it is replaced, your file gets a different name and path. Drupal appends a number to the end of the new one (_0, _1, etc.), instead of overwriting the original.

Standard file replacement in Drupal 8

File replacement becomes an especially tedious process when the file is used in multiple places throughout your website. This means additional expenses on your or your staff’s manual work. A special point of concern here is the impact on SEO. Read on to find out more about the impact of file replacing on your SEO.

How does media file replacement influence SEO?

File names play a part in your SEO rankings. Human-readable names enriched with keywords and written through a dash is a great way to tell Google what your image is about (in addition to the ALT tag), and this benefits your SEO.

Changing your file name and path can lead to a certain loss in SEO rankings because Google treats the newly uploaded files as new, and needs to recrawl, reprocess, and reindex them. This can take a long time during which it will show old and irrelevant content.

And, of course, if your files are used in content throughout your website and you change them but forget to re-upload them everywhere, they will be unavailable to your users. File path changes have the potential to cause broken links, which is one of the most annoying things both for search engines and users.

Never lose any SERP position to your competitors. Use helpful tools to replace files easily and without losing SEO.

How the Media Entity File Replace module can help

The new Media Entity File Replace module for Drupal 8 offers a smart and SEO-friendly way to replace Drupal media files. The module replaces the source files in Drupal media entities that have a file-based field. It does so by overwriting the old file with the new one. What’s even more important, the name and path are preserved in the process.

The Media Entity File Replace works with Media entities in Drupal 8. If you use the Media system to manage your media files, this module will suit you.

Note: To use the Media module, consider updating your Drupal website to the latest version where it has been greatly improved — our Drupal support team can do it for you.

Installing the Media Entity File Replace module

The module is installed like any other. It depends on a bunch of core modules: Media, File, Field, Image, User, and System.

Installing the Media Entity File Replace module in Drupal 8

Configuring the File Replace widget

The module comes with the File Replace widget, which you need to enable in the media types for which you want your files to be overwritable. Let us remind you: Drupal 8 has 5 default media types: Audio, Document, Image, Remote video, and Video, while others can be created to your liking.

To enable the File Replace widget, we need to go to Structure — Media types — {Our media type} — Manage form display. In our example, we work with the Document Media type.

Let’s drag the Replace File widget to the enabled ones. The perfect place is just below the File field.

Configuring the File Replace widget in Drupal 8

Replacing your Drupal media files

Let’s create a new document entity in Media — Add media — Document. We then upload an “Our services” PDF to it.

Creating a Drupal document Media entityDrupal media file path

The PDF is now saved in our Media Library (Content — Media) where we can go and edit the entity in order to replace the PDF.

Instead of the usual Remove button, we now see the Replace button. If the “Overwrite original filename” is checked, the original name will be kept and the content will be replaced.

Media Entity File Replace Drupal 8 module in action

We click “Choose file” and upload a new one — “Updated services.”

Replacing files with the Media Entity File Replace module

After saving, we see that the filename in this media entity is the same as before.

Drupal document file path

However, the content of the source file available by the same path has been rewritten. It now shows our updated services.

Media file replaced with its path inchanged in Drupal 8

This PDF can be used as an attachment in your content. You just need to add a Media field in your content type, and then you can easily fetch media there from the Media Library. Visit our “Media handling in Drupal 8” to learn how media can serve as building blocks for content.

Adding documents from Media Library to content

In this example, we added a “Document” Media field to the Basic page, and then created a content page with our PDF attached to it.

Content entity with a document Media field

Wherever else we add the file throughout your website, it is going to be rewritten automatically after a replacement, with no need to reupload.

While using the Media Entity File Replace module, special attention needs also to be paid to caching so your users are able to see the updated content sooner.

Entrust your media setup to our Drupal team

The Media capabilities and the ecosystem of Media Library management modules in Drupal 8 keep growing at amazing strides. It offers you more every day for managing your media effortlessly, with joy, and with no SEO losses.

Ask our Drupal support & maintenance team to configure the ideal media management processes on your Drupal 8 website!

Feb 12 2020
Feb 12

An effective administrative interface is table stakes for any content management system that wishes to make a mark with users. Claro is a new administration theme now available in Drupal 8 core thanks to the Admin UI Modernization initiative. Intended to serve as a logical next step for Drupal's administration interface and the Seven theme, Claro was developed with a keen eye for modern design patterns, accessibility best practices, and careful analysis of usability studies and surveys conducted in the Drupal community.

Claro demonstrates several ideas that not only illustrate the successes of open-source innovation but also the limitations of overly ambitious ideas. By descoping some of the more unrealistic proposals early on and narrowing the focus of the Claro initiative on incremental improvements and facilitating the work of later initiatives, Claro is an exemplar of sustainable open-source development.

In this closer look at how Claro was made possible and what its future holds for Drupal administration, join Cristina Chumillas (Claro maintainer and Front-End Developer at Lullabot), Fabian Franz (Senior Technical Architect and Performance Lead at Tag1), Michael Meyers (Managing Editor at Tag1), and Preston So (Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) for a Tag1 Team Talks episode about the newest addition to Drupal's fast-evolving front end.

[embedded content]

Feb 12 2020
Feb 12

This article is from our resident Drupal 8 core maintainer Nathaniel Catchpole (@catch) who is helping to build Drupal 9.

Drupal 9 will be the first major core release to have a continuous upgrade path meaning that Drupal 8 contributed modules and themes, and actual Drupal 8 sites, should be able to upgrade smoothly from Drupal 8 to Drupal 9, with only slightly more disruption than a Drupal 8 minor release. Since this is the first time that Drupal has ever attempted to provide a smooth upgrade path between major releases, we understand it can be hard to imagine what it will be like. 

The best way to feel confident will be to actually understand the changes that will be landing in Drupal 9 over the next few months, and when you should try to test it, use it to build new sites and to update existing sites.

What are the main changes in Drupal 9?

Drupal 9 will update third party dependencies, such as Symfony and Twig, so that we are able to continue to receive bugfix and security support from those projects.

Drupal 9 (and Drupal 8 versions from 8.7.7 onwards) will have support for running the same branch of a contributed module across multiple major core versions, reducing the maintenance burden of supporting both.

Drupal 9 will remove backward compatibility layers and deprecated code marked for removal in Drupal 8.

These three things are really the entire focus of the initial Drupal 9.0.0 release — updating dependencies, smoothing the update from Drupal 8, and removing deprecations.

When with Drupal 9 be available?

Drupal 9.0.0 is planned for release on June 3rd, 2020, the same day that Drupal 8.9.0 is released. However, the Drupal 9 branch is already open for development and you can download and test Drupal 9.0.0-alpha1!

The first tasks in Drupal 9 are updating to  Symfony 4 and Twig 2 as well as other PHP and JavaScript dependencies. This process has already started, and we’ve raised the minimum PHP requirement to PHP 7.2 already too.

You can already enable testing for a Drupal-9 compatible module against the Drupal 9 branch and make sure all your tests pass with Symfony 4 and Twig 2.

Removal of deprecated code and backward compatibility layers is also in progress now that Drupal 9 is open. We will continue updating dependencies and removing deprecated code throughout the next few months.

Alphas will continue to come out regularly until we hit beta.

Isn’t this a short list? Surely, there’s more to it?

This is intentionally a shortlist because we’ve spent years working on how to minimize the disruption for the Drupal 8 to 9 updates. However, with the question of ‘when will it be ready?’, there should always be the addition of ‘ready for whom?’. Drupal core developers? Contrib developers? New site builders? Organizations with an existing site on Drupal 8? Organizations on Drupal 6 or 7?

Drupal 9 becomes ready for core development when the branch is open — once this is done, Drupal 9-specific patches can be committed.

Drupal 9 becomes ready for contributed module developers as soon as the branch begins to look like Drupal 9 - i.e. once dependencies are updated and deprecated code starts to be removed. Contrib developers can already support multiple core versions using the new core_version_requirement key, but the way to test if a module is really Drupal 9 compatible is to install and try to use it on Drupal 9.

Drupal 9 becomes ready for new site development once the contributed modules you need for that new site are compatible with Drupal 9. For many contributed modules this will be a one-line change, so you should be able to start developing new Drupal 9 sites once 9.0.0-beta1 is released or earlier if you’re feeling adventurous and want to contribute to contributed modules being read. Drupal 9’s beta period will be longer than what we allow for minor releases, to enable plenty of advance testing and flush out any unexpected issues with actual sites.

Once Drupal 9.0.0 is released, you should never start building a new site on Drupal 8 again.

Drupal 9 becomes ready for Drupal 8 upgrades once all of the modules you’re already using, and any custom code, are Drupal 9 compatible. Drupal 8 will be supported until November 2021. However, it’s a good idea to start work on this as soon as Drupal 9.0.0 is released, and you can prepare for that upgrade now by doing things like ensuring you’re up-to-date with Drupal 8 core and contributed updates. While June 2020 to November 2021 is a much shorter support window than previous major Drupal releases like 6.x and 7.x, it also means hundreds of thousands of sites making a smaller change all around the same time, which should improve reliability for everyone.

If you’re still on Drupal 6 or 7, you can already migrate sites that don’t use content translation to Drupal 8 now! There is no reason to wait for Drupal 9 to do this since it will be such a small update to Drupal 9. Sites using content translation should keep an eye on this critical Drupal core issue to finalize the multilingual upgrade path for translated nodes, and help to test it if you can.

When will Drupal 9.0.0 actually be released though?

We’re aiming for Drupal 9.0.0 to be released on June 3rd, 2020. To hit this date, we’ll need to hit beta by March 2020. The full list of requirements for tagging beta is tracked in this issue.

There is always the possibility that we won’t have resolved every beta blocker by March. If we don’t have everything in place by then, there are two fallback dates—we’ll either start the beta in May with a 9.0.0 release date of August/September 2020 or start the beta in September with a release date of December 2nd, 2020. 

The more people test the branch and Drupal 9-compatible modules before June 2020, the more confident everyone can be that it’s really ready.

Feb 11 2020
Feb 11

The need to export the contents of a Drupal 8 project is a recurring need, whether for analysis purposes or for mass updating with a concomitant import process. We have several solutions with Drupal 8, each of which has advantages and disadvantages, whether in terms of the content types that can be exported, the options for exporting column headers and values, the level of access rights required and the highly variable configuration options. We will present here a new Entity Export CSV module that seems to be able to respond to many use cases.

Another new CSV export module?

We have many solutions to set up a CSV export on a project. Let's try to list some of them without wanting to be exhaustive.

The Views Data Export module, as its name indicates, is based on the Core Views module to configure and set up CSV exports. Using Views we can then configure an export for any Drupal 8 entities, usually for a particular bundle. We then need to configure as many views as we need to export and some limitations may appear when it comes to exporting multiple fields. Setting up CSV export with this module requires administrative rights to create and manage views of a Drupal 8 project, with some understanding of how Views works. It is therefore not really intended for end users.

The Content Export CSV module allows you to quickly export the nodes of a Drupal 8 site. Its configuration options are very limited, especially the choice of exported fields and their values, in addition to the fact that only nodes can be exported with this module. Conversely, this module can be used directly by end users.

The Entity Content Export module allows many configuration options. It can export all Drupal 8 content entities and the exports of each entity can be configured based on the entity view modes, including field formatters that we can select and configure for a given content export. However, it requires a consequent initial configuration, with very high administrative access rights, at the level of each entity bundle that we want to make exportable.

For the needs of a Drupal web factory project, each of its solutions could partially but not totally meet these requirements. In particular because it was not possible to know in advance what content types or what entity type it was necessary to be able to export, how the content type was configured and used (in the case of generic content types customizable by instance of the web factory) and especially because any configuration and implementation of these CSV exports had to be able to be done by end users, without any particular knowledge of Drupal, nor any access rights on the configuration of a Drupal 8 project.

Introduction of the Entity Export CSV module

The Entity Export CSV module was created to meet these challenges.

  • Be able to export any content entity from Drupal 8
  • Be able to select which fields are exportable for each entity and each bundle
  • Be able to configure how each field of an entity is exported
  • Be able to configure field by field their export behavior when multiple fields are involved (export in a single column with separator and export of each value in a separate column)
  • Be easily customizable for the export of a particular field, with a specific business need
  • Be usable by an end user without special administrative rights on the configuration of a Drupal 8 project (Views, Entity View Mode, etc.).

In terms of architecture, in order to meet these challenges, Entity Export CSV relies on :

  • A simple configuration page that allows you to select which content entities will be exportable from among the content entities present in a project and, if necessary, to limit them to one or more bundles, and also to limit (or not) the fields of these entity bundles that will be exportable.
  • A simple export page allowing, on the basis of the initial selection configuration, the entity to be exported, then to configure for each field whether it should be included in the export and if so how it should be exported

Detailed presentation of how Entity Export CSV works

Once the module is installed, as indicated above, the first step is to configure the exportable entities, which will be accessible in the export page.

The configuration is quite simple. For each entity type available on the project we select the ones we want to make exportable, and we can limit the bundles by entity type that will be exportable.

Entity Export CSV settings

Then for each entity and bundles enabled, we can also limit the fields that will be exportable, this for example in order to not overload the export page with the so-called technical fields of an entity (uuid, revision, or any other field added on the entity that does not contain any content as such).

Entity Export CSV settigns details

Then just go to the actual CSV export page.

page export CSV

Users can configure for each field, if it is to be included in the export, how the column header will be populated (field readable name or system name), the number of columns to be used for multiple fields as well as the export format to be used for each field.

Entity Export CSV page export

Once the configuration is finished, the user can save this configuration so that he does not have to make another configuration during a next export. Each configuration is saved for an entity type, a bundle and per user. A programmed evolution will be to be able to configure configuration exports for each entity type and bundle, in an unlimited way, and then use these export configurations here as a reusable template.

Entity Export CSV save export settings

Each field can be configured differently at the level of the column header used for its export, and at the level of the exported value. To do this, the module has a FieldTypeExport Plugin system that allows you to easily create configurable and/or specific field exports. For example below the "Modified" field is configured with the Timestamp export plugin which exposes as an option the date formatting.

Entity Export CSV timestamp export

Another basic plugin provided by the module allows to configure how Entity Reference type fields are exported. For example if we want to export the ID of the referenced entity or its label, and for the case of a multiple field the number of columns to be used to export the values.

Entity Export CSV entity reference export

Extension of the module Entity Export CSV

One of the challenges the module meets is to be easily extensible and/or customizable. Rare are the Drupal 8 projects where a field does not fit into a generic box and requires special treatment.

The Entity Export CSV module relies on a Plugin system to export all fields and can therefore be easily extended by a Drupal 8 developer to support any type of special case or fields created by contributed modules (for example, the module includes a Plugin for the fields of the Geolocation module and the Address module).

To create a Field Export Plugin, a FieldTypeExport Plugin must therefore be created in the src/Plugin/FieldTypeExport namespace of any Drupal 8 module.

The annotations of this plugin allow you to control certain behaviors of the Plugin and its availability. Let's look at these annotations with the example of the Geolocation plugin included in the module.

namespace Drupal\entity_export_csv\Plugin\FieldTypeExport;

use Drupal\Core\Field\FieldDefinitionInterface;
use Drupal\Core\Form\FormStateInterface;
use Drupal\entity_export_csv\Plugin\FieldTypeExportBase;
use Drupal\Core\Field\FieldItemInterface;

 * Defines a Geolocation field type export plugin.
 * @FieldTypeExport(
 *   id = "geolocation_export",
 *   label = @Translation("Geolocation export"),
 *   description = @Translation("Geolocation export"),
 *   weight = 0,
 *   field_type = {
 *     "geolocation",
 *   },
 *   entity_type = {},
 *   bundle = {},
 *   field_name = {},
 *   exclusive = FALSE,
 * )
class GeolocationExport extends FieldTypeExportBase {

   * {@inheritdoc}
  public function getSummary() {
    return [
      'message' => [
        '#markup' => $this->t('Geolocation field type exporter.'),

   * {@inheritdoc}
  public function buildConfigurationForm(array $form, FormStateInterface $form_state, FieldDefinitionInterface $field_definition) {
    $configuration = $this->getConfiguration();
    $build = parent::buildConfigurationForm($form, $form_state, $field_definition);
    return $build;

   * {@inheritdoc}
  public function massageExportPropertyValue(FieldItemInterface $field_item, $property_name, FieldDefinitionInterface $field_definition, $options = []) {
    // Stuff to format field item value.

   * {@inheritdoc}
  protected function getFormatExportOptions(FieldDefinitionInterface $field_definition) {
    $options = parent::getFormatExportOptions($field_definition);
    return $options;


The annotations of a FieldTypeExport plugin are :

  • weight: the weight of the plugin. The plugin with the highest weight will be the plugin selected by default if more than one plugin is available for a field.
  • field_type: the type of field to which the plugin applies. Multiple field types can be specified if necessary. This option is mandatory.
  • entity_type: it is possible here to limit the plugin to only certain entity types. Leave empty and the plugin will be available for the field type on any entity type.
  • bundle: it is possible here to limit the plugin to only certain entity bundles. Leave empty and the plugin will be available for the field type on any bundles
  • field_name: here it is possible to limit the plugin to one particular field. Leave empty and the plugin will be available for the field type on all fields of that type.
  • exclusive: this option if set to TRUE will make this plugin exclusive, i.e. all other plugins available for this field type will no longer be visible. Useful if you want to limit the options available to export a specific field by a particular field. Default value is FALSE.

You can then override all the methods available on the Base Plugin in order to customize the export rendering of the field. In particular you can expose new configuration options, and of course implement the massageExportPropertyValue() method which is in charge of formatting the export of a field instance.

To you exports

The Entity Export CSV module provides access to advanced export features for end users who may not have extensive knowledge of Drupal or advanced configuration rights. It allows you to quickly provide export functions on any entity type, while remaining completely independent of a project configuration and as such can be integrated on any type of project, including webfactory projects. The simplified interface that it provides to users is not to the detriment of more or less complex use cases, thanks to its Plugin system that allows all business needs that are a little specific to be handled at very little cost.

Need a CSV export? Think about Entity Export CSV!

Feb 11 2020
Feb 11

Recent and abundant evidence that ADA Accessibility enhances SEO, is broadening perspectives on WCAG compliance -- from a complicated and potentially costly requirement, to an excellent opportunity that needs to be accomplished ASAP.  

Google as a Gatekeeper

Google has emerged as a gatekeeper within our digitally-driven business climate. If a site doesn’t grab Google’s attention, that means lots of lost traffic. A recent article in Search Engine Journal reported that sites which appear on the first page of a Google search receive, on average, 91.5 percent of the traffic share. 

This means that in the current climate, websites need to be accessible to the major search engines -- just as they need to be accessible to people with disabilities. Structuring a site to align with what Google looks for in determining listing order on a Search Engine Results Page (SERP) is a critical business imperative.

Accessibility and SEO in Alignment

Many of the factors that fuel Search Engine Optimization are also essential for ADA Web accessibility compliance. 

Here’s a short list of reasons why and how SEO and ADA web accessibility best practices converge to enhance both objectives:

Google gets it: Great UX equals greater accessibility.

When websites are designed with a high degree of empathy for those who visit the site -- including people with disabilities -- SEO follows. Many metrics pertaining to great user experience have an impact on a site’s search result ranking. Among them: how long it takes a site to load, straightforward navigation, quality content, mobile responsiveness, and internal linking.

Screen readers and search engines both rely on title tags.

Title tags are the first component of a web page that’s read by a screen reader. They’re also essential to a search engine’s ranking of a page and where it appears on the SERP. Even though title tags don’t appear on the page itself, the title tag appears as the heading of the SERP listing and it’s essential -- for both accessibility and SEO -- that it includes key words that accurately reflect the content on the page. From the standpoint of visually impaired screen reader users, it’s also important that every page on the site has a distinct title tag.

Search engines also scan alt text.

It’s tempting for content editors to slap in perfunctory alt text. Carefully describing an image for the purpose of helping a visually impaired person to envision it, takes time and thought and descriptive alt text can make or break the user experience for a visually impaired person who relies on a screen reader. At the same time, alt text that weaves in key search terms also serves to enhance SEO.

Meaningful header hierarchies support WCAG and SEO.

Accessibility compliance requires that content follows a logical H1 to H6 header sequence and that headers accurately describe the content that follows. Adherence to a logical content structure serves all users, and in particular, those who have cognitive impairments or rely on a screen reader. From an SEO standpoint, breaking content up into meaningful pieces of information with headers that incorporate key search terms, is key to SEO and can lead to the content appearing as a featured snippet on a SERP page. 

The Big Picture: Web Accessibility Fuels SEO

ADA Web accessibility compliance and SEO are two distinct areas of expertise. Fortunately a sharp focus on one, enhances the other. Viewed from another angle, when Google is treated as a distinct user for whom a site needs to be accessible, the result is significant steps toward achieving ADA accessibility.

Looking to continue the conversation on how web accessibility can improve search engine rankings? Contact us.

Feb 11 2020
Feb 11

Table of Contents

If you’ve touched a Drupal site at any point in the last ten years, it’s very likely you came into contact with Drush (a portmanteau of “Drupal shell”), the command-line interface (CLI) used by countless developers to work with Drupal without touching the administrative interface. Drush has a long and storied trajectory in the Drupal community. Though many other Drupal-associated projects have since been forgotten and relegated to the annals of Drupal history, Drush remains well-loved and leveraged by thousands of Drupal professionals. In fact, the newest and most powerful version of Drush, Drush 10, is being released jointly with Drupal 8.8.0.

As part of our ongoing Tag1 Team Talks at Tag1 Consulting, a fortnightly webinar and podcast series, yours truly (Preston So, Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) had the opportunity to sit down with Drush maintainer Moshe Weitzman (Senior Technical Architect at Tag1) as well as Tag1 Team Talks mainstays Fabian Franz (Senior Technical Architect and Performance Lead at Tag1) and Michael Meyers (Managing Director at Tag1) for a wide-ranging and insightful conversation about how far Drush has come and where it will go in the future. In this two-part blog post series, we delve into some of the highlights from that chat and discuss what you need to know and how best to prepare for the best version of Drush yet.

What is Drush?

The simplest way to describe Drush, beyond its technical definition as a command-line interface for Drupal, is as an accelerator for Drupal development. Drush speeds up many development functions that are required in order to take care of Drupal websites. For instance, with Drush, developers can enable and uninstall modules, install a Drupal website, block or delete a user, change passwords for existing users, and update Drupal’s site search index, among many others — all without having to enter Drupal’s administrative interface.

Because Drush employs Drupal’s APIs in order to execute actions like creating new users or disabling themes, Drush performs far more quickly than Drupal’s bootstrap itself, because there is no need to traverse Drupal’s render pipeline and theme layer. In fact, Drush is also among the most compelling real-world examples of headless Drupal (a topic on which this author has written a book), because the purest definition of headless software is an application that lacks a graphical user interface (GUI). Drush fits that bill.

The origins and history of Drush

Though many of us in the Drupal community have long used Drush since our earliest days in the Drupal ecosystem and building Drupal sites, it’s likely that few of us intimately know the history of Drush and how it came to be in the first place. For a piece of our development workflows that many of us can’t imagine living without, it is remarkable how little many of us truly understand about Drush’s humble origins.

Drush has been part of the Drupal fabric now for eleven years, and during our most recent installment of Tag1 Team Talks, we asked Moshe for a Drush history lesson.

Drush’s origins and initial years

Though Moshe has maintained Drush for over a decade to “scratch his own itch,” Drush was created by Arto Bendiken, a Drupal contributor from early versions of the CMS, and had its tenth anniversary roughly a year ago. Originally, Drush was a module available on Drupal.org, just like all of the modules we install and uninstall on a regular basis. Users of the inaugural version of Drush would install the module on their site to use Drush’s features at the time.

The Drupal community at the time responded with a hugely favorable reception and granted Drush the popularity that it still sees today. Nonetheless, as Drush expanded its user base, its maintainers began to realize that they were unable to fully realize the long list of additional actions that Drupal users might want, including starting a web server to quickstart a Drupal site and one of the most notable features of Drush today: installing a Drupal site on the command line. Because Drush was architected as a Drupal module, this remained an elusive objective.

Drush 2: Interacting with a remote Drupal site

Drush 2 was the first version of Drush to realize the idea of interacting with a remote Drupal website, thanks to the contributions of Adrian Rousseau, another early developer working on Drush. Today, one of the most visible features of Drush is the ability to define site aliases to target different Drupal sites as well as different environments.

Rousseau also implemented back-end functionality that allowed users to rsync the /files directory or sql-sync the database on one Drupal installation to another. With Drush 2, users could also run the drush uli command to log in as the root user (user 1 in Drupal) on a remote Drupal site. These new features engendered a significant boost in available functionality in Drush, with a substantial back-end API that was robust and worked gracefully over SSH. It wasn’t until Drush 9 that much of this code was rewritten.

Drush 3: From module to separate project

During the development of Drush 3, Drush’s maintainers made the decision to switch from Drush’s status as a module to a project external to Drupal in order to enable use cases where no Drupal site would be available. It was a fundamental shift in how Drush interacted with the Drupal ecosystem from there onwards, and key maintainers such as Greg Anderson, who still maintains Drush today seven versions later, were instrumental in implementing the new approach. By moving off of Drupal.org, Drush was able to offer site installation through the command line as well as a Drupal quickstart and a slew of other useful commands.

Drush 5: Output formatters

Another significant step in the history of Drush came with Drush 5, in which maintainer Greg Anderson implemented output formatters, which allow users to rewrite certain responses from Drush into other formats. For instance, the drush pm-list command returns a list of installed modules on a Drupal site, including the category in which they fit, formatted as a human-readable table.

Thanks to output formatters, however, the same command could be extended to generate the same table in JSON or YAML formats, which for the first time opened the door to executable scripts using Drush. During the DevOps revolution that overturned developer workflows soon afterwards, output formatters turned out to be a prescient decision, as they are particularly useful for continuous integration (CI) and wiring successive scripts together.

Drush 8: Early configuration support

Drush 8, the version of Drush released in preparation for use with Drupal 8 sites, was also a distinctly future-ready release due to its strong command-line support for the new configuration subsystem in Drupal 8. When Drupal 8 was released, core maintainer Alex Pott contributed key configuration commands such as config-export, config-import,config-get, and config-set (with Moshe’s config-pull coming later), all of which were key commands for interacting with Drupal’s configuration.

Due to Drush 8’s early support for configuration in Drupal 8, Drush has been invaluable in realizing the potential of the configuration subsystem and is commonly utilized by innumerable developers to ensure shared configuration across Drupal environments. If you have pushed a Drupal 8 site from a development environment to a production environment, it is quite likely that there are Drush commands in the mix handling configuration synchronicity.

Drush 9: A complete overhaul

About a year ago, Drush’s indefatigable maintainers opted to rewrite Drush from the ground up for the first time. Drush had not been substantially refactored since the early days in the Drush 3 era, when it was extracted out of the module ecosystem. In order to leverage the best of the Composer ecosystem, Drush’s maintainers rewrote it in a modular way with many Composer packages for users to leverage (under the consolidation organization on GitHub).

This also meant that Drush itself became smaller in size because it modularized site-to-site communication in a tighter way. Declaring commands in Drush also underwent a significant simplification from the perspective of developer experience. Whereas foregoing Drush commands were written in PHP as was the case in Drush 8, developers could now write Drush commands in a PHP method within a callback with the lines of Doxygen above the callback housing the name, parameters, and other details of the command. Also in the same release came YAML as the default format for configuration and site aliases in Drush as well as the beginning of Symfony Console as the runner of choice for commands.

Drush 9 introduced a diverse range of new commands, including config-split, which allows for different sets of modules to be installed and different sets of configuration to be in use on distinct Drupal environments (though as we will see shortly, it may no longer be necessary). Other conveniences that entered Drush included running commands from Drupal’s project root instead of the document root as well as the drush generate command, which allows developers to quickly scaffold plugins, services, modules, and other common directory structures required for modern Drupal sites. This latter scaffolding feature was borrowed from Drupal Console, which was the first to bring that feature to Drupal 8. Drush’s version leverages Drupal’s Code Generator to perform the scaffolding itself.


As you can see, Drush has had an extensive and winding history that portends an incredible future for the once-humble command-line interface. From a pet project and a personal itch scratcher to one of the most best-recognized and commonly leveraged projects in the Drupal ecosystem, Drush has a unique place in the pantheon of Drupal history. In this blog post, we covered Drush’s formative years and its origins, a story seldom told among open-source projects.

In the second part of this two-part blog post series, we’ll dive straight into Drush 10, inspecting what all the excitement is about when it comes to the most powerful and feature-rich version of Drush yet. In the process, we’ll identify some of the key differences between Drush and Drupal Console, the future of Drush and its roadmap, and whether Drush has a future in Drupal core (spoiler: maybe!). In the meantime, don’t forget to check out our Tag1 Team Talk on Drush 10 and the story behind Drupal’s very own CLI.

Special thanks to Fabian Franz, Michael Meyers, and Moshe Weitzman for their feedback during the writing process.

Photo by Jukan Tateisi on Unsplash

Feb 11 2020
Feb 11

1) Built-in support for multi-language sites and admin portals

Let's jump right in! For business owners, ecommerce eliminates many restrictions of traditional business practices. One opportunity is the ability to sell your product to overseas consumers, expanding your possible market to contain, well virtually, the whole world. Of course, one of the barriers to entry into certain markets may be the language.

Imagine this: You are a Brazilian business owner who just invented chewing gum that never loses its flavour. Obviously, the demand for this product is worldwide. The only problem is that you do not feel comfortable writing the script for your new online product page in English or any language other than Portuguese for that matter. In a perfect world, the ideal solution might be to hire translators for every language of each country that you want to sell this amazing gum in. However, the costs of such an endeavour are enough to make even those with even the deepest of pockets think twice.

In my opinion, the next best and completely viable option is to choose to develop your chewing gum site using Drupal then make use of the many multilingual modules to automatically translate your content (just Google “Drupal automatic translate” for a list of options). The advantage of these Drupal translation modules is that, first, it can appear as an option at the top of the page and is therefore easily accessible to the customer. Second, additional modules can allow you to automatically show the users local language based on their browser’s set language. Third, you can choose which blocks of text you want to translate and which you do not; so let us say for aesthetic reasons or brand awareness you do not want a certain block of the site to be translated, you simply do not enable the translation for that block in the admin portal. Additionally, while your site frontend is being translated for your visitors, as an admin you can maintain Portuguese as the primary language to run your backend admin portal.

Read the full Gartner reportSpeaking from my own experience, I shop online for bicycle components quite often. The problem is many of the unique manufacturers I am looking to buy from are based out of Italy and Germany. Google translate can do an adequate job of helping you navigate the site, but when it comes to the finer details like product specifications or return policies I quickly find myself out of my depth. The great thing about using Drupal Translate is that you can manually enter the translation for each language of every block on your website. So for example, instead of paying for a full site translation in each language, you could hire professionals to translate the important areas like the fine print and leave the less critical areas up to Drupal.

2) Features on features

Okay, Drupal is not exactly an episode of Pimp My Ride, but it can pretty much do anything you can dream of. If, for some reason, you want to design a site that sources all of the types of chicken wings sold in restaurants across your city. Then create a catalogue that breaks down the various chicken wings by texture, flavour, price, size, messiness, etc. Now you want to integrate a system that uses logic and intelligence to recommend the best beer your company sells to accompany any wing selection made. This is all possible with Drupal.

The cost to develop such a unique site with these custom modules on Drupal would not be cheap. However, the point remains that a feature such as the one mentioned above is quite crazy, but completely possible. If there is functionality that you need, it can be built on Drupal. The other big takeaway is that once you have paid for the development of the module you are now the owner and do not have to worry about any ongoing licensing costs. For reasons like these, it is my opinion that Drupal is the best CMS for such robust and custom site requirements.

3) Security

Of course, nothing can ever be fully secure especially without regular upkeep, but Drupal does a few things differently that should help you sleep better at night. Unlike the many popular SaaS platforms, Drupal is open source and non-proprietary. This means that you are the owner of your data and you are the one who decides how it is managed, meaning you can fine tune every aspect of your Drupal site from the site itself to your hosting environment. If you have a security team or security-focused partner that you work with, Drupal provides the flexibility they need to keep your data safe.

The official Drupal Security Team is also thoroughly on top of the security of the core Drupal software’s code and helps module developers keep their modules secure. This team frequently releases security patches that address any vulnerabilities that come up. In addition to the official Drupal team, the large Drupal community of developers donate their time to develop and monitor Drupal’s code. Drupal and all of it’s modules are built using a core set of coding standards, so the many thousands of developers working with Drupal’s code ensures security issues are found and addressed quickly.

Lastly, one of the features of Drupal that is best known is its ability to integrate into third-party applications. As such, Drupal is also capable of easily integrating into other security systems and platforms on the market. You’re not restricted to Drupal alone.

4) Open source community

In my mind, there are two main reasons that the open-source nature of Drupal and the community that surrounds it are such an advantage.

First, because of the large community of developers and its open-sourced nature, there are countless plug-and-play ready modules available free of licensing fees just waiting to be added to your website. This means, in addition, you are the owner of your own code and data. Furthermore, you never have to worry about losing development support for your website. There will always be another Drupal agency out there waiting to pick up the pieces if something were to go wrong.

Second, because there is such a large community of developers behind the expansion of Drupal, you have a veritable fusion of diverse ideas and designs. Instead of a single organization pushing code in a certain direction, you can find incredibly creative and unique libraries of code. This means a deeper pool of free talent to pull from. Even with the creative minds driving the development of Drupal, there is still consistency in the underlying code. This enables easier upkeep of the code itself and allows a lower barrier of entry when onboarding new developers. The advantage to the end-user is that, when compared to a fully custom build, using Drupal means that should your partner agency ever go out of business or the relationship deteriorates, you will have other experts in Drupal to turn to.

5) Future-proof

I keep bringing this up, but it really enables so many possibilities; because Drupal is so open to API integrations, you can design Drupal to work as a modular middleware behind the scenes. This means as you acquire new technology and software, it really is as simple as plugging it in and configuring an API hook.

Furthermore, as long as Drupal is paired with the right server, it can handle endless amounts of traffic and scale from small business to enterprise. This is a reason why Drupal is such a popular CMS of choice for medium-sized to enterprise-level organizations.

Finally, Drupal as a CMS is kind of like Play-Doh. You can build out your frontend experience for the market you are presently targeting using Drupal’s built-in theming layer or by using one of the many other frontend frameworks. Drupal’s APIs allow it to run headless, so it can hold your backend data but you’re not tied down to any specific way of building your frontend. Ten years down the road, though, you may have a completely different set of needs for your frontend framework. No problem, you can rest assured that Drupal won't get in your way.

Are you considering options for your digital experience platform?

Choosing the right DXP now is important to your business now and in the future. Protect your tech investment by assessing the trade-offs of buy or build deployment options and how they relate to your digital experience goals and business outcomes. This Gartner report has been made available to our readers for a limited time and will help you get started. Check it out.

Click to access the Gartner report today

Feb 11 2020
Feb 11
Sun through a lense

"A picture is worth more than a thousand words". True, but a large picture will make your webpage slower, which will affect your SEO in a negative way. And eat away at your servers space, megabyte after megabyte.

There are several ways to remedy such a behaviour, but one way is to use image compression services to save space. With online services or programs on your computer you can remove unnecessary information and compress images with sometimes up to 80% gain.

Here I'm going to show you how to integrate the TinyPNG service in your Drupal installation which automatically compresses your images.


There are many different services on the internet, but one of the best I have found is TinyPNG - and it's supereasy to implement on your Drupal site.

It's also super easy to see if you can benefit from using their service. If you visit their Page Analyzer and enter your site url, you will be presented with statistics. If you are over 25% savings, I would suggest you start using a compression service.

Statistics over how much your website can benefit from using an image compression service, in this case TinyPNG

Step 1: Installing the Drupal module

By using composer to install the module and the TinyPNG library, it's super easy to get started.


composer require drupal/tinypng

in your terminal. Composer downloads the module, places it in the correct folder and downloads its dependency - TinyPNG PHP Library - and places it in the vendor folder.


Head into your Drupal website and click Extend in the menu. Scroll down (or filter) to TinyPNG and activate the module. 

Step 2: Getting an API key

API Key? What's that? Well, to make TinyPNG accept the requests from your website to the service you need an API key. It's a way of saying "howdy, can I get some service". It's also a way for TinyPNG to track how many images you get compressed per month. Don't worry, you get 500 for free every month, so unless you upload more that that, you're in the clear

If you should send more than 500 requests then you won't get access to the service until next month - or if you pay for the service. 

For normal use, 500 requests should be enough.

Getting an API key couldn't be simpler. Just visit the developer section of Tinypng.com and enter your name and email. 

You get an email with a link. Click it, and - boom! - you're in. On the page you can see your API key and also a counter that lets you know how many requests TinyPNG has processed using your unique API key.

Screenshot from TinyPNG's developer page with an API key

Step 3: Make the magic work in Drupal

Click the Configuration link in Drupal's menu and look under Media. There you find TinyPNG Settings. Click it.

Now it's time to copy the API code you got from the TinyPNG service. Paste it into the field on the settings page and hit Save configuration.

Screenshot from the settings page of TinyPNG inside Drupal

Step 4: Choose your compression method

The module facilitates two different kinds of image compression: on upload or via Drupal's own Image Styles - or both. I myself use the uploading kind since I then know that I won't reach the monthly limit through the API. If I would use the image style version, then I could reach - and pass - the limit in a fast way since I manage a site with a lot of images. Sure, I don't need to use the image action on every single Image style I have in Drupal, but I sure would be tempted to do so. 

If you choose to use the TinyPNG API whn uploading you get two options under Integration method: Download and Upload. They are the same, the only thing to remember is to use Upload on your local installation and Download on your live server. The help text says it all: "The download method requires that your site is hosted in a server accessible through the internet. The upload method is required on localhost." Though, personally, the names could be better. But anyway, it does the job.

Step 5: Save some megabytes

Well, actually there isn't a step 5. After installing the module with its dependencies, entering your API key there isn't much more. Just sit back, relax and watch the images shrink when uploading and/or showing them to the users making their experience on your website faster and better.

Some numbers

Here is also a comparison before and after using TinyPNG.

Type   Before compression   After compression   Saved, %Image 1, PNG   1.1 Mb   267 Kb   75%Image 2, PNG   1.1 Mb   287 Kb   75%Image 3, PNG   1.2 Mb   269 Kb   77%Image 4, PNG   985.7 Kb   274.0 Kb   72%Image 5, PNG   5.6 Mb   1.5 Mb   73%Image 6, JPG   3.5 Mb   524 Kb   84%Image 7, JPG   197 Kb   104 Kb   47%
Feb 11 2020
Feb 11

Our client Senec is working in a competitive environment and has to react on changing requirements regarding privacy quickly. At the same time the user experience cannot be harmed by regulations. In the beginning of the year Senec requested us to build a two-step process for YouTube/Vimeo video playback.

Senec's website is built with Drupal and won the International Splash Awards 2019.

When loading a page that contains a video, a preview is displayed website with a custom image and a play button according to their design system. 

Screenshot video on Senec website before privacy overlay

But when the visitor decides to click on the video playback button, an information message is displayed as shown in the image below.

Screenshot video on Senec website with privacy overlay warning about video content

It is only after the user clicks on 'Play video' that the request is made to to the video provider to fetch the content and display it.

Displaying the video like this allows for both an engaging visual experience, and at the same time protects the user's privacy until consent has been explicitly granted.

Feb 11 2020
Feb 11

A Drupal 7 to 8 migration is anything but boring because there are so many ways to perform a migration! Depending on the complexity of the project, we can choose a technique that suits it best. The one we are going to discuss in this blog is to migrate content and configuration from Drupal 7 to Drupal 8 using a CSV import method.

Drupal provides various modules for importing data from different sources like JSON, XML and CSV. Drupal 8 core migration API system offers a whole suite of APIs that can essentially handle any kind of a migration from a previous version of Drupal to Drupal 8. 

Some prep Work before the Drupal 7 to 8 migration

In order to migrate from Drupal 7 to Drupal 8 using CSV import, we will need these modules.

Drupal 7 Modules -

  • Views Data export: This module needs to be installed in our Drupal 7 site. The Views Data export module helps in exporting the data in CSV format.

  • Views Password Field: This module helps to migrate passwords which will send passwords in hashed format. 

Drupal 8 Modules -

  • Migrate – The Drupal 8 Migrate module helps in extracting data from various sources to Drupa 8.

  • Migrate Plus – This Drupal 8 module will help in manipulating the imported source data

  • Migrate Drupal – This module offers support in migrating content and configurations to Drupal 8.

  • Migrate source CSV – This module offers a source plugin that can migrate entities and content to Drupal 8 from .csv files.

  • Migrate Tools – This Drupal 8 module helps by offering UI tools/Drush commands to manage migrations.

  • Configuration Development Module – This module helps in importing configuration files to Drupal 8.

Let the Drupal 8 migration begin!

First, we need to create a custom module for our Drupal 8 migration. Let’s name this module as test_migrate. And we know that after creating a custom module we need to create the info.yml file.


Above screenshot shows keys that are required for info.yml.

Once the info.yml file is created, we need to create a migration group for the migration. This migration group needs to be created in the path: test_migration > config > install. Name of the group should be migrate_plus.migration_group.test_migration.yml.


Above screenshot shows the folder structure to create a migration group.

Inside the migrate_plus.migration_group.test_migration.yml file, we need to write id, label and description for the migration group which is shown in the screenshot below.


After creating the migration group, we need to install this group in our info.yml file. 


Now, we are going to write a migration script for the Users, Taxonomy term, Paragraphs, Content types. Note that you are migrating in the same order since there will be a link between these entities. For example, content will be created by a particular user - so we first need to migrate users and after that taxonomy, content type.

Now let’s write a script in yaml file for user migration. So, in order to write user migration, we need user yaml file with the name migrate_plus.migration.test_migration_users.yml and script for migration is shown below.


These are the keys required for migration here source csv file which we need to be migrated. Csv files should be placed in the path assets > csv > user.csv. Users.csv is also shown below.



Path - It indicates the path for the csv file.

header_row_count - This will give row count which is the header of a particular column.

Keys - which should be unique for every row.

Process - In this we are mapping csv files to fields.


Above image shows the mapping between fields and csv. Here, the name is the machine name of the user name field and title is the csv column title. If we have multiple data for a single field, then we use delimiters. Users may have multiple roles in that case we write like shown in the above image.

Images are migrated by writing custom plugin. Custom plugin can be written in the path src > plugin > migrate > process. In the above picture you can see that the user_image_import_process is a custom plugin written to migrate user images.


Inside UserImportProcess.php we are writing the function which will copy the image and save it to the destination. Script is shown in the image below.


In order to identify where images should be saved we will write one more function ImageImportprocess. In that function we will mention the machine name of the image.


In the users info.yml file there is a destination section which will indicate where the migrated data is to be stored and which is an entity. This is marked in the image below.


After creating code for users, we need to write yaml for taxonomy terms. Note that if you have only title field in your taxonomy then you do not need to write a separate yaml file. If you have multiple fields in taxonomy term, then you need to write a separate yaml file. In taxonomy terms we will have tid as key since tid will be unique for each term.


After this we will migrate paragraphs. For that we need to create a separate yaml file. The code to migrate is shown in the below image.


Lastly, lets migrate the content type. The yaml file for the content type is shown in the code below.


label: 'Migrate Content type data from the csv file'

migration_group: test_migration

source:id: test_migration_content

 plugin: csv

 # Full path to the file.

 path: 'modules/custom/test_migrate/assets/csv/content.csv'

 header_row_count: 1


   - nid


 # Adding the mapping between the fields and the csv columns.

 title: title

 promote: promoted_to_front_page

 sticky: sticky

 field_display_name: display_name

 field_marketing_tagline: marketing_tagline


   plugin: entity_lookup

   source: Taxonomy

   entity_type: taxonomy_term

   bundle_key: vid

   bundle: taxonomy

   value_key: name

 body/value: body


   plugin: default_value

   default_value: "full_html"


   - plugin: explode

     delimiter: "|"

     source: fcid

   - plugin: skip_on_empty

     method: process

   - plugin: migration_lookup

     migration: test_migration_paragraphs

     no_stub: true

   - plugin: iterator


       target_id: '0'

       target_revision_id: '1'


 plugin: 'entity:node'

 default_bundle: content



   - test_migration_paragraph

   - test_migration_taxonomy

dependencies: { }

After writing all the yaml files the migration test_migrate.info.yml will contain the below installs.


Once you finish all these steps, go to your site and install your custom module.


Next, go to your project in terminal and run this “drush ms” command to check migration status as shown in the below image.


To migrate use command drush mim migration-id . We can see the migration ID in the above image.

Once done, if you check the migration status you can see the number of migrated items.


Now you can observe that all the content is migrated. If there is any error in the data migration, the process will terminate at that particular instance. Check the issue with that content and then once again you can restart the migration.

Things to Remember

  • If the migration is terminated in between the process, the status of migration will display as “importing”. In order to change the status to idle you need to run the command drush mrs migration-id. Next, run command drush mim migration-id

  • If you want rollback the migrated content, then run the command drush mr migration-id

  • If you have changed anything in the code after starting the migration process, then make sure you run the command drush cdi test_migration. This command will help you to reflect the changes while migrating. Once done, do a thorough check on your site to see if all the content is migrated.

Feb 11 2020
Feb 11

Florida Drupalcamp 2020 is the event that celebrates open-source software and brings together a worldwide community of Drupal users, developers, marketers and content specialists to a spot. The brightest of minds share their expertise, level up their skills, and make new friends in the community every year. 

This year, OpenSense Labs is a Silver Sponsor for Florida Drupalcamp 2020! To be held from 21-23 February 2020, the event will provide for a platform where developers, designers, and marketers gather to explore the most ambitious and cutting edge case studies.

Catch us here!

If you're going to be around during the camp, do not miss out on these sessions: 

Session 1: Centralised content distribution and syndication demystified. Why and how?

Saturday, February 22 | 2:15 pm - 3:00 pm

session 1

A central content repository allows the content editors to edit content directly from the backend of one site. Using the publisher site, organizations can publish, reuse, and syndicate content across a variety of subscriber sites and publishing channels.

The session will stress the importance of having a centralized reporting to boost the editorial teams’ productivity & article publication pace.

At the end of the session the attendees would be able to take away the following:

  • Centralized Content Distribution Architecture.
  • Real-time content syndication by setting up publisher and subscriber sites.
  • Configuring content schema between publisher and subscriber sites.
  • Minimizing Failures during data transmission.
  • Choosing the right infrastructure for content distribution.

Session 2: Architecting a Highly Scalable, Voice-Enabled and Platform Agnostic Federated Search 

Sunday, February 23 | 9:30 am - 10:15 am

session 2

Vidhatanand will be sharing how we have built an enterprise search over the traditional by tinkering with robust Apache Solr and Drupal 8, leveraging portability using Java Script and with a diverse range of CMSs, thereby increasing efficiency by 40%.  

He will walk you through the complex architecture of federated search and challenges amidst architecting a microservice. You will be equipped with the know-how of:

  • Enhancing website search experience retaining a blend of useful and accurate results.
  • Expanding inter-site searchability decreasing the bounce rate and latency.
  • Increasing data discovery and interoperability of information by cross-functional support to a plethora of platforms. 

See you there!

Taking this great opportunity to be a part of Florida DrupalCamp 2020 we can’t wait to connect with you about the amazing things our team has to offer. Come stop by and say hello to get your hands on some cool Drupal swag!

When: 21-23rd February 2020 

Where: Florida Technical College, 12900 Challenger Parkway, Orlando, Florida 32826

Feb 11 2020
Feb 11

The Drupal Community Working Group (CWG) is pleased to announce that registration is now open for a full-day Mental Health First Aid workshop on Sunday, May 17, 2020 (the day before DrupalCon Minneapolis begins) in Bloomington, Minnesota. 

The workshop will be held "field trip" style; it will be held off-site, at the Health Counseling Services facility in Bloomington, Minnesota, from 8:30am-5pm. Transportation will be provided to and from a location near the Minneapolis Convention Center (the site of DrupalCon) to the workshop. Following the workshop, attendees are invited to (optionally) attend a pay-on-your-own group dinner to decompress and discuss the day's workshop.

The CWG believes that these types of proactive workshops will help improve our community's mental health literacy and awareness, as well as making it easier for us to have open, honest, and respectful conversations and potentially spotting signs of when community members are in need of assistance.

The Drupal Association is generously sponsoring the workshop by providing funding to help defer the cost of the workshop as well as providing transportation. 

From the Mental Health First Aid website:

Mental Health First Aid is a course that gives people the skills to help someone who is developing a mental health problem or experiencing a mental health crisis. The evidence behind the program demonstrates that it does build mental health literacy, helping the public identify, understand, and respond to signs of mental illness.

Mental Health First Aiders learn a single 5-step action plan known as ALGEE, which includes assessing risk, respectfully listening to and supporting the individual in crisis, and identifying appropriate professional help and other support. Participants are also introduced to risk factors and warning signs for mental health or substance use problems, engage in experiential activities that build understanding of the impact of illness on individuals and families, and learn about evidence-supported treatment and self-help strategies.

Over the past few years, the CWG has organized proactive community health events, including on-going Code of Conduct contact training, as well as previous DrupalCon North America trainings on leadership, teamwork, and communications. 

In order for the workshop to proceed, we need at least ten community members to register by April 1, 2020 at https://healthcounselingservices.com/events/adult-mental-health-first-aid-11/

When registering:

  • Choose the "Pay now" option (do not select the "Bill my organization" option.
  • Use the coupon code: MHFA30 to receive $30 off the regular price.
  • For the "Name of organization", "Name of site", "Supervisor's name", and "Supervisor's phone" fields, feel free to use "not applicable".
Feb 10 2020
Feb 10


To perform A/B testing, segmentation, and the personalization of a webform, a site builder needs to create a variant of the form that can be triggered based on certain contexts, which can be as simple as a custom URL.

A variant is a form or version of something that differs in some respect from other forms of the same thing or from a standard.

-- https://www.lexico.com/en/definition/variant

A webform variant might alter a form's labels, descriptions, and even its confirmation page. A webform variant could be used to create an A/B test to confirm if a tweak or improvement to a form's user experience increases the rate at which the user completes a form. A basic A/B test would randomly load two variants, allow a defined number of users to complete the form, and then review the results to determine which variant had the highest completion rate. The most successful variant can then be permanently applied to the webform.

A webform variant can also be used to create a personalized webform based on a user's demographics. For example, webform's available inputs, labels, and even options could be altered based on a user's gender, age, locale, employer, etc. Even subtle tweaks can improve a form's user experience - for example, removing inappropriate disease options or inputs based on a user's gender can simplify an appointment request form.


Right now, the one out-of-box solution is to create multiple instances of a webform and route users to the appropriate webform. The biggest issue with having multiple webforms is that, in doing so, it collects two different datasets. Ideally, all submission data should go into the same results table to be analyzed with just the user experience changing. You can also use conditional logic to tweak hide/show elements and disable/enable certain behaviors.

Both approaches have limitations and lack some organization. For A/B testing, it is possible to alter a form via JavaScript. Still, this approach is limited to front-end tweaks - for example, you can't change an element's server-side conditional logic using client-side JavaScript.

The best solution is to provide an administrative user interface for defining and managing webform variants.


Variant definition (element)

Variant definition (element)

Variant definition (element)

Site builders need to define a variant type using an element. Once a variant element is added to a webform, a dedicated "Variants" tab is then displayed in the form’s management tabs. The "Variants" tab is used to create and manage variants of a webform.

Storing the variant definition in a variant element makes it easy to track and report submissions by variant. By default, the variant element is not displayed on the form or submission. Also, by default, a variant element is allowed to be prepopulated using a query string parameter. Using a query string parameter to set the variant makes it very easy to track completion rates by examining page views by URL. A variant can also be set using the element's default value or a webform field's default data.

When prepopulation is enabled, a site builder can choose to enable A/B testing by checking the randomize property. When randomize is checked, visitors are randomly redirected via JavaScript to an enabled variant.

Variant instances (plugin)

Variant instances (plugin)

Variant instances (plugin)

Once a variant element is placed on a form, variant instances need to be added to a webform. Variant plugins work very similar to webform handlers. A variant plugin can alter any aspect of the webform.

The default webform variant plugin that ships with the Webform module is called the 'Override' plugin. This plugin allows site builders to alter elements, settings, and behaviors using YAML.

Altering settings

Using the 'Override' plugin, site builders can alter a webform's form, submission, and confirmation settings and behaviors. For example, a variant can change a webform's confirmation type and message. It is also possible to setup variant-specific submission limits.

Altering elements

Altering a webform's elements makes it possible to change an element's types, label, validation, and even conditional logic. Elements can also be hidden or displayed in a variant. In YAML, a site builder enters the element name and element properties that need to be altered. For example, using a variant, a select menu can be changed to radio buttons.

Altering handlers

Altering a webform's handlers configuration is mostly applicable to email handlers because a variant can alter an email's subject, message, and recipients.

Custom Variant Plugins

Custom Variant Plugins

Custom Variant Plugins

The 'Override' variant plugin is very flexible and makes it very easy to create A/B tests. For webform segmentation, where multiple similar variants are needed, a developer might want to implement a custom variant plugin, which provides a dedicated configuration form and custom business logic to apply the variant.

Managing variants

Variants are very similar to Handlers, with the sole purpose of variants being to alter a webform. Variants are managed using the same user interface as Handlers with the addition of "View", "Test", and "Apply" operations. The "View" and "Test" operations allow site builders to review the variant's changes and test that submission handling is working as expected.

Applying variants

Applying variants

Applying variants

The "Apply" operation allows a site builder to apply a webform variant to the master webform. As a variant is applied, the selected variant or all variants can be deleted. The “Apply” operation is used to finalize an A/B test by applying the winner.

Webform nodes

Webform nodes

Webform nodes

Since variants are defined as elements types, they can be populated using a webform field's default data. When a webform has variants, the “References” tab, which tracks all webform nodes, will now display the variant information for each webform node. An “Add reference” button is placed at the top of the page. The “Add reference” button opens a dialog where site builders can select the variant type when creating a webform node.

Placing variant instances in individual webform nodes makes it easy to create segmented dedicated webforms that still route data to the same submission table. For example, an event node can use a shared registration form while using variants to change the form based on the event type.

The concept and use case for variants is relatively complex, and it helps to see variants in action. There is a Webform Example Variant module, which includes an example of an A/B test and an example of a segmented webform. The “Example: Variant: Segments” webform demonstrates how a webform can leverage multiple variants to create and a long- and short- form for multiple organizations using a custom WebformVariants plugin.

The below screencast walks-through what are webform variants and how can variants be used to create A/B tests and segmentation.

What is next?

Experiment, Experiment, Experiment

The single-word to describe why I felt it was essential to add variant support to the Webform module is "Experimentation." As a community, being continuously open to new ideas and experimentation is what keeps Drupal moving forward.

Variants allow sites to experiment and improve their existing webform using A/B testing. Variants open up the possibility to create segmented and targeted webform for specific user audiences. As with every aspect of Drupal and Webform, variants are extendable, which opens up the possibility that external personalization services can be connected to the backend of a webform to create personalized and individualized webform experiences.

I look forward to seeing what people can accomplish using variants. And at the very least, we can perform A/B tests and build more awesome webforms.

Who sponsored this feature?

Memorial Sloan Kettering Cancer Center (MSKCC) has been my primary client for the past 20 years. Without MSKCC's commitment to Drupal and their early adoption of Drupal 8, I would most likely not be maintaining the Webform for Drupal 8. Most of my work on the Webform module is done during my free time. Occasionally, MSKCC will need a Webform-related enhancement that can be done using billable hours. In the case of adding support for variants, MSKCC, like most healthcare systems, need webforms that target multiple and segmented audiences. Being able to create variants makes it easier for MSKCC to manage and track segmented forms and submissions.

I am very fortunate to have an ongoing relationship with an institution like MSKCC. MSKCC appreciates the value that Drupal provides, and the work that I am doing within the Drupal community.

Backing the Webform module

Open Collective is providing us, Drupal, and Open Source, with a platform to experiment and improve Open Source sustainability. If you appreciate and value what you are getting from the Webform module, please consider becoming a backer of the Webform module's Open Collective.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!


Feb 10 2020
Feb 10

Growing the community is the implicit goal of every Drupal meetup and event I attend. It's a constant topic of conversation at Drupal event organizing meetings, agency roundtables, and panels about recruitment and selling Drupal. Last year, I created a presentation for DrupalCamp Atlanta called "Growing the Drupal Community". Since then, it's been my hallway track conversation of choice, and everyone I talk to seems onboard with the goal of growing Drupal. As part of my role on the Drupal Association board, I'm chairing the Community & Governance Committee. We've been having lots of conversations about facilitating community growth, and I wanted to share some of what I've been thinking.

Our Target Audiences

By definition, if we want to grow Drupal, that means talking to people outside the Drupal-sphere. So who would we be targeting?  

  • Decision makers selecting a technology (Marketing/Communications and IT)
  • Developers and technologists curious about Drupal
  • Drupal users who aren't active in the community
  • Users who inherit a Drupal project
  • Agencies who are using Drupal for the first time
  • People looking to switch careers  

These are who I think of when I think of growing the community. It's important to remember that we're not just talking to developers or decision makers, but people from a wide range of backgrounds. The Drupal community is made up of designers, project managers, developers, translators, content and accessibility experts, and folks with other roles or who do Drupal as one of their many responsibilities.

One Step Closer to Engagement

Growing the Drupal community means bringing our audiences one step closer to participating in the community. That could mean different things for different people depending on what type of user they are and where they're at in their "Drupal Journey." Here are some tasks early on in this journey that we should make easier:

Try it Out

  • Install Drupal
  • Try out a demo
  • Watch a video about how Drupal works

First Contact

  • Attend a first Drupal event
  • Attend Global Training Days 
  • Make an account on Drupal.org and/or Drupal Slack
  • Talk to another Drupal user in the community
  • Join a Drupal user group on meetup.com 

Stay Informed

  • Join a mailing list to learn more about Drupal
  • Read a case study or download promo material
  • Watch a video from a Drupal event
  • Search for help on Drupal.org or Drupal StackExchange  

Later in the journey, we hope to take users beyond feeling like "Newbies." We want them to use Drupal successfully, become members of the Drupal Association, make contributions, and become Drupal ambassadors. But arguably, the steps above are more important for growing the community.

What does this mean for Drupal.org?

Drupal.org is the home of the Drupal project and it should help move users further along their journey to being part of the community. It's a big ask. Drupal.org is also a place for the existing community to communicate and collaborate, and it's a complex website with a lot of moving pieces.   

That being said, here are some key places we could focus on to build community engagement:  

  • Community page: At DrupalCon Amsterdam, I conducted a UX feedback session and collected some feedback about the Community page. One audience member said "I feel like this is structured in a way that people who are very familiar with the community would think about it, rather than from the point of view of someone who is new to the community." I think repositioning this page for newcomers and focusing on local events (camps, meetups, and local training days), joining the Drupal Slack, local associations, and getting started using Drupal would be a big improvement.
  • Groups.drupal.org is still a useful community organizing tool for some topics and groups, but many of its features have effectively been replaced by meetup.com, confusing many new users who stumble across abandoned groups on the website. When a user stumbles across a group, clearly pointing them to the place where they can find upcoming events and the most relevant content would be really helpful.
  • The Evaluator Guide is a valuable tool for developers trying out Drupal for the first time. I think adding in an evaluator guide for different audiences (especially decision makers) is essential to creating a smooth and welcoming onboarding experience.

How You Can Help

  • Spread the success stories of Drupal in your local communities and networks, especially to those outside the Drupal community. Post those stories on LinkedIn, attend events outside the Drupal-sphere. And look for ways to promote Drupal in outlets where non-Drupal folks hang out.
  • Volunteer with the Promote Drupal initiative 
  • Be active in your local Drupal community
  • Welcome newcomers on Slack, Drupal.org, and at the Drupal events you attend 
  • Look for opportunities to hire and train those outside the Drupal community  

Let me know your thoughts and what you think of the ideas above. I'd love to start a conversation.

Feb 10 2020
Feb 10

Drupal, one of the prominent WCMS in the world, delivers its services to giants like AL-Jazeera, URRWA, and Georgetown University. Having an impressive network of committed developers, it makes up as a robust, flexible, scalable, and highly secure WCMS for small and big businesses alike. Migrating to Drupal 8 can prove a nice pull for enterprises if they are looking to enhance their workflow efficiency, however, the move must be undertaken with utmost care.

SEO plays a crucial role in building your brands’ credibility digitally, alongside a huge impact on traffic, leads, conversions, and sales 

Further, a continuous concern that looms while migrating is whether the SEO and the content value will remain integral and unharmed or not. 

SEO plays a crucial role in building your brands’ credibility digitally, alongside a huge impact on traffic, leads, conversions, and sales (ROI). And failure to protect SEO assets may likely result in a reduction of all the above-mentioned factors to the new website.

Therefore, enterprises planning to carry out the migration to Drupal 8 must not only pin down the exemplary solution in prior but also put content auditing into effect and strategize around safeguarding the SEO value on a priority basis.

Also, before you initiate your venture of shifting into a newer Drupal version, make sure all the modules are updated. This, in turn, will add to the smooth Drupal installation process. 

  1. Perform a complete site audit

In order to know how your current site is doing, it is imperative to perform a complete audit of your website. An in-depth site analysis should include the evaluation of every component on the website and how they appear to search engines, thereby providing you an overview to help you maintain and improve search engine rankings essential to your business.

Google Analytics tool is a good choice to evaluate the site as it detects pivotal issues on your top pages and can help you prevent traffic loss.

Further, you can also set up a priority list for fixing your pages and get actionable information on how to fix issues.

The following elements are evaluated-

  1. Page titles
  2. URLs
  3. Headings on Pages
  4. Internal Links
  5. Site Map
  6. Navigation on your website
  7. Structure of your website
  8. Paging
  9. Meta tags
  10. Robots.txt
  11. Optimization of graphic element
  12. Page loading times


There can be more components also apart from the above-mentioned ones, which can be included for some possible changes on the website.

During this step-I, you should also get your page score via the Google Lighthouse tool. The site score ranges between 0 and 100, 0 as the lowest and 100 being the highest score. The score depends upon the page performance, accessibility, best practices followed, and SEO tactics. This site score, however, will be checked post-migration also to comprehend the changes.

Post analysis, the audit report is prepared which encompasses an extensive list of recommendations on how to optimize your site effectively, depending upon the scope and structure of the website.

Download your pre and post-migration SEO checklist now!

2.   Keep every stakeholders and decision-maker on the same page and agree upon goals for the newly migrated website

Post the site-audit, all the stakeholders and decision-makers must be informed about the stats of the current website. Considering the same, the marketing goals must be set that you wish to achieve while migrating to Drupal’s newer version. These could be leads, traffic, rankings, sales, etc. 

Companies should align their strategies mindfully to recover traffic in the first two-three months itself

Besides, all sites experience a slump in traffic post-migration. Therefore, companies should align their strategies mindfully to recover traffic in the first two-three months itself. They can also set up pay-per-click campaigns to keep the lead numbers up during the migration process.

3.   Review content: keep, improve or redirect

Creating effective and relevant content takes both time and effort and hence, enterprises must apply all the measures meticulously which involves preservation and refinement of the content pieces during Drupal 8 migration processes.

Also, enterprises should not make the impulsive decision of removing something just because it’s old. Old does not imply that it’s bad but teams should keep a keen eye out for ways in which it can be updated. Because sometimes, even old content is valuable and can contribute largely to SEO value.

Enterprises must apply all the measures meticulously which involves preservation and refinement of the content pieces during Drupal 8 migration

So, filter out the content and push obsolete content only to trash while uncovering and retaining everything that is of great importance for user-engagement and conversion.

Teams can use a reliable content audit tool like SEMrush, or Google Analytics to track all URLs and content and learn about their usefulness as well as limitations. The content audit exercise will also facilitate in achieving your long term business objective, as it will help in

  • Identifying pages that need editing/copywriting
  • Access to a variety of metrics like visits, conversions, PA, risk score, etc.
  • Identify pages that need improvement/updates
  • Uncover content marketing opportunities

4.    Focus on technical SEO part for the new website - handy Drupal SEO modules

Now that you’ve crafted valuable content on the foundation of solid keyword research, it’s important to make sure it’s not only readable by humans, but by search engines too!

Technical SEO bears no connection with content, however, it ensures that your website is optimized for the crawling and indexing phase. It also helps search engines access, crawl, interpret, and index your website without any problems.

Here are the Drupal modules that contribute towards SEO- 

Drupal and SEO modules written inside box

A.  Metatag

The Metatag module allows you to set up Drupal 8 to provide title tags and structured metadata, (meta tags) on each page of your site.

It gives you complete control over your HTML title tags and also creates meta tags for your website. Meta tags are snippets of text that illustrate a search engine about your web pages by illustrating them and social media as well what each page on your website is about and how you want them to describe in it the search results.

B.   Pathauto

The Pathauto module generates URLs for your website content without needing to enter the path alias manually. Simply put, if the title of your new blog post is “My Big Cat” then Pathauto will set the URL to your Drupal8site.dev/my-big-cat instead of yourDrupal8site.dev/node/87. 

Keeping the URL neat and relevant with the right words in the URL is great for SEO, thereby this module is a must for your project. 

In case you don’t use the Pathauto module, you must remember to create every single content URL on your website manually.

C.   Google Analytics

The Google Analytics module empowers marketing teams in tracing the footprints and general behavior of users concerning their interaction with the landing pages and the content available on the website. It adds the Google Analytics code snippet to the website and allows you to control how and when it is used.

Additionally, it provides insights into your visitors including demographics, where they found you online, what keywords they used to find you and a lot more. It also eliminates the tracking of in-house employees who might be visiting the website very often and could be counted as visitors and unique sessions. 

D.    Real-time SEO for Drupal

You can improve your Drupal SEO by optimizing content around keywords in a fast, natural, and non-spam way.

The Real-time SEO module works best in combination with the Metatag module. Besides, it also checks whether your posts are long enough or not, or if you have used any subheadings within your post or not - to make sure that your content is approachable by both search engines and users.

Improve your SEO by optimizing content around keywords in a faster, natural, and non-spam way

E.    Node Noindex

This module lets the administrator set the HTML robots metatag to Noindex for a specific node. As a result, it will ask search engine robots to not index the node, preventing it from appearing in search engine results. 

This module comes handy in not indexing those pages which are:

  1. Unimportant
  2. Transitory
  3. Contain personal or sensitive data

Note: The same functionality is not achieved by disallowing a node in robots.txt.

F.    Smart Paging

Smart Paging ensures that you split up long Drupal content pages into subpages based on the number of characters, words or by a placeholder HTML tag for the node, user, and taxonomy term entities.

The URL for subpages have become more SEO friendly now and works even for aliased URL paths.

G.   Query Parameters to URLs

This module enables you to rewrite the URL query parameters into Clean URL components on specified paths.

Also, the URL path gets unwieldy quickly if there are multiple filters and filter values used. Also, as the path contains query parameters, t can impact SEO results.

Learn more about Drupal SEO from here:

[embedded content]

H.   SEO Preview

Shows preview of how your title tag, URL, and meta-description will appear on popular search engines to those who have permission. It also generates best practices and warnings for titles and descriptions.

I.    Dynamic Internal linking

It is always considered a best practice to link your valuable pages within the website for establishing site structure and redirecting the link juice to valuable landing pages.

This module also does the same! You can map your high ranking keywords into links within the content of other pages on the site. This way the first occurrence of the keyword in the content is dynamically replaced as a link.

J.   HitTail

This module provides integration with the keyword tracking tool HitTail.com.

It helps you target the most promising search terms based on your existing traffic in real-time with the help of an algorithm that is tuned by analyzing over 1.2 billion keywords.

It analyzes your visitor stream in real-time and provides you with a simple, actionable list of precise keywords that you can use to target audience and grow your organic search traffic.

K.  Stop Broken Link in Body

It determines broken links by visiting the destinations and evaluating the HTTP response code

Stop Broken Link in Body makes up a great module for editors & content publishers by which they can check embedded link’s validity before publishing the content. In case the content has broken links in the body field, then it won’t get published.

It determines broken links by visiting the destinations and evaluating the HTTP response code.

L.  Search 404

This module instead of showing a standard “404 Page not found”, performs a search operation on the keywords in the URL for instance, if a user searches for http://example.com/does/not/exist, this module will search for "does not exist" and will come up with the result instead of the 404 pages. 

This approach helps in retaining visitors coming in from old URLs linked from other sites or search indices.

5.  Evaluate your landing and transaction pages responsible for conversions

The ultimate goal of improving traffic on websites with SEO and user-friendly content is to generate leads and revenue. To execute the same, you need to keep a tab on your landing & transaction pages and the conversion rates of each one of them on your site. 

Also, figure out the approach that you can take to improve these rankings, for instance, you can check your call to actions (CTAs), contact-us forms, shopping carts, and lot more. Make sure that you don’t put too much emphasis on filling forms or information in the first step only, that annoyed users leave your site. 

Executing these techniques will certainly lead to a proliferation of sales. Further, you can also download your SEO pre and post-migration checklist to avoid missing out on any important step.

6.   Let search engines know about your site

When you are migrating a website and planning to change the domain name also in any way, it is imperative to let Google know what you are up to.

Google provides a mechanism in its search console to inform them that a site is going to move to a new address. However, if all the URLs remain intact, then you need not tell Google.

Google Analytics is also another key place where you would require to make some adjustments. The GA or Google Tag Manager module for Drupal will install the Javascript tracking code on every page of your new site. You can also run a crawler to find out if the code is showing up or not.

In addition, there are some Drupal modules that you can implement to help search engines crawl your website smoothly-

  1. Redirect

The Redirect module ensures that visitors are redirected from old URLs to new URLs whenever a content piece is moved to another section of your site or inadvertently changed the URL. It also creates 301 redirects to make sure that any URL which ranks in Google gets resolved when a visitor arrives. If you don’t install this module, you will have to regularly look for URLs manually that have changed to fix them. 

This module highlights the power of Drupal, automating what used to be a strenuous and constant SEO chore.

2.   Robots txt

You can use this module whenever you are running multiple Drupal sites from a single code base. However, you’ll need a different robots.txt file for each one. Robots.txt generates this file dynamically and enables you to edit it, on a per-site basis, from the web UI.

Note: It is mandatory to delete or rename the robots.txt file in the root of your Drupal installation for this module to display its robots.txt file(s).

3.   Schema.org Metatag

It maintains structured data and tags so that you can add them to your HTML code to improve the way search engines read and represent your pages on SERPs.

4.    XML Sitemap

Drupal XML sitemap module helps in SEO as it provides your website a sitemap on its own to make it searchable and crawlable by search engines. This practice helps search engines in understanding the hierarchy of your website and accordingly crawl in a tree sort of manner, which otherwise could have missed pages or even whole sections due to its huge size or complex structure.

The module is highly flexible also as it allows you to include or exclude certain pages from the sitemap of your website. This means that you don’t need to get those pages indexed which you are not using anymore.

Further, you can set up Cron too. It is an in-built system in the server which is responsible for running maintenance tasks regularly. Drupal Cron ensures that your site is spic and span, updated, and whether the content is indexed or not or if there is a requirement of rebuilding XML sitemaps.

5..Image Sitemaps

It’s up to you whether you want to use a separate sitemap for listing images or add the required info to an existing sitemap; the whole purpose is to let Google (or any other search engine) easily discover images on your site for effective indexing and crawling and hence be able to showcase them in image search results.

7.  Measure and monitor your new website pre and post-migration

Analysis post the deployment of solution or in this case, migration, will give you a better picture of what worked for you and whatnot. Also, it would be wise enough if you keep your site under observation post-migration, at least for three months to fix the errors and bugs being discovered now and then.SEO checklist pre and post-migration

Final Words

It’s not over yet! In fact, SEO has just begun post-migration. It is an ongoing process that enterprises need to pay attention to for enhancing their digital presence and boosting growth.  

Drupal 8 is a dynamic open-source software platform for site-building. Organizations that have outgrown their existing non-Drupal CMS should consider migrating to Drupal 8. However, migrating has its unique challenges, including data migration, and SEO retention. 

Srijan is an award-winning Drupal web solutions provider that can assist you in navigating through the process of migrating to Drupal efficiently, without compromising on data and SEO. Contact us now!

Feb 10 2020
Feb 10

We debut a new podcast format, and talk with Leslie Glynn about the Aaron Winborn Award and Mauricio Dinarte about Drupal 8 migrations.

URLs mentioned

DrupalEasy News



Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Feb 09 2020
Feb 09

By default, the field chosen for the description metatag(s) is not trimmed by Metatag. Most of the time it's therefore way too long for SEO. Here is a way to add 2 custom tokens that take care of that for nodes and terms.

Let's start by declaring our new tokens in our custom module (named gd_global here):


 * Implements hook_token_info().

function gd_global_token_info() {
  return [
    'tokens' => [
      'node' => [
        'metatag_description' => [
          'name' => t('Node description for Metatag'),
          'description' => t('The node body trimmed down to 197 characters with ellipsis, for Metatag.'),
      'term' => [
        'metatag_description' => [
          'name' => t('Term description for Metatag'),
          'description' => t('The term description trimmed down to 197 characters with ellipsis, for Metatag.'),

Then let's provide the replacements:

 * Implements hook_tokens().

function gd_global_tokens($type, $tokens, array $data, array $options, BubbleableMetadata $bubbleable_metadata) {
  $replacements = [];
  if (in_array($type, ['node', 'term'])) {
    foreach ($tokens as $name => $original) {
      switch ($name) {
        case 'metatag_description':
          if (isset($data['node']) && $data['node'] instanceof Node) {
            $replacements[$original] = \Drupal::service('gd_global.manager')->getMetatagDescription($data['node']->body->value);
          if (isset($data['term']) && $data['term'] instanceof Term) {
            $replacements[$original] = \Drupal::service('gd_global.manager')->getMetatagDescription($data['term']->getDescription());
  return $replacements;

Of course we must add in the top of the file:

use Drupal\Core\Render\BubbleableMetadata;
use Drupal\node\Entity\Node;
use Drupal\taxonomy\Entity\Term;

We are not using BubbleableMetadata here and it could be removed. But I'm used to add all the possible parameters of the hooks. So I'm also adding in the top of the file.

Next thing: to prepare the Metatag description, I'm using a custom service. It might be a bit overkill but I try to always implement things in an extendable way. So here is our service:


: Drupal\gd_global\GdGlobalManager



namespace Drupal\gd_global;

use Drupal\Component\Utility\Unicode;

 * GD global manager.

class GdGlobalManager {

   * Get Metatag description.

  public function getMetatagDescription($string = '') {
    $string = strip_tags($string);
    $string = str_replace("\r", '', $string);
    $string = str_replace("\n", ' ', $string);
    $string = str_replace("\t", ' ', $string);
    $string = trim(preg_replace('/ {2,}/', ' ', $string));
    $string = Unicode::truncate($string, 197, TRUE, TRUE);
    return $string;


This can be used as a temporary workaround to Automatically trim meta tag lengths (D8).

Feb 09 2020
Feb 09

Headless and decoupled architectures continued to gain popularity in 2019, driven by the number of channels that need to be supported and the increase in JavaScript frameworks being used to build more engaging digital experiences.

Lately, as an evolution of microservices architecture, we are also seeing a trend in micro frontends, which brings the microservices concept to the frontend. With micro frontends, instead of building a Single Page Application (SPA), you structure your application vertically, with functionalities being grouped together from the backend (microservices) all the way to the frontend.

Here at 1xINTERNET we have adopted React as the JavaScript framework to build more dynamic experiences. React is a modern JavaScript framework used to build fast, high-performing user interfaces. It was originally created by Facebook and it has been released under an open-source license. While React allows us to create experiences that customers love using component-based and flexible JavaScript that our frontend developers love to write, Drupal’s API-first approach gives marketers the editorial experience they need.

Most of the websites we have built in 2019 use React in the frontend in some way or other. Where needed we expose functionality as microservices with Drupal and consume and display them with micro frontends built with React. For us this has become a best practice and we will continue to build web-based solutions like this. We have recently published a seperate article about this topic: Headless applications with Drupal and React in 2020.

Feb 09 2020
Feb 09

With performance being a continuous bottleneck for enterprises trying to deliver an exceptional digital experience to customers and please search engines, this blog will serve as a sight to the sore eyes. 

Earlier, we covered tips and tricks for beginners to help them optimize the site performance.  Today, we are taking this forward to illustrate some intermediate-level techniques for optimal results. The many ways to configure web performance for your Drupal website are listed below-


1.  Theme optimization

Theme optimization is an essential technique to enhance performance. Because, when developers create new themes, they override the required templates such as home page layout or node page, and for the rest of the pages, they override CSS; leading to the addition of a lot of unwanted layers in the HTML.

Follow the steps mentioned below to optimize the theme for your Drupal website-

  1. Remove blank spaces and comments manually from .tpl
  2. Ensure there is no indentation in .tpl
  3. Turn on CSS & JS aggregation in the performance page
  4. Remove duplicate files and combine similar ones to reduce the CSS file size
  5. Move codes to functions in a custom common module. Use functions for similar problems instead of coding separately.


2.  Drupal external caching:

Desktop, ipad, and phone showcased with other iconsSource- Cloudways


There are a few methods in Drupal that can manage the CMS interaction with an external cache. This can be done using contributed modules, like-

A.   Advanced CSS/ JSS Aggregation

Known as AdvAgg, it ensures the improvement in the frontend performance of your site. You can also compare before and after results using Google’s PageSpeed Insights tool.

B.    Memcache

With Memcache, you can directly discharge cache bins into RAM; thereby speeding up the cache and making room for MySQL to breathe.

C.     Redis

Redis is an open-source (BSD licensed) & in-memory data structure store that can be used as a smart cache with the proper eviction policy. When implemented in a similar fashion, it can prove to be the most effective way by which the application can access stored content within it and improve the cache hit ratio substantially.

Know more about the clear cache tag module and how it helps in optimizing the website performance.

3.  Devel module:  

Devel is an amalgamation of modules, encompassing helper functions, admin pages, and additional development support. One can use Drush commands to use it during the development process to evaluate the query execution time or the number of times function was executed on a particular page.

4.  Sprite image: 

Sprites are 2-D images that are constituted into one from small images, defined at X and Y coordinates. To display a single image from the combined image, you could use the CSS background-position property, thereby showcasing the exact position of the image to be displayed.

5.  Lazy-load images

Lazy loading is a shrewd technique that involves displaying content only when it’s visible to users as they scroll down the screen. This comes handy for those sites which comprise a lot of images and don’t intend to waste the bandwidth by loading the whole page every time the user comes on it. 

Thus, images are visible only when the user scrolls.

6.  Implementing AMP standard to provide lightning-fast page loading on mobile devices

AMP refers to Accelerated Mobile Pages. It ensures the optimization of web pages for faster loading on mobile devices by providing content through lightweight pages. This certainly aligns with Google’s motive of making the web a more accessible and enjoyable place for users of mobile devices.

AMP is the open-source framework that lets you build pages which are stripped-down/lightweight version of your main pages by eliminating speed-taxing elements impacting load time.

So, whenever a standard webpage’s AMP alternative is available, a link to the AMP version is placed on the page via an HTML tag and then it is what displayed to the mobile device user.

This way, the implementation of AMP can boost the loading speed of your web pages, and hence, improve the end-user experience. 

7.  Accelerating your 404 responses with the Fast 404 module

Drupal logo, a meter, and other tools to measure performanceSource: Web peppers

The average site with an average module load consumes around 60-100MB of memory on the server to deliver a 404. However, Drupal Fast 404 module comprises a common method that handles both missing image & file 404 errors to fix the issues using less than 1MB of memory; depending on the method you choose - aggressive or super-aggressive.

8.  S3 File System

S3 File System or s3fs provides an additional file system to your Drupal site, which is stored in either Amazon’s Simple Storage Service (S3) or any other S3- compatible storage service. It’s up to you whether you want to use S3 File System as the default one, or only for individual fields. This is beneficial for the sites that have distributed their load across multiple servers, as the mechanism used by Drupal’s default file systems is not sustainable under such a configuration.

9.  Content delivery network 

Content Delivery Network module facilitates easy integration for Drupal sites. It modifies file URLs so that files (for example, CSS, JS, images, fonts, videos, etc.) are downloaded from a CDN instead of your web server.

However, only origin pull CDNs are supported. You just need to replace the existing URL with another domain name- thereby, allowing CDN to automatically pull the files from your server (the origin).

10.  Pick Nginx over Apache 

Apache and Nginx are the two common open-source web servers, however, the latter is quite faster and consumes considerably less space than the former one. 

Nginx has been designed to resolve the C10K problem - the most common problem that web servers (like Apache) face in supporting a large number of simultaneous connections, i.e, more than 10k connections at once.

Thus, Nginx comes as a quick fix for performance issues. You can opt for it without the need of changing your actual application code. It will seamlessly integrate with your Drupal code.

11.  Leverage browser’s cache for images and files

Leveraging your browser’s cache implies that you can specify for how long web browsers should retain images, CSS and JS stored locally. That way, the user’s browser will download less data while browsing on your site, thereby improving site performance. Below are the examples of the same for different web servers-

  • For Nginx web server :
 location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ {

expires 365d;


location ~* \.(pdf)$ {

expires 30d;


  • For Apache web server :

It should be added to your .htaccess file.


<IfModule mod_expires.c>

ExpiresActive On

ExpiresByType image/jpg "access 1 year"

ExpiresByType image/jpeg "access 1 year"

ExpiresByType image/gif "access 1 year"

ExpiresByType image/png "access 1 year"

ExpiresByType text/css "access 1 month"

ExpiresByType text/html "access 1 month"

ExpiresByType application/pdf "access 1 month"

ExpiresByType text/x-javascript "access 1 month"

ExpiresByType application/x-shockwave-flash "access 1 month"

ExpiresByType image/x-icon "access 1 year"

ExpiresDefault "access 1 month"



  • Cache-control
 # 1 Month for most static assets

<filesMatch ".(css|jpg|jpeg|png|gif|js|ico)$">

Header set Cache-Control "max-age=2592000, public"


  • gzip compression
 mod_gzip_on Yes

mod_gzip_dechunk Yes

mod_gzip_item_include file .(html?|txt|css|js|php|pl)$

mod_gzip_item_include handler ^cgi-script$

mod_gzip_item_include mime ^text/.*

mod_gzip_item_include mime ^application/x-javascript.*

mod_gzip_item_exclude mime ^image/.*

mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.*

12.  Drupal coding standards should meet

Coding standards specify the set of rules for programmers to ensure best practices like consistent formatting, indentation, and many more rules.

Drupal’s coding standards are articulated in English so that anyone working with Drupal team around the globe, which is common today, standards can eliminate any communication differences.

Further, many programmers come to Drupal from multiple programming language backgrounds, each with their syntax and style. Therefore, having a precise set of standards to look upon help in keeping the codebase consistent.

These standards, further, specify the guidelines on how to style and format your code precisely, especially concerned with its appearance and readability ease. This includes things like indentation, whitespace, and line length. 

Ensuring that all code abides by the given guidelines, Drupal acknowledges consistency & integrity in all its projects, making it easier for documentation. The API module itself parses the information to produce a document, which can be seen here. The documentation is generated by the specific format of the comments and a variety of tags in the source code to get detailed information on the code.

13.  Ensuring code security is a MUST

Security is the foremost issue for any developer that he or she must take care of, undividedly; no matter whether you are writing a PHP snippet or an entire module. 

The given basic rules will help you avoid any security breach if followed properly-

  • Apply check functions on the output obtained to prevent cross-site scripting attacks.
  • User-submitted content shouldn’t be placed directly as-is into HTML ever.
  • Take benefit of the database abstraction layer to avoid SQL injection attacks
  • Use db_rewrite_sql() to respect node access restrictions.

14.  DB query optimization in codes

Performance tuning can be a challenging task, especially when you tend to work with large-scale data. Even the smallest change can have a significant (positive or negative) impact on performance.

Query optimization can be stated as whenever a developer or the database engine revamps a query in such a way that SQL Server is capable of returning the same results more efficiently.

For more ease, you can follow these steps-

1.  Join DB queries whenever possible

2.  For any DB updates and insertion, use core API

3.   Follow Drupal coding standards

15.  DB table optimization

DB table optimization in Drupal refers to the refinement of all the administrator-selected tables in the database and displaying its sizes. Enable notifications stating the necessity to analyze tables, maintenance, and carry out repair operations. 

With it, you can prevent crashing of tables during regular cron.php executions.

16.  Table indexing 

A database index is a data structure that ensures the boost in the speed of operations in a table. You can create indexes using one or more columns for rapid & random lookups, and efficient ordering of access to records.

17.  Use Defer attribute for external js file to load the page faster

The role of the Defer attribute is to indicate to the browser that it should load the script in the background, while continuously working on the page. Once the script gets loaded, you can run it. 

The advantage here is that scripts with the Defer attribute never let the browser block the page. 

E.g. js/admin_toolbar.js: { attributes: { defer: true } }

18.  Upload compressed image for better performance results

Using compressed or smaller- sized images will help in saving bandwidth, which is appreciated by Networks and browsers as well.

Also, before starting the modification of images, ensure that you have chosen the best file type. There are several types of file that you can use:

  1. PNG - creates higher quality images, but the downside is its large file size. Though it was created as a lossless image format, it can also be lossy.
  2. JPEG - uses lossy and lossless optimization. You can adjust the quality level for a good balance of quality and file size.
  3. GIF - utilizes 256 colors only, making it suitable for animated images. It only uses lossless compression. 

Wrapping Up

With all the given methods from implementing a CDN to clearing caching, lazy-loading images, fixing 404s, and aggregating CSS/JS files, you can fix your Drupal-powered website’s performance extensively.

Feb 08 2020
Feb 08

Why Acquica Dev Desktop 2?

So if we could get Acquica Dev Desktop working as what we expected, it would be a good alternative for Drupal devs who use macOS.

Clean up

Remove export PATH="$PATH:/Applications/DevDesktop/tools" from ~/.profile to avoid conflicts, meanwhile, add aliases to ~/.profile for easy switching PHP environment

  1. alias p71="export PHP_ID=php7_1; export PATH=/Applications/DevDesktop/php7_1_x64/bin:/Applications/DevDesktop/mysql/bin:/Applications/DevDesktop/drush_9:$PATH"

  2. alias p72="export PHP_ID=php7_2; export PATH=/Applications/DevDesktop/php7_2_x64/bin:/Applications/DevDesktop/mysql/bin:/Applications/DevDesktop/drush_9:$PATH"

  3. alias p73="export PHP_ID=php7_3; export PATH=/Applications/DevDesktop/php7_3_x64/bin:/Applications/DevDesktop/mysql/bin:/Applications/DevDesktop/drush_9:$PATH"

Install extensions

Tools required:

  1. # required to build extensions

  2. $ brew install autoconf pkg-config

Pecl extenisons

Just follow up Add a PHP extension to Acquia Dev Desktop

Non-Pecl extensions/built-in extensions

No resource on the Internet as I know so far (Feb 8, 2020) introducing how to install non-pecl extension and built-in extensions for Acquia Dev Desktop


  1. $ brew install krb5 openssl@1.1 imap-uw

  2. ### Active php 7.3

  3. $ p73

  4. ### Download

  5. $ wget https://www.php.net/distributions/php-7.3.39.tar.gz && tar xzf php-7.3.39.tar.gz

  6. $ cd php-7.3.39/ext/imap

  7. $ phpize

  8. $ ./configure --with-kerberos=/usr/local/opt/krb5 --with-imap-ssl=/usr/local/opt/openssl\@1.1 --with-imap=/usr/local/opt/imap-uw

  9. $ make

  10. $ make install


  1. $ brew install openldap

  2. ### Active php 7.3

  3. $ p73

  4. ### Download

  5. $ wget https://www.php.net/distributions/php-7.3.39.tar.gz && tar xzf php-7.3.39.tar.gz

  6. $ cd php-7.3.39/ext/ldap

  7. $ phpize

  8. $./configure --with-ldap=/usr/local/opt/openldap

  9. $ make

  10. $ make install


Append two lines to settings.php

  1. $databases['default']['default']['charset'] = 'utf8mb4';

  2. $databases['default']['default']['collation'] = 'utf8mb4_general_ci';

Where are the database setttings:

  1. # On macOS

  2. $ open ${HOME}/.acquia/DevDesktop/DrupalSettings

Remove all tools under following folders, besides drush and reinstall by running composer install, especially having phpcs configured already.

  • /Applications/DevDesktop/drush_9/drush
  • /Applications/DevDesktop/tools

Change drush version

  1. $ cd /Applications/DevDesktop/tools

  2. # edit composer.json, change the version

  3. $ composer install -vvv


Feb 08 2020
Feb 08

Why Acquica Dev Desktop 2?

So if we could get Acquica Dev Desktop working as what we expected, it would be a good alternative for Drupal devs who use macOS.

Clean up

Remove export PATH="$PATH:/Applications/DevDesktop/tools" from ~/.profile to avoid conflicts, meanwhile, add aliases to ~/.profile for easy switching PHP environment

  1. alias p71="export PHP_ID=php7_1; export PATH=/Applications/DevDesktop/php7_1_x64/bin:/Applications/DevDesktop/mysql/bin:/Applications/DevDesktop/drush_9:$PATH"

  2. alias p72="export PHP_ID=php7_2; export PATH=/Applications/DevDesktop/php7_2_x64/bin:/Applications/DevDesktop/mysql/bin:/Applications/DevDesktop/drush_9:$PATH"

  3. alias p73="export PHP_ID=php7_3; export PATH=/Applications/DevDesktop/php7_3_x64/bin:/Applications/DevDesktop/mysql/bin:/Applications/DevDesktop/drush_9:$PATH"

Install extensions

Tools required:

  1. # required to build extensions

  2. $ brew install autoconf pkg-config

Pecl extenisons

Just follow up Add a PHP extension to Acquia Dev Desktop

Non-Pecl extensions/built-in extensions

No resource on the Internet as I know so far (Feb 8, 2020) introducing how to install non-pecl extension and built-in extensions for Acquia Dev Desktop


  1. $ brew install krb5 openssl@1.1 imap-uw

  2. ### Active php 7.3

  3. $ p73

  4. ### Download

  5. $ wget https://www.php.net/distributions/php-7.3.39.tar.gz && tar xzf php-7.3.39.tar.gz

  6. $ cd php-7.3.39/ext/imap

  7. $ phpize

  8. $ ./configure --with-kerberos=/usr/local/opt/krb5 --with-imap-ssl=/usr/local/opt/openssl\@1.1 --with-imap=/usr/local/opt/imap-uw

  9. $ make

  10. $ make install


  1. $ brew install openldap

  2. ### Active php 7.3

  3. $ p73

  4. ### Download

  5. $ wget https://www.php.net/distributions/php-7.3.39.tar.gz && tar xzf php-7.3.39.tar.gz

  6. $ cd php-7.3.39/ext/ldap

  7. $ phpize

  8. $./configure --with-ldap=/usr/local/opt/openldap

  9. $ make

  10. $ make install


Append two lines to settings.php

  1. $databases['default']['default']['charset'] = 'utf8mb4';

  2. $databases['default']['default']['collation'] = 'utf8mb4_general_ci';

Where are the database setttings:

  1. # On macOS

  2. $ open ${HOME}/.acquia/DevDesktop/DrupalSettings

Remove all tools under following folders, besides drush and reinstall by running composer install, especially having phpcs configured already.

  • /Applications/DevDesktop/drush_9/drush
  • /Applications/DevDesktop/tools


Feb 08 2020
Feb 08

Since joining Planet Drupal in 2016, no seriously new blog posts for a long time. Last year, my personal site was rebuilt with Drupal 8 again. And a few months ago, I found my feed was removed from Planet Drupal. Recently, I wanted to join in it again. The first task is to make a valid RSS. This is the background.

Suggestions from How to get your feed to the Planet

If your site uses Drupal, you can create a "Drupal Planet" taxonomy term and use it to tag any content you want to appear on the Planet. Drupal automatically creates an RSS feed for each taxonomy term, and you can submit that term’s feed URL in your Drupal Planet application. Alternatively, you can use the Drupal Planet feature module, which uses Flag and Views modules to create the feed. If you are not using Drupal, you will need to make sure your site can create a feed just for your Planet content.

Using Planet content is inapplicable as it has no Drupal 8 release.

Previously, my feed had the term "Drupal Planet" exactly, and I did submit http://ranqiangjun.com/taxonomy/term/1/feed. This time I want to avoid using the term, instead, creating a content type Planet Drupal and a View Planet Drupal to provide an RSS feed,

Content type: Planet Drupal

Creating content type called Planet Drupal (machine name: planet_drupal) with a body filed. nothing fancy here.

View: Planet Drupal

Add a view Planet Drupal (machine name: planet_drupal) with a page display and a feed display

For the page display, it's free to make it whatever you want it to be, let's focus on the feed display

Feed display

  1. TITLE

  2. Title:Planet Drupal


  4. Format:RSS Feed | Settings

  5. Show:Fields | Settings


  7. Content: Title

  8. Content: Authored by [hidden]

  9. Content: Authored on [hidden]

  10. Content: Body [hidden]

  11. Content: Link to Content [hidden]

  12. Content: UUID [hidden]

  13. ...


  15. Path:/planet.xml

  16. Attach to:Page

For the fields added

  • Title, remember uncheck Link to the Content
  • Authored by, Formatter: label, unchecked Link label to the referenced entity
  • Authored on, Date format: Custom, Custom date format: r, here is important. It should be a RFC 2822 formatted date. r is the proper format character.
  • Body, Formatter: Summary or Trimmed, Trimmed limit: 600 characters
  • Link to Content, Check Output the URL as text, Checked or unchecked Use absolute link (begins with "http://") are both ok.
  • UUID, we know UUID is unique, it's used as the GUID in feed. by default no formmater for UUID field type, but there is a module called uuid_extra which provides an UUID formatter, so install it, and choose Formatter: UUID.

Next mapping the fields to feed fields

  • Title field <-- Content: Title
  • Link field <-- Content: Link to Content
  • Description field <-- Content: Body
  • Creator field <-- Content: Authored by
  • Publication date field <-- Content: Authored on
  • GUID field <-- Content: UUID, and uncheck GUID is permalink

So far so good, let's check if our feed is valid. Go to W3C Feed Validation Service and submit

Bingo, it's valid, but it complains

Let's fix it.

By checking https://dri.es/taxonomy/term/1/feed, which is a valid feed. it contains only one namespace xmlns:dc="http://purl.org/dc/elements/1.1/, so let's only keep the one. And fix Missing atom:link with rel="self" needs adding one namespace and inserting a atom:link to your feed in the channel section

Solution: If you haven't already done so, declare the Atom namespace at the top of your feed, thus: Then insert a atom:link to your feed in the channel section. Below is an example to get you started. Be sure to replace the value of the href attribute with the URL of your feed.

So our solution would be alteing the namespaces inside the implementation of template_preprocess_views_view_rss, and adding a new variable called feed_url as the href of the atom:link tag being inserted. let's assume we have a theme called planet

Inside planet.theme file, add the implementation of hook_preprocess_views_view_rss

  1. /**

  2.  * Implements hook_preprocess_HOOK().

  3.  */

  4. function planet_preprocess_views_view_rss(&$variables) {

  5. /** @var \Drupal\views\ViewExecutable $view */

  6. $view = $variables['view'];

  7. if ($view->current_display === 'feed_1' && $view->id() === 'planet_chinese') {

  8. // Provide a new variable for the custom template.

  9. $display = $view->getDisplay();

  10. $variables['feed_url'] = $display->getUrl()->setAbsolute()->toString();

  11. // Alter namespaces.

  12. $style = $view->getStyle();

  13. $style->namespaces = array_filter( $style->namespaces, function ($key) {
  14. return $key === 'xmlns:dc';


  16. $style->namespaces['xmlns:atom'] = 'http://www.w3.org/2005/Atom';

  17. $variables['namespaces'] = new Attribute($style->namespaces);

  18. }

  19. }

Copy the views-view-rss.html.twig from core/modules/views/templates into the planet theme's templates folder and insert the atom:link tag inside channel.

  1. ...

  2. <channel>

  3. {% if feed_url %}{{ feed_url }}" rel="self" type="application/rss+xml" /> {% endif %}
  4. ...


Feb 08 2020
Feb 08

Since joining Planet Drupal in 2016, no seriously new blog posts for a long time. Last year, my personal site was rebuilt with Drupal 8 again. And a few months ago, I found my feed was removed from Planet Drupal. Recently, I wanted to join in it again. The first task is to make a valid RSS. This is the background.

Suggestions from How to get your feed to the Planet

If your site uses Drupal, you can create a "Drupal Planet" taxonomy term and use it to tag any content you want to appear on the Planet. Drupal automatically creates an RSS feed for each taxonomy term, and you can submit that term’s feed URL in your Drupal Planet application. Alternatively, you can use the Drupal Planet feature module, which uses Flag and Views modules to create the feed. If you are not using Drupal, you will need to make sure your site can create a feed just for your Planet content.

Using Planet content is inapplicable as it has no Drupal 8 release.

Previously, my feed had the term "Drupal Planet" exactly, and I did submit http://ranqiangjun.com/taxonomy/term/1/feed. This time I want to avoid using the term, instead, creating a content type Planet Drupal and a View Planet Drupal to provide an RSS feed,

Content type: Planet Drupal

Creating content type called Planet Drupal (machine name: planet_drupal) with a body filed. nothing fancy here.

View: Planet Drupal

Add a view Planet Drupal (machine name: planet_drupal) with a page display and a feed display

For the page display, it's free to make it whatever you want it to be, let's focus on the feed display

Feed display

  1. TITLE

  2. Title:Planet Drupal


  4. Format:RSS Feed | Settings

  5. Show:Fields | Settings


  7. Content: Title

  8. Content: Authored by [hidden]

  9. Content: Authored on [hidden]

  10. Content: Body [hidden]

  11. Content: Link to Content [hidden]

  12. Content: UUID [hidden]

  13. ...


  15. Path:/planet.xml

  16. Attach to:Page

For the fields added

  • Title, remember uncheck Link to the Content
  • Authored by, Formatter: label, unchecked Link label to the referenced entity
  • Authored on, Date format: Custom, Custom date format: r, here is important. It should be a RFC 2822 formatted date. r is the proper format character.
  • Body, Formatter: Summary or Trimmed, Trimmed limit: 600 characters
  • Link to Content, Check Output the URL as text, Checked or unchecked Use absolute link (begins with "http://") are both ok.
  • UUID, we know UUID is unique, it's used as the GUID in feed. by default no formmater for UUID field type, but there is a module called uuid_extra which provides an UUID formatter, so install it, and choose Formatter: UUID.

Next mapping the fields to feed fields

  • Title field -> Content: Title
  • Link field -> Content: Link to Content
  • Description field -> Content: Body
  • Creator field -> Content: Authored by
  • Publication date field -> Content: Authored on
  • GUID field -> Content: UUID, and uncheck GUID is permalink

So far so good, let's check if our feed is valid. Go to W3C Feed Validation Service and submit

Bingo, it's valid, but it complains

Let's fix it.

By checking https://dri.es/taxonomy/term/1/feed, which is a valid feed. it contains only one namespace xmlns:dc="http://purl.org/dc/elements/1.1/, so let's only keep the one. And fix Missing atom:link with rel="self" needs adding one namespace and inserting a atom:link to your feed in the channel section

Solution: If you haven't already done so, declare the Atom namespace at the top of your feed, thus: Then insert a atom:link to your feed in the channel section. Below is an example to get you started. Be sure to replace the value of the href attribute with the URL of your feed.

So our solution would be alteing the namespaces inside the implementation of template_preprocess_views_view_rss, and adding a new variable called feed_url as the href of the atom:link tag being inserted. let's assume we have a theme called planet

Inside planet.theme file, add the implementation of hook_preprocess_views_view_rss

  1. function planet_preprocess_views_view_rss(&$variables) {

  2. /** @var \Drupal\views\Entity\View $view */

  3. $view = $variables['view'];

  4. if ($view->id() === "planet_drupal") {

  5. $style = $view->style_plugin;

  6. $style->namespaces = array_filter( $style->namespaces, function ($key) {
  7. return $key === 'xmlns:dc';


  9. /** @var \Drupal\views\Plugin\views\display\Feed $display */

  10. $display = $view->getDisplay('feed_1');

  11. if ($display !== NULL) {

  12. $variables['feed_url'] = $display->getUrl()->setAbsolute()->toString();

  13. }

  14. // Add the missing namespace

  15. $style->namespaces['xmlns:atom'] = 'http://www.w3.org/2005/Atom';

  16. $variables['namespaces'] = new Attribute($style->namespaces);

  17. }

  18. }

Copy the views-view-rss.html.twig from core/modules/views/templates into the planet theme's templates folder and insert the atom:link tag inside channel.

  1. ...

  2. <channel>

  3. {% if feed_url %}{{ feed_url }}" rel="self" type="application/rss+xml" /> {% endif %}
  4. ...

I will put all together and publish a module probably on github or as the drupal 8 port of drupal_planet on d.o.


  • Missing atom:link with rel="self" does not make sense to me, as I am creating a RSS feed not an atom, as I could understand, they are two formats. Anyway, let's just follow what the Validation Service suggests.

  • An issue reported to Drupal core Let StringFormatter support UUID filed type explicitly

Feb 08 2020
Feb 08

New year, new possibilities, as we say in Norway. Which is why I have relaunched my blog using Gatsby.js. I could write a blog post about that, but I am not going to do that today. There is a lot of tutorials on how to set that up (one of my personal favorites is this one from Lullabot), and there is even an official page in the Gatsby.js documentation.

I could probably write many blog posts about different aspects I tweaked and looked at in the migration, but one field I feel is not often talked about is geography and performance.

With regards to servers, many Drupal sites (at least basic ones) are probably geographically limited by the actual server that is supposed to serve the web requests, and the location of this particular server. With a static site, made for example with Gatsby.js, you can deploy it to a Content Delivery Network (CDN) and have the same static html files on servers all around the world. This could mean that a website visitor from Tokyo to your static site could get a response from a server in Tokyo. The traditional Drupal site however might be on a server in Ireland, and then a visitor from Tokyo would quite often have to send their request all around the world to get responses.

This idea is not very new. In fact, there are several providers that let's you deploy your static site on their CDN for free (more or less). They will then serve your static HTML from different parts of the world, depending on the visitor. What a world to live in. But instead of comparing their service and the user experience of deploying, I decided to compare them by which ones were being performant from all parts of the world. A geographic performance cup if you like.

The competitors in the cup are:

  • Surge.sh
  • Zeit Now
  • Netlify
  • S3 with Cloudfront (a service from Amazon Web Services - AWS)

Instead of doing a very long analysis, let's just start with the results!

CDN ping

The fastest service is S3 with Cloudfront. S3 is a static file storage, and Cloudfront is the CDN service from Amazon.

In the same way I could write many things about my migration to Gatsby, I could also speculate and write many things about this result. Instead I want to just show some animated gifs about interesting aspects of the different geography results for the providers. I am going to do them in reverse order, best result last.

Fourth place: Surge.sh:

Surge.sh geography

Third place: Netlify:

Netlify geography

Then, slightly behind on second place, Zeit Now:

Zeit now geography

Lastly, the winner, here is AWS S3 with Cloudfront:

S3 with Cloudfront geography

Conclusions and reflections

The numbers are one thing, but let's talk a bit about their significance. The tests were performed from AWS datacenters, and the 2 services scoring highest is either an AWS service (S3/Cloudfront), or uses AWS for their service (Zeit Now). Meaning the actual numbers does not necessarily mean that Netlify is 144% slower than S3/Cloudfront. It also does not mean I think any of these services have been proven to be better or worse than others.

I think it means that now that we are able to serve static HTML pages for our blogs or websites in a somewhat dynamic way, we can make the performance more democratic and less unfair. I don't want to discriminate readers of my blog based on their location (or anything else for that matter). Performance matters, but performance also differs from different parts of the world.

I guess what I am trying to say is: Let's make the world a better place by thinking about everyone that lives there, no matter where they live. So I will finish this post with an animated gif about just that. The world.

Feb 07 2020
Feb 07

We hope to see you in Berkeley October 14th - 17th, 2020, for this year’s BADCamp! If you’re ready to get a jumpstart on community participation, we’ve got some ideas to get the BADCamp vibes going:

Sponsor the Camp

We truly couldn’t do any of this without you. BADCamp runs on sponsorships.  

With packages starting at $500, your organization can help us plan a weekend full of learning, community, and connections. And if you’re looking for top talent, now is a great time to sponsor! You’ll have access to post to our job board, and get your position in front of hundreds of eager and community-minded Drupallers. Visit our sponsorships page for more details.

Book Your Hotel

Berkeley hotels often fill up early for BADCamp week due to other events in the area. Book early to confirm your hotel! We have rooms reserved at The Hotel Shattuck. The Hotel Shattuck is in the heart of Berkeley only a few blocks from the UC Berkeley campus. 

Book online following this link.

Please feel free to reach out to organizers via the contact form should you have any problems utilizing the link or if any modifications to reservations are needed. 

Answer Our Call for Trainers

One of the things we love best about BADCamp is the community learning together. Every year, we bring in Drupal experts to offer skills trainings to our fabulous attendees, with a wide variety of subjects from beginner to advanced.

Do you have expertise you’d love to share? We’d love to talk! Please contact us with details about a training you’d like to provide.

Feb 07 2020
Feb 07

Read our Roadmap to understand how this work falls into priorities set by the Drupal Association with direction and collaboration from the Board and community.

Project News

Get Ready for Drupal 9

Are you wondering what it will take to upgrade to Drupal 9? Good news - it's going to be easier than any other major version upgrade in Drupal's history.

The upgrade to Drupal 9 is just like any other Drupal upgrade, except that the new codebase has updated key dependencies Drupal relies on and removed deprecated code. As long as all the modules and custom code you use don't rely on deprecated code - you should be good to go.

As it turns out, many contributed or even custom modules only need a one-line change to be ready for Drupal 9. You can use these community created tools to check the status of your modules: the upgrade status module, or the Drupal Check command line tool. In many cases, you may just need to remove some deprecated code in favor of the more modern implementations. Drupal Rector can provide you with automated fixes for many of these deprecations. 

Still getting to grips with Composer?


If you're still getting to grips with using Composer after the changes in Drupal's 8.8.0 release, don't worry - there's help to be found. The community has extensively documented the different scenarios a site owner may find themselves in with this update.

If you've previously used one of the community created templates to manage your site to composer, there are instructions to migrating to the officially supported method.

If you've never used Composer at all - you're in luck - with 8.8.0 and beyond everything you need is already in place. 

Drupal.org Update

Drupal.org Packaging updates

As mentioned in our December update, we've been making major improvements to the Drupal.org packaging pipeline, to support packaging Drupal using Composer create project. We reached a major milestone at DrupalCamp New Jersey, allowing our packaging pipeline to properly support the Composer create project command when generating tar and zip files, and paving the way for enhancements to the core subtree splits.

Updating this pipeline is critical for ongoing releases of Drupal, and is part of paving the way for the Drupal 9 alpha release. We want to thank Acquia for donating time to help us get this work ready.

Preparing for contrib Semver

Per our roadmap for supporting Semver for contributed projects on Drupal.org, we have updated the way contrib version numbers are stored, making existing version numbers parseable when we convert to full semver. We also collaborated with core contributors at DrupalCamp New Jersey to identify and resolve a number of other related issues.

Drupal.org now has an example project which uses semantic versioning, which we are using as the testbed for this support, and to prove out any additional UI changes that we want to make before rolling this out to all other contributed projects.

Want to learn more about Semantic Versioning and how to use it properly within your projects? Semver.org can walk you through it.

More accessible formatting for the DrupalCon program schedule

It's almost time for the DrupalCon Minneapolis program to be published! To prepare for this launch, we've made updates to the program schedule to improve accessibility and readability for attendees.

In particular these updates have focused on line weights, spacing, and other formatting changes that should improve readability. With the accepted sessions being announced soon,  we're excited to see what you think!

Better social event submission tools for DrupalCon events

DrupalCon Minneapolis | May 18-22 2020Some of the best parts of DrupalCon are the social events that take place around it. They're a chance for the community to celebrate and build camaraderie, and an established tradition. We've made updates to the social event submission process to make getting your event listed easier than ever. 

Join the Drupal Community in person! 

By the way… have you registered for DrupalCon yet?

DrupalCon is the best place to come together with other members of the Drupal community in person. It's also the central meeting point for all of facets of the Drupal business ecosystem, so if you are end-user looking for training or a vendor to support your Drupal deployment - there's no better place to be than DrupalCon.

DrupalCon Minneapolis is going to be here any day now - so get your tickets before prices go up!

Can't make it to Minneapolis? Join us at DrupalCon Barcelona 2020 in September.


As always, we’d like to say thanks to all the volunteers who work with us, and to the Drupal Association Supporters, who make it possible for us to work on these projects. In particular, we want to thank:

If you would like to support our work as an individual or an organization, consider becoming a member of the Drupal Association.
Follow us on Twitter for regular updates: @drupal_org, @drupal_infra

Feb 07 2020
Feb 07

The handful of hotels we have selected in Minneapolis each offer an ideal hub — connecting you to a rewarding DrupalCon community experience. This year, choose from multiple Hiltons, a Hyatt, a Holiday Inn and more as part of our special DrupalCon attendee offerings. Why is staying at a DrupalCon partner hotel great for you and the community? Read on!

Feb 07 2020
Feb 07

In Drupal 8.4.X and later releases, Drush 9 is the only supported and recommended version. One of the key changes introduced in this version is a new model of writing custom Drush commands. From now on, .inc files are obsolete and you will no longer use them for your commands, which are now classes based on AnnotatedCommand format.

The underlying structure of a module containing your custom Drush command will look as follows:

Module structure with custom Drush 9 command

Drush9_custom_commands.info.yml file

Default information file with basic information for the module.

name: Drush 9 Custom Commands
description: Example of Drush 9 custom command.
core: 8.x
type: module
package: Examples

Drush.services.yml file

Using the standard services.yml file will not work anymore, and it will lead to an error since Drush now expects you to use a separate drush.services.yml file with service definitions for your custom Drush commands.

    class: \Drupal\drush9_custom_commands\Commands\Drush9CustomCommands
      - { name: drush.command }

Drush9CustomCommands.php file

As such, this is the most crucial file, including the entire definition for your commands. 


namespace Drupal\drush9_custom_commands\Commands;

use Drush\Commands\DrushCommands;

 * A drush command file.
 * @package Drupal\drush9_custom_commands\Commands
class Drush9CustomCommands extends DrushCommands {

   * Drush command that displays the given text.
   * @param string $text
   *   Argument with message to be displayed.
   * @command drush9_custom_commands:message
   * @aliases d9-message d9-msg
   * @option uppercase
   *   Uppercase the message.
   * @option reverse
   *   Reverse the message.
   * @usage drush9_custom_commands:message --uppercase --reverse drupal8
  public function message($text = 'Hello world!', $options = ['uppercase' => FALSE, 'reverse' => FALSE]) {
    if ($options['uppercase']) {
      $text = strtoupper($text);
    if ($options['reverse']) {
      $text = strrev($text);

Your commands class inherits from DrushCommands, and you can use it to include numerous commands, of which all are separate methods using annotations.

Some of the annotations available for use include:
@command – command definition, which needs to follow the module: command structure;
@aliases – aliases for your commands, separated with spaces;
@param – which defines the input parameters for your command;
@option – which defines the options available for your commands, which should be put in an associative array, where the name of the option is the key;
@usage – example showing how the command should be used.

Usage examples for the above command:

Drush d9-message 
Hello world!

drush d9-message --uppercase --reverse drupal8


At our Drupal agency, we are pretty sure that those who wrote their custom Drush commands before Drush 9 was released can notice quite a number of differences here, as we’re seeing a departure from the old way, which was well-known to Drupal 7 developers, towards the new Symphony-based solution. Those who haven’t had a chance to work with Drush commands yet will probably find the above example pretty boring. What can your custom Drush command be used for in practice? For example, recently I had an opportunity to integrate Drupal with an external blogging service. Using cron, the posts are added to Drupal at specified intervals. Thanks to a custom Drush command, I can run such an operation at any time without using UI. What is more, the parameter enables its user to download any number of posts using a numeric value or “all” setting. The above solution proved to be very useful during the initial migration when all the existing posts and entries had to be downloaded. What kind of processes are you going to make easier thanks to custom Drush commands?

Feb 06 2020
Feb 06

I was recently hired to implement SEO optimization recommandations by an SEO agency on a Drupal 8 site in a highly competitive sector. Also I recently submitted a proposal to a call for offers including SEO and talked a lot about it with an SEO expert friend of mine. Here's my feedback in case it might interest you. I'll try to keep most obvious things to the end and start by the most difficult.

View filters

Oh that one is big. Let's suppose that you have a Property content type with some fields referencing taxonomy terms in vocabs such as Area and City. You have a View that list the properties with Area and City as exposed filters. 2 minutes to build this, thank you Drupal! However... If your View page URL is /properties, with submitted exposed filters it becomes /properties?field_area=123&field_city=456

If you know a little bit SEO, you know that query parameters are not taken into account. Those URLs are worth nothing in comparison to /properties/california/san-francisco. I thought "it sounds bad, but it might be easy if I just can automate alias creation from URLs with queries to SEO-URLs. But it's not possible to create an alias with query strings.

You better know this from the start. I might have missed an elegant solution but all in all, in the end this required 5 full development days on the website. In my opinion, as awesome as Drupal is, it still misses this feature.

Exceptional potential otherwise

Honestly beyond that, Drupal Core and the available contrib modules you all know are very close to the perfection. Name it, you have it. Pathauto, Metatag, Simple sitemap, Redirect... Like I said I'm not gonna expand on this. Just one last special mention to Schema metatag which adds too an incredible value.

In SEO performance is also important, but I will save this for a new article: how to optimize the perf for a very simple website like mine.

SEO analysis modules

There are some SEO analysis modules, just 1000 times less numerous than in WordPress. But even in WordPress, in my opinion they are all extremely limited. Yoast is incredible, but only for WordPress vanilla PostTypes. ACF is the leader in WordPress "fields". There are extensions for Yoast / ACF but more than half of the field types are not taken into account in the analysis.

My conclusion? Forget the SEO analysis modules. Instead, setup a free account on an SEO platform for your clients if they ask for it. That's anyways what I would do if I were to take in charge the SEO of any website by myself.

Feb 06 2020
Feb 06

A useful Drupal 9 guide that features everything else 
you wanted to know about the upcoming release ;)

Drupal keeps moving forward, offering more and more advanced digital experiences. The scheduled release of Drupal 9 in 2020 was announced a while ago, and this year has come. What about you — are you doing some Drupal 9 planning for your website?

We are here to help you with it! So you can:

  • either directly contact our Drupal support & maintenance team with one simple request “Prepare me for Drupal 9 please”
  • or start with reading our D9 checklist for website owners.

If the word “checklist” sounds a little cumbersome, we will give you a little hint: a Drupal 9 upgrade is going to be incredibly easy for those who are on D8! For those who are on D7, we will also share another Drupal 9 checklist. In addition to useful checklists, our blog post is also going to feature everything you might want to ask about Drupal 9. So let’s begin!

When is it coming? The Drupal 9 release date

The Drupal 9 release date is one of the most frequently asked questions. We are pleased to answer it — the D9 release is planned for June 3, 2020!

Although we expect D9 to be released in June, the exact day is not certain as was recently announced by the Initiative Coordinator Gábor Hojtsy. Most people took the June 3 release date for granted, but it is still not guaranteed, writes Gábor in his blog post.

The community has a huge job to do in meeting the alpha and beta release requirements. So Gábor Hojtsy encourages more contributors to participate. According to their active effort, the release date may be:

  • June 3, 2020 (most likely)
  • August 5, 2020
  • December 2, 2020

Why do you need to upgrade to Drupal 9?

Next year, in November 2021, both Drupal 7 and Drupal 8 reach their end-of-life, which means an end of official support. It is desirable that all sites move to D9 before then (see the Drupal 9 checklist chapter to learn how to do this).

And, of course, a great reason for a Drupal 9 upgrade is that all development efforts will soon be focused on creating modern features for it. The big changes are not directly coming in June — D9.0 is going to be just a cleaned-up version of D8’s latest minor release. The most exciting things will start with Drupal 9.1 (several months later).

The community is preparing something amazing, but not all features are uncovered yet. Let’s take a glimpse at a couple of things that are known.

Some of the new features in Drupal 9:

  • D9 will be available to D8 websites from day one thanks to full backward compatibility! No need to fuss with upgrades this time, they are going to be very easy. Only a few preparations are needed (see the checklist chapter).
  • Drupal 9 will also be using the latest libraries and components so your website has better performance, cleaner code, more efficient development, and much more. Among the most important examples are the latest versions of the Symfony framework (4 or 5) and the Twig template engine (version 2).
  • We have already taken a glimpse at the new Drupal front-end theme Olivero. Its official overview was titled “Drupal 9: The Most Beautiful CMS Ever!”. Olivero’s color palette, typography, forms, buttons, sidebars, and everything else is meant to give websites an attractive and professional look, increase their usability and accessibility to all users.

New Drupal 9 frontent theme Olivero

  1. reducing development costs and efforts
  2. prioritizing the beginner experience
  3. driving the future of the open web with accessibility, inclusiveness, security, privacy, and interoperability
  4. making Drupal the best-structured data engine for more integrations, more devices, and more channels

Strategic Drupal 9 areas announced by Dries

How to prepare for Drupal 9?

This is the main question of our today’s Drupal 9 guide and the answer depends on your current version. So we are sharing two checklists with you: the D7 and the D8 checklists.

Drupal 8 site owner's checklist for a smooth Drupal 9 upgrade

As of November 2021, Drupal 8 will reach its end-of-life and stop receiving support (including security updates). But there is no need to worry because you have already made the best decision — choosing Drupal 8. Since D9 is being built in D8, this makes your upgrade checklist very easy:

  • 1) Review your website’s goals

Time flies, and what was relevant yesterday, may become unnecessary today. On the contrary, you may need something new. So the first optional point in the checklist is to think about what digital experiences your website provides to your customers and what else it could do.

  • 2) Talk to your editors and administrators

The next optional point in the checklist is to ask your content editors, administrators, and other team members who do things on your website every day. What could work better? Drupal 8 has brought so many wonderful improvements in content editing, and the ninth version should continue this path.

  • 3) Make a module audit with developers

Based on your reviewed requirements and ideas in points (1) and (2) of this checklist, ask developers to help you prepare a cleaned-up and updated list of modules for your desired functionalities. To get an audit of modules, you can contact our Drupal support & maintenance team. During the module audit, the best practices are to use the core where possible, since many nice features have been moved to it. Examples include using the core Content Moderation instead of Workbench Moderation, the core Media instead of the contributed Media, the core Layout Builder instead of Panels, etc.

  • 4) Order an update to the latest minor version (required)

Use the latest version of the core, modules, themes, and third-party libraries. According to the new innovation model, every six months a new core release with valuable features comes out, and between them, there also are patch releases. By keeping up with the latest minor version, you are already preparing for Drupal 9. What version are you on? The current one is 8.8.2! Ask our developers to update you and you will be much closer to D9. These updates do not take long (usually from half an hour to an hour).

  • 5) Order a cleanup from deprecated code (required)

This is the cornerstone of Drupal 9 readiness checklist. Websites that are not using deprecated APIs and functions will be instantly ready for Drupal 9. Ask our Drupal support experts to do this cleanup. This is not going to take much time. They will use D9 preparation tools like the Drupal-check command-line tool, the Upgrade Status contributed module with a UI for deprecated code checking, the Rector automatic deprecated code removing tool, etc.

Drupal 7 site owner's checklist for a smooth Drupal 9 upgrade

Similarly to D8, D7 is end-of-life in November 2021, which means a stop of support as well. Drupal 7 website owners face a couple of options:

  • commercial support
  • direct upgrade to Drupal 9
  • upgrade to D8 and an easy jump to D9 as soon as it comes

We recommend option 3, and here’s why. Staying on Drupal 7 with commercial support means you are shut out of the innovation because no one is going to prepare new and exciting features in D7.

When it comes to a direct update to Drupal 9, we are ready to do it if you want, but consider the following. You’ll need one big upgrade anyway because D7 is totally different from both D8 and D9. The option to stick to a well-studied D7-D8 upgrade path looks much better. In addition, you can start enjoying modern functionalities much sooner.

Dries Buytaert in his State-of-Drupal presentation compared the versions to railway tracks: once you move to the Drupal 8 track, you will never have to change your tracks again, and all future upgrades will be easy. But the D7 track leads to nowhere.

Once you choose Drupal 8 track you will never change tracks again

So your checklist will look like this:

  • 1-2-3) Repeat the first 3 steps of the checklist for Drupal 8

The first three checklist steps look similar. However, with D7, they are more required because more serious changes are needed. Review your website’s goals and features, talk to your editors, and share the results with our support team so we can help you make an audit of modules. Based on the necessary functionalities, we’ll check which modules have a D8 version, which ones need a replacement, which ones need a rewrite, and so on.

  • 4) Schedule an upgrade to Drupal 8

Order your website upgrade to D8 in accordance with the best practices. We’ll move your content and configuration to a clean Drupal 8 instance of the latest version. The upgrade time may vary depending on your website’s complexity. However, it has become much smoother and faster now with new, stable D8 migration modules.

  • 5) Order one final little check for deprecated code

Your fresh Drupal 8 website should be as close to Drupal 9 as possible. However, you can ask us after the D9 release to do one more check and clean it up from deprecated code. And then congrats — you are ready for Drupal 9!

Let our Drupal team complete your checklist!

Hopefully, the above Drupal 9 checklist will come in handy. Maybe checklists look a little scary but in reality, it’s not difficult to accomplish them. Just contact our Drupal support & maintenance team we will help you with every item in the checklist.

Let your business flourish from the most modern features Drupal development can offer!

Feb 06 2020
Feb 06

The saying “content is king” is an indisputable truth for well-performing websites with high traffic, good SEO rankings, and conversions.

If content is the king, then attractive images are the crown! To make the crown fit perfectly, the easy content creation processes on every website should include handy ways to add and optimize images.

One of the ways to optimize images is cropping. Today, we will tell you about intelligent, or smart cropping of images that makes sure they are presented in their best form.

We will review a very helpful Drupal module that takes care of intelligent image cropping — the Focal Point.

What is smart cropping?

Images need to be optimized in order to look trimmed on your content pages and not hamper your website’s speed. So image cropping combined with scaling comes as a great solution. In relation to this, people often ask:

  • How do you crop in precisely?
  • How do I auto crop a picture?
  • How do I crop a picture to a specific size
Image cropping

No worries — if your website is built with Drupal, all these questions are already resolved in intelligent ways. You will not need to crop anything manually. Drupal’s system of image styles allows you to set up automatic crop & scale effects for all images with specific dimensions in specific scenarios. We will show how this works in today’s blog post.

There is one more thing to consider while answering the question “what is clever cropping?”. When cropping is automatic, it could potentially cut off important parts of an image, especially when they are not situated in the center.

How automatic image cropping can cut off important image parts

How to avoid cutting off important pieces and have all images cropped properly? An intelligent approach is needed! Read on to discover the solution.

The Focal Point Drupal module: the smartest way to crop your images

The intelligent approach is offered by the Focal Point module in Drupal. Thanks to it, every content editor or website user who uploads an image to your website can mark the point in the image that they think is important and should never be cropped off.

The subsequent crop and scale process intelligently focuses on this specified point. This means a clever, content-aware image cropping.

Focal Point Drupal module for smart cropping

How to work with the Focal Point module in Drupal 8

We will show you how intelligent cropping works from beginning to end with the Focal Point Drupal module.

The Focal Point module installation

The Focal Point module in Drupal 8 should be installed and enabled together with another module — the Crop API.

Focal Point module installation in Drupal 8

The special Focal Point field widget

For intelligent cropping, the Focal Point module provides a dedicated field widget. On the “Manage form display” tab of a particular entity type, you need to select the “Focal Point” widget next to the image field. In our example, this is the “Photo” content type.

Focal Point field widget in Drupal 8

The focal point of your images

Content editors and users will now be able to set a focal point for every image added via this field in the “Photo” content type. Let’s add an image at “Add content — Photo” and upload an image.

We see that there now is a crosshair that we can move around. Let’s put the focal point to its center of an important object on this beach picture — a cliff.

Setting focal point on an image with the Focal Point module in Drupal 8

We can then add a few more pictures while setting a focal point for each. Here is another example at a closer look:

Setting focal point on an image with the Focal Point module in Drupal 8

The Focal Point Scale & Crop Effect

Let’s now move on to our intelligent cropping. To make all our photos look uniform in a particular display, we will naturally use Drupal image styles (Configuration — Media — Image styles — Add image style).

While adding an effect, we see that there is now a new special effect to select from — “Focal Point Scale and Crop.” Let’s click “Add.”

Focal Point scale and crop effect in Drupal 8

Then we will set the desired dimensions and save both the new effect and the whole image style. We can change the dimensions at any time and see all the images change automatically.

Specifying image dimensions — Focal Point scale and crop effect in Drupal 8

Now let’s go to the “Manage display” tab of our content type and set the image style in which all photos will appear on the website. We will use the cogwheel next to the image field and select our newly created style.

Selecting image style in Drupal 8

The smart crop result checking

Let’s open the content items to see the intelligent cropping results. It looks good — our focal point has not been cropped off.

Focal Point scale and crop result in Drupal 8

Let’s create a view of all our photos processed by the intelligent cropping to see them together by creating a Drupal view:

Creating a Drupal view

All images had their key elements outside the center, but thanks to the Focal Point module none of them have been cropped off. This is how intelligent cropping works.

Focal Point scale and crop effect result in a Drupal 8 view

Entrust your image optimization to our Drupal experts!

If the intelligent image cropping by the Focal Point module inspires you, drop us a line. Our Drupal support and web development experts will set up this and many other ways of image optimization on your website. Let all your processes be intelligent!

Feb 06 2020
Feb 06

Like every month, we’ve compiled a list of our favorite Drupal blog posts and pieces of news from last month. Here’s what we enjoyed the most in January. 

Nominations now open for the 2020 Aaron Winborn Award

We’re starting off with a short but important post by Michael Anello of DrupalEasy which serves as the announcement that nominations for the 2020 edition of the Aaron Winborn Award are open. 

He describes briefly the background of the award: what kind of work it’s intended for and the origin of the award (namely, it was launched to honor the late Aaron Winborn after having lost his battle with ALS). 

Nominations for the award are open until the end of March. So, if you have in mind a member of the Drupal community who is truly outstanding in their work and you feel deserves the recognition, make sure to nominate them and increase their chances of winning!

Read more

Our Drupal 9 Masterplan

With Drupal 9 fast approaching, doing your due diligence and preparing for the upgrade beforehand can truly pay off. The second place on our list for January thus goes to MD System’s post describing their plan for the upgrade to 9. 

Author of the post Sascha Grossenbacher first explains Drupal’s release cycle and how the release of Drupal 9 differs from the traditional one, then confirms that the MD Systems Primer will be ready for all users who have an active maintenance contract with them.

The second half of Sascha’s post is then dedicated to the status of Drupal core and tooling with regards to Drupal 9 readiness. 

Read more

New Language Hierarchy release for Drupal 8

In the next post on this month’s list, James Wiliams of ComputerMinds presents the first stable release of the Language Hierarchy module that he has developed for Drupal 8. This module serves to enhance Drupal’s already powerful existing multilingual capabilities by allowing you to set a “fall back” language when a certain translation is missing. 

Some of Language Hierarchy’s features include the ability to set the same configuration for each ‘child’ of a ‘parent’ language, the ability to build hierarchies by dragging and dropping languages, and the inclusion of automated tests. To learn more about the module and what it can do, have a read of James’ entire post. 

Read more

Happy nineteenth birthday, Drupal

January 15th marked the 19th birthday of the Drupal CMS and one year of its “maturity”. Drupal and the huge and powerful community that has sprung up around it have truly come a long way since their beginnings, and it’s really nice to see Dries commemorating the occasion every January by taking a look back through the years. 

It makes perfect sense, then, that Dries is so proud of the community and how much we have grown, both in the technological aspect and the ethical, humane one. And we’ll continue with this trend in Drupal’s 20th year: Drupal 9 will be an unprecedented release in its backwards compatibility, and the community will keep on striving towards a more inclusive and open experience for every user. 

Read more

Drupal’s Admin UI and How it Compares to WordPress

We continue with a comparison of the administration UI of the two biggest CMS, Drupal and WordPress, written by Mike Hubbard of Acro Media. The reason why we particularly enjoyed Mike’s post is that his comparison isn’t biased and based on the misconception of WordPress as easy and Drupal as hard to use, but is instead an objective comparison of different aspects of their admin user interfaces. 

In his post, Mike analyzes 10 different elements of the administration UI and which CMS wins out in each category. The final results show that the two are actually very close, with Drupal getting 6 and WordPress 4 points. WordPress was clearly advantaged in the management of media and plugins, for example, while Drupal won out in the management of user roles and permissions, and its admin toolbar.

Read more

How to Contribute to Drupal: Module Maintenance

Another blog post from January that we really enjoyed reading was Digital Echidna’s guide to becoming and being a maintainer of a Drupal module. It takes the form of an interview of their Communications Specialist Sasha Kristoff with their developer Jordan Thompson, who is a maintainer of three Drupal modules. 

Jordan gives answers to questions such as what the time commitment is and how he became a maintainer, complete with a word of advice to anyone wanting to become a maintainer (spoiler: work on drupal.org issue queues!). Sasha then finishes her post with some information about the Global Contribution weekend which took place at the end of January. 

Read more

Contribution Recognition Committee Selected

The ability for Drupal contributors to be recognized for their contributions is something really groundbreaking - but, at the same time, difficult to pull off perfectly, leaving a lot of room for taking advantage of the system.

The announcement of the formation of a Contribution Recognition Committee at DrupalCon Amsterdam was an important step in the right direction, and, in January, the members of the committee were selected and announced by Tim Lehnen in his post. 

They are: Mike Lamb, Tiffany Farriss, Ryan Szrama, Stella Power, Jacob Rockowitz, Rakhi Mandhania and Paul Johnson. To learn more about them and how they will operate, we suggest you read Tim’s entire post. 

Read more

How to Recover After a Painful Drupal 8 Migration

The final post on our list for January is dedicated to everyone who has recently (unsuccessfully) migrated to Drupal 8 and is unsure about how to salvage the migration. Its author, Third and Grove’s CEO Justin Emond, suggests starting with a retrospective before setting yourself to actual fixes, which should be carefully planned out.

Justin’s post covers some of the most important questions site owners might have after such a migration. Near the top of these is making sure the site stays Drupal 9 ready, but he also provides advice on regaining lost SEO ranking and accessibility guidelines compliance, as well as how to fix a lacking back-end editor experience.

Read more

That’s it for our favorite Drupal posts from January. If you enjoyed the read, we recommend taking a look at some of our other blog posts - you never know, you might be just a click away from that amazing insight you’ve been looking for!

Feb 06 2020
Feb 06

Thomas Edison famously said, "The three great essentials to achieve anything worthwhile are, first, hard work; second, stick-to-itiveness; third, common sense." This quote made me wonder if "sticking-to-it" is contradictory to innovation; does it make you resistant to change? But, the more I pondered on it, I realized that innovation is fueled by perseverance.

Before Drupal 8 was introduced, the Core committee had not just promised to innovate; they decided to be persistent. Persistent in continuous reinvention. Persistent in making Drupal easier to adopt—not only by the market but also by developers with various levels of expertise. However, to be able to make Drupal successful and relevant in the long run, a drastic change was needed—a change that would build a better future. For this, Drupal 8 had to dismantle the Drupal 7 architecture and lay a fresh foundation for a promising future. Moving on to Drupal 9 (coming soon) and subsequent versions will now be easy and straightforward.

Freedom to innovate with open source

Innovation brings freedom, and freedom creates innovation. Open source gives you the freedom to access, learn, contribute, and, most importantly, the freedom to innovate. The ability to learn, catch up, and reinvent is extremely crucial today. Drupal began as a small internal news website and later went on to become an open source content management system (CMS) because there was a potential to make it much more compelling by attracting more contributions. It gave developers the freedom to collaborate, re-use components, and improvise on it to create something more modern, powerful, and relevant.

Promises delivered: Drupal 8 version history

The web is always changing. To stay relevant, Drupal had to introduce changes that were revolutionary but, at the same time, not so hard to accept. Drupal 7, as a content management system, was widely welcomed. But it lacked in certain aspects like developer adoptability, easy upgrade paths, better API support, and more. Drupal 8 changed everything. They did not choose to build upon Drupal 7, which would have been an easier choice for an open source CMS. For a more future-proof CMS that is ready to accept changes, Drupal 8 had to be rebuilt with more modern components like Symfony, Twig, PHP 7, and initiatives like the API-first initiative, mobile-first initiative, Configuration Management initiative, etc.

Drupal 8 was released with a promise of providing more ambitious digital experiences with better UX improvements and mobile compatibilities. The goal was to continuously innovate and reinvent itself. For this to work, these practices needed to be put in place: semantic versioning (major.minor.patch), scheduled releases (two minor releases per year), and introducing experimental modules in Core. All of this while providing backward compatibility and removing deprecated code.

Let’s look at some of the promises that have been delivered with each minor version of Drupal 8.

  • Drupal 8.0
    • Modern and sophisticated PHP practices, object-oriented programming, and libraries.
    • Storage and management of configuration was a bit of a messy affair with Drupal 7. The Configuration Management Initiative was introduced with Drupal 8.0, which allowed for cleaner installations and config management. Configurations are now stored in easily readable YAML format files. These config files can also be readily imported. This allows for smooth and easy transitions to different deployment environments.
    • Adding Symfony components drastically improved Drupal 8’s flexibility, performance, and robustness. Symfony is an open source PHP framework, and it abides by the MVC (Model-View-Controller) architecture.
    • Twig is a powerful template engine for PHP, replaced Drupal’s engine since 2005, PHPTemplate. With Twig, the code is now more readable, and the theme system is less complex, uses inheritance to avoid redundant code, and offers more security by sanitizing variables and functions.
    • The Entity API, which was quite limited and a contributed module in Drupal 7, is now full-fledged and is in Drupal 8 Core. Since Drupal 8 treats everything as an "entity," the Entity API provides a standardized method of working with them.
    • The CKEditor, which is a WYSIWYG (What You See Is What You Get) editor, was introduced. It allows for editing on the go, in-context editing, and previewing your changes before it gets published.
  • Drupal 8.1
    • The alpha version of the BigPipe module got introduced to Core as an experimental module. BigPipe renders Drupal 8 pages faster using methods like caching and auto-placeholder-ing.
    • A Migrate UI module suite got introduced to Core as an experimental module. It makes migrating from Drupal 7 to Drupal 8 easier.
    • The CKEditor now includes spell-check functionality and the ability to add optional languages in text.
    • Improved testing infrastructure and support especially for Javascript interactions.
    • Composer is an essential tool to manage third-party dependencies of websites and modules. With Drupal 8.1, Drupal Core and all its dependencies are now managed and packaged by Composer.
  • Drupal 8.2
    • The Place Block module is now an experimental module in Core. With this module, you can easily play around with blocks right from the web UI. Configuring and editing blogs can be done effortlessly.
    • A new Content Moderation module that is based on the contributed module Workbench Moderation has been introduced as an experimental module in Core. It allows for granular workflow permissions and support.
    • Content authoring experiences have been enhanced with better revision history and recovery.
    • Improved page caching for 404 responses.
  • Drupal 8.3
    • The BigPipe module is now stable!
    • More improvements in the CKEditor. A smooth copy-paste experience from Word, drag and drop images, and an Autogrow plugin that lets you work with bigger screen sizes and more.
    • Better admin status reporting for improved administrator experience.
    • The Field Layout module was added as an experimental module in Core. This module replaces the Display Suite in Drupal 7 and allows for arranging and assigning layouts to different content types.
  • Drupal 8.4
    • The 8.4 version calls for many stable releases of previously experimental modules.
    • Inline Form Errors module, which was introduced in Drupal 8.0, is now stable. With this module, form errors are placed next to the form element in question, and a summary of the errors is provided on the top of the form.
    • Another stable release—the DateTime Range module that allows date formats to match that of the Calendar module.
    • The Layout Discovery API, which was added as an experimental module in Drupal 8.3, is now stable and ready to roll. With this module, the Layout API is added to Drupal 8 Core. It has adopted the previously popular contributed modules—Panels and Panelizer—that were used extensively to create amazing layouts. Drupal 8’s Layout initiative has ensured that you have a powerful Layout building tool right out of the box.
    • The very popular Media module is added as an API for developers to be able to port a wide range of Media contributed modules from Drupal 7. For example, media modules like the Media entity, media entity document, media entity browser, media entity image, and more. However, this module is still hidden from site builders till the porting and fixing of issues are over with.
  • Drupal 8.5
    • One of the top goals that Drupal 8 set out to reach was making rich images, media integration, and asset management easier and better for content authors. It has successfully achieved this goal by adding the Media module now in Core (and it isn’t hidden anymore). 
    • Content Moderation module is now stable. Defining various levels and statuses of workflow and moving them around is effortless.
    • The Layout builder module is introduced as an experimental module. It gives site builders full control and flexibility to customize and built layouts from other layout components, blocks, and regions. This has been one of the top goals for Drupal 8 site builders.
    • The Migrate UI module suite that was experimental in Drupal 8.1 is now considered stable.
    • Big pipe module which got previously stable in version 8.5, now comes by default in the standard installation profile. All Drupal 8 sites are now faster by default.
    • PHP 7.2 is here, and Drupal 8.5 now runs on it and fully supports the new features and performance improvements that it offers.
  • Drupal 8.6
    • The very helpful oEmbed format is now supported in the Drupal 8.6 Media module. The oEmbed API helps in displaying embedded content when a URL for that resource is posted. Also included within the Media module is support for embedding YouTube and Vimeo videos.
    • An experimental Media Library module is now in Core. Adding and browsing multiple media is now supported and can also be customized.
    • A new demo site called Umami has been introduced that demonstrates Drupal 8's Core features. This installation profile can give a new site builder a peek into Drupal’s capabilities and allows them to play around with views, fields, and pages for learning purposes. It also acts as an excellent tool for Drupal agencies to showcase Drupal 8 to its customers.
    • Workspaces module is introduced as an experimental module. When you have multiple content packages that need to be reviewed (status change) and deployed, this module lets you do all of it together and saves you a lot of time.
    • Installing Drupal has now gotten easier with this version. It offers two new easy ways of installing Drupal. One with a "quick start" command that only requires you to have PHP installed. In the other option, the installer automatically identifies if there has been a previous installation and lets you install it from there.
  • Drupal 8.7
    • One of the most significant additions to Drupal Core that went straight there as a stable module is the JSON:API module. It takes forward Drupal’s API-first initiative and provides an easy way to build decoupled applications.
    • The Layout Builder module is now stable and better than ever before. It now even lets you work with unstructured data as well as fieldable entities.
    • Media Library module gets a fresh new look with this version release. Marketers and Content editors now have it much easier with the ability to search, attach, drag, and drop media files whenever and wherever they need it.
    • Fully supports PHP 7.3.
    • Taxonomy and Menu items are revision-able, which means that they can be used in editorial workflows and can be assigned statuses.
  • Drupal 8.8
    • This version is going to be the last minor version of Drupal 8 where you will find new features or deprecations. The next version, Drupal 8.9, will not include any new additions but will be very similar to Drupal 9.0.
    • The Media Library module is now stable and ready to use.
    • Workspaces module is now enhanced to include adding hierarchical workspaces. This gives more flexibility in the hands of the content editor. It also works with the Content Moderation module now.
    • Composer now receives native support and does not need external projects to package Drupal with its dependencies. You can create new projects with just a one-line command using Composer.
    • Keeping its promises on making Drupal easier to learn for newbies, a new experimental module for Help Topics has been introduced. Each module, theme, and installation profile can have task-based help topics.

Opening doors to a wider set of developers

Although Drupal was largely accepted and loved for its flexibility, resilience, and, most of all, its content management abilities, there was a nagging problem—the "deep learning curve" issue. While many Drupalers argue that the deep learning curve is part and parcel of a CMS that can build highly complex and powerful applications, finding Drupal talent is a challenge. Dries, the founder of Drupal, says, "For most people new to Drupal, Drupal 7 is really complex." He also adds that this could be because of holding on to procedural programming, large use of structured arrays, and more such "Drupalisms" (as he calls them).

This issue needed to be tackled. With Drupal 8 adopting modern platforms and standards like object-oriented programming concepts, latest PHP standards, Symfony framework, and design patterns, the doors are now flung wide open to a broad range of talent (site builders, themes, developers).

Final thoughts

"The whole of science is nothing more than a refinement of everyday thinking."– Albert Einstein.

Open source today is more than just free software. It is a body of collaborated knowledge and effort that is revolutionizing the digital ecosystem. The digital world is moving at a scarily rapid pace, and I believe it is only innovation and perseverance from open source communities that can bring it to speed. The Drupal community unwaveringly reinvents and refines itself each day, which is especially seen in the latest release of Drupal 8.


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web