Feb 22 2020
Feb 22

Here is a quick tip if you want to create a step definition that has an argument with multiple lines. A multiline string argument if you like.

I wanted to test that an email was sent, with a specific subject, to a specific person, and containing a specific body text.

My idea was to create a step definition that looked something like this:

Then an email has been sent to "[email protected]" with the subject "Subject example" and the body “one of the lines in the body

plus this is the other line of the body, after an additional line break”

So basically my full file is now this:

@api @test-feature
Feature: Test this feature
  Scenario: I can use this definition
    Then an email has been sent to "[email protected]" with the subject "Subject example" and the body “one of the lines in the body

    plus this is the other line of the body, after an additional line break”

My step definition looks like this:

   * @Then /^an email has been sent to :email with the subject :subject and the body :body$/
  public function anEmailHasBeenSentToWithTheSubjectAndTheBody($email, $subject, $body) 
      throw new PendingException();

Let’s try to run that.

$ ./vendor/bin/behat --tags=test-feature

In Parser.php line 393:
  Expected Step, but got text: "    plus this is the other line of the body, after an additional line break”" in file: tests/features/test.feature  

Doing it that way simply does not work. You see, by default a line break in the Gherkin DSL has an actual meaning, so you can not do a line break in your argument, expecting it to just pass along everything up until the closing quote. What we actually want is to use a PyString. But how do we use them, and how do we define a step to receive them? Let’s start by converting our step definition to use the PyString multiline syntax:

@api @test-feature
Feature: Test this feature
  Scenario: I can use this definition
    Then an email has been sent to "[email protected]" with the subject "Subject example" and the body
    one of the lines in the body

    plus this is the other line of the body, after an additional line break

Now let’s try to run it:

$ ./vendor/bin/behat --tags=test-feature                                                                                        
@api @test-feature
Feature: Test this feature

  Scenario: I can use this definition                                                                 # tests/features/test.feature:3
    Then an email has been sent to "[email protected]" with the subject "Subject example" and the body
      one of the lines in the body
      plus this is the other line of the body, after an additional line break

1 scenario (1 undefined)
1 step (1 undefined)
0m0.45s (32.44Mb)

 >> default suite has undefined steps. Please choose the context to generate snippets:

A bit closer. Our output actually tells us that we have a missing step definition, and suggests how to define it. That’s better. Let’s try the suggestion from the output, now defining our step like this:

   * @Then an email has been sent to :email with the subject :subject and the body
  public function anEmailHasBeenSentToWithTheSubjectAndTheBody2($email, $subject, PyStringNode $string)
      throw new PendingException();

The difference here is that we do not add the variable name for the body in the annotation, and we specify that we want a PyStringNode type parameter last. This way behat will know (tm).

After running the behat command again, we can finally use the step definition. Let's have a look at how we can use the PyString class.

   * @Then an email has been sent to :email with the subject :subject and the body
  public function anEmailHasBeenSentToWithTheSubjectAndTheBody2($email, $subject, PyStringNode $string)
      // This is just an example.
      $mails = $this->getEmailsSomehow();
      // This is now the important part, you get the raw string from the PyStringNode class.
      $body_string = $string->getRaw();
      foreach ($mails as $item) {
          // Still just an example, but you probably get the point?
          if ($item['to'] == $mail && $item['subject'] == $subject && strpos($item['body'], $body_string) !== FALSE) {
      throw new \Exception('The mail was not found');

And that about wraps it up. Writing tests are fun, right? As a bonus, here is an animated gif called "Testing".

Feb 21 2020
Feb 21

Horray! Today is the day. We are officially announcing our return to DrupalCon 2020 in Minneapolis. We're sorry we kept you waiting so long, but we were still finalizing some moving pieces. With the recent announcement of accepted sessions for this year's DrupalCon, the cat is finally out of the bag.

This year we will be fully immersed in all things Drupal and open source. Some of our team will be leading sessions, some mentoring, and some manning a booth. Whatever it is, we're excited for another fun conference filled with amazing opportunities to learn and grow from fellow community members. We're honored to be able to be involved, to be sponsoring, and to be part of a community that has proven time and time again to be inclusive and open to curious minds.

Accepted Sessions

We'll start first by telling you what exactly it is we'll be talking your ears off about this year. While we're in Minneapolis for DrupalCon 2020 you can look forward to a diverse lineup of topics. Although the full schedule is to be determined, and don't worry we'll update you on when you can hear one of our sessions a little later, we're anxious to let you know what we know. For now, here are the topics you can expect to hear from our team.

Build Healthy Multilingual Relationships: Nest Entities Correctly

First up, we have CEO Aimee Hannaford teaming up with Christian López Espínola from Lingotek to give a great session about multilingual relationships...in a matter of tech speak of course!

As component-based design principles evolve, the actual implementation in Drupal can be performed in many ways. Most, if not all, component-based site building approaches rely on Entity Relationships. Adding multilingual support that is Correct and Predictable can be a challenge to configure. In this session, we'll surface best practices of Multilingual configuration with entity relationships.

You can read the full session description for healthy multilingual relationships on the DrupalCon website.

Progressively Enhance your Workflow with CI: A Roadmap

Next, we have another duo working together, architects Jonathan Daggerhart and Ryan Bateman. Together they are teaming up to provide a clearer pathway to integrating CI into your project process.

Adding Continuous Integration (CI) services to projects is often seen as an expensive and time consuming operation. At worst, clients may even perceive CI as an expensive set of developer tools that rarely benefit a project in any meaningful way. However, by changing our perspective and looking at CI as a progressive enhancement, slowly adding more and more helpful automations to our projects, CI can become the time-saving technology many projects desperately need.

You can read the full session description for progressive CI workflow on the DrupalCon website.

Getting Started With Nightwatch.js For Automated Testing

Architect Jonathan Daggerhart will be leading a solo session built around the capabilities of Nightwatch.js and Drupal. By the end of this session, you should be very familiar with the capabilities of Nightwatch.js, how you can use it for functional testing of any website (including your Drupal sites), as well as use it for unit testing your contributions to Drupal.

You can read the full session description for Nightwatch.js and Drupal on the DrupalCon website.

Preparing yourself and your site for Drupal 9

Last, but certainly not least, we have CTO Kristen Pol leading a session on a particularly hot subject...how do you prepare yourself for Drupal 9? The release of Drupal 9 is fast approaching, hopefully on June 3, 2020 but, worst case, by December 2020. Drupal 9 is a large departure from previous major releases. The only changes between Drupal 9 and the last release of Drupal 8 is it will have deprecated code removed and 3rd party dependencies updated. This will make it the easiest upgrade ever.

You can read the full session description to prepare youself, and your website, for Drupal 9 on the DrupalCon website.

Women In Drupal Sponsorship

We are proud and honored to be part of a community that acknowledges a safe space for women. As a woman-owned business, we are excited to be sponsoring the Women in Drupal Lunch again for DrupalCon 2020. Last year we led a Q&A session with those who attended our lunch to get each other talking about the ups and downs we've had along the journey to success within the tech community. To prepare yourself for what that might sound like this year, you can read some reflections from our owners about what it's like to be a woman in technology from our post last year on women's equality.

With all of that said, we're looking forward to another great opportunity to connect with the women, and those who support them and identify as them, in our community. We can't wait to see you all again. Also, we'd like to take this as an opportunity to welcome all newcomers too. The more the merrier!

We're Feeling Boothy

We have made one of the biggest commitments ever this year, by having a booth for the entire duration of DrupalCon. We wanted to make sure you'd always be able to find us while we're there. Whether you have questions about services we offer, a session we've given, or you're just looking for an old friend to say hello to, now you'll know exactly where to find us!

Our team cannot express enough just how excited we are to be returning for another DrupalCon. We hope you all share the same excitement as we do, and look forward to the journey we're about to embark on together. 

Feb 21 2020
Feb 21

The accepted sessions for the next DrupalCon are here—and what an impressive lineup!

Curated content for the 45-minute sessions is packed with compelling insight, information, and your chance to learn the latest in Drupal. Read on for a sampling of sessions. 

Feb 20 2020
Feb 20

Join us on March 19th for an Amazee Labs Webinar about Advanced BDD with Cypress

This session will build on the basics of Behaviour-Driven Development (BDD) and Cypress.io covered in our last webinar. We will take an in-depth look at how and when scenarios are written, common beginner mistakes and general workflow improvements. 

Topics will include:

  • Writing good scenarios
  • Establishing personas
  • Using tags to organize test runs
  • The screenplay pattern - organizing step implementations

Beginners to Experts – all are welcome. Please join us! 

Date: 19 March 2020
Time: 5 pm CEST

Register online now!

Watch our previous Webinars:

Feb 20 2020
Feb 20

As a creative digital agency based in Amsterdam, we’ve gotten used to having two languages of conduct: Dutch and English. We switch back and forth without even noticing and have learned to read, write and conduct business in both languages effortlessly. Our clients are mostly Dutch, but many cater to an international audience or operate beyond borders, so by now there quite a few multilingual websites in our portfolio.

Despite the potential complexities multilingual websites may pose, Drupal has always been notoriously adamant about supporting all languages - including those with non-latin scripts and those read from right to left. Whereas multilingual sites in Drupal 7 often required a plethora of internationalization and translation modules, combined with some custom code, Drupal 8 was going to solve all our multilingual headaches once and for all. Or did it?

Everything is perfect. Almost.

Admittedly, Drupal 8 has made it easier than ever to support multiple languages. Its architectural overhaul simplified the structure, thereby making internationalization of content much more logical and efficient. A lot of internationalization functionality was moved to core in order to improve maintenance and support. And of course, to enable Drupal site builders to create multilingual sites out of the box.

So Drupal 8 solves every pet-peeve you could’ve had with multiple languages in a single site perfectly and for good. Right? Not quite. Things are never exactly how we (or our clients) want them and that’s fine. That’s why there are Drupal Developers.

Sleek and minimal will do.

Let’s talk about the language switcher in Drupal 8. It can be enabled, placed as a block in the region of your choice and it pretty much works. It shows all added languages written out fully, like so:

  • English
  • Nederlands

However, as we like our site sleek and minimal and consider our visitors tech-savvy, we would like to customize the links and determine exactly where and how the language switcher links get rendered.

Customize the language block

In order to control the output of the language switcher block, we want to be able to render the necessary links only. We don’t need a block title or any of those pesky elements, we just want an HTML list with links, for example:

<ul class="language-switcher-language-url links">
  <li hreflang="en" class="en">
    <a href="https://www.thesavvyfew.com/" class="language-link" hreflang="en">en</a>
  <li hreflang="nl" class="nl is-active">
    <a href="https://www.thesavvyfew.com/nl" class="language-link is-active" hreflang="nl">nl</a>

Luckily, Drupal 8’s Twig templating system makes it pretty easy to render exactly what we want. Just place this in the page.html.twig of your custom theme where you want your language switcher links:

{ # Language switcher # }
{{ drupal_block('language_block:language_interface', wrapper=false) }}

Changing the link labels

Although it’s probably best practise to fully write out the available languages - English, Nederlands, Francais - we could do with some more sleek minimalism. Besides, we consider our visitors as tech-savvy, so we can probably suffice with showing only the language codes as our links: EN and NL.

Let’s tackle this one with a preprocess function. Just paste this code in your CUSTOMTHEMENAME.theme and the links in your switcher should magically transform into language codes:

 * Use language code for the language switcher
function CUSTOMTHEMENAME_preprocess_links__language_block(&$variables) {
  foreach ($variables['links'] as $i => $link) {
    // @var \Drupal\language\Entity\ConfigurableLanguage $linkLanguage
    $linkLanguage = $link['link']['#options']['language'];
    $variables['links'][$i]['link']['#title'] = $linkLanguage->get('id');

Don’t forget to change CUSTOMTHEMENAME to the name of your custom theme.

Hide links for untranslated content

Once the language switcher is enabled and placed, it’s always there. Even when there is no translated version of the page you are viewing, there is a link to the translation of that page. Which doesn’t exist and will just lead you back to the same node, possibly on an unaliased path. That’s bad for SEO and worse for usability.

Let’s fix this by only rendering translation links when translations really do exist. We’re going to create a custom module. Let’s call it “untranslated”.

Our untranslated.info.yml file looks like this:

name: Untranslated
type: module
description: "Disables language switcher links for untranslated content."
package: Custom
core: 8.x

  - language

And untranslated.module looks like this:


 * @file
 * Hide language switcher links for untranslated languages on an entity.
use Drupal\Core\Entity\ContentEntityInterface;

 * Implements hook_language_switch_links_alter().
function untranslated_language_switch_links_alter(array &$links, $type, $path) {
  if ($entity = untranslated_get_page_entity()) {
    $new_links = array();
    foreach ($links as $lang_code => $link) {
      try {
        if ($entity->getTranslation($lang_code)->access('view')) {
          $new_links[$lang_code] = $link;
      catch (\InvalidArgumentException $e) {
        // This language is untranslated so do not add it to the links.
    $links = $new_links;

    // If we're left with less than 2 links, then there's nothing to switch.
    // Hide the language switcher.
    if (count($links) < 2) {
      $links = array();

 * Retrieve the current page entity.
 * @return Drupal\Core\Entity\ContentEntityInterface
 *   The retrieved entity, or FALSE if none found.
function untranslated_get_page_entity() {
  $params = \Drupal::routeMatch()->getParameters()->all();
  $entity = reset($params);
  if ($entity instanceof ContentEntityInterface) {
    return $entity;
  return FALSE;

Now enable this module and your untranslated links should magically vanish until you publish a translation of your page.

There’s been some debate about whether you should remove the links to untranslated content as we demonstrate here or go for a less radical approach and add css classes in order to display links as grayed out or apply strikethrough. This decision will depend on your particular case or client.

The rest of the customizations we applied were just styling, so we’ll leave those up to you and your designers.

Have fun out there!


A big thank you to these very helpful references:

Feb 20 2020
Feb 20

Drupal Camp London is a 3-day celebration of the users, designers, developers and advocates of Drupal and its community! Attracting 500 people from across Europe, after Drupalcon, it’s one of the biggest events in the Drupal Calendar. As such, we're pleased to sponsor such an event for the 6th time!

Drupalcamp weekend (13th-15th March) packs in a wide range of sessions featuring seminars, Birds of a feather talks, Sprints and much more. Over the weekend there are 3 Keynotes addressing the biggest upcoming changes to the technical platform, its place in the market, and the wider Drupal community.

Check out all of the accepted sessions on the Drupal Camp London website here. Or keep reading to see our highlights…

CXO Day - Friday 13th of March

From Front Room to Front Runner: how to build an agency that thrives, not just survives - Talk from Nick Rhind

Few digital agency start-ups reach their first birthday, let alone celebrate over 16 years of success. Our CEO Nick Rhind will be sharing anecdotes and advice from 2 decades of building the right teams to help his agency group, CTI Holdings, thrive.

Catch up with Nick, or any of our team attending Drupal Camp by connecting with them on LinkedIn, or via our contact form.

Come dine with us - Agency Leaders Dinner London

Hosts Paul Johnson (CTI Digital), Piyush Poddar (Axelerant), and Michel Van Velde (One Shoe) cordially invite agency leaders to join them for a night of meaningful discussions, knowledge sharing, and of course great food, excellent wine, and the best company you could ask for. Details of the dinner can be found here.

DCL Agency Leaders Dinner 2020

Agency Leaders Dinner London

Drupal Camp Weekend

Drupal in UK Higher Education - A Panel Conversation

Paul Johnson, Drupal Director at CTI Digital, will be hosting influential bodies from the Higher Education community as they discuss the challenges facing universities in a time of light-speed innovation and changing demand from students. In addition, they will explore the role Drupal has played in their own success stories and the way open source can solve problems for other universities. Drupal camp panel details available here.

The Panellists:

Adrian Ellison, Associate Pro Vice-Chancellor & Chief Information Officer University of West London - Adrian has been involved in Registry, IT and Library Services in higher education for over 20 years. He joined UWL in 2012 from the London School of Economics, where he was Assistant Director of IT Services. Prior to that, he was IT Director at Royal Holloway, University of London, and held several roles at the University of Leeds.

Adrian is a member of the UCISA Executive Committee, representing the voice of IT in UK education. He has spoken about information technology at national and international conferences and events and co-wrote the Leadership Foundation for Higher Education’s 'Getting to Grips with Information and Communications Technology' and UCISA’s ‘Social Media Toolkit: a practical guide to achieving benefits and managing risks’.

Billy Wardrop, CMS Service Support Officer at Edinburgh University - Billy is a Senior Developer with 15+ years experience and the current technical lead for the migration to Drupal 8 at The University of Edinburgh. He has worked with many platforms but his passion lies in developing websites and web applications using open source such as Drupal, PHP, JavaScript and Python. Billy is an advocate in growing the open-source community. As part of his role in the university, he regularly mentors at events and encourages software contribution. 

Iain Harper Head Of Digital, Saïd Business School, University of Oxford - Iain started his career at leading medical insurer MPS, developing their first online presence. He then ran digital projects at a leading CSR consultancy business in the Community before joining the Civil Service. Iain worked with the Government Digital Service on Alphagov, the precursor to GOV.UK. He then joined Erskine Design, a small digital agency based in Nottingham where he supervised work with the BBC on their Global Experience Language (GEL). He now leads the digital team at Oxford University’s Saïd Business School.

Open source has won. How do we avoid dying from success? - A Panel Conversation

Drupal, founded on a philosophy of open source, has steadily grown into a global community, a feat some may label as having achieved ‘Success’. Drupal users and contributors will be discussing the sustainability of Drupal and the future of open source in an open panel session.

What are the challenges faced by different roles? How can we make the whole ecosystem fair and future proof? What does an open source business model look like? 

Join host Paul Johnson and Drupal panellists for this thought provoking discussion on the future of open source. More details on the session are available here.

Why should you attend Drupal Camp?

Share useful anecdotes and up-to-date knowledge 

Discover the latest in UX, design, development, business and more. There’s no limit to the types of topics that could come up...as long as they relate to Drupal that is!

Meet peers from across the industry

From C-Level and Site managers to developers and designers over 500 people attended last year. Meet the best and brightest in the industry at talks and breakouts.

Find your next project or employer

A wide variety of business and developers attend Drupal Camp, make the most of it by creating connections to further your own career or grow your agency team.

Feb 20 2020
Feb 20
Main picture blog post DrupalCamp London 2020 DrupalCamp London is a great event for all Drupal agencies. It is a conference that brings together hundreds of Drupal experts from around the world. It's a great opportunity to meet people who use, develop, design and support the Drupal platform. Therefore, as usually we will be there this year as well. Droptica is a sponsor of DrupalCamp London 2020 We are proud to sponsor and participate in this year's DrupalCamp London. The event will take place on March 13-15, 2020 in London.
Feb 20 2020
Feb 20

My appreciation for form API in Drupal is on the same level as my attempt to avoid it when it comes to user facing forms. Both are pretty high. The reasons I love it are because it’s extendible and security is built in. I’ve worked with a few other frameworks in different languages, and my impression is that Drupal’s form API is significantly more advanced than any other solution I’ve seen.

The reason I try to avoid it, on the other hand, is mainly because it’s hard to create forms that satisfy the end users, and achieve their expectations. Basically, forms are bulky, and going with a mix of JS/Ajaxy solutions is often a pain. Having a JS form (i.e. some JS widget that builds and controls the entire form), that POSTs to a RESTful endpoint takes more code, but often times provides a more streamlined user experience.

Not sure why and how, but over the years we’ve been tasked quite a few times with creating form wizards. It’s frequently used for more complex registrations, like for students, or alumnus applying for different programs. In the early Drupal 7 days we went with CTools’ wizard, and then switched to Elm (i.e. a JS form) along with RESTful endpoints. Drupal 8 however has one major feature that makes it very appealing to work once more with form API - that is “Form modes.”

This post has an example repo, that you should be able to reliably download and run locally thanks to DDEV. I will not try to cover every single line of code - but rather share the concepts behind our implementation, with some references to the code. The audience is intermediate-level Drupal developers, that can expect to have a good sense of how to use Form modes to build wizards after reading this post and going over the code.

Before diving in, it’s important to recognize that “wizards” come in many flavors. I personally hold the opinion that a generic module cannot be (easily) built to accommodate all cases. Instead, I look at Drupal 8 with its core and a couple of contrib modules as the “generic” solution to build complex - sprinkled with lots of custom business logic - wizards.

In our case, and for demonstration purposes, we’ve built a simplified “Visa application” wizard – the same you may find when visiting other countries. As an aside, I do wish some of the countries I’ve visited lately would have implemented a similar solution. The experience of their wizards did bring me to tear some of the remaining hair I’ve still got.

Our objectives are:

  1. A user can have only a single Visa application.
  2. Users should have an easy overview of the state of their application.
  3. Sections (i.e. the wizard pages) must be completed to be able to submit it to final processing; however, it doesn’t have to happen in a single go. That is, a user can save a partial draft and return to it later.
  4. After a user “signed” the last section, the application is locked. The user can still see what they have submitted, but cannot edit it.
  5. A site admin should have easy access to view and edit existing applications.

With the above worthy goals, we were set for the implementation. Here are the two main concepts we’ve used.

Sections as Form Modes

In Drupal 7 we could define View modes for a node (well, for any entity, but let’s be node centric for now). For example, a “Teaser” view mode, would show only the title and trimmed body field; And a “Full” view mode would show the entire content. The same concept was applied to the node forms. That is, the node add/edit form we know, is in fact a “Default” form mode. With Drupal 8, we can now define multiple form modes.

That’s pretty big. Because it means we can have multiple forms, showing different fields - but they are all accumulated under a single node.

For each wizard “page” we’ve added a section: section_1, section_2, etc. If you have the example repo running locally, you can see it in:


Form modes configuration page.

Next we have to enable those form modes for our Visa application content type.


Enable form modes for the content type.

Then we can see those form modes enabled, allowing us to setup fields and disable others under each section.


Configure fields under each form mode.

It’s almost enough. However, Drupal still doesn’t really know about those sections, and so we have to explicitly register them.

The next ingredient to our recipe, is having our wizard page controller recognize which section needs to be rendered. Drupal 8 has made it quite elegant, and it requires only a few lines to get it. So now, based on the URL, Drupal serves us the node form - with the correct form mode.

Faux Required is Not Required

One of our goals, as often requested by clients, was to allow saving the application in a draft state. We can easily do it by not checking the “required” option on the field settings, but from a UX perspective - how could a user know the field would eventually be required? So we’ve decided to mark the “required” fields with the usual red asterisk, along with a message indicating they are able to save a draft.

As for the config of the field, there is a question: should the fields be marked as required, and on the sections be un-required? Or should it be the other way around? We have decided to make it optional, as it has the advantage that a site admin can edit a node – via the usual node edit form – in case of some troubleshooting, and won’t be required to fill in all the fields (don’t worry – we’ll cover the fact that only admins can access directly the node view/edit/delete in the “Access” section).

So to reconcile the opposing needs, we came up with a “faux-required” setting. I have entertained the idea of calling it a “non-required required” just to see how that goes, but yeah&mldr; Faux-required is a 3rd party setting, in field config lingo.


Faux required setting in the field configuration page.

By itself it doesn’t do much. It’s just a way for us to build our custom code around it. In fact, we have a whole manager class that helps us manage the logic, and have a proper API to determine, for example, the status of a section – is a section empty, partially filled, or completed. To do that we basically ask Drupal to give us a list of all the faux-required fields that appear under a given Form mode.

Access & Application Status

We need to make sure only site-admins have direct access to the node, and we don’t want applicants to be able to edit the node directly. So we have a Route subscriber that redirects to our own access callback, which in turn, relies on our implementation of hook_node_access.

As for knowing in which status the application is, we have a single required field (really required, not faux-required) - a field called “Status” with New, Locked, Accepted and Rejected options. Those are probably the most basic ones, and I can easily imagine how they could be extended.

With this status, we can control when the form is editable by the user or disabled and without submit buttons. Having to write $form['#disabled'] = TRUE; and knowing it will disable any element on the form, is one of the prettiest parts form API.

Theming that Freaking Required Symbol

The subtitle says it all. Theming, like always and forever, is one of the hardest parts of the task. That is, unlike writing some API functions and having the feeling - “that’s the cleanest way possible,” with theme I often have the feeling of “it works, but I wonder if that’s the best way”. I guess the reason for this is that theming is indeed objectively hard to get “right.”

Anyway, with a bunch of form alters, preprocess and process functions callback we were able to tame the beast.

Field Groups

Many fields in different sections can make the job of site admins (or the ones responsible for processing or supporting the applications) quite hard. Field group is a nice way to mimic the structure of the sections both to the default node form, as well as the node view.


Wizard section appear as fieldsets in the regular node edit page.


Form modes in Drupal 8 are a very useful addition, and a cornerstone in our implementation of wizards. I encourage you to jump into the code of the example repository, and tinker with it. It has lots of comments, and if you ignore some of the ugly parts, you might find some pretty ones hidden there: Extending the complex inline entity form widget with custom logic to be more user friendly, use Form API states for the “other” option and getting that to work with faux-required, implementing Theme negotiator, our DDEV config and more&mldr;

And finally, if you spot something you have a better idea for, I’d love to get PRs!

Feb 19 2020
Feb 19

Our normally scheduled call to chat about all things Drupal and nonprofits will happen TOMORROW, Thursday, February 20, at 1pm ET / 10am PT. (Convert to your local time zone.)

This month, in addition to our usual free-for-all, we'll be talking about hosting on Pantheon. There has been a lot of discussion in the community and on the Drupal Slack #nonprofits channel about some of the pricing changes they have implemented. If you would like to discuss and contribute to the conversation, please join us.

We will also have an update on our community's plans for the upcoming Nonprofit Technology Conference (20NTC).

All nonprofit Drupal devs and users, regardless of experience level, are always welcome on this call.

Feel free to share your thoughts and discussion points ahead of time in our collaborative Google doc: https://nten.org/drupal/notes

This free call is sponsored by NTEN.org but open to everyone.

REMINDER: New call-in information -- we're on Zoom now!

  • Join the call: https://zoom.us/j/308614035
    • Meeting ID: 308 614 035
    • One tap mobile
      • +16699006833,,308614035# US (San Jose)
      • +16465588656,,308614035# US (New York)
    • Dial by your location
      • +1 669 900 6833 US (San Jose)
      • +1 646 558 8656 US (New York)
  • Follow along on Google Docs: https://nten.org/drupal/notes
  • Follow along on Twitter: #npdrupal

View notes of previous months' calls.

Feb 19 2020
Feb 19

Your browser does not support the audio element. TEN7-Podcast-Ep-082-Mike-Gifford-Accesibillity-Is-a-Journey.mp3


Mike Gifford’s mission is to build better and more inclusive software. He’s a Drupal 8 Core Accessibility Maintainer, founder of OpenConcept, an accessibility consulting firm, and all-around gem of the Drupal community. Mike’s been spearheading website accessibility improvements for over a decade, and we’re thrilled to have him on our podcast. You’ll learn a lot from this episode!


Mike Gifford of OpenConcept Consulting


  • What is accessibility?
  • The four pillars of accessibility: POUR
  • Designing for 80%
  • Permanent, temporary, and situational disabilities
  • How Mike got started fixing Drupal accessibility issues
  • Drupal’s Accessibility Maintainers
  • Rachel Olivero, and the Drupal Olivero project
  • Pivoting OpenConcept Consulting from a Drupal shop to a consultancy
  • How OpenConcept eats its own dog food
  • The future of accessibility is personalization
  • The preferences framework widget
  • Establish trust by showing individual needs of website visitors are important
  • Accessibility and fonts
  • How do you help users when we can’t see their cues?
  • How accessible is Drupal Core?
  • Frontend pages AND backend pages must be made as accessible as possible
  • How are other CMSes and frameworks doing with regards to accessibility?
  • We are all affected by some sort of disability



IVAN STEGIC: Hey everyone! You’re listening to The TEN7 Podcast, where we get together every fortnight, and sometimes more often, to talk about technology, business and the humans in it. I’m your host Ivan Stegic. My guest today is Mike Gifford, Founder and President of OpenConcept Consulting, Inc. in Ottawa, Canada. OpenConcept is a web development agency specializing in Drupal, much like ours, and a benefit corporation, a B corp. Mike is also Drupal’s core accessibility maintainer and has been since 2012. Hey Mike, welcome. It’s great to have you on the podcast.

MIKE GIFFORD: It’s great to be on once again. It’s been a while, but definitely enjoy having an opportunity to talk again with you about Drupal and accessibility and things involved with digital tech.

IVAN: I love it. I can’t believe it’s been almost three years ago, episode 6. We’re coming up to episode 84. Wow. It’s been a long time. I feel like we’ve just scratched the surface.

MIKE: Absolutely. It’s amazing how time passes when you’re busy serving client needs and keeping up with changes in the Drupal community.

IVAN: Busy having fun I think, is what we call it. [laughing] So, let’s talk about accessibility. People throw that word around quite a bit, don’t they? I think we believe we know what it means, and some people say, “Oh your site has to be ADA compliant.” Other people say, “Yeah, we need WCAG or WCAD compatibility.” Why don’t we start with the definition for accessibility? What do we mean in our industry when we talk about that for a website?

MIKE: So, essentially it means removing barriers to make sure the people are able to access your content, whatever disabilities they have or whatever tools and devices they’re using to go off and to access it. The World Wide Web Consortium’s Web Accessibility Initiative broke this down into four main pillars that they’re using for the WCAG 2.0 framework, and I think to summarize it quite nicely:

Let’s make sure that your web content is perceivable so that people can understand it and read it and absorb the information.

Let’s make sure that it’s operable so if you’ve got some sort of interface people are working with, then the people can interact with the web forms, and they can engage with it, and click on the buttons and navigate the website.

Let’s make sure that it’s understandable. This is one that particularly government websites fail on. And there’s very, very few websites that really excel at plain language and making sure that things are written so that people can absorb the information on the flow and eliminate all that technical jargon and information that gets in the way of comprehension.

Finally, let’s make sure that it’s robust. So many websites work great if you’re in a dark room with a really new monitor that’s sitting there, but if you’re on your phone and navigating on a bright day, you’re going to have a hard time going off and viewing the information, accessing the information. So, let’s think about this in the real world situations that people engage with technology on. It’s not always going to be in that ideal environment where those light-gray-on-dark-gray backgrounds work.

You really need to be able to think about the context with which people are using the technology. So that’s how I think about technology within the web accessibility framework.

IVAN: And that’s POUR right? P-O-U-R. Perceivable, Operable, Understandable, Robust.

MIKE: That’s right. And there’s a whole bunch of criteria that you can use to evaluate your website based on that. It’s an interesting framework that’s designed to be technology agnostic. It doesn’t really matter what kind of technology you’re working with, whether it’s a PDF file or whether it’s a website—those principles are things that you can use to guide your thinking around accessibility.

IVAN: So those are high-level principles. That’s how we want to design from a design-first perspective so that the context of the design itself is available and accessible from anyone using the site.

MIKE: That’s right. There’s some really interesting work being done by people looking at inclusive design, and this isn’t a new movement, but there’s some neat work that Jutta Treviranus has set up to look at designing for the fringes. So often people think about the 80/20 rule, and it’s like, Well let’s just go off and design for that 80%, and then worry about that additional 20% later. We won’t necessarily factor that into the equation. Whereas, if you look at the fringes and design for the extremes, then you can be confident that everyone’s needs are going to be met, and you can work ahead to see that you’re able to deal with it.

And also, the 80/20 rule is a great concept for many things, but it doesn’t make a lot of sense. Like, how many businesses would just write off 20% of the United States? Would you get rid of New York and New Jersey? Whatever you think about them, would you just eliminate selling to those as potential customers, because it’s inconvenient for some reason? Probably not. It’s a large chunk of the population to ignore. So, thinking about the fringes and accessibility early on in the design process really allows you to go off and to serve a much broader range of the population than most people are aware of.

IVAN: The 80/20 rule is by definition exclusive. You are actually saying we’re excluding one-fifth of the population because we choose to.

MIKE: Because that’s inconvenient.

IVAN: Cause it’s inconvenient.

MIKE: Yeah. There’s really neat work done by Microsoft actually with the Inclusive Design Toolkit, and what they’ve done is try to look at, not just people with permanent disabilities, but to try and extend the definition out more broadly so that you have people that have permanent disabilities. But then there’s people who have temporary disabilities, and then there’s people who have situational disabilities. So, for example, a temporary disability might be you left your glasses at home, so you’re having trouble reading stuff at the office. Or, you’re in a situation where you’re taking medication, and so your eyesight may not be as good while you’re on this particular medication. Or, maybe you’ve broken your dominant arm, so you’re trying to navigate the mouse with your left hand, and it’s just not as good as it was with your dominant hand.

There are things like that, that are temporary issues that we all undergo as part of living in a complex world. But the situational ones are things like you’re in a noisy environment and you can’t use Siri to go off and interact, because Siri can’t cancel out all the noise that is coming from the area. Or, you’re in a situation where you want to use your laptop outside on a sunny day, and you can’t because there just isn’t enough contrast in the pages to be able to read the information effectively. So, again, thinking about all of these different ways that people interact, even if they’re not defined as having their disability, but there are times and places where everyone has a disability.

IVAN: Context is important, isn’t it?

MIKE: Yeah.

IVAN: You’re so passionate about accessibility. And besides it being the right thing to do, to be inclusive, to think about others that might not have the same abilities as you do. How did that happen? How did you get to be so passionate? Where did your start in accessibility come from?

MIKE: I think a huge part of my background came from having a good friend who has cerebral palsy, who is a real champion for disability rights and who schooled me on some of the theory about how to think about disability, and to think about the abilities that I have. So that certainly is a really key element to it. I started making changes to the Drupal community, getting them involved in affecting Drupal. And it suddenly became addictive because many people who work in the accessibility field go off and address a particular issue, and they fix a particular issue for a particular website. But that’s not what I was focusing my time and energy on. I was focusing my time and energy on fixing Drupal, which is 3% of the web.

So, I was able to go off, working with a whole team of other people to transition Drupal from being a reasonably good standards-compliant CMS to being by far the best, most accessible content management system out there, because of some of the work that I was spearheading. It was interesting to go off and look at ways of supporting people, and so the first Drupal Accessibility Maintainer was Everett Zufelt, who also taught me a great deal. When he was contributing the most to the Drupal community, he was working with me as staff at OpenConcept.

So, I was able to go off and learn from him as a blind user, and to learn what his experiences were with Drupal and review my own assumptions about what was possible, and how to address that. Everett is no longer an accessibility maintainer, but in Drupal 8 there are two other accessibility maintainers, Andrew MacPherson and Rain Breaw, who are taking on a role of pushing things ahead and addressing the accessibility community within Drupal.

So, we’re going to be having regular office hours and I think, it is the last Tuesday of every month.

IVAN: I’ve just been so impressed with the amount of accessibility that Drupal has garnered in the last five years. It’s just been so nice to see the improvements that have happened. Of the maintainers, do we have any maintainers that have disabilities that have the inclusive perspective of actually using the web and being able to maintain Core from a disability perspective?

MIKE: Actually no, and that’s an area where we could actually use a lot of additional work. I don’t think that either Andrew or Rain have a disability, at least not that I’m aware of. But we haven’t had enough people in the Drupal community who have disclosed their disabilities and who have stepped up to get involved in the Drupal community. We had a few people who have done that. Everett is one. Vincenzo Rubano was another. He came to DrupalCon Portland from Italy. And as an individual, Vincenzo has contributed more to the Drupal 8 accessibility than all of the governments in the world combined, and this is what he did in the year before he started university. So, it’s quite an amazing accomplishment in many ways, and also, why is it that governments around the world are not doing more on accessibility? It’s a bit baffling.

But the last person we’ve had to sort of highlight in terms of people with disabilities who have been involved with the accessibility team, was Rachel Olivero, who unfortunately died last year. So, that was a sad thing. She was quite involved in the diversity community within Drupal and had gone to two other DrupalCons and unfortunately, she died suddenly and is no longer with us.

IVAN: I recall that. That was devastating news that we’d heard. I’d had such a wonderful dinner with her at that DrupalCon that we both attended on trivia night.

MIKE: Yes.

IVAN: I think she was part of the National Association of the Blind, if I’m not mistaken.

MIKE: That’s right. She was working with the National Federation of the Blind and transferred a few different roles, but had actually just launched their Drupal 8 website and had made a lot of advancements in that, and it was really nice to see that, and she had also made a couple contributions to the Drupal community with accessibility bugs that she had identified. There is now the Olivero project within Drupal with the new themes which is going to be coming out with Drupal 9. I’m looking forward to going off and seeing that, a theme that’s being named in her honor. So, that’s lovely.

IVAN: That’s really lovely. We will link to that from the podcast episode page. So, if you’re listening, do visit the website for more information . Now, OpenConcept is focused on accessibility as a core part of your business, isn’t it?

MIKE: Yes, it is. We’re actually in a position where we’re pivoting from being a Drupal shop where that’s primarily what we do, to actually having a role as a digital agency and doing more consultation and support work with others. Because of the work that I’ve done on accessibility, we’ve been able to take systems perspective to accessibility and sustainability and security and really look at this at a higher level and to step back and address these issues. So, we’re doing more work as a digital agency going ahead, and not just as a Drupal shop.

IVAN: Well that’s a wonderful development for you. You certainly have the wherewithal and the knowledge to be providing that kind of consulting, so I love to hear that evolution. I love the fact that your website itself, openconcept.ca, eats its own dog food, so to speak. There’s a widget drawer at the top of some sort of preferences. I haven’t seen that before. It allows anyone to be able to change, essentially, the design and the contrast and the typeface, and everything you would to make the site more accessible, I would guess. Tell me about that preferences pane.

MIKE: Our own website is one that often doesn’t get as much attention as we’d like to. So, we started a process to rebuild our website, and I’ve had to put that on hold because of some other issues. But, yeah, our website, we’ve definitely built it for accessibility, but accessibility is a journey, you need to be able to invest in that on a regular basis. So, I’d like to be doing more with our website than we are.

But specifically about the widget that we have on our website, I realize that one of the our challenges with the WCAG process is that it’s building a single website, and meeting the needs of everyone through a single website. But unfortunately, disabilities are such that that doesn’t really make any sense. There are people who have low vision, and need high contrast. There’s people who have dyslexia, they need low contrast. That’s just sort of one example. There’s people who get really frustrated when they see Comic Sans as a font. There’s other people who, the best way that they can read content is with having Comic Sans or Openyslexic or some other customized font.

So, how do you try to give people the range of exposure to go off and absorb information in a way that suits their needs? And having a single site, that is not going to be able to achieve all of those goals. So, we see that the future of accessibility is really towards providing personalization. Yes, you want to go off and meet a minimum standard requirement. You want to make sure that your default website is meeting the base level, the Perceivable, Operable, Understandable, Robust guidelines that are being set forth in WCAG 2.1—actually, because the latest one is WCAG 2.1. So, that’s the goal that people should be aiming for. But, if you can extend it to even more people by going off and allowing individuals to have personal choices.

The IDRC, which is the Inclusive Design Research Centre in Toronto, put forward a preferences framework widget that we’ve incorporated in our website. We did this because we were working with the Canadian National Institute of the Blind, which is the equivalent of the NFB in the US, or the RNIB in the UK, and we wanted to incorporate that within their website.

So, we first tested it on our website and looked to make sure that we could work through the bugs and unknowns and uncertainties with that tool on our website, before we went off and implemented it with our clients. Again, that idea of eating your own dog food and evaluating this and building the best practice by demonstrating the best practice is something that we wanted to be able to do.

So, we implemented this widget and have contributed back to the IDRC, because the preferences framework is an open source widget that we were able to build on and incorporate into Drupal as a Drupal module. There’s now Drupal 7 and Drupal 8 implementations of the preferences framework.

IVAN: If you design a site so that it’s meeting these accessibility parameters in WCAG 2.1, then do you need the preferences framework?

MIKE: You don’t for the legislation. If your goal is to try and make sure that you’re just checking a box, and that you’re meeting the requirements and you’re not going to get sued, then no, you don’t need to worry about the widget. But if your goal is actually increasing the use and participation and usability of your site, if your goal is improving usability, then this widget is actually quite useful to go off and to give your users the ability to provide a custom interface, or custom display for the site. And there are ways that people can override the CSS pages that are custom built to their own browser, but that’s more complicated than most users would go off and know how to do, and often it is something that doesn’t work well with the website.

But, if you build in this framework then you can evaluate, Well, how does it work with the dark background? How does it work with the light background? How does it work with a yellow/black high contrast mode? And your developers and designers can evaluate some basic ideas to make sure that your SVG files show up appropriately, that you’re able to go off and provide a good experience for somebody, even if they do need to invert the color scheme. It’s useful to do that even for the number of people who are now preferring to use dark mode for their websites.

IVAN: How do you deal with marketers and brand stewards of corporations and organizations who will, I’m sure, inevitably say something like, “That preferences pane destroys our brand. It’s not consistent with what our brand guidelines say.” How do you deal with that kind of pushback?

MIKE: I think I would say that if something like this destroys your brand, then your brand is not very strong. You want to be able to go off and have some control over your presentation, your site, the default settings, but ultimately a brand is about establishing trust with the customer. And there’s no better way to establish trust with your customer than to demonstrate that their individual needs are important to you, and that you can serve their individual needs.

So, what could possibly be better than having a widget on their website that says, You can buy your Nike runners even if you're blind or if you’re dyslexic. We’re going to make it easy for you to buy our products and give you the support you need, however it is that it serves you best. And you’re not to get stuck on stupid proprietary fonts or highly custom color combinations that were approved by some highly paid branding office. Ultimately the brand has to be strong enough that the trust and that care for the user shines through, and I think that this preferences framework is a part of that.

IVAN: I’ve been tracking the variable fonts recently. I know they’ve been around for quite a while now, but they’ve only recently been getting more traction, I would guess. Do you know anything about variable fonts and how they affect or not affect accessibility? I would imagine there’d be a relationship there.

You have more control over the kerning and the size. They’re somewhat different than regular fonts that have specific sizes. A nd when you say bold for example, they go to a specific bold typeface. You can actually change the size and characteristics of these variable fonts with CSS. And I would imagine that would be highly useful from an accessibility perspective.

MIKE: Fonts are definitely an interesting area and it’s amazing to see the changes in the web that make it look more attractive, more compelling. But there isn’t a lot in WCAG to address fonts. Sometimes it just comes down to a matter of judgment. A lot of times fonts are too narrow to be easily read, and there’s no standard way to evaluate how thick or how thin a font can be without affecting the accessibility of it. And, as our monitors get more and more refined, you can create thinner and thinner fonts.

So, I think we are going to get to the point where fonts are going to be more easily evaluated, but some of it comes down to even just base readability. Like, all the debates have happened between, you know, is Helvetica better than Arial? Better than Times? There’s a lot of studies on this, but again there’s nothing in the standards that we’re looking at that say, This is the best font, or This is the way that you’re going to address fonts in meeting accessibility. Because it’s hard to pin down, hard to go off and quantify, and so much about WCAG is about making it quantifiable.

It’s not just about opinions. It’s about a quantifiable, demonstrable barrier that you’re able to address. I can see that with variable fonts, one of the neat opportunities there is to be able to say, How would you use something like the preferences framework to allow a user to go off and customize the fonts?

IVAN: Exactly.

MIKE: Let’s say you want to have a fat font for this. The font doesn’t need to be larger, it just needs to be bolder, just make everything bolder. And you could with this, very easily go off and have a setting that allows users to have that ability to build that in, and to think about ways—or just switching fonts entirely. You can either stick with just making the font that was chosen customizable, to go off to meet your specific user’s needs. Or you can go off and say, Let’s give them the option to pick from five or six other fonts that might meet their needs and allows them to more easily absorb the information. Because ultimately what we want is the ability for an author to communicate to another person.

So, how do you communicate that information so that the author or presenter is able to go off and convey as rich in information as possible for the person receiving that information to absorb. And, if you were doing it in a face-to-face conversation, we can know how to go off and slow the pace of our speech, or to speak more loudly if somebody has hearing impairments. We know how to do that because of personal cues that allow people to change their presentation for a particular individual.

But it’s harder on the web when you’ve got technology mediating that communication, and we don’t have those personal cues to guide us. So we need to actually give that opportunity for feedback to the user and encourage them to select preferences that allow them to choose something that allows them to more easily use your website.

IVAN: The standard that we’re looking at right now is WCAG 2.1, and people usually say that [level] A is the bare minimum. AAA is, Are you insane, do you have a lot of money? What is the goal of doing AAA? And AA is usually the one that organizations land on, if I’m not mistaken, to describe it loosely. Where are we at with Drupal for accessibility? For Drupal Core? And what’s next?

MIKE: So, just a bit of a correction. In the United States, the standard is still the revised Section 508 standard, which is pegged to WCAG 2.0 AA, and more or less that is the standard. Internationally, we’ve moved on from WCAG 2.0 AA, because that was the standard written in 2008.

IVAN: I didn’t realize it was that old.

MIKE: Yeah, so the original Section 508 that was in place up until January of 2017 was written in 1997, so that was how old the standard was for the Section 508. Which is still better than the—actually no, it wasn’t better than the WCAG 1.0—these are old, old standards. But WCAG 2.1 was released in 2018, so it’s a much more current guideline, and WCAG 2.2 should be released later this year. And the plan is to go off and make these releases much more regular, in order to keep up with the pace of technology. Waiting a decade or two between updates of accessibility standards truly does leave out a lot of people. So, the standards are evolving.

So, as far as where Drupal is, we’ve done a good job at Drupal Core of meeting WCAG 2.0 AA for both the frontend web pages and for the administrative pages. It’s not perfect, and there’s a lot of known issues in the accessibility queue, but we’ve addressed a lot of the base issues that people run into.

Because Drupal 8 has a lot of interactive elements, there’s content that changes on the fly with JavaScript, and we need to be able to add more instances for support for this, so that we can alert screen reader users that the screen has changed. So, ARIA live is something that was introduced in Drupal 8. There’s a Drupal JavaScript function called Drupal.announce() that allows developers to go off and to codify how screen readers are alerted to changes to dynamic content on the page. So, we need a lot more work done to implement that in Drupal 8. Drupal 8 has done a lot of work on ATAG 2.0. ATAG 2.0 is the Authoring Tool Accessibility Guidelines, and this essentially says, we’re going to look at the authoring experience, and Drupal is an authoring tool, and we’re going to say, part A of ATAG is Let’s make sure that the authors with disabilities can actually be publishers of content and not just consumers of content. Right?

And this something really that Everett Zufelt went off and drove home to us, is that we couldn’t just go off and rely on making the frontend pages accessible, we also needed to make the backend pages accessible, or people like Everett were not going to be able to publish content using Drupal. So, we went and, in Drupal 7, made some big advances in addressing the backend accessibility. That’s been carried over in Drupal 8, and that’s part of ATAG part A, it’s just sort of making that the authoring interface be as accessible as possible.

Part B is actually I think more interesting and more useful, particularly for institutions, and I’m sad to see that there isn’t more attention paid to Part B of ATAG, because that’s about how do we use these systems to make it easier for authors to create accessible content? And how if we don’t think about the authoring experience, like we can’t expect authors to be accessibility experts; we need to think through the tools and the technology that they use to support and guide users in doing the right thing.

We need to set good defaults for accessibility in the authoring tool, so that when the millions of users are adding new content to Drupal, the 3% of Drupal websites around the world, it’s a huge number of people that have used Drupal on a regular basis that need to be involved. And if you don’t have the system helping authors make the right decisions, then it should be no surprise that you have accessibility issues being added by authors who are not familiar with best practices.

But if you get the tools involved in setting up proper constructs, then you can limit the damage that users can do. You can guide them to make the right decisions. And there’s a lot more that can be done in that space. We’ve done more than anyone else, but we have not done enough in that space.

IVAN: I like to hear that we’ve done more than anyone else, but I’d be even happier to hear that “everyone else” is close on our heels, and equally accessible. So, before we close, I wanted to hear your assessment of other competitors, other CMSes and frameworks out there, and what their accessibility is looking like. How do they compare? Let’s talk about maybe a couple of the open source ones we all know and love, like WordPress, and maybe talk about React or Gatsby, any of the things that come to mind for you. How do they compare?

MIKE: So, I was really quite hopeful with the WordPress community up until the Gutenberg debacle that came out. When I was in Holland in the fall, I had a great meeting with one of the WordPress accessibility leads that had stepped down because of how that was handled. It was a really interesting presentation that she and I had with others in Holland around accessibility. So, I’m less optimistic about the future of WordPress accessibility than I was, because of leadership issues within the WordPress community.

But there’s lots of good people involved in creating accessible themes in WordPress and that’s great. But it does require it to be a priority for the leadership in order for it to be really ingrained in the community, and that’s one of the things that Drupal has really stood out in.

 I’m really impressed by Gatsby and with Marcy Sutton and others who have a really deep ingrained passion for accessibility that they’re building into the process. So, if you’re building a Gatsby site, or a Gatsby page, accessibility checks are now just part of the process of doing a Gatsby build. And just having that as a framework is just built into the process of how you build a good Gatsby website. That’s so wonderful. Marcy was involved in the Axcore team, which is an automated accessibility engine that the Deque folks built a while back, and it’s really been taking off. And the Microsoft community is jumping on board with that, and Microsoft has built a tool called Accessibility Insights that uses that.

There’s also the Google Lighthouse tool that uses Axcore as well. So, it’s nice to go off and see that that’s built into Gatsby, and that there’s a commitment to that from senior levels in the Gatsby community.

I hadn’t really seen a lot of other examples where content management systems are taking this seriously. I do think that Microsoft is an organization that we do really need to be aware of, both because of their interest in open source and their passion for accessibility, and that has really been a real transition in the last two or three years.  

IVAN: Yeah, who saw that coming?

MIKE: Yeah, like what the heck? So, they’re incredible leaders in the space and making a lot more money because of it, and that’s both wonderful and fascinating. I certainly did not see that coming [laughing]. I was definitely not one of the people that expected this. But it’s quite wonderful, and I think it will be neat to see what Microsoft comes up with, and I’m not sure there’s enough money in the world to go off and make SharePoint accessible [laughing].

IVAN: [laughing] Yeah, SharePoint doesn’t have a great, stellar accessibility. I mean, even for the rest of us, the user experience could be better.

MIKE: That’s right, but it is interesting that Microsoft has made a cultural shift in how they think about both open source and accessibility, and sustainability, for that matter. They’re committed to being carbon negative by 2030, so they are making some big bold leadership commitments in the tech space, and I think that they will pay off for Microsoft, and I think that is something that others will follow, but I haven’t seen a lot from most other, like React itself, I haven’t seen a lot of pickup and movement around this. I haven’t seen Angular or Core.

IVAN: What about Sitecore or any of the proprietary CMSes?

MIKE: I don’t think it’s part of the process. They’re not looking at building it accessible by default and partly that’s because clients are not demanding it. There are not enough organizations who are demanding accessibility as part of the default system. I think this is changing. The governments in Europe are starting to be aware of this and looking at Drupal and looking at that as a model, but it’s not being incorporated into the procurement process. So sales folks are not hearing this is something they’re not losing their contracts around it.

IVAN: Yeah, and you would expect there to be a more vocal demand. As you know there was this report from the World Health Organization in 2011 that said that about 15% of the world’s population lives with some form of disability. And it’s probably higher than that, that’s almost 10 years ago that that report was done. So, you would think that there was a demand, that corporations would see this, and would move in that direction, but I guess not.

MIKE: That’s only the people with permanent disabilities that they were addressing. They weren’t making it either temporary or situational. If you’re looking at temporary or situational, it’s a higher percentage.

IVAN: Right. We all experience high sunlight when we go out and use our phones, don’t we?

MIKE: That’s right. It’s a universal thing. Unless you don’t go outside [laughing].

IVAN: Right. [laughing] and the same thing with getting older, we all lose our eyesight as we’re getting older. Our vision becomes impaired and so there’s that to think about as well.

MIKE: That’s right, and you’ve got the aging baby boomer population. You think about the gray tsunami, all of those ideas require us to think differently about the web, because it’s not just the way that we see color change as we age, the way that we navigate websites changes. Disability is just part of life. We do not have the abilities we did when we were 20. This should not be a shocker to anyone right?

IVAN: I wish I had those abilities still, Mike. [laughing] It’s been so great talking to you. I feel like we didn’t even cover some of the things I wanted to get to, talking about the government and your work with the Canadian government and how you’ve been keeping track of the Accessible Canada Act. Would you come back sooner than the next three years, and we could have another recording and another episode, and we can get into those ideas as well?

MIKE: I would absolutely be keen on doing that, and then hopefully it’ll be something that’ll be done in person as well, which will be way more fun than doing it remotely.

IVAN: You know what, that would be great. Next time you’re in Minneapolis, I know you won’t be here for DrupalCon this year, but next time you are, let’s do that.

MIKE: That’d be great. And who knows, maybe I’ll find a way to get to DrupalCon. Maybe I’ll make that possible but it’s not in the cards right now, but, yeah, it’d sure be a lot of fun.

IVAN: That would be so much fun. Thank you so much for spending your time with me today. It’s been a great pleasure talking to you.

MIKE: No problem.

IVAN: Mike Gifford is Founder and President of OpenConcept Consulting Inc. in Ottawa, Canada. You can find them at openconcept.ca. They’re a web development agency that specialize in Drupal and are pivoting to be more of a strategic consulting firm. They’re also a B Corp. Mike is Drupal’s core accessibility maintainer, one of the few of them, and you could find him online @mgifford on Twitter. You’ve been listening to the TEN7 podcast. Find us online at ten7.com/podcast. And if you have a second, do send us a message. We love hearing from you. Our email address is [email protected]. Until next time, this is Ivan Stegic. Thank you for listening.

Feb 19 2020
Feb 19

Content collaboration has long been table stakes for content management systems like WordPress and Drupal, but what about real-time peer-to-peer collaboration between editors who need direct interaction to work on their content? The WordPress Gutenberg team has been working with Tag1 Consulting and the community of Yjs, an open-source real-time collaboration framework, to enable collaborative editing on the Gutenberg editor. Currently an experimental feature that is available in a Gutenberg pull request, shared editing in Gutenberg portends an exciting future for editing use cases beyond just textual content.

Yjs is both network-agnostic and editor-agnostic, which means it can integrate with a variety of editors like ProseMirror, CodeMirror, Quill, and others. This represents substantial flexibility when it comes to the goals of WordPress to support collaborative editing and the potential for other CMSs like Drupal to begin exploring the prospect of shared editing out of the box. Though challenges remain to enable truly bonafide shared editing off the shelf in WordPress and Drupal installations, Gutenberg is brimming with possibility as the collaboration with Tag1 continues to bear significant fruit.

In this Tag1 Team Talks episode that undertakes a technical deep dive into how the WordPress community and Tag1 enabled collaborative editing in the Gutenberg editor, join Kevin Jahns (creator of Yjs and Real-Time Collaboration Systems Lead at Tag1), Michael Meyers (Managing Editor at Tag1), and your host Preston So (Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) for an exploration of how CMSs around the landscape can learn from Gutenberg's work to empower editors to collaborate in real-time in one of the most exciting new editorial experiences in the CMS world.

[embedded content]

Feb 19 2020
Feb 19

Why business should go for it?

  • Law. There are many lawsuits already and plaintiffs always win and get money. For example, Californian Bag’n Baggage was forced to pay $4,000 to the plaintiff who wasn’t capable to shop on their inaccessible website. Oftentimes, there is a special accessibility law in a particular country. Check the link to the government policies in the block Useful links.
  • Morals. It’s the inclusion century, let’s keep up with it.  Your users are many different unique people, so make sure that they have access to your services.
  • Lost profit. As long as you want to make money, you ought to remember that, according to World Health Organization, “about 15% of the world's population lives with some form of disability”. It’s 1,1 milliard of people - a huge market.

What businesses are the most vulnerable to be sued? 

According to 3PlayMedia, these industries are in danger of being sued:

  • retail
  • food service
  • travel
  • banking

If you have an app that considers users actively communicate with it - double-check the accessibility standards.

Accessibility and Drupal

Now that you’ve recognized yourself among those who need an adapted website, what technical solution should you go for?

We often advise using the Drupal CMS as it has many accessibility features out-of-the-box and the Drupal Community actively contribute new functionality. 

So what Drupal has?

  • Special markup in HTML, for example, the language tags
  • Images descriptions and alt descriptions
  • Built-in responsive images 
  • Headings can be used for page-level navigation
  • Controlled tab order that enables non-visual users and non-mouse users to access all the elements on the page

There are also contributed modules for Drupal 8. Check the link in the block Useful links.

Accessibility best practices

Here we would like to address some common-sense practices that you should take into account from the very first stage of development.

According to WCAG 2.1 (Web Content Accessibility Guidelines), your website or app should be perceivable, operable, understandable, robust.

There are 3 levels of accessibility: A, AA, AAA. Usually, implementing AA accessibility requirements is enough for the law. Below we will list some A and AA practices that are being violated the most from our experience.

1. All content should be perceivable

  • All non-text content should have a text equivalent if possible.
  • You should create captions for audio and video.
  • The reading sequence should be understandable and obvious, otherwise - programmatically determined.
  • It’s supposed to be a poor practice to rely on color, shape, sound to operate the content. Some people don’t recognize colors, some have hearing impairments, some cannot comprehend a way too much-complicated logic due to aging or mental diseases.
  • A contrast ratio of text and images should be at least 4.5:1.
  • Text size can be resized 2 times bigger without any assistive devices

2. The content should be operable

  • To cut a long story short, all functionality must be operable through a keyboard 
  • If there are time limits for some actions, they can be turned off (for example, a photo carousel)
  • No more than 3 flashes per second - don’t provoke falling sickness

3. The content should be understandable

  • Navigational mechanisms that are repeated throughout the website should occur in the same order
  • Components with the same functionality should be identified the same way throughout the website
  • If an error occurs, it should be identified to a user

4. The content should be robust and compatible with different assistive technologies. For example, elements do not contain duplicate attributes, and any IDs are unique. Attributes of user interface components can be programmatically determined.

Write and publish an accessibility statement on your website: here you may specify how exactly you improved accessibility and how you address this topic in general. Probably, you have an accessibility professional in your team or you hold accessibility audit frequently.

Find an elaborated description in the PDF in the drop-down banner.

Useful links

  1. Web Accessibility Laws & Policies
  2. Web Content Accessibility Guidelines (WCAG) 2.1
  3. Drupal 8 accessibility features
  4. Drupal 8 modules for extending accessibility
Feb 18 2020
Feb 18

Paragraphs is a new way of content creation. It allows the site builders to make things cleaner and can give more editing power to the end-users.
“Paragraphs” is a very popular module in drupal, used for handling content. It is similar to content fields and provides a wide range of options for designs, layout, and grouping of content as per the requirements.

Types of Drupal Paragraphs & Its Usage?

Paragraphs can be of different types. They can be anything from an image to a video, or a simple text to a configurable slideshow. 
Instead of putting all the data in one body field, we can create multiple Paragraphs Types with different structures. This allows the end user to choose between pre-defined Paragraphs Types. By using Paragraphs, we can create multiple featured Paragraphs, allowing the end users to pick the one most suitable for their needs. 
The Drupal Paragraphs module is easily manageable for non-technical users, while also offering Drupal developers the ability to control the appearance of the elements at the theme level.

How to use Drupal 8 Paragraphs module

1. Install and Enable Drupal Paragraphs module.

drupal paragraph module

Drupal Paragraphs module requires Entity Reference Revision module. To work with drupal paragraph module, install and enable Entity Reference Revision module.

drupal paragraph field

2. Create a new Paragraphs Type

  • To take the advantages of Drupal Paragraphs module, create at least one Paragraphs Types.
  • Navigate to Structure > Paragraphs types.
  • Click on “Add paragraph types”.
  • On the next page, there are three fields to be filled.  Label, Paragraph type icon and description. The Label field(mandatory), is the name given to the paragraph type created. If required, icon and the description of the paragraph can be given.
  • After editing, click on “save and manage fields”.
  • In manage fields, click on “Add field”.
  • Here you can add more fields as per the requirement. These fields include text fields, image fields, boolean etc.. This is similar to adding fields in content type.
  • After adding the field, click on “save and continue”.
  • On clicking “save and continue”, you will be guided to the “Field settings” tab. For any of these fields, there are settings such as, maximum length and allowed number of values. This is useful to allow more than one value for some fields when it is required. 
  • Click on “save field settings”.
  • In the next tab we can set the title for the field which is displayed when the new content is created.
  • Then click on “save settings”.
  • Now the field created can be seen inside Paragraphs Types.
manage field in paragraphs modules

3. Add Paragraphs to the content type:

  • Navigate to Structure > Content type. Choose the content type for which the created paragraph is required.
  • Go to “manage fields” of the content type and click “add field”.
  • To use Paragraphs, open the “Select a field type” dropdown list and select “Paragraph” under “Reference revisions”. After selecting “Paragraph” give a label name for the field which is used while creating the content. After labeling, click on “save and continue”.
  • On the next page, there are two settings “Type of item to reference” and “Allowed number of values”. “Type of item to reference” should be set to “Paragraph” and under “number of values”, it is better to set it to “Unlimited” so that we can value innumerable times. Click “Save field settings”.
type of the paragraphs-module
  • Clicking “Save field settings” will take us to the next tab where there are options to choose the paragraphs type to be used in this content type. If we want a particular paragraph type to be used, check on the paragraph that is required. Else, click ”Save settings” without checking the paragraph types. This will give dropdown during the content creation and we can use any paragraphs that are created.
  • By clicking “Save settings” we can see the field with type entity reference revisions.

4. Adding contents to the content type having drupal paragraphs.

  • Go to Content > Add content, and choose the content type to which paragraph is added.
  • You will see a paragraph field with all the fields added to the paragraphs. There are two more options: They are “remove” and “Add (title of the field referred to the paragraph)”. To add multiple values of the same structure, click on add, and to remove paragraph, click on remove.

Features of Paragraphs module

  1. Allows the editor to create different structures on each page.
    If there are different structures on the same page or different pages, paragraphs can be used. For ex. There is an image with text on the top and slideshow on the bottom. We can use Paragraphs to make this less complicated.
  2. Allows the editor to change the order of the paragraphs.
    If there is a change in display, like there is an image on the top followed by title and description, and you want to change it to title to the top followed by image and description. Such changes can be done using paragraphs easily. 
    Go to “manage display” of the paragraphs used and change the order, which will change the display order of all the content wherever the paragraph is used.
  3. Paragraphs can be nested
    One paragraph field can be referred to in another paragraph field. While selecting field to paragraphs, select “Paragraph” under “Reference revisions” and select which paragraphs type you want to add.
Feb 18 2020
Feb 18

In today’s fast-paced digital environment, time is one of our most precious, if not the most precious, commodities. One would then figure that we would value it and manage it well, right? 

Well - not really. While most of us understand the importance of time and its limited amount, we somehow fail to put this understanding into practice and waste our time on distractions and procrastination rather than taking the reins and living the life we’ve always dreamt of living. 

Even if we dedicate our time to the things that matter, we quickly realize that there are just not enough hours in a day to cover all the important fields, i.e. work, family, hobbies and personal care. This is why good time management is even more important to a happy and fulfilled life.

In this post, I’ll further discuss the importance of time management and give you some tips on how to effectively manage your time and get more out of your life. By the end of the post, you’ll be one step closer to having full control over how and where you spend your time.

Why is time management so important?

If we don’t take control over how we spend our time, all areas of our lives suffer. Either we procrastinate by binge watching Netflix and endlessly scrolling through social media feeds, or we stretch ourselves too thin by trying to juggle everything - both cases can lead to health problems, i.e. burnout and depression/anxiety, respectively. 

That’s why it is crucial to, on the one hand, have clearly set goals and schedules, and, on the other, maintain a healthy work-life balance by not allowing your work to start seeping into your you-time. 

If we don’t make these plans and take care of keeping our work and personal life separate, we’re starting to damage our relationships and thus overall diminishing our quality of life. 

In the end, we and we alone own our time - and we must learn how to effectively manage it. Achieving this, we’ll be more efficient and productive, and have more opportunities in life while greatly reducing stress and avoiding a poor professional reputation. 

It’s not something new

The science behind time management isn’t some groundbreaking new trend; it was a popular topic of Ancient Roman philosophers which then resurged at the end of the 19th century with the industrial era after being sidelined during the Middle Ages. I doubt anyone today could’ve put it as nicely as Seneca did over two millennia ago:

Then the start of the 20th century and the concurrent rise of capitalism brought new advances to time management. Frederick W. Taylor, the first “time management guru”, was able to increase production from 12.5 to 50 tonnes of steel per day - but, seeing how he used this much higher figure as the new daily standard, this wasn’t exactly beneficial to the workers.

Just a few decades later, in 1930, British economist John Maynard Keynes made the prediction that the work week would have been reduced to only 15 hours in the following hundred years. Now, a mere 10 years until the end of that hypothetical timeframe, it is clear that this is not the case - all the more reason, then, to become the master of your time.

How to effectively manage your time

There’s an abundance of time management tips out there; trying to follow all of them right from the get-go, however, would likely just have the adverse effect and cause you even more time-related issues. 

So, as a general rule of thumb, remember that “less is more”, but also keep in mind that not everyone is the same and that different approaches work for different people. Some prefer following a set of just a few guidelines, while others would be left immobilized without at least 20. 

Here are some tips that have helped me personally and can serve as a great starting point:

  • Set goals the right way,
  • find a good time management system and/or tool,
  • audit your time for seven days straight,
  • spend your morning on MITs (most important tasks),
  • follow the 80-20 rule (the Pareto principle),
  • instill keystone habits into your life,
  • schedule email response times,
  • eliminate bad habits, or at least reduce the amount of time spent on them.

Once you get the hang of it, you can start working even more intently on mastering your time:

  • Take frequent breaks while working,
  • meditate or exercise every morning,
  • at the end of each day, make a to-do list for the next day,
  • find inspiration when you’re feeling lackluster,
  • get a mentor who can guide and support you,
  • turn off social media notifications,
  • and, finally - declutter and organize.

In the end, the most important thing is to incorporate these rules into your routine. The results likely won’t be visible immediately, so it’s important to persevere and not throw in the towel immediately. Once you internalize them and make them an integral part of your life, they’ll become intuitive and you’ll really start seeing the benefits of following them.

Real-life example of the importance of valuing time

Some time ago I was present for a call between a client and one of our developers. The client is from central U.S., which means an 8-hour time difference. It was supposed to be a short, 15 minute-ish meeting intended to keep all sides updated on the progress of the project that week.

Unfortunately, it quickly became apparent that the client’s project manager wasn’t adequately prepared for the meeting - they were trying to get answers to our developer’s questions on the fly by contacting their CTO, which stretched the meeting into 45 rather than the intended 15 minutes. 

Worst of all, the developer did not get any of her questions answered, while both of us had to prolong our workday to accommodate the call while the project manager was scrambling to find answers in-house. 

This could easily have been avoided with proper preparation: you should always have a predefined meeting agenda, which enables you to go through the issues quickly. Then send a follow-up email to everyone involved with a recap of the meeting and steps to go forward. Even saving just 30 minutes is something most people will greatly appreciate.

In conclusion

I think we can all agree on how important time management is to all aspects of our lives. If you were not convinced before, or just never really knew how to get started, I hope this post has helped you realize this importance and put this mindset into practice. 

In case you want to learn more about and really dive deep into managing your time, I highly recommend Nir Eyal’s excellent book Indistractable which will help you tackle one of the major obstacles to effective time management - distraction.

If, however, you’re looking for something shorter, this Harvard Business Review article by Erich C. Dierdorff is another great resource with useful insights. 

Time is your most valuable asset, and those who understand and live by this are primed for a successful and happy life. I wish you the best of luck on your journey of mastering your time!

If you'd like to work with experienced developers who always value their clients' time, reach out to us and we'll craft the perfect team for your project's needs. 

Feb 18 2020
Feb 18

In the first part of our two-part blog series on Drush 10, we covered the fascinating history of Drush and how it came to become one of the most successful projects in the Drupal ecosystem. After all, many of us know many of the most common Drush commands by heart, and it’s difficult to imagine a world without Drush when it comes to Drupal’s developer experience. Coming on the heels of Drupal 8.8, Drush 10 introduces a variety of new questions about the future of Drush, even as it extends Drush’s robustness many years into the future.

Your correspondent (Preston So, Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) had the unique opportunity to discuss Drush’s past, present, and future with Drush maintainer Moshe Weitzman (Senior Technical Architect at Tag1), Fabian Franz (Senior Technical Architect and Performance Lead at Tag1), and Michael Meyers (Managing Director at Tag1), as part of the Tag1 Team Talks series at Tag1 Consulting, our biweekly webinar and podcast series. In the conclusion to this two-part blog series, we dig into what’s new in Drush 10, what you should consider if you’re making a choice between Drush and Drupal Console, and what the future for Drush might hold in store for Drupal’s first CLI.

What’s new in Drush 10

Drush 10 is the version of Drush optimized for use with Drupal 8.8. It embraces certain new configuration features available as part of the upcoming minor release of Drupal, including the Exclude and Transform APIs as well as config-split in core. Nevertheless, the maintainers emphasize that the focus of Drush 10 was never on new additive features; instead they endeavored to remove a decade’s worth of code from Drush and prepare it for many years to come.

To illustrate this fact, consider that Drush 9 was a combination of both old APIs from prior versions of Drush and all-new APIs that Drush’s maintainers implemented to modernize Drush’s commands. Therefore, while Drush 9 commands generally make use of the newly available APIs, if you call a site with Drush 9 installed from Drush 8, it will traverse all of the old APIs. This was a deliberate decision by Drush’s maintainers in order to allow users a year to upgrade their commands and to continue to interoperate with older versions. As a result of the removals of these older approaches, Drush 10 is extremely lean and extremely clean, and it interoperates with sites having Drush 9 but not those with earlier versions.

How should developers in the Drupal community adopt Drush 10? Moshe recommends that users upgrade at their earliest convenience through Composer, as Drush’s maintainers will be able to offer the best support to those on Drush 10.

Why Drush over Drupal Console?

One key question that surfaces frequently concerning Drupal’s command-line ecosystem is the distinction between Drush and a similar project, Drupal Console, and when to use one over the other. Though Drush and Drupal Console accomplish a similar set of tasks and share similar architectures because they both depend on Symfony Console, there are still quite a few salient differences that many developers will wish to take into account as they select a command-line interface to use with Drupal.

Commands, for instance, are one area where Drush and Drupal Console diverge. Command authors will find that commands are written quite differently. Drush leverages an annotated command layer on top of Symfony Console where developers employ annotations to write new commands. Drupal Console instead utilizes Symfony Console’s approach directly, with a few methods attached to each command. However, this is a minor consideration, as there is little to no difference in the CLI’s functionality, and it is merely a stylistic preference.

Drush and Drupal Console also differ significantly in their approaches to testing. Whereas Drupal Console performs unit testing, Drush prefers functional testing, with a full copy of both Drupal and Drush in their test suite. All Drush CLI commands are run on a real, fully functional Drupal site, whereas Drupal Console opts to leverage more mocking. There are admittedly many advantages to both approaches. But perhaps the most important distinction is of a less technical variety: Drupal Console has seen a bit less contribution activity as of late than Drush, which is an important factor to consider when choosing a CLI.

The future of Drush: Drush in core?

Though Moshe and Greg have committed themselves to maintaining and supporting Drush in the future, there are doubtlessly many questions about Drush’s roadmap that will influence decision-making around Drupal.

Drush’s inclusion in core has long been a key talking point with a variety of potential resolutions. Drupal already has two CLI commands in it unrelated to Drush, namely the site-install and quick-start commands, which are seldom used as they have limited coverage of key use cases. For instance, site-install only installs Drupal successfully on SQLite databases and lacks consideration for configuration. Drush’s maintainers are keen on considering a version of Drush in core, and an active discussion is ongoing.

Moreover, now that the starter template for Drupal projects is now deprecated in favor of core-recommended, there is an opportunity for Drush 10 to serve as a key dependency in those starter templates, initially as a suggested dependency and eventually as a required one. Some of the key commands that a hypothetical Drush in core would encompass include enabling and uninstalling modules as well as clearing caches and logging in as a user. In the not-too-distant future, a Drupal user could start a Drupal project and immediately have Drush and all its commands available from the very outset.


Drush 10 is an inflection point not only in the history of Drupal but in how Drupal developers interact with Drupal on a daily basis. Thanks to its leaner, faster state, Drush 10 marks a new era for remote interactions with Drupal. Because Drush 10 has tracked closely to the Drupal 8 development cycle, many of the core changes present in Drupal 8.8 are reflected in Drush 10, and the ongoing discussion surrounding the potential of Drush in core will doubtlessly continue apace.

For many of us in the Drupal community, Drush is more than a cherished tool; it is one of the primary entry points into Drupal development. With the help of your contributions, Drush can reach even greater heights. Moshe recommends that new contributors get started with improving Drush’s documentation and content concerning Drush, whether it comes in the form of blog posts or step-by-step tutorials that make learners’ experiences much better. The Drush maintainers are always happy to link to compelling content about Drush, to address bugs and issues in Drush’s issue queue, and to offer co-maintainership to prolific contributors.

While this was an exhaustive look at Drush 10, it by no means includes all of the insights we gathered from Moshe, and we at Tag1 Consulting encourage you to check out our recent Tag1 Team Talk about Drush 10 to learn even more about Drush’s past, present, and future.

Special thanks to Fabian Franz, Michael Meyers, and Moshe Weitzman for their feedback during the writing process.

Photo by Bill Oxford on Unsplash.com

Feb 17 2020
Feb 17

I was recently reading an interesting blog post concerning web accessibility. It seemed insightful and informative, until I came upon a link to “download a PDF …” for more information. 

But wait. These PDFs were not accessible.

I shouldn’t have been surprised. This kind of thing happens all the time. As the concept of web accessibility enters the mainstream as a “must-have,” the bigger picture is often getting missed.

Free training: Making PDFs Web Accessible

The PDF Problem

People are sometimes reluctant to accept that PDF’s are included in the scope of web accessibility requirements. It’s not uncommon for organizations to have hundreds of inaccessible PDFs on a site, or to have multiple links off to inaccessible PDFs. The surface of this challenge has barely begun to be scratched.

It’s true that PDFs can be remediated and made accessible, and WCAG 2.1 provides steps for creating accessible PDFs. Generally speaking, though, the process of making PDFs accessible is time-consuming and cumbersome. In the current accessibility climate, the most efficient approach is a mindset shift away from PDFs as the standard for document saving, and toward a realization that properly structured HTML files are the most accessible document type.

PDF Proliferation

When marketing an event online or promoting a product, a PDF is viewed as the go-to solution for saving a file. Organizations design and link offsite to PDFs that serve as digital posters for downloading and printing. 

It’s not uncommon for trifold brochures to be saved as a PDF and posted to a site -- not a good UX and abysmal for accessibility. 

PDF’s are also relied upon in organizational governance. Meeting notes or agendas tend to be created as Word docs and then saved as PDFs.Governments, healthcare providers, corporations, educational institutions, and large organizations of every kind, have hundreds if not thousands of PDFs on their site that they are just now realizing need to be remediated for ADA accessibility. 

Automation: Not the Answer

Automated remediation tools serve as only a partial solution, at best. For one thing, automated tools cannot be counted on to accurately create alt text for images. That’s a task that will always call for human judgment. 

A bigger issue is the fact that content on a WCAG compliant web page is required to be structured within the H1 to H6 HTML hierarchy of headings. Headings on a PDF promotional page or product announcement cannot be counted on to follow any particular logic that either a screen reader or automated remediation tool can make sense of. 

PDF Evaluation and Prioritization

An accessibility audit on a site that includes hundreds of PDFs functions akin to triage, with each PDF being evaluated as necessary or not for remaining on the site, followed by a strategic evaluation as to whether it makes more sense to remediate the PDF for WCAG compliance or convert it to an HTML file.  

This exercise proves to be strategically valuable for many reasons. I’ve yet to encounter a website that does not benefit from an overall inventory and focus on clearing out the “clutter.” 

As is the case with all initiatives designed to ensure WCAG compliance, inventory and auditing of PDFs for accessibility sets the stage for more sustainable web strategies, greater inclusivity, and better UX for all.

There’s much more to learn. Please join me next month for our upcoming training where we will cover the PDF accessibility requirements within the broader scope of WCAG compliance.

Register NOW

Making PDFs Web Accessible
Wednesday, March 18, 12 p.m. - 1 p.m. CST


Feb 15 2020
Feb 15

This month for SC DUG I gave a talk on the importance of self-directed learning for professional development as a developer — or really any other modern career. It was an extension and revision of my December blog post on the same topic. The presentation runs a hair over 30 minutes, and parts of the discussion are included as well.

[embedded content]

We frequently use these presentations to practice new presentations, try out heavily revised versions, and test out new ideas with a friendly audience. If you want to see a polished version checkout our group members’ talks at camps and cons. So if some of the content of these videos seems a bit rough please understand we are all learning all the time and we are open to constructive feedback.

If you would like to join us please check out our up coming events on MeetUp for meeting times, locations, and remote connection information.

Feb 14 2020
Feb 14
Droptica_blog_post When creating an e-commerce platform for a given industry, one needs to adapt the system to specific types of products. Every product type has different characteristics, e.g. size, colour, etc. Check out how easily you can customise Drupal Commerce to sell any type of product. We are happy to tell you how to easily add attributes to products for several selected industries. We are a Drupal agency and we deal with it every day. Attributes are the characteristics of given product types. For example, when you sell your own original T-shirts, thanks to the attributes you do not have to add separate products when you want to add another shirt size or colour. You create one product, add a size and colour attribute, and create other versions of the same product.
Feb 14 2020
Feb 14

We can’t predict the Drupal core security update releases because security updates can arrive at any time and we need to keep our sites updated. People from other parts of the world stay awake during security updates.

Let’s look at the numerous sites built for small businesses today. If a site maintainer is present to manage these updates then there’s no problem at all. But what if there is no maintainer?

Many a time people have questions like:

  1. “Has anyone built the script which will download, backup, and install the updates?”
  2. “Why upgrade, with all security updates which pop up? It seems like I need to upgrade every month.”

New updates arrive frequently. It is a part of the software world be it open source or commercial. The Drupal security team is an awesome team that provides security releases as quickly as possible rather than leaving you with an insecure site. 

There have been talks since the past few years about automating the Drupal core updates, thus a Drupal core strategic initiative was formed “Automatic Updates”. If successful, it would secure a lot of vulnerable Drupal sites. 

Currently, the Automatic Update feature is being developed as a contributed module and eventually, it will be shipped into Drupal core as an experiment and finally if all goes well it could land as a new Drupal core feature. 

Since the work for Automatic Updates is so vast, tasks are being worked in phases. 

Currently, Automatic Updates is divided into the following two phases out of which, phase I is now stable.:

Objectives of Phase I

  • Providing a JSON feed of Drupal PSAs from Drupal.org

  • Displaying PSAs in the Drupal admin interface

  • Providing an extensible update readiness check system

  • Generating update packages from Drupal.org

  • Securing the update packages with a signing system

  • Applying the updates, manually or automatically, with roll-back

In this first phase, the Automatic Updates module includes the Public Service Announcement and Readiness Check features and can apply In-Place Updates manually or on cron. Updates that contain database updates will cause a rollback of the update.

Objectives of Phase II

  • Providing an A/B front-end controller for more robust testing/roll-back features

  • Supporting contributed module automatic updates

  • Supporting composer-based site installs

The goal is to implement a secure system for automatically installing updates in Drupal, lowering the total cost of ownership of maintaining a Drupal site, and improving the security of Drupal sites.

Public service announcements (PSAs)

Announcements for highly critical security releases for core and contrib modules are done infrequently. When a PSA is released, site owners should review their sites to verify if they are up-to-date with the latest releases and that the site is in a good shape to update quickly once the fixes are provided to the community.

Drupal.org provides a JSON feed of Drupal Public Security Announcements to be consumed by the automatic updates module.

That feed includes values for the following: 

  • Project type (core, module, theme, etc) 

  • Project: the short name of the project the PSA is for

  • Title: The title of the PSA

  • Is_psa: The flag which indicates that the post is a PSA (and not another kind of Security Advisory) 

  • Link: The link to the full PSA on drupal.org

  • Insecure: Metadata about what versions of the affected project are known insecure

  • pubDate: The date the PSA was published

Update Checklist

List all checklists which are checked whether a site is ready for an upgrade or not. 

Eg: pending hook updates, changes made in drupal core files. Etc

Demonstrating Automatic Updates

Step 1: First, check if the update is available or not

Drupal 9 automatic updates

Step 2: Configuring Automatic Updates

Drupal 9 automatic updates

Step 3: Now examine the PSAs and Readiness checks in the configurations

Click on ‘Manually run the readiness checks link’ under READINESS CHECKS.

Drupal 9 automatic updates

Step 4: Under Errors found of the status report page, you can see the checks failed message with reasons

drupal 9 automatic updates


Drupal 9 automatic updates

Wish to contribute to Automatic Updates?

Feb 14 2020
Feb 14

Consider a small-mid size Drupal Project. Usually what happens is that once development is complete, sites (Drupal or Wordpress or any other framework) are left forgotten. This leaves the site vulnerable to attack, especially when a new Drupal security release is announced as it exposes the vulnerability publicly. It is good if a site is properly maintained & updated at regular intervals. But not at all recommended if left unattended.

Many a time people have questions like:

  • “Has anyone built the script which will download, backup, and install the updates?”
  • “Why upgrade, with all security updates which pop up? It seems like I need to upgrade every month.”

What if we had a process where Drupal could automatically update itself removing the vulnerability altogether. 

There have been talks since the past few years about automating the Drupal core updates, thus a Drupal core strategic initiative was formed “Automatic Updates”. If successful, it would secure a lot of vulnerable Drupal sites. 

Currently, the Automatic Update feature is being developed as a contributed module and eventually, it will be shipped into Drupal core as an experiment and finally if all goes well it could land as a new Drupal core feature. 

Since the work for Automatic Updates is so vast, tasks are being worked in phases. Currently, Automatic Updates is divided into the following two phases out of which, phase I is now stable.

Objectives of Phase I

  • Providing a JSON feed of Drupal Public service announcements from Drupal.org

  • Displaying PSAs in the Drupal admin interface

  • Providing an extensible update readiness check system

  • Generating update packages from Drupal.org

  • Securing the update packages with a signing system

  • Applying the updates, manually or automatically, with roll-back

In this first phase, the Automatic Updates module includes the Public Service Announcement and Readiness Check features and can apply In-Place Updates manually or on cron. Updates that contain database updates will cause a rollback of the update.

Objectives of Phase II

  • Providing an A/B front-end controller for more robust testing/roll-back features

  • Supporting contributed module automatic updates

  • Supporting composer-based site installs

The goal is to implement a secure system for automatically installing updates in Drupal, lowering the total cost of ownership of maintaining a Drupal site, and improving the security of Drupal sites.

Public service announcements (PSAs)

Announcements for highly critical security releases for core and contrib modules are done infrequently. When a PSA is released, site owners should review their sites to verify they are up to date with the latest releases and the site is in a good state to quickly update once the fixes are provided to the community.

Drupal.org provides a JSON feed of Drupal Public Security Announcements to be consumed by the automatic updates module.

That feed includes values for the following: 

  • type (core, module, theme, etc)

  • project: the short name of the project the PSA is for

  • title: The title of the PSA

  • is_psa: The flag which indicates that the post is a PSA (and not another kind of Security Advisory) 

  • link: The link to the full PSA on drupal.org

  • insecure: Metadata about what versions of the affected project are known insecure

  • pubDate: The date the PSA was published

Readiness Checks

Below are possible points that should be checked to decide whether a site is ready for an upgrade or not. 

Sites can’t receive automatic updates in case 

  • If they don’t have sufficient disk space.

  • If sites are placed on a read-only file system.

  • If sites have un-run database updates(Pending database updates)

  • Any modifications made to the Drupal core source code. 

When PSA is released and the site is failing readiness checks, it is important to resolve the readiness issues so the site can quickly be updated.


A quick look at how to use Automatic Updates

Step 1: First, check if the update is available or not by going to “Reports » Available Updates” from the administration pages.

Drupal 9 automatic updates

Step 2: Install & Configuration of automatic updates contrib module. Go to “Config » System » Automatic Updates”.

Drupal 9 automatic updates

Step 3: Now check the PSAs and Readiness checks in the configurations. Notice the PSA shown in the messages section.

PSA notification


Step 4: Click on the “Manually run the readiness checks” link under READINESS CHECKS.

  If the Readiness check has failed a list of error failed checks are shown in messages. These error messages with reasons can also be found under “Errors found” of the status report page.


drupal 9 automatic updates


Drupal 9 automatic updates

Step 5: If Readiness check shows “No issues found. Your site is ready for automatic updates”. It means our site is ready for an automatic upgrade.

Readiness check done

Step 6: Click on the “manually update now” link inside the “Experimental” section to upgrade the site.

update successful

drupal version updated

Wish to contribute to Automatic Updates?

Feb 13 2020
Feb 13

Integrating With Key Systems

One of the overall goals when embarking on this redesign was to create an experience that mimicked the streamlined usability of an ecommerce site. HonorHealth wanted to create a place where patients could easily make choices about their care and schedule appointments with minimal effort. In order to provide this experience, it was imperative that the new platform could play well with HonorHealth’s external services and created a firm foundation to integrate more self-service tools in the future.

In particular, Palantir integrated with three services as part of the build: SymphonyRM, Docscores, and Clockwise.

  • SymphonyRM offers a rich set of data served by a dynamic API. HonorHealth leverages SymphonyRM’s Provider Operations services to hold its practice location and physician data. Palantir worked closely with Symphony to help define the data structure. Through this work, HonorHealth was able to reduce the number of steps and people required to maintain their provider and location data. By leveraging the strategy work done and the technical consultation of Palantir’s Technical Architecture Owner, HonorHealth was able to keep focused on the most valuable content to their users throughout all of their integrated systems.
  • Docscores provides a platform for gathering high-quality ratings and review data on healthcare practitioners and locations. Palantir integrated this data directly with the physician and location data provided from SymphonyRM to create a research and discovery tool for HonorHealth users. On the new HonorHealth website, users can find physicians near a specific location and read about other patients’ experiences with them.
  • Clockwise provides real-time wait estimates for people looking for Urgent Care services in their area. Each of these individual “under the hood” integrations don’t represent a significant shift for website users, but when all of these changes are coupled with the intense focus on putting the user experience of the site first, the result speaks for itself: a beautiful website that works well and empowers people to engage in their ongoing healthcare in meaningful ways.

Each of these individual “under the hood” integrations don’t represent a significant shift for website users, but when all of these changes are coupled with the intense focus on putting the user experience of the site first, the result speaks for itself: a beautiful website that works well and empowers people to engage in their ongoing healthcare in meaningful ways.

Feb 13 2020
Feb 13

Identifying “Top Tasks”

The biggest negative factor of the previous ETF site’s user experience was its confusing menus. The site presented too many options and pathways for people to find information such as which health insurance plan they belong to or how to apply for retirement benefits, and the pathways often led to different pages about the same topic. Frequently, people would give up or call customer support, which is only open during typical business hours.

Palantir knew the redesign would have the most impact if the site was restructured to fit the needs of ETF’s customers. In order to guarantee we were addressing customers’ most important needs, we used the Top Task Identification methodology developed by customer experience researcher and advocate Gerry McGovern.

Through the use of this method, we organized ETF’s content by the tasks site users deemed most important, with multiple paths to get to content through their homepage, site and organic search, and related content.

Feb 13 2020
Feb 13

Hi. Please read the next paragraph in your best David Attenborough voice.

When the majestic monarch takes flight on its marathon migration to Mexico, it’s easy to be moved by their mission. Each year hundreds of thousands of the small, winged creatures take to the sky with a common goal: survival and prosperity. No single butterfly completes the trip by itself, but as a group, they complete one of the most remarkable journeys in the world.

Okay, back to your normal voice now.

Icon1Migrating your website doesn’t usually have the same visual impact, but behind the scenes, it can be just as impressive as the monarch’s multigenerational marathon migrations. Ask any experienced Drupal developer, and they’ll tell you that the amount of work that goes into a website migration can be just as much work as the monarch migration. They’re transferring an entire website from an older version of Drupal to a new one (Drupal 7 to Drupal 9, for instance), making sure everything stays the same and still works.

There are a plethora of different reasons to migrate your website to a new platform, and we’ll talk about some of the common situations a little later. Specifically, we’re going to be focusing on migrating a Drupal 6 or 7 site to Drupal 8. This information will get you ready for Drupal 9, the latest release of Drupal coming later in 2020. However, a lot of this general information also applies to site migrations using other Content Management Systems (CMSs), so if you aren’t on Drupal, don’t worry! This article isn’t useless!

So what’s migration, and what does it mean for your site? In straightforward terms, website migration is the process of copying data and configurations from Site A (built on Platform X) to Site B (built on Platform Y) and using that data to create users, nodes, configs, and components automatically. Essentially, you’re transferring every piece of content and data from one system to another. These systems can be anything, an old Wordpress installation to a new Drupal installation, for example, or a Joomla site to a Concrete5 site, or even an old Drupal 6 to a fresh Drupal 8 installation. It’s important to remember that your beautiful monarch, built out of content and configurations, isn’t what’s changing. It’s migrating to a newly built home that’s already familiar to it.

Uhh… That sounds like too much work. Why would you ever want to migrate your site? Fear not true believers! I’m going to walk you through an example! Let’s play make-believe for a minute and say that you’re the owner of WidgCo, makers of the most excellent widgets in the whole world. Still on the same page as me? Okay, cool.

Icon2You originally built your website back in 2008 using Drupal 6 so that WidgCo had an online catalogue for the 347 widgets you offer. Users would find the part numbers on your website, and then call your ordering phone line to make the purchase. The types of products you provide, the price point of those products, and most importantly, the buying habits of your typical customer meant that online purchasing wasn’t a feasible way to go right off the bat. Your clients are used to picking up the phone, reading off the part numbers they want to one of your sales reps, and then they get mailed their invoice at the end of the month.

It’s 2020 now, though, and your customers want to be able to finish all their widget ordering and invoice payments online. You’ll need to install a module enabling e-commerce on your site to meet their modern expectations. Since you’re dealing with sensitive financial information, you’re going to want to make sure that it’s as stable and secure as possible. Bad news, however. You’re still on Drupal 6, and the module developers have stopped supporting the Drupal 6 and 7 versions and are only updating the Drupal 8 version. To get the new features, you need you’ll have to migrate your site from your current older version of Drupal to a more modern version.

Now that we know why we need to migrate, the next question, of course, is, “How do I migrate?” As usual with these sorts of things, there are different options around how to migrate your website. No matter the method, though, the first step is backing everything up! If something goes wrong and your site breaks, you’re going to be left high and dry if you don’t have a proper backup ready to rock. Once you’ve got everything safely backed up, we can move on to the next step, where we finally start the migration.

Here’s where we hit a fork in the road. You can choose one of two main methods to perform your migration: automatically using the Migrate Upgrade module, or writing some good ol’ fashioned code by hand and doing it manually. To pick which method is best for you, you need to ask, “is my content model going to change?” Is your website content going to be staying mostly the same? Is it going to be organized in the same way on the new site?

Icon3If your content is going to stay pretty much the same, then automated migration using the Migrate Upgrade module is the way to go. You’ll save your time, money, and sanity. When you go the automatic route, the module automatically analyzes your Drupal 6 or 7 website’s content model. The module will detect content types and configurations on your site, all by itself. Once it’s done that, it will automatically generate the proper content types and configuration for your new Drupal 8 site. If there are some little changes you need to make, the Migrate Upgrade module has an API available to developers so that they can make alterations or customizations to the process. 

I don’t know about you, but that’s all a bit confusing to me, and I’m the monkey writing it. I’ll use another one of those handy-dandy, fancy-schmancy metaphors to explain it a bit differently. Imagine that your website is the building that our fictional WidgCo is headquartered. The building is no longer suitable for your company and to keep growing and progressing, so you’ve got to migrate to a new office. See what I did there?

When your workers built your office, they used the best, most advanced materials they had available, but over time they’ve deteriorated, decayed, or shown to be unsafe. The ceilings are filled with asbestos, the horrid fluorescent lights won’t stop flickering, and nobody is totally sure what the original colour of the carpet was (wait, is this even a carpet?). Having said all that, the layout of the building has been great for WidgCo. The locations of team leaders relative to their team members have been carefully optimized. The processes of the quality assurance folks on the assembly line are perfect, with every widget leaving WidgCo meeting your exacting tolerances. Even the way that employees clock in and out every day is just how everyone likes it.  You need a new building that can be updated, upgraded, or renovated moving ahead. At the same time, you need to keep your optimized office layout, your excellent employee roster, and your perfected processes precisely the same. Otherwise, WidgCo will lose productivity, fall behind, and miss out on money. You need to plan your move so that your workers and clients can come to your building and keep going about business as usual as if nothing changed.

That’s what migration can do for you. Automated migration will look at the old building (your old website), identify the different teams, where they sit, what their processes are, and it will automatically categorize all of that information into a plan. By having this plan in hand before you move into the new building, you’re able to replicate the old office floor plan exactly. Even the thermostat was set to the same setting. In web terms, that means all your content, pages, blog articles, galleries, users, and everything else are functioning correctly in the correct places.

Icon4Performing a migration by hand is a similar process. Instead of having your plan automatically generated, you and your development team go through with custom scripts and code to manually categorize and sort your site data. Instead of a program automatically sorting and classifying your data, your team can identify specific elements of your website and dictate precisely how they should move them to the new site.

Okay, that’s enough for now, I think. I don’t want to overload my brain, let alone yours! This article is just to serve as a high-level overview of what migrations are, how they work, and why you’d need to perform one. It’s been a lot of info, but we’re only scratching the surface of migrations. There are tons of tiny technical tidbits we could touch on, but unless you’re a web developer, it’ll mostly be gibberish. For now, though, I hope this has been a good primer on the topic.

Our expert team at Cheeky Monkey Media kicks ass at this stuff. In fact, we’ve put together a full Drupal whitepaper, just for you. If you need a hand, have any questions, or are interested in learning more about working together, don’t be afraid to reach out! We’re always happy to help and share some of our nerdy knowledge.

Feb 13 2020
Feb 13

Table of contents

What is Tag1 Quo
How does Tag1 Quo work
What makes Tag1 Quo unique
--Immediate notice of vulnerabilities
--Backports of LTS patches
--Automated QA testing for Drupal 7 LTS
--Customer-driven product development

One of the challenges of securing any Drupal site is the often wide range of modules to track, security advisories to follow, and updates to implement. When it comes to Drupal security, particularly older versions of Drupal such as Drupal 6 and Drupal 7, even a slight delay in patching security vulnerabilities can jeopardize mission-critical sites. Now that Drupal 7 and Drupal 8 are fast approaching their end of life (EOL) in November 2021 (Drupal 6 reached end of life on February 24, 2016), the time is now to prepare your Drupal sites for a secure future, regardless of what version you are using.

Fortunately, Tag1 Consulting, the leading Drupal performance and security consultancy, is here for you. We’ve just redesigned Tag1 Quo, the enterprise security monitoring services trusted by large Drupal users around the world, from the ground up, with an all-new interface and capabilities for multiple Drupal versions from Drupal 6 to Drupal 8. Paired with the Tag1 Quo module, available for download on Drupal.org, and Tag1 Quo’s services, you can ensure the security of your site with full peace of mind. In this blog post, we’ll cover some of the core features of Tag1 Quo and discuss why it is essential for your sites’ security.

What is Tag1 Quo?

Tag1 Quo is a software-as-a-service (SaaS) security monitoring and alerting service for Drupal 6, Drupal 7, and Drupal 8. In addition, it includes long-term support (LTS) for Drupal 6 and is slated to commence backporting security patches for both Drupal 7 and Drupal 8 when both major versions no longer have community-supported backports. The centerpiece of Tag1 Quo integration with Drupal is the Tag1 Quo module, which is installed on your servers and communicates securely with our servers.

In addition, for a fee, we can help you with a self-hosted version of Tag1 Quo for sites hosted on-premise. This does require setup fees and entails higher per-site licensing fees, so we encourage you to reach out to us directly if you’re interested in pursuing this option.

How does Tag1 Quo work?

When a new module update is released on Drupal.org, or when a security advisory is announced that directly impacts your Drupal codebases, the Tag1 Quo system alerts you immediately and provides all of the necessary updates required to mitigate the vulnerability, with a direct link to the code you need to install to address the issue. Not only are these alerts sent over e-mail by default; they can also flow directly into your internal project workflows, including issue tracking and ticketing systems.

Tag1 Quo doesn’t stop there. As part of our long-term support (LTS) offering, when security releases and critical updates emerge, or when new security vulnerabilities are announced for community-supported Drupal versions, Tag1 audits these and determines whether the identified vulnerability also impacts end-of-life (EOL) versions of Drupal such as Drupal 6 and, in November 2021, Drupal 7. If those EOL versions are also susceptible to the vulnerabilities, we backport and test all patches to secure the EOL versions as well and distribute them to you through the Tag1 alert system.

Moreover, when a new security vulnerability is discovered in an EOL version of Drupal without an equivalent issue in a currently supported version, Tag1 creates a patch to rectify the problem and collaborates with the Drupal Security Team (several of whom are part of the Tag1 team) to determine if the EOL vulnerability applies vice-versa to all currently supported versions of Drupal so that they can be patched too. In short, no matter where the vulnerability occurs across all of Drupal’s versions, you can rest easy with Tag1 Quo’s guarantees.

What makes Tag1 Quo unique

Tag1 Quo features a centralized dashboard with an at-a-glance view of all of your Drupal sites and their current status, regardless of where each one is hosted. After all, most enterprise organizations juggle perhaps dozens of websites that need to remain secure. Such a perspective at an organizational level is essential to maintain the security of all of your websites. But the Tag1 Quo dashboard is only one among a range of capabilities unique to the service.

Immediate notice of vulnerabilities

Although several members of the Tag1 team are also part of the Drupal Security Team, and are aware of vulnerabilities as soon as they are reported, the Drupal Security Team’s first policy is to collaborate privately to address the issue before revealing its nature publicly. This is to facilitate progressive disclosure in the form of issuances of public advisories and releases of public patches before nefarious actors are able to attack Drupal sites with success. This is for your safety and for the viability of released patches.

Thanks to our deep knowledge of both projects used by our clients' websites and security advisories, Tag1 has the distinction of being among the very first to notify Tag1 Quo customers as soon as the official announcement is released. Immediately afterwards, Tag1 Quo will prepare you to apply the updates as quickly as possible to ensure your web properties’ continued safety.

Backports of LTS patches

If a fix for a vulnerability is reported for currently supported versions of Drupal but also applies to EOL versions, the patch must be backported for all Drupal sites to benefit from the patch. Unfortunately, this process can be complex and require considerable planning and analysis of the problem across multiple versions—and it can sometimes only occur after the patch targeting supported versions has been architected or completed. This means it may take more time to develop patches for LTS versions of Drupal.

Luckily, we have a head-start in developing LTS patches thanks to our advance notice of vulnerabilities in currently supported versions of Drupal. Despite the fact that we cannot guarantee that LTS updates will be consistently released simultaneously with those targeting supported versions, Tag1 has an admirable track record in releasing critical LTS updates at the same time as or within hours of the issuance of patches for supported Drupal versions.

Automated QA testing for Drupal 7 LTS

Throughout Drupal’s history, the community encouraged contributors to write tests alongside code as a best practice, but this was rarely the case until it became an official requirement for all core contributions beginning with the Drupal 7 development cycle in 2007. Tag1 team members were instrumental in tests becoming a core code requirement, and we created the first automated quality assurance (QA) testing systems distributed with Drupal. In fact, Tag1 maintains the current Drupal CI (continuous integration) systems that perform over a decade of concurrent years of testing within a single calendar year.

Because the Drupal Association has ended support for Drupal 7 tests and decommissioned those capabilities on Drupal.org, Tag1 is offering the Tag1 Quo Automated QA Testing platform as a part of Tag1 Quo for Drupal 7 LTS. The service will run all tests for Drupal 7 core and any contributed module tests that are available. Where feasible and appropriate, Tag1 will also create new tests for Drupal 7’s LTS releases. Therefore, when you are notified of LTS updates, you can rest assured that they have been tested robustly against core and focus your attention on integration testing with your custom code instead, all the while rolling out updates with the highest possible confidence.

Customer-driven product development

Last but certainly not least, Tag1 Quo is focused on your requirements. We encourage our customers to request development in order for us to make Tag1 Quo the optimal solution for your organization. By working closely with you to determine the scope of your feature requests, we can provide estimates for the work and an implementation timeline. While such custom development is outside the scope of Tag1 Quo’s licensing fees, we allot unused Tag1 Quo consulting and support hours to minor modifications on a monthly basis.

Examples of features we can provide for custom code in your codebases includes ensuring your internal repositories are relying on the latest versions of dependencies, and providing insights into your custom code through site status views on your Tag1 Quo dashboard. We can even do things like add custom alerts to notify specific teams and users responsible for these sites and customize the alerts to flow into support queues or other ticketing systems. Please get in touch with us for more information about these services.


The new and improved Tag1 Quo promises you peace of mind and renewed focus for your organization on building business value and adding new features. Gone are the days of worrying about security vulnerabilities and anxiety-inducing weekends spent applying updates. Thanks to Tag1 Quo, regardless of whether your site is on Drupal 6, Drupal 7, or Drupal 8, you can rest assured that your sites will remain secure and monitored for future potential vulnerabilities. With a redesigned interface and feature improvements, there is perhaps no other Drupal security monitoring service better tuned to your needs.

Special thanks to Jeremy Andrews and Michael Meyers for their feedback during the writing process.

Photo by Ian Schneider on Unsplash

Feb 13 2020
Feb 13

Welcome! If you need to update your Drupal 8 site to the latest feature branch, this post is for you. 

Drupal 8.8.0 introduced many exciting changes. On the heels of this release, we also saw a security release for Drupal 8.8.1, which covers four distinct security advisories (SAs). Drupal 8.7.11 also has fixes for those SAs, but developers might consider this a golden opportunity to take care of both updates at once.

This post will cover four common pitfalls of upgrading to the 8.8.x branch:

  • Pathauto version conflicts
  • New sync directory syntax
  • Temporary file path settings
  • Composer developer tools


These instructions assume you are:

  • Maintaining a Drupal 8 site on version 8.7.x
  • Using Composer to manage dependencies
  • Comfortable using the command line tool Drush

Pathauto version conflicts

The Pathauto module has been around for a long time. With over 7 million downloads and 675K reported sites using Pathauto, chances are high that this section applies to you.

Drupal core 8.8.0 introduced the path_alias module into core. However, this module conflicts with the Pathauto contrib module at version 8.x-1.5 or below. If you have Pathauto installed on your site, you must first update to Pathauto 8.5-1.6 or later. 

I strongly suggest updating Pathauto as a first step, and deploying all the way to production. The order of operations is important here, because updating Pathauto after core will result in data loss. While the release notes say it is safe to update “before or at the same time” as core, it is good to have some extra precaution around the sequence of events. 

Visit the full change record for more details: Drupal 8.8.0 requires pathauto version 8.x-1.6 or higher if installed.

Diagnosing path alias issues

What if I neglect to update Pathauto? How can I identify the symptoms of this problem? After running drush updb, I would expect to see this SQL error:

[error]  SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry '7142' for key 'PRIMARY': INSERT INTO {path_alias}

This Drupal core bug report provides more detail. It also describes how to walk it back to a working state if you encounter this problem.

New sync directory syntax

The configuration management system introduced a new sync directory syntax for 8.8.0. The default location for Drupal’s config sync is sites/default/files/config_[HASH]. It is very common practice to customize this location. It makes config files easier to understand and manage. If you do customize this location, note that Drupal no longer uses the $config_directories variable. 

Here is what a custom config sync directory might have looked like in Drupal 8.7.x or lower. In your settings.php file:

$config_directories['sync'] = dirname(DRUPAL_ROOT) . '/config';

Now in Drupal 8.8.x, this setting should be updated to look like this:

$settings['config_sync_directory'] = dirname(DRUPAL_ROOT) . '/config';

Read the full change record for more technical details: The sync directory is defined in $settings and not $config_directories.

Diagnosing sync directory issues

You can tell right away if there is a problem with your sync directory if config files are showing up in an unexpected place. You can also use drush to discover the current value of your config sync directory. 


$ drush php
>>> Drupal\Core\Site\Settings::get('config_sync_directory')
=> "/var/www/html/foo/bar"


drush php-eval '$path = Drupal\Core\Site\Settings::get("config_sync_directory"); print $path;'

Here are a few more ways to retrieve the same information, using the drush status command.

For Drush 8 only:

drush status config_sync

For Drush 9 or 10:

drush status --fields=%paths

Temporary file path settings

In this new feature release, the old procedural function file_directory_temp() is deprecated. Drupal now uses the FileSystem service instead, and this has implications if you are setting a custom value for temporary file storage.

To customize the temporary file location the old way, you may have something like this in your settings.php file:

$config['system.file']['path']['temporary'] = '/tmp';

Change this setting to the new syntax before running database updates:

$settings['file_temp_path'] = '/tmp';

Read the full change record to learn more: file_directory_temp() is deprecated and moved to the FileSystem service.

Diagnosing temp directory issues

Take a look at your database logs. In Drupal’s logs at /admin/reports/dblog, you can filter on “file system”. Any errors about temporary file placement may indicate an issue.

Composer developer tools

Composer Support in Core is just one of many strategic initiatives in the Drupal community. Some early ways of using Composer are now deprecated in favor of this new support. For example, Drupal now has an official Composer template project. If you used the unofficial template (but recommended at the time) drupal-composer/drupal-project to install Drupal before, you will have a bit of updating to do.

Manually edit composer.json to remove deprecated packages. Remove this line:

   "require-dev": {
        "webflo/drupal-core-require-dev": "^8.7.0"

And replace it with this one:

   "require-dev": {
        "drupal/core-dev": "^8.8"

Then edit the require statement for Drupal core itself: 

   "require": {
        "drupal/core": "^8.8"

Now that composer.json is up to date, you can go ahead and run Composer updates in the usual way, with composer update --lock.

If you are starting a new Drupal 8 site from scratch, refer to this guide on Starting a Site Using Drupal Composer Project Templates. It has instructions on how to use the new recommended way of handling Composer templates, using drupal/recommended-project.

Diagnosing deprecated Composer tools

Without following the steps above, if you try to run composer update, Composer will fail with this error:

Your requirements could not be resolved to an installable set of packages.

Depending on what package versions are installed, and the syntax of your composer.json file, the rest of the error output will vary. Here is an example:

The requested package webflo/drupal-core-require-dev (locked at 8.7.11, required as ^8.8) is satisfiable by webflo/drupal-core-require-dev[8.7.11] but these conflict with your requirements or minimum-stability.

The important thing to know here is that Composer is being helpful. It is preventing you from upgrading a deprecated package. 

You can verify this by using the Composer prohibits command:

$ composer prohibits drupal/core:^8.8
webflo/drupal-core-require-dev  8.7.11  requires  drupal/core (8.7.11)                                 
drupal/core      8.8.0  requires   typo3/phar-stream-wrapper (^3.1.3)                   
drupal-composer/drupal-project  dev-my-project  does not require  typo3/phar-stream-wrapper (but v2.1.4 is installed)

Or its alias:

$ composer why-not drupal/core:^8.8

But wait! There’s more!

These are just a few pitfalls. There are other considerations to make before updating to 8.8.x. Make sure to read the release notes carefully to see if any other advisories apply to you.

Feb 13 2020
Feb 13

Amazee Labs has been awarded the Daimler Key Supplier Inspiration Award for 2020! 

Markus Schäfer, Member of the Board of Management of Daimler AG and Mercedes-Benz AG, responsible for Group Research and Mercedes-Benz Cars Development, Procurement and Supplier Quality, presented the award to Amazee Labs CEO, Stephanie Lüpold on 12 February 2020 in Stuttgart’s Carl Benz Arena.

In presenting the award for outstanding suppliers, Mr Schäfer said “In order to fulfil our role as innovation and technology leader in the future, we also expect courageous impulses with inspiring visions in all areas from our partners. Together we are creating ground-breaking mobility solutions that are in line with our social, environmental and economic targets.”

Amazee Labs Daimler Supplier Awards

Amazee Labs won the award for developing the content management system for Daimler’s new smart.com website. They were able to set new standards in mapping internationalization processes thereby turning a former challenge of Drupal into a major advantage. 

Stephanie Lüpold, CEO of Amazee Labs is exceptionally pleased: "We want to thank Daimler for the award. As a strong player in the Drupal open source community, Amazee Labs has solved one of the top challenges faced by Drupal by developing a new, revolutionary and easy way to manage content internationally. Not only have we solved this issue for Drupal, but it’s also helped Drupal achieve a significant advantage over other systems. The team has done an outstanding job. We are extremely pleased that our solution is already being used successfully by several multinational companies, including Daimler. This award is recognition of our hard work and great motivation to continue to do everything possible for our customers".

Amazee Labs - Daimler Supplier Awards

Amazee Labs wants to congratulate all the other winners, it’s an honour to be included in such esteemed company.

Feb 12 2020
Feb 12

If your website has plenty of media files (PDF presentations, price-lists, product images, etc.), you know how cumbersome it may be to replace them. Special attention in file replacement needs to be paid to SEO — because, as every SEO expert knows, every detail matters there.

Luckily, your Drupal 8 website offers more and more ways of easy media management that will also allow you to stay in line with the best practices of SEO.

Discover how to replace Drupal media files easily, with no fuss or extra manual efforts, and with your SEO rankings preserved. This is possible thanks to the new Drupal module — the Media Entity File Replace.

When do you need to Drupal replace media files?

The content never stays unchanged — it needs to keep up with the new business circumstances. Media files are not an exception. For example, you may need to:

  • update a PDF presentation for your company/products/services
  • change your price-list
  • update your how-to checklist
  • upload new product images with better quality than before
  • make changes to your corporate video

and many other types of content.

What is the problem with the standard file replacement?

The standard procedure includes removing the old file and uploading another one. When replaced, your file gets a different name and path. Drupal appends a number to the end of the new one (_0, _1, etc.), instead of overwriting the original.

Standard file replacement in Drupal 8

File replacement becomes an especially tedious process when the file is used in multiple places throughout your website. This means additional expenses on your or your staff’s manual work. A special point of concern here is SEO. Read on to find out more about the impact of file replacing on your SEO.

How does media file replacement influence SEO?

File names play their part in your SEO rankings. Human-readable names enriched with keywords and written through a dash is a great way to tell Google what your image is about (in addition to the ALT tag), and this benefits your SEO.

Changing your file name and path can lead to a certain loss in SEO rankings because Google treats the newly uploaded files as new, and needs to recrawl, reprocess, and reindex them. This can take a long time during which it will show old and irrelevant content.

And, of course, if your files are used in content throughout your website and you change them but forget to re-upload them everywhere, they will be unavailable to your users. File path changes have the potential to cause broken links, which is one of the most hateful things both for search engines and users.

Never lose any SERP position to your competitors. Use helpful tools to replace files easily and without losing SEO.

How the Media Entity File Replace module can help

The new Media Entity File Replace module for Drupal 8 offers a smart and SEO-friendly way to replace Drupal media files. The module replaces the source files in Drupal media entities that have a file-based field. It does so by overwriting the old file with the new one. What’s important, the name and path are preserved in the process.

The Media Entity File Replace works with Media entities in Drupal 8. If you use the Media system to manage your media files, this module will suit you.

Note: To use the Media module, consider updating your Drupal website to the latest version where it has been greatly improved — our Drupal support team can do it for you.

Installing the Media Entity File Replace module

The module is installed like any other. It depends on a bunch of core modules: Media, File, Field, Image, User, and System.

Installing the Media Entity File Replace module in Drupal 8

Configuring the File Replace widget

The module comes with the File Replace widget that you need to enable in the media types for which you want your files to be overwritable. Let us remind you: Drupal 8 has 5 default media types: Audio, Document, Image, Remote video, and Video, while others can be created to your liking.

To enable the File Replace widget, we need to go to Structure — Media types — {Our media type} — Manage form display. In our example, we work with the Document Media type.

Let’s drag the Replace File widget to the enabled ones. The perfect place is just below the File field.

Configuring the File Replace widget in Drupal 8

Replacing your Drupal media files

Let’s create a new document entity in Media — Add media — Document. We then upload an “Our services” PDF to it.

Creating a Drupal document Media entityDrupal media file path

The PDF is now saved in our Media Library (Content — Media) where we can go and edit the entity in order to replace the PDF.

Instead of the usual Remove button, we now see the Replace button. If the “Overwrite original filename” is checked, the original name will be kept and the contents will be replaced.

Media Entity File Replace Drupal 8 module in action

We click “Choose file” and upload a new one — “Updated services.”

Replacing files with the Media Entity File Replace module

After saving, we see that the filename in this media entity is the same as before.

Drupal document file path

However, the content of the source file available by the same path has been rewritten. It now shows our updated services.

Media file replaced with its path inchanged in Drupal 8

This PDF can be used as an attachment in your content. You just need to add a Media field in your content type and then you can easily fetch media there from the Media Library. Visit our “Media handling in Drupal 8” to learn how media can serve as building blocks for content.

Adding documents from Media Library to content

In this example, we added a “Document” Media field to the Basic page, and then created a content page with our PDF attached to it.

Content entity with a document Media field

Wherever else we add the file throughout your website, it is going to be rewritten automatically after a replacement, with no need to reupload.

While using the Media Entity File Replace module, special attention needs also to be paid to caching so your users are able to see the updated content sooner.

Entrust your media setup to our Drupal team

The Media capabilities and the ecosystem of Media Library management modules in Drupal 8 keep growing at amazing strides. It offers you more every day for managing your media effortlessly, with joy, and with no SEO losses.

Ask our Drupal support & maintenancehttps://drudesk.com/ team to configure the ideal media management processes on your Drupal 8 website!

Feb 12 2020
Feb 12

An effective administrative interface is table stakes for any content management system that wishes to make a mark with users. Claro is a new administration theme now available in Drupal 8 core thanks to the Admin UI Modernization initiative. Intended to serve as a logical next step for Drupal's administration interface and the Seven theme, Claro was developed with a keen eye for modern design patterns, accessibility best practices, and careful analysis of usability studies and surveys conducted in the Drupal community.

Claro demonstrates several ideas that not only illustrate the successes of open-source innovation but also the limitations of overly ambitious ideas. By descoping some of the more unrealistic proposals early on and narrowing the focus of the Claro initiative on incremental improvements and facilitating the work of later initiatives, Claro is an exemplar of sustainable open-source development.

In this closer look at how Claro was made possible and what its future holds for Drupal administration, join Cristina Chumillas (Claro maintainer and Front-End Developer at Lullabot), Fabian Franz (Senior Technical Architect and Performance Lead at Tag1), Michael Meyers (Managing Editor at Tag1), and Preston So (Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) for a Tag1 Team Talks episode about the newest addition to Drupal's fast-evolving front end.

[embedded content]

Feb 12 2020
Feb 12

This article is from our resident Drupal 8 core maintainer Nathaniel Catchpole (@catch) who is helping to build Drupal 9.

Drupal 9 will be the first major core release to have a continuous upgrade path meaning that Drupal 8 contributed modules and themes, and actual Drupal 8 sites, should be able to upgrade smoothly from Drupal 8 to Drupal 9, with only slightly more disruption than a Drupal 8 minor release. Since this is the first time that Drupal has ever attempted to provide a smooth upgrade path between major releases, we understand it can be hard to imagine what it will be like. 

The best way to feel confident will be to actually understand the changes that will be landing in Drupal 9 over the next few months, and when you should try to test it, use it to build new sites and to update existing sites.

What are the main changes in Drupal 9?

Drupal 9 will update third party dependencies, such as Symfony and Twig, so that we are able to continue to receive bugfix and security support from those projects.

Drupal 9 (and Drupal 8 versions from 8.7.7 onwards) will have support for running the same branch of a contributed module across multiple major core versions, reducing the maintenance burden of supporting both.

Drupal 9 will remove backward compatibility layers and deprecated code marked for removal in Drupal 8.

These three things are really the entire focus of the initial Drupal 9.0.0 release — updating dependencies, smoothing the update from Drupal 8, and removing deprecations.

When with Drupal 9 be available?

Drupal 9.0.0 is planned for release on June 3rd, 2020, the same day that Drupal 8.9.0 is released. However, the Drupal 9 branch is already open for development and you can download and test Drupal 9.0.0-alpha1!

The first tasks in Drupal 9 are updating to  Symfony 4 and Twig 2 as well as other PHP and JavaScript dependencies. This process has already started, and we’ve raised the minimum PHP requirement to PHP 7.2 already too.

You can already enable testing for a Drupal-9 compatible module against the Drupal 9 branch and make sure all your tests pass with Symfony 4 and Twig 2.

Removal of deprecated code and backward compatibility layers is also in progress now that Drupal 9 is open. We will continue updating dependencies and removing deprecated code throughout the next few months.

Alphas will continue to come out regularly until we hit beta.

Isn’t this a short list? Surely, there’s more to it?

This is intentionally a shortlist because we’ve spent years working on how to minimize the disruption for the Drupal 8 to 9 updates. However, with the question of ‘when will it be ready?’, there should always be the addition of ‘ready for whom?’. Drupal core developers? Contrib developers? New site builders? Organizations with an existing site on Drupal 8? Organizations on Drupal 6 or 7?

Drupal 9 becomes ready for core development when the branch is open — once this is done, Drupal 9-specific patches can be committed.

Drupal 9 becomes ready for contributed module developers as soon as the branch begins to look like Drupal 9 - i.e. once dependencies are updated and deprecated code starts to be removed. Contrib developers can already support multiple core versions using the new core_version_requirement key, but the way to test if a module is really Drupal 9 compatible is to install and try to use it on Drupal 9.

Drupal 9 becomes ready for new site development once the contributed modules you need for that new site are compatible with Drupal 9. For many contributed modules this will be a one-line change, so you should be able to start developing new Drupal 9 sites once 9.0.0-beta1 is released or earlier if you’re feeling adventurous and want to contribute to contributed modules being read. Drupal 9’s beta period will be longer than what we allow for minor releases, to enable plenty of advance testing and flush out any unexpected issues with actual sites.

Once Drupal 9.0.0 is released, you should never start building a new site on Drupal 8 again.

Drupal 9 becomes ready for Drupal 8 upgrades once all of the modules you’re already using, and any custom code, are Drupal 9 compatible. Drupal 8 will be supported until November 2021. However, it’s a good idea to start work on this as soon as Drupal 9.0.0 is released, and you can prepare for that upgrade now by doing things like ensuring you’re up-to-date with Drupal 8 core and contributed updates. While June 2020 to November 2021 is a much shorter support window than previous major Drupal releases like 6.x and 7.x, it also means hundreds of thousands of sites making a smaller change all around the same time, which should improve reliability for everyone.

If you’re still on Drupal 6 or 7, you can already migrate sites that don’t use content translation to Drupal 8 now! There is no reason to wait for Drupal 9 to do this since it will be such a small update to Drupal 9. Sites using content translation should keep an eye on this critical Drupal core issue to finalize the multilingual upgrade path for translated nodes, and help to test it if you can.

When will Drupal 9.0.0 actually be released though?

We’re aiming for Drupal 9.0.0 to be released on June 3rd, 2020. To hit this date, we’ll need to hit beta by March 2020. The full list of requirements for tagging beta is tracked in this issue.

There is always the possibility that we won’t have resolved every beta blocker by March. If we don’t have everything in place by then, there are two fallback dates—we’ll either start the beta in May with a 9.0.0 release date of August/September 2020 or start the beta in September with a release date of December 2nd, 2020. 

The more people test the branch and Drupal 9-compatible modules before June 2020, the more confident everyone can be that it’s really ready.

Feb 11 2020
Feb 11

The need to export the contents of a Drupal 8 project is a recurring need, whether for analysis purposes or for mass updating with a concomitant import process. We have several solutions with Drupal 8, each of which has advantages and disadvantages, whether in terms of the content types that can be exported, the options for exporting column headers and values, the level of access rights required and the highly variable configuration options. We will present here a new Entity Export CSV module that seems to be able to respond to many use cases.

Another new CSV export module?

We have many solutions to set up a CSV export on a project. Let's try to list some of them without wanting to be exhaustive.

The Views Data Export module, as its name indicates, is based on the Core Views module to configure and set up CSV exports. Using Views we can then configure an export for any Drupal 8 entities, usually for a particular bundle. We then need to configure as many views as we need to export and some limitations may appear when it comes to exporting multiple fields. Setting up CSV export with this module requires administrative rights to create and manage views of a Drupal 8 project, with some understanding of how Views works. It is therefore not really intended for end users.

The Content Export CSV module allows you to quickly export the nodes of a Drupal 8 site. Its configuration options are very limited, especially the choice of exported fields and their values, in addition to the fact that only nodes can be exported with this module. Conversely, this module can be used directly by end users.

The Entity Content Export module allows many configuration options. It can export all Drupal 8 content entities and the exports of each entity can be configured based on the entity view modes, including field formatters that we can select and configure for a given content export. However, it requires a consequent initial configuration, with very high administrative access rights, at the level of each entity bundle that we want to make exportable.

For the needs of a Drupal web factory project, each of its solutions could partially but not totally meet these requirements. In particular because it was not possible to know in advance what content types or what entity type it was necessary to be able to export, how the content type was configured and used (in the case of generic content types customizable by instance of the web factory) and especially because any configuration and implementation of these CSV exports had to be able to be done by end users, without any particular knowledge of Drupal, nor any access rights on the configuration of a Drupal 8 project.

Introduction of the Entity Export CSV module

The Entity Export CSV module was created to meet these challenges.

  • Be able to export any content entity from Drupal 8
  • Be able to select which fields are exportable for each entity and each bundle
  • Be able to configure how each field of an entity is exported
  • Be able to configure field by field their export behavior when multiple fields are involved (export in a single column with separator and export of each value in a separate column)
  • Be easily customizable for the export of a particular field, with a specific business need
  • Be usable by an end user without special administrative rights on the configuration of a Drupal 8 project (Views, Entity View Mode, etc.).

In terms of architecture, in order to meet these challenges, Entity Export CSV relies on :

  • A simple configuration page that allows you to select which content entities will be exportable from among the content entities present in a project and, if necessary, to limit them to one or more bundles, and also to limit (or not) the fields of these entity bundles that will be exportable.
  • A simple export page allowing, on the basis of the initial selection configuration, the entity to be exported, then to configure for each field whether it should be included in the export and if so how it should be exported

Detailed presentation of how Entity Export CSV works

Once the module is installed, as indicated above, the first step is to configure the exportable entities, which will be accessible in the export page.

The configuration is quite simple. For each entity type available on the project we select the ones we want to make exportable, and we can limit the bundles by entity type that will be exportable.

Entity Export CSV settings

Then for each entity and bundles enabled, we can also limit the fields that will be exportable, this for example in order to not overload the export page with the so-called technical fields of an entity (uuid, revision, or any other field added on the entity that does not contain any content as such).

Entity Export CSV settigns details

Then just go to the actual CSV export page.

page export CSV

Users can configure for each field, if it is to be included in the export, how the column header will be populated (field readable name or system name), the number of columns to be used for multiple fields as well as the export format to be used for each field.

Entity Export CSV page export

Once the configuration is finished, the user can save this configuration so that he does not have to make another configuration during a next export. Each configuration is saved for an entity type, a bundle and per user. A programmed evolution will be to be able to configure configuration exports for each entity type and bundle, in an unlimited way, and then use these export configurations here as a reusable template.

Entity Export CSV save export settings

Each field can be configured differently at the level of the column header used for its export, and at the level of the exported value. To do this, the module has a FieldTypeExport Plugin system that allows you to easily create configurable and/or specific field exports. For example below the "Modified" field is configured with the Timestamp export plugin which exposes as an option the date formatting.

Entity Export CSV timestamp export

Another basic plugin provided by the module allows to configure how Entity Reference type fields are exported. For example if we want to export the ID of the referenced entity or its label, and for the case of a multiple field the number of columns to be used to export the values.

Entity Export CSV entity reference export

Extension of the module Entity Export CSV

One of the challenges the module meets is to be easily extensible and/or customizable. Rare are the Drupal 8 projects where a field does not fit into a generic box and requires special treatment.

The Entity Export CSV module relies on a Plugin system to export all fields and can therefore be easily extended by a Drupal 8 developer to support any type of special case or fields created by contributed modules (for example, the module includes a Plugin for the fields of the Geolocation module and the Address module).

To create a Field Export Plugin, a FieldTypeExport Plugin must therefore be created in the src/Plugin/FieldTypeExport namespace of any Drupal 8 module.

The annotations of this plugin allow you to control certain behaviors of the Plugin and its availability. Let's look at these annotations with the example of the Geolocation plugin included in the module.

namespace Drupal\entity_export_csv\Plugin\FieldTypeExport;

use Drupal\Core\Field\FieldDefinitionInterface;
use Drupal\Core\Form\FormStateInterface;
use Drupal\entity_export_csv\Plugin\FieldTypeExportBase;
use Drupal\Core\Field\FieldItemInterface;

 * Defines a Geolocation field type export plugin.
 * @FieldTypeExport(
 *   id = "geolocation_export",
 *   label = @Translation("Geolocation export"),
 *   description = @Translation("Geolocation export"),
 *   weight = 0,
 *   field_type = {
 *     "geolocation",
 *   },
 *   entity_type = {},
 *   bundle = {},
 *   field_name = {},
 *   exclusive = FALSE,
 * )
class GeolocationExport extends FieldTypeExportBase {

   * {@inheritdoc}
  public function getSummary() {
    return [
      'message' => [
        '#markup' => $this->t('Geolocation field type exporter.'),

   * {@inheritdoc}
  public function buildConfigurationForm(array $form, FormStateInterface $form_state, FieldDefinitionInterface $field_definition) {
    $configuration = $this->getConfiguration();
    $build = parent::buildConfigurationForm($form, $form_state, $field_definition);
    return $build;

   * {@inheritdoc}
  public function massageExportPropertyValue(FieldItemInterface $field_item, $property_name, FieldDefinitionInterface $field_definition, $options = []) {
    // Stuff to format field item value.

   * {@inheritdoc}
  protected function getFormatExportOptions(FieldDefinitionInterface $field_definition) {
    $options = parent::getFormatExportOptions($field_definition);
    return $options;


The annotations of a FieldTypeExport plugin are :

  • weight: the weight of the plugin. The plugin with the highest weight will be the plugin selected by default if more than one plugin is available for a field.
  • field_type: the type of field to which the plugin applies. Multiple field types can be specified if necessary. This option is mandatory.
  • entity_type: it is possible here to limit the plugin to only certain entity types. Leave empty and the plugin will be available for the field type on any entity type.
  • bundle: it is possible here to limit the plugin to only certain entity bundles. Leave empty and the plugin will be available for the field type on any bundles
  • field_name: here it is possible to limit the plugin to one particular field. Leave empty and the plugin will be available for the field type on all fields of that type.
  • exclusive: this option if set to TRUE will make this plugin exclusive, i.e. all other plugins available for this field type will no longer be visible. Useful if you want to limit the options available to export a specific field by a particular field. Default value is FALSE.

You can then override all the methods available on the Base Plugin in order to customize the export rendering of the field. In particular you can expose new configuration options, and of course implement the massageExportPropertyValue() method which is in charge of formatting the export of a field instance.

To you exports

The Entity Export CSV module provides access to advanced export features for end users who may not have extensive knowledge of Drupal or advanced configuration rights. It allows you to quickly provide export functions on any entity type, while remaining completely independent of a project configuration and as such can be integrated on any type of project, including webfactory projects. The simplified interface that it provides to users is not to the detriment of more or less complex use cases, thanks to its Plugin system that allows all business needs that are a little specific to be handled at very little cost.

Need a CSV export? Think about Entity Export CSV!

Feb 11 2020
Feb 11

Recent and abundant evidence that ADA Accessibility enhances SEO, is broadening perspectives on WCAG compliance -- from a complicated and potentially costly requirement, to an excellent opportunity that needs to be accomplished ASAP.  

Google as a Gatekeeper

Google has emerged as a gatekeeper within our digitally-driven business climate. If a site doesn’t grab Google’s attention, that means lots of lost traffic. A recent article in Search Engine Journal reported that sites which appear on the first page of a Google search receive, on average, 91.5 percent of the traffic share. 

This means that in the current climate, websites need to be accessible to the major search engines -- just as they need to be accessible to people with disabilities. Structuring a site to align with what Google looks for in determining listing order on a Search Engine Results Page (SERP) is a critical business imperative.

Accessibility and SEO in Alignment

Many of the factors that fuel Search Engine Optimization are also essential for ADA Web accessibility compliance. 

Here’s a short list of reasons why and how SEO and ADA web accessibility best practices converge to enhance both objectives:

Google gets it: Great UX equals greater accessibility.

When websites are designed with a high degree of empathy for those who visit the site -- including people with disabilities -- SEO follows. Many metrics pertaining to great user experience have an impact on a site’s search result ranking. Among them: how long it takes a site to load, straightforward navigation, quality content, mobile responsiveness, and internal linking.

Screen readers and search engines both rely on title tags.

Title tags are the first component of a web page that’s read by a screen reader. They’re also essential to a search engine’s ranking of a page and where it appears on the SERP. Even though title tags don’t appear on the page itself, the title tag appears as the heading of the SERP listing and it’s essential -- for both accessibility and SEO -- that it includes key words that accurately reflect the content on the page. From the standpoint of visually impaired screen reader users, it’s also important that every page on the site has a distinct title tag.

Search engines also scan alt text.

It’s tempting for content editors to slap in perfunctory alt text. Carefully describing an image for the purpose of helping a visually impaired person to envision it, takes time and thought and descriptive alt text can make or break the user experience for a visually impaired person who relies on a screen reader. At the same time, alt text that weaves in key search terms also serves to enhance SEO.

Meaningful header hierarchies support WCAG and SEO.

Accessibility compliance requires that content follows a logical H1 to H6 header sequence and that headers accurately describe the content that follows. Adherence to a logical content structure serves all users, and in particular, those who have cognitive impairments or rely on a screen reader. From an SEO standpoint, breaking content up into meaningful pieces of information with headers that incorporate key search terms, is key to SEO and can lead to the content appearing as a featured snippet on a SERP page. 

The Big Picture: Web Accessibility Fuels SEO

ADA Web accessibility compliance and SEO are two distinct areas of expertise. Fortunately a sharp focus on one, enhances the other. Viewed from another angle, when Google is treated as a distinct user for whom a site needs to be accessible, the result is significant steps toward achieving ADA accessibility.

Looking to continue the conversation on how web accessibility can improve search engine rankings? Contact us.

Feb 11 2020
Feb 11

Table of Contents

If you’ve touched a Drupal site at any point in the last ten years, it’s very likely you came into contact with Drush (a portmanteau of “Drupal shell”), the command-line interface (CLI) used by countless developers to work with Drupal without touching the administrative interface. Drush has a long and storied trajectory in the Drupal community. Though many other Drupal-associated projects have since been forgotten and relegated to the annals of Drupal history, Drush remains well-loved and leveraged by thousands of Drupal professionals. In fact, the newest and most powerful version of Drush, Drush 10, is being released jointly with Drupal 8.8.0.

As part of our ongoing Tag1 Team Talks at Tag1 Consulting, a fortnightly webinar and podcast series, yours truly (Preston So, Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) had the opportunity to sit down with Drush maintainer Moshe Weitzman (Senior Technical Architect at Tag1) as well as Tag1 Team Talks mainstays Fabian Franz (Senior Technical Architect and Performance Lead at Tag1) and Michael Meyers (Managing Director at Tag1) for a wide-ranging and insightful conversation about how far Drush has come and where it will go in the future. In this two-part blog post series, we delve into some of the highlights from that chat and discuss what you need to know and how best to prepare for the best version of Drush yet.

What is Drush?

The simplest way to describe Drush, beyond its technical definition as a command-line interface for Drupal, is as an accelerator for Drupal development. Drush speeds up many development functions that are required in order to take care of Drupal websites. For instance, with Drush, developers can enable and uninstall modules, install a Drupal website, block or delete a user, change passwords for existing users, and update Drupal’s site search index, among many others — all without having to enter Drupal’s administrative interface.

Because Drush employs Drupal’s APIs in order to execute actions like creating new users or disabling themes, Drush performs far more quickly than Drupal’s bootstrap itself, because there is no need to traverse Drupal’s render pipeline and theme layer. In fact, Drush is also among the most compelling real-world examples of headless Drupal (a topic on which this author has written a book), because the purest definition of headless software is an application that lacks a graphical user interface (GUI). Drush fits that bill.

The origins and history of Drush

Though many of us in the Drupal community have long used Drush since our earliest days in the Drupal ecosystem and building Drupal sites, it’s likely that few of us intimately know the history of Drush and how it came to be in the first place. For a piece of our development workflows that many of us can’t imagine living without, it is remarkable how little many of us truly understand about Drush’s humble origins.

Drush has been part of the Drupal fabric now for eleven years, and during our most recent installment of Tag1 Team Talks, we asked Moshe for a Drush history lesson.

Drush’s origins and initial years

Though Moshe has maintained Drush for over a decade to “scratch his own itch,” Drush was created by Arto Bendiken, a Drupal contributor from early versions of the CMS, and had its tenth anniversary roughly a year ago. Originally, Drush was a module available on Drupal.org, just like all of the modules we install and uninstall on a regular basis. Users of the inaugural version of Drush would install the module on their site to use Drush’s features at the time.

The Drupal community at the time responded with a hugely favorable reception and granted Drush the popularity that it still sees today. Nonetheless, as Drush expanded its user base, its maintainers began to realize that they were unable to fully realize the long list of additional actions that Drupal users might want, including starting a web server to quickstart a Drupal site and one of the most notable features of Drush today: installing a Drupal site on the command line. Because Drush was architected as a Drupal module, this remained an elusive objective.

Drush 2: Interacting with a remote Drupal site

Drush 2 was the first version of Drush to realize the idea of interacting with a remote Drupal website, thanks to the contributions of Adrian Rousseau, another early developer working on Drush. Today, one of the most visible features of Drush is the ability to define site aliases to target different Drupal sites as well as different environments.

Rousseau also implemented back-end functionality that allowed users to rsync the /files directory or sql-sync the database on one Drupal installation to another. With Drush 2, users could also run the drush uli command to log in as the root user (user 1 in Drupal) on a remote Drupal site. These new features engendered a significant boost in available functionality in Drush, with a substantial back-end API that was robust and worked gracefully over SSH. It wasn’t until Drush 9 that much of this code was rewritten.

Drush 3: From module to separate project

During the development of Drush 3, Drush’s maintainers made the decision to switch from Drush’s status as a module to a project external to Drupal in order to enable use cases where no Drupal site would be available. It was a fundamental shift in how Drush interacted with the Drupal ecosystem from there onwards, and key maintainers such as Greg Anderson, who still maintains Drush today seven versions later, were instrumental in implementing the new approach. By moving off of Drupal.org, Drush was able to offer site installation through the command line as well as a Drupal quickstart and a slew of other useful commands.

Drush 5: Output formatters

Another significant step in the history of Drush came with Drush 5, in which maintainer Greg Anderson implemented output formatters, which allow users to rewrite certain responses from Drush into other formats. For instance, the drush pm-list command returns a list of installed modules on a Drupal site, including the category in which they fit, formatted as a human-readable table.

Thanks to output formatters, however, the same command could be extended to generate the same table in JSON or YAML formats, which for the first time opened the door to executable scripts using Drush. During the DevOps revolution that overturned developer workflows soon afterwards, output formatters turned out to be a prescient decision, as they are particularly useful for continuous integration (CI) and wiring successive scripts together.

Drush 8: Early configuration support

Drush 8, the version of Drush released in preparation for use with Drupal 8 sites, was also a distinctly future-ready release due to its strong command-line support for the new configuration subsystem in Drupal 8. When Drupal 8 was released, core maintainer Alex Pott contributed key configuration commands such as config-export, config-import,config-get, and config-set (with Moshe’s config-pull coming later), all of which were key commands for interacting with Drupal’s configuration.

Due to Drush 8’s early support for configuration in Drupal 8, Drush has been invaluable in realizing the potential of the configuration subsystem and is commonly utilized by innumerable developers to ensure shared configuration across Drupal environments. If you have pushed a Drupal 8 site from a development environment to a production environment, it is quite likely that there are Drush commands in the mix handling configuration synchronicity.

Drush 9: A complete overhaul

About a year ago, Drush’s indefatigable maintainers opted to rewrite Drush from the ground up for the first time. Drush had not been substantially refactored since the early days in the Drush 3 era, when it was extracted out of the module ecosystem. In order to leverage the best of the Composer ecosystem, Drush’s maintainers rewrote it in a modular way with many Composer packages for users to leverage (under the consolidation organization on GitHub).

This also meant that Drush itself became smaller in size because it modularized site-to-site communication in a tighter way. Declaring commands in Drush also underwent a significant simplification from the perspective of developer experience. Whereas foregoing Drush commands were written in PHP as was the case in Drush 8, developers could now write Drush commands in a PHP method within a callback with the lines of Doxygen above the callback housing the name, parameters, and other details of the command. Also in the same release came YAML as the default format for configuration and site aliases in Drush as well as the beginning of Symfony Console as the runner of choice for commands.

Drush 9 introduced a diverse range of new commands, including config-split, which allows for different sets of modules to be installed and different sets of configuration to be in use on distinct Drupal environments (though as we will see shortly, it may no longer be necessary). Other conveniences that entered Drush included running commands from Drupal’s project root instead of the document root as well as the drush generate command, which allows developers to quickly scaffold plugins, services, modules, and other common directory structures required for modern Drupal sites. This latter scaffolding feature was borrowed from Drupal Console, which was the first to bring that feature to Drupal 8. Drush’s version leverages Drupal’s Code Generator to perform the scaffolding itself.


As you can see, Drush has had an extensive and winding history that portends an incredible future for the once-humble command-line interface. From a pet project and a personal itch scratcher to one of the most best-recognized and commonly leveraged projects in the Drupal ecosystem, Drush has a unique place in the pantheon of Drupal history. In this blog post, we covered Drush’s formative years and its origins, a story seldom told among open-source projects.

In the second part of this two-part blog post series, we’ll dive straight into Drush 10, inspecting what all the excitement is about when it comes to the most powerful and feature-rich version of Drush yet. In the process, we’ll identify some of the key differences between Drush and Drupal Console, the future of Drush and its roadmap, and whether Drush has a future in Drupal core (spoiler: maybe!). In the meantime, don’t forget to check out our Tag1 Team Talk on Drush 10 and the story behind Drupal’s very own CLI.

Special thanks to Fabian Franz, Michael Meyers, and Moshe Weitzman for their feedback during the writing process.

Photo by Jukan Tateisi on Unsplash

Feb 11 2020
Feb 11

1) Built-in support for multi-language sites and admin portals

Let's jump right in! For business owners, ecommerce eliminates many restrictions of traditional business practices. One opportunity is the ability to sell your product to overseas consumers, expanding your possible market to contain, well virtually, the whole world. Of course, one of the barriers to entry into certain markets may be the language.

Imagine this: You are a Brazilian business owner who just invented chewing gum that never loses its flavour. Obviously, the demand for this product is worldwide. The only problem is that you do not feel comfortable writing the script for your new online product page in English or any language other than Portuguese for that matter. In a perfect world, the ideal solution might be to hire translators for every language of each country that you want to sell this amazing gum in. However, the costs of such an endeavour are enough to make even those with even the deepest of pockets think twice.

In my opinion, the next best and completely viable option is to choose to develop your chewing gum site using Drupal then make use of the many multilingual modules to automatically translate your content (just Google “Drupal automatic translate” for a list of options). The advantage of these Drupal translation modules is that, first, it can appear as an option at the top of the page and is therefore easily accessible to the customer. Second, additional modules can allow you to automatically show the users local language based on their browser’s set language. Third, you can choose which blocks of text you want to translate and which you do not; so let us say for aesthetic reasons or brand awareness you do not want a certain block of the site to be translated, you simply do not enable the translation for that block in the admin portal. Additionally, while your site frontend is being translated for your visitors, as an admin you can maintain Portuguese as the primary language to run your backend admin portal.

Read the full Gartner reportSpeaking from my own experience, I shop online for bicycle components quite often. The problem is many of the unique manufacturers I am looking to buy from are based out of Italy and Germany. Google translate can do an adequate job of helping you navigate the site, but when it comes to the finer details like product specifications or return policies I quickly find myself out of my depth. The great thing about using Drupal Translate is that you can manually enter the translation for each language of every block on your website. So for example, instead of paying for a full site translation in each language, you could hire professionals to translate the important areas like the fine print and leave the less critical areas up to Drupal.

2) Features on features

Okay, Drupal is not exactly an episode of Pimp My Ride, but it can pretty much do anything you can dream of. If, for some reason, you want to design a site that sources all of the types of chicken wings sold in restaurants across your city. Then create a catalogue that breaks down the various chicken wings by texture, flavour, price, size, messiness, etc. Now you want to integrate a system that uses logic and intelligence to recommend the best beer your company sells to accompany any wing selection made. This is all possible with Drupal.

The cost to develop such a unique site with these custom modules on Drupal would not be cheap. However, the point remains that a feature such as the one mentioned above is quite crazy, but completely possible. If there is functionality that you need, it can be built on Drupal. The other big takeaway is that once you have paid for the development of the module you are now the owner and do not have to worry about any ongoing licensing costs. For reasons like these, it is my opinion that Drupal is the best CMS for such robust and custom site requirements.

3) Security

Of course, nothing can ever be fully secure especially without regular upkeep, but Drupal does a few things differently that should help you sleep better at night. Unlike the many popular SaaS platforms, Drupal is open source and non-proprietary. This means that you are the owner of your data and you are the one who decides how it is managed, meaning you can fine tune every aspect of your Drupal site from the site itself to your hosting environment. If you have a security team or security-focused partner that you work with, Drupal provides the flexibility they need to keep your data safe.

The official Drupal Security Team is also thoroughly on top of the security of the core Drupal software’s code and helps module developers keep their modules secure. This team frequently releases security patches that address any vulnerabilities that come up. In addition to the official Drupal team, the large Drupal community of developers donate their time to develop and monitor Drupal’s code. Drupal and all of it’s modules are built using a core set of coding standards, so the many thousands of developers working with Drupal’s code ensures security issues are found and addressed quickly.

Lastly, one of the features of Drupal that is best known is its ability to integrate into third-party applications. As such, Drupal is also capable of easily integrating into other security systems and platforms on the market. You’re not restricted to Drupal alone.

4) Open source community

In my mind, there are two main reasons that the open-source nature of Drupal and the community that surrounds it are such an advantage.

First, because of the large community of developers and its open-sourced nature, there are countless plug-and-play ready modules available free of licensing fees just waiting to be added to your website. This means, in addition, you are the owner of your own code and data. Furthermore, you never have to worry about losing development support for your website. There will always be another Drupal agency out there waiting to pick up the pieces if something were to go wrong.

Second, because there is such a large community of developers behind the expansion of Drupal, you have a veritable fusion of diverse ideas and designs. Instead of a single organization pushing code in a certain direction, you can find incredibly creative and unique libraries of code. This means a deeper pool of free talent to pull from. Even with the creative minds driving the development of Drupal, there is still consistency in the underlying code. This enables easier upkeep of the code itself and allows a lower barrier of entry when onboarding new developers. The advantage to the end-user is that, when compared to a fully custom build, using Drupal means that should your partner agency ever go out of business or the relationship deteriorates, you will have other experts in Drupal to turn to.

5) Future-proof

I keep bringing this up, but it really enables so many possibilities; because Drupal is so open to API integrations, you can design Drupal to work as a modular middleware behind the scenes. This means as you acquire new technology and software, it really is as simple as plugging it in and configuring an API hook.

Furthermore, as long as Drupal is paired with the right server, it can handle endless amounts of traffic and scale from small business to enterprise. This is a reason why Drupal is such a popular CMS of choice for medium-sized to enterprise-level organizations.

Finally, Drupal as a CMS is kind of like Play-Doh. You can build out your frontend experience for the market you are presently targeting using Drupal’s built-in theming layer or by using one of the many other frontend frameworks. Drupal’s APIs allow it to run headless, so it can hold your backend data but you’re not tied down to any specific way of building your frontend. Ten years down the road, though, you may have a completely different set of needs for your frontend framework. No problem, you can rest assured that Drupal won't get in your way.

Are you considering options for your digital experience platform?

Choosing the right DXP now is important to your business now and in the future. Protect your tech investment by assessing the trade-offs of buy or build deployment options and how they relate to your digital experience goals and business outcomes. This Gartner report has been made available to our readers for a limited time and will help you get started. Check it out.

Click to access the Gartner report today

Feb 11 2020
Feb 11
Sun through a lense

"A picture is worth more than a thousand words". True, but a large picture will make your webpage slower, which will affect your SEO in a negative way. And eat away at your servers space, megabyte after megabyte.

There are several ways to remedy such a behaviour, but one way is to use image compression services to save space. With online services or programs on your computer you can remove unnecessary information and compress images with sometimes up to 80% gain.

Here I'm going to show you how to integrate the TinyPNG service in your Drupal installation which automatically compresses your images.


There are many different services on the internet, but one of the best I have found is TinyPNG - and it's supereasy to implement on your Drupal site.

It's also super easy to see if you can benefit from using their service. If you visit their Page Analyzer and enter your site url, you will be presented with statistics. If you are over 25% savings, I would suggest you start using a compression service.

Statistics over how much your website can benefit from using an image compression service, in this case TinyPNG

Step 1: Installing the Drupal module

By using composer to install the module and the TinyPNG library, it's super easy to get started.


composer require drupal/tinypng

in your terminal. Composer downloads the module, places it in the correct folder and downloads its dependency - TinyPNG PHP Library - and places it in the vendor folder.


Head into your Drupal website and click Extend in the menu. Scroll down (or filter) to TinyPNG and activate the module. 

Step 2: Getting an API key

API Key? What's that? Well, to make TinyPNG accept the requests from your website to the service you need an API key. It's a way of saying "howdy, can I get some service". It's also a way for TinyPNG to track how many images you get compressed per month. Don't worry, you get 500 for free every month, so unless you upload more that that, you're in the clear

If you should send more than 500 requests then you won't get access to the service until next month - or if you pay for the service. 

For normal use, 500 requests should be enough.

Getting an API key couldn't be simpler. Just visit the developer section of Tinypng.com and enter your name and email. 

You get an email with a link. Click it, and - boom! - you're in. On the page you can see your API key and also a counter that lets you know how many requests TinyPNG has processed using your unique API key.

Screenshot from TinyPNG's developer page with an API key

Step 3: Make the magic work in Drupal

Click the Configuration link in Drupal's menu and look under Media. There you find TinyPNG Settings. Click it.

Now it's time to copy the API code you got from the TinyPNG service. Paste it into the field on the settings page and hit Save configuration.

Screenshot from the settings page of TinyPNG inside Drupal

Step 4: Choose your compression method

The module facilitates two different kinds of image compression: on upload or via Drupal's own Image Styles - or both. I myself use the uploading kind since I then know that I won't reach the monthly limit through the API. If I would use the image style version, then I could reach - and pass - the limit in a fast way since I manage a site with a lot of images. Sure, I don't need to use the image action on every single Image style I have in Drupal, but I sure would be tempted to do so. 

If you choose to use the TinyPNG API whn uploading you get two options under Integration method: Download and Upload. They are the same, the only thing to remember is to use Upload on your local installation and Download on your live server. The help text says it all: "The download method requires that your site is hosted in a server accessible through the internet. The upload method is required on localhost." Though, personally, the names could be better. But anyway, it does the job.

Step 5: Save some megabytes

Well, actually there isn't a step 5. After installing the module with its dependencies, entering your API key there isn't much more. Just sit back, relax and watch the images shrink when uploading and/or showing them to the users making their experience on your website faster and better.

Some numbers

Here is also a comparison before and after using TinyPNG.

Type   Before compression   After compression   Saved, %Image 1, PNG   1.1 Mb   267 Kb   75%Image 2, PNG   1.1 Mb   287 Kb   75%Image 3, PNG   1.2 Mb   269 Kb   77%Image 4, PNG   985.7 Kb   274.0 Kb   72%Image 5, PNG   5.6 Mb   1.5 Mb   73%Image 6, JPG   3.5 Mb   524 Kb   84%Image 7, JPG   197 Kb   104 Kb   47%
Feb 11 2020
Feb 11

Our client Senec is working in a competitive environment and has to react on changing requirements regarding privacy quickly. At the same time the user experience cannot be harmed by regulations. In the beginning of the year Senec requested us to build a two-step process for YouTube/Vimeo video playback.

Senec's website is built with Drupal and won the International Splash Awards 2019.

When loading a page that contains a video, a preview is displayed website with a custom image and a play button according to their design system. 

Screenshot video on Senec website before privacy overlay

But when the visitor decides to click on the video playback button, an information message is displayed as shown in the image below.

Screenshot video on Senec website with privacy overlay warning about video content

It is only after the user clicks on 'Play video' that the request is made to to the video provider to fetch the content and display it.

Displaying the video like this allows for both an engaging visual experience, and at the same time protects the user's privacy until consent has been explicitly granted.

Feb 11 2020
Feb 11

A Drupal 7 to 8 migration is anything but boring because there are so many ways to perform a migration! Depending on the complexity of the project, we can choose a technique that suits it best. The one we are going to discuss in this blog is to migrate content and configuration from Drupal 7 to Drupal 8 using a CSV import method.

Drupal provides various modules for importing data from different sources like JSON, XML and CSV. Drupal 8 core migration API system offers a whole suite of APIs that can essentially handle any kind of a migration from a previous version of Drupal to Drupal 8. 

Some prep Work before the Drupal 7 to 8 migration

In order to migrate from Drupal 7 to Drupal 8 using CSV import, we will need these modules.

Drupal 7 Modules -

  • Views Data export: This module needs to be installed in our Drupal 7 site. The Views Data export module helps in exporting the data in CSV format.

  • Views Password Field: This module helps to migrate passwords which will send passwords in hashed format. 

Drupal 8 Modules -

  • Migrate – The Drupal 8 Migrate module helps in extracting data from various sources to Drupa 8.

  • Migrate Plus – This Drupal 8 module will help in manipulating the imported source data

  • Migrate Drupal – This module offers support in migrating content and configurations to Drupal 8.

  • Migrate source CSV – This module offers a source plugin that can migrate entities and content to Drupal 8 from .csv files.

  • Migrate Tools – This Drupal 8 module helps by offering UI tools/Drush commands to manage migrations.

  • Configuration Development Module – This module helps in importing configuration files to Drupal 8.

Let the Drupal 8 migration begin!

First, we need to create a custom module for our Drupal 8 migration. Let’s name this module as test_migrate. And we know that after creating a custom module we need to create the info.yml file.


Above screenshot shows keys that are required for info.yml.

Once the info.yml file is created, we need to create a migration group for the migration. This migration group needs to be created in the path: test_migration > config > install. Name of the group should be migrate_plus.migration_group.test_migration.yml.


Above screenshot shows the folder structure to create a migration group.

Inside the migrate_plus.migration_group.test_migration.yml file, we need to write id, label and description for the migration group which is shown in the screenshot below.


After creating the migration group, we need to install this group in our info.yml file. 


Now, we are going to write a migration script for the Users, Taxonomy term, Paragraphs, Content types. Note that you are migrating in the same order since there will be a link between these entities. For example, content will be created by a particular user - so we first need to migrate users and after that taxonomy, content type.

Now let’s write a script in yaml file for user migration. So, in order to write user migration, we need user yaml file with the name migrate_plus.migration.test_migration_users.yml and script for migration is shown below.


These are the keys required for migration here source csv file which we need to be migrated. Csv files should be placed in the path assets > csv > user.csv. Users.csv is also shown below.



Path - It indicates the path for the csv file.

header_row_count - This will give row count which is the header of a particular column.

Keys - which should be unique for every row.

Process - In this we are mapping csv files to fields.


Above image shows the mapping between fields and csv. Here, the name is the machine name of the user name field and title is the csv column title. If we have multiple data for a single field, then we use delimiters. Users may have multiple roles in that case we write like shown in the above image.

Images are migrated by writing custom plugin. Custom plugin can be written in the path src > plugin > migrate > process. In the above picture you can see that the user_image_import_process is a custom plugin written to migrate user images.


Inside UserImportProcess.php we are writing the function which will copy the image and save it to the destination. Script is shown in the image below.


In order to identify where images should be saved we will write one more function ImageImportprocess. In that function we will mention the machine name of the image.


In the users info.yml file there is a destination section which will indicate where the migrated data is to be stored and which is an entity. This is marked in the image below.


After creating code for users, we need to write yaml for taxonomy terms. Note that if you have only title field in your taxonomy then you do not need to write a separate yaml file. If you have multiple fields in taxonomy term, then you need to write a separate yaml file. In taxonomy terms we will have tid as key since tid will be unique for each term.


After this we will migrate paragraphs. For that we need to create a separate yaml file. The code to migrate is shown in the below image.


Lastly, lets migrate the content type. The yaml file for the content type is shown in the code below.


label: 'Migrate Content type data from the csv file'

migration_group: test_migration

source:id: test_migration_content

 plugin: csv

 # Full path to the file.

 path: 'modules/custom/test_migrate/assets/csv/content.csv'

 header_row_count: 1


   - nid


 # Adding the mapping between the fields and the csv columns.

 title: title

 promote: promoted_to_front_page

 sticky: sticky

 field_display_name: display_name

 field_marketing_tagline: marketing_tagline


   plugin: entity_lookup

   source: Taxonomy

   entity_type: taxonomy_term

   bundle_key: vid

   bundle: taxonomy

   value_key: name

 body/value: body


   plugin: default_value

   default_value: "full_html"


   - plugin: explode

     delimiter: "|"

     source: fcid

   - plugin: skip_on_empty

     method: process

   - plugin: migration_lookup

     migration: test_migration_paragraphs

     no_stub: true

   - plugin: iterator


       target_id: '0'

       target_revision_id: '1'


 plugin: 'entity:node'

 default_bundle: content



   - test_migration_paragraph

   - test_migration_taxonomy

dependencies: { }

After writing all the yaml files the migration test_migrate.info.yml will contain the below installs.


Once you finish all these steps, go to your site and install your custom module.


Next, go to your project in terminal and run this “drush ms” command to check migration status as shown in the below image.


To migrate use command drush mim migration-id . We can see the migration ID in the above image.

Once done, if you check the migration status you can see the number of migrated items.


Now you can observe that all the content is migrated. If there is any error in the data migration, the process will terminate at that particular instance. Check the issue with that content and then once again you can restart the migration.

Things to Remember

  • If the migration is terminated in between the process, the status of migration will display as “importing”. In order to change the status to idle you need to run the command drush mrs migration-id. Next, run command drush mim migration-id

  • If you want rollback the migrated content, then run the command drush mr migration-id

  • If you have changed anything in the code after starting the migration process, then make sure you run the command drush cdi test_migration. This command will help you to reflect the changes while migrating. Once done, do a thorough check on your site to see if all the content is migrated.

Feb 11 2020
Feb 11

Florida Drupalcamp 2020 is the event that celebrates open-source software and brings together a worldwide community of Drupal users, developers, marketers and content specialists to a spot. The brightest of minds share their expertise, level up their skills, and make new friends in the community every year. 

This year, OpenSense Labs is a Silver Sponsor for Florida Drupalcamp 2020! To be held from 21-23 February 2020, the event will provide for a platform where developers, designers, and marketers gather to explore the most ambitious and cutting edge case studies.

Catch us here!

If you're going to be around during the camp, do not miss out on these sessions: 

Session 1: Centralised content distribution and syndication demystified. Why and how?

Saturday, February 22 | 2:15 pm - 3:00 pm

session 1

A central content repository allows the content editors to edit content directly from the backend of one site. Using the publisher site, organizations can publish, reuse, and syndicate content across a variety of subscriber sites and publishing channels.

The session will stress the importance of having a centralized reporting to boost the editorial teams’ productivity & article publication pace.

At the end of the session the attendees would be able to take away the following:

  • Centralized Content Distribution Architecture.
  • Real-time content syndication by setting up publisher and subscriber sites.
  • Configuring content schema between publisher and subscriber sites.
  • Minimizing Failures during data transmission.
  • Choosing the right infrastructure for content distribution.

Session 2: Architecting a Highly Scalable, Voice-Enabled and Platform Agnostic Federated Search 

Sunday, February 23 | 9:30 am - 10:15 am

session 2

Vidhatanand will be sharing how we have built an enterprise search over the traditional by tinkering with robust Apache Solr and Drupal 8, leveraging portability using Java Script and with a diverse range of CMSs, thereby increasing efficiency by 40%.  

He will walk you through the complex architecture of federated search and challenges amidst architecting a microservice. You will be equipped with the know-how of:

  • Enhancing website search experience retaining a blend of useful and accurate results.
  • Expanding inter-site searchability decreasing the bounce rate and latency.
  • Increasing data discovery and interoperability of information by cross-functional support to a plethora of platforms. 

See you there!

Taking this great opportunity to be a part of Florida DrupalCamp 2020 we can’t wait to connect with you about the amazing things our team has to offer. Come stop by and say hello to get your hands on some cool Drupal swag!

When: 21-23rd February 2020 

Where: Florida Technical College, 12900 Challenger Parkway, Orlando, Florida 32826

Feb 11 2020
Feb 11

The Drupal Community Working Group (CWG) is pleased to announce that registration is now open for a full-day Mental Health First Aid workshop on Sunday, May 17, 2020 (the day before DrupalCon Minneapolis begins) in Bloomington, Minnesota. 

The workshop will be held "field trip" style; it will be held off-site, at the Health Counseling Services facility in Bloomington, Minnesota, from 8:30am-5pm. Transportation will be provided to and from a location near the Minneapolis Convention Center (the site of DrupalCon) to the workshop. Following the workshop, attendees are invited to (optionally) attend a pay-on-your-own group dinner to decompress and discuss the day's workshop.

The CWG believes that these types of proactive workshops will help improve our community's mental health literacy and awareness, as well as making it easier for us to have open, honest, and respectful conversations and potentially spotting signs of when community members are in need of assistance.

The Drupal Association is generously sponsoring the workshop by providing funding to help defer the cost of the workshop as well as providing transportation. 

From the Mental Health First Aid website:

Mental Health First Aid is a course that gives people the skills to help someone who is developing a mental health problem or experiencing a mental health crisis. The evidence behind the program demonstrates that it does build mental health literacy, helping the public identify, understand, and respond to signs of mental illness.

Mental Health First Aiders learn a single 5-step action plan known as ALGEE, which includes assessing risk, respectfully listening to and supporting the individual in crisis, and identifying appropriate professional help and other support. Participants are also introduced to risk factors and warning signs for mental health or substance use problems, engage in experiential activities that build understanding of the impact of illness on individuals and families, and learn about evidence-supported treatment and self-help strategies.

Over the past few years, the CWG has organized proactive community health events, including on-going Code of Conduct contact training, as well as previous DrupalCon North America trainings on leadership, teamwork, and communications. 

In order for the workshop to proceed, we need at least ten community members to register by April 1, 2020 at https://healthcounselingservices.com/events/adult-mental-health-first-aid-11/

When registering:

  • Choose the "Pay now" option (do not select the "Bill my organization" option.
  • Use the coupon code: MHFA30 to receive $30 off the regular price.
  • For the "Name of organization", "Name of site", "Supervisor's name", and "Supervisor's phone" fields, feel free to use "not applicable".
Feb 10 2020
Feb 10


To perform A/B testing, segmentation, and the personalization of a webform, a site builder needs to create a variant of the form that can be triggered based on certain contexts, which can be as simple as a custom URL.

A variant is a form or version of something that differs in some respect from other forms of the same thing or from a standard.

-- https://www.lexico.com/en/definition/variant

A webform variant might alter a form's labels, descriptions, and even its confirmation page. A webform variant could be used to create an A/B test to confirm if a tweak or improvement to a form's user experience increases the rate at which the user completes a form. A basic A/B test would randomly load two variants, allow a defined number of users to complete the form, and then review the results to determine which variant had the highest completion rate. The most successful variant can then be permanently applied to the webform.

A webform variant can also be used to create a personalized webform based on a user's demographics. For example, webform's available inputs, labels, and even options could be altered based on a user's gender, age, locale, employer, etc. Even subtle tweaks can improve a form's user experience - for example, removing inappropriate disease options or inputs based on a user's gender can simplify an appointment request form.


Right now, the one out-of-box solution is to create multiple instances of a webform and route users to the appropriate webform. The biggest issue with having multiple webforms is that, in doing so, it collects two different datasets. Ideally, all submission data should go into the same results table to be analyzed with just the user experience changing. You can also use conditional logic to tweak hide/show elements and disable/enable certain behaviors.

Both approaches have limitations and lack some organization. For A/B testing, it is possible to alter a form via JavaScript. Still, this approach is limited to front-end tweaks - for example, you can't change an element's server-side conditional logic using client-side JavaScript.

The best solution is to provide an administrative user interface for defining and managing webform variants.


Variant definition (element)

Variant definition (element)

Variant definition (element)

Site builders need to define a variant type using an element. Once a variant element is added to a webform, a dedicated "Variants" tab is then displayed in the form’s management tabs. The "Variants" tab is used to create and manage variants of a webform.

Storing the variant definition in a variant element makes it easy to track and report submissions by variant. By default, the variant element is not displayed on the form or submission. Also, by default, a variant element is allowed to be prepopulated using a query string parameter. Using a query string parameter to set the variant makes it very easy to track completion rates by examining page views by URL. A variant can also be set using the element's default value or a webform field's default data.

When prepopulation is enabled, a site builder can choose to enable A/B testing by checking the randomize property. When randomize is checked, visitors are randomly redirected via JavaScript to an enabled variant.

Variant instances (plugin)

Variant instances (plugin)

Variant instances (plugin)

Once a variant element is placed on a form, variant instances need to be added to a webform. Variant plugins work very similar to webform handlers. A variant plugin can alter any aspect of the webform.

The default webform variant plugin that ships with the Webform module is called the 'Override' plugin. This plugin allows site builders to alter elements, settings, and behaviors using YAML.

Altering settings

Using the 'Override' plugin, site builders can alter a webform's form, submission, and confirmation settings and behaviors. For example, a variant can change a webform's confirmation type and message. It is also possible to setup variant-specific submission limits.

Altering elements

Altering a webform's elements makes it possible to change an element's types, label, validation, and even conditional logic. Elements can also be hidden or displayed in a variant. In YAML, a site builder enters the element name and element properties that need to be altered. For example, using a variant, a select menu can be changed to radio buttons.

Altering handlers

Altering a webform's handlers configuration is mostly applicable to email handlers because a variant can alter an email's subject, message, and recipients.

Custom Variant Plugins

Custom Variant Plugins

Custom Variant Plugins

The 'Override' variant plugin is very flexible and makes it very easy to create A/B tests. For webform segmentation, where multiple similar variants are needed, a developer might want to implement a custom variant plugin, which provides a dedicated configuration form and custom business logic to apply the variant.

Managing variants

Variants are very similar to Handlers, with the sole purpose of variants being to alter a webform. Variants are managed using the same user interface as Handlers with the addition of "View", "Test", and "Apply" operations. The "View" and "Test" operations allow site builders to review the variant's changes and test that submission handling is working as expected.

Applying variants

Applying variants

Applying variants

The "Apply" operation allows a site builder to apply a webform variant to the master webform. As a variant is applied, the selected variant or all variants can be deleted. The “Apply” operation is used to finalize an A/B test by applying the winner.

Webform nodes

Webform nodes

Webform nodes

Since variants are defined as elements types, they can be populated using a webform field's default data. When a webform has variants, the “References” tab, which tracks all webform nodes, will now display the variant information for each webform node. An “Add reference” button is placed at the top of the page. The “Add reference” button opens a dialog where site builders can select the variant type when creating a webform node.

Placing variant instances in individual webform nodes makes it easy to create segmented dedicated webforms that still route data to the same submission table. For example, an event node can use a shared registration form while using variants to change the form based on the event type.

The concept and use case for variants is relatively complex, and it helps to see variants in action. There is a Webform Example Variant module, which includes an example of an A/B test and an example of a segmented webform. The “Example: Variant: Segments” webform demonstrates how a webform can leverage multiple variants to create and a long- and short- form for multiple organizations using a custom WebformVariants plugin.

The below screencast walks-through what are webform variants and how can variants be used to create A/B tests and segmentation.

What is next?

Experiment, Experiment, Experiment

The single-word to describe why I felt it was essential to add variant support to the Webform module is "Experimentation." As a community, being continuously open to new ideas and experimentation is what keeps Drupal moving forward.

Variants allow sites to experiment and improve their existing webform using A/B testing. Variants open up the possibility to create segmented and targeted webform for specific user audiences. As with every aspect of Drupal and Webform, variants are extendable, which opens up the possibility that external personalization services can be connected to the backend of a webform to create personalized and individualized webform experiences.

I look forward to seeing what people can accomplish using variants. And at the very least, we can perform A/B tests and build more awesome webforms.

Who sponsored this feature?

Memorial Sloan Kettering Cancer Center (MSKCC) has been my primary client for the past 20 years. Without MSKCC's commitment to Drupal and their early adoption of Drupal 8, I would most likely not be maintaining the Webform for Drupal 8. Most of my work on the Webform module is done during my free time. Occasionally, MSKCC will need a Webform-related enhancement that can be done using billable hours. In the case of adding support for variants, MSKCC, like most healthcare systems, need webforms that target multiple and segmented audiences. Being able to create variants makes it easier for MSKCC to manage and track segmented forms and submissions.

I am very fortunate to have an ongoing relationship with an institution like MSKCC. MSKCC appreciates the value that Drupal provides, and the work that I am doing within the Drupal community.

Backing the Webform module

Open Collective is providing us, Drupal, and Open Source, with a platform to experiment and improve Open Source sustainability. If you appreciate and value what you are getting from the Webform module, please consider becoming a backer of the Webform module's Open Collective.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!


Feb 10 2020
Feb 10

Growing the community is the implicit goal of every Drupal meetup and event I attend. It's a constant topic of conversation at Drupal event organizing meetings, agency roundtables, and panels about recruitment and selling Drupal. Last year, I created a presentation for DrupalCamp Atlanta called "Growing the Drupal Community". Since then, it's been my hallway track conversation of choice, and everyone I talk to seems onboard with the goal of growing Drupal. As part of my role on the Drupal Association board, I'm chairing the Community & Governance Committee. We've been having lots of conversations about facilitating community growth, and I wanted to share some of what I've been thinking.

Our Target Audiences

By definition, if we want to grow Drupal, that means talking to people outside the Drupal-sphere. So who would we be targeting?  

  • Decision makers selecting a technology (Marketing/Communications and IT)
  • Developers and technologists curious about Drupal
  • Drupal users who aren't active in the community
  • Users who inherit a Drupal project
  • Agencies who are using Drupal for the first time
  • People looking to switch careers  

These are who I think of when I think of growing the community. It's important to remember that we're not just talking to developers or decision makers, but people from a wide range of backgrounds. The Drupal community is made up of designers, project managers, developers, translators, content and accessibility experts, and folks with other roles or who do Drupal as one of their many responsibilities.

One Step Closer to Engagement

Growing the Drupal community means bringing our audiences one step closer to participating in the community. That could mean different things for different people depending on what type of user they are and where they're at in their "Drupal Journey." Here are some tasks early on in this journey that we should make easier:

Try it Out

  • Install Drupal
  • Try out a demo
  • Watch a video about how Drupal works

First Contact

  • Attend a first Drupal event
  • Attend Global Training Days 
  • Make an account on Drupal.org and/or Drupal Slack
  • Talk to another Drupal user in the community
  • Join a Drupal user group on meetup.com 

Stay Informed

  • Join a mailing list to learn more about Drupal
  • Read a case study or download promo material
  • Watch a video from a Drupal event
  • Search for help on Drupal.org or Drupal StackExchange  

Later in the journey, we hope to take users beyond feeling like "Newbies." We want them to use Drupal successfully, become members of the Drupal Association, make contributions, and become Drupal ambassadors. But arguably, the steps above are more important for growing the community.

What does this mean for Drupal.org?

Drupal.org is the home of the Drupal project and it should help move users further along their journey to being part of the community. It's a big ask. Drupal.org is also a place for the existing community to communicate and collaborate, and it's a complex website with a lot of moving pieces.   

That being said, here are some key places we could focus on to build community engagement:  

  • Community page: At DrupalCon Amsterdam, I conducted a UX feedback session and collected some feedback about the Community page. One audience member said "I feel like this is structured in a way that people who are very familiar with the community would think about it, rather than from the point of view of someone who is new to the community." I think repositioning this page for newcomers and focusing on local events (camps, meetups, and local training days), joining the Drupal Slack, local associations, and getting started using Drupal would be a big improvement.
  • Groups.drupal.org is still a useful community organizing tool for some topics and groups, but many of its features have effectively been replaced by meetup.com, confusing many new users who stumble across abandoned groups on the website. When a user stumbles across a group, clearly pointing them to the place where they can find upcoming events and the most relevant content would be really helpful.
  • The Evaluator Guide is a valuable tool for developers trying out Drupal for the first time. I think adding in an evaluator guide for different audiences (especially decision makers) is essential to creating a smooth and welcoming onboarding experience.

How You Can Help

  • Spread the success stories of Drupal in your local communities and networks, especially to those outside the Drupal community. Post those stories on LinkedIn, attend events outside the Drupal-sphere. And look for ways to promote Drupal in outlets where non-Drupal folks hang out.
  • Volunteer with the Promote Drupal initiative 
  • Be active in your local Drupal community
  • Welcome newcomers on Slack, Drupal.org, and at the Drupal events you attend 
  • Look for opportunities to hire and train those outside the Drupal community  

Let me know your thoughts and what you think of the ideas above. I'd love to start a conversation.


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web