Oct 01 2019
Oct 01

Drupal accessibility is vital for your website

It is vital to create accessible content on your website. Among your audience, people with impairments will also be included. On top of that, the website itself will become more user-friendly and you will better meet the Drupal accessibility standards that exist today. In this blog post, we will go over 20 tips that will improve your content and website accessibility, then I'm going to make a brief description of W3C and the WCAG guidelines and finally, I'm going to suggest 5 Drupal modules that will aid you in your quest to improve your Drupal accessibility. Let's get started.

1. Incorporate a Site Map

Drupal accessibility map

A site map is a beneficial tool that lets a user of your website assess the logical structure of your website. This, in turn, will make it easier for the user to be able to overview the content of your website. On top of that, it makes the content on your website easier to be accessed and increases your Drupal accessibility. 

2. ALT attributes to describe pictures

Alt attributes are a very important part of enabling your website to have accessible content. The main purpose of ALT attributes is that is going to help the search engines and website better be able to describe and understand what the picture is all about. This can be very helpful for people who can’t see and receive their image descriptions through audio feedback. Probably your website was built from the start with alt attributes in the content, but you need to train your writers and site maintainers to not skip over the alt attributes when updating the website.

3. Clean and distraction-free content

Drupal accessibility clutter

Another paramount point to make your content more accessible is to host your content on a clutter-free website. This will enable easier access to the content on your website. Which in turn will make it less frustrating for impaired people to navigate your website and to get to the important parts of it.

4. Clear and simple language

Language is another factor that has to be taken into account if you want to make accessible content. It’s important to adapt your language to be able to be understood by a wide range of people. That is why, even in writing, it’s important for the language level to remain conversational. That means no fancy words that can make it more difficult for the screen reader to do its job. If you install the Yoast SEO Drupal module you'll get a real-time score of how easy-to-read your content is! 

5. Meaningful link text

It is important that the link text is as clear as possible. Link texts like “click here” or “read here” are not descriptive enough. Instead, try to link a sentence or group of words that are describing what the link is about. This will lead to a decrease in frustration for users that are unable to see and use a reader.

6. Ensure keyboard accessibility

People that have motor disabilities, visual impairment or are amputees, often have trouble using a keyboard or any device that requires a high degree of motor coordination. That’s why keyboard accessibility is so important. The main point of keyboard accessibility is to make every element or link be selectable by using the TAB key. In order to test if your website has this functionality, just press TAB and see if every element will be able to be selected. This way, you will greatly reduce the struggles of impaired people.

7.Provide videos and audios with transcripts or captions

In order to make accessible content, video and audio should have transcripts or captions. This is a crucial step in making the content on your website accessible. With this, screen readers will be able to aid the visually impaired by reading the text, while the deaf will be able to read the text. 

8. Support screen readers

Drupal accessibility keyboard

Screen reader support is the most important piece for improving your Drupal accessibility. With this kind of software support, your website will be able to read out loud the text that it’s being displayed on your website. Basically, it lets blind people hear the text from your website. On top of that, paired with captions and translations, the screen reader can also read what is happening in a multimedia video. Also, the screen reader gives two types of feedback, either through speech or through braille. A general awareness of how screenreaders work is a great first step in training your writers on accessibility.

9. Don’t use automated media

What is automated media? Automated media is the media that starts automatically after a website is accessed. It either can be an ad or a video. In both cases, it can be annoying for somebody with an impairment to have to find and mute or close the media windows. This is why automated media should be turned off on your website.

10. Review your website using automated accessibility assessment tools

It’s always a good idea to assess your website's Drupal accessibility with an automated testing tool for accessible content. This tool will automatically scan and see how compliant your website is. After this, you can see the areas of your website were your doing great and the areas were you could still improve on your accessibility.

11. Make your website seizure proof

Drupal accessibility brain

It’s really important to make sure that your website is not causing some unwanted seizures in your audience. For example, someone that suffers from epilepsy can have it triggered by rapid flashing animations. A simple rule to avoid such an unfortunate event is to not have content that flashes for more than 3 times per second. This way you will make sure that you’re not going to trigger any photosensitive seizures.

12. Content that has to be input by the user has clear instructions

If a website requires its users to input content, then instructions have to be crystal clear in order to avoid confusion. The easy way to do this is to provide labels for every form control. Examples of such control are drop-down menus, text fields, and checboxes. On top of that, the labels have to describe the function and purpose of the control. This will make sure that the assistive technology will refer to the correct form, increasing your Drupal accessibility.

13. Character key shortcuts

Drupal accessibility characters

If a website supports keyboard shortcut that consists of numbers, letters, punctuation or symbols, then it should have the option to be able to be turned off. This will make sure that people will not trigger accidentally a shortcut in by pressing on the wrong button.

14. Users are allowed to turn animations off

Another important feature that your website has to have in order to be more inclusive and accessible has to be the function to turn animations off. It’s important to have this feature because animations can be distracting and can make the navigation on your website harder.

15. Pointer gestures

Complex actions such as pinching for zooming or swiping should also be able to be done through other means. This will ensure that the people of your audience that cannot perform for various reasons, will not be left out. This is a vital point for your Drupal accessibility.

16. Motion actuation

The interaction that can be used by moving your phone, for example shaking it, should also be able to be done through the interface, without the need of physically doing the interaction. This will increase the Drupal accessibility, inclusiveness, and user-friendliness of your website.

17. No time limits

Drupal accessibility hourglass

Having no time limits is really important. Imposing time limits on your website can make people with motor, visual or hearing disabilities have a hard time reaching their goal in a timely manner on your website. This, in turn, can lead to an increase in user frustration. In order to avoid that, disabling time limits is the way to go.

18. Text resizability

Another important aspect of improving your Drupal accessibility is text resizability. Basically, your website has to allow its users to zoom to up to 200% from the original size. This will ensure that even with some sort of visual impairment might be able to read the text or view your photos.

19. Visual presentation

This is another important criteria when you are considering making your website more inclusive. Adhering to this guideline will give our end users the ability to choose how to visualize your website. This includes the colors, the line spacing, and sizes. This will give your users the freedom to choose the visual representation of your website however it suits him best. 

20. Bypass Blocks

Drupal accessibility stop sign

Another tip to make your website more inclusive and user-friendly is to create the opportunity for the users to be able to bypass blocks. This is important because a screen reader will read all the navigation links, header links and all sorts of repetitive content that is present on a website, regardless of how long the links are. Now, you can imagine how frustrating it can be for a person to have to sit and listen to a high number of links, that may be irrelevant for them, before actually getting to see or hear the content that they were originally searching for. This may lead to a lot of your visitors becoming frustrated and leaving your page. So, in order to avoid this situation, the easiest way is to provide a skip to content link in your header. With this, you create better Drupal accessibility for your website.

WCAG Guidelines


WCAG was developed by the W3C (World Wide Web Consortium) as a set of regulations that help make digital content accessible to all users, including those with disabilities.


There are 3 versions of these regulations. These are WCAG 1.0, WCAG 2.0 and WCAG 2.1. The latter two have at their core four basic principles that have to be met in order for a website to be compliant. These principles are as follows: 

  • Perceivable: The information and user interface has to be presented to the user in a way that can be perceived.

  • Robust: The content has to be robust enough to be able to be interpreted by many types of users, including assistive technology and future technology.

  • Operable: Navigation and user interface components have to be operable.

  • Understandable: Information and user interface have to be understandable.


On top of that, each guideline has a level of compliance that is assigned to it. The levels of compliance are as follows:

  • A: This level of compliance usually has the highest priority and is the easiest to achieve out of them all.

  • AA: This is a more extensive guideline. It is usually regarded as the standard to meet.

  • AAA: This is the most extensive design standards to meet. It is also more strict, thus it is the one that is the least common to meet.


Currently, most laws require websites to be WCAG 2.0 compliant. Only if the laws in your country explicitly state that your website should comply with the standards of WCAG 2.1, then you should adopt that. However, the W3C does suggest that the new website should be built in compliance with the WCAG 2.1 since they tend to be more inclusive and user-friendly.

Drupal Modules that help with your Content accessibility

Drupal accessibility disability

Now that you have an idea of the compliance levels for accessibility, it’s time to see a list of Drupal modules that can help you improve the Drupal accessibility of your website and become more user-friendly and inclusive.

Automatic alternative text

This Drupal module makes it easy for the images on your website to have an alternative text, even if there is none specified by the user. This module uses Microsoft's Azure Cognitive Services API. It basically identifies what the image is about and makes a description or more based on the confidence level.

Text resize

This module allows the text on your website to be adjusted to the needs of your user. This module is available for both Drupal 7 and 8. 

Style Switcher

This module provides a high degree of functionality for the users that are suffering from colorblindness. It gives the ability for themers to create themes with alternative stylesheets. This, in turn, gives the ability for the user to select the right color scheme for their particular type of colorblindness.

Accessibility Scanner

This module allows you to perform website accessibility assessments in order to see where your website can be improved. This module has to be used in conjunction with achecker. On top of that, the websites that can be asses are both local and remote.

Fluidproject UI Options

This module provides the user with the ability to control and modify a page’s font size, font style, height, contrast and link style. On top of that, those preferences are remembered on the website by using cookies. A perfect module to ensure a higher degree of customizability for all its users.


Hopefully, now that you find out those tips and have a better understanding of the WCAG guidelines imposed by the W3C, you can put your newfound knowledge into practice and use the suggested Drupal modules to make the most amazing, inclusive and user-friendly website that you can make. These will get the Drupal accessibility of your website to new heights!

Oct 01 2019
Oct 01

We have witnessed rapid developments around voice assistants over the past few years. With mobile users increasing exponentially every passing day it would be fair to assume that voice searches will rise simultaneously. Fiction has transformed into reality, one can pose questions to a device and get human-like reactions, stunning isn't it? This is what millions of users are doing every day with Alexa, Apple pod, Google assistant, etc. User interfaces have changed over time, and with each new user interface, a bundle of new difficulties has emerged. 

Alexa Multi-turn conversation


Conventional user interfaces are displayed as controls in an application (text boxes, buttons) or web pages. They are vigorously utilized and have been demonstrated to be sufficiently effective for human-machine interaction. 

| The question persists, why build voice assistants? What are the advantages of having voice assistants? 

  1. The magic of conversational interfaces is that users don’t have to learn how to use them. Alexa skill (android app) should leverage the power of natural language understanding to adapt to the user’s word choices, instead of forcing them to memorize a set of commands. 
  2. As someone said, “Don’t play the odds, play the man”. The voice assistant will be able to do that as voice search keywords are normally longer than text search which is why they are more conversational. 
  3. One of the significant benefits of voice assistants is their machine learning capabilities. The more we interact with these devices, the more the assistants grasp. After a period, they can return highly customized outcomes.
  4. With voice assistants, you can take into account the customer based on who they are and not simply their behavior. While it's still early for personalization of the customer experience through voice assistants, this is tremendous for businesses.
  5. Conversations are classified into two types, single-turn, and multi-turn dialogs.

| Single-turn Vs Multi-turn Dialog with Drupal

Single turn: Dialogs where the conversation ends with one question and one response in return. For example: Asking Alexa to set an alarm, a reminder, play a song, adjust the volume, is not a technical conversation. This is called a single-turn conversation.

Alexa Multi-turn

Let’s consider an example in context with the Drupal and Alexa module. Here we have created Alexa skill which provides information related to Drupal. The user asks Alexa ‘who is the founder of Drupal?’ she responds ‘Dries’. But when you ask her “Which year it open-sourced?”. Alexa fails to determine the context of the question i.e “Drupal” and treats it as a brand new query. 

A few questions cannot be answered in a single turn. A user may pose a question that should be filtered or refined to determine the correct answer. That is where Multi-turn conversations come in to picture.

Multi-turn conversation with Alexa


| Dialog Management

Genuine conversations are dynamic, moving among topics and thoughts smoothly. To make conversational Alexa skills, structure for adaptability and responsiveness. Skills ought to have the capacity to deal with varieties of discussion, contingent gathering of information, and switching context mid-discussion. Dialog management makes these regular communications conceivable. - Definition from Alexa docs

| How do you make this work?

Create an Alexa skill: 

  • Amazon Alexa skills are equivalent to Android apps. You have to create a custom Alexa skill using the Alexa skill kit (ASK) on the Amazon developer console. 
  • You define an interaction model and point it to your Drupal site.

Interaction model: 

  • With the Alexa Skills Kit, you can create skills with a custom interaction model. 
  • You implement the logic for the skill, and also define the voice interface through which users interact with the skill. 
  • To define the voice interface, you map users' spoken input to the intents your cloud-based service can handle.

Components for Alexa custom skill:

  • Use an invocation name to start a conversation with a particular custom skill. For example, if the invocation name is "Drucom", the users can say “Alexa, open Drucom”.  
  • An invocation name can be called to get things going or you can combine invocation name with intent such as “Alexa, open Drucom, order wine”.
  • Each intent corresponds to something that the Alexa skill is capable of doing. Intent can capture the things that your users want to do. You might design intents to capture the details. Each intent in the Alexa skill contains the following:
  1. Intent name
  2. Utterances
  3. Slot (optional)
  • Utterances are nothing but an invocation phrase for intents. Users can express the same intent using different statements. For example, if we were building a help intent, there are different ways one can express that he/she requires help:
  1. I need help
  2. Help me
  3. Alexa, can you help me?
  • The way Alexa works is, it will parse what the user says. It will not just send the raw sentence but it will pass the intent that’s being triggered too. 
  • The utterances you provide to an intent do not have to be perfect which covers all the cases and scenarios, it is training data for Alexa to figure out what the intent here is.

Let's start with implementing the interaction model for the Add to cart functionality.

Step 1:  Create a new skill with Drucom as the skill name

Custom skill for Alexa

Step 2: Set an invocation name



Step 3: Create an intent

For our interaction model, we will create an intent called AddToCartIntent, which will be responsible for handling the utterances for adding items to the cart. Adding utterances: When users interact with our skill, they may express additional things that indicate what they want to order.


Looking at the above utterances we can say that the AddTocartIntent will only be invoked when the user tries to add Red Wine to cart but it will not invoke if a user tries to add some other product and that's where custom slot types come to the rescue. 

Step 4: Create slot types and using them in AddToCartIntent

  • Glancing through the utterances above, we can identify the two slots that we have to catch i.e productName and quantity.
  • We have to create one custom slot type for productName and will use one built-in slot type AMAZON.number for quantity.
  • Custom slot types are a list of possible values for a slot. They are utilized for a list of things that are not secured by the built-in slot types provided by Amazon.

Once we have set up the slot types, it’s time to apply them in our intent. Once you are done with the changes our intent will look something like this:



Step 5: Activating Dialog management

To activate the dialog, you will have to mark at least one slot as ‘required’.

Slot form - you need to provide the sample prompts which Alexa will use while asking questions to the user,  along with these sample utterances the user might also add a slot value. Now our interaction model for AddToCartIntent is ready.


I have covered what single-turn and multi-turn conversations are, and how multi-turn conversations with Alexa and Drupal are vital. I have also described the steps to create a custom Alexa Skill. In my next blog, we will learn more about Configuring a Drupal site.

Sep 30 2019
Sep 30
[embedded content]

Automatic updates are coming to Drupal at the end of October! Long one of the most commonly requested features in the Drupal community, Drupal 7 and D8 will soon have an automatic updater that will allow Drupal installations to stay up-to-date more easily. How does Drupal's new auto updater work, and what do you need to know about it? In this Tag1 Team Talk, we dive into not only Drupal's new automatic updates feature itself but also its architecture, components, and roadmap, as well as why it's such an important part of Drupal's Core Strategic Initiatives.

Join moderator Preston So (Contributing Editor, Tag1 Consulting) and guests Lucas Hedding (Senior Architect and Data and Application Migration Expert, Tag1), Tim Lehnen (CTO, Drupal Association), Fabian Franz (Senior Technical Architect and Performance Lead, Tag1), and Michael Meyers (Managing Director, Tag1) for a deep dive into the nuts and bolts of Drupal's groundbreaking automatic updates feature, directly from the module maintainer, and the strategic initiative sponsors including the Drupal Association, MTech, Tag1, and the European Commission.

Further reading

Automatic Update - Module

Automatic Updates - D7 and D8 Documentation Pages

Automatic Updates - Issue Queue
To provide your feedback on this first generation of the Automatic Updates module, create an issue in the Automatic Updates issue queue

Drupal Core Strategic Initiative

More information about the PSA.json feed can be found in this issue: https://www.drupal.org/project/drupalorg/issues/3068539

Drupal.org uses a package hashing and signing architecture based on the BSD Signify project

Drupal contributors created a php implementation of signify: https://github.com/drupal/php-signify

Sponsors & Supporting organizations:
Drupal Association, Funding & Direction http://association.drupal.org
European Commission, Funding https://ec.europa.eu/
Tag1 Consulting, Development http://tag1.com
MTech, Development https://www.mtech-llc.com/

Text Transcript

[Preston So] - Hello, and welcome once again to the Tag Team Talks series. Today I'm joined by several guests from all around the world and talking about a subject that is very near and dear to my heart, I'm very excited to hear about this topic, Drupal's new automatic update feature. This is part of a Drupal core strategic initiative happening as part of the Drupal core roadmap. And today we're gonna talk with the maintainer of the project, as well as several special guests, about what it is, why it's important, and how you can get started with it right away and help us report any bugs or any issues you might encounter. But before we get started I just want to remind everyone who's joining us today, don't forget to check out our previous Tag Team Talks at tag1.com/tagteamtalks. We've done several amazing webinars and sessions with guests from all around the world about realtime collaborative editing, rich text editors, and of course, our work, working with a top 50 Fortune company in all of these issues. If you do like this session, if you enjoyed this conversation today, please remember to upvote, subscribe, and share it with all your friends and colleagues at your team. So, first thing I wanna do today is introduce myself. My name is Preston So. I'm a contributing editor at Tag1 Consulting, moderator here at Tag Team Talks. Joined by my friend Michael Meyers, managing director of Tag1, located here in New York City. Both of us are separate places here in New York. We're joined today by Lucas Hedding from Leon, Nicaragua, the senior architect and data and application migration expert at Tag1. Lucas is one of the top 20 most active contributors to Drupal 8. He's also the Drupal core Migrate subsystem maintainer, a core contribution mentor, a drupal.org project application reviewer, and the maintainer for the thing we're talking about today, Automatic Updates. We're also joined today by Fabian Franz in Switzerland, senior technical architect and performance lead at Tag1. Fabian is one of the five Drupal 7 core branch maintainers. Here's also one of the top 50 contributors to Drupal 8 and maintainer for several Drupal 8 core subsystems, including BigPipe, Dynamic Page Cache, and Theme API, all very important systems in Drupal. We're also joined today by Tim Lehnen, located in Portland, Oregon, the CTO at the Drupal Association. Tim has been leading the Drupal Association engineering team for a number of years now, it's a great success, overseeing amazing initiatives to support Drupal development in the community. This includes the contribution credit system, the recent move to GitLab, and he's taken point on managing the relationship with the European Commission as well as they sponsor this wonderful initiative here. So, just to kick things off I wanna turn over the mic to Mike Meyers to talk briefly about Tag1 and why it is that we're talking about Automatic Updates, which is, once again, one of the most incredible topics, I think, that we can talk about in this series. [Michael Meyers]- Awesome. Thanks, Preston. Happy to be here. Tag1 is the second all-time leading contributor to Drupal. We have the largest concentration of core maintainers, and our team maintains over a dozen core systems like User and Views. We do a lot of our business based on Drupal, and auto uploader is key to keeping your sites up to date. So I'm excited to hear how this great new feature is gonna help end users, agency partners, platform companies, better manage their Drupal sites. [Preston]- Absolutely. And automatic update, this whole notion of auto update is, it's been something that's in the back burner of Drupal for a long time. It's also part of a very interesting aspect of the Drupal community which I think is very important to call out, which are the 10 or so currently ongoing Drupal core strategic initiatives. Before we jump into the technology itself let's just step back for a bird's eye view for a moment. What is a strategic initiative in the Drupal community, and what are some of the past and current ones? How did this whole initiative come to be? [Tim Lehnen]- I think that's a great question, and I'd be happy to speak to it a little bit. This is Tim with the DA. The strategic initiatives for Drupal are laid out on maybe an annual basis, maybe a little bit longer than that in terms of horizon, frequently in Dries' keynotes at DrupalCon, where he talks about key priorities for the Drupal project that he's discussed together with the whole core maintainer team and prioritized and settled on as the major efforts that he wants to rally the community around, and certainly the core developers around, in terms of moving the Drupal project forward. In the past, these have been things like adding Composer support to the Drupal project along with the release of Drupal 8. Things like updating the Migrate system, things like providing the API-first features in Drupal, and a number of things like that. And so, it was, I think in 2017 or 2018 when Dries first called attention to the need for automatic updates as a sort of foundational feature for Drupal, pointing out that this is something that a lot of commercial and proprietary software already does, and does well, and even some other open source projects are doing it. And it really makes a big difference to the total cost of ownership of people who run Drupal sites. [Preston]- Let's talk a little bit more about that angle, Tim. I think there's a lot of folks on this session here today who are really interested in learning more about what you just mentioned, the TCO, total cost of ownerships. What exactly is the benefit of auto update to these small and medium-sized businesses? You mentioned agencies as well. What sorts of use cases are we supporting with this? [Tim]- Yeah, it's a great question. Let me paint a picture first for what happens right now around updates, and particularly the time period that's, I think, the most time intensive and critical for most people who maintain sites, whether they're agencies maintaining on behalf of clients or the end users themselves. So, that's security updates. Security updates are, of course, something that are very important, something that people need to keep up with as quickly as they can. And so there's sort of this culture and community within the Drupal community of everybody getting together during a core security release window on Wednesdays, hanging out in Slack, talking to each other, and waiting for that security release to drop. Now, while we have these security release windows, sometimes they come out right on time, sometimes they take several hours, but that window is sort of U.S.-centric because that's where a lot of Security Team members are. And so what that means is teams all over the world are up on high alert whatever time it might be locally waiting for this critically important patch to drop so they can update sites and make sure they're secure. And so, this is something that takes a lot of time that people can't really plan for, so much. They have to be there in that window, and they don't really have much of a choice. And so this is a situation that can be expensive in terms of just keeping your developers on call, paying overtime if you're outside of the timezone that these windows happen, all those sorts of things. And there's a lot of anxiety around them. That was one of the first elements that we wanted to address is making the security update process in particular easier, something you don't have to be so worried about, and then expanding on that to other use cases. [Fabian]- One question I had because it's pretty fascinating for me is for years, the Drupal community has said, we need auto updates, we need auto updates, we need auto updates. And it was always like, hey, but it's not possible in all. So, how come this direction change that now we are thinking it's possible finally? [Tim]- That's a really good question. I think there's a few factors that all came into play. And I think Lucas can speak to this a little bit more than I can. But just from an overall view, I think just various things came together. The architecture of Drupal 8 had changed, starting to do things like support Composer workflows, but also incorporating other elements like those, Symfony as a core element, all those kinds of things. But also we began observing just in the open source world other people doing this. So, for example, WordPress has an auto update system. Some other projects have an auto update system. So I think people have been thinking about how we're gonna architect something that would work in a Drupal context, for quite a while. Lucas, do you wanna maybe speak to what came together to make it more possible to consider? I think we lost your audio, Lucas.

- [Fabian] Lucas, you're on mute. Can you hear us? [Lucas Hedding]- Hello, can you hear me?

- [Fabian] Yes, much better. [Lucas]- Ah great. I think it's a bit of a myth that we had bought into. We had said to ourselves, we can't do this, and we said it to ourselves long enough that we had convinced ourselves. But then as we started seeing the competition in WordPress and others doing in-place updates or auto updates of some form we said, well, let's see if we can solve this tough problem. And we're close. We've spent a lot of time architecting this. We're still not 100% there, but I think we can see the light at the end of the tunnel. I think it's gonna be a pretty good success as we get to the finish of this first part of the initiative. [Fabian]- Now I'm really curious. How did you split up this mega task of getting even started with this tough problem? [Lucas]- Well, I mean... Let's see, we did a lot of architectural thinking, a lot of discussions at various camps and DrupalCons. [Tim]- Yeah, I think one of the... It was a confluence of a few factors that came together and made it work well. We had a meeting of minds between some key contributors in the core team at Midwest Drupal Summit in 2018 and did a lot of architectural work, and at the same time, with these conversations with the European Commission had started, and they were talking about how Drupal was a critical part of their infrastructure and they wanted to find something to support. And so we put two and two together and said, hey, we can do this. But yeah, scope is always a problem. It's a huge project. And so, what we said was, okay, well, if we wanna do some sort of phased scenario, what's the most important thing to start with, what has the biggest impact on people, and what simplifies the task a little bit? And that's where we came to security updates in particular as being the focus of this first phase because it both lets us focus on just patch releases that shouldn't be destructive in terms of what's being changed in those patches, and also fixes a critical need that a lot of the community has.

[Fabian]- That's really great. [Michael Meyers]- Maybe before we jump into the underlying architecture and how all this comes together, can you give us a background on how this works? Is this gonna update my production website in the background? Is this something I can do on my development environment and put through a CICD system? How does this work at a high level?

[Lucas]- I think the answer is yes to all of those questions. It's supposedly flexible. At least that's how we're designing it. I think it's gonna be a little bit like the configuration management system in that there was an original intent for how we do this. And I think the original intent, at this point, is that you'd be a small site owner and you'd wanna in-place update on a live site, so definitely we're trying to make that as stable as possible. But if you've got a continuous integration system setup, there's nothing stopping you as a site owner from wiring into that. And then we'll iterate. This is an initiative that has multiple phases planned out already.

[Fabian]- So it could be possible for me as... I mean, if I had 100 Drupal sites or something like that it could be possible for me to just get, instead of waiting for many, many hours on this Wednesday I would just get the package pushed to me, just as potential scenario, and then once the patch arrives then I can distribute it, test it, if my basic tests are working, distribute it automatically, basically.

[Lucas]- Yeah, I mean... Well, some of the initial conversations I had was at MidCamp with folks from Pantheon and from, at the time, it was Drupal Commerce. They've renamed themselves to... I'm gonna butcher their name.

[Tim]- Centarro is the name.

[Lucas]- Centarro, right. So, we've got different folks in the room, and we're all just chatting about this idea of auto update, and everyone, Pantheon has its upstreams. There's nothing stopping any of these, even infrastructure owners, from certifying things and updating the URLs where you pull down files so that it's from a vetted source of thing where they've already done QA and added their secret sauce to this. I'm not saying that that's what's gonna happen because I don't know the future. But we're definitely trying to build this as flexibly as possibly so that even the hosting providers and your service providers can add their own secret sauce. Some of this might or might not play real well with what we're doing around the hashing and cryptographic signing of some of the resources. But at the very least, there's flexibility in the base architecture.

- [Preston] Sorry, go ahead, Fabian.

- [Fabian]- I'm really curious as a Drupal 7 maintainer. Is this only for Drupal 8, or would Drupal 7, where we still have probably a million sites out there, also be? [Lucas]- Because of the source of our funding here we're in an interesting place where the backport policy for Drupal is you have to do it in Drupal 8. Eventually it'll be Drupal 9 and then we backport from there. But the funding and the interest from the European Commission has really been Drupal 7. So we're playing to two camps here. We've gotta fulfill our contracts with the European Commission and so all of the goodness of Drupal 8 is getting backported as we go along. And at this point, essentially everything that's in there for 8 has already been built out for 7.

- [Preston]- Very interesting. And I wanna get back to what you just mentioned, Lucas, about flexibility and some of the really interesting elements there. But first, we've established that this is gonna be for both Drupal 7 and Drupal 8 which is, I'm sure, music to everyone's ears listening right now. I'm curious though, because this is such an ambitious project as both you, and Tim, and Fabian, have all mentioned, where are we right now in the life cycle? I know, by the way, you have a full roadmap already outlined. There's gonna be a link, by the way, in the description of this video if you wanna check out the full roadmap for auto updates. But what's currently part of the initial outflow as of right now?

- [Lucas]- So, we did an alpha one, I don't know, three or four weeks ago. And we did that because we now have something of value. There's this concept that we're building into the auto updates called PSAs, or public safety alerts, announcements. So when these Drupalgeddon type events happen, when these majorly critical things are coming down the pipe, we can now have another channel, yet another channel, to alert even small site owners that their site is now out of date. We're doing yet another channel because we've already got the existing, hey, you've got security updates, hey, we've got Twitter, hey, we've got Slack. That only alerts and notifies folks if they're following the Twitter feed. And we already send out the emails about your site has updates available for it for everything already, so it's a little bit like the boy calling wolf every other week. So this is just another channel, but it's going to be used with great care. It's gonna be used once or twice a year. And then we'll be able to send out these alerts to folks, and we have that now for both 7 and 8.

-[Fabian]- Just to clarify for me, so, I'm one of those, probably not, but just imagine I'm one of those that always clicks away this message, oh, there's updates available. Yes, I know, I haven't updated this module in a while, or there's security updates, oh, but this module I can't, and so I'm missing a crucial core update. But instead of that message, this generic message, and many also disable Update module because it makes things slower at some parts, the thing is, instead of that, I would be getting the message, on Wednesday, the date, there's a critical security update, please pay attention, or something so that it's more direct communication to... [Lucas]- Direct communication, it'll happen in the days preceding to the Wednesday as well. So, whenever the Security Team decides that, hey, this is important, they send out the emails a day or so in advance. This would also be part of their process. So the other part of what we've got built in in this alpha is what's called readiness checks, or preflight checks if you wanna put it another way.

-[Preston]- Sorry, go ahead.

-[Lucas]- The readiness checks just basically go out and say, are you ready? If you got an update tomorrow, or tonight, in 10 minutes would you be ready to update using this technology? And then we spent a lot of time thinking, well, how do we go about this? We even renamed it from preflight checks because with our non-English speaking community we thought it doesn't have as much meaning as a readiness check. Readiness checking goes and out sees, hey, is your site hacked? And I'll get into that in a second. Do you have enough disk space? If we were to apply an update, do you have two megabytes of disk space and you're gonna fail? There's about eight of these things that we decided. And some of them are errors, some of them are warnings. As an example, we'll warn you if you're not running cron on your site frequently enough. If you're not running it frequently enough we know that you're not gonna automatically update your site in the next couple hours. But that's a warning. We're not gonna block you. But if your site is mounted on a read-only hard drive for reasons of security, you can't really go out and update a read-only hard drive. That's just gonna fail. And that would be an error. So we've got this whole concept around readiness checks. It's a plugin, part of the Drupal 8 plugin system. Drupal 7 we've done something similar to that, not plugins, but the same business logic is in there. Can we update you quickly? And both of these things at this point are now available in the 7 and 8 alphas.

- [Fabian]- We can download this?

- [Lucas] Yeah, right now.

- [Preston]- That's amazing. We've already spent a great deal of time talking about readiness checks, and I wanna get more into the features of the Automatic Updates module. But first I wanna move into a very interesting subject for our audience, because this has been a very uniquely challenging problem for Drupal for a very long time. How did you all manage to architect this into the right solution? And also one that could potentially be expanded out to other PHP projects, like WordPress, for example.

- [Lucas]- I don't know that I was involved in the early, early conversations. But since the beginning of 2019 I've been involved. Tim, you wanna bring up some of the early conversations?

- [Tim]- I can get into some of the early architectural discussions. I think the concern has to do... As Lucas was saying earlier, we have this myth that Drupal use cases and Drupal sites were so varied and so complex that that meant there wasn't a good way to have a set of standards for doing an auto update process. And we just really started interrogating those things in the last two years or so and said, well, is that really true? And what it came down to be is that the concerns were all about confidence. They were all about that ability to understand, hey, can we really be confident that these updates will apply cleanly, that they're not gonna break folks' sites, what do we do if we miss something and it does break folks' sites. So we spent a lot of this time saying, okay, what kind of architecture can we create to give us the maximum confidence so that whether you're running this as an attended or even unattended auto update you'll be able to feel like it's not gonna break your site. And so, that meant that we needed to architect these readiness checks, we needed to talk about something that's coming in the second phase which is this notion of having an A/B version of your codebase so that you can run an update, but if something fails flip back to the known good version, almost like a bootloader concept is what we were inspired by. Then we had thought about, okay, well, what else is involved? Well, we need to make sure that we can generate something that is the equivalent of a patch. Not literally the patch that's just been released but a quasi-patch, as Lucas termed it, that will apply all these changes cleanly to sites. And then finally we need to give people confidence in what's being delivered. So that means having a good way to secure the package delivery system that comes from drupal.org, ensure that it's signed and verifiable so that people know that what they're installing is in fact coming from a trusted source. All of these concerns are about confidence and integrity of the system, and so that's kind of what we built into it. And then once we realized, hey, you know, there's ways to solve each of these problems, there's ways to solve confidence that the update will apply before you do it, there's ways to solve confidence in being able to roll back if you need it, there's ways to solve the confidence in the trusted source of the package, then we realized it's almost significantly simpler. If you assume that we've solved those problems we're just applying patches and a couple processes to make sure those things apply clearly. And from there we got into the real weeds of the architecture which I think Lucas could probably speak to more. [Lucas]- It's interesting that you say it's simple because the patch that we've built right now for this service to auto update your site, it's a service. Half of this patch is testing. Actually, probably more of the patch is testing. It's like 500 lines, 400 lines of code, to update your site. Seriously, it's a really small amount of code to grab an artifact off of drupal.org, download it, and overlay that, I'm calling it a quasi-patch because it's not a full patch, it's not a regular patch. It's all the files that would be touched in a version between 8.73 and 8.74, as an example. If there's 10 files, we grab all those 10 files, put them in this patch. That way we don't have to worry about git apply, or patchutils, or anything on the server. We just have to deal with what's already built into PHP, copy and paste. And we do the copy and paste, and we do all of this... I'm gonna credit Nate Lampton here. His vision here was do it all in the same HTTP request. If you do it all in the same HTTP request and you roll back in the same HTTP request, the site can't actually go down. Well, knock on wood. It still probably could in very rare edge cases. It's about, I don't know, 200 or 300, 400, 500 lines of code, depending on if you start counting some of our tests in there as well. [Preston]- Very interesting. I think the way that you've solved this problem with obviously these quasi-patches and just the thinking that's gone into this really reflects, I think, the amazing commitment that you all have shown to making this a really successful feature. I wanna dig into some of the features now. We've already delved a little bit into some of the public safety messaging and the readiness checks. But just for our readers, and our listeners, and all the folks in the audience today, I wanna define just very clearly what those three major components of auto updates are. The first is public safety messaging, which we talked about briefly already. The second is readiness checks, or preflight checks, which really indicate when a site is ready to go. And then finally, I think the most compelling of all of these, which are the actual in-place updates that occur. What can we learn... Lucas, you mentioned earlier about Drupalgeddon and some of the ways in which, for example, every time I get a security advisory tweet or see that pop up, sometimes just the urgency of it doesn't really register with me, or things of that nature. I know Fabian also mentioned that as well. So I'm curious what sorts of other aspects of public safety messaging did your team consider during this whole effort? And what other components are involved in that?

-[Lucas]- Well, I really wish there was more folks on this call because it's not just the folks here on this call that have really contributed. It's been a whole team in the auto updates channel in Slack, and folks at different camps and conferences. So, what have we... I mean, there is other thoughts around just doing figuring out what is on the site already, and calling home to drupal.org and sending information. But there's security and privacy concerns around that, and so for the scope of what we're doing right now we're trying to keep it simple for messaging.
- [Tim]- Yeah, I would add to that, basically, we coordinated very closely with the Security Team, naturally, because this first focus was on the security side of Automatic Updates. And so, the main thing that we did was say, look, they already have their security advisory content type architecture on drupal.org, which is the basis of how the regular, standard, canonical announcements roll out. So what we asked ourselves was, okay, what are the requirements to changing that to provide a feed that could be consumed, a JSON feed that could be consumed by this auto update system, and then how do we put policy restrictions in place to say, hey, look, this is gonna be a more prominent message appearing in people's admin interface, but we don't want to cry wolf, as people have said, we don't want to overuse it, so what controls do we put in place. And mostly that will be at the discretion of the Security Team to say when is it worthwhile to put out this message about those additional updates.

- [Preston]- I think the interesting thing here is that we're really trying to reduce the burden on the user through this JSON feed, through the way in which we're gonna be able to get this alert directly to the audience. Now, just to move into more of the way that the user experience, or the way that the site owner experience is really improving here. These readiness checks, these preflight checks, that run on every six hours, or via cron jobs, how do they really play into this whole feature and this whole notion of automatic updates?

- [Tim]- I was just gonna add before Lucas runs into the architecture. From my two cents, what Lucas and others have architected is really great because it's an extensible system. New kinds of readiness checks can be created as time goes on. And what's been developed so far has been the result of core contributors, and the Security Team, and just other general contributors, coming together and saying, okay, what are those scenarios and use cases that are gonna be most important to determining whether or not an end user can actually run this auto update process? What's gonna block them from taking advantage of this new system that will make things easier? And so building this in so these currently eight or so different categories of readiness checks run on a regular basis and then provide their results reports so the site owner knows, oh, hey, if I fix this read-only directory, or if I fix whatever these warnings are or errors are in advance of this next round of updates going on, it's gonna be much more of a breeze for me. I just think those elements are gonna be really valuable. They're already valuable. They're already some things that I think people should be taking advantage of if they wanna try out the alpha. But I think the underlying message is they're only gonna become more valuable. As people test the alpha and as we move into the first phase stable release and things like that, we're only gonna learn more from contact with the world about other things that we could be checking, other things that we could be creating warnings, or errors, or messages for in order to help people be ready to take full advantage of the system.

- [Fabian]- Quick question to the readiness checks, because really interesting. As I've understood, it's plugin based?

- [Lucas]- It's plugin based indeed.

- [Fabian]- So as the site owner or, for example, a hosting provider, I could add my own readiness checks, as a site owner I could add my own readiness checks so if I need something more, like for example running some very basic tests or something that my homepage is still reachable, or whatever I wanted, I could just do that with a plugin.

- [Lucas]- Yeah, you could. We tried to come up with the basic ones that are very reusable. Most of these readiness checks are somewhere in the range, of real code, of about five to 10 lines. Can we write to a file on your hard drive? You know, I mean, that doesn't take that long to figure out that you can't do that. It's really simple. You tag a plugin with the right tags, look at an example, and now as an agency or even as a large company, if you have some specific rules around, hey, we need to make sure that the moon is in the third phase right now, you can do that. There's nothing stopping you.

- [Tim]- I would add...

- [Fabian]- That's really cool. And I have a question. If I have hacked core, and as a core maintainer I sometimes have to do that, can I still use the auto updates in that? I mean, is it preventing me from that, especially if a file is updated that I never hacked?

- [Lucas]- We're not blocking you from updating core, or any contrib module, if you have a file that's modified. Remember we're using a quasi-patch approach here. So, if you are about to update something... For the readiness check, if you've got a modified, patched file in core, we will warn you, saying there's some uncertainty here. Because what happens if a patch, a quasi-patch between 8-dot-something and 8-dot-something else comes out tomorrow? Well, you can look at what's been modified. If it's in Book module and you don't even use the Book module on your site, or the Book module hasn't been modified in six years or something... If you can look at it and you say, well, I've patched this thing and it's rarely patched, or there's no security issues in it that have come out for many years then you have greater confidence that you'd probably be okay. That said, when we go to apply that update we're gonna take that specific one and we're gonna say, no, no, no, no, you can't override a file if it's been modified.

- [Tim]- So any of your unchanged files, as long as you check from the warnings and you think it's otherwise okay, any of your files that are unchanged from what the architecture has, when it does its comparison, then yeah, those can still apply.

- [Lucas]- I wanna go into a little bit about this how are we figuring out if files have changed. Because I think that's a really interesting component to all of this, one that I wish I had had more... I wish I could say that I had more thought into it, but it's really the community. There's this whole concept, though, that makes sense after it was explained to me the first time about doing a file hash. Just do an MD5 sum, well, not an MD5 sum. A SHA-256 or a SHA-512 sum. We do that of every single file in all of core, all of contrib. The Drupal Association has really helped us out a lot here. They're gonna provide an official feed for hashes for all of these things. This is the type of stuff that if you look at the Linux distributions out there in the world they've been doing this for a super long time. If you try to download Composer, even, there's a SHA sum there. And you grab that SHA sum and you're supposed to do SHA sum to compare and to make sure that what you just downloaded is actually what was provided on the website. That's the type of technology we're building into. We're doing it not only to see is your site been modified, we're also doing it to have greater certainty assurance that these quasi-patch archives that we're downloading haven't been hacked or modified by a man-in-the-middle attack, or some other attack. And we'll link this if you're really wanting to look into it. There's a whole project that the Drupal community has now provided to the world on GitHub called PHP Signify. And it's using chain signatures. We've been going through it, trying to figure out... But you know what it is? It's following a pattern that BSD is using. BSD Linux has used the same thing. We're just standing on their shoulders. The current concerns about, well, did you write this hashing and verification system yourself? No, we didn't. And we also have some super, super smart people in the community contribute to this at the Midwest Developer Summit this summer. [Preston]- Very interesting, and I think this just indicates the amount of thought that went into this. And the notion of actually introducing your own readiness checks is something very interesting. We mentioned what phase the moon is in. I can see this being useful if you have, for example, a decoupled Drupal implementation and you have to wait on other code that you're working with to be ready before starting an automatic update, such as your decoupled application or frontend. So, now with these, the third feature we just named was in-place updates. Which, as you mentioned, Lucas, just now, you wanna be very secure. You wanna make sure that potentially these ZIPs that are being downloaded are not in any way affected by man-in-the-middle attacks. How do you look at that from the standpoint of the in-place update feature? What's going on in there in the actual Drupal site itself now that you've got those updates ready to go?

- [Lucas]- So, the mechanics of this small patch that we've got right now is that we download a ZIP file. The ZIP file has six files in it. Bootstrap.include is one of them, as in my example here. But another file that we have in here is this signature file that's called CSIG. It's using the same format that BSD Linux is using. The project that we're using is called PHP Signify. It's standing on the shoulders of libsodium which is baked into PHP as of 7.0 or 7.1. Someone will comment in the feed here and correct us. Which also has backwards compatibility shims all the way back to PHP 5.3. So if you're still on a Drupal 7 site and you're on an old version of PHP, you could still have this stuff work for you. It's doing a hash and it's using things like SHA to hash these files, compare the hash to a publicly signed and privately signed key pair of things. We're even talking about, not talking, we're doing, the Drupal Association is in the process of acquiring and installing an HSM so that we can do this in a very, very secure manner. And the HSM will then be taken offline. You can't get more secure than this. Is Drupal really a target? Someone, a conspiracy theory might say that it is. But even if it isn't a target, or even if it is, we're doing this in the most secure manner that we can.

- [Tim]- Sorry, just to add to that a little bit, I think it's really interesting and robust the way it's architected. Drupal.org delivers these hashed and signed files with the quasi-patch plus all of the appropriate signature files with the relevant metadata to verify all that. And then, again, as Lucas said, anyone using the auto update system, as long as they're on even a very old version of PHP with these shims in place, the PHP process can do the verification of the signatures, and the hashing, and all of that, and then we know that it's not an interrupted package that's been modified in flight, there wasn't a man-in-the-middle attack, or anything like that. And then on our side using an HSM we're able to do secure key rotations on a regular basis, we're able to validate, hey, you know, when was the last time this was checked, is it within a valid window of being signed, all this kind of stuff. So it's a really, really robust system. And I have to say, we're sort of, we have the advantage... It took us a while to get here, but we have the advantage of being able to look at the example of other projects, whether that's a BSD system that a lot of this is based on, but also look at, say, WordPress or other projects that have started implementing their own systems and say, these are the impressive things they've done but they may be missing this piece in terms of the signatures and man-in-the-middle protection, or maybe they're missing this other piece, and how do we get a lot of that right from the start. And part of this is only possible because Drupal has maintained its status as kind of a centralized project. Drupal.org is the home of most, but not all, code related to the Drupal project and that helps us be able to sign and deliver all the relevant packages. And with our move to GitLab, I think some other folks who aren't here are probably coming back as well.

- [Fabian]- And I'm very glad that you're doing that, especially with an HSM, because the recent Webmin hack showed how bad is this if you deliver hacked software to your users.

- [Preston]- Oh, absolutely. And I think a lot of our audience is very concerned about that just given the amount of risk that's possible here. So, I have one question for you both, Tim and Lucas. One of the things that I know a lot of our audience is curious about is, well, you have these archives, it makes sense. But what happens if we're using Composer? I use Composer. I've got Drupal 8.8. What exactly is planned for Composer? Or does it already work for sites that use Composer?

- [Lucas]- So, when you go to 8.8.0, when that gets launched in October, you will have Composer whether you know it or not. The Composer initiative has been really rolling and rocking quite quickly here in the last few months and a lot of things have come together such that when you upgrade to 8.8.8, 8.8.0, you're gonna have Composer just by installing and upgrading to 8.8. So that does open up the question. Because when this whole initiative of auto update started we said, well, we don't have any official Composer support in core. Remember, the funding is coming from European Commission. That funding will dry up here shortly. Let's focus on the needs of tarball. So we have spent a lot of time on tarball because of pragmatic reasons. But none of that means... None of that, though, means that we've not thought about Composer. And we have thought about it. And there's two or three different scenarios that we've considered with Composer. And the first two of them, you're fine. You're not gonna have an issue. Let me talk about those. The first one is you're on a site that's been using tarball since day one. And again, let me pause there. This is really an issue in Drupal 8. When we go back to Drupal 7, the number of sites that are using Composer are very minimal. So you can do it, I guess. I've never done it. Drupal 8 though, it is an issue. If you've been doing tarballs and you're still running before 8.8, let's just make it simple. You're not gonna have any issues at all. 8.8 comes around, you've upgraded to that because you're being a good boy or girl. Now you've got Composer. You're still okay because we're just overlaying files. The problem that comes in is when you start using the capabilities of Composer and you do a Composer require, and your Composer JSON and your Composer lock file start to get modified. In the case of that, you're gonna have some additional considerations. Again, if it's just an overlay of files you still will probably work in nine out of 10 times. Nothing that we've done will have an issue for your site. If you're wanting to patch, say, Token module, that's one of the more popular modules out there, and security release comes out for Token and it's been managed through Composer, we'll do that, that's not a problem. It's when you're dealing with a thing like... I'm gonna pick on Commerce. Commerce has a lot of vendor dependencies as well. If you want to update Commerce with this in phase one, and I wanna be clear, phase one, there are lots more plans in the future, you might have it a little bit more tricky. If it's just one file in Commerce, again, you still probably won't have a problem. But if it's a file... I've got this all written down in an issue on drupal.org. We'll link to it because this gets tricky. If it's a file that isn't in a vendor folder then you're fine. But if you start dealing with modified Composer JSON, vendors... But again, if think about the audience here, it's for folks that probably wouldn't have been doing composer update, composer require, up til now because they were on tarball. And so I think at least for version 1.0 of this module, we're gonna learn so much. So all that to say, Composer is thought of. There's more to come with it. And we'll make it better with time. We're gonna beef up that support. [Preston]- Absolutely. And I wanna ask just the other elephant in the room here that I know a lot of people have on their minds. Is this gonna be a part of Drupal core? Is all of this going into core? At what point do we see this being in Drupal core?

- [Lucas]- We've been working with the core contributors, and we even have Fabian on the call today. We've been working with core contributors from day one. This is an initiative out of the Drupal Association and this is not done in a vacuum. The simple answer is all of this will go into Drupal core. We're gonna incubate it in contrib. That said, some of the actual implementations of how we're doing this are not gonna go into core. We're not gonna do this whole overlaying of files thing on the live site, or at least not unless something changes, because there's too much risk there. Even though it's still very low risk, the core maintainers and release managers are still feeling that that's too much. So we've got plans for phase two, unfunded plans. I'm gonna put a request out there for funds. We need more funds for this A/B controller where you've got a live site on A and over on B you've got the about to be updated site and we're gonna flip back and forth between A and B, and if B has a problem we switch it back to A. And then we just keep doing that. And as we do that we can also start doing things more with Composer perhaps, beefing up that support. The simple answer is all of this will go into core.

- [Fabian]- What's the roadmap for the public safety announcements? Because there doesn't seem that much to it besides that needing to be a secure channel that I could commit that to Drupal 7, kind of right now. So what are the plans from the 8 maintainers there when we could get this in?

- [Lucas]- We're still in discussions with you guys, Fabian. Because we wanna figure out when's the most ideal time to take these features over. We've really been focusing on the end of the year deadline. But if someone were to grab the code, roll a patch, and throw it up there, there's no stopping anyone from RTBCing that, reviewing, testing, and getting that into core now because it's pretty stable. I say that, but I also say we've gotta learn a lot. This whole thing, we need to learn more. More information needs to be gathered. I was hoping that we would have a little bit of a chance to have several hundred sites, thousand sites, with the contrib module installed with this on there, figure out the right wording for this messaging so that we don't annoy people to death, but we don't do it too infrequently either, and iterate and improve on it in contrib where the gates are a little bit lower, the stakes are a little bit lower. And then when it's ready, move it into core.

- [Fabian]- Yeah, I mean, you could always start with a simple experimental module. has worked very well. And just one real quick question to the Composer thing, just if I got this right. So, the problem with comments is, for example, that a security patch might itself depend on a vendor library update and then I've got problems because then I can't know what kind of inner library's in there. Is that where the problems go into?

- [Lucas]- Yeah. I really probably should link it, and I feel like I did a poor description. But yeah, like, some of our more recent Drupal 8 updates worked for with a far file and we've got a vendor file in here. If someone has run composer update at some point and they've already updated to a newer version, and in the midst of all of this they got a newer version of Symfony and some other components, it could get really messy. I think what we're hoping to do is just release this so that folks can give us that feedback, is this gonna blow up on me. There's gonna be a few folks that are gonna be like, I don't care. Just give me the updated site. And those are gonna be the risky folks. Those are the guys who go down the black diamond ski slopes without any protection and half kill themselves. We need those guys out there to do this because they're gonna report, and probably get upset, that they broke their site. But we need those people to give us feedback. But if they don't give us feedback, if we give the folks the flexibility to hang themselves and there's silence, well that also tells us a lot too. It says, well, maybe we can relax some of our... Relax some of what we're doing here, and even maybe promote some of these more risky adventures.

- [Fabian]- One quick question for Composer itself. Have you thought about, the idea came out originally two years ago or something like that, to just run Composer inside of that process with a virtual file system?

- [Lucas]- Composer right now takes too much memory. And on shared hosting where you're capped out at 512 megabytes, or 256 if you've got other processes, it becomes cumbersome. But a lot of the same folks that have been in these rooms have been thinking about this for a while. So, more to come on that. I think we've got some ideas to drop that in phase two as funding comes available for the community. We might have some more surprises in the wings.

- [Fabian] Sounds great.

- [Lucas]- Tim, you know more about the future. Do you wanna talk about where you see this going?

- [Tim]- Why don't we talk a little bit, yeah, just about phase two and what's going on, and just move into a discussion of the future. As we've said several times, the European Commission, because they rely on Drupal and they have a commitment to all the open source technologies that they're a part of as part of their FOSSA program, approached us before the beginning of the year and said that they wanted to set something up, and generously they've been supporting the work so far to do this phase one of the project. But as we said, that's gonna run out. Then in the next phase when we wanna have better Composer support and this A/B system so that it can be actually brought into core, those are really, I think, probably the two most major priorities, we're going to really need to figure out how we create a sustainable funding model for that. For folks who are out there who may be at large end user organizations who rely on Composer in your workflows but are also interested in the possibilities of Automatic Updates, it'd be great to hear from you. You can reach out to me at the Drupal Association. And that phase two we're gonna try and really plant a flag around that A/B controller and more robust Composer support and see who we can bring to the table to help us make that happen. It's really important to us that we move this forward. It's a huge priority for the project, of course, but it's also, it's not been an easy problem to solve. As people have mentioned, we've been talking about this in the Drupal community for probably five plus years and are only now beginning to make some real progress. But it's taken work. As we've alluded to before, there's a huge group of people involved. Staff on the Drupal Association side. People like Tag1 who've been working with us closely to actually do implementation, Lucas in particular. The Security Team, all the volunteers in the auto updates channel. Just a huge amount of people. The core maintainers have all been involved. And without funding, we might make a little bit of progress but I don't wanna go another five years without really delivering an even more robust system that we're capable of doing. [Preston]- Yeah, I think this definitely shows the amount of support and backing from the community but also from these sponsors that have really contributed a lot to make this a success. Obviously we mentioned the Drupal Association, MTech, Tag1 Consulting, the European Commission and their Free and Open Source program. I know that you all are looking for more sponsors. Tim, just to be very clear, to reach out to you they can go to [email protected].

- [Tim]- That's right. You can reach me there directly.

- [Preston]- Absolutely. I just wanna mention briefly that this is a very important initiative for the future of Drupal and for the ability for site owners to really have people, or to have that governance and maintenance that they've been used to all along with so many of these projects. And so Tim, just to jump back into the notion of sponsorship and looking for contributions. Why should folks look at sponsoring this project? We've talked about obviously the benefits technically, but are there other benefits also to this?
- [Tim]- Yeah, I mean, in addition to the technical benefits and the benefits of just being a part of good citizens of an open source community and raising the tide for all ships, there's also, because this is a strategic initiative for core, there's a lot of visibility into this project. There's opportunity for marketing and awareness campaigns around your support of this initiative if you choose to get involved. Cross-promotional activities that can be done with the Drupal Association. There's also opportunities for recruiting some of the best talent in the Drupal community if your organization is looking to recruit some more key talent. A lot of the contributors who've been involved, whether on a volunteer basis or a contract basis with this initiative are some of the most talented folks in the whole community, and you could recruit those people to your team. And then also, this is something that gets talked about on stage during the keynote at DrupalCon. If, for example, we managed to get someone on board before DrupalCon Amsterdam, I'm sure we would be thrilled to announce that partnership with another organization. Moving forward into the next year during the keynote, during Dries' keynote, we're certainly gonna be talking about the progress on this initiative and all the organizations who've been involved in the first phase so far.

- [Preston]- Wonderful, and I'm looking forward to hearing a lot more about this at DrupalCon Amsterdam. Before we close things up here, though, I just wanna ask, are there any other things that we wanted to mention about where this project is headed, or any other things that the audience might be interested in?

- [Tim]- I would just reiterate something that we said before, which is, there's an alpha available now. It's been out for several weeks. So if you're in a position to try it out, to test it out and provide feedback, that would be wonderful. If you're interested in being involved in the initiative there's a lot of activity in the auto updates channel in Drupal Slack as well as in the drupal.org/projects/autoupdates so you can get involved there and help out. And yeah, certainly feel free to reach out to myself or others involved if you have questions.

- [Fabian]- Awesome work.

- [Preston] Yes, this is fantastic. [Lucas]- I mentioned it before but this has been the work of many, many people, and we've only got five or six on this call which is a poor representation of the larger community. So my shout out to Drupal and the Drupal community. [Preston]- Absolutely. All righty, well, we have run out of time here at the Tag1 Team Talk. So, just to remind the audience here today, thanks again for joining. It was a really amazing conversation about Automatic Updates coming to Drupal. A really amazing thing. I'm looking forward to hearing about it in Amsterdam and Minneapolis. Just for your information, we post all of these, by the way, at tag1.com/tagteamtalks. All the links are gonna be available alongside this video and audio recording. If you like this talk, if you like what you heard today, please remember to upvote, subscribe, and share it with all your colleagues, your parents, your friends, your grandparents too. And as always, please, if you have any topics you wanna hear about, if Automatic Updates is interesting to you, you wanna bring Lucas and Tim back into the fray here, please write to us at [email protected]. I wanna say a big thank you to Lucas, a big thank you to Tim, and to Fabian, and to Michael as well, for joining us today for yet another Tag Team Talk. Thank you all, and goodbye.

Sep 30 2019
Sep 30

All about Drupal development and the people behind.

More and more businesses look for Drupal developers as the market has been skyrocketing for the past decade. Drupal has emerged to be an enterprise-level content management system compared to rivals Wordpress and Joomla.  

As Drupal development involves various segments, there is always a scope of confusion regarding skill sets and responsibilities. We, at OpenSense Labs, comprise of Drupal Developers, Architects, Themers and Back-end experts. All of us lay the foundation for any project we pursue. 

Welcome to our first of the three series articles on Drupal developers. Let’s dive in and understand the distinct categories and their skills which make Drupal development a success. 

illustration image showing blur drupal icon and and multi colour icons on white background

The entire development process in Drupal comprises of various segments which contribute equally to the overall well-being of the website. From laying the foundation of the website to providing it a UX-friendly design, it is collaboration at its peak. Let us decode every single role here:

Drupal Site Builder 

illustration image showing a person on stairs building a site layout in multiple colours

Site building is the core Drupal competency which is much needed for the site creation. It includes, getting Drupal up and running, and configuring the options to build a full-fledged functional website.

One of the most rewarding features of site-building is that a Drupal site builder approaches building sites with the only point and click on the admin UI (user interface), without writing a single line of custom code.  Site Builders are known to lay the foundation of any Drupal website. 

Meaning, they build the taxonomy, content types, image presets, lists with views, layouts, menus, rules and setting up roles and permissions.

By understanding, a completely functional Drupal website is curated with a lot of  Drupal core and contributed modules (such as References, Scheduler, and Automatic Nodetitles). A site builder has a sound experience of these core and contributed modules.

They have the skills to play with a combination of modules, along with the limitations which might result in resolving a respective problem or a set of problems. Every module in itself is grounded to some capabilities which the site builders understand. Except for the earlier mentioned, the site builders also have:

  • The general understanding of the working of the web, installation of dynamic web applications are the important prerequisites for Drupal site-building. In addition to that, familiarity with HTML, CSS with a code understanding is an aid.
  • Can install and setup Drupal manually or by using an application or a service, configure core, add new features and evaluate the contributed modules.
  • Capable to test out the configuration changes before deploying or configuring them on a live website.

Drupal Themer

illustration image showing two screens showing colour combinations in blue orange and white colours

A Drupal themer, also known as a Front-end developer has a seat in between the designer and the developer. They are the specialist in front-end designing and development and are responsible for maintaining the implementation of the client-facing architecture of an application or a website. Along with HTML, CSS expertise, they know:

  • Front-end technologies like Javascript, JQuery and AngularJS.
  • Basic theming skills like installing themes, creating sub-themes, and tweaking sub-themes with CSS and custom template files. They use some PHP in template files and in Drupal 8, Twig is used for templating.
  • They have expertise in the Drupal theme layer. They ought to have the capacity to take a design and transform it into a functional issue like implementing responsive design.
  • The expert front-end developers create "glue code" modules or functions in PHP that expose configuration options to site builders. 

Drupal Backend Developer

A click on the front-end is of no-use if there is no functionality implementation in the backend. A backend developer writes the code that hooks distinct sections altogether for the proper functioning of an application as a whole.

Also known as the Drupal Module Developer, they are proficient coders who write a lot of code in PHP and other server-side languages. The backend developers in Drupal are fully aware of the basic site building architectures and best practices. In addition to that they are:

  • Well versed in creating and executing the new modules. They are also adequately equipped to customize and extend the existing Drupal modules.
  • Involved in the advanced side of theme layers, automated tests, consume web services, automated deployment, etc.
  • Along with the knowledge of HTML, CSS, JS/JQuery and JavaScript, a clear and in-depth understanding of back-end tools like PHP and MySQL.
  • For D8, they know the concepts related to architecture and planning, development of custom modules and D8 performance and security concerns. 

Drupal Architect

Drupal architect has an understanding of complete project architecture and they provide the direction to the project path. A lead role in the Drupal development process, a Drupal architect performs backend development, various front-end tasks and theming in the project. Following is the must-have skills for a Drupal architect:

  • Strong understanding of front-end and back-end development tools and other web development aspects.
  • Well-versed with the optimization of Drupal.
  • Highly proficient in languages such as PHP, SQL, JQuery, and CSS.
  • Well versed with the implementing tools like Varnish, GeoIP, Commerce, Ubercart, Solr, and CRM integration, to name a few.

Drupal DevOps/Sysadmin Engineer

DevOps is known with a variety of definitions as a culture, trends, perspective, etc. A Drupal DevOps Engineer wields the tasks of both software development and information technology operations. They run the live stack and deploy Drupal websites from the development environment to the live server environment. Additionally, a DevOps Engineer handles performance-related hurdles that might interrupt business operations or cause any sort of harm, such as setting up Varnish, CDN, and Memcache, etc. 

Following are the skills of a Drupal Sysadmin that every Drupal ecosystem requires:

  • Linux is a mandate for a Drupal DevOps engineer, that includes proficiency in managing the Linux servers, an expert in internals and Linux Kernel working.
  • Bash Scripting, Continuous Integration (CI) so as to automate the time-consuming and repetitive tasks in the application development process, like deployment on the server, backups, restores, refreshes of the databases, etc. 
  • Hands-on in automation technologies such as Chef, Puppet, Ansible, etc. for configuration management and deployment.
  • A DevOps Engineer needs to be capable of performing multifaceted roles, such as Site Reliability Engineer (SRE), Build Engineer (BE), System Operations Engineer (SOE), Database Administrators (DBA).
  • Solid understanding of Infrastructure as Code (IAC) in order to manage the networks, virtual machines, load balancers, and connection topology in a descriptive model for source code versioning.

Drupal QA Engineer

The profile which imitates as an end-user and has the skills of a developer is a Drupal QA engineer. This profile ensures the quality of product deliveries. They run the manual as well as automated tests to meet quality thresholds. 

To ensure quality delivery of projects, a Drupal QA engineer develops corrective action programs as a part of the Quality Assurance process. Following are the must-have skills for a Drupal QA engineer:

  • Sound understanding of the product or industry-specific requirements.
  • Experience in testing web technologies. 
  • Well versed with Drupal 7 and higher.
  • Strong command in various programming languages, such as HTML, CSS, and JS. 
  • Ability to document test cases, capture the test result details, setting up an automated test environment, etc.  

Drupal Project Manager/Scrum Master

Also known as the Scrum Master who ensures agile practices in the entire term of the project. They manage and run scrum teams, take responsibility for daily progress in the project to meet project delivery timelines. A quality project manager who acts as a central node between the client and the team while ensuring transparency for both ends. Following are the must-have skills of a Drupal project manager:

  • Skilled in client servicing domain, plus sufficient technical expertise to regulate the workload of the team. 
  • Capable to forecast/foretell potential risks and mold the project plan accordingly. 
  • Well versed with the content strategy, implementation and other existing, emerging technologies in order to integrate it with the Drupal CMS. 
  • Knowledge of SEO and reporting tools like Google Analytics to check how the content is performing across the web.

Drupal Designer

A Drupal designer accelerates the process of user experience (UX) and user interface (UI ), so as to create the best experience for end-users. They know what the technology stack is capable of, thus delivering to design requirements and win over stakeholders before development kicks off. Following are the must-have skills of a Drupal Designer:

  • Knowledge of the capabilities of Twig is imperative for the upcoming Drupal versions.
  • Knowledge of HTML, CSS and Javascript.
  • A clear understanding of the basics of theme creation and site-building.

Drupal Product Owner

Most of the time, product owners are the clients who have the final sign off of all the project changes. But they can be the people from the drupal development team too. A product owner (PO) comes up with the requirements of a project and has extensive experience in various industrial domains. They work in close coordination with the project managers to prioritize the backlogs. Following are the must-have skills of a Drupal product owner:

  • They should be capable of seeing how things integrate and work together to decide the future or usability of the project.  A clear vision and commitment of a product owner will set up a strong base for a Drupal project.
  • Also known as the organizers of the project, they should have excellent communication skills to deliver their message across the application development teams.
  • Excellent reporting and record-keeping capabilities to measure the current state of the project. 
  • With excellent decision-making skills and the power of managing the business feedback, product owners should be capable enough to drive projects towards its successful completion. 

Content Marketer 

How to market the content so that it delivers the maximum output is the major concern of a content marketer. They own the complete content publishing process and ensures that the content matches with the latest search engine optimization (SEO) and search engine marketing (SEM) practices.  Following are the must-have skills for a content marketer:

  • Well versed with the latest Drupal versions.
  • Knowledge of administration functions and perform changes that don’t require any coding related upgradations in the project. 

So we saw, similar to other web development life cycles, projects developed under the Drupal roof requires a range of roles streamlining the seamless process of building and support of the Drupal website and applications. 

Stay tuned for more!


Drupal has brought a major paradigm shift by being a leading content management system for enterprise-level organizations. A successful Drupal website is powered by a list of different roles having substantial knowledge and skills of the platform. 

Want to join the Drupal league? There are seemingly unlimited opportunities for a person who is crazy about Drupal. With over 15+ years of experience in the Drupal community, we at OpenSense Labs are the growth-bound team of architects, developers, designers, themers and more.

We love contributing to resolve community hurdles and help escalate the potential of Drupal as an Enterprise Content Management System. Let's speak for your enterprise needs at [email protected]

Or you can connect with us on our social media channels: Facebook, LinkedIn, and Twitter.

Sep 27 2019
Sep 27

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

For our latest interview, we chatted with Cristina Chumillas, front-end developer at Lullabot, designer, and one of the coordinators of Drupal's Admin UI and JavaScript Modernization initiative. Give it a read to learn more about Cristina, the supportive and welcoming attitude of her colleagues at Lullabot, and her work on modernizing Drupal's administration UI.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

I am both designer and front-end developer, and in the Drupal community right now I’m a core usability maintainer and co-organizer of the Drupal Admin UI and JavaScript Modernization initiative. Apart from that, I’m helping in the local community here in Barcelona; professionally, I work for a Drupal agency as a front-end developer, and then on the admin UI I’m mainly helping out as a designer and managing a little bit.

I actually just started a job at Lullabot about three weeks ago. I previously worked at Ymbra, one of the oldest Drupal companies here in Spain, but now I’ve moved over to Lullabot. I’m really enjoying the kinds of projects and learning how we do everything. I’d been at the same company for 5 years, so any change that I wanted to see I had to do by myself. 

So, right now, it’s really great seeing how other teams get organized; Lullabot is a distributed company, so it’s great to see how they’re super used to these kinds of situations where you don’t really get to get in real contact with people. They have a lot of alternatives to make you feel welcome and to help you get to know other people on the team. 

I have to say that before joining Lullabot I already knew some members of the team, and I knew that they’re very nice people. But now that I started I have to say that most of the people there really take into account that you’re a little bit lost when you’re starting out, so everybody’s super nice. They know how you feel and how to act because they have been in the same situation before. 

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

I started doing Drupal because I’d been working for a design agency. When I started as a freelancer, I wanted to do my own things - I wanted to do websites as well, and if I was doing the design I had to know how to actually make the website, and as a freelancer I had to do everything by myself.

So I learned how to make websites with Drupal and then after a while I got in contact with the local community; during that time they were organizing Drupal DevDays in Barcelona, it was 2012 I think. It was at that point that I got in contact with the community, I helped organizing the Drupal DevDays just a few months after getting to know people from the community.

And that’s why I stayed, I really liked the community and I just kept moving forward, helping with more things and, after some years, ending up at the company I mentioned before, Ymbra. After that I got in contact with the international community and, thanks to that, I ended up at Lullabot. So, getting in contact with the community has helped me a lot on the professional side.

3. What impact has Drupal made on you? Is there a particular moment you remember?

One of the people with whom I worked most closely during the aforementioned Drupal DevDays was Pedro Cambra; he was actually one of the people that put me in contact with the Drupal Association when DrupalCon Barcelona was going to happen. They asked him to be the local contact for the community, but since he was moving to - I think - London at that time, he put me in contact with them. 

Thanks to that, I helped first Stephanie, then later Amanda to come up with some things around Barcelona, helping them find locals and places to have parties, these kinds of things - essentially helping with the organization. I would say that DrupalCon Barcelona is one of the happier moments or one of the moments that I remember the most, because Pedro came and also helped during the ‘Con, and after a full year of working with Amanda I finally got to know her. 

Fun fact here: I was talking with her in English and my English at that time was really bad. Before getting to know her in person, I was growing nervous, thinking “Oh my God, this is going to be the moment that I have to speak in English”, but when I got to meet her, she said “Hola Cristina!” - she was speaking in Spanish! At that moment I realized I had been talking in English for a full year with someone that I could have otherwise understood perfectly. So, in a way, she totally helped me take my English to the next level.

4. How do you explain what Drupal is to other, non-Drupal people?

It depends if they’re technical people or if they aren’t in the tech industry. If they are, I just say “an open source CMS” and that’s all. When they don’t know what I’m talking about I usually say “just like WordPress but on a different level”. 

If I’m talking with someone that has no idea about that, I usually say “I make websites, but not the websites for the bar around the corner, bigger websites”, I don’t try to explain more than that. Because you can see websites that take a team of, let’s say, 5 or 10 people working full-time one or two years to complete, and then you have the small websites, e.g. for a small local business. Both are websites - how do you explain that difference to people who aren’t in tech?

5. How did you see Drupal evolving over the years? What do you think the future will bring?

Looking back to when I started out, I would say Drupal has evolved into something more professional, more high-level or more enterprise (that’s the word!). I actually wouldn’t be able to start with Drupal if I had to take the same path right now. So that’s actually one of the big differences today, the way people start with Drupal, it’s not like freelancing anymore.

And about Drupal’s future, I think just like everything is different today than it was 8, 10 years ago in the website industry, we have a lot of different levels right now that we didn’t have some years ago. 

I see the future of Drupal having to choose which of these next levels we’re going to focus on, because we’re seeing a lot of new technologies and trends; a lot of projects are decoupled right now, the internet of things is something that’s going to be here in no time, and a lot of people expect to have the content everywhere.

So, Drupal will need to put itself in a place that can actually give access to that content everywhere; where exactly is going to be Drupal’s place in this situation, I don’t know. But that’s the need that we’re going to have in the future, so Drupal will have to quickly evolve to make that possible.

I think there are a lot of smart people working towards these features, these needs, e.g. everybody working on the API initiative and other related initiatives. There are a lot of smart people that know how to do these things and I’m pretty sure that if there are such people investing their time, it can happen. It’s just that if we forget about pushing Drupal forward in order to solve these needs, it’s going to be risky as Drupal may start stagnating.

6. What are some of the contributions to open source code or to the community that you are most proud of?

I’m really happy with the Admin UI initiative and, although I’m not doing everything, I’m helping others getting involved. So, everything I’ve done there from designing to actually helping others getting involved and contributing by themselves, as well as all the UX studies that happened there where I mostly managed rather than did all the work.

I would say getting so many diverse people helping on the Admin UI, that’s something I’m really proud of and happy with. Because most of my time working on the Admin UI is not dedicated to actually getting things done, but helping others getting things done, so I’m happy about that.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

The Admin UI, of course. The admin UI is actually something that I think is really important for Drupal, I think the current admin UI was great at the moment it started, but it’s been many years since then and it actually needs a refresh. I think a lot of people, especially end-users, are expecting that, so I think it really can have a huge impact.

And it’s also a kind of contribution that can be done by people who aren’t specifically back-end developers, but also front-end developers, designers, project managers … We’ve actually even had some users that were content creators helping with the tests that we did at the beginning.

So, we’ve also had a diverse group of people doing user tests, e.g. people from the usability perspective; I think the project has so many professionals involved and so many skills needed that almost everyone that wants to help is welcome. If you’re interested, you can join the #admin-ui channel on Slack, that’s the place where everything is organized. 

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavor. 

I like to get involved with other communities and I try to help others. E.g. I’m trying to organize things for DrupalCon, but if I can, I get involved in other local stuff that can actually help others get improved - for example, I’m trying to organize an event to promote open source among women in Barcelona. So, these kinds of things, where I can actually use my skills to help others, it’s something that I really like, getting involved and helping with organizing things. 

Sep 27 2019
Sep 27

Recording my experiences of Drupal Camp Pune before they fade away. If you are connected with me on twitter, you must have seen a spike in my tweets over the weekend of 14th-15th September 2019. 

I was privileged to attend this 2-day event and want to admit that my experience of co-presenting a workshop, attending several amazing sessions, meeting old friends and new was great. Had the chance to meet a lot of people from the Drupal community, who were earlier only familiar to me via their usernames. The diversity of the sessions was really impressive. #DCP19 contained sessions for Backend, Frontend Devs, Quality Analysts, Managers, Students, Community, etc. ranging from Beginners to Experts levels.

Being a co-organizer of a Drupal event earlier, I knew how important it was to get the audience on the day of the event. The attendance was more than what was expected for both the days. This was a good sign for the event organizers.

| Keynote

Undoubtedly, the star of the event was none other than Mr. Preston So. It was great to interact with him. I had initially expected his keynote to be around Gatsby. Instead, his topic was a broader one, he highlighted the transition from Content Management systems to Content Management Stack.

DCP19 Keynote Preston So (Gatsby)


He also showed how modern applications are being developed and the role of Drupal & Gatsby in it. His keynote sparked a thought in my head around how applications can be developed and what is the way forward. I would like to share a couple of non-technical highlights of his prenote:

  • Preston started his keynote in Hindi and everyone in the auditorium was in awe. He truly is a master of languages.
  • He gave references to the Bollywood movie “Sui Dhaaga” for explaining the challenges developers face in our day-to-day lives.
  • He gave away 2 copies of his book “Decoupled Drupal in practice”.
  • Preston also shared his love for India, especially Mumbai.

Post the Keynote, Preston was surrounded by people and he was busy answering dozens of questions (I was a part of that group). Questions ranged from technical aspects of Drupal, Gatsby, to him learning so many languages, etc.

| Drupal India Association

DCP19 Drupal India Association (DIA)


The Drupal India Association board members addressed the audience, where they showcased brand new the DIA logo designed by QED42’s design team! For more updates around DIA follow their twitter handle - @india_drupal

| Drupal in a Day

A massive part of my role at #DCP19 was to co-present a 5 hour  “Drupal in a Day” workshop for the students. I co-presented with Nitesh Sethia & Meena Bisht, training and educating students who hadn’t heard of Drupal, around concepts like Opensource, Drupal, community, etc. Students gained hands-on experience with Drupal through:  

  • Familiarization of Drupal concepts 
  • Installing all prerequisites and Drupal itself
  • Introduction to the basic building blocks of Drupal like Content Types, Fields, Blocks, Menus, Views, etc.

We also spoke about the Drupal Campus Ambassador Programme which aims to bridge the gap between students and the Industry.

DCP19 Drupal in a Day


One of my favourite moments from the workshop was the attendee’s reactions when they witnessed the power of Views. They were amazed at how Views can be used to fetch data we want from the database and display it according to our needs. The responses and students eagerness to learn more new topics was a really satisfying experience.

| Sponsors

Sponsors are one of the building blocks in making DrupalCamps successful! This year we had 6 sponsors. 

DCP19 QED42 Platinum Sponsors


QED42 was the platinum sponsor for DrupalCamp Pune 2019. We were not only the sponsors but were also the organizers for the event. QED42’s booth, vibrant standees, Quizzes around Drupal, JavaScript, Machine Learning, and Hackathon appealed to the students and event attendees. We also carried out an internship drive for students. QED42 is known in the Drupal community for its designs and goodies, this year we had T-shirts, stickers, notepads, and designed quiz cards as giveaways.  

| After-Party!

Day one was tiring and about to get over, and we received an update regarding the After-party from the @drupalcamppune twitter handle!  

The after-party was one of the memorable moments of #DCP19 wherein I had numerous great conversations. I met a lot of people informally and got to know the jolly side of their life. I was so engaged in the conversations that I totally missed the dance floor. We reminisced memories from our past Drupal events, the current event and discussed future events too. Sharing a few snaps from the party at the end of this blog. Since I was caught up with “Drupal in a Day” workshop on the first day, I missed most of the sessions presented on that day. You can find out more about the sessions here - http://camp2019.drupalpune.com/accepted-sessions. However, I was lucky to attend sessions on the second day. Here are some sessions I loved: 

1. Multi-turn conversations with Alexa” — Anand Toshniwal

DCP19 - Alexa Multi-turn session Anand Toshniwal


The demo amazed the audience and received loud applause. Anand had set up a Drupal e-commerce store and he showcased how he could place an order with Alexa via a Multi-turn dialog. PS: Reach out to me for the recorded video of the demo! 

2. “Pixel Perfect Web” — Kiran Kadam

DCP19 Pixel perfect web session - Kiran Kadam QED42


Filled with Frontend enthusiasts, Kiran Kadam spoke passionately about what pixel-perfect web is and how to achieve it. 

3. “Effective storytelling with Clients and Teams” — Nikhil Anant

DCP19 Effective Story telling with clients and teams - Nikhil Anant


Nikhil shared his experience of visiting Manali and the challenges it brought with it, describing how things can be explained in the form of stories for effective team communication.

4. “Making Front-end Testing Easier using Visual Regression” — Ambuj Gupta and Kanchan Patil

DCP19 Visual Regression session


Automation is my favorite part in Quality Assurance process, and these guys took it to the next level. 

5. Good UX = Accessible UI design - Nikita Aswani and Asmita Wagh

DCP19 Good UX = Accessible UI Design QED42


The best thing about the session was the fact that not only QAs but also Developers who were equally interested in implementing A11Y and considered it to be an inseparable part of their web-development practices. 

| DrupalCamp Pune Closing Session

Overall, it was a great event put up by the organizers of #DCP19. The closing session was hosted by Sushyl & Ajit, where we acknowledged the organizing team’s efforts and thanked them for making DrupalCamp Pune a huge success. Right from the swag-kits, keynotes, sessions, speakers, venue, food, after-party, and countless important items, the organizers deserve a huge round of applause. 

DCP19 Closing Session

Next year, I am looking forward to being a part of the organizing team and experience the excitement of planning DrupalCamp Pune! 

I have collected some pictures from the event and would like to share them with you. 

DCP19 Memories


| Conclusion

I really appreciate and thank you for taking out time for reading this post. Hope we cross paths at the next Drupal event. #DrupalThanks

Sep 27 2019
Sep 27

First, we need to add our Gatsby Site to Gatsby Cloud. To do this, go to the root directory of your Gatsby project and use git to commit all your recent changes. You will need to be able to push your git repository to Github so make sure you have a repo created in your Github account for this Gatsby project (the repository can be private or public).

cd [gatsby-project-folder]
git add .
git commit -m “Adding recent changes”
git push

Next, go to Gatsbyjs.com and click on the Gatsby Preview link to get started. Click the 14-day free trial and sign in with your Github account.

Once you are logged in, click the purple Create Site button.

Gatsby Create Site Button

Select the option to Add my own site.

Gatsby Add Site

Select the correct Github repository from the list and click the Next button. You don’t need to add any integrations so click Next. Note the preview URL as we will need that on our Drupal site.

Now it’s time to open up your Drupal site. We need to download and install the Gatsby Drupal module.

composer require drupal/gatsby

Once the module is installed, go to the Gatsby configuration page which is located under Admin > Configuration > System > Gatsby Live Preview settings.

Add the preview URL from your Gatsby Cloud account and click save. That’s it! You can now start editing content on your site and it will be sent to the Gatsby Live Preview server.

If it is not working, double check you are passing only the id into your page templates on your Gatsby site and building your pages by pulling all the data from a graphql page query in your page component. Go through the previous lessons for more information on how to correctly set up your Gatsby site to work with Live preview.

Sep 27 2019
Sep 27

The Layout Builder is one of the most exciting new features in Drupal 8. It's a site building tool that makes it easier to configure how your content is displayed in Drupal. You can use a drag-and-drop interface to combine fields, nodes, and other content, and actually control the layout used to contain that content.

You can also use it to build landing pages from the ground up: creating custom content blocks and placing them where you want in a layout. I was curious about how content editors would react to the Layout Builder interface, and if they would be able to easily build a landing page in this way. I did a short user test at DrupalCon Seattle and the test subject (an experienced Drupal content editor with a lot of patience) had a hard time figuring out where to start.

That's how this comparative study came about. The goal was to see how content editors use the Layout Builder, in the context of creating landing pages. My colleague Annika Oeser created a script and conducted the user testing, my colleagues Michiel Huiskens and Jigar Mehta set up the configuration in Drupal, and Sean Conner at Charles Shwab helped us recruit volunteers for the study.

A lot of work has gone into the Layout Builder already, and the user interface is undergoing constant improvement. This study specifically addresses the use case of content editors creating landing pages using the Layout Builder.

Layout builder demoUsing the Layout Builder to add a custom block

The Setup

To organize the study, we created a mockup of a simple landing page design. Our main instruction was open-ended: asking participants to create the landing page following the design we provided, and then move some of the content to the top of the page.

We had all the study participants do the task using Drupal with the Layout Builder and, as time allowed, also tested how they used WordPress with Gutenberg and Drupal with Paragraphs to give us some benchmarking.

We created three demos sites:

  • Drupal with the Layout Builder: we configured a landing page content type that has no fields, and the Layout Builder enabled on a per-node basis. The site includes block types to model the content components that appear on the landing page: text, image, call to action.
  • Drupal with Paragraphs: we configured a landing page content type and Paragraph types for the content components, as well as nested paragraph types like "2-column wrapper" to allow the content editors to build the layout
  • WordPress Gutenberg: No custom configuration

Landing Page MockupThe design for the sample landing page

First Impressions

As one participant said, "the biggest question is: 'Where do I create content?'"

Although at first, many participants asked themselves what the difference between a Block and a Section is, they were all able to quickly figure out the model of adding Sections. And they found that selecting the layout for a Section was easy.

Add blocksInterface for adding sections and blocks

Adding Blocks

Clicking the "Add block" link was obvious to all the participants, and once they found the "Add custom block" link, they had no trouble using this to populate their layout with content. However, along the way, they found a few aspects of the UI confusing:

  • All the participants observed that "When you go to add [a block], it's confusing to have all these options." The "Add custom block" link gets lost, even though it's at the top of the list.
  • Once the user selects "Add custom block", they can guess which block type to use, but it would be nice to have a way to explain the difference between the types. Block type names like "Text", "Call to Action", or "Basic Block" are abstract and hard to differentiate.
  • After adding several custom blocks through the Layout Builder, one user looked for a "Block Library", because he wanted to reuse one of the blocks he had just created.

List of options when adding a blockThe list of available block types when adding a block through the Layout Builder.

Editing Blocks

The most common complaint we heard about the block editing interface was about the word "Configure" when editing the content of a block. Content editors look for the word "Edit".

Another thing that content editors found confusing was the "Display title" checkbox next to the title field. Many participants asked "What is [the title] used for if it's not displayed?" In the case of adding custom blocks through the Layout Builder, it seems like the content editor shouldn't have to make this decision. And it would be nice if there was a clear way to indicate to the user what the purpose of this field is if it's not displayed.

Other feedback included:

  • When editing a block, there's no "Cancel" button, only an "Update" button.
  • "When I [double-]click on the content of a block, I feel like it should go into edit mode, like MailChimp."
  • Using this method of having custom block types to construct a landing page, the onus is still on the site builder to configure the fields that are well-labelled and easy for content editors to populate. So we heard feedback like "I would like for the default [text format] to be Full HTML."

Interface for editing a custom blockInterface for editing a custom block through the Layout Builder

Editing the Layout and Sections

Learning how to use the Layout Builder involves learning new terminology, and how to manipulate the Blocks and Sections. We heard several observations about this experience:

  • One feedback we heard many times was that the links to "Add section" and "Add block" should look more like buttons. This could be helpful because when the participants tried to drag-and-drop blocks on the page, they tried to move blocks into the "Add section" areas, because these look like part of the layout.
  • One user noticed that the "Add section" links "interfered" with her layout. Another user said "'Add section' feels intuitively like a place I should be able to put something."
  • Once a Section is created, it's hard to tell that it's a section, which can add to the initial confusion about the difference between a Block and a Section.
  • Also, when trying to move content from the bottom to the top of a layout, one participant said "It looks like the sections are movable. But I don't know how to select an entire section."

Findings About the Overall UI

Configuring the permissions for content editors to limit what they can do will be key to making the overall interface less distracting and easier to use. Some specific observations about the overall UI:

  • One participant clicked on the "Edit the template for all Landing Page content items instead" link. The interfaces are so similar that it wasn't clear to her what had happened and she continued editing as if she were editing a single landing page node.
  • Having the publishing status more visible on the "Layout" page would be helpful.
  • Having the "Save" link at the bottom of the "Edit" page, and the "Save Layout" link at the top of the "Layout" page seemed disorienting.
  • The fact that your default Layout can't be empty means that you have to have one block in the layout when the content editor first clicks on the "Layout" tab. This block prompted some questions and mild confusion from the content editors.

Comparison with Paragraphs and WordPress Gutenberg

When trying to create the same landing page layout with Paragraphs, participants found:

  • The nested-Paragraphs interface we provided for creating the two-column layout was more confusing and less flexible than the Layout Builder.
  • The Paragraphs interface is more familiar for someone who is used to working with the Drupal fields. Using Paragraphs was faster for creating and editing content.
  • One participant observed that "Paragraphs works well if you have simple content, but once the content and layout is complex, then it gets bloated. I would be curious to see how the Layout Builder handles complex content like that."

Comparing WordPress Gutenberg and the Layout Builder:

  • Participants observed that the two interfaces offer similar features and work in a similar way.
  • With Gutenberg, some of the styling options are hidden, in order to make the interface more sleek, and this can make it harder to find content editing options.
  • Gutenberg provides the flexibility of adding a wide variety of types of content to a landing page, while the Layout Builder allows (and requires) the site builder to pre-define the set of block types that can be added.

What Did We Learn?

One of the most interesting things we learned in the study was the workflow that content editors use. One said "I would like to be able to preview my layout before I start adding content to it. Just like a blank template [that I can send as a preview to my colleagues]." I noticed that some participants created the landing page in two rounds: first adding the content, and then doing another round of work to try and get it styled correctly by using the WYSIWYG and changing block types.

By the end of the testing sessions, all the participants were able to easily add/edit blocks. But getting used to the layout tools and figuring out where to go to add custom blocks in the first place was difficult for all of them. I know that controlling the list of available blocks is on the roadmap for the Layout Builder, and I think this will help immensely.

Although all the editors were able to figure out how to use the "Layout" tab, orienting the whole content editing process around the "Layout" tab would be helpful for editors. As one participant observed: "My habit is to go to the "Edit" tab, but all the useful things are in the "Layout" tab."

Terminology is hard to get right, and even harder to change. I think it's hard because what we call things change depending on what role we play. One very observant participant said "the word 'Block' is throwing me. To me, it should be content. When I have my content editor hat on, I'm looking for a link to add content." Likewise, content editors look for the word "Edit" instead of "Configure".

I hope these findings are useful for understanding how content editors think, and will be helpful for improving the UI of the Layout Builder for this use case. I also hope that site builders and developers can use this input create better configuration and documentation as we start to use the Layout Builder on our projects. As one content editor exclaimed at the end of the testing session "I'm excited about this new feature!"

Sep 26 2019
Sep 26
[embedded content]

Rich text editors are an integral part of content creation and content management workflows, but they can often present challenges for developers when it comes to robustness, extensibility, flexibility, and accessibility. What are some of the considerations you should keep in mind when evaluating rich text editors, especially for mission-critical systems like the application Tag1 is building for a top Fortune 50 company?

In this Tag1 Team Talk, we explore the new generation of rich text editors, which are based on a well defined data-structure rather than HTML, but still can export to Markdown or HTML. This allows us to tackle new requirements organizations have, including video embedding, cross-device support, and keyboard-navigable editors. After diving into some of the open-source solutions available in the market, such as Draft.js, CKEditor 5, Quill, Slate, and TapTap, join moderator Preston So (Contributing Editor) and guests Nik Graf (Senior Software Engineer), Kevin Jahns (Real-time Collaboration Systems Lead, Yjs creator), Fabian Franz (Senior Technical Architect and Performance Lead), and Michael Meyers (Managing Director) for an in-depth conversation about why ProseMirror is the best tool for our client’s project requirements.

Be sure to check out our related #TagTeamTalk, A Deep Dive Into Real Time Collaborative Editing solutions (e.g., Yjs, Collab, CKSource, etc.)

Further reading

ProseMirror Editor
CZI ProseMirror: https://github.com/chanzuckerberg/czi-prosemirror/
Prosemirror Tables Demo: http://cdn.summitlearning.org/assets/czi_prosemirror_0_0_1_b_index.html
ProseMirror Atlaskit Yjs Demo: https://yjs-demos.now.sh/prosemirror-atlaskit/


CKEditor 5
CK5 Demo: https://ckeditor.com/ckeditor-5/demo/

Quill.js demo: https://quilljs.com/standalone/full/

Slate Demo: https://www.slatejs.org/#/rich-text


Fidus Writer

one of the most popular code editors for the web

Text Transcript

Preston So: - Hello, and welcome to the second ever episode of the Tag Team Talks. Today we're gonna be talking about rich text editors and some of the solutions that are out there in this very exciting and growing space. First thing I want to do, though, is get a little bit of a look at our guests today. My name is Preston So. I am the moderator and contributing editor to Tag1 Consulting. And I'm joined today by several amazing folks from all around the world here to talk about rich text editing.

Michael Meyers: - Awesome. My name is Michael Meyers. I'm the managing director at Tag1. I handle business development, sales, partnerships, marketing, strategy, client relations, things of that nature.

Kevin Jahns: - Hi, I'm Kevin Jahns. I'm located in Berlin. And I'm an expert for shared editing and CRDTs. I currently work for Tag1 consulting on a realtime system.

Nik Graf: - Hey, I'm Nick. I've done a lot of frontend development over the last couple years, and also was digging into Draft.js, actually built a plugin system on top of Draft.js. And now doing a lot of work on the same project as Kevin, the realtime collaboration stuff with ProseMirror.

Fabian Franz: - Hi, my name is Fabian. At Tag1 I'm currently a senior technical architect and performance lead. But on this project I'm especially excited about bridging the gap for the editors. And I'm a Drupal enthusiastic, Drupal 7 core maintainer, but also a longtime Drupal Aid contributor where we're also kind of having this switch over from CKEditor 4 to maybe CKEditor 5, so going to the next generation. So this is really exciting to be working a project where we're exploring all of that.

Preston: - Thanks very much to all of our guests. It's a real pleasure to be here with all of you today. This is a very meaty topic. We're gonna be talking for quite some time about this, I'm sure. But first, I just want to say good morning, good afternoon, good evening, to wherever you are in the world. And if you haven't already checked it out there's actually a previous webinar that we've done related to this topic on collaborative editing. It's about how it relates to the ways in which people work today. And I want to make sure that we refer back to that webinar so please take a look at the link also available on this page. Alrighty, so let's go ahead and get a quick background on Tag1. Why are we interested in rich text editing, Mike?

Michael: - So, Tag1, we handle mission critical systems, emergency management. We've helped organizations like the American Civil Liberties Union go from raising $4 million a year in donations to over $120 million a year after President Trump in the U.S. came into power. So we do a lot of performance and scalability. We do high availability. We work with a lot of Fortune 500 companies like Symantec doing cybersecurity, infrastructure management. For this particular project that we're gonna be talking about today we're working with one of the top 10 Fortune 50 companies. They are rebuilding their intranet. It's a highly available, highly scalable, mission critical system used across 200 countries with over 20,000 active users in well over a dozen languages. Realtime collaboration is key to how the modern workforce operates. I spend a lot of my time in things like Google Docs collaborating with the team on all sorts of things. And while our goal with this intranet is to integrate a lot of different systems and not reinvent the wheel, so for example, you'll get a summary of what's going on in Slack on the intranet but all that information comes from Slack and the idea is just to link you off to Slack. These days, people use a lot of third-party tools for what they do best. The challenge with that is that they are disparate systems. And so if you have Box, and Slack, and Quip, and all these other things, it's hard to know what's where. So this system really organizes all of that with centralized authentication and user management so you can, say, create a space for a particular group and it will spin up all of the necessary artifacts we need, from say Slack to Quip, manage permissions. You can use any of these systems independently but everything is sort of synced, meta searched, and managed across this centralized system. And then a key component of this system itself is collaborative editing. And they have, as you can imagine with a global workforce of 150,000+ employees, they have a lot of people with different uses cases and needs. And so, some people, let's say technical people, love Markdown and want to work in one type of editor. People in other groups and departments might prefer WYSIWYG. Some people want to be able to edit HTML directly. And so, the reason that we're looking at editors on top of the ability to do realtime collaboration and work together on information in realtime is that we need to accommodate a lot of features, plugins, enhancements, and different users in different spaces. And so we took a wide range, an assessment of a wide range of editors in the marketplace, did an analysis based on our feature requirements, narrowed it down to a field that we're gonna talk about today, and ultimately selected an editor.

Preston: - I think this landscape is quite broad. There are so many options out there, and it's very difficult to choose which ones are appropriate, especially given that there are so many requirements that people have today. And being able to actually choose based on a variety of different features, which we'll talk about in just a little bit, is a huge prerogative. I mean, there's two areas that you just mentioned, Mike, that are very interesting. And the first is the realtime collaboration which has its own challenges and its own difficulties. Which was the subject, by the way, of our inaugural Tag Team Talk. And of course our second topic today, which is really what a rich text editor is. And combining those two really unleashes a lot of power for these content editors, but it is also very challenging from a technical standpoint. But let's go down to the very, very basics here, and the fundamentals. Sort of in its most basic sense, how would we, as a group, define a rich text editor?

Kevin: - I think that's really, really hard to make a general description of a rich text editor. I think most people think about Google Docs when they hear that. But I would say that a really basic rich text editor is something that supports boldness, italic, and headlines. That's it for me. Because often you really need that feature set. That's basically what you have in Markdown and that you want to have in all the other editors. Sometimes you write a blog post. You basically only need these features. For us developers it's really important to have code blocks too. I think that's a really important feature. But I don't think everyone agrees here. There's links and tables. Actually, a lot of people expect tables but not all editors support tables. So for me, a rich text editor is, yeah, something that supports this, contrary to pure text editors that only support working on text, maybe only paragraphs, and no rich text formatting.

Preston: - Was there a certain minimum, like a threshold that you wanted to reach in terms of the number of features? I know that you all have done a really strong comparison of all of the features available. Was there a certain point where you said, okay, well, we can put a dividing line here where we can say, all right, everything above here we can definitely look for, but everything below this line perhaps maybe we should strike out of our requirements?

Kevin: - I think a baseline for this project, yeah, we had a baseline, a feature set that we want to implement. And for our use case it was really important that our editor is adaptable. And this is not a requirement for all the projects that are out there. Sometimes you really just want to have a plug-in editor that just works and does the basic stuff right. But for us, we wanted to do some custom stuff, and some editors support that, and some not as well.


- I could dive in here and give one example that Kevin mentioned. This is for example the tables. I worked a lot with Draft.js in the past, and I know you can do tables, and it's possible. But if you want to do more than just a simple text field and then have, rich content, again, in the table field, this is really, really hard to do with Draft.js. So what people came up with ideas like Draft.js per fields, like editors per field in the table. And then this gets really, really heavy because this has to run on the web browser. While others support this because the structure internally, how the data is managed is completely different. This is completely different. Basically, depending on what your needs are it completely rules out certain editors right away.

Fabian: - Yeah, that's also what I've found in my research of editors. Tables are really tricky with an image in a table, where every normal person is like, hey, that's so easy, it should just work in that. I've also seen for two other editors, either Slate or Quill, where the table plugin was basically instantiating another complete editor within that and then doing some magic to hide the one toolbar, show the other toolbar, so that it's still a seamless experience. Once you go away from those basic features like bold, italic, those they all can do, code play, quotation maybe a little bit more complicated. But basically, what's kind of used, what you are used to from all the old editors, most can do, that's not a problem. But once you get into the nitty gritty and really want some features like autocomplete, you type something and you get a table or something like that, we have that not yet, but it's so useful, and so practical, and so nice. But some editors, it's just way more harder to implement than others.

Preston: - I think we can all agree that as it gets more and more complex you kind of question the usefulness of some of these, especially the inline tables or some of those formatting options.

Preston: - Well, I think we've talked a lot about formatting, and clearly, formatting is a very, very strong interest to a lot of the content editors that we work with on a daily basis. But I think we also, Mike mentioned earlier something very interesting which was about document formats and the interchangeability between those. That's also a very important feature of rich text editors. Because you can do whatever editing you want to, but if you can't extract it and move it into a different format, or make it usable for other systems, it doesn't make any sense. And so I'm curious, when we talk about these document formats, do all of these editors support Markdown, HTML? Do they all support rich text? And I guess my even more pertinent question here is how easy is it to switch between them in these editors, or is it possible at all?

Kevin: - I think it's important to distinguish the underlying document model of the editor, that is often JSON-based, especially in rich text editors, and how you interact with the editor. Most editors somehow support some positional parameters. So, insert something at position X, something like that. Because that's how most humans comprehend text editors. So we somehow try to port that to rich text editors. There's some editors like ProseMirror that are more structured so you really need to say, okay, I want to insert something in this paragraph, inside this table, at this position. But this is also translated to index parameters. Because also ProseMirror, which is structured internally, accepts something like index parameters. So, insert something at position one, for example. And I really like that, especially... Like, in comparison, Quill.js has also an internal model that is purely based on position parameters. It accepts changes that are defined as the delta format. And I really love this data format because it's a really easy description how to insert something into the editor. It's index-based. And it is also perfectly suited for collaboration. But something that is really hard when you only work with index parameters is designing tables. So, when you work with tables, I think something like ProseMirror which is more structured, it's something really cool.

Fabian: - What Kevin said is very great and important but might have been a little bit too much already, too deep already into what our audience expect. So I would really like to step back and just show what is this kind of document model. And we are here at a very exciting point for me personally because we are at a transition point. All of the old generation editors, CKEditor 4, and whatever you all have, some nicer, some not so nice, they all have been built on something that's called a contenteditable. This contenteditable was basically supplied by the browser. It allowed basic formatting, and every of the trillion browsers that are out there, even browsers within the same name, implemented it differently. It was a huge headache. So all the editors said, no, no, no more contenteditable. We really don't want that anymore. The huge advantage of this old generation of editors is you threw them some HTML, it was outputted out of Word, and they could take it. It might have not looked nice, but they could take it. They could see it, they could display it. You could even edit it. So you just threw some HTML on them and then you get also HTML out. So for something like Google, that's perfectly suited. You load the HTML from the database. The user edits the HTML, now it's saved again to the database. The new generation of editors, CKEditor 5, ProseMirror, Quill, they are all having some internal document model. And we are seeing that a lot also in other cases of the web and everywhere, that we're kind of using these extra technologies, these languages that allow to express the same what was in the HTML, but differently. And because they are all having these internal document models what you can do is you can, for example, take... In theory, at least, you can take the same document model and ones display it as What You See Is What You Get, but ones who could also display it as Markdown, as long as you don't have something that Markdown doesn't support within it, and you can basically transfer it back and forth. Because the only thing that changes is the transformation from document model to what the user is editing, and how you are working on the document model. This was the other thing. And that makes for really cool demos. We'll put in a link to a ProseMirror demo where you can really, you have the technical person there, no Markdown, all comments out of the hat, they're just putting in Markdown. You have a non-technical person, they can collaborate on it in the same document because the other person can just click on bold and it's bold, and they see it as What You See Is What You Get. And that's so cool in the new generation of editors. And later we'll talk a little bit about the challenges, but I think that was a good introduction.

Preston: - And just to add a little bit even more context here, I think when you talk about, Fabian, the ways in which we've evolved over time, I mean, long gone are the days when we had those phpBB formatting toolbars which were limited to let's say three or four different buttons, and they never worked half the time. To nowadays, this very advanced and almost abstract way of working that is really kind of a layer above, where you're talking about working with JSON as opposed to working with direct HTML or direct text. We're actually talking about an object tree, which is really amazing, and I think is very compelling. So let's go ahead and move a little bit towards some of the more business requirements here. I do want to talk a little bit about this Fortune 50 client that you mentioned. We know that all of these editors and all of these powerful tools do have the functionality to do these formatting changes, have the abstraction layer as part of this new, this kind of new document model that we talked about. But I wanted to ask, there's kind of differences in how each of these editors like ProseMirror, like Draft.js, like Quill, how they manage all of the underlying data, and also how customizable they are. Can we talk about some of the key requirements here? What's maybe some of the major characteristics that you all wanted to see come out of this project?

Michael: - Before we jump into the technical stuff, I think one of the key things, well, first of all, it had to be collaboration ready because we're integrating this with a realtime collaboration system. But beyond the extensibility that Kevin talked about, which is critical because their needs are constantly changing, we need to integrate it with a lot of different third-party tools and systems. We want to add things like @mentions that tie into central authentication. I'll let these guys dig into that. There were a couple of business requirements. One of them was, you know, prove it. We looked at some really interesting editors that are still in the earlier stages of development, and we could swap them in in the future. That is another aspect of extensibility. We may choose to change editors in the future, or give different users different editors. But for launch we need something that's proven. Something that is really stable, that has a robust open source community behind it that is continuing to develop it with maintainers that are really responsive. We wanted to make sure that it was being used in enterprise production by large organizations. So, ProseMirror, for example, is used by The New York Times. And they've written some great posts about it. They were generous enough to get on the phone and talk to us a lot about their experience to sort of confirm some of our research and thinking in real world scenarios. That was really critical just from a, before we could even evaluate these editors and dig into the features, there was sort of a minimum bar.


- Yeah, and also what was important from the proving standpoint, ProseMirror for example, and we will come later to that, Confluence almost everyone knows, many work with it. It's built up on Atlaskit, and Atlaskit itself is built upon ProseMirror so that was another plus point. The CZI, Zuckerberg Initiative nonprofit, they are building some Google Docs-like clone based on ProseMirror. Also very interesting. So we had several things to just work with, and to see, and use in that. You use those demos, and they just work. Tables look great, things work in that, so that was a huge plus point here for ProseMirror in just being proven by being used by other larger organizations.

Nik: - Maybe I can add a word to Atlaskit. I mean, we'll kind of dig in later. But Atlaskit, as Fabian already mentioned, Confluence is built on Atlaskit, but not only Confluence. Basically every Atlaskit is this designed system from Atlassian, and everything we're building at the moment, everything we're rebuilding, redesigning, is built on top of Atlaskit. So the Atlas editor, core editor, built on top of ProseMirror in their design system. And this also gave us in terms of like, I don't know, kind of showing off at the client a good headstart in the beginning. I mean, they had a different design and they had different widgets, but you could take a lot of that stuff, put a different design on top of it, and get a lot these tools out there. So, while it was not really a requirement, it was a really, really good way to impress early on. And yeah, because Atlas, Atlassian has done a great job with Atlaskit it could give us a good headstart. And yeah, accessibility, multiplatform, all of that, is built in.

Preston: - Let's dig into some of these. I think, Nick, you just mentioned multiplatform. I mean, this is a really interesting kind of idea where you should be able to use a rich text editor on whatever device. On a phone, on a tablet, on an Electron app. Can you talk a little bit about how you thought about multiplatform and why was it so important to these folks?

Nik: - I think in general, the world is becoming way more mobile. And while desktop is still in the use case, probably, for this intranet, but people more often check... They want to edit something on mobile. And while we currently don't talk of it, we wanted to pick a platform that we later on can expand to it. I can tell from, like, some editors, they have their fair share of troubles with mobile because simply the environment, it is different browsers behaving differently, so the underlying document model sometimes already struggles. Mostly they're working fine and it's a matter of your IUX. But yeah, you basically want to pick something that definitely works on all platforms so you can expand in all directions.

Preston: - And one thing you just mentioned also as well, Nick, that I wanted to call out is the notion is extensibility. You know, third-party integrations, being able to work with other tools. One thing Mike had just mentioned was the notion of being able to tie in @mentions and integrate that with the central authentication system. I also know that there are other third-party tools you want to integrate with, and that you see as being important for these rich text editors. Can you give some examples of those? All the group feel free to jump in as well.

Nik: - Yeah, absolutely. Let's say you want to reference a Dropbox file, or a Box file, or you want to mention another user. These are then custom nodes. So you have an editor that only supports the standard HTML text and doesn't allow you to make your own nodes, then you can't do this. That's why this goes back to this document model. Basically, the document model of the editor has to be extensible so you can actually add your own, extend it, and then allow to build a user interface to add these custom nodes in there. And then however you want to implement them, you can just reference an ID to a GitHub issue, for example, and then you could load the data on demand or you could actually put the data in the document. This then, it ties together into, like, into this authentication system, and how you load the data, and so on. This is very dependent on the needs of security and customer requirements. But in the end, the gist of it, you want to be able to create something where you can add a toolbar item to add GitHub issues and connect them, and have them in the rich text document. This is something where you... I mean, you could have for example even still custom markdown syntax. This is where WYSIWYG usually outperforms Markdown or other systems, by far, because the experience is so much better.

Fabian: - For example, hover over them and then it would show you all the details of that GitHub issue, if there's a pull request or not, et cetera. Possibilities are endless, obviously. And I think that's so very cool about that. What's also really great about this kind of editor integration is that there's so many possibilities in extending that. For example, one thing we didn't talk yet about much, correct me if I'm wrong there, but we're building everything on React components. React, you just have your standard React component for like an autocomplete and then you can put it in the editor. And also another nice thing here about ProseMirror where you have the display in the document and how it's displayed in your document what's outside your document different, and that's also another important part for accessibility, which we probably also wanted to talk about.

Preston: - Absolutely, yeah. Accessibility is a topic that's very near and dear to my heart personally. I know that, Mike, you just mentioned earlier as well that when it comes to a large Fortune 50 company like this one, being able to work with this very large workforce that has a variety of different abilities and a variety of different needs is important. We alluded earlier to some of the challenges around accessibility with rich text editors. We talked about things like contenteditable, the contenteditable attribute on DOM elements, ARIA labels. I know that we've looked at some of the capabilities and we've talked about some of the nice things that these editors have. Are there any that I've missed besides the contenteditable and some of the ARIA features?

Nik: - [Nick] Kevin, you want to take this, or should I?

Kevin: - You do, please.

Nik: - In general, I mean, a lot of accessibility you get out of the box if you have the same content... Or, a document structure in HTML. So if you have like headlines well structured and so on, it makes it easier for people, for the screen readers and so on to parse the content and for people to jump around just by their voice input. But then if you... If you actually make popups, dialogs, if you have toggle buttons, that you get into the nitty gritty details, if you make your custom ones you really have to take care of accessibility by your own. If you look at all the standard toolbars and buttons that a lot of these editors provide, or come with, they have accessibility built in. And that's really good because that shows that this is already, like, a standard, that it's kind of expected. But as soon as you start to build your own, like, @mentions, or a GitHub plugin to reference pull requests, and you're doing your own popup and dialog, you really have to take care of yourself, take care of it by yourself. This is still a lot of work. We were fortunate that Atlaskit did a lot of good stuff out of the box. We already got the feedback. There are a couple of improvements we can do. But that's okay. The initial response was quite impressive. Maybe the gist of it is like, even with these new editors, although they're using contenteditable, you can make them very accessible but you have to, as soon as you do custom stuff you have to do it by yourself and you have to take care of it.

Preston: - Yeah, and I think that, you know, you just mentioned this notion of all of the custom work you have might potentially challenge some of the accessibility and be a problem. This is where having that notion of great flexibility, and extensibility, and customizability, comes with great responsibility as well. I know that one of these you mentioned was the popups. For example, having that autocomplete widget show up with the @mention, that's very challenging. As somebody who likes to think through how would I build that accessibly, I actually don't know where I would start. That's a very challenging one.

Nik: - We very recently had a call with an accessibility expert to talk through that one. And yeah, it's... There are things like... I've built a lot of React components in the past that were focusing on accessibility, but even I learned in this call a lot about your concepts like live regions. You can have a live region in your document and then you can announce, basically, state changes. So for example, one thing that we learned that we're currently not doing yet, but we definitely want to, is if you toggle, if you have some text and you toggle it to be bold, you should announce that. You should announce the state. Is it now bold, or is it not bold? Because by just hitting the toggle, if you listen to the voice, the screen reader will just tell you, you toggled the bold. You toggled the bold status. Like, uh, okay, but which one is it now? This is very, very interesting. You really... What I learned basically is turn on the screen reader and dim your screen so it's black and just try to... All the actions that you usually do in this text editor, try to do them just by navigating around with your keyboard or with voice. If you can get through it, then you're in a pretty good state already. By doing this test and this call, and learning about all these things, we noticed a bunch of things that we're missing. But we're working on it. It's an interesting journey.

Preston: - Well, I know what I'm gonna be doing this evening after I get off work. It sounds like actually a lot of fun. Like, playing the game Myst or something. I know that there are also some specific requirements that were more interesting. And I think that there are definitely some interesting implications that come about when you mix rich text editing with some of the other ways in which people like to work with components on the page. Like, maybe the most important component or most popular component right now, React components. How exactly have you thought about things like embedding React components or embedding videos? I know that you've thought about actually placing React views straight into these rich text editors. How has that worked out for you all?

Kevin: - I think that's definitely one of the more interesting things about ProseMirror. Because a lot of people seem to do that. They plug in their rendering engine, like there's UJS. I know a project, TipTop uses UJS instead of React. Other projects like Atlaskit just build on React to render custom node views. And you can basically render everything in the editor that you would render on a website. I saw a project where you render a PDF using a React view because there's this great PDF, React-PDF. I think it's called React-PDF. It's a really cool project. And you just plug it in, and you have a PDF inside your editor. That's really cool, right? There's a lot of other stuff that you can do just like that. And because ProseMirror is already built on this concept of mutable state it's a really nice adoption to just use React within ProseMirror. But because you can do everything without React, I would argue that in Quill.js it's really hard to use something like that, React inside the editor. But still you can do everything you want. You can build your custom tables and stuff like that. But React certainly makes a lot of stuff easier because they have a lot of knowledge in React. So, it really makes stuff easier.

Fabian: - Not only that. There's also the possibility to just reuse components and then combine the best out of the React world. That is also important from a perspective of how to get developers for this project, that really we focused on those collaboration developers as well as React developers to get the best of the best.


- I think we can all agree that you shouldn't manipulate the DOM manually anymore. For the editor itself, we have ProseMirror to handle the DOM. And for custom views, like for any kind of custom stuff, for example how a table is built, like there are a lot of divs around there, a lot of CSS, I wouldn't do that directly in the DOM and manipulate that information in the DOM directly. There are a lot of edge cases that you need to handle, and you can do a lot of stuff wrong. So, React really helps, I think.

Nik: - There was one more specific requirement that is probably worth mentioning, is our comments. We have these annotations or comments. And this is a very interesting aspect that we learned over time. There was for ones this requirement that for a different permission level it shouldn't be part of the document model, so we wanted to have that out of the document model. But it's also really interesting that if you start to do an annotation and you share the document model collaboratively on realtime, you don't really want to... If you're making a draft of a comment, you don't want to share that realtime. And this is the same for let's say @mentions. If you start typing, you don't want the other person on the other end see the autocomplete suggestions. This needs a little bit of rethinking because you basically have parts. The document model is really like the content that you want to share realtime. But there's the other parts like user interface elements or annotations that are seen in draft mode that you want to keep out of it. And then that's really, really useful to share the same component library so you can actually stay in the same system and not build the editor with one UI library and then build these other user interface elements with another library. It's really handy to use the same thing. It keeps us sane and easy to move forward.

- [Kevin] Well put.

Preston: - Let's jump into some of the actual tools that are out there. I think that we've heard a lot of names thrown around. There's been a lot of very interesting technologies mentioned, and a few that we haven't mentioned. We've talked about ProseMirror briefly. We talked about Draft.js. Very briefly talked about Quill and CKEditor 5. But there's also some others. There's Slate and TipTap. What are some of the top open source... When we look at these open source editors on the market, which ones were the ones that really were compelling for you all? And what were some of the strengths and weaknesses?

Kevin: - I think, for me, the biggest strength... Like, we can talk about each of the editors separately. Maybe we go from the most popular ones. Maybe Fabian can explain something about CK 5. He has the most experience with that.

Fabian: - Sure. CKEditor 5 is the popular successor of the very, very popular CKEditor 4. It also switched to a JSON-based model. However, they do up and downcast everything so what you basically in the end still get is HTML. So you have HTML, or you have structured HTML. So, for example you cannot have your hello tag. The document would just not know, what the heck is a hello tag, or a blink tag? It would just ignore it and everything that's in it. Because what it does, when it loads the HTML it loads it into its own document model which is also JSON-based, and then it puts it out. Basically, CKEditor 5 is pretty strong. It has good accessibility. And it also has nice collaboration. The collaboration just had one big flaw. It was not open source. This was unfortunately a deal breaker in terms of the extensibility in that, and also putting anything out for everyone. I mean, Drupal is open source. We work a lot with open source at Tag1. We love open source, and it's so cool that Kevin, as the developer of Yjs, is here. That's also kind of how we found Kevin and the other three. We directly talked with the people who are developing these editors and checked them for our project here when we were interested in some part, et cetera. That was kind of the team we then settled on. But CKEditor 5 is still a little bit young in its thing. It has just recently gotten out of beta. ProseMirror has a little bit longer history in being stable and being used for those things. It will not be a concern because some other big players at settling on CKEditor 5. But just saying experience in how you long you work with something is not worth playing around. And then there's a huge compatibility break with CKEditor 4. So what could have been a huge advantage with CKEditor 5, that all of our Drupal backends will directly work with it, et cetera, is no, because there's just a real break between. So, CKEditor 5 is a completely different product than CKEditor 4. Which has its advantages, but as there is no backwards compatibility, and the collaboration module was not open source, we looked more at the other editors. Slate, for example, we've not talked much about it. It's a great editor. It has, from what I've seen, the nicest backend system. It's really cool, very nice in that, but it's beta, and it's beta since a long time. And we want something proven, we want something stable. And something in beta where there could be hard BC breaks was just too much risk for us here in this project. Nick can maybe talk more about Slate because he knows. Draft.js was more like the original Facebook thing, monolith. It's a great editor. It has a nice backend system, it's React-based. It is harder to extend overall in that, and it's also aged a little bit in that. It's one of the more older editors in that. Also, the community is not as active as, for example for ProseMirror, where it's mainly Facebook committing some patches here and there and maintaining it at the stable state. But in the end, it didn't have the features we needed. So yeah, that was kind of the thing. TipTap with roo. If anyone needs an editor for rooJS, use TipTap, it's great. And it's also ProseMirror, basically, so yeah. It's kind of like a product for ProseMirror for roo. That's cool. And then we ended up with ProseMirror and Quill, and that was kind of the huge race between ProseMirror, Quill, going that. Now, Yjs supported both, so it was also no thing here. But in the end, ProseMirror won basically on the experience. Also the tables plugin looked much nicer in its experience, and how it looked and everything. Quill, the database format is great. It's also a collaboration ready editor. It directly works. But you are then... You need to use the data format it provides, and you then need to use ShareDB. And that again was putting our flexibility a little bit to the test. It's also OT-based, which we talk a little bit about in the other talk. If you're interested, check that out. And we want really something that, where in the end, maybe we'll never get there, maybe we will, but in the end we could at least think about a future of offline editing. And that's again something we talk about there. But Quill and ProseMirror was a really nice race in that ProseMirror is more giving you a framework where you have like nothing and you build your own editor, and Quill is like a ready made editor. You plug it into your application, it just works. It's great in that. But as you add Atlaskit, then Quill got out of the race.

Preston: - Yeah, I understand that... Oh, sorry, go ahead.

Kevin: - I think this was one of the bigger selling points. I think we had Quill and ProseMirror in the end listed down, and we compared it. Like, Quill.js has ShareDB. It's a proven concept, operational transformation. It also works with Yjs. There are a lot of companies that already use Yjs with Quill.js. And then there's ProseMirror. ProseMirror has all these teachers, a great community. I think it has a really interesting concept. And most modern editors nowadays, like all the new editors that pop up, for example Atlassian, they are all built on ProseMirror. We also have the Collab module which is kind of similar to OT. It's like a reconciliation approach. It doesn't really handle conflicts as well as operational transformation, but it clearly works. It's And also, Yjs works with ProseMirror, so I got you there because either way, we could choose any of the editors with Yjs. And this is what I really wanted to do. I really explained that and why we did that in the last webinar. But I think the biggest selling point I felt was one, the community behind ProseMirror, and when we saw Atlaskit and that we could just build on top of Atlaskit. Because we had this existing editor, a lot of features, nice tables, nice interactions. This is, I guess, a big selling point of ProseMirror. And a lot of open source components that you can just use, plug into the editor, and it just works. So yeah.

Preston: - Absolutely. One of the things I know, Nick, you mentioned about ProseMirror was the fact that Atlaskit helped so much more. Is there anything more that you wanted to mention about Atlaskit? I think it's a very compelling story.

Nik: - I think not much more to add. I quickly could repeat that because there's so much there that you really, you can simply start using Atlaskit then you have a good headstart. The biggest trouble you might have, and we went through this, is that this is a big mono-repository, so we had to take out the parts, the core editor, that we needed, and then basically continue to use the rest from Atlaskit and take the bits and pieces. Like, slowly replace the bits and pieces that we actually needed to be changed. But this was like, from a demo, or the experience in this project, this was very good because in a very short period of time, I think it was just a matter of like two weeks or so, we had something ready to show that the client could try, use, and actually feel. And then obviously if you then can test with re-users and so on, or potentially your re-users, you're making better decisions than like, just coming up with, like, hey, and we might to do this, and that, and have a button here. But if it takes you quite some time to slowly build it up than rather starting from something that is fully fleshed and then replacing bits and pieces, I think that was for, like, product thinking and product development, a really compelling story. It was possible for Atlaskit.

Fabian: - Definitely Atlaskit very great for us. And that was what Nick was saying was part of our strategy with this client, that we are showing progress every two weeks, and a huge demo. And with not only the client itself but there's a huge stakeholder team that can all watch the progress of that, how it's done. That was really great to make a great impression quick. But not only a great impression quick. What you shouldn't undersell, Nick, is how you definitely improved the build system. Because I think it took like three minutes, at the start, to just build everything, and now you've put it down to 30 seconds and rebuilds are like 10 seconds. No longer like 30 seconds wait time. Like, change some CSS, wait 30 seconds, drink a coffee.

Nik: - We should say there, props to Sebastian, to me. We was digging into the webpack configuration and getting the hot rebuilding working with good compile time. This was really helpful for faster development. One thing I could add there, though, is like about Atlaskit. Atlaskit is not built with realtime collaboration in mind. For example, certain features, they do things like they change the document model for just like showing different user interfaces. So for example, the @mentions that we are building, and the annotation or commenting section, we basically, we cannot use what's there in Atlaskit or we have to adapt it and change it. Otherwise it would be synced across to other users, and we don't want that. So, while Atlaskit was a good start, we now have to really, especially with this realtime collaboration in mind we have to change a lot of things. But that's fine. I think this was the strategy and the approach, and it was a good one. Highly recommended.

Kevin: - I think it was built with realtime but they use a custom version of the Collab module, which is like a different realtime approach. So we just plugged in the Yjs plugin so all the data that you have is shared immediately. And I'm sure that they have some, I don't know, filtering on the backend happening, filter that out, like all the data that you don't want to share. I'm not exactly sure how that works. But also, backend to the Atlaskit collaboration approach, it is, I think, proprietary. The source code is not available, I think. I'm not sure.

Fabian: - I haven't seen it. I've searched everything on collaboration that's out there on the internet. There's even some prototypes from The New York Times that still can be found. There's a five-year-old ProseMirror version if someone will dig into history.


- Absolutely. Well, we are starting to run out of time, but I do want to give some opportunity to talk about some of the more interesting aspects here. By the way, what sort of additions have you made to ProseMirror? Just very, very quickly. I know, Fabian, you've done some work on this.


- One of the important things, and I've already talked a little bit about that with how I explained the document models is reintegrating ProseMirror with Drupal. Now someone says, well, but yeah, Drupal supports many editors, but yes, only those of the old generation. So what we are now talking about is kind of we have these JavaScript mega beasts that are usually run with Node. And they are coming to the old giants of PHP, like old not in terms of being very old, but Google has been around 20 years and it's traditional PHP-based. It's just base HTML that you store in a database. And you have this ProseMirror and it has this JSON model. And how you usually would do that is you would take this JSON, run it through Node. Node would launch a little instance of the editor then would display it, and then the webpage would be delivered. We cannot do that because we are basically bridging this old generation of editors with the new generation of editors. And that's very, very interesting because when I was starting up with that, the React developers were like, why do you want to output HTML? Why do we need that? And the Drupal developers were like, JSON? Why would we put JSON in the database? We are storing HTML, we are storing JSON. We're storing both in the database. We're storing the HTML only kind of like a cache for display purposes. That's what we will be displaying to the user. And the JSON is what we then feed again into ProseMirror, or Atlaskit, or our custom editor. In this case, for loading up the same state as it was before. So, that's very important in that so we don't need to store to HTML, load the HTML again, store it again, and convert it back and forth where we could be losing data where it's prone, but we are storing the document model of ProseMirror directly in the database. We are storing also the HTML so Drupal can display that. That was a little bit of a challenge. A challenge that the whole of Drupal agents somewhere or another will also face because now we're going, kind of like with Drupal itself, with Drupal core, into this direction of this new generation of editors. So, that's a lot of challenges, and I hope we can speak in more detail about that at some other point. But it's really interesting. And then also, just loading this whole React, and then you have a frontend which is still jQuery-based, dramatic person, and AJAX-based, and it's this traditional how Drupal used to work in that. But now this new framework comes and now you want your @mentions but also working with some traditional Drupal comments, and you have to combine those two worlds. And that's very, very interesting.

Preston: - So, it seems that... Oh, sorry, go ahead.

Fabian: - There's one part that we did that was really exciting for me besides all of those mentions, collapsers, sections, and the collaboration and shared editing.

Preston: - Well, unfortunately we are out of time. I did want to get to talking about the integrations but clearly we will have to save that for another time. I just wanted to say thank you so much to all of you in the audience for watching or listening to the live Team Talk. For all of the things that you heard in this call, in this webinar, things like ProseMirror, things like Yjs, things like Draft.js, all of these things, we're gonna have links with all of these technologies that you can take a look at. By the way, please don't forget to check out our previous webinar, the inaugural Tag Team Talk about shared editing, collaborative editing. And by the way, if you're interested in learning about a particular area or a certain topic, please feel free to reach out to us and the team at [email protected]. I want to give a big thank you to our guests today. First and foremost, Nick Graf, our senior software engineer, based in Austria. Fabian Franz, senior technical architect and performance lead. And Kevin Jahns, realtime collaboration systems lead and creator of Yjs. And of course, the managing director of Tag1, Michael Meyers. This is Preston So. Thank you all so much. And until next time, take care.

Sep 26 2019
Sep 26

I am writing this quick tutorial in the hopes it helps someone else out there. There are a few guides out there to do similar tasks to this. They just are not quite what I wanted.

To give everyone an idea on the desired outcome, this is what I wanted to achieve:

Example user profile with 2 custom tabs in it.

Before I dive into this, I will mention that you can do this with views, if all that you want to produce is content supplied by views. Ivan wrote a nice article on this. In my situation, I wanted a completely custom route, controller and theme function. I wanted full control over the output.

Steps to add sub tabs

Step 1 - create a new module

If you don't already have a module to house this code, you will need one. These commands make use of Drupal console, so ensure you have this installed first.

drupal generate:module --module='Example module' --machine-name='example' --module-path='modules/custom' --description='My example module' --package='Custom' --core='8.x'

Step 2 - create a new controller

Now that you have a base module, you need a route

drupal generate:controller --module='example' --class='ExampleController' --routes='"title":"Content", "name":"example.user.contentlist", "method":"contentListUser", "path":"/user/{user}/content"'

Step 3 - alter your routes

In order to use magic autoloading, and also proper access control, you can alter your routes to look like this. This is covered in the official documentation.

# Content user tab.
  path: '/user/{user}/zones'
    _controller: '\Drupal\example\Controller\ExampleController::contentListUser'
    _title: 'Content'
    _permission: 'access content'
    _entity_access: 'user.view'
    user: \d+
        type: entity:user

# Reports user tab.
  path: '/user/{user}/reports'
    _controller: '\Drupal\example\Controller\ExampleController::reportListUser'
    _title: 'Reports'
    _permission: 'access content'
    _entity_access: 'user.view'
    user: \d+
        type: entity:user

This is the code that actually creates the tabs in the user profile. No Drupal console command for this unfortunately. The key part of this is defining base_route: entity.user.canonical.

  title: 'Content'
  route_name: example.user.contentlist
  base_route: entity.user.canonical
  weight: 1

  title: 'Reports'
  route_name: example.user.reportList
  base_route: entity.user.canonical
  weight: 2

Step 5 - enable the module

Don't forget to actually turn on your custom module, nothing will work until then.

drush en example

Example module

The best (and simplest) example module I could find that demonstrates this is the Tracker module in Drupal core. The Tracker module adds a tab to the user profile.

Sep 26 2019
Sep 26

Search engine optimization (SEO) is the chief ingredient in preparing the recipe of top ranking on Google. SEO assist websites in acquiring traffic from organic, natural, or editorial search engine results. There are several other factors also that affects the ranking of the website, such as quality of content, site loading time, backlinks, and responsive designs.

Further, Drupal being a robust and highly customizable website content management system is considered as the most SEO friendly platform. Its unconventional architecture encourages site-builders to implement ethical SEO practices in their workflows, just to name a few, from correct tagging of content, and SEO-friendly naming conventions, to make your site search-engine friendly and user-friendly. 

Thus, with targeted content, properly coded website and theme, and installation of SEO modules can help organizations to make a success story seamlessly.

Integrate These Drupal SEO Modules For Better Visibility & Ranking

Text written horizontally and vertically inside box
Following is a Drupal SEO-friendly modules checklist that you can get ahold of to highlight your site-

  1. Pathauto

One of the most essential modules of Drupal is the Patghauto. It saves your valuable time, which you devote to create the path/URL aliases. It does so by automatically creating URL /path aliases for the contents (nodes, taxonomy, terms, users) based on configurable patterns.

For instance, we configure our blog entry as blog-entry/[node:title]. And this blog post is published with the title, “Embracing Drupal SEO modules”, Pathauto will instantly generate an SEO friendly URL as “Blog/Embracing-Drupal-SEO-Modules” instead of “node/92”. 

2. SEO Checklist

If you are aware of the SEO basics and manage multiple websites at a time, then this module is perfectly suitable for you. With SEO Checklist, you can keep your SEO practices in check. 

It eliminates guesswork by creating a functional to-do list of modules and tasks that remain pending. The regular updates of this module make on-page search engine friendly without any hassle.


Two boxes interconnected with each other                                                                  Source: Drupal.org

It makes work simpler by breaking down the tasks into functional needs like Title Tags, Paths, Content, and much more. Next to each tas is a link mentioned to download the module and a link to the proper admin screen of your website so that you can optimize the settings perfectly. It also places a date and time stamp next to each item when a task has been finished. This, in turn, provides a simple report that you can share with others showing what’s been done.

3. Metatag

This module allows you to automatically provide structured metadata, i.e., “meta tags”, about a website. In context with SEO, when people refer to meta tags, they usually mean referring to the meta description tag and the meta keywords tag that may help enhance the visibility and rankings on the search engine results.

4. XML Sitemap

Drupal XML sitemap module once installed will provide your website a sitemap itself and make it searchable by search engines. This, as a result, will help search engines in understanding the hierarchy of your website and accordingly crawl in a tree sort of manner.

The best part of having this module is the flexibility to include or exclude certain pages from the sitemap of your website. This means that you don’t need to get those pages indexed which you are not using anymore.

5. Google Analytics

The Google Analytics Module helps in tracing the footprints and general behavior of users concerning their interaction with the landing pages and the content present on the website. Not only this, but it also provides insights into your visitors including demographics, where they found you online, what keywords they used to find you and a lot more.

Further, it also eliminates the tracking of in-house employees who might be visiting the website very often and could be counted as visitors and unique sessions. 

6. Real-time SEO For Drupal

The real-time SEO module for Drupal relieves you from the tedious task of optimizing content by including keywords in a fast & natural way.

It works best in combination with the Metatag module. How?

It checks whether your posts are long enough to secure a place in SERPs, the meta tag is included with the high-ranking keyword present in it if there are subheadings in the post or not, etc.

This evaluation makes sure that you don’t miss out on a single opportunity even to increase organic traffic and hence improve your ranking.

7. Search 404

The search 404 module rescues your website by controlling the bounce rate, which search engines use as a criterion to rank websites’ quality. Whenever users come across some pages showing 404, this module automatically redirects them to the internal site search with the related term in the URL.

Besides, it helps you in retaining visitors coming in from old URLs linked from other sites or search indices.

8. Alinks

Alinks module automatically replaces keywords with links in the content body with a list of links. You can set the content type on which this should work by simply setting up the phrases and links through the administration interface. And from here onwards, the module will take over and replace the keyword phrases in the body field with links to the pages to specify.

9. SEO Compliance Checker

The SEO Compliance Checker analyzes the node content on search engine optimization whenever it is created or modified. Whenever a publisher saves or previews a node, the module performs audits and gives the user feedback on the compliance of the rules in the form of a result table to the editor. 

This can help SEO beginners immensely as they will get to know about the areas where they need to optimize content more accurately.

This comprises of scanning of alt tags in the image, usage of keywords in the node titles, keyword density on the body, etc. 

10. Schema.org Metatag

It maintains structured data and tags so that you can add them to your HTML to improve the way search engines read and represent your pages on SERPs.

11. Taxonomy Title

Taxonomy title modules let you edit the heading tag (H1) of the taxonomy page. The importance of H1 tag can’t be neglected and overlooked for it forms a crucial element in SEO and helps in achieving rank on the top page of SERPs. 

People interested in SEO may prefer to add more user-friendly, keyword-rich, and descriptive words to this heading element.

This is the only module that lets you control that title individually for every term. 

12. Menu Breadcrumbs

As per its name, it appends a breadcrumb menu line to the top of the website. It also provides substantial benefits for both users and search engines. Well, first it lets the user know where he is in the navigation hierarchy, and secondly, there is an anchor text in the breadcrumb, which internally links it to the appropriate URL.

13. Power Tagging

The PowerTagging module evaluates content from Drupal nodes and the associated file attachments. It interprets content and concepts automatically through thesaurus or taxonomy even if synonyms are used. Users can consolidate all suggested tags or index the bunch of Drupal content nodes automatically, leading to the formation of a semantic index. This practice makes search comfy than ever before.


  • Customizable entity’ tags with manual tags combined with an auto-completion of tags already used.
  • Multilingual tagging is supported
  • All content can be tagged automatically in one go with Bulk-tagging


Learn How To Use Taxonomy to Tag Content in 9 Steps


14. Similar By Terms

Similar by terms module tends to provide a framework for content items by showcasing a view block with links to other analogous content. The similarity is based on the taxonomy terms assigned to content. Views are available based on similarity within each of the defined vocabularies for a site as well as similarity within all vocabularies.

15. Footnotes

Footnotes module can be utilized for creating automatically tallied footnote references into an article or post, for instance, to add a reference to a URL.

16. Require on Publish

This module comes handy when fields to be required the only option is to be filled at the time of publishing content or if it is already live. This can be used when you have fields available such as tags or SEO information on your content that editors generally don’t need to fill up until the content is going live.

17. Auto Recommend Content Tags (Thru Apache Stanbol)

This module uses Apache Stanbol via a web socket to recommend real-time tags, or even find keywords while the editor is writing, editing, or creating a new piece of content.

18. Drupal SEO Tools

This is an all-inclusive SEO suite. This module offers a dashboard which encapsulates a plethora of SEO functions for the sites, from keywords, titles, tags, paths, redirects, sitemaps to Google analytics, webmaster tool, etc.

However, it has some prerequisites that must be met to make full use of the suite. 

19. Redirect

Redirect module lets you rechannel an existing URL to another one. Additionally, it keeps the two links on your website without delivering a 404 not found error. It also works wonder in case you want to handle duplicate content.

Watch this video further to understand more about Drupal 8 SEO-

[embedded content]

20. Global Redirect

The problem with the alias system in Drupal is that the default URL is still present, i.e., there are still 2 URLs pointing to the same content on your website. The search engine bot can identify the duplicate content easily, and so it can put an impact on your website ranking.

Global Redirect module cures this problem by checking if there is an alias for the existing URL and if it is, then it redirects to the alias URL.

Besides, it exhibits other features like removing the trailing slash in the URL, cross-checking that clean URLs are being implemented correctly and checking permission and access to nodes, URLs.

21.Content Optimizer

Content optimizer module improves your website’s search engine ranking by refining the on-page optimization factors and ensuring that your content meets all the requisites listed under Drupal SEO best practices. It instantly audits your website content through SEO analyzer and guides as per the content stats obtained to improve search engine rankings.

22. Site Verification

Search engines rank your website when they are able to properly navigate through your website and also index it. Now, to know if your site is crawlable you need to verify it. This site verification module helps in the same by either uploading an HTML file or adding meta tags. It supports search engines like Google, Yahoo, & Bing. 

You can use this in combination with XML sitemap to let search engines index your up-to-date website content appropriately.

23. Links Checker

Broken things are considered unlucky and so as is the case with broken links for your website ranking. Broken links put a bad impression on search engines. Hence, Links Checker module can help specify the failed results which you can rectify easily.

24. Menu Attributes

The Menu attributes module facilitates the admin to point out specific attributes comprising of id, name, class, styles and rel.

This module is helpful in your SEO strategy especially when you want to “nofollow” certain menu items to mold the flow of PageRank through your site.

Are You The One Who Is Offending Search Engines? --- Follow These 5 Tips To Avoid Common SEO Mistakes

“To err is human, to forgive is divine”- Alexander Pope

After all, we, being humans, can also flub in achieving our goals. However, if not rectified on time, can result in heavy loss. And here by heavy loss means, your site won’t show up on top pages of the search engine results.

Notebook, phone and glasses kept near a man

Perhaps, you can prevent it from happening by following these 5 simple tips and tricks-

  1. Include top-ranking keywords in your content

The primary task to be SEO-friendly is to focus on the keyword strategy for your website. Use various keyword tools, like Google Keyword Planner or SEMrush to find out the high ranking keywords that you can use in your website content ( landing pages, blogs, and other information pages) to secure top rank on the search results of the search engine.

2. Ensure that your URLs are search-engine friendly

Another important factor in our SEO checklist is to URL structure. Yes, it does matter! Search engines like Google, Bing put a lot of stress on the use of user-friendly URLs. Such clean URLs make the content more readable and also give a clear picture of what the page is about. Example of an unambiguous URL for a service page of a site would be something like this-www.example.com/services.

As discussed above, the Pathauto module is an excellent Drupal module which makes this process a whole lot easier by converting complicated URLs to clean and clear URLs.

3. Don’t underestimate the power of metatag

Metatags are those micro-sized text pieces that you (should) place in the header part of your website to make search engines aware of what the web page is about. 

In case you don’t include metatags in your content, the search engines are forced to guess what the page contains and trust us, this could seriously piss off search engines, and eventually, your SEO ranking will suffer!

Fortunately, you don’t need to indulge yourself into the code of your Drupal site’s web pages to implement meta tags since Drupal already has a solution (module) for it- Metatag module

With the metatag module, you can automate the process of placing meta tags and HTML title tags in the header of your webpage.

4. Indulge Users With Your Mobile-friendly Website

The changing technology scenario has brought everyone together with the advent of mobile phones, especially millennials, who use their smartphones to access the internet for every small detail.

Having said that, Google now prefers those websites who offer users mobile-friendly interface. Therefore, it’s high time that companies optimize their Drupal website well to adapt to the screen size of different devices. Enterprises can use content-as-a-service (CaaS) to push their content via APIs on their Drupal website. CaaS automatically adjusts the size and format of the content, increasing the feel of the content.

That’s why Drupal 8 is considered as an out-of-the-box solution for driving the SEO compatibility of the website effortlessly.

5. Keep Drupal updated

Drupal, being open-source software, and managed by a huge community of developers is regularly updated to incorporate new features and fix bugs and errors to keep potential security risks at a dead end. 

This has two benefits, one- your website will be safe from any cyber-attacks and second, search engines list those sites on top which keep vulnerabilities at bay. 

Hence, keeping your website updated is an important factor in your Drupal SEO journey. Make sure that you keep on updating your Drupal site as soon as an update is rolled out to keep website ranking high on SERPs.

Final Words

Confining yourself to just creating a website and pushing content on it is never enough to get your website on the top search results of search engine pages especially when there is a slew of websites present on the world wide web.

Search engine optimization for Drupal is a homogeneous practice that evolves as you keep investing time on it. The more you practice, the more is the visibility! Integrate these modules onto your website and also follow the mentioned Drupal SEO guide religiously to witness a significant boost on your website’s ranking on search engines. After all, it’s worth a try!

Sep 26 2019
Sep 26

Although web accessibility begins on a foundation built by content strategists, designers, and engineers, the buck does not stop there (or at site launch). Content marketers play a huge role in maintaining web accessibility standards as they publish new content over time.

“Web accessibility means that people with disabilities can perceive, understand, navigate, and interact with the Web, and that they can contribute to the Web.” - W3

Why Accessibility Standards are Important to Marketers

Web accessibility standards are often thought to assist audiences who are affected by common disabilities like low vision/blindness, deafness, or limited dexterity. In addition to these audiences, web accessibility also benefits those with a temporary or situational disability. This could include someone who is nursing an injury, someone who is working from a coffee shop with slow wifi, or someone who is in a public space and doesn’t want to become a nuisance to others by playing audio out loud.

Accessibility relies on empathy and understanding of a wide range of user experiences. People perceive your content through different senses depending on their own needs and preferences. If someone isn’t physically seeing the blog post you wrote or can’t hear the audio of the podcast you published, that doesn’t mean you as a marketer don’t care about providing that information to that audience, it just means you need to adapt in the way you are delivering that information to that audience.

10 Tips for Publishing Accessible Content

These tips have been curated and compiled from a handful of different resources including the WCAG standards set forth by W3C, and our team of accessibility gurus at Palantir. All of the informing resources are linked in a handy list at the end of this post. 

1. Consider the type of content and provide meaningful text alternatives.

Text alternatives should help your audience understand the content and context of each image, video, or audio file. It also makes that information accessible to technology that cannot see or hear your content, like search engines (which translates to better SEO).

Icons to show image, audio, video

Types of text alternatives you can provide:

  • Images - Provide alternative text.
  • Audio - Provide transcripts.
  • Video - Provide captions and video descriptions in action.

This tip affects those situational use cases mentioned above as well. Think about the last time you sent out an email newsletter. If someone has images turned off on their email to preserve cellular data, you want to make sure your email still makes sense. Providing a text alternative means your reader still has all of the context they need to understand your email, even without that image.

2. Write proper alt text.

Alternative text or alt text is a brief text description that can be attributed to the HTML tag for an image on a web page. Alt text enables users who cannot see the images on a page to better understand your content. Screen readers and other assistive technology can’t interpret the meaning of an image without alt text.

With the addition of required alternative text, Drupal 8 has made it easier to build accessibility into your publishing workflow. However, content creators still need to be able to write effective alt text. Below I’ve listed a handful of things to consider when writing alt text for your content.

  • Be as descriptive and accurate as possible. Provide context. Especially if your image is serving a specific function, people who don’t see the image should have the same understanding as if they had.
  • If you’re sharing a chart or other data visualization, include that data in the alt text so people have all of the important information.
  • Avoid using “image of,” “picture of,” or something similar. It’s already assumed that the alt text is referencing an image, and you are losing precious character space (most screen readers cut off alt text at around 125 characters). The caveat to this is if you are describing a work of art, like a painting or illustration.
  • No spammy keyword stuffing. Alt text does help with SEO, but that’s not it’s primary purpose, so don’t abuse it. Find that happy medium between including all of the vital information and also including maybe one or two of those keywords you’re trying to target.
Illustration of red car with flames shooting out of the back, flying over line of cars on sunny roadway.Example of good alt text: “Red car in the sky.”
Example of better alt text: “Illustration of red car with flames shooting out of the back, flying over line of cars on sunny roadway.”

3. Establish a hierarchy.

Upside down pyramid split into three sections labeled high importance, medium importance, low importance

Accessibility is more than just making everything on a page available as text. It also affects the way you structure your content, and how you guide your users through a page. When drafting content, put the most important information first. Group similar content, and clearly separate different topics with headings. You want to make sure your ideas are organized in a logical way to improve scannability and encourage better understanding amongst your readers.

4. Use headings, lists, sections, and other structural elements to support your content hierarchy.

Users should be able to quickly assess what information is on a page and how it is organized. Using headings, subheadings and other structural elements helps establish hierarchy and makes web pages easily understandable by both the human eye and a screen reader. Also, when possible, opt for using lists over tables. Tables are ultimately more difficult for screen reader users to navigate.

If you’re curious to see how structured your content is, scan the URL using WAVE, an accessibility tool that allows you to see an outline of the structural elements on any web page. Using WAVE can help you better visualize how someone who is using assistive technologies might be viewing your page.

5. Write a descriptive title for every page.

This one is pretty straight forward. Users should be able to quickly assess the purpose of each page. Screen readers announce the page title when they load a web page, so writing a descriptive title helps those users make more informed page selections.

Page titles impact:

  • Users with low vision who need to be able to easily distinguish between pages
  • Users with cognitive disabilities, limited short-term memory, and reading disabilities.

6. Be intentional with your link text.

Write link text that makes each link’s purpose clear to the user. Links should provide info on where you will end up or what will happen if you click on that link. If someone is using a screen reader to tab through 3 links on a page that all read “click here,” that doesn’t really help them figure out what each link’s purpose is and ultimately decide which link they should click on.

Additional tips:

  • Any contextual information should directly precede links.
  • Don’t use urls as link text; they aren’t informative. A
  • void writing long paragraphs with multiple links. If you have multiple links to share on one topic, it’s better to write a short piece of text followed by a list of bulleted links.

EX: Use "Learn more about our new Federated Search application" not "Learn more".

7. Avoid using images of text in place of actual text.

The exact guideline set forth by W3 here is “Make it easier for users to see and hear content including separating foreground from background.” 

There are many reasons why this is a good practice that reach beyond accessibility implications. Using actual text helps with SEO, allows for on-page search ability for users, and creates the ability to highlight for copy/pasting. There are some exceptions that can be made if the image is essential to include (like a logo). Providing alt text also may be a solution for certain use cases.

8. Avoid idioms, jargon, abbreviations, and other nonliteral words.

The guideline set forth by W3 is to “make text content readable and understandable.” Accessibility aside, this is important for us marketers In the Drupal-world, because it’s really easy to include a plethora of jargon that your client audience might not be familiar with. So be accessible AND client-friendly, and if you have to use jargon or abbreviations, make sure you provide a definition of the word, link to the definition, or include an explanation of any abbreviations on first reference.

Think about it this way: if you are writing in terms people aren’t familiar with, how will they know to search for them? Plain language = better SEO.

9. Create clear content for your audience’s reading level.

For most Americans, the average reading level is a lower secondary education level. Even if you are marketing to a group of savvy individuals who are capable of understanding pretty complicated material, the truth is, most people are pressed for time and might become stressed if they have to read super complicated marketing materials. This is also important to keep in mind for people with cognitive disabilities, or reading disabilities, like dyslexia.

I know what you’re thinking, “but I am selling a complicated service.” If you need to include technical or complicated material to get your point across, then provide supplemental content such as an infographic or illustration, or a bulleted list of key points.

There are a number of tools online that you can use to determine the readability of your content, and WebAIM has a really great resource for guidelines on writing clearly.

10. Clearly label form input elements.

If you are in content marketing, chances are you have built a form or two in your time. No matter whether you’re creating those in Drupal or an external tool like Hubspot, you want to make sure you are labeling form fields clearly so that the user can understand how to complete the form. For example, expected data formats (such as day, month, year) are helpful. Also, required fields should be clearly marked. This is important for accessibility, but also then you as a marketer end up with better data.

Helpful Resources

Here are a few guides I've found useful in the quest to publish accessible content:

Accessibility Tools

Sep 25 2019
Sep 25

Yesterday the digital experience world and the Drupal community received the long awaited answer to the question: What’s going to happen with Acquia? when it was announced, first on Bloomberg that Vista Equity Partners would be buying a majority stake in Acquia which it values at $1B. 

Many were caught off guard by the timing, but an event like this had been expected for a long time. After receiving nine rounds of venture funding totaling $173.5M, it was time. As the leader and largest company in the Drupal space, Acquia has a center of gravity that leaves many asking a new question: What Now for Drupal?

What Are the Angles?

Before I attempt to answer what I think this means for Drupal and the Drupal community, I think it is worthwhile to at least speculate on the strategy Acquia plans to pursue as a part of Vista. It seems that everyone I have heard from both offline and online since the announcement yesterday are speculating on the Vista angle (i.e. why did they want Acquia?). As TechCrunch led with “Vista Equity Partners...likes to purchase undervalued tech companies and turn them around for a hefty profit…” Well that’s pretty much what a PE firm does. And to me less interesting than asking: What does Acquia want from Vista?

What I believe Acquia wanted to get out of this is a heavy weight partner with capital and connections that could help develop Acquia into a more formidable competitor to Adobe, Sitecore and other digital experience platforms (“DXP”). It was just last week that Salesforce Ventures made a very sizeable $300M investment in Automattic, the parent company of WordPress. Things are heating up with all of the top digital experience platforms and no one is going to survive, let alone stay in the front of the pack, without some serious capital behind them. 

Who Wins?

I believe Acquia plans to use Vista’s investment and resources to continue making targeted acquisitions and investments to become a more robust and powerful digital experience platform. I would expect them to grow their suite of products, invest even more heavily in sales and marketing to increase revenue and grow its installed base of customers. 

Vista will then have a more valuable asset from which to pursue either an IPO or a strategic acquisition. It is possible this will follow the pattern of Marketo, which Vista bought and then sold to Adobe for a $3B profit or Ping, which they recently took public in an IPO

So there are mutual interests being met and a fair valuation that gets the necessary attention - so both parties win. I also think customers win from increased product development, competition, and a more robust ecosystem.

What Does This Mean For Drupal?

I think this is the best of all possible scenarios for both Drupal (the product) and the Drupal community. While many will bemoan the intrusion of a large private equity firm into the sacred space of an open source community, change was inevitable and it comes with predictable tradeoffs that have to be measured in the context of a new reality for the space. The community needs the indirect investment that this deal provides and it far outweighs the alternatives. If you assume that there were only a few possible scenarios for Acquia that were going to play out sooner or later, they would be:

  1. Organic growth / status quo - In my opinion, the worst scenario due to the dynamics of the market converging. Without a huge infusion of capital like the Vista deal into Acquia, Drupal simply wouldn’t be able to compete fast enough to stay in the top DXP category against Adobe, Sitecore, Salesforce and WordPress. 

  2. IPO - As a liquidation event for VC investors, this could be perhaps the most lucrative, but the public markets are fickle and I believe that would be very hard on a large open source community and product like Drupal due to the dynamics of control for a public company. This may yet come to pass as the end game for Vista, but I think it is good it was not the immediate play. 

  3. Strategic Acquisition - Salesforce, Amazon, Google, IBM and others of this size would be likely acquirers. Again, this may yet come to pass, but it would not have been an ideal immediate short term play for Drupal because of the weight of influence it would add to the community and open source dynamic.  

  4. PE - Obviously, what did happen. This deal brings the financial strength and strategic opportunities without the messiness of the public markets or a new giant controlling the ecosystem. 

As for the direct benefits to the Drupal project, I take Dries at his word in the personal statement he made on his blog that this strategy will allow Acquia to provide even more for Drupal and the community including: 

  • Sponsor more Drupal and Mautic community events and meetups.

  • Increase the amount of Open Source code [sic] contributed.

  • Fund initiatives to improve diversity in Drupal and Mautic; to enable people from underrepresented groups to contribute, attend community events, and more.

Those are all things that directly benefit the community and make open source Drupal better in addition to the opportunities that the deal affords Acquia to better compete against its rivals. 

How Things Line Up From Here…

Consolidation and funding in the digital experience platform (“DXP”) space are going to make for a wild ride as the top players continue to unveil pieces of their strategy.  

  • Adobe - With Magento and Marketo neatly tucked up, Adobe remains the most competitive player both in terms of market share and the comprehensiveness of the offering, though cost and proprietary lock-in into a single homogenous platform are continued weaknesses. 

  • Acquia / Drupal - Recent acquisitions of platform components like Mautic and Cohesion are likely to continue or increase after the Vista deal in an effort to bring an open and more heterogeneous alternative to bear against the others. 

  • Sitecore - The recent acquisition of a top service provider, Hedgehog followed by the subsequent announcement that Sitecore was laying off 7% of its workforce can’t be interpreted as strong signs of health, but the enterprise market is full of Microsoft ecosystems that will be partial to Sitecore’s underlying technology. 

  • Automattic / WordPress - I have a less insight into the WordPress space than I do Drupal, but the SalesForce Ventures investment doesn’t feel like an attempt to gain a CMS for its own offering (sidenote: Salesforce does have a “CMS” and its Ventures has invested in other CMS’s like Contentful).  Founder Matt Mullenwig told TechCrunch that Automattic doesn’t want to change course. With the new influx of cash, there won’t be any big departure from the current lineup of products and services. “The roadmap is the same. I just think we might be able to do it in five years instead of 10,” Their recent acquisition of Tumblr is part of a strategy I don’t fully understand, but seems to be a continued volume market move into the larger media space and less about competing with the other platform providers. However, $300M could go a long way in tooling the platform for lots of purposes. 

I also think there is a lot more to watch on the related martech front surrounding customer data. In April, Salesforce and Adobe announced (in the same week) that they were acquiring competing Customer Data Platform (CDP) products. So this is about the whole digital experience stack; where we are likely going to see more acquisitions and consolidation is beyond the CMS. 

What Does This Mean For our Clients?

Despite the race to create the killer platform, most of our clients have consciously, or organically, adopted heterogenous digital experience platforms. This means they rely on many different components to “weave” together solutions that meets their unique requirements. As Forrester explains, DX is both a platform and a strategy and despite the influence of these major software and cloud players, a “digital experience” needs to be created - that includes strategy, customer research, UX, design, content, brand and the integration of custom and legacy software, and data sources in addition to purchased software. Still, we believe our customers do need to be aware of the changing dynamics in the market and in particular how consolidation will affect their platform investments. 

What Does This Mean For Phase2?

At Phase2, this news comes with much interest. We were one of the very first Acquia partners named after the company was founded in 2008. Over the last 10+ years, we have shared, and continue to share, numerous clients. We are also prolific contributors and implementers in the Drupal space who have been a part of some of the biggest and most impactful Drupal moments over the last ten years. We ourselves, once invested heavily in creating many products that extended and enhanced the capabilities of Drupal because we believe it is a powerful platform for creating digital experiences. 

Over time, as our agency grew and moved “up market”, we have diversified our expertise and have become Salesforce partners, developed Commerce experience, and enhanced our Design, UX and creative capabilities. We also use WordPress, Javascript frameworks for decoupled sites, and static site generators in conjunction with a wide variety of marketing technologies to create digital experience platforms that go beyond websites and CMS

We will continue to monitor the trends and prepare and enable ourselves to create digital experiences that advance our clients goals and we fully expect Drupal will remain a key component of building those experiences well into the future. 

Sep 25 2019
Sep 25

Securing your website is not a one-time goal but an on-going process that needs a lot of your attention. Preventing a disaster is always a better option. With a Drupal 8 website, you can be assured about having some of the top security risks being taken care of by the Drupal security team. 
Drupal has powered millions of websites, many of which handle extremely critical data. Unsurprisingly, Drupal has been the CMS of choice for websites that handle critical information like government websites, banking and financial institutions, e-Commerce stores, etc. Drupal security features address all top 10 security risks of OWASP (Open Web Application Security Project)
Drupal 8 is considered one of the most secure version till date because of its forward-thinking and continuous innovation approach. The Drupal security team had also issued a security bounty program six months before the release of Drupal 8. Through this program, users were invited to test run and find (and report) bugs in Drupal 8. And they even got paid for it! 

Drupal Security Vulnerabilities

It goes without saying that the Drupal community take drupal security issues very seriously and keep releasing Drupal security updates/patches. The Drupal security team is always proactive and ready with patches even before a vulnerability goes public. For example, the Drupal security team released the security vulnerability update - SA-CORE-2018-002 days before it was actually exploited (Drupalgeddon2). Patches and security upgrades were soon released, advising Drupal site admins to update their website.
Quoting Dries from one of his blogs on the security vulnerability – “The Drupal Security Team follows a "coordinated disclosure policy": issues remain private until there is a published fix. A public announcement is made when the threat has been addressed and a secure version of Drupal core is also available. Even when a bug fix is made available, the Drupal Security Team is very thoughtful with its communication. “
Some interesting insights on Drupal’s vulnerability statistics by CVE Details :

drupal securitydrupal security vulnerabilities

1. Keep Calm and Stay Updated – Drupal Security Updates    

The Drupal security team are always on their toes looking out for vulnerabilities. As soon as they find one, a patch/Drupal security update is immediately released. Also, after Drupal 8 and the adoption of continuous innovation, minor releases are more frequent. This has led to easy and quick Drupal updates of a better, more secure version. 
Making sure your Drupal version and modules are up-to-date is really the least you can do to ensure safety of your website. Drupal contributors are staying on top of things and are always looking for any security threats that could spell disaster. A Drupal security update doesn’t just come with new features but also security patches and bug fixes. Drupal security updates and announcements are posted to users’ emails and site admins have to keep their Drupal version updated to safeguard the website.

2. Administer your inputs 

Most interactive websites gather inputs from a user. As website admins, unless you manage and handle these inputs appropriately, you are at a high-security risk. Hackers can inject SQL codes that can cause great harm to your website’s data.
Stopping your users from entering SQL specific words like “SELECT” or “DROP” or “DELETE” could harm the user experience of your website. Instead, with Drupal security, you can use escaping or filtering functions available in the database API to strip and filter out such harmful SQL injections. Sanitizing your code is the most crucial step towards a secure Drupal website.

3. Drupal 8 Security

How is Drupal 8 helping in building a more robust and secure website? Here are a few Drupal 8 security features - 

  • Symfony – With Drupal 8 adopting the Symfony framework, it opened doors to many more developers other than limiting them to just core Drupal developers. Not only is Symfony a more secure framework, it also brought in more developers with different insights to fix bugs and create security patches.
  • Twig Templates – As we just discussed about sanitizing your code to handle inputs better, here’s to tell you that with Drupal 8, it has already been taken care of. How? Thanks to Drupal 8’s adoption of Twig as its templating engine. With Twig, you will not need any additional filtering and escaping of inputs as it is automatically sanitized. Additionally, because Twig’s enforcement of separate layers between logic and presentation, makes it impossible to run SQL queries or misusing the theme layer.
  • More Secure WYSIWYG - The WYSIWYG editor in Drupal is a great editing tool for users but it can also be misused to carry out attacks like XSS attacks. With Drupal 8 following Drupal security best practices, it now allows for using only filtered HTML formats. Also, to prevent users from misusing images and to prevent CSRF (cross-site request forgery), Drupal 8’s core text filtering allows users to use only local images.
  • The Configuration Management Initiative (CMI) – This Drupal 8 initiative works out great for site administrators and owners as it allows them to track configuration in code. Any site configuration changes will be tracked and audited, allowing strict control over website configuration.

4. Choose your Drupal modules wisely

Before you install a module, make sure you look at how active it is. Are the module developers active enough? Do they release updates often? Has it been downloaded before or are you the first scape- goat? You will find all the mentioned details at the bottom of the modules’ download page. Also ensure your modules are updated and uninstall the ones that you no longer use.

5. Drupal Security Modules to the rescue

Just like layered clothing works better than one thick pullover to keep warm during winter, your website is best protected in a layered approach. Drupal security modules can give your website an extra layer of security around it. Some of the top Drupal 8 security modules that you must use for your website –

 Drupal Login Security –

This module enables the site administrator to add various restrictions on user login. The Drupal login security module can restrict the number of invalid login attempts before blocking accounts. Access can be denied for IP addresses either temporarily or permanently. 

Two-factor Authentication –

With this Drupal security module, you can add an extra layer of authentication once your user logs in with a user-id and password. Like entering a code that’s been sent to their mobile phone.

Password Policy –

This is a great Drupal security module that lets you add another layer of security to your login forms, this preventing bots and other security breaches. It enforces certain restrictions on user passwords – like constraints on the length, character type, case (uppercase/lowercase), punctuation, etc. It also forces users to change their passwords regularly (password expiration feature).

Username Enumeration Prevention –

By default, Drupal lets you know if the username entered does not exist or exists (if other credentials are wrong). This can be great if a hacker is trying to enter random usernames only to find out one that’s actually valid. This Drupal security module can prevent such an attack by changing the standard error message.

Content Access -

As the name suggests, this module lets you give more detailed access control to your content. Each content type can be specified with a custom view, edit or delete permissions. You can manage permissions for content types by role and author.

Coder -

Loopholes in your code can also make way for an attacker. The Coder module (a command line tool with IDE support) goes through your Drupal code and lets you know where you haven’t followed best coding practices.

Security Kit -

This Drupal security module offers many risk-handling features. Vulnerabilities like cross-site scripting (or sniffing), CSRF, Clickjacking, eavesdropping attacks and more can be easily handled and mitigated with this Drupal 88 security module.

Captcha -

As much as we hate to prove our human’ness, CAPTCHA is probably one of the best Drupal security modules out there to filter unwanted spambots. This Drupal module prevents automated script submissions from spambots and can be used in any web form of a Drupal website

6. Check on your Permissions

Drupal allows you to have multiple roles and users like administrators, authenticated users, anonymous users, editors, etc. In order to fine-tune your website security, each of these roles should be permitted to perform only a certain type of work. For example, an anonymous user should be given least permissions like viewing content only. Once you install Drupal and/or add more modules, do not forget to manually assign and grant access permissions to each role.

7. Get HTTPS

I bet you already knew that any traffic that’s transmitted over just an HTTP can be snooped and recorded by almost anyone. Information like your login id, password and other session information can be grabbed and exploited by an attacker. If you have an e-Commerce website, this gets even more critical as it deals with payment and personal details. Installing an SSL certificate on your server will secure the connection in between the user and the server by encrypting data that’s transferred. An HTTPS website can also increase your SEO ranking – which makes it totally worth the investment.

Sep 25 2019
Sep 25

Your browser does not support the audio element. TEN7-Podcast-Ep-071-Kevin-Thull-Drupal-Archivist.mp3


If you've ever watched a Drupal Camp or Con session from the comfort of your home, you likely have our guest Kevin Thull to thank. Thull has recorded almost 1700 Drupal sessions, and he keeps looking for more ways to contribute to the Drupal community. Bonus: If you're a gear nerd you'll love hearing about the evolution of the recording system!     


Kevin Thull, Drupal Developer and Archivist


  • Life in Chicago (Cubs fan forever!)
  • Almost artificial limbs
  • Yet another person that joined Drupal because of the helpful, welcoming community
  • How Kevin got started recording sessions
  • Failing is important!
  • Going on the road to record Camps
  • Giant red button debut
  • Iterating to find the perfect sound and audio recording setup
  • Ad hoc recording becomes the Drupal Recording Initiative
  • Midwest Open Source Alliance (MOSA) and fiscal sponsorship and insurance
  • Aaron Winborn Award
  • Contributing to community without coding



IVAN STEGIC: Hey everyone! You’re listening to the TEN7 podcast, where we get together every fortnight, and sometimes more often, to talk about technology, business and the humans in it. I am your host Ivan Stegic. My guest today is Kevin Thull, a freelance frontend developer and President of the Midwest Open Source Alliance. You may know him as the guy whose session recording kits are omnipresent at Drupal events across the globe. He’s also the 2018 recipient of the Aaron Winborn Award, an award that is presented annually to an individual who demonstrates personal integrity, kindness and above and beyond commitment to the Drupal community. Hey, Kevin. Welcome to the podcast. It’s a great pleasure to have you on.

KEVIN THULL: Thank you. It’s great to be here.

IVAN: I have so many questions. [laughing] I feel like there’s so much to explore. So maybe you’ll consider coming back if we don’t get to it all?

KEVIN: Definitely.

IVAN: Awesome. Okay, I thought we’d start with some background. So right now, you live in Chicago, and you went to the University of Illinois at Chicago. Are you a lifelong Chicagoan?

KEVIN: I am. Born and raised.

IVAN: Born and raised. So, where did life in school start for you?

KEVIN: I’m on the northwest side of Chicago. So, I went to schools in that area. As far as UIC I went there for Bioengineering. My dream was to create artificial limbs, and then I learned I’d pretty much be in school for the rest of my life. I said nope! [laughing]

IVAN: [laughing] Wow. Bioengineering. That’s amazing. What was the motivation for artificial limbs?

KEVIN: It just seemed an interesting and really useful career. At the time, I graduated college in 1989/90, so there wasn’t a whole lot of advancement at that point. So, it just seemed like a really interesting and rewarding career to go into.

IVAN: Yeah. I’ve seen those artificial limbs that are 3D printed these days.

KEVIN: Yeah, it’s incredible.

IVAN: It really is. Do they use Raspberry Pis, I think, in some cases? Or, I don’t know what it is, but it looks like there’s a really inexpensive way to get things done these days.

KEVIN: Yeah. There was an event at my old job sponsorship conference and one of speakers was one of the inventors of that 3D printing, or some of the innovators of it, and it’s just an incredible story.

IVAN: It really is. It’s kind of what technology and the internet, the original idea behind it was trying to accomplish. Right? Something that can bring the masses. Something that’s cheap and life-changing as a technology, whether it’s hardware or software, it doesn’t matter.

KEVIN: Right. Yeah.

IVAN: So, I have to ask you, since you’re in Chicago. White Sox or Cubs?

KEVIN: I grew up on the northwest side, so Cubs fan forever.

IVAN: Cubs fan. Yes.

KEVIN: I don’t really follow sports at this point anymore.

IVAN: No? Well recent World Champions, I have to give it to you that it’s probably okay to stop following it. Right?

KEVIN: Yeah. And I actually lived near Wrigley Field when that happened.

IVAN: What a beautiful ballpark.

KEVIN: Yeah. It’s wonderful. I hope they don’t change it.

IVAN: Yeah. I’m just so excited about baseball these days, given that the Twins are now number one in the league, and have the best average. We really needed that. I feel like it’s a good omen that that’s what happened to the Cubs who went to the World Series, and now maybe the Twins can do it.

KEVIN: Yeah, that’d be amazing.

IVAN: Definitely would be amazing. [laughing] So, let’s talk a little bit about Drupal. You’ve been in the Drupal ecosystem for more than 10 years with many different areas of interest and expertise from being a site builder to developer, to being involved in the community. Do you remember your first experience with Drupal?

KEVIN: Yeah. Vividly. [laughing] Drupal 6 was just shiny and new, and I was using a product to essentially build a static site, so I was using an early static site generator, just this Perl script that let me create both a car parts website and a couple different product websites, we’ll put it at that. Since it was UI-based the site kept timing out during the rebuilds for the owners of their sites, and telling them, “Oh, you can log in through SSH and run it there.” It was not an option. So, I started evaluating other systems and it was really down to between Joomla and Drupal.

IVAN: Ooh, Joomla.

KEVIN: Right. But feature set, similar, they looked equally capable on paper. So, I looked through support forms, because I’m not a coder by trade, I guess you can say. I can't code my way out of a paper bag, is how I’ll define my programming skills. I’m good with CSS and SaSS, but in terms of the rest, even though I went to engineering school, you’d think I’d be better at it. I looked at the community forums for both, and Joolma’s answers were, “Sure we’ll help you for a bounty,” and Drupal’s answers were “Sure we’ll help you. Can I move in with you to help you build this thing?” Sort of that feel. So, at that point I went with Drupal.

IVAN: Yeah, it was the community that got you hooked, it sounds like.

KEVIN: Absolutely. Then I struggled because there were no migration scripts at that point, so I had to find some custom PHP to brute force it into the database, which worked, data tables were a whole lot easier in Drupal 6.

IVAN: Of course.

KEVIN: Yeah. Then I didn’t quite understand the whole contrib cycle. I was like Drupal 5 versus Drupal 6. Well Drupal 6 is new I’ll use that, and then realized I was sort of stuck waiting for contrib to follow-up. I ended up doing an Ubercart site, struggled with a make/model/year selector, and my first community event, because I had learned a lot through videos. You found videos on archive.org from past events, and that got me a long way. But then I was stuck. 

I was very, very introverted, very shy at the time. I still am a little bit. So, I committed to going to an in-person meetup. I was living in the suburbs at the time, and there was a meetup posted and it’s like, “Well ask your questions and we’ll have Jeff Eaton there because he wrote the book on building Drupal," or he’s one of the writers. I’d been listening to Lullabot podcasts. I was having celebrity anxiety. So, he showed up, and I asked my question, “How could you do this?” He’s like, “Oh, I’m so sorry, 'cause basically there is no solution for that right now.” At least I felt good that it wasn’t me. And if he can’t figure it out, then...

IVAN: Is that code still around that you wrote?

KEVIN: No. The site owner ended up migrating out to BigCommerce at some point. He had several different sites. But we had it going for a while, doing lots of imports and CSV files. So, it was a pretty intense project.

IVAN: So that interaction with Jeff Eaton, was that your first in-person involvement in some sort of a community event?

KEVIN: Yeah. It was the very first Drupal Fox Valley meetup.

IVAN: Drupal Fox Valley meetup. Is that still around?

KEVIN: They are. Yeah. I’m sad I don’t live in the suburbs. That’s one of the reasons I’m sad I don’t live in the suburbs, because it’s a pretty far west suburb but it was a great, great group. I met a lot of wonderful people there, and I count that as one of the reasons that I am where I am today, being part of that group in that community.

IVAN: So that was your first exposure to the community. Is it also the first time you started participating and organizing events as well?

KEVIN: More or less. I did some light volunteering at the Chicago Drupal Camp when it was around, but we ended up as a suburban group. It’s a decent commute. There’s a good community in the suburbs so we decided to have our own DrupalCamp Fox Valley. That was October, 2013. That’s also when I decided I was going to record the sessions, because at the time where I worked, we hosted a marketing conference where I basically was involved in recording sessions. So I’m like, well, a) I learned when I started Drupal from session recordings; b) I do this for work, so it was a no brainer in my mind to do that for events that I’m organizing.

IVAN: So, 2013, there’s the first set of sessions that you decide to record and it’s at a meetup? Or it’s at the Fox Valley Camp?

KEVIN: Yeah. We had our Fox Valley DrupalCamp, or DrupalCamp Fox Valley.

IVAN: So, did you go into that camp thinking, “Okay, I’m going to record every single session?” Or did you say, “Let’s iterate. Let’s choose one room and see how it goes?” [laughing]

KEVIN: No. I figured, I do this for work, so, we’re going to get them all, and the method was have a camcorder in the back of the room just to see when slides change, get the slide presentation from the presenter, make stills of each slide, and then kind of rebuild what would be a screen share. Because that was the process that I did at work, but it was for marketing conference.

IVAN: I see.

KEVIN: So, there were maybe 30 slides or so. At the work event, it was a union hotel, so we brought in AV to do the keynote as a live video production, but in the breakout rooms, to cut costs we just got an audio file from them. So, I would get their deck and any videos that they were playing and kind of rebuild it based on the audio and just what I call the "reference record" to see where those slides change.

IVAN: So, you actually had to rebuild every session there was on any live capture?

KEVIN: Correct.

IVAN: Alright. So that’s kind of version one?

KEVIN: Yeah. That was terrible.

IVAN: Was it? [laughing]

KEVIN: Well, there was one talk, it was like a 45-minute talk and over 100 slides, so it took like three hours to rebuild that.

IVAN: Boy, that was really time intensive.

KEVIN: Yeah, and you know, demos were lost. It’s just a completely different medium. It’s funny because friends of mine at the time were like, “Why are you investing so much time in this, in the post-production? You know, nobody’s going to watch these.” I’m like, “It’s important.”

IVAN: Yeah, and it really is important. Thank you for investing the time. It’s such an asset to the community now, I can’t even imagine what it would be without it.

KEVIN: Yeah. I never once imagined it would be what it is today. [laughing]

IVAN: I would love to know about what the next iteration was after you decide, “I can’t handle doing four hours and 100 slides for a 45-minute talk.” What’s the next iteration?

KEVIN: So, shortly after that event was the first MidCamp. That was March of 2014. We were fortunate enough to get the Drupal Association recording kits. So, the same laptops and splitters that they use at DrupalCon. Because apparently if your event falls in the window of when they’re not needed to be shipped or in shipping or en route, you can just pay the FedEx cost to borrow the equipment. So, you get this giant Pelican case, you could fit a body in it because it’s stuffed with laptops and equipment. And that was also a pretty horrible experience, because it was just a lot of setup. They didn’t work terribly well. Every once in a while, in the recording there would be a dropped frame, so you just see a one-second blue frame. So, of course, I edited those out. It was pretty low res and at the end of the event we were exhausted because it was our first one, it’s like, Oh now we have to drag this giant case and find a FedEx to send it home.

IVAN: So, were the laptops themselves doing the recording,  and you had to have your presentation on the laptop?

KEVIN: No, there was a splitter. So, the presentation computer fed into a splitter that split to the projector and to the recording MacBook. So, the MacBook was basically running a capture software. And to this day that’s the same type of equipment they use at Drupal Con. So, if you go into a session at DrupalCon you’ll see off to the side a table with a laptop on it that has a note saying, “Recording. Do not Touch.”

IVAN: Really.

KEVIN: And it’s just always on.

IVAN: I always wondered about that. Okay, so that’s version 2. So that’s 2014. So, what happens after that? You’ve reduced the amount of time you spend on post-production but you’re still not happy. You still want something better.

KEVIN: Yeah. So, I think it was, DrupalCon Austin was that year, so March is when we did the laptops. Went to DrupalCon Austin. We actually met with the people who produce the videos and I was brainstorming with a fellow organizer, there’s got to be some sort of solution that’s lightweight, inexpensive, device agnostic and no drivers. We kind of came up with this base requirements list and started looking, and it was really difficult to find stuff, because it turns out this is a very lucrative industry, recording events. They don’t want to give away their methods. Even the prosumer-level equipment is really expensive. So, I found this device that the intended market is to record your console gameplay, so it’s HDMI in and out, it records to a thumb drive and it’s got an audio mixer so it can pick up the gameplay audio from your console and then also your commentary through a headset. And it has a standalone mode, because lot of those console gameplay systems require you to either hook to a console where there’s some sort of interface or attach to a PC so you can run it through software. But it’s the only one that had a standalone mode. So, I’m like, well let me try it. So, I bought one and it worked. So, the second DrupalCamp Fox Valley, which was then later in 2014, was where that kit first debuted.

IVAN: Wow. And what was the cost of the kit at the time? Do you remember?

KEVIN: At that time, they came with a lav mic. So, just the unit itself was, I want to say, $180.00 plus dongle, so maybe low $200 per kit, which is really cheap.

IVAN: That’s really reasonable, yeah. Plus, you have to supply the thumb drive, right?

KEVIN: Yeah. When I think of cost per kit, that’s equipment plus dongles plus recording gear. But we have issues where if you had multiple presenters, you’re handing around this lav mic which is not a great way to deal with it, and every once in awhile there was no audio on the record. So, you’ve got the screen recording, no problems, but it’s silent. So those were lost. That’s when I decided to add in the zoom voice recorder which serves as the mic, but also records to an SD card.

IVAN: So that’s the omnidirectional mike that’s hooked up and right next to the console?

KEVIN: Yeah.

IVAN: Okay, so that’s the next version after the Fox Valley Camp?

KEVIN: Yeah. That was all exciting. It was promising. I got most of the recordings for Fox Valley. I was going to BADCamp that year, so that was September 2014, BADCamp was San Francisco, would’ve been late October. I just wanted to show off the kits and they’re like, “Well can you actually record some sessions?” I’m like, “Okay, sure.” I probably had two or three at the time. So, I brought them with me, they’re compact, and I recorded sessions and failed miserably.

IVAN: Really?

KEVIN: I think I caught maybe two out of 20 or 30 that I tried.

IVAN: What was the main issue?

KEVIN: So, this time they were bus powered. So, it would plug into the presenter's laptop.

IVAN: So, the assumption was there’s enough juice coming out of the laptop that’ll actually give you consistent power to power it.

KEVIN: Well there’s juice, but if that power gets interrupted before the file is written, then you get a zero K file.

IVAN: Oh, no!

KEVIN: Right. And generally, with equipment it loses the connection; it writes the file, powers down. Not so with this one. So, then that was wonderful and terrifying. It’s like, Okay, good, failing is important.

IVAN: Absolutely.

KEVIN: Right. Because if it’s working you don’t know how to break it, and therefore you don’t know how to fix it.

IVAN: Exactly.

KEVIN: So that was late 2014. March 2015 MidCamp No. 2 is coming up and I was a little scared, because it worked and then it failed, and here we go again. And, MidCamp was a success. So, it’s like, okay, great. What this tells me is I need to take these things on the road and just get more variables into the equation. So, shortly after MidCamp, I sent out a Tweet saying, “Hey Camps, if you’ll cover my airfare and hotel, I’ll record your Camp.” And St. Louis and Twin Cities took me up on it right away.

IVAN: Yeah, we did. It was like a no brainer for Twin Cities DrupalCamp. We were like, Oh, Kevin’s going to record it?” I think I remember voting yes on that request. I’m like, “Yeah, absolutely. Bring him. We’ll do it.”

KEVIN: Yeah, so that was also terrifying, because like, Oh now this someone else’s money. But by and large it worked really well. In St. Louis, I had 100% capture. So, like, great, this is good. But over time just various variables helped me to iterate on the kit, or whether that’s documentation—because BADCamp there was one year, there’s no time between sessions, and you’ve got six rooms over four buildings, and you’re the only one doing it. It’s like, “Okay, I guess I’m going to make instructions and put them at the podium because I’m not going to be there.” And it worked mostly.

Kevin Thull

IVAN: The giant red button, when did that make its debut?

KEVIN: The red button was part of what I call the beta kit, the 2014 Fox Valley version. So, it was the camcorder, there was the Drupal laptops, the DA laptops and then the Big Red Button. So that was early on the process and it’s just been a matter of smoothing out the whole bit.

IVAN: So, you took it on the road, you got different variables for the kit. Did the kit stay very similar after your beta process or did you change anything major after you were done with Twin Cities and St. Louis?

KEVIN: The bulk of the changes were adding in redundancies and taking out other variables. So, I added in the digital voice recorder, but then I got a remote for it. So, you hit the red button on the video, you’d hit the button on the audio record, and then when you’re done you stop both. Then so many times people forget to do the audio record, and then I realized, why don’t I just leave this thing to record all day long and take that out of the equation. One less thing for presenters to think about. And that worked.

There have been times, and sort of been the bane of my existence for a bit, because invariably someone would bump the power, or they’d turn off the power strip that everything’s attached to. Well, the video recorder powers on automatically. The audio recorder has to be turned on once it’s got power. So, I would lose it that way. Now there’s batteries in there, so there’s a failover. I now discovered there’s a hold switch so you can’t accidentally stop the recording, which has happened before. So, audio has become pretty solid in terms of capturing it.

Then just accidents. I had a four-port USB power, because one of the AV guys, when BADCamp failed miserably, he’s like, “Maybe there’s not enough USB power for some of this stuff.” For the laptops we’ll get a separate powered hub. So, I did but I thought I had to plug it into the laptop, and so, if the laptop went to sleep, it turned off power to the recorder. In one session I accidentally forgot to plug it in, and I think I know who the person is who, what I call happy accidents, her laptop went to sleep, and her recording continued. I'm like, “Oh, it’s because it’s not plugged in.” Because it’s not plugged into the laptop. So, it didn’t perceive that as a signal loss. Yeah, so, just lots of documentation and happy accidents throughout the years.

IVAN: And are you now at a final version of the right or do you have additional changes you want to make for the future?

KEVIN: The issues are currently, for whatever reason, if it’s not Mac OS, even though there’s a voice mixer, an audio mixer in the unit, there’s still no audio on the screen record. So, I don’t know if somehow rather than dubbing the non-audio from the presentation plus the spoken audio from the presenter, rather than mixing them, it’s completely wiped out. So, I had some time before someone’s session who historically had no audio, so I’m like, “Let’s look at your audio system and pick whatever is not chosen.” I assumed that it was choosing HDMI, and we would have to set it to headset. But it was set to headset. I’m like, “Let’s choose HDMI.” Then it worked. So, it’s like, “Oh, that’s cool.” But it’s still not 100%. Either you choose it, there will still be no audio, or it’ll be bad audio that has to be replaced. But it’s improved.

IVAN: I’m just amazed at the speed at which you get these sessions turned around and available online. What’s the secret to doing that?

KEVIN: The unit records to an .mp4 file on the thumb drive that’s attached to it. So, assuming you’ve got good audio, you already have a compressed file to upload to YouTube. So, as long as I go in, like any large break I’ll swap out media, see what I’ve got, fix anything that needs fixing and upload it, assuming then it has decent internet.

IVAN: So, really, you’re not doing any postproduction. That rig does it all for you.

KEVIN: Ideally, right. Yeah. When it works, it works really well. There are some small fixes. I’ve got enough experience where I’ve gotten quick at it. I think my most challenging was last year at DrupalCamp Montreal, it was a completely French spoken session that had no audio. So, trying to time that was tough.

IVAN: Oh, yeah. [laughing]

KEVIN: [laughing] I don’t speak French. So, eventually I figured it out.

IVAN: And what do you think the Achilles heel is with the whole system?

KEVIN: People. And that’s my next focus. When I’m doing it, I get close to 100% capture, pretty consistently. When others are doing it, it’s generally 80% or less, which I’ve learned is still okay. Because it means there are other people doing it, I’m not the blocker. But also, it’s just a matter of presence and making sure that everything’s being checked and rechecked. That a) you’re connecting the presenter laptops; and b) when sessions start, you’re verifying that the recording is recording. You still may lose the one or two that way, but it’s really just a matter of finding the people who care enough to make sure that it’s as successful as possible for any event that they’re managing equipment.

IVAN: How many sessions do you think you’ve recorded since you started in Fox Valley?

KEVIN: I do keep track.

IVAN: You do? So, this is not a guess. Okay. How many. What are we up to?

KEVIN: 1,646 total.

IVAN: Wow.

KEVIN: Although there’s more than that, because I don’t have numbers from Chattanooga. That includes sessions I’ve captured plus sessions—I call them proxy captures. So I now will send equipment to Camps through FedEx. And with instruction documentation. If needed I’ll do a video call with them to kind of go over how the kit works, troubleshooting stuff.

IVAN: And are the kits still around $250? Or has that changed?

KEVIN: So, by adding in the voice recorder that all totals about $450 per setup, which is still relatively affordable. I can get eight of them into a carryon-size Pelican case.

IVAN: That’s great.

KEVIN: They’re portable. They’re lightweight.

IVAN: Yeah. Wow. The quality on the recordings are nothing to be ashamed of. They’re all HD, the audio’s great. I don’t know how you get such great audio. You even get the questions from the auditorium as well.

KEVIN: That’s the audio recorder. I just have it set to multichannel, I think the auto gain and meeting is the setup. The preset. So, it does a good job of it.

IVAN: Yeah. I’m just so proud and amazed and you should be commended at every chance you can get, because this is such an amazing service and such high quality. It’s just amazing to see.

Am I right in saying that you started something called the Drupal Recording Initiative?


IVAN: Tell me about that. What is that?

KEVIN: Yeah, this is a funny story. DrupalCorn Camp in Iowa last year, I was very happy to be able to record it. It was one of the first non-Chicago Camps I went to in 2014-2015, but they always had a way to record their sessions. So, I was never going to record theirs, even though I wanted to. This last year they reached out, I guess they didn’t have their typical contact and they wanted me to record it. So, I’m like, “Yes, absolutely.”

Then Matt Westgate of Lullabot, he gave a keynote and either right before or after that, he just nonchalantly asked me, “So, how’s the recording initiative going?” In my mind I’m like, “Oh wow, you just named this thing.” Because forever I was just like, “I’m just recording stuff.” So, it immediately got an upgrade. So, I had to kind of figure out what that was.

So I tried to do a year-end blog post to say how it’s gone for the year, do a little reporting, and the DA [Drupal Association] reached out to me after that, because this past year I realized that a) because this is bigger than just me, I need to start mentoring people, and so they offered to let me do a guest blog post on the DA’s blog so that it would amplify that. So, I’m like, great what am I going to write?

So, I came up with the initiative. Basically, broke it down into various buckets, like training and mentorship, expanded coverage, improved documentation, funding organization, content discoverability, and that was just basically December of last year. Now it’s just a matter of a three to five-year roadmap.

IVAN: So, this is quite recent. So, this is the end of last year you’re through about six months of it. How’s it going?

KEVIN: Surprisingly well. I think it just goes to show when you create a plan, you’ll start…

IVAN: …you’ll start working on the plan.

KEVIN: Yeah. If you don’t have a plan, you’re not going to achieve results. If you have a plan, you have a roadmap and things to shoot for.

IVAN: And how do we find out more about the Drupal Recording Initiative?

KEVIN: One of the items was open accounting, and in order to do that I put it on Open Collective, whether it links through show notes or something. But if you search "Drupal Recording Initiative," you’ll pretty much find it on open collective and I’ve got the entire initiative spelled out there.

IVAN: Excellent. We’ll link to it in the transcript and the show notes of this podcast episode, so keep it tracked there. But, it’s on opencollective.com, and as you said if you do a search for "Drupal Recording Initiative" it should be one of the first results. And I think it was for me.

KEVIN: Excellent. So, it’s working. [laughing]

IVAN: [laughing] So, it’s working. This is actually a really good segue into a question I had about the Midwest Open Source Alliance. It did say on the recording initiatives webpage that it’s hosted by the Midwest Open Source Alliance. What is MOSA? I’m sure that’s what you call it. Right?

KEVIN: It is what we call it.

IVAN: Okay. What is MOSA?

KEVIN: Yeah. MOSA was born out of the fact that the Drupal Association used to provide fiscal sponsorship for events, primarily in the U.S. They ended that program with the recommendation transfer over to Open Collective because they can be your fiscal sponsor. The DA took 10% which went to the Drupal project. Great. Going to Open Collective was going to take 10% and fund Open Source in general, also good but they were going to take a 10% on that initial deposit in addition.

So, as an event, we had already paid our 10% to the DA, so we were going to lose another 10% just to transfer. I wasn’t okay with that, especially because we didn’t know anything about Open Collective. So, that felt like a big jump to me. And there’s still issues like insurance, and getting sales tax exemption in Chicago is an issue.

So, some of the issues that we had when the DA was running this sponsorship program were going to not be fixed by moving the Open Collective. So, some of the MidCamp organizers got together, and we had been talking about this for a while, and that was the impetus to form our town nonprofit.

IVAN: And so, the Midwest Open Source Alliance is a federally recognized nonprofit, and you behave the same way that the Drupal Association did. You are fiscal sponsors for camps. I know that Twin Cities DrupalCamp uses you right now.

KEVIN: Yeah. It was primarily a solution for MidCamp, but we realized that if we could fix this for one, we could fix it for more. We tried to keep the scope smaller, geographically by Midwest, but also open the scope and just call it open source.

IVAN: And are you the fiscal sponsor and the insurance and everything else that a camp needs? Like the Open Collective and like the Drupal Association was to us?

KEVIN: That’s the intent. We’re still working on the insurance part. For any camp to be part of MOSA we have to designate an at large board member. So, in this case that was Dan Moriarty. So, he then is a representative of MOSA, so he can sign insurance using MOSA’s name. So, it’s not his name or his company. I didn’t even know that was a problem until I heard about event organizers being sued, because of something on their website.

IVAN: That’s awful.

KEVIN: Yeah. So, this is important. Even with the DA, at one point they provided insurance, and then they realized they couldn’t because it really wasn't part of their structure.

IVAN: The liability.

KEVIN: Yeah. So then here I am buying event insurance under my own name.

IVAN: Ouch.

KEVIN: Which is terrifying.

IVAN: Yeah, that is terrifying.

KEVIN: You do what you can to get your Camp out.

IVAN: Right. And how is MOSA funded? Is it also through a percentage that the members paid?

KEVIN: Yeah. So, we’re taking 5% from events and that’s been enough because it’s all volunteer run. We take 0% from initiatives, so donations. The Recording Initiative I do pay 5% platform fee to open collective but no additional cost. Because Open Collective itself is not a fiscal sponsor. There are fiscal sponsors on Open Collective. MOSA is now one of those, and the fiscal sponsor decides what percent they’ll take. So, for camps we don’t organize that through Open Collective, so that way we can get 5% to help keep the lights on. But for initiatives, we don’t need to take anything.

IVAN: And you talked about a plan for the Drupal Recording Initiative. What kind of a plan is there for MOSA? What are you guys hoping to achieve in the next few years?

KEVIN: We’ve got a project board on GitHub mostly to sort of finish. We’re building the bike as we’re riding it. It’s like, oh we have to create a nonprofit and run an event. And, oh, Twin Cities is actually going to be our next event, so now we have to figure that out. So, we’re getting documentation and things of that nature, hashing out insurance. We want year-long insurance for Board members, but also how to cover all volunteers of an event during the phase of the event. So, in theory, events don’t need event insurance. MOSA’s insurance would cover it, in theory. There’s a lot of time to talk to a lot of people to get a lot of quotes.

IVAN: Well, you guys are doing a wonderful job, so I wish you all of the best of luck for MOSA. I know that it felt like TC Drupal was looking for something like MOSA, and I’m just glad that we’re in the Midwest, and we’re able to take advantage of the Open Source Alliance.

KEVIN: Yeah, I’m glad it worked out.

IVAN: So, I think the last thing I want to talk to you about is the Aaron Winborn Award. Last year in 2018 in recognition of this incredible service you’ve been providing to our community, you received the Aaron Winborn award. What an honor to receive that. How did that make you feel?

KEVIN: It was incredibly humbling. I’m definitely not here for anything but to give back. So, to have to stand up and thank people. I understand that people really appreciate what I’m doing, but I’m not here for that. I’m here to just make videos available. So, it’s hard to go up there. I’m a very much behind-the-cameras kind of guy. It was wonderful.

IVAN: It was wonderful to see you accept that.

KEVIN: Thanks. Yeah.

IVAN: Did it change your approach to how and what you’re doing? Did it make it more intense? Did it change anything about your approach?

KEVIN: I don’t think so. I think, if anything, more people know me. [laughing] So I’m now Drupal famous.

IVAN: [laughing]

KEVIN: But aside from that, I’d say no.

IVAN: Well, it’s just wonderful to see. You’re just such a great example of how you can contribute to the community without writing a single line of code. Right?

KEVIN: Well, that’s the whole point.

IVAN: You’re a front end developer. You’ve written code. You’ve got patches in there, but you get an award for not writing code. So, that’s just a testament. So, what do you think your advice would be to those who just joined the Drupal community, or even to any open source community who maybe are not developers or who are young developers, or who just started writing code, maybe they’re afraid to show what they’ve written? What would your advice be to them about wanting to contribute?

KEVIN: Just, if you’re passionate about giving back to a community that you’re getting benefit from, don’t let the fact that you’re not maybe working on core module development, don’t let that stop you. There are so many ways that are either technical-lite or non-technical to give back. Documentation would be a great example for Drupal, because it’s still a sticking point. Plenty of opportunity to contribute there. But, at events you always need "day of" volunteers. There are plenty of non-standard ways to get involved. And also especially to bring in any past experience you have. I did video work, that’s not at all Drupal related, but look how big of an impact it’s made.

IVAN: Kevin, thank you so much for spending your time with me today on the podcast. It’s been a pleasure talking to you.

KEVIN: Well, thank you for having me.

IVAN: Kevin Thull is a freelance frontend developer and President of the Midwest Open Source Alliance. You can find him on Twitter @kevinjthull and on Drupal.org @kthull. And we'll have those in the show notes and in the transcription on the website. You’ve been listening to the TEN7 Podcast. Find us online at ten7.com/podcast. And if you have a second, do send us a message. We love hearing from you. Our email address is [email protected]. And don’t forget, we’re also doing a survey of our listeners. So, if you’re able to, tell us about what you are and who you are, please take our survey as well at ten7.com/survey. Until next time, this is Ivan Stegic. Thank you for listening.

Sep 25 2019
Sep 25

One of our members recently asked this question in support:

Wonder if you have, or can suggest, a resource to learn how to access, authenticate (via OAuth preferably) and process JSON data from an external API?

In trying to answer the question I realized that I first needed to know more about what they are trying to accomplish. Like with most things Drupal, there's more than one right way to accomplish a task. Choosing a solution requires understanding what options are available and the pros and cons of each. This got me thinking about the various different ways one could consume data from an API and display it using Drupal 8.

The problem at a high level

You've got data in an external service, available via a REST API, that you need to display on one or more pages in a Drupal site. Perhaps accessing that data requires authentication via OAuth2 or an API token. There are numerous ways to go about it. Which one should you choose? And how should you get started?

Some questions to ask yourself before you start:

  • How much data are we talking about?
  • How frequently does the data you're consuming change, and how import is it that it's up-to-date? Are real-time updates required? Or is a short lag acceptable?
  • Does that data being consumed from the API need to be incorporated into the Drupal-generated pages' HTML output? How does it impact SEO?
  • How much control does a Drupal site administrator need to have over how the data is displayed?

While I'm certain this list is not exhaustive, here's are some of the approaches I'm aware of:

  • Use the Migrate API
  • Create a Views Query Plugin
  • Write a custom service that uses Guzzle or similar PHP SDK via Composer
  • Use JavaScript

I'll explain each one a little more, and provide some ideas about what you'll need to learn in order to implement them.

Option 1: Use the Migrate API

Use the Migrate API combined with the HTTP Fetchers in the Migrate Plus module to ingest data from an API and turn it into Drupal nodes (or any entity type).

In this scenario you're dealing with a data set that doesn't change frequently (a few times per day, maybe), and/or it's okay for the data displayed on the site to lag a little behind what's in the external data service. This approach is somewhat analogous to using a static site generator like Gatsby, or Sculpin, that requires a build to occur in order for the site to get updated.

In this case that build step is running your migration(s). The result is you'll end up with a Drupal entity for each record imported that would be no different than if a user had created a new node by filling out a form on your Drupal site. In addition, you get the complete extract, transform, load pipeline of the Migrate API to manipulate the ingested data as necessary.


  • If you've worked with Migrate API before, this path likely provides the least friction
  • Data is persisted into Drupal entities, which opens up the ability to use Views, Layout Builder, Field Formatters, and all the other powerful features of Drupal's Entity & Field APIs
  • You can use Migrate API process plugins to transform data before it's used by Drupal
  • Migrate Plus can handle common forms of authentication like OAuth 2 and HTTP Basic Auth


  • Requires a build step to make new or updated data available
  • Data duplication; you've now got an entity in Drupal that is a clone of some other existing data
  • Probably not the best approach for really large data sets

Learn more about this approach:

Option 2: Create a Views Query Plugin

Write a Views Query Plugin that teaches Views how to access data from a remote API. Then use Views to create various displays of that data on your site.

The biggest advantage of this approach is that you get the power of Views for building displays, without the need to persist the data into Drupal as Entities. This is approach is also well suited for scenarios where there's an existing module that already integrates with the third party API and provides a service you can use to communicate with the API.


  • You, or perhaps more importantly your editorial team, can use Views to build a UI for displaying and filtering the data
  • Displays built with Views integrate well with Drupal's Layout Builder and Blocks systems
  • Data is not persisted in Drupal and is queried fresh for each page view
  • Can use Views caching to help improve performance and reduce the need to make API calls for every page load


  • Requires a lot of custom code that is very specific to this one use-case
  • Requires in-depth understanding of the underpinnings of the Views API
  • Doesn't allow you to take advantage of other tools that interact with the Entity API

Learn more about this approach:

Option 3: Write a Service using Guzzle (or similar)

Write a Guzzle client, or use an existing PHP SDK to consume API data.

Guzzle is included in Drupal 8 as a dependency, which makes it an attractive and accessible utility for module developers. But you could also use another similar low-level PHP HTTP client library, and add it to your project as a dependency via Composer.

Guzzle is a PHP HTTP client that makes it easy to send HTTP requests and trivial to integrate with web services. --Guzzle Documentation

If you want the most control over how the data is consumed, and how it's displayed, you can use Guzzle to consume data from an API and then write one or more Controllers or Plugins for displaying that data in Drupal. Perhaps a page controller that provides a full page view of the data, and a block plugin that provides a summary view.

This approach could be combined with the Views Query Plugin approach above, especially if there's not an existing module that provides a means to communicate with the API. In this scenario, you could create a service that is a wrapper around Guzzle for accessing the API, then use that service to retrieve the data to expose to views.

If you need to do anything (POST, PUT, etc. ) other than GET from the API in question you'll almost certainly need to use this approach. The above two methods deal only with consuming data from an API.


  • Able to leverage any existing PHP SDK available for the external API
  • Some of the custom code you write could be reused outside of Drupal
  • Greatest level of control over what is consumed, and how the consumed data is handled
  • Large ecosystem of Guzzle middleware for handling common tasks like OAuth authentication


  • Little to no integration with Drupal's existing tools like Views and others that are tailored to work with Entities

Learn more about this approach:

Option 4: JavaScript

Use client-side JavaScript to query the API and display the returned data.

Another approach would be to write JavaScript that does the work of obtaining and displaying data from the API. Then integrate that JavaScript into Drupal as an asset library. A common example of something like this is a weather widget that displays the current weather for a user, or a Twitter widget that displays a list of most recent Tweets for a specific hash tag.

You could also create a corresponding Drupal module with an admin settings form that would allow a user the ability to configure various aspects of the JavaScript application. Then expose those configuration values using Drupal's JavaScript settings API.

While it's the least Drupal-y way of solving this problem, in many cases this might also be the easiest -- especially if the content you're consuming from the API is for display purposes only and there is no reason that Drupal needs to be aware of it.


  • Data is consumed and displayed entirely by the client, making it easier to keep up-to-date in real time.
  • Existing services often provide JavaScript widgets for displaying data from their system in real time that are virtually plug-and-play.
  • Code can be used independent of Drupal.


  • No server-side rendering, so any part of the page populated with data from the external API will not be visible to clients that don't support JavaScript. This also has potential SEO ramifications.
  • You can't query the API directly if it requires an API key that you need to keep secret (e.g., because the key has access to POST/PUT/DELETE resources). In that case, you would need server-side code to act as a proxy between the API and the JavaScript frontend
  • Drupal has no knowledge of the data that's being consumed.
  • Drupal has little control over how the data is consumed, or how it's displayed.

Learn more about this approach:

Honorary mention: Feeds module

The Feeds module is another popular method for consuming data from an API that serves as an alternative to the Migrate API approach outlined above. I've not personally used it with Drupal 8 yet, and would likely use the Migrate API based on the fact that I have much more experience with it. Feeds is probably worth at least taking a look at though.


There are a lot of different ways to approach the problem of consuming data from an API with Drupal. Picking the right one requires first understanding your specific use case, your data, and the level of control site administrators are going to need over how it's consumed and displayed. Remember to keep in mind that turning the data into Drupal entities can open up a whole bunch of possibilities for integration with other aspects of the Drupal ecosystem.

What other ways can you think of that someone might go about solving the problem of consuming data from an API with Drupal?

Sep 25 2019
Sep 25
“Drupal is here to stay, it's only getting bigger with the scale of engagements we are in, our wish is for India to Choose to Lead.” - Drupal India Association

“What is the most resilient parasite? Bacteria? A virus? An intestinal worm? An idea. Resilient... highly contagious. Once an idea has taken hold of the brain it's almost impossible to eradicate. An idea that is fully formed - fully understood - that sticks; right in there somewhere.” This is a dialogue from Christopher Nolan’s Inception (2010) that is congruous with different scenarios of life where you are looking forward to new beginnings and working towards that. An idea can make you ponder over a plethora of options to make something great happen. Drupal India Association (DIA) is also a result of the work of brilliant people and their visionary ideas.

Drupal India Association written on a wall and images of buildings on right hand side

Like Drupal Association, which helps the Drupal community across the globe to build, secure and promote Drupal in addition to the funding, online collaboration, infrastructure and education, there was definitely a great value seen in forming a national level association in India. Channelising funds for events or act as a bank of thought leaders or prevention of scheduling conflict would all require a central body. This is exactly what led to the formation of Drupal India Association.

Floating an idea: How DIA came to fruition

The idea was to have a central organisation that has an India-wide presence and recognition
A long sentence written on the right and a man sitting on the bike on left

The discussions on forming DIA were happening as early as 2012. The idea was to have a central organisation that has an India-wide presence and recognition. The key areas that such a central body would address are:

  • Promotion: Whether you need to organise Drupal-related events (DrupalCamps, DrupalCon, Drupal Training etc.) in India or want to know where should you advertise the events, it can all be streamlined with the presence of a central organisation. You will have access to a wonderful group of thought leaders from the Drupal community of India who can answer your questions related to Drupal promotion. In short, this will be essential to engage the open-source community within India and help the Drupal community in India grow even bigger.
  • Funding: Such a central body can also help simplify the funding process that is imperative to organise large Drupal-related events.
  • Schedule: The window for different Drupal-related events to be scheduled can be easily decided. The question of two or more Drupal events happening concurrently is nullified.

It was only in 2018 when the resolve to plan for a regional chapter strengthened. This was the time when the Drupal community in India came together to chalk out the action plan.

A woman's face on top and a long sentence below

The interest among the Drupal community members in India was palpable.

A woman's face on top and a long sentence below

Efforts started bearing fruits in 2019 when everything fell in place. At Drupalcamp Delhi 2019, the announcement of Drupal India Association as the newly formed organisation was made.

A woman's face on top and a long sentence below

The synergy has developed among the different thought leaders from various agencies, including Vidhatanand (Chief Engagement Officer at OpenSense Labs).

A group of men and women sitting around a huge tableRepresentatives of different agencies meeting at DrupalCamp Pune 2019 to discuss DIA

There is a hope that Drupal India Association will inspire more such local chapters to be formed. And the Drupal Community is already looking forward to many more associations on similar lines.

A woman's face on top and a long sentence below

The Vision

After all the brainstorming and insightful discussions, DIA is finally here and is here with a mission. Be it the marketers, the agencies or the developers, it has something to offer for everyone.


The primary vision of Drupal India Association is to provide value for the member organisations and the Drupal Community in India. DIA’s emphasis will be on boosting digital innovation using Drupal and enabling more agencies to innovate using Drupal. DIA will be steadfast in its goals of identifying tech events where it can participate and hire a big booth where every member organisation can take part.


Popularising Drupal in India and setting an example to the rest of the world is one of the objectives of DIA. With the help of DIA, marketers will be able to change the way people look at India when it comes to Drupal development and its role in it. DIA will also pave way for India to have a colossal influence over the Gulf and ASEAN (Association of Southeast Asian Nations) regions. Cities in India that were never on the radar of the Drupal community will now be holding Drupal camps and meetups. DIA will be responsible for preparing a calendar of events with the aim of promoting Drupal across different cities in India.


Drupal India Association’s objective is to proliferate Drupal contributions coming from India and will keep working towards it to make a huge impact.


From being just an idea in the incipient stages to being a central body, Drupal India Association has come a long way. It still has a lot to look forward to. A massive country like India shows a lot of promise to make impactful contributions when it comes to increasing adoption of Drupal by more agencies, make Drupal even stronger, and lead the way. Drupal India Association is committed to making it all happen.

Ping us at [email protected] to know more about Drupal, its remarkable merits and how you can make your invaluable contributions to the growth of Drupal.

Sep 25 2019
Sep 25

For e-commerce sites offering training or events, an extremely interesting function is to offer visitors to subscribe to the training or event in question in order to be notified as soon as a new session, a new date, is available. The interest is twofold: this allows the user to receive real-time notification as soon as a new session is available, and for the e-commerce site it allows him to know the interest generated by his various events or training courses, and can encourage him to strengthen certain products rather than others, in other words to respond to a request that is expressed.

This is the main objective of the Commerce Product Reminder module, which we will discover here.

The configuration of the module is quite simple. Its main concept is to provide a subscription form, allowing the user to subscribe with an email, on Drupal Commerce's Product entities, and then to notify all subscribers as soon as a new variation of the product in question is published (or if an existing unpublished variation is published again).

Module configuration

The module offers us several general configuration options

Commerce Product Reminder general settings

So we can:

  • Disable the sending of notification emails if necessary
  • Use background tasks to send notification mails (recommended option)
  • Log the sending of each notification
  • And finally, select the different product types of Drupal Commerce on which to activate the subscription form.

The module configuration options also allow you to customize the various text elements of the subscription form as well as those of the notification emails sent.

Commerce Product Reminder form settings

The various configurable text elements of the subscription form are

  • An introductory text for the subscription form
  • The label of the form submission button
  • The subscription confirmation message
  • And finally an introductory text on the page allowing anonymous visitors to manage their different subscriptions (The link to this page is automatically available on all notification emails sent to subscribers)

Commerce Product Reminder mail settings

The configurable text elements for notification mails are :

  • The sender email (leave blank to use by default the main email of the site)
  • The body of the mail (tokens associated with the Product and Product Variation entities are available)
  • The subject of the email

Once these different elements have been configured, all that remains is to activate the subscription form on the different product types, on the relevant view mode (generally the Full view mode).

Commerce Product Reminder Extra field

And now all anonymous visitors can subscribe to receive a notification as soon as a new variation is published related to the product in question.

Commerce Product Reminder subscription form

Extension du périmètre fonctionnel du module

The functional scope of the module consists above all in notifying users of a new variation published on a Drupal Commerce product. And its main contribution consists in providing the storage of subscribers, giving them access to a page allowing them to manage their subscriptions while being an anonymous visitor, and then to manage the sending of notification emails.

The logic that determines the sending of notification mails ultimately represents a very small part of the module and can be easily modified by a Drupal 8 developer, as for example if you want to generate the sending of notifications on any other property or fields of a product variation.

It is enough to implement an EventSubscriber that will react on events related to a product variation, events propagated by Drupal Commerce itself, or to overload the EventSubscriber used by the module itself.

 * Class ProductVariationSubscriber.
class ProductVariationSubscriber implements EventSubscriberInterface {

   * Drupal\commerce_product_reminder\HelperServiceInterface definition.
   * @var \Drupal\commerce_product_reminder\HelperServiceInterface
  protected $helper;

   * Queue factory.
   * @var \Drupal\Core\Queue\QueueFactory
  protected $queueFactory;

   * ProductVariationSubscriber constructor.
   * @param \Drupal\commerce_product_reminder\HelperServiceInterface $helper
   * @param \Drupal\Core\Queue\QueueFactory $queue_factory
  public function __construct(HelperServiceInterface $helper, QueueFactory $queue_factory) {
    $this->helper = $helper;
    $this->queueFactory = $queue_factory;

   * {@inheritdoc}
  public static function getSubscribedEvents() {
    $events[ProductEvents::PRODUCT_VARIATION_INSERT] = ['onProductVariationInsert'];
    $events[ProductEvents::PRODUCT_VARIATION_UPDATE] = ['onProductVariationUpdate'];
    return $events;

   * This method is called when the product_variation_insert event is dispatched.
   * @param \Drupal\commerce_product\Event\ProductVariationEvent $event
   *   The dispatched event.
  public function onProductVariationInsert(ProductVariationEvent $event) {
    $product_variation = $event->getProductVariation();
    if ($product_variation->isPublished()) {

   * This method is called when the product_variation_update event is dispatched.
   * @param \Drupal\commerce_product\Event\ProductVariationEvent $event
   *   The dispatched event.
  public function onProductVariationUpdate(ProductVariationEvent $event) {
    $product_variation = $event->getProductVariation();
    $product_variation_original = $product_variation->original;
    if (!$product_variation_original instanceof ProductVariationInterface) {
    if ($product_variation->isPublished() && !$product_variation_original->isPublished()) {

   * Send reminder mails or queue them.
   * @param \Drupal\commerce_product\Entity\ProductVariationInterface $product_variation
  protected function sendMailForReminderRelated(ProductVariationInterface $product_variation) {
    if (!$this->helper->isEnabled()) {
    $data = [];
    $data['product_variation_id'] = $product_variation->id();
    $reminders = $this->helper->getRemindersFromVariation($product_variation);
    foreach ($reminders as $reminder) {
      $data['reminder_id'] = $reminder->id();
      if ($this->helper->useCron()) {
      else {
        $this->helper->sendMail($product_variation, $reminder);

And then you can modify the methods that are in charge of acting on product variation update or create events, and thus introduce your own business logic. The module will take care of all subscription management and sending notification emails according to your own needs.

A very likely evolution of the module will certainly be to allow an easier modification of the logic triggering notifications, whether via a plugin system or additional configuration options, or even to support other contributed modules such as those related to inventory management for example.

Sep 25 2019
Sep 25

Our dedicated Global Maintenance Team works diligently with our clients to keep their sites updated, secure, and fresh. In this blog, we’ll outline three common maintenance practices we use to keep our clients happy and their sites running smooth. 

Quick Response Times

Regular maintenance can prevent many common issues, but even properly updated sites can have problems. And, when they inevitably do, clients need a quick response – that’s precisely where our team excels. Whether a site is down or clients need help editing content, we’re ready to help. 
We use the same channels of communication both internally and externally and this is one of the reasons we have such quick response times. All of our client conversations about projects take place in Slack, so clients can raise an issue at any time and get a quick reply from anybody on the team. This can mean we can take action immediately, whether it's by troubleshooting over chat, creating a ticket for support, or escalating the task for immediate action.

In all cases, we’re able to deliver swift and transparent solutions for our clients because we are able to communicate directly with them.


Accidents happen, and when they do we can help. In one recent situation, a client deleted a user and subsequently deleted all the content associated with that user as well. It wasn’t immediately clear what had happened, but the site’s performance was suffering. We jumped in to investigate and found the root cause. From there, it was only a matter of contacting amazee.io (our favourite hosting provider) to restore the old back-up on production. After that, everything came back to life and went back to normal. We were able to investigate and solve the client's issue in a transparent and timely manner.  

amazee.io Drupal Example


We have maintenance and support clients that come to us after building their sites with our Amazee Labs Development Team, as well as clients who hire us to take care of sites built elsewhere. Maintaining existing sites built by other agencies means the code may or may not be in great shape. Every time we onboard a new client, we audit their site. During this process, we check if the modules or core are hacked, patched or up to date. We also check the caching settings and any custom code. Once we’ve done this, and fixed potential issues, we fully onboard new clients into our systems and tools (Github, Lagoon, Jira, etc). 

Site Audit Example

Drupal updates (... and patch parties)

We help our clients understand the importance of frequently updating their sites. In most cases, updates are lightweight and come with instant benefits, like performance and security. Updating the core and modules that make up a site is a common task for our team. For clients that prefer to update their sites less frequently, these updates can be done periodically or in batches. But critical security updates are a different story. 

Security Advisories Example

Every now and then, critical security updates are released for Drupal core or specific modules. These updates need to be pushed immediately because neglecting them can make a site vulnerable to hacking, a loss of data, or both. For critical security updates to be implemented quickly, the maintenance team holds patch parties. 

During a patch party, we get team members from all around the world to focus on making sure all our clients’ sites are secure. For some sites, we have automation scripts, for others, we need to do things manually. Either way, we get all hands on deck to monitor everything and keep our clients updated.

During these concerted efforts, it’s great to have a globally distributed team so we can work continuously to make sure every site is updated, functional, and secure. 

Important events

One of the benefits of keeping our client communication in Slack  –  it’s visible to everyone on the team. That way someone is always available to help and the client is able to monitor its progress.

For important events on certain sites (newsletters, leads exports, etc) we use Slack integration to make sure everything runs smoothly and everyone knows what’s happening and when. You can read more on the subject in this blog post.

With the right tools and our dedicated team of experts, we make sure our clients' sites stay secure and up to date. Stay tuned for more blog posts in this series. 

If you’d like to learn more about the benefits of keeping your website well maintained and ahead of the competition, drop us a line. We’d love to hear from you. 

Sep 25 2019
Sep 25

Video of Drupal 8 Override Node Options Module - Daily Dose of Drupal Episode 235

The Drupal 8 Override Node Options module is a simple module that allows you to set who can edit specific node options when creating or editing nodes. This includes things such as the published checkbox, sticky checkbox, promoted to frontpage checkbox, revision information, and authoring information. This is a useful module for building out a more complex content workflow or perhaps just simplifying the content editing experience on your Drupal 8 site by hiding unneeded node options.

Download and install the Override Node Options module just like you would any other module.

composer require drupal/override_node_options

After installing the module, you can configure the module to turn on Global permissions across all node types and specific permissions for each node type. These checkboxes just add additional permissions options on the permissions page.

If you go to the permissions page and search for the Override Node Options section, you will see the available permission options. Here you can set the permission for who can view the various authoring fields that show up on the Node edit forms. You can easily use this to configure specific roles to be able to edit only the authoring information you want them to be able to access.

Sep 24 2019
Sep 24

Our lead community developer, Alona Oneill, has been sitting in on the latest Drupal Core Initiative meetings and putting together meeting recaps outlining key talking points from each discussion. This article breaks down highlights from meetings this past week.

You'll find that the meetings, while also providing updates of completed tasks, are also conversations looking for community member involvement. There are many moving pieces as things are getting ramped up for Drupal 9, so if you see something you think you can provide assistance on, we encourage you to get involved.

Drupal 9 Readiness Meeting

September 16, 2019

Meetings are for core and contributed project developers as well as people who have integrations and services related to core. Site developers who want to stay in the know to keep up-to-date for the easiest Drupal 9 upgrade of their sites are also welcome.

  • It usually happens every other Monday at 18:00 UTC.
  • It is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public Drupal 9 Readiness Agenda anyone can add to.
  • The transcript will be exported and posted to the agenda issue.

Guzzle, Diactoros, symfony/http-client, and PSRs-7, PSRs-17, and PSRs-18

Drupal 9/8 PHP version requirements

MySQL 5.7 and MariaDB 10.1 will officially end support in Oct 2020

Chx suggested splitting the MySQL and MariaDB drivers eventually as they continue to diverge.

Stable upgrade status, but missing features

Gábor Hojtsy announced Upgrade Status went stable a few days ago. There are various missing features: 

Documenting deprecated code / triggering deprecation errors properly

Drupal 8.8 is the deadline for:

Remove core's own uses of deprecated APIs

Drupal core's own deprecation testing results are really close to done.

Drupal Module Upgrader 7 => 9 directly

There’s been lots of work recently by Amit GoyalRohit Joshi, and Pranit Jha to add Drush 9 support and make the transformations produce Drupal 9 compatible results. They also made the test suite green with lots of work and are looking into the possibility to write new transformations with rector. Unfortunately, due to conflicts of dependencies, rector cannot be added to a Drupal instance without Support PHPUnit 7 optionally in Drupal 8, while keeping support for 6.5 being resolved.

Deprecations in contrib

Admin UI Meeting

September 18, 2019

  • Meetings are for core and contributed project developers as well as people who have integrations and services related to core. 
  • Usually happens every other Wednesday at 2:30pm UTC.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • There are roughly 5-10 minutes between topics for those who are multitasking to follow along.
  • The Admin UI Meeting agenda is public and anyone can add new topics in the document.

Design Revision 1

The breakpoint for Cards set to 85rem Vertical Tabs style update.

Design revision 2: heading spacings

We have several options for defining the default:

  • Equal vertical+bottom space: it could be 1em or 0.75em. (margin: 1em 0; or margin: 0.75em 0;
  • Only top: 1em or 0.75em.
  • Only bottom: 1em or 0.75em.
  • Different spacings for top&bottom: margin: 1em 0 0.75em;> go for this

UX meeting

We did a Claro demo and found some bugs. The first one for messages, without icon and title and working on that in a follow-up.

Drupal Core Cross-Initiative Meeting

September 19, 2019

UX Update

Cristina Chumillas talked about UX with the following update:

  • Claro is on track, design components need to be reviewed and blockers resolved.
    • Several issues are nearly complete.
    • Several that still need work from the blocker's list.
  • Issue submitted to add Claro to Drupal Core:
    • Close to getting a green light to add it as an alpha module.
  • Next steps:
    • Need additional accessibility maintainer reviews.
    • Need additional RM support to understand what the level of sign off should be from each of the maintainers.

Workspaces Update

Andrei Mateescu talked about Workspaces with the following update:


  • List of stable blockers obtained after meeting with maintainers/rm’s.
  • Before marking stable, we need a conversion of path aliases to entities (big patch), which adds some risk.
  • 2 other major asks:
    • Compatible w/ content moderation.
    • Ability to add subworkspaces.
  • On track right now for getting int core, pending final reviews of those changes.


  • None right now.

Next steps:

  • Work on the 3 major things identified, get final reviews.

Multilingual Migration

Gábor Hojtsy, Alex Pott, and Michael Lutz talked about Multilingual Migration with the following update:


  • Most issues to get it committed progressing, Alex Pott is working with V Spagnolo on final ones.
  • Hard to get data migrated correctly and still grab old migrations to fit them into new formats.
    • For revisions + translations, one big overhaul of node revision table is the solution landed on to maintain backwards compatibility.
  • Solution is actually working (yay!) just need to do some cleanup, testing & validation.


  • Testing is required to validate the solution will work for people who are expecting granular controls.

Next steps:

  • Testing / Validating the solution to manage both revisions and translations.
  • Later reviews from entity subsystem maintainers, framework mgmt, PM, and RM will need to happen.
    • Meet with Lucas Hedding working with V Spagnolo to review the potential breakdown-scenarios.

Drupal 9

Gábor Hojtsy provided a Drupal 9 update:


  • Deprecations => making a ton of progress, some hard ones left.
  • Symfony 4 => several mysterious issues found, working to resolve those issues.
  • Upgrade status => Stable release hurraayyy!! So far no issues found apart from one person who had no UI showing.
  • Drupal Module Upgrader => working w/ community to make it produce D9 compatible code. 1.5 tag released yesterday, no feedback yet.
  • Rector => converts your d8 code to d9, looking into merging this into drupal module upgrader.
  • Contrib deprecation errors, Ryan Aslett at the DA is helping to resolve the results of deprecations across contrib! 
  • Core deprecation => requirements recently redefined, D9 branch should be opened with D8.9 branch in less than a month! Whoo-hoo!


Next steps:

  • Hey initiative owners: make your stuff Drupal 9 ready, please.
  • Resolve final deprecations.
  • Keep working on Symfony 4 issues identified.
  • Test modules / deprecated API’s and hold a D9 sprint in Amsterdam.
  • Keep looking into merging rector.


Keith Jay provided the following update on Demo: 


  • Progressing for working with layout builder for recipe type in the basic format.
  • Working on expanding features.
  • Working on making a front-page based on layout builder.
  • Creating more tooltips.
  • Great new content coming from a UK-based chef.
  • Layout switcher also in progress.
  • 3081587 => may have a core-related issue, to be continued.


Next steps:

  • Keep working through the above issues, nothing needed.

Auto Updates

Lucas Hedding gave an auto updates with the following update:  


  • 3 parts:
    1. PSA tells you the update is coming.
    2. Readiness checks are preflight checks to confirm your site can be updated.
    3. In-place updates do the fix.
  • The first two parts have been released in alpha.
  • Video podcast prepared, will be live first weekend of October.
  • DA blog post to promote will follow.
  • In-place updates are also progressing.


  • One issue that needs further discussion to get to RTBC, could use core committer review on this so they know what the blockers are.

Next steps:

  • Work through final issues related to part 3 => in-place updates.
  • Testing and validation to get to RTBC => beta release with all features.
  • Work through the issues identified => stable contrib release.
  • Core release will happen later, not to be rushed at this point.


Ryan Aslett gave an update on Composer with the following update:


  • Down to the last couple of items.
  • Made tons of progress.


Next steps:

  • Final reviews / Remediations from core committers and Alex Pott.
  • Write documentation / enablement supports / marketing & promotion of improvements
  • Commit to 8.8!
Sep 24 2019
Sep 24

"Disability is an avoidable condition caused by poor design.”

This is a sentence that I found in a Certified Professional in Accessibility Core Competencies (CPACC) study guide, and it has given me a new perspective on accessibility. 

For those of us who are steeped in the world of web accessibility, an openness to constantly broadening perspectives is at the heart of our effectiveness -- whether we’re thinking about a wide range of experiences when designing sites or paying close attention to the language surrounding disabilities. 

Words matter, and how we talk to and about people with disabilities factors into the bigger picture of our effectiveness as accessibility evangelists. 

Here are some of the essential lessons that I have learned recently as both a web accessibility developer and as a person who is devoted to understanding a wide range of perspectives

Put People First

People-first language puts the person before their disability -- sending a subtle, but powerful, signal that the person is not defined by their disability.
Some examples of person-first language: 

  • A person who is blind 
  • A person with a hearing impairment
  • A person who uses a wheelchair

Notice that when the word “person” comes before any mention of a disability, we are literally putting the person first.

Identity-First Dilemma

While it may appear to be a direct contradiction to person-first language, identity-first language places the disability before the person. Individuals who prefer this form of speech argue that having a disability has had a major influence on their lives and who they are as a person. Their disability is nothing that they need to hide or be ashamed of.  

Some examples of identity-first language:

  • A blind person
  • A hearing-impaired person
  • A disabled person

Given that these two types of language can appear to be in direct contradiction to each other, determining what’s the preferred form and when to use it can be confusing.

In general, it is fair to assume that person-first language does not offend anyone. It is a benign form of speech and if the individual does have a preference, they will usually inform you of such. However, if there is any doubt or discomfort about which form of speech a person with a disability prefers, it’s okay to just ask.

More Insights

Best intentions don’t guarantee against oversights or offer the ability to view the world from another person’s perspective. So let’s take a closer look at some of the terms that many of us use in our day-to-day language, as well as some outdated language, and some terms you should absolutely never use.

“Accessible” vs. “Disabled” or “Handicapped”

When talking about places with accommodations for people with disabilities, use the term "accessible" rather than "disabled" or "handicapped." 

This is how the importance of this distinction was explained to me. I was asked as a mother of a small child whether I ever used the “handicapped” bathroom stall in a public restroom. I said, “sure.” I was then asked why. 

“Well,” I said, “because it is large and easier to maneuver in with my child.” The person then said, “so you use the bathroom because it is more accessible.”

"Uses a Wheelchair" vs. "Wheelchair Bound"

“Wheelchair bound” is a term that many of us use in our daily language and we should avoid it. It has a restrictive connotation, and implies that the wheelchair is a negative thing, instead of something that broadens possibilities and makes a person’s life more manageable. The wheelchair is a tool that helps to provide access, not a punishment that the individual is bound to. Instead of “wheelchair bound,” try saying “uses a wheelchair” or “wheelchair enabled.” 

Strike from the Vocabulary!

There are certain terms we all should take out of our vocabulary entirely. They include retarded, retard, handicapable, cripple, crippled, victim, stricken, or special needs. All of these terms are negatives and in the case of the top two on the list, absolutely unacceptable. In every case, they imply that people with disabilities are not “normal.”

Some additional “don’ts”

  • Don’t ask a person with a disability how they became disabled.
  • Don’t assume that all disabilities are easily observed. The fact that a person using an accessible parking spot is not using a walking aid does not mean that they are lazy or disrespecting the needs of legitimate users of the space. They could have a pain condition or some other issue preventing them from walking long distances. Often, there is more to a situation than can be detected from a casual observation. 
  • When working remotely, don’t presume that you know everyone’s story. There is much that you may not know concerning a team member or client on the other end of a call -- even if it is a video call. Making language sensitivity a habit, in all circumstances, is not just the right thing to do. It’s good business. 

At Promet Source, we’re actually a lot more interested in the “do’s” of accessible web experiences than the “don’ts.” So if you are looking for an empowering web design that’s excellent and accessible? Contact us today.

Sep 24 2019
Sep 24
Acquia partners with Vista Equity Partners

Today, we announced that Acquia has agreed to receive a substantial majority investment from Vista Equity Partners. This means that Acquia has a new investor that owns more than 50 percent of the company, and who is invested in our future success. Attracting a well-known partner like Vista is a tremendous validation of what we have been able to achieve. I'm incredibly proud of that, as so many Acquians worked so hard to get to this milestone.

Our mission remains the same

Our mission at Acquia is to help our customers and partners build amazing digital experiences by offering them the best digital experience platform.

This mission to build a digital experience platform is a giant one. Vista specializes in growing software companies, for example, by providing capital to do acquisitions. The Vista ecosystem consists of more than 60 companies and more than 70,000 employees globally. By partnering with Vista and leveraging their scale, network and expertise, we can greatly accelerate our mission and our ability to compete in the market.

For years, people have rumored about Acquia going public. It still is a great option for Acquia, but I'm also happy that we stay a private and independent company for the foreseeable future.

We will continue to direct all of our energy to what we have done for so long: provide our customers and partners with leading solutions to build, operate and optimize digital experiences. We have a lot of work to do to help more businesses see and understand the power of Open Source, cloud delivery and data-driven customer experiences.

We'll keep giving back to Open Source

This investment should be great news for the Drupal and Mautic communities as we'll have the right resources to compete against other solutions, and our deep commitment to Drupal, Mautic and Open Source will be unchanged. In fact, we will continue to increase our current level of investment in Open Source as we grow our business.

In talking with Vista, who has a long history of promoting diversity and equality and giving back to its communities, we will jointly invest even more in Drupal and Mautic. We will:

  • Improve the "learnability of Drupal" to help us attract less technical and more diverse people to Drupal.
  • Sponsor more Drupal and Mautic community events and meetups.
  • Increase the amount of Open Source code we contribute.
  • Fund initiatives to improve diversity in Drupal and Mautic; to enable people from underrepresented groups to contribute, attend community events, and more.

We will provide more details soon.

I continue in my role

I've been at Acquia for 12 years, most of my professional career.

During that time, I've been focused on making Acquia a special company, with a unique innovation and delivery model, all optimized for a new world. A world where a lot of software is becoming Open Source, and where businesses are moving most applications into the cloud, where IT infrastructure is becoming a metered utility, and where data-driven customer experiences make or break business results.

It is why we invest in Open Source (e.g. Drupal, Mautic), cloud infrastructure (e.g. Acquia Cloud and Site Factory), and data-centric business tools (e.g. Acquia Lift, Mautic).

We have a lot of work left to do to help businesses see and understand the power of Open Source. I also believe Acquia is an example for how other Open Source companies can do Open Source right, in harmony with their communities.

The work we do at Acquia is interesting, impactful, and, in a positive way, challenging. Working at Acquia means I have a chance to change the world in a way that impacts hundreds of thousands of people. There is nowhere else I'd want to work.

Thank you to our early investors

As part of this transaction, Vista will buy out our initial investors. I want to provide a special shoutout to Michael Skok (North Bridge Venture Partners + Underscore) and John Mandile (Sigma Prima Ventures). I fondly remember Jay Batson and I raising money from Michael and John in 2007. They made a big bet on me — at the time, a college student living in Belgium when Open Source was everything but mainstream.

I'm grateful for the belief and trust they had in me and the support and mentorship they provided the past 12 years. The opportunity they gave me will forever define my professional career. I'm thankful for their support in building Acquia to what it is today, and I am thrilled about what is yet to come.

Stay tuned for great things ahead! It's a great time to be an Acquia customer and Drupal or Mautic user.

September 24, 2019

3 min read time

Sep 24 2019
Sep 24

4 minute read Published: 24 Sep, 2019 Author: Colan Schwartz
Drupal Planet , Aegir , DevOps

Aegir is often seen as a stand-alone application lifecycle management (ALM) system for hosting and managing Drupal sites. In the enterprise context, however, it’s necessary to provide mutiple deployment environments for quality assurance (QA), development or other purposes. Aegir trivializes this process by allowing sites to easily be copied from one environment to another in a point-and-click fashion from the Web front-end, eliminating the need for command-line DevOps tasks, which it was designed to do.

Setting up the environments

An Aegir instance needs to be installed in each environment. We would typically have three (3) of them:

  • Development (Dev): While generally reserved for integration testing, it is sometimes also used for development (e.g. when local environments cannot be used by developers or there are a small number of them).
  • Staging: Used for QA purposes. Designed to be a virtual clone of Production to ensure that tagged releases operate the same way as they would there, before being made live.
  • Production (Prod): The live environment visible to the public or the target audience, and the authoritative source for data.

(While outside the scope of this article, local development environments can be set up as well. See Try Aegir now with the new Dev VM for details.)

To install Aegir in each of these, follow the installation instructions. For larger deployments, common architectures for Staging and Prod would include features such as:

  • Separate Web and database servers
  • Multiple Web and database servers
  • Load balancers
  • Caching/HTTPS proxies
  • Separate partitions for (external) storage of:
    • The Aegir file system (/var/aegir)
    • Site backups (/var/aegir/backups)
    • Database storage (/var/lib/mysql)
  • etc.

As these are all out of scope for the purposes of this article, I’ll save these discussions for the future, and assume we’re working with default installations.

Allowing the environments to communicate

To enable inter-environment communication, we must perform the following series of tasks on each Aegir VM as part of the initial set-up, which only needs to be done once.

Back-end set-up

The back-ends of each instance must be able to communicate. For that we use the secure SSH protocol. As stated on Wikipedia:

SSH is important in cloud computing to solve connectivity problems, avoiding the security issues of exposing a cloud-based virtual machine directly on the Internet. An SSH tunnel can provide a secure path over the Internet, through a firewall to a virtual machine.

Steps to enable SSH communication:

  1. SSH into the VM.
    • ssh ENVIRONMENT.aegir.example.com
  2. Become the Aegir user.
    • sudo -sHu aegir
  3. Generate an SSH key. (If you’ve done this already to access a private Git repository, you can skip this step.)
    • ssh-keygen -t rsa -b 4096 -C "ORGANIZATION Aegir ENVIRONMENT"
  4. For every other environment from where you’d like to fetch sites:
    1. Add the generated public key (~/.ssh/id_rsa.pub) to the whitelist for the Aegir user on the other VM so that the original instance can connect to this target.
      • ssh OTHER_ENVIRONMENT.aegir.example.com
      • sudo -sHu aegir
      • vi ~/.ssh/authorized_keys
      • exit
    2. Back on the original VM, allow connections to the target VM.
      • sudo -sHu aegir
      • ssh OTHER_ENVIRONMENT.aegir.example.com
      • Answer affirmatively when asked to confirm the host (after verifying the fingerprint, etc.).

Front-end set-up

These steps will tell Aegir about the other Aegir servers whose sites can be imported.

  1. On Aegir’s front-end Web UI, the “hostmaster” site, enable remote site imports by navigating to Administration » Hosting » Advanced, and check the Remote import box. Save the form. (This enables the Aegir Hosting Remote Import module.)
  2. For every other server you’d like to add, do the following:
    1. Navigate to the Servers tab, and click on the Add server link.
    2. For the Server hostname, enter the hostname of the other Aegir server (e.g. staging.aegir.example.com)
    3. Click the Remote import vertical tab, check Remote hostmaster, and then enter aegir for the Remote user.
    4. For the Human-readable name, you can enter something like Foo's Staging Aegir (assuming the Staging instance).
    5. You can generally ignore the IP addresses section.
    6. Hit the Save button.
    7. Wait for the server verification to complete successfully.

All of the one-time command-line tasks are now done. You or your users can now use the Web UI to shuffle site data between environments.

Select remote site to import

Deploying sites from one environment to another

Whenever necessary, this point-and-click process can be used to deploy sites from one Aegir environment to another. It’s actually a pull method as the destination Aegir instance imports a site from the source.

Reasons to do this include:

  • The initial deployment of a development site from Dev to Prod.
  • Refreshing Dev and Staging sites from Prod.


  1. If you’d like to install the site onto a new platform that’s not yet available, create the platform first.
  2. Navigate to the Servers tab.
  3. Click on the server hosting the site you’d like to import.
  4. Click on the Import remote sites link.
  5. Follow the prompts.
  6. Wait for the batch job, Import and Verify tasks to complete.
  7. Enable the imported site by hitting the Run button on the Enable task.
  8. The imported site is now ready for use!

The article Aegir DevOps: Deployment Workflows for Drupal Sites first appeared on the Consensus Enterprises blog.

We've disabled blog comments to prevent spam, but if you have questions or comments about this post, get in touch!

Sep 24 2019
Sep 24

The European edition of the 2019 DrupalCon likely features a more diverse and exciting palette of possible sessions to attend than any previous European ‘Con. There are so many of them that it’s not an easy task picking the ones you absolutely don’t want to miss. 

We at Agiledrop are especially excited by the Business + Marketing track. Since it’s practically impossible to cover all the tracks without missing most of the great sessions, we decided to focus on this track, as well as the more general Industry track. 

Without further ado, here are our picks for the must-see business, marketing and industry sessions at next month’s DrupalCon. Hope to catch you at some of them!

Business + Marketing track

The Art of Mentorship

Monday, October 28, 16:25 - 16:45 @ G104

Maria Totova, Drupal developer, trio-group communication & marketing gmbh, Coding Girls
Todor Nikolov, Drupal developer, Tech Family Ventures, Coding Girls

This session will dive into the importance of mentorship and how the relationship benefits both mentee and mentor. Being mentors themselves, Maria and Todor will share their experiences with teaching and give some tips on effective mentorship. 

If you’re thinking about becoming a mentor, but have some hesitations, or if you’re already mentoring someone, but feel like you could use some improvements, this is definitely a session you’ll want to attend.

The Good, The Bad and The Data: Marketing Strategies for Open Source Companies

Monday, October 28, 17:15 - 17:35 @ G102

Felix Morgan, Content Manager, Amazee Group

This is the perfect session for companies working with open source software that are struggling with marketing. Amazee’s Felix Morgan will present some marketing best practices for such companies by covering three different topics: personas and stakeholders; community and narrative; and data.

Winning and retaining long term clients

Tuesday, October 29, 17:15 - 17:55 @ G103

Owen Lansbury, Co-founder, PreviousNext

Acquiring clients is already a major challenge agencies have to deal with. Retaining these clients, then, and turning them into long-term clients is an even greater challenge. Owen’s session will provide insights on spotting and winning over the types of clients with whom you can forge a long-term relationship, as well as then cultivating that relationship.

Women on top: How to get (and keep) women in your leadership roles

Wednesday, October 30, 9:00 - 9:40 @ G109

Shannon Vettes, Factories Program Manager, Acquia
Lindsey Catlett, Drupal architect, Newell Brands
Jenn Sramek, Director of Learning Services, Acquia

It’s no secret that there’s quite a scarcity of women in technology, especially in positions of leadership. But this lack of diversity is actually harmful to business itself; teams with a greater percentage of women and with women as leaders are generally more productive and successful.

This session will talk about the bias towards women in IT and illustrate the challenges they face in this field, while also providing tips to combat this and attract and retain a diverse range of talent.

Industry track

How to start contributing to Drupal without Code

Monday, October 28, 15:25 - 15:45 @ G102

Paul Johnson, Drupal Director, CTI Digital

Non-code contributions to open source are just as welcome as all the code contributions, and often that much more needed. Much too often, however, non-code contributions to open source have gone underappreciated. 

Fortunately, Paul Johnson is remedying this in the Drupal community and encouraging contribution of any kind. His session will serve as a stepping stone for non-developers working in Drupal to get involved and start contributing.

Drupal’s place in an evolving landscape - Modernising your Commerce architecture

Tuesday, October 29, 10:30 - 11:10 @ G106

Richard Jones, CTO, Inviqa

One of the big buzzwords in Drupal right now is “headless” or “decoupled”. Alongside Drupal, another area where the “headless” approach is gaining ground is ecommerce. In his session, Richard will take a look at the evolution of commerce websites, as well as how Drupal can be used in the commerce ecosystem as the content and experience layer.

In Their Own Words: Stories of Web Accessibility

Wednesday, October 30, 15:25 - 15:45 @ G103

Helena McCabe, Technical Account Manager, Lullabot

Even though the situation is improving, accessibility is still much too often considered of secondary importance when setting up a website. During her session, Helena McCabe will share first-person stories of people with disabilities, with the aim of inspiring attendees to adopt a more inclusive and accessible mindset when designing experiences for the web.

4 Keys to a Successful Globalization Strategy and CMS Platform Architecture

Wednesday, October 30, 15:00 - 15:40 @ Auditorium

Ann-Marie Shepard, Domain Architect, IBM
Tina Williams, Digital and Content Strategist, IBM

For a business operating in international markets, it’s no easy task to keep producing relevant content and maintain web platforms for all the different audiences it’s trying to reach. A well thought-out globalization strategy is needed for this. 

In this session, you’ll learn both the business requirements and the technical solution behind IBM’s optimization of Drupal 8’s translation capabilities to support a successful globalization strategy. 

This was our selection of some of the most interesting sessions from the upcoming DrupalCon. Of course, with so many different tracks, there are many more great ones to attend - you can check out the whole day-by-day and track-by-track program here. See you in Amsterdam next month!

Sep 24 2019
Sep 24

Drupal 8.8.0 will be released in December 2019 and the upcoming changes in JSON:API module codebase introduce huge performance benefits.

Here are three things to prove that:

1. Recent patches committed to JSON:API in Drupal 8.8

https://www.drupal.org/project/drupal/issues/3039730 is a simple issue which is making sure that if you are requesting information of related entities then it statically caches the resource type information for that relationship so that when multiple entities of the same entity type and bundle are requested it doesn’t have to collect the resource type information for the related entities over and over again.

https://www.drupal.org/project/drupal/issues/2819335 adds a cache layer to store the normalized entities so that if we need the normalized version of an entity we can just get it from the cache instead of normalizing the whole entity again which can be a very expensive process.

https://www.drupal.org/project/drupal/issues/3018287 introduces new cache backend to store JSON:API resource type information which was stored in the static cache. This means that instead of creating JSON:API resource types every request, we are just creating them once after cache clear.

2. Profiling using blackfire.io

I was able to do some profiling to compare the JSON:API core module in Drupal 8.7 versus 8.8 . Here are the initial conditions:

  • PHP 7.3
  • JSON:API version 8.7
  • No JSON:API Extras
  • Page Cache module disabled.
  • Dynamic Page Cache module is set to cache.backend.null, which forces a 100% cache miss rate.
  • Cleared all caches.
  • Visit user login page to rebuild the container and essential services.

Case I

Visit the first JSON:API endpoint which loads 50 nodes with 8 fields, 2 computed fields, 2 filters, and sorted by title.

JSON:API 8.7 - URL 1

Case II

Visit the first JSON:API endpoint which loads 2 nodes with 45 paragraph fields, each paragraph field has 6 fields and 2 computed fields, 1 filter.

JSON:API 8.7 - URL 2

Then update the JSON:API to 8.8, all other initial conditions were the same as before.

Case I

Visit the first JSON:API endpoint which loads 50 nodes with 8 fields, 2 computed fields, 2 filters, and sorted by title.

JSON:API 8.8 - URL 1

Case II

Visit the first JSON:API endpoint which loads 2 nodes with 45 paragraph fields, each paragraph field with 6 fields and 2 computed fields, 1 filter.

JSON:API 8.8 - URL 2


Case I

The comparison shows 79% improvement in response time.

URL1 comparison from JSON:API 8.7 to JSON:API 8.8

There are 39 more SQL queries on JSON:API in Drupal 8.8.

After having a detailed look at those shows that there are additional calls to new cache bin added by JSON:API but the most important thing was 50 fewer queries to url_aliase table.

URL1 query comparison from JSON:API 8.7 to JSON:API 8.8

Function calls also show the reduced number of function calls to Entity API and normalizers.

URL1 function comparison from JSON:API 8.7 to JSON:API 8.8

Case II

The comparison shows 66% improvement in response time.

URL2 comparison from JSON:API 8.7 to JSON:API 8.8

There are 35 more SQL queries on JSON:API in Drupal 8.8.

These are the same additional calls to the new cache bin.

URL2 query comparison from JSON:API 8.7 to JSON:API 8.8

Function calls also show the reduced number of function calls to Entity API and normalizers — same as before.

URL2 function comparison from JSON:API 8.7 to JSON:API 8.8

I ran the same scenarios with redis cache backends instead of the default database backends. The results show the same kind of improvements.

3. Raw response comparison:

What matters is how this all plays out on the website.

JSONAPI:8.7 first page load on cold cache

JSONAPI:8.7 first page load on cold cache

JSONAPI:8.8 first page load on cold cache

JSONAPI:8.8 first page load on cold cache

Before After Improvement URL1 2.6 sec 1.3 sec 2x faster URL2 4.5 sec 1.8 sec 2.7x faster URL3 7.7 sec 2.5 sec 3.1x faster URL4 7.5 sec 2.4 sec 3.1x faster URL5 7.2 sec 2.5 sec 2.9x faster Overall 10.3 sec 3.8 sec 2.7x faster


In short, JSON:API in Drupal 8.8 is going to be significantly faster than its predecessor!

To improve the performance like this takes enormous effort and this was a community accomplishment but special thanks to @ndobromirov, @kristiaanvandeneynde, @itsekhmistro, and last but not least the hardworking maintainers of JSON:API module @e0ipso, @gabesullice, and @Wim Leers, without their work, support and guidance this would not have been possible. Please give them a shoutout on Twitter or come say ‘hi’ in Drupal Slack #contenta channel. If you are interested in JSON:API and its performance then please feel free to help out at https://www.drupal.org/project/issues/search?status%5B%5D=Open&issue_tags_op=all+of&issue_tags=Performance%2C+API-First+Initiative.

Thanks to @Wim Leers for feedback on this post!

Photo of Jibran Ijaz

Posted by Jibran Ijaz
Senior Drupal Developer

Dated 24 September 2019

Add new comment

Sep 24 2019
Sep 24


E-learning boom is gaining momentum. That can be explained not only by the convenience of use but also by the low cost. Many businesses that are interested in developing employees can save a lot of money by implementing e-learning instead of offline learning. Besides, according to SHIFT, 42% of companies say that implementing e-learning has lead to a revenue increase. 

This article will show you why you should use Drupal for creating an e-learning platform. 

Drupal Learning Management System (LMS)

Take a quick read about Drupal advantages. 

Remember, online learning platforms for businesses should be specialized on the company’s goals and should fulfill employees' needs. For example, you need to increase sales, but some sales employees don’t possess the necessary knowledge. That means you need to provide a course where the employees will learn this information. It’s also possible to create a course not only providing knowledge but also for training existing skills.

Here are some tips the company can use while providing an educational course to the employees:

  • Use blended learning. Don’t concentrate only on a specific way of producing materials for studying. Drupal allows mixing learning tools by using video, texts, flashcards, etc. because it has video-based modules, gaming and gamification content, scenario-based learning, microlearning modules, multilingual courses, images, and infographics.

  • Create learning paths. Tracking different participants' ways of learning teaches to allocate some problems which need to be cut out or improvements that need to be made. For example, if your employee doesn’t know how to prioritize everyday tasks to achieve a company goal, an e-learning platform should train the employee on how to do it. 

  • Reward and recognize training achievements. Even on e-learning platforms, rewards motivate for studying. Here is a tip. Make a microsystem of students where they can act as experts and students at the same time. Some students can help others by correcting their answers and for these corrections receive a nomination like “the best correction”, “the best mentor”, “the best-provided information”. Such gamification increases engagement.

  • Ask your employees what they need is the easiest way to engage employees in corporate learning and make the learning effective. The thing is, the revenue generated per employee is 26 % bigger for companies that offer e-learning, according to Topyx.

Drupal modules and profiles for e-learning

First, define a company’s e-learning goal, then choose a realization tool. It can be a profile or a combination of modules. We made a review of Drupal modules and profiles which can be used for e-learning goals. Take a read and choose the best for the company.

  1. Opingo
    The Opingo profile includes the flexible access system based on such roles as a student, teacher, coach, administrator, etc. It has the Theory and Quiz modules to create engaging and interactive content. Opingo has adaptive learning path management, where training materials can be adapted to every student according to his/her previous achievements and some conditional rules. There is a module that manages virtual classroom sessions and allows to implement Instructor-led sessions.
    One of the Opingo advantages is the implemented certificate module. If a student successfully completes the training, a PDF certificate is automatically generated. The e-learning platform is more valuable when it provides students with certificates.

  2. Open Academy
    This is a Drupal distribution that brings the best in web publishing for higher education on a customizable Drupal platform. It’s designed to provide academic departments in higher education with a ready to use website. It has sections for courses, news, people, events, and publications. Recent news, events, and publications are listed on the main page. Open Academy is easy in use because of design simplicity. Besides, most academic department's needs are well accomplished there.

  3. Julio
    Julio is a distribution targeted for schools, school districts, small colleges, and academic departments within universities. There are sections about faculty and staff, students’ life, academics, admissions, parents and guardians, and sports. Empowered persons can create announcements, events, galleries, and group posts in all sections except for the faculty and staff.

  4. Course
    The Application Programming Interface (API) in the Course helps to allocate goals which can be added to the employee’s workflow.
    There is a possibility to use objects which will be marked as graded or ungraded course objects. Also, the Course provides a framework for external learning application integration.

  5. Room Reservations
    This module is created for managing study rooms being used while learning.

  6. Quiz
    The Quiz module in Drupal allows building graded analytics, the results can be displayed after and during the quiz.
    It’s an effective way to track a student’s progress because it’s easy to analyze. The students are able to see their improvement and also to receive feedback from an administrator. This module also includes the certificate module. Read about 10 Drupal modules for quizzes.

  7. User Progress API
    The User progress API module was sponsored by Pennsylvania State University. It has been developed for charting students’ progression throughout a system. 

  8. Badges
    The digital badging Drupal module helps provide a visual demonstration of an achievement to enhance accomplishment in the eyes of the student as well as college admission officers.
    Such digital images as badges help to recognize learner’s skills and achievements. That type of visualization gives an opportunity for the student to actually feel improvement during studying.  

  9. Certificate
    The Certificate module creates and awards PDF certificates using tokenized HTML templates.

  10. H5P - Create and Share Rich Content and Applications
    This module is an authoring tool for rich content - you can create interactive videos, flashcards, board games, etc. Besides, H5P also enables your site to import and export H5P files. H5P files are package files for HTML5 content and applications. You can easily upload them to your Drupal site and publish rich Internet content and applications. 

  11. Social learning/Messageboards
    Communication between students matters, it’s like at university. The Social login and Social share Drupal modules allow to log in using the social network sites, which also helps in sharing the content with the student’s network.
    You can also read about 10 Drupal modules for communication with users and 10 free Drupal modules for integration with Social Media

While providing an e-learning platform to your employees, don’t forget that a company learning management system should be well-designed, otherwise, all good intentions will sink in the “failed user's experience” sea.  

Also, guarantee consistent instructor presence at the platform. Two types of problems will be solved by doing this. First, it will be easy to ask some appearing technical questions, therefore studying will go faster. Second, when a lack of motivation captures some students, there is an opportunity to address the instructor for motivation.

Sep 23 2019
Sep 23

The friendliness of Drupal 8 for content editors and website administrators grows every day. New handy features come thick and fast — updated Media Library interface, built-in drag-and-drop Layout Builder, media embed button in CKEditor, and so much more. 

Today, we are happy to announce another lucrative improvement — a new and modern administration theme Claro is supposed to come to D8.8 core! 

Why Drupal 8 needed a new administration theme

The idea of a new administration theme arrived as part of the Drupal team’s striving to make Drupal more competitive in everything. 

Drupal creator Dries Buytaert wrote that he had talked in 2018 to almost a hundred Drupal agency owners to discover their key stumbling blocks in selling Drupal. One of their common replies was about the admin interface having an outdated look. 

The admin UI theme, Seven, had been created back in 2008-2009 in the time of D7, with a few updates in D8. Since then, a lot of water has passed under the bridge and plenty of modern UI design trends have appeared.

There was also an admin UX study performed in 2018 when content editors where asked in detail about their impressions from working with Drupal and they suggested many improvements. According to the famous contributor who worked at this study group, Suzanne Dergacheva, Drupal 8 for content editors is “notoriously intimidating” when it comes to newbies. 

So the great minds of the Drupal community agreed that the admin UI really needed a good brush-up and a new, clean, and modern theme.

The Claro theme in Drupal 8: a good core candidate

One of the results of the above-described decisions was the appearance of the Claro theme. It is a clean, concise, responsive theme with a modern look and an enhanced level of web accessibility. It is being built on top of the Seven theme. 

New Claro admin theme in Drupal 8.8

It is now a contributed project, but there is a proposal to add the Claro theme to Drupal 8.8.0 core as an experimental theme.

The development is in full swing, with the project’s new version, alpha 5, released in September 2019. The maintainers actively welcome any feedback and bug reports about the new theme to brush it up. 

Claro theme principles and features

Here are some features of Claro, both completed and planned:

  • a new, colder color scheme
  • higher contrasts
  • touchscreen readiness
  • the Quick Edit, Toolbar, and Contextual Links components
  • redesign of content pages and all their components
  • redesign of the file and image upload widgets

The Claro theme is built in accordance with the Admin UI & JavaScript Modernisation Initiative. It strictly follows the new Drupal 8 admin theme design guidelines:

  • precise shapes and good contrasts
  • clear hierarchy and relations between elements
  • the clear purpose of each element
  • rational use of white space
  • optimal readability
  • emphasis on what matters
  • visual clues
  • friendly and cheerful colors

and more.

New admin theme Claro expected in Drupal 8.8

Claro at the Drupal Usability meeting

The theme is receiving special attention at the Drupal Usability meeting, according to Gábor Hojtsy’s tweet. The meeting experts are asking to get more feedback, so it looks like the new project is going to be polished to perfection.

Gábor Hojtsy about the new Claro admin theme

Claro as part of the top Drupal 8 distribution

Claro is already having a good test drive because is it part of the most popular D8 distribution, or installation kit — Lightning

By the way, speaking about Drupal 8 for content editors, we must admit Lightning is a distribution well-tailored to their needs and great for media and publishing websites. It is actively used by almost 3,000 websites. By itself, Claro is installed on 500+ sites today.

New admin theme Claro is coming to Drupal 8.8

Use the benefits of Drupal 8 for content editors with us!

As you see, the present and future features of Drupal 8 for content editors are very lucrative. To keep up with them, you can contact our development and support team will smoothly update your website to all new releases. 

Sep 23 2019
Sep 23

It's a Monday morning and you push your first bit of code for the week. Suddenly all your Javascript tests start failing with crazy errors you've never seen before! And it's not just happening on one project! This post will hopefully help you track down the fix to the Bad Message 400 errors plaguing WebDriver.

Here at PreviousNext, we have automated processes to ensure our PHP images are updated on a weekly basis. On September 22nd 2019, that update included a version bump to the curl library from 7.65.1 to 7.66.0. This had a cascading effect which resulted in builds across all of our projects failing javascript tests running against selenium/standalone-chrome containers.

The errors looked something like this:

WebDriver\Exception\CurlExec: Webdriver http error: 400, payload :<h1>Bad Message 400</h1><pre>reason: Bad Content-Length</pre>

We were able to compare an old version of the PHP image (from a week ago) and track down that version change in CURL. But why was that failing? We didn't want to just go about pinning curl back to the old version and dusting our hands off.

Let's dive into the void (stacktrace)

WebDriver\Exception\CurlExec: Webdriver http error: 400, payload :<h1>Bad Message 400</h1><pre>reason: Bad Content-Length</pre>


When inspecting the code through the above trace, we found that the instaclick/php-webdriver library was responsible for issuing the actual cURL command and was throwing the exception.

Op to the Rescue

When looking through the recent commits of the library, Nick Schuch noticed a suspicious commit that sounded a bit fishy. Sure enough, manually applying those changes got all the tests green again!

But how do we fix it for good? That's where it gets a bit tricky due to some composer constraints (as per usual).

Unfortunately the instaclick/php-webdriver library's HEAD is quite far aHEAD of the latest stable release (1.4.5), and we aren't able to simply bump to dev in our composer file due to behat/mink-selenium2-driver (a Drupal core dev requirement) constraining us to 1.x.

The Fix

The easiest approach for now is to commit a patch locally to your repository and manually patch it until the maintainer releases a new stable release

First download a custom patch file I've prepared against 1.4.5:

wget https://gist.githubusercontent.com/acbramley/c2809699c4dbf1774a14d89722743395/raw/e2c1479a73e7e9faff200802671bf982b6f3ac56/gistfile1.txt -O instaclick-curl-fix.patch

Then patch the library (using cweagans/composer-patches) with the new patch file by adding the following to the patches key in your composer.json:

"instaclick/php-webdriver": { "fix cURL POST": "instaclick-curl-fix.patch" }

Then simply run composer update instaclick/php-webdriver

Photo of Adam Bramley

Posted by Adam Bramley
Senior Drupal Developer

Dated 23 September 2019

Add new comment

Sep 22 2019
Sep 22

Though big data, AI, Cloud, blockchain, and open-banking have been here since quite a long time to transform the way financial services are designed and rendered, there is still a roadblock in the way ahead - core banking infrastructure.

Open-source banking can level the playing field and enable incumbent players to take advantage of these powerful trends and transformative technologies

Outdated architecture, costly licenses, specialized consultants-all of these hinder accessible FinTech services from keeping up the pace with current trends even in this era of smartphone ubiquity, where with one click, everything gets done!

Open-source banking can level the playing field and enable incumbent players to take advantage of these powerful trends and transformative technologies.

By leveraging common technology infrastructure, it can analyze the customer data and deliver a seamless banking experience via the mobile phone, leverage the power of the cloud, connect into a distributed ledger and digital payments, and more.

As per the PWC Report, there are over 80% of the financial institutions that believe business is at risk to innovators; 56% believe that they have put the transformation in their core strategy ; 82% expect to increase FinTech partnerships in the next three to five years, and 77% expect to adopt blockchain as part of an in production system or process.

Drupal is the perfect website content management framework to create open-source banking platform where it will not only reduce costs significantly, free up IT teams to focus on innovation but also enable greater security and extensibility to new devices and delivery channels.

Challenges Faced by FinTech Ecosystem

Though FinTech solutions have been doing the rounds for quite some time now in the market, there are a few constraints that are still stonewalling the industry’s growth. Some of these are underlined below:

  1. Market regulators

    1. Balancing data privacy needs with the industry’s requirement for open data
      Market regulators are having a hard time in striking the balance between consumer needs of data security & data privacy and industry’s need for open data for insight generation. Data privacy is critical to safeguarding consumers’ trust in the FS space, however, stringent practices on data sharing can hamper the free flow of data crucial for creating innovative solutions. Data privacy is critical to safeguarding consumers’ trust in the FS space
    2. Aligning with the anticipated risk associated with advanced technologiesMarket regulators need to match the pace with the fast-changing technology landscape to fully understand the evolving risks on the wider ecosystem. For example, cryptocurrencies could be used for money laundering, and AI-driven algorithm trading could lead to system-wide risks by increasing market unsteadiness.

      Also, AI-led models for credit assessment and underwriting could lead to a segment of one and end up pricing certain customer segments out of the market for good.
      Various elements and text in one rectangleSource: Mastercard

    3. Ensuring stability in the FinTech sector in this close network of world
      It’s evident that FinTech players have created a diversified FS ecosystem which has led to the strengthening in interconnectivity, but it has also brought forth new systemic risk by launching disruptive models.

      For instance, local regulators are grappling to supervise global technology firms who operate across multiple jurisdictions, leading to regulatory arbitrage.

  2. FS Incumbents

    1. Reskilling people for the modern digital world
      It is one of the key challenges that industry is right now facing and i.e., “How to adopt workforce re-skilling strategies to endure the technology-led revolution”?

    2. Regular monitoring of advanced technologies
      Such regulations have clipped conventional players’ ability to experiment with advanced analytical models in areas directly influencing customers
      The FinTech industry is finicky about consumer security necessitates that advanced models should be employed in sensitive areas such as lending pass the test of explainability to protect consumer interests. Such regulations have clipped conventional players’ ability to experiment with advanced analytical models in areas directly influencing customers.

  3. FinTech players

    1. Tackle the cyber-security concerns to gain consumers’ trust
      The advancement of technology has its pros and cons. And one of the cons is increased cybercrime! Its now FinTech players and their partners’ responsibility of ensuring that appropriate digital control measures are taken to secure customers’ trust and assets.
    2. Lack of early-stage funding
      Despite the FinTech space appealing sustained investments over the past few years, many smaller startups still struggle to gain early-stage capital, prohibiting their potential to scale up.
    3. Managing regulatory uncertainty
      Although Indian FinTechs have worked in an enabling regulatory environment, they still have not been resistant to regulatory uncertainties. Many FinTechs who had built their business models around Aadhar-enabled services for customer onboarding had to pull it out due to physical mandates, leading to the disruption in their operations.

How Drupal Modules can Power FinTech?

Organizations planning to or delivering FinTech solutions need to maintain a robust online presence. Drupal has been powering the landscape of FinTech with its extraordinary capabilities.

However, unlike with media publishing, education, or government verticals, which have dedicated distributions, there is no such scenario in FinTech.

The mentioned ones satiate the needs of consumers by providing related features with ease-

  1. Commerce PayPal
    Commerce Paypal incorporates PayPal into the Drupal Commerce payment and checkout programs. Currently, it lends support to:
    1. Off-site payment via PayPal Payments Standard (WPS) & PayPal Express Checkout (EC),
    2. Off-site or on-site payment via PayPal Payments Advanced (PPA),
    3. Payflow Link (PFL), and
    4. On-site credit card payment via PayPal Payments Pro (WPP).
      The PayPal WPS / EC Integration supports PayPal’s Instant Payment Notifications (IPNs) to respond to authorizations, captures, voids, and refunds with full logging for testing and debugging.
  2. Currency
    Currency takes this overwhelming task on it by converting currency with its inbuilt conversion and price display input filter.
  3. Commerce Paybox
    Paybox is integrated with Drupal Commerce payment and checkout system. It offers two mechanisms - Paybox service and Paybox Direct (PPPS), wherein former service offers a payment solution on its server and redirects customers to paybox.com during the payment process and the latter one supports on-site payments. This implies that payments are done on the Drupal site.
    Installing HTTPS before implementing this payment method is considered good practice to ensure security.
  4. The Google Currency Converter
    The Google Currency Converter module has integrated Google finance calculator within it to convert currency on the website. It also offers an option where you can set your default currency and default currency conversion format.
  5. BudgetUsers can set up a budget with this module to manage their finances. The list of requirements goes like this:
    1. Data Structure- Data will be broken down into four main taxonomy terms: income, expenses, debt, and savings. From there, sub-terms can be added by the site administrator to further classify data items. Main terms can also have sub-terms, where the user can enter their description.
    2. User-interface- The data entry will be a multi-part question and answer session, with help pop-ups to help users enter data and select sub-terms from a drop-down menu to manage their finances.
    3. Security- Appropriate measures will be taken to ensure the privacy and security of the user and their data. Only the user, system administrator, and financial adviser role will be able to view the individual user’s data and report.
    4. Recommendations- The finance recommendations will be based on the user data’s deviation from normal as a percentage of net income for his/her income group. Additionally, the site administrator will be able to set thresholds where red flags will be raised along with the description for the user to understand the reasons behind it.
    5. Aggregate Reporting- The module will produce aggregate reports in spreadsheets with 6 months cost projections. These reports will be exportable in excel spreadsheet format.
    6. Open Source- The module will be licensed under the GPL and contributed to the Drupal community.
  6. UC OmniKassa
    Integrate Rabobank OmniKassa  to make it as a default checkout method for UberCart.

    This module offers different payment configuration methods (iDEAL, Credit Card, transfer) to use via SHA-1 encryption for secure payment status verification. All settings are adjustable in admin form.

  7. Ubercart Affirm
    Affirm is an off-site payment method and a financing alternative to credit cards and other credit payment products. This project integrates Affirm Credit Payment Gateway into the Drupal Ubercart payment and checkout systems.

    Watch this video to understand further how technology is changing the Finance sector-

    [embedded content]


  8. Commerce Lending Works
    Lending Works aligns investors with borrowers directly who want to spread the cost of their purchase. It offers flexible finance on purchases from £50 to £25,000, without any hidden fees.

    This module is useful for a retailer in:

    1. Boosting sales- Finance services can shoot up retailers’ sales by 17% and order value by 15% on average.
    2. Refined customer experience- Customers enjoy the hassle-free process whether its online, in-store or over the phone.
    3. Rocket science made simple- The integration process is super-fast and provides round the clock support to help them analyze sales on one easy-to-use online account, or connect by API.
    4. Flexible finance - Split small purchases into 3 interest-free payments or Finance from 6 to 60 months on purchases from £250.00
  9. Drupal Finance
    Drupal Finance aims at providing complete business accounting and finance solution. However, don’t use it in production as it is in the very early stages as of now and entity schema will likely change without any prior information.

The following features are either currently available or are in development:

  • Organizations
  • Financial Documents Entity Type with Bundles
  • Supplier Entity Type
  • Financial Field Type to store the monetary value of a particular currency, along with performing currency conversion based on the primary currency of the organization.
  • Formula Field Type (experimental) which can be used to dynamically perform calculations based on mathematical equations and can contain Tokens to include values from other fields.

    It comes in handy where value is based on values of other fields, such as adding together an invoice total amount and tax.

    Integration with the Currency module, along with an Exchange Rates Plugin which provides real-time and historical exchange rates powered by ExchangeRatesAPI.io.


  1. Guardr
    Guardr is a Drupal distribution made in combination with the modules and settings to upgrade the Drupal’s application security and availability to meet enterprise security requirements.

    Sufficient information must be fed to the system so that it can store it and compute it to prevent any service disruptions caused by power outages, hardware failures, and system upgrades.

  2. Droopler
    Droopler is a Drupal 8 distribution offers pre-built websites with complete functionalities so that you can tweak as per your requirements and get your good-looking website ready swiftly.

    Droopler is great for:

    1. Website factories - Used to build various microsites with editors having the power to edit content. Pick a theme to match your brand colors and get your website ready instantly.
    2. Corporate websites - Having a site is essential for all to stay in business but it's not necessary that all companies have an extravagant budget. Drooplers is a great start to create websites in a pocket-friendly manner.

      Custom Bootstrap 4 theme

      SCSS included and all variables & settings can be customized to match your needs.

      Built on Paragraphs

      Multiple boxes with lines drawn inside them
      Source: Drupal.org

      It uses Paragraphs module to create the pages. During the installation, you get one content type with various paragraphs (banner, feature list, text with an image on the site, headline text with background image), all themed and working exceptionally.

      Multi-language support

      Two languages are set by default for a demo with options to remove them/add more as in the case with any multilingual Drupal site.

  3. Seeds- Drupal Starter KitSeeds is a light distribution which SMEs can use to kickstart their projects irrespective of scale to speedily complete their projects.
  4. Panopoly

    A base distribution of Drupal powered by lots of Chaos Tools and Panels magic enacts as both general frameworks for site-building and a base foundation upon which other Drupal distributions can be built. 

Final Words

Consumer demands are taking a paradigmatic shift- and FinTechs are iterating on the product quickly to get ahead of demands by offering alternative financing sources, branch-less banking, and more. However, there is no need for enterprises to reinvent the wheel to achieve the necessary objectives, as the tools and technology that they need to deploy, Drupal, Blockchain, Cloud, AI, & Big data are all available commercially and they can leverage it to scale a comprehensive data ecosystem using APIs while mitigating risk.

They will either demonstrate significant improvements in automation, digitalization, analytics, quality, security, and compliance or else they will go backward compared to their peer group.

Here is to the hopes of using better technology and getting great business outcomes in the year ahead!

Sep 22 2019
Sep 22

Our testing approach was two-fold, with one underlying question to answer: what is the most intuitive site structure for users?

Test #1: Top Task survey

During the Top Task survey, we had users rank a list of tasks we think they are trying to complete on the site, so that we have visibility into their priorities. The results from this survey informed a revised version of the navigation labels and structure, which we then tested in the following tree test. The survey was conducted via Google forms with existing Center audiences, aiming for 75+ completions.

We then used these audience-defined “top tasks” to inform the new information architecture, which we tested in our second test.

Test #2: IA tree test

During the tree testing of the Information Architecture, we stripped out any visuals and tested the outline of the menu structure. We began with a mailing list of about 2,500 people, split the list into two segments, and A/B tested the new proposed structure (Variant) vs. the current structure (Benchmark). Both trees were tested with the same tasks but using different labels and structure to see with which tree people could complete the tasks quicker and more successfully.

Sep 21 2019
Sep 21

Content has proven itself king time and again. The enthusiasm with which B2B and B2C companies are investing in content production, has however brought up a significant question- Are users able to find the relevant content?

Finding the relevant web content has been one of the biggest issues that enterprises and users both have been facing

Finding the relevant web content has been one of the biggest issues that enterprises and users both face and so, it needs to be addressed without further delays to avoid poor user experiences and negative sentiments. 

Additionally, big brands and companies also lose out on opportunities due to the content searchability issue, like failing to quickly come up in the search results for a given phrase or set of keywords, which can jumble the company’s revenue percentage.

The problem can be fixed with little effort for Drupal powered enterprises. They just have to inculcate content tagging while classifying their piece of content. 

Having said that, this blog will provide insights on content tagging, taxonomy, and how implementing these factors on your Drupal website can take your marketing efforts to a whole other level.

Why Does Strategic Taxonomy Matter?

Content tagging can be huge resource-demanding and tedious task, especially when done manually, leaving companies wondering even if it’s worth the efforts. So, before building your tagging taxonomy, it’s better to understand why it matters.

All your efforts put into procuring and producing content will go wasted if there is no one to read it

  • Searchability- For your targeted  audience to find it

Enterprises have diverse and in-depth categories of resources available but what if users visit the site and could not find the desired content, they’re going to leave of course. 

All your efforts put into procuring and producing content will go wasted if there is no one to read it. Thus, it’s better to help your targeted audience access your content in the simplest and convenient way through a proper tagging structure.
Box comprising circle in it                                                                         Source: Curata

  • Usability: For your internal team to leverage it

Content tags not only benefit users but also internal teams of the company, especially sales. Including tags such as buying stage, persona, industry, product line, and geographical region, will only benefit the sales team to leverage and share relevant content with potential customers that align with those components. 

It will facilitate key stakeholders in disseminating content in their network to move users through the preferred funnel

This strategic content plan and execution as per business goals will increase the shareability of content for your users and will also facilitate key stakeholders disseminate content in their network to move users through the preferred funnel.

  • Data Insights: For your analytics team to gain insights from it

Another key benefit of proper tagging taxonomy can be leveraged through tag structure. It helps in building custom segments of data for your analytics teams to extract data and insights on the content framework, calendar promotion schedule, and content production cadence, and audience preferences so that you can tailor your content accordingly.

  • Sales Acceleration: For your readers to navigate, curate and refer to it

The agenda has been always on enhancing user experience. If they have engaged with a piece of content that resonated with them, they are more likely to read more content on the same topic, category or style. Tagging taxonomy with simple tag filtering showcase users the next article they should read as in line with your preferred funnel structure.

If users have engaged with a piece of content that resonated with them, they are more likely to read more content on the same topic, category or style

All this can be achieved by simply organizing content through a strict tagging taxonomy.  A better content tagging structure can create more business efficiencies for users and internal teams- they can tangibly impact the bottom line!

Content Tagging and Drupal Taxonomy

In Drupal, taxonomy is the core module used for categorizing or classifying content being built on the website with the CMS. It is critical to the website’s information architecture, on both the back and front ends.

Taxonomies in Drupal consists of vocabulary associated with them. This vocabulary list helps CMS to determine what items belong with what types of content. Further, vocabularies have terms with them, where the list of terms define the contents of the vocabulary. 

These can be a part of a hierarchy or simply a compilation of tags. Tags group nodes (elements in Drupal sites that contain content; eg. articles and basic pages) together. These can then be referenced with the search on the website. 

Sites built on Drupal can have an unlimited number of vocabularies and terms, so complex sites can be built using the framework. These two elements associated with your website can serve several purposes, especially for displaying content and managing content assets. It can also be important for reference as well.

Content tags, on the other hand, are a great way to navigate websites. In fact, this type of tag often appears as a hyperlink that users can click on to view other content in the system that contains the tag. These are used within the content management system, say, Drupal, to organize, filter, and relate content for end-users.

These tags can be applied in a few different ways, depending on the system that is using them. Some systems will allow for the creation of highly controlled tagging lists that content providers can choose terms from. Other systems may supply a free-tagging method, where users just type in terms.  Some systems allow for both methods.

How to Use Taxonomy in Tagging Content?

Although you can optimize your content based on its type, you might also want to view content based on what it is about. Taxonomy allows you to link terms with the content which you can put to use in organizing and presenting content on your website. 

Taxonomy allows you to link terms with the content which you can put to use in organizing and presenting content on your website

You can refer to this blog , Adding Tags with Drupal Taxonomy In 9 Steps,  to learn using taxonomy in tagging content. There are nine steps given which you can follow to classify your content hassle-free and increase your site visibility.

How Drupal-powered Enterprises Can Benefit From Using Taxonomy?

Taxonomy plays an important role in content strategy as it can make sense of your organization’s content by supporting the following activities-

  • Search and Discovery

This is the most common and useful benefit of taxonomy-  as it facilitates search and discovery in knowledge-driven and Drupal-powered organizations; leading to improved discovery layers including search, related content, and personalization so that it can work across various content repositories and even multiple organizations. In the end, the objective is to empower users and knowledge workers so that they can quickly find what they need. Search is essential for their productivity  and taxonomy can ensure it to a great extent.

Search engines like Google and Bing will be able to easily determine the site’s content, architecture, design, and organization of the website files, and hence improving ranking on SERP.

  • Permissions or Visibility

The goals of Drupal-powered organization determines how best to use taxonomy, permissions & metadata to share the information (public, confidential, semi-confidential, etc.) within the organization with various parties. There are many nodes and specific content that only certain members with the organization are allowed to edit.

Developers can use the permissions in the administration page within Drupal to assign permissions and roles for registered users of the site

Developers can use the permissions in the administration page within Drupal to assign permissions and roles for registered users of the site. This ensures high flexibility to developers as  they can also modify the content which the public can view.

  • Repurposing

Re-using the existing content wherever it’s relevant instead of starting from scratch or simply recombining the taxonomy (if there is not even one in place to allow items to be found) into new useful information sets can save plenty of time and efforts of the organization. It can also help in reaching new audiences and reinforcing your message.

  • Future-proofing knowledge held in the business

Taxonomies are “knowledge insurance” that stores and shares classified information assets, to retain knowledge accessibility while people move on. Obviously, a taxonomy can also move by continuously evolving in line with the needs of the organization.

 6 Best Practices To Tag Content Well

  • Consistency is the key

Use clear, consistent tagging throughout your organization to provide a uniform experience to the customers. Marketing and sales team should  use the same taxonomy terms to tag content. 

However, if you won’t share tags from the same taxonomy, you are going to have a fractured website.

Too many tags oversaturate search results, and too fewer tags fail to provide enough personalized content

  • Strike the right balance

Users find it informative when tags are marked appropriately. Too many tags oversaturate search results, and too fewer tags fail to provide enough personalized content. So, it’s important to find the right balance.

  • Focus on the user

The right balance can be easily achieved by focusing on users’ experience. Ponder upon the tags before adding that, will it add value to users’ experience or is it just because you want the asset to get more attention on the site?

Three circles connected to each other

Take yourself out from the tagging equation and focus on the experience part, you’ll get a much clearer picture of which tags are and aren’t appropriate

  • Prioritize your time

Tagging consumes a lot of time so it’s a good practice to prioritize and maximize your time by evaluating the content. Find out which assets need more detailed tagging (for say, only those that will live on the site for a long time) and which can have more general tagging as they will change often (e.g. industry reports) so that you can your time and sanity.

  • Fill your content containers well

A tag associated with a topic having a  plethora of information within it is likely to keep a reader more engaged than a tag that has only one or two pieces of content. So, before creating a new tag, ensure that you have ample amount of content that could be tagged the same way.

  • Consider SEO while selecting tags

Use keyword planner tool to check SERPs and find out what keywords do users use to search the content-  is it the acronym, plural construction, or spelled out version. 

Drupal Modules

Here are some modules listed that work around the principle of taxonomy and content tagging-

The Power Tagging module is linked with thesaurus or taxonomy to interpret content and its concepts in Drupal. Users can easily curate all suggested tags at one place or can even compile collections of Drupal content nodes to create a semantic index. This makes search more comfortable than ever before.

It also allows you to customize your entity’s tags with manual tags and perform multilingual tagging. 


  • Tweak your entity's tags with manual tags combined with an auto-completion of already used tags.
  • Supports multilingual tagging
  • Whole content can be tagged automatically at once via Bulk-Tagging

Available for - Drupal 8 | Not covered by Security Advisory

This Drupal module provides context for content items by displaying a view block with links to other similar content. The similarity is defined as per the taxonomy terms assigned to content. Views are available based on similarity within each of the defined vocabulary for a site as well as similarity within all vocabularies. 

Simply put, you can use this module by creating a free tagging vocabulary called “Tags” assigned to the content types on which you would like to display a similar view block.

Available for - Drupal 8 | Covered by Security Advisory

Good search engine optimization practices bring organic traffic to the website. And so this module helps in updating the heading tag at the top of the taxonomy term page so that it appears on top in SERPs. This is the only module that lets you control the title individually for every term.

Enterprises should add more user-friendly, keyword-rich, and describing words to this heading element. 

Available for - Drupal 7 | Covered by Security Advisory

Azure Cognitive Services API module  seamlessly incorporates intelligent features and technology into the Drupal applications, like Machine Learning, Artificial Learning, and Natural Learning Process, to detect speech, facial and vision recognition other than detecting the emotions. 

Among the 4 features it provides text analysis, API module is helpful for tagging

  • Azure Text Analytics API Module

Text Analytics API is a cloud-based service that provides advanced natural language processing over raw text and comprises of three main functions- sentiment analysis, key phrase extraction, and language detection.

Available for - Drupal 8 | Covered by Security Advisory

This module helps in optimizing content around keywords in a fast, natural, and non-spam manner. It also keeps a check on other SEO factors such as the length of the post, written meta-description using focused keywords, and subheadings within the post. 

This real-time page analysis ensures that your content is easily searchable and liked by users.

Available for - Drupal 8 | Covered by Security Advisory

It is a great SEO module since it takes away the boring and laborious task from you. It automatically defines URLs which are both user-friendly and relevant as per the category and page title. This clarity in classification helps users churn information with ease and you can also get brownie points from search engines.

Available for - Drupal 8 | Covered by Security Advisory

Metatag module facilitates enterprises to provide more metadata on their website. This includes tags, page titles, descriptions, etc. As a result, it helps Google in ranking the website in SERPs.

Available for - Drupal 8 | Covered by Security Advisory

It is a semantic vocabulary of tags that you can add to your HTML to improve the way search engines read and represent your page in SERPs.

Available for - Drupal 8 | Covered by Security Advisory

Final Words

Embracing taxonomy and tags to classify content is a great way to solve the problems of those organizations who possess a gigantic amount of data or are anticipating to create a huge amount of it in the future.

Also, since Drupal is open-source software, there is an outstanding opportunity for enterprises to learn from a community of developers and users. However, the foremost task that organizations should do is start thinking hard about what they are doing with their content once it is classified and how they intend to deliver it so that it serves best customers’ manifold needs.

Sep 20 2019
Sep 20

While compelling content marketing taxonomy in Drupal isn’t just about enhancing the searchability of your content but to also ascertain content priorities based on what’s in an item. However, there were 56% marketers from B2C and 64% marketers from B2B who didn’t even have a documented content marketing strategy as per this source. 

A content marketing taxonomy aids content strategizing by organizing content in an easy-to-understand way for marketers to analyze data and gaps

Let’s dive into this blog to learn using taxonomy for tagging content-

How to Use Taxonomy in Drupal to Tag Content?

Each set of taxonomy terms is part of a category set that you define and is called a vocabulary. Terms in vocabularies are that can be broken down further and can contain sub-terms.

Therefore, it is of prime importance to first understand how to create vocabulary-

1. Go to Manage >> Structure >> Taxonomy. By default, tags (as a vocabulary) is here.
At this place, we can add vocabulary by clicking on the + Add Vocabulary

Rectangle box with text written inside

Enter the name of the newly created Vocabulary along with the short description.

Box with name and description fields inside

2. Click on the Save button. You will be redirected to the Ingredients page, which shows a list of all the terms that you have added in this vocabulary.Box with text inside it

3. Now click on the Add term. Enter "Butter" in the Name field. Once done, click on the Save button.

Box with Add term title page

4. You will receive a confirmation about the term you created. You can also add more terms, like "Eggs" and "Milk".

5. In the Manage administrative menu, navigate to Structure > Content Types(admin/structure/types). Click Manage fields for your Recipe content type.

6. Click Add field, and enter values from the table below. Once done, click on the Save button and continue.

Field name Explanation Value Add a new field Select the field type Reference > Taxonomy term Label The title to give the field Ingredients

Rectangle box with Add field title page

Field name Explanation Value Type of item to reference The type of entity that is referenced by the field Taxonomy term Allowed number of values The number of values a user can enter Unlimited

Dropdown menu option in a box

8. On the following configuration screen, enter the values from the table below. Click Save Settings.

Field name Explanation Value Help text Help showed to users creating content Enter ingredients that site visitors might want to search for Reference type > Reference method Select the method used to choose allowed values Default Reference type > Vocabulary Select the vocabulary to choose allowed values from Ingredients Reference type > Create referenced entities if they don’t already exist Whether new ingredient terms can be created from the content editing form


Box with various fields and options

Click Save Settings. You will be taken back to the Manage Fields page. A message will pop up stating that the configuration for Ingredients is complete.

Box with manage fields page

And you’re done!

You can also watch the video shared below to learn further on setting up taxonomy.

Drupal 8 User Guide- 6.6. Setting Up a Taxonomy (1)

   Video Courtesy: Drupalize.me


Following the given steps will help you in implementing clear and concise content marketing taxonomy in Drupal, which as a result, will improve the readability of your editorial calendar. It will also allow all the stakeholders and team members to know what kind of content you’re creating in just a glimpse.

Happy Tagging!

Sep 20 2019
Sep 20

Drupal is well known for its stability and security out-of-the-box. However, we all know how dangerous the internet can be with all the risks of having your site attacked. There are particular situations in which an extra security measures are needed. This tutorial will deal with specific ways to secure your Drupal site, and the modules involved in this process. 

Let’s get started!

The Out-of-the-Box Features

The Drupal theming system has been improved in Drupal 8 with the use of the templating language, Twig. Twig templating has many advantages over PHP templating. One of them is that it is much easier to visualize on code. Another advantage is the fact, that all variables passed to the template are auto-escaped automatically. That minimizes the risk of a variable including a result that could break the HTML, and therefore your site. The risk is also minimized because you do not have to write custom PHP code in your templates anymore.

The PHP input filter module was part of the Drupal core in Drupal 7, but in Drupal 8, it is a contributed module. This measure eliminates vulnerabilities.

Drupal 8 implemented the trusted hosts configuration. This allows you to associate the Drupal codebase with a specific domain, to prevent online spoofing attacks

Due to the new programmatic approach of Drupal, modules can be enhanced in their functionalities by adding plugins. These plugins add behaviors to the module. The code stays clean and easy to analyze.

The use of Composer as package manager opens the door to new development possibilities for Drupal Open Source software, and it also helps to maintain all modules and keep their dependencies up-to-date and working properly. This is a key factor for the stability of Drupal systems.

How to Enhance Security on Access - Contrib Modules

190919 drupal security

There are two alternatives:

  • Lock attack vectors within the system
  • Limit vulnerabilities by restricting or changing system functions and operations, taking the responsibility from the user

Here are some modules, which provide this kind of functionality:

Automated Logout

Link: https://www.drupal.org/project/autologout

This module allows you to configure the time period for users to be logged in on your site. You can configure different time periods for different user roles, assuming that a user with a higher role, i.e. with access to more valuable information or resources on the site, would need to log in more frequently, than a user with a lower role. 

Session Limit

Link: https://www.drupal.org/project/session_limit

With the Session limit module, you can determine the limit of simultaneous sessions per user on 2 different browsers. If you set the session limit to 1 and open a new session in another browser, the session of the first browser will automatically expire.

Login Security

Link: https://www.drupal.org/project/login_security

Login Security allows site administrators to authenticate user only from valid IP addresses, that way, it is possible to grant or restrict access to your site to whole countries if needed, just by configuring the right domain. Login security restricts also the number of login attempts, so brute force attacks can be avoided. An extra feature of this module is the fact that it can hide Drupal login error message. The attacker will have another problem trying out if the account he wants to access even exists. 

Security Kit    

Link: https://www.drupal.org/project/seckit

Security Kit allows developers to change response headers. This is useful for very specific security needs on sites with high probabilities of Cross-site scripting, Cross-site request forgery, and origin driven attacks.


Link: https://www.drupal.org/project/honeypot

Honeypot prevents robots from filling out the forms of your site, by determining if the user is human or not. It uses two identification methods. One of them is by using a hidden form field, which is not visible to humans if the field has been filled out, Honeypot detects this and blocks the submission. The other method is by taking the time in seconds, in which the form has been filled out. A low value here (2-3 seconds), would speak for a robot, so the module would block  this submission too.


Link: https://www.drupal.org/project/captcha

A captcha is a challenge-response test, to determine whether the user is human or not. This way, the goal of blocking fake form submissions, is achieved since robots will not be able to decipher the captcha text or image.

Auditing - Checking Procedures

190919 drupal security 001

It is always important to permanently review the system logs. This is even more important if your site has already been compromised. The analysis of all this data will help you also to track transactions within your system and perform checks on the ongoing state of the system. A rule of thumb is to log as much data as possible - always!

Some of the modules, which provide this type of functionality are listed below:

Site Audit

Link: https://www.drupal.org/project/site_audit

Site audit performs a static analysis of the whole system, against a set of recommended configurations. The module also stores reports of every audit. By performing this check, you as a developer can be sure and confident that your site is meeting the required security standards.

Security Review

Link: https://www.drupal.org/project/security_review

Security review, like Site audit,  makes also an analysis of the system, but this time, against a set of potential security implications on your site, like file permissions, text formats, and potentially malicious PHP or JS code on the frontend. It also stores reports. 

Login History

Link: https://www.drupal.org/project/login_history 

Login history adds a report to the database with a log of every user login, including timestamp, IP address, and user agent information. As stated before, it is always good to log as much information as possible.

Authentication Measures

190919 drupal security 002

Often, the importance of the information, implicate a reduction of usability. Users take instead of that the extra hassle of these additional authentication procedures.

The modules that you can use for this purpose are listed below: 

Two-factor Authentication  

Link: https://www.drupal.org/project/tfa

Two-factor Authentication provides an additional verification step, which ensures the integrity of the user authenticating. It also provides an API to support different plugins (provided by modules), which integrate several authentication services, like the Google Authenticator module.

simpleSAMLphp Authentication 

Link: https://www.drupal.org/project/simplesamlphp_auth

This module will allow you to replace the default Drupal login with a single-sign-on implementation. The module communicates with identity providers for authenticating users. That way you can validate the identity of the user through a service like Twitter, Facebook or Google

Password Policy

Link: https://www.drupal.org/project/password_policy

The Password Policy module defines a set of rules to force users to have strong passwords. It also forces users to change their password from time to time, depending on the configured options. Password Policy provides an API to define your own set of rules. 

Password Strength

Link: https://www.drupal.org/project/password_strength

This module provides a star rating widget, for users to test the strength of their passwords. You can leverage the API of Password Policy to force users to enter a password with a high rating (minimum rating).


190919 drupal security 003

The data at rest (data that is not being used) should be encrypted to prevent attacks of all types. The encryption provides solutions at all levels:

  • Hosting
  • Server
  • CDN
  • Drupal system

As a rule of thumb, and to guarantee the highest security, the best way is to perform encryption at a low level of this stack.

You should always use HTTPS and SSL for all your web traffic, and you should also ask your hosting or cloud provider about full disk encryption.

Some useful modules are:


Link: https://www.drupal.org/project/key

The Key module manages the system and API keys. It provides an API with options to store and retrieve sensitive keys like API keys or encryption keys. The site admin can decide where to store these keys in the system. Examples of API keys are the public keys for services like AWS, PayPal, or MailChimp.


Link: https://www.drupal.org/project/encrypt

The Encrypt module is an API, which provides a common algorithms utility to encrypt and decrypt Drupal application data. Its API is leveraged by many modules, which make use of algorithms to encrypt/decrypt (Real AES, Diffuse), modules to encrypt data, like Field Encrypt and File Encrypt and other modules for more specific use cases, like Pubkey Encrypt.

File Encrypt

Link: https://www.drupal.org/project/file_encrypt

This module focuses on the file system, it performs a check of the encryption on all files.

Field Encryption

Link: https://www.drupal.org/project/field_encrypt

This module encrypts data at the field level, that is, it encrypts field values.

 DevOps (Developer Operations)

190919 drupal security 004

The development process is critical to proactively maintaining the security. You should always use code repositories and pull requests for all your files. Other measures imply performing regular code reviews, tag each one of your code releases, keep the site code always up-to-date (this includes JS libraries), and try always to avoid manual procedures because these entail mistakes. Always try to automate your scripts and use a tool like DRUSH.

Some of the relevant modules in this category are:


Link: https://www.drupal.org/project/coder

This module has PHP Code Sniffing extensions to test the code on your site and compare it against the coding standards of Drupal.org. Coder will not do anything on your UI. It is rather a command-line tool.   


Link: https://www.drupal.org/project/hacked

The Hacked module scans the code of your core and contrib folders and compares it against the code hosted at Drupal.org. It shows the differences between both codebases, so you can take the proper measures regarding your code. 

Backup and migrate

Link: https://www.drupal.org/project/backup_migrate

This is a classic among Drupal modules. Backup and migrate performs regular backups of the codebase and of the database, so you can restore them, for example on a fresh installation. This is very useful if your system has been compromised, and you want to restore it. 


Securing the infrastructure, in which the system is hosted, is as important as securing Drupal itself. Try always to mitigate attacks before they happen. This module list is supposed to help you with that purpose. 

  1. Use a coding workflow - making sure that the best code ends at the production environment. 
  2. Make a very detailed analysis of the logs - there are very useful tools for this matter, like Splunk or the ELK stack.
  3. If it is possible, try to use cloud-based environments - these are more secure than hosted environments. 
  4. Try to use CDNs every time you can - this acts as a firewall preventing malicious attacks early in the process.
  5. Make sure you have an up-to-date failover environment and test what would happen in case of a failover. 

Please, leave us your comments and questions below.  Thanks for reading!

About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Sep 19 2019
Sep 19

You will notice that, along with thousands of websites around the world, Drupal.org posted a banner message this week declaring we are opting in to a global Digital Climate Strike on 20th September.

Will @drupal website add the #DigitalClimateStrike banner? @baluertl requested it and mocked up a visual... https://t.co/XcIj9Gf173 pic.twitter.com/Zl0ctyc7G6

— ClimateAction.tech (@climateActTech) September 18, 2019

Of course, because Drupal.org is an essential service to over a million websites around the world, we have to be sure that we still allow them all to continue to access resources here. As such, the full page banner that will appear on websites on the 20th September will be configured to allow visitors to cancel it, should they need to.

Fundamentally, the Drupal Association wants to be a good steward of the environment and recognizes the impact that technology has on environmental issues. We are committed to exploring ways for the Drupal project to reduce its carbon footprint and to become a more eco-friendly platform. Today, we stand with others in the technology industry to educate and inform the general public about some of the ways that the tech industry can support environmental causes.

If the environmental sustainability of Drupal websites is a subject as close to your hearts as it is to ours, you might like to know that recently a #sustainable Slack channel was created for discussion on the topic.

Sep 19 2019
Sep 19

The Drupal 8 View Unpublished module is a simple module that provides you a permission to allow specific roles to view unpublished content. It’s a useful module to help you build out your content editor workflows on your Drupal 8 website.

Download and install the View Unpublished module just like you would any other module.

composer require drupal/view_unpublished

After installing the module, go to the permissions page and search for the View Unpublished section. Here you can set the permission for who can view unpublished content on the site. After setting the permissions you will likely need to clear the cache and rebuild the access permissions before your users will be able to see the unpublished content.

That’s all there is to this module!

Sep 19 2019
Sep 19

Ubercart, once the go-to commerce option for Drupal and the precursor to Drupal Commerce, is slowly fading away. Its usage has been declining for years and a stable Drupal 8 release will never happen. Even one of the original creators has moved on to support a new Drupal ecommerce solution instead of continuing on with Ubercart. In you’re running an ecommerce site that uses Ubercart, this post is for you. Our goal is to show you why you should consider moving off of Ubercart now instead of waiting until it finally reaches end of life.

The decline of Ubercart today

As mentioned in the introduction. Ubercart usage has been declining for years. The Drupal 7 version of the module is where it saw most of its success with usage peaking in 2014/2015, but usage has been continuously dropping since then. The following graph is a snapshot of Ubercart’s usage history as recorded on Drupal.org.

Ubercart usage history
Ubercart usage history (source)

Ryan Szrama, one of the original creators of Ubercart, moved away from it and started the Commerce module for Drupal as a replacement. Since then, the majority of the ecommerce community around Drupal has also moved along with him making Drupal Commerce the new go-to option for ecommerce built on Drupal. Not only does Commerce now have more installs for both Drupal 7 and Drupal 8, but it is also a much more active development community.

Commerce usage history (source)

Ubercart and Drupal 8

The Ubercart module has never moved over to a proper Drupal 8 release. Development is stuck in alpha and without a new release in over 3 years, there is never going to be a stable Drupal 8 release.

What “alpha” means

In software development, alpha is a term given to a software release that is still very much in development and not ready for production. Here’s the description of alpha from Drupal.org.

alpha: Most reported errors are resolved, but there may still be serious outstanding known issues, including security issues. Project is not thoroughly tested, so there may also be many unknown bugs. There is a README.txt/README.md that documents the project and its API (if any). The API and DB schema may be unstable, but all changes to these are reported in the release notes, and hook_update_N is implemented to preserve data through schema changes, but no other upgrade/update path. Not suitable for production sites. Target audience is developers who wants to participate in testing, debugging and development of the project.

In contrast, the Drupal Commerce module has had many full production-ready releases for Drupal 8 and follows a release schedule for bug fixes and new features. The group behind Drupal Commerce is actively developing the core software and the wider community is also active in supporting the project.

Ubercart and Drupal 7

What Ubercart development still happens focuses on maintenance of the Drupal 7 version only. The catch here is that Drupal 7 reaches end of life November 2021, which will likely spell the effective end of Ubercart as well. If you’re using Ubercart and Drupal 7 together and you want new features and active development, that realistically ended years ago when the majority of the contributor community moved away from the project.

Here’s a couple snapshots of the commit history for both the core Ubercart module and the core Drupal Commerce module. A commit is a term given to code changes that have been added to the module. Commits are typically code improvements, new features, bug fixes and security updates that have been written, tested and approved for release.

ubercart-commit-historyUbercart commit history

Commerce commit history

When looking at the graphs above, it’s important to know that it’s common to see number of commits trailing off over time. This is because the majority of the core software is built early on and so fewer commits are made over time as development of the core ramps down. What is important to see is that development of Drupal Commerce over Ubercart is still continuing, meaning new features and code improvements are being actively made to the core Commerce software but not to Ubercart.

Another point to note about these graphs is that when commits are ramping down to the core software, development efforts are likely being moved to community built extensions. This data isn’t reflected in the graphs above. The community built extensions is the ecosystem of new add-ons and features that aren’t found in the core software. In the case of Ubercart, this community development is very small and limited whereas for Drupal Commerce community is very active and engaged.

Where to go from Ubercart?

You’ve probably guessed this already, but the clear path moving away from Ubercart is to Drupal Commerce. Commerce is the Ubercart replacement and it’s capable of so much more. It’s also Drupal 8 ready and will provide a painless transition to Drupal 9, when that happens.

Commerce improvements over Ubercart

The following is a list of improvements Commerce for Drupal 8 has over Ubercart:

Drupal 8 improvements over Drupal 7 include:

  • Robust caching and performance for authenticate or unique users, very important for any ecommerce site
  • Drupal’s new rolling release schedule, no more large updates between versions makes updates easier
  • Modern object oriented design, which makes testing, extension and use of 3rd party libraries easier. Commerce follows all of the architectural improvements for Drupal 8 and has, in some cases, lead the way by innovating first.

Commerce improvements over Ubercart include:

  • More secure payment architecture. Commerce encourages the lowest level of PCI risk possible and enforces good practices with it’s payment API, compared to Ubercart’s primarily DIY payment model.
  • Proper variation based product model with unique SKUs for each variation
  • Robust and accurate promotions, discounts and pricing adjustments. If you’ve struggled with pricing accuracy in Ubercart you’ll understand.
  • Multi-store and multi-currency support is robust and built in.
  • And the list goes on…

Why move now instead of later?

While you could wait until Drupal 7 end of life to move your ecommerce site off of Ubercart and onto Drupal Commerce, this is not something we would ever recommend. The truth of the matter is that by waiting until the very end, you’re taking on a lot of unnecessary risk for both your business and your customers. You don’t want to be in a position where you’re scrambling to make-it-happen quickly when suddenly you’re not getting any more security updates to both Drupal 7 AND Ubercart. That is a worse-case scenario and you would be wise to avoid it.

Right now is an ideal time for you to consider making the switch. Both Drupal 8 and Commerce have been used in the wild now for years and the software is very stable. Most likely all of the features and functionality that you currently use has already been ported over to the new versions. The tools that help migrate Drupal 7 and Ubercart over to Drupal 8 and Commerce have been created to assist with the move. Really, from a technical standpoint there’s no reason to not make the move now.

Of course, it can’t be denied that completing a migration to the latest and greatest does take time and effort to do, and there will be a cost involved. All the more reason to start the process now. Right now you have the time to find the help you need and to properly budget and plan how your migration will be executed. Right now it’s not a hassle, it’s an opportunity to make your business better for both you and your customers while at the same time correcting any of the little things that bother you about your site now.

Acro Media has been helping ecommerce owners and operators with consultation and development for well over 10 years. We’re intimate with both Ubercart and Drupal Commerce, and we even staff some of the talented people who built Commerce and the migration tools everyone uses to make the move. If you want to learn more about how your migration would happen, we would love to talk. Click the link below to get started.

Read the full Gartner report

Sep 19 2019
Sep 19

We're back!  Our normally scheduled call to chat about all things Drupal and nonprofits will happen TODAY, September 19, at 1pm ET / 10am PT. (Convert to your local time zone.)

Feel free to share your thoughts and discussion points ahead of time in our collaborative Google doc: https://nten.org/drupal/notes

We have an hour to chat so bring your best Drupal topics and let's do this thing!

Some examples to get your mind firing: how do I recreate [feature] on my Drupal 7 site in Drupal 8? I need to explain [complicated thing] to a non-technical stakeholder -- any advice? How can I get Drupal and my CRM to play nicely?

This free call is sponsored by NTEN.org but open to everyone.

View notes of previous months' calls.

Sep 19 2019
Sep 19

Attending DrupalCon is an investment in your skills, professional development, and in building community connections. 

A lot of attendees don't buy their own tickets—most need to convince someone else (their boss) of the value.

Sep 19 2019
Sep 19

To scale and sustain Open Source ecosystems in a more efficient and fair manner, Open Source projects need to embrace new governance, coordination and incentive models.

A scale that is in balance

In many ways, Open Source has won. Most people know that Open Source provides better quality software, at a lower cost, without vendor lock-in. But despite Open Source being widely adopted and more than 30 years old, scaling and sustaining Open Source projects remains challenging.

Not a week goes by that I don't get asked a question about Open Source sustainability. How do you get others to contribute? How do you get funding for Open Source work? But also, how do you protect against others monetizing your Open Source work without contributing back? And what do you think of MongoDB, Cockroach Labs or Elastic changing their license away from Open Source?

This blog post talks about how we can make it easier to scale and sustain Open Source projects, Open Source companies and Open Source ecosystems. I will show that:

  • Small Open Source communities can rely on volunteers and self-governance, but as Open Source communities grow, their governance model most likely needs to be reformed so the project can be maintained more easily.
  • There are three models for scaling and sustaining Open Source projects: self-governance, privatization, and centralization. All three models aim to reduce coordination failures, but require Open Source communities to embrace forms of monitoring, rewards and sanctions. While this thinking is controversial, it is supported by decades of research in adjacent fields.
  • Open Source communities would benefit from experimenting with new governance models, coordination systems, license innovation, and incentive models.

Some personal background

Scaling and sustaining Open Source projects and Open Source businesses has been the focus of most of my professional career.

Drupal, the Open Source project I founded 18 years ago, is used by more than one million websites and reaches pretty much everyone on the internet.

With over 8,500 individuals and about 1,100 organizations contributing to Drupal annually, Drupal is one of the healthiest and contributor-rich Open Source communities in the world.

For the past 12 years, I've also helped build Acquia, an Open Source company that heavily depends on Drupal. With almost 1,000 employees, Acquia is the largest contributor to Drupal, yet responsible for less than 5% of all contributions.

This article is not about Drupal or Acquia; it's about scaling Open Source projects more broadly.

I'm interested in how to make Open Source production more sustainable, more fair, more egalitarian, and more cooperative. I'm interested in doing so by redefining the relationship between end users, producers and monetizers of Open Source software through a combination of technology, market principles and behavioral science.

Why it must be easier to scale and sustain Open Source

We need to make it easier to scale and sustain both Open Source projects and Open Source businesses:

  1. Making it easier to scale and sustain Open Source projects might be the only way to solve some of the world's most important problems. For example, I believe Open Source to be the only way to build a pro-privacy, anti-monopoly, open web. It requires Open Source communities to be long-term sustainable — possibly for hundreds of years.
  2. Making it easier to grow and sustain Open Source businesses is the last hurdle that prevents Open Source from taking over the world. I'd like to see every technology company become an Open Source company. Today, Open Source companies are still extremely rare.

The alternative is that we are stuck in the world we live in today, where proprietary software dominates most facets of our lives.


This article is focused on Open Source governance models, but there is more to growing and sustaining Open Source projects. Top of mind is the need for Open Source projects to become more diverse and inclusive of underrepresented groups.

Second, I understand that the idea of systematizing Open Source contributions won't appeal to everyone. Some may argue that the suggestions I'm making go against the altruistic nature of Open Source. I agree. However, I'm also looking at Open Source sustainability challenges from the vantage point of running both an Open Source project (Drupal) and an Open Source business (Acquia). I'm not implying that every community needs to change their governance model, but simply offering suggestions for communities that operate with some level of commercial sponsorship, or communities that struggle with issues of long-term sustainability.

Lastly, this post is long and dense. I'm 700 words in, and I haven't started yet. Given that this is a complicated topic, there is an important role for more considered writing and deeper thinking.

Defining Open Source Makers and Takers


Some companies are born out of Open Source, and as a result believe deeply and invest significantly in their respective communities. With their help, Open Source has revolutionized software for the benefit of many. Let's call these types of companies Makers.

As the name implies, Makers help make Open Source projects; from investing in code, to helping with marketing, growing the community of contributors, and much more. There are usually one or more Makers behind the success of large Open Source projects. For example, MongoDB helps make MongoDB, Red Hat helps make Linux, and Acquia (along with many other companies) helps make Drupal.

Our definition of a Maker assumes intentional and meaningful contributions and excludes those whose only contributions are unintentional or sporadic. For example, a public cloud company like Amazon can provide a lot of credibility to an Open Source project by offering it as-a-service. The resulting value of this contribution can be substantial, however that doesn't make Amazon a Maker in our definition.

I use the term Makers to refer to anyone who purposely and meaningfully invests in the maintenance of Open Source software, i.e. by making engineering investments, writing documentation, fixing bugs, organizing events, and more.


Now that Open Source adoption is widespread, lots of companies, from technology startups to technology giants, monetize Open Source projects without contributing back to those projects. Let's call them Takers.

I understand and respect that some companies can give more than others, and that many might not be able to give back at all. Maybe one day, when they can, they'll contribute. We limit the label of Takers to companies that have the means to give back, but choose not to.

The difference between Makers and Takers is not always 100% clear, but as a rule of thumb, Makers directly invest in growing both their business and the Open Source project. Takers are solely focused on growing their business and let others take care of the Open Source project they rely on.

Organizations can be both Takers and Makers at the same time. For example, Acquia, my company, is a Maker of Drupal, but a Taker of Varnish Cache. We use Varnish Cache extensively but we don't contribute to its development.

A scale that is not in balance

Takers hurt Makers

To be financially successful, many Makers mix Open Source contributions with commercial offerings. Their commercial offerings usually take the form of proprietary or closed source IP, which may include a combination of premium features and hosted services that offer performance, scalability, availability, productivity, and security assurances. This is known as the Open Core business model. Some Makers offer professional services, including maintenance and support assurances.

When Makers start to grow and demonstrate financial success, the Open Source project that they are associated with begins to attract Takers. Takers will usually enter the ecosystem with a commercial offering comparable to the Makers', but without making a similar investment in Open Source contribution. Because Takers don't contribute back meaningfully to the Open Source project that they take from, they can focus disproportionately on their own commercial growth.

Let's look at a theoretical example.

When a Maker has $1 million to invest in R&D, they might choose to invest $500k in Open Source and $500k in the proprietary IP behind their commercial offering. The Maker intentionally balances growing the Open Source project they are connected to with making money. To be clear, the investment in Open Source is not charity; it helps make the Open Source project competitive in the market, and the Maker stands to benefit from that.

When a Taker has $1 million to invest in R&D, nearly all of their resources go to the development of proprietary IP behind their commercial offerings. They might invest $950k in their commercial offerings that compete with the Maker's, and $50k towards Open Source contribution. Furthermore, the $50k is usually focused on self-promotion rather than being directed at improving the Open Source project itself.

A visualization of the Maker and Taker math

Effectively, the Taker has put itself at a competitive advantage compared to the Maker:

  • The Taker takes advantage of the Maker's $500k investment in Open Source contribution while only investing $50k themselves. Important improvements happen "for free" without the Taker's involvement.
  • The Taker can out-innovate the Maker in building proprietary offerings. When a Taker invests $950k in closed-source products compared to the Maker's $500k, the Taker can innovate 90% faster. The Taker can also use the delta to disrupt the Maker on price.

In other words, Takers reap the benefits of the Makers' Open Source contribution while simultaneously having a more aggressive monetization strategy. The Taker is likely to disrupt the Maker. On an equal playing field, the only way the Maker can defend itself is by investing more in its proprietary offering and less in the Open Source project. To survive, it has to behave like the Taker to the detriment of the larger Open Source community.

Takers harm Open Source projects. An aggressive Taker can induce Makers to behave in a more selfish manner and reduce or stop their contributions to Open Source altogether. Takers can turn Makers into Takers.

Open Source contribution and the Prisoner's Dilemma

The example above can be described as a Prisoner's Dilemma. The Prisoner's Dilemma is a standard example of game theory, which allows the study of strategic interaction and decision-making using mathematical models. I won't go into detail here, but for the purpose of this article, it helps me simplify the above problem statement. I'll use this simplified example throughout the article.

Imagine an Open Source project with only two companies supporting it. The rules of the game are as follows:

  • If both companies contribute to the Open Source project (both are Makers), the total reward is $100. The reward is split evenly and each company makes $50.
  • If one company contributes while the other company doesn't (one Maker, one Taker), the Open Source project won't be as competitive in the market, and the total reward will only be $80. The Taker gets $60 as they have the more aggressive monetization strategy, while the Maker gets $20.
  • If both players choose not to contribute (both are Takers), the Open Source project will eventually become irrelevant. Both walk away with just $10.

This can be summarized in a pay-off matrix:

Company A contributes Company A doesn't contribute Company B contributes A makes $50
B makes $50 A makes $60
B makes $20 Company B doesn't contribute A makes $20
B makes $60 A makes $10
B makes $10

In the game, each company needs to decide whether to contribute or not, but Company A doesn't know what company B decides; and vice versa.

The Prisoner's Dilemma states that each company will optimize its own profit and not contribute. Because both companies are rational, both will make that same decision. In other words, when both companies use their "best individual strategy" (be a Taker, not a Maker), they produce an equilibrium that yields the worst possible result for the group: the Open Source project will suffer and as a result they only make $10 each.

A real-life example of the Prisoner's Dilemma that many people can relate to is washing the dishes in a shared house. By not washing dishes, an individual can save time (individually rational), but if that behavior is adopted by every person in the house, there will be no clean plates for anyone (collectively irrational). How many of us have tried to get away with not washing the dishes? I know I have.

Fortunately, the problem of individually rational actions leading to collectively adverse outcomes is not new or unique to Open Source. Before I look at potential models to better sustain Open Source projects, I will take a step back and look at how this problem has been solved elsewhere.

Open Source: a public good or a common good?

In economics, the concepts of public goods and common goods are decades old, and have similarities to Open Source.

Examples of common goods (fishing grounds, oceans, parks) and public goods (lighthouses, radio, street lightning)

Public goods and common goods are what economists call non-excludable meaning it's hard to exclude people from using them. For example, everyone can benefit from fishing grounds, whether they contribute to their maintenance or not. Simply put, public goods and common goods have open access.

Common goods are rivalrous; if one individual catches a fish and eats it, the other individual can't. In contrast, public goods are non-rivalrous; someone listening to the radio doesn't prevent others from listening to the radio.

I've long believed that Open Source projects are public goods: everyone can use Open Source software (non-excludable) and someone using an Open Source project doesn't prevent someone else from using it (non-rivalrous).

However, through the lens of Open Source companies, Open Source projects are also common goods; everyone can use Open Source software (non-excludable), but when an Open Source end user becomes a customer of Company A, that same end user is unlikely to become a customer of Company B (rivalrous).

For end users, Open Source projects are public goods; the shared resource is the software. But for Open Source companies, Open Source projects are common goods; the shared resource is the (potential) customer.

Next, I'd like to extend the distinction between "Open Source software being a public good" and "Open Source customers being a common good" to the the free-rider problem: we define software free-riders as those who use the software without ever contributing back, and customer free-riders (or Takers) as those who sign up customers without giving back.

All Open Source communities should encourage software free-riders. Because the software is a public good (non-rivalrous), a software free-rider doesn't exclude others from using the software. Hence, it's better to have a user for your Open Source project, than having that person use your competitor's software. Furthermore, a software free-rider makes it more likely that other people will use your Open Source project (by word of mouth or otherwise). When some portion of those other users contribute back, the Open Source project benefits. Software free-riders can have positive network effects on a project.

However, when the success of an Open Source project depends largely on one or more corporate sponsors, the Open Source community should not forget or ignore that customers are a common good. Because a customer can't be shared among companies, it matters a great deal for the Open Source project where that customer ends up. When the customer signs up with a Maker, we know that a certain percentage of the revenue associated with that customer will be invested back into the Open Source project. When a customer signs up with a customer free-rider or Taker, the project doesn't stand to benefit. In other words, Open Source communities should find ways to route customers to Makers.

Both volunteer-driven and sponsorship-driven Open Source communities should encourage software free-riders, but sponsorship-driven Open Source communities should discourage customer free-riders.

Lessons from decades of Common Goods management

Hundreds of research papers and books have been written on public good and common good governance. Over the years, I have read many of them to figure out what Open Source communities can learn from successfully managed public goods and common goods.

Some of the most instrumental research was Garrett Hardin's Tragedy of the Commons and Mancur Olson's work on Collective Action. Both Hardin and Olson concluded that groups don't self-organize to maintain the common goods they depend on.

As Olson writes in the beginning of his book, The Logic of Collective Action: Unless the number of individuals is quite small, or unless there is coercion or some other special device to make individuals act in their common interest, rational, self-interested individuals will not act to achieve their common or group interest..

Consistent with the Prisoner's Dilemma, Hardin and Olson show that groups don't act on their shared interests. Members are disincentivized from contributing when other members can't be excluded from the benefits. It is individually rational for a group's members to free-ride on the contributions of others.

Dozens of academics, Hardin and Olson included, argued that an external agent is required to solve the free-rider problem. The two most common approaches are (1) centralization and (2) privatization:

  1. When a common good is centralized, the government takes over the maintenance of the common good. The government or state is the external agent.
  2. When a public good is privatized, one or more members of the group receive selective benefits or exclusive rights to harvest from the common good in exchange for the ongoing maintenance of the common good. In this case, one or more corporations act as the external agent.

The wide-spread advice to centralize and privatize common goods has been followed extensively in most countries; today, the management of natural resources is typically managed by either the government or by commercial companies, but no longer directly by its users. Examples include public transport, water utilities, fishing grounds, parks, and much more.

Overall, the privatization and centralization of common goods has been very successful; in many countries, public transport, water utilities and parks are maintained better than volunteer contributors would have on their own. I certainly value that I don't have to help maintain the train tracks before my daily commute to work, or that I don't have to help mow the lawn in our public park before I can play soccer with my kids.

For years, it was a long-held belief that centralization and privatization were the only way to solve the free-rider problem. It was Elinor Ostrom who observed that a third solution existed.

Ostrom found hundreds of cases where common goods are successfully managed by their communities, without the oversight of an external agent. From the management of irrigation systems in Spain to the maintenance of mountain forests in Japan — all have been successfully self-managed and self-governed by their users. Many have been long-enduring as well; the youngest examples she studied were more than 100 years old, and the oldest exceed 1,000 years.

Ostrom studied why some efforts to self-govern commons have failed and why others have succeeded. She summarized the conditions for success in the form of core design principles. Her work led her to win the Nobel Prize in Economics in 2009.

Interestingly, all successfully managed commons studied by Ostrom switched at some point from open access to closed access. As Ostrom writes in her book, Governing the Commons: For any appropriator to have a minimal interest in coordinating patterns of appropriation and provision, some set of appropriators must be able to exclude others from access and appropriation rights.. Ostrom uses the term appropriator to refer to those who use or withdraw from a resource. Examples would be fishers, irrigators, herders, etc — or companies trying to turn Open Source users into paying customers. In other words, the shared resource must be made exclusive (to some degree) in order to incentivize members to manage it. Put differently, Takers will be Takers until they have an incentive to become Makers.

Once access is closed, explicit rules need to be established to determine how resources are shared, who is responsible for maintenance, and how self-serving behaviors are suppressed. In all successfully managed commons, the regulations specify (1) who has access to the resource, (2) how the resource is shared, (3) how maintenance responsibilities are shared, (4) who inspects that rules are followed, (5) what fines are levied against anyone who breaks the rules, (6) how conflicts are resolved and (7) a process for collectively evolving these rules.

Three patterns for long-term sustainable Open Source

Studying the work of Garrett Hardin (Tragedy of the Commons), the Prisoner's Dilemma, Mancur Olson (Collective Action) and Elinor Ostrom's core design principles for self-governance, a number shared patterns emerge. When applied to Open Source, I'd summarize them as follows:

  1. Common goods fail because of a failure to coordinate collective action. To scale and sustain an Open Source project, Open Source communities need to transition from individual, uncoordinated action to cooperative, coordinated action.
  2. Cooperative, coordinated action can be accomplished through privatization, centralization, or self-governance. All three work — and can even be mixed.
  3. Successful privatization, centralization, and self-governance all require clear rules around membership, appropriation rights, and contribution duties. In turn, this requires monitoring and enforcement, either by an external agent (centralization + privatization), a private agent (self-governance), or members of the group itself (self-governance).

Next, let's see how these three concepts — centralization, privatization and self-governance — could apply to Open Source.

Model 1: Self-governance in Open Source

For small Open Source communities, self-governance is very common; it's easy for its members to communicate, learn who they can trust, share norms, agree on how to collaborate, etc.

As an Open Source project grows, contribution becomes more complex and cooperation more difficult: it becomes harder to communicate, build trust, agree on how to cooperate, and suppress self-serving behaviors. The incentive to free-ride grows.

You can scale successful cooperation by having strong norms that encourage other members to do their fair share and by having face-to-face events, but eventually, that becomes hard to scale as well.

As Ostrom writes in Governing the Commons: Even in repeated settings where reputation is important and where individuals share the norm of keeping agreements, reputation and shared norms are insufficient by themselves to produce stable cooperative behavior over the long run. and In all of the long-enduring cases, active investments in monitoring and sanctioning activities are quite apparent..

To the best of my knowledge, no Open Source project currently implements Ostrom's design principles for successful self-governance. To understand how Open Source communities might, let's go back to our running example.

Our two companies would negotiate rules for how to share the rewards of the Open Source project, and what level of contribution would be required in exchange. They would set up a contract where they both agree on how much each company can earn and how much each company has to invest. During the negotiations, various strategies can be proposed for how to cooperate. However, both parties need to agree on a strategy before they can proceed. Because they are negotiating this contract among themselves, no external agent is required.

These negotiations are non-trivial. As you can imagine, any proposal that does not involve splitting the $100 fifty-fifty is likely rejected. The most likely equilibrium is for both companies to contribute equally and to split the reward equally. Furthermore, to arrive at this equilibrium, one of the two companies would likely have to go backwards in revenue, which might not be agreeable.

Needless to say, this gets even more difficult in a scenario where there are more than two companies involved. Today, it's hard to fathom how such a self-governance system can successfully be established in an Open Source project. In the future, Blockchain-based coordination systems might offer technical solutions for this problem.

Large groups are less able to act in their common interest than small ones because (1) the complexity increases and (2) the benefits diminish. Until we have better community coordination systems, it's easier for large groups to transition from self-governance to privatization or centralization than to scale self-governance.

The concept of major projects growing out of self-governed volunteer communities is not new to the world. The first trade routes were ancient trackways which citizens later developed on their own into roads suited for wheeled vehicles. Privatization of roads improved transportation for all citizens. Today, we certainly appreciate that our governments maintain the roads.

The roads system evolving from self-governance to privatization, and from privatization to centralization

Model 2: Privatization of Open Source governance

In this model, Makers are rewarded unique benefits not available to Takers. These exclusive rights provide Makers a commercial advantage over Takers, while simultaneously creating a positive social benefit for all the users of the Open Source project, Takers included.

For example, Mozilla has the exclusive right to use the Firefox trademark and to set up paid search deals with search engines like Google, Yandex and Baidu. In 2017 alone, Mozilla made $542 million from searches conducted using Firefox. As a result, Mozilla can make continued engineering investments in Firefox. Millions of people and organizations benefit from that every day.

Another example is Automattic, the company behind WordPress. Automattic is the only company that can use WordPress.com, and is in the unique position to make hundreds of millions of dollars from WordPress' official SaaS service. In exchange, Automattic invests millions of dollars in the Open Source WordPress each year.

Recently, there have been examples of Open Source companies like MongoDB, Redis, Cockroach Labs and others adopting stricter licenses because of perceived (and sometimes real) threats from public cloud companies that behave as Takers. The ability to change the license of an Open Source project is a form of privatization.

Model 3: Centralization of Open Source governance

Let's assume a government-like central authority can monitor Open Source companies A and B, with the goal to reward and penalize them for contribution or lack thereof. When a company follows a cooperative strategy (being a Maker), they are rewarded $25 and when they follow a defect strategy (being a Taker), they are charged a $25 penalty. We can update the pay-off matrix introduced above as follows:

Company A contributes Company A doesn't contribute Company B contributes A makes $75 ($50 + $25)
B makes $75 ($50 + $25) A makes $35 ($60 - $25)
B makes $45 ($20 + 25) Company B doesn't contribute A makes $45 ($20 + $25)
B makes $35 ($60 - $25) A makes $0 ($10 - $25)
B makes $0 ($10 - $25)

We took the values from the pay-off matrix above and applied the rewards and penalties. The result is that both companies are incentivized to contribute and the optimal equilibrium (both become Makers) can be achieved.

The money for rewards could come from various fundraising efforts, including membership programs or advertising (just as a few examples). However, more likely is the use of indirect monetary rewards.

One way to implement this is Drupal's credit system. Drupal's non-profit organization, the Drupal Association monitors who contributes what. Each contribution earns you credits and the credits are used to provide visibility to Makers. The more you contribute, the more visibility you get on Drupal.org (visited by 2 million people each month) or at Drupal conferences (called DrupalCons, visited by thousands of people each year).

Example issue credit on drupal orgA screenshot of an issue comment on Drupal.org. You can see that jamadar worked on this patch as a volunteer, but also as part of his day job working for TATA Consultancy Services on behalf of their customer, Pfizer.

While there is a lot more the Drupal Association could and should do to balance its Makers and Takers and achieve a more optimal equilibrium for the Drupal project, it's an emerging example of how an Open Source non-profit organization can act as a regulator that monitors and maintains the balance of Makers and Takers.

The big challenge with this approach is the accuracy of the monitoring and the reliability of the rewarding (and sanctioning). Because Open Source contribution comes in different forms, tracking and valuing Open Source contribution is a very difficult and expensive process, not to mention full of conflict. Running this centralized government-like organization also needs to be paid for, and that can be its own challenge.

Concrete suggestions for scaling and sustaining Open Source

Suggestion 1: Don't just appeal to organizations' self-interest, but also to their fairness principles

If, like most economic theorists, you believe that organizations act in their own self-interest, we should appeal to that self-interest and better explain the benefits of contributing to Open Source.

Despite the fact that hundreds of articles have been written about the benefits of contributing to Open Source — highlighting speed of innovation, recruiting advantages, market credibility, and more — many organizations still miss these larger points.

It's important to keep sharing Open Source success stories. One thing that we have not done enough is appeal to organizations' fairness principles.

While a lot of economic theories correctly assume that most organizations are self-interested, I believe some organizations are also driven by fairness considerations.

Despite the term "Takers" having a negative connotation, it does not assume malice. For many organizations, it is not apparent if an Open Source project needs help with maintenance, or how one's actions, or lack thereof, might negatively affect an Open Source project.

As mentioned, Acquia is a heavy user of Varnish Cache. But as Acquia's Chief Technology Officer, I don't know if Varnish needs maintenance help, or how our lack of contribution negatively affects Makers in the Varnish community.

It can be difficult to understand the consequences of our own actions within Open Source. Open Source communities should help others understand where contribution is needed, what the impact of not contributing is, and why certain behaviors are not fair. Some organizations will resist unfair outcomes and behave more cooperatively if they understand the impact of their behaviors and the fairness of certain outcomes.

Make no mistake though: most organizations won't care about fairness principles; they will only contribute when they have to. For example, most people would not voluntarily redistribute 25-50% of their income to those who need it. However, most of us agree to redistribute money by paying taxes, but only so long as all others have to do so as well.

Suggestion 2: Encourage end users to offer selective benefits to Makers

We talked about Open Source projects giving selective benefits to Makers (e.g. Automattic, Mozilla, etc), but end users can give selective benefits as well. For example, end users can mandate Open Source contributions from their partners. We have some successful examples of this in the Drupal community:

If more end users of Open Source took this stance, it could have a very big impact on Open Source sustainability. For governments, in particular, this seems like a very logical thing to do. Why would a government not want to put every dollar of IT spending back in the public domain? For Drupal alone, the impact would be measured in tens of millions of dollars each year.

Suggestion 3: Experiment with new licenses

I believe we can create licenses that support the creation of Open Source projects with sustainable communities and sustainable businesses to support it.

For a directional example, look at what MariaDB did with their Business Source License (BSL). The BSL gives users complete access to the source code so users can modify, distribute and enhance it. Only when you use more than x of the software do you have to pay for a license. Furthermore, the BSL guarantees that the software becomes Open Source over time; after y years, the license automatically converts from BSL to General Public License (GPL), for example.

A second example is the Community Compact, a license proposed by Adam Jacob. It mixes together a modern understanding of social contracts, copyright licensing, software licensing, and distribution licensing to create a sustainable and harmonious Open Source project.

We can create licenses that better support the creation, growth and sustainability of Open Source projects and that are designed so that both users and the commercial ecosystem can co-exist and cooperate in harmony.

I'd love to see new licenses that encourage software free-riding (sharing and giving), but discourage customer free-riding (unfair competition). I'd also love to see these licenses support many Makers, with built-in inequity and fairness principles for smaller Makers or those not able to give back.

If, like me, you believe there could be future licenses that are more "Open Source"-friendly, not less, it would be smart to implement a contributor license agreement for your Open Source project; it allows Open Source projects to relicense if/when better licenses arrive. At some point, current Open Source licenses will be at a disadvantage compared to future Open Source licenses.


As Open Source communities grow, volunteer-driven, self-organized communities become harder to scale. Large Open Source projects should find ways to balance Makers and Takers or the Open Source project risks not innovating enough under the weight of Takers.

Fortunately, we don't have to accept that future. However, this means that Open Source communities potentially have to get comfortable experimenting with how to monitor, reward and penalize members in their communities, particularly if they rely on a commercial ecosystem for a large portion of their contributions. Today, that goes against the values of most Open Source communities, but I believe we need to keep an open mind about how we can grow and scale Open Source.

Making it easier to scale Open Source projects in a sustainable and fair way is one of the most important things we can work on. If we succeed, Open Source can truly take over the world — it will pave the path for every technology company to become an Open Source business, and also solve some of the world's most important problems in an open, transparent and cooperative way.

September 19, 2019

22 min read time


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web