Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Feb 09 2021
Feb 09

On January 15, 2021, Drupal celebrated its 20th birthday - a major accomplishment for an open source project! As we look to the future, how can we propel Drupal forward to be the top digital experience platform?

Vision. Contribution. Innovation.

With Drupal 10 on the horizon for summer of 2022, the Drupal Association is helping to rally the community around Drupal’s key strategic initiatives, critical to meeting this milestone. We are collaborating with the Core Mentoring Coordinators, the Core Initiative Coordinators, and the DrupalCon Program Committee to evolve the contribution experience at DrupalCon to support key initiatives, illustrate how to get involved, and, ultimately, accelerate project innovation.

DrupalCon North America will incorporate Drupal's strategic goals and contribution opportunities into the main event programming. As part of your DrupalCon experience, you will have the ability to deep dive into a specific initiative or participate in more general conference programming. 

Each day, we focus

Each day, in addition to the traditional DrupalCon content, we will have a focus on one of the Drupal Project’s Strategic Initiatives and will begin with an Initiative Keynote. Key members of the Initiative will take us all on a journey through the goals of that initiative, and how it will help create a better product for us all to build future digital experiences. Co-speakers will join the keynote to place the initiative in the wider context of the value this feature brings to Drupal. This content will not just be for developers, but for everyone who has a stake in the strategic future of Drupal.

At the conclusion of the session, all the keynote speakers will be available for discussion with attendees.

Throughout the day, the program will contain select, highlighted content that helps participants of all backgrounds and skill sets to learn in detail how the initiative works, how it will enable them to build more exciting experiences, how to sell those experiences more effectively, and how they can participate in contributing to the initiative’s success.

Each day, we contribute

In the latter part of the day, participants will have the option to engage in interactive and supported contribution activities. Contribution opportunities will be available for both technical and non-technical audiences at all experience levels.

The day’s initiative keynote team will be available to coordinate contributions, alongside our mentoring team. On other days, we will all give the initiative team the space to build upon the successes coming out of their day in the spotlight.

Contribution is not limited to the focused initiatives. There will be space and opportunity for individuals who already have their own contribution teams and goals. You’re encouraged to work on them at DrupalCon and share your stories via #DrupalContributions for us to share with the wider community.

The transformation of contribution opportunities at DrupalCon will only make an impact with your support. Accelerate the project by contributing at DrupalCon and elevate the future of Drupal.

We hope you'll join us! Register for DrupalCon today.

More information about the specific schedule, content, and areas of focus for contribution will be coming soon. 

Editor's note: DrupalCon contribution activities will be available for free for non-ticket-holders as always.

About DrupalCon North America 2021

At DrupalCon North America 2021, the people who create the digital experiences the world relies on every day—developers, marketers, agency leaders, and business decision makers—come together to learn about the latest Drupal developments and contribute to the future of the platform.

This year’s all-virtual event, April 12-16, is designed to fit into your schedule, so you can still participate in the conference while balancing work and life at home. Live sessions will be held across five days, Monday - Friday from 11:00AM - 3:00PM Eastern, with these contribution activities in the latter part of the day

In events ranging from keynote presentations to hands-on skill-building workshops, you’ll:

  • Learn Drupal best practices, new skills, and innovative strategies you can use in your own projects.
  • Connect with others in the Drupal community, growing your network and collaborating on ways to make the platform even better.
  • Build your career or business, with opportunities to showcase your own work and meet potential employers, clients, talent, and partners.

Whether you attend DrupalCon to hear the vision from founder Dries Buytaert, to learn from mainstage speakers about open source and beyond, to develop your own talent or recruit talent of others, or to build relationships with new contributors, collaborators, and friends - there is a place for you at DrupalCon. 

Feb 09 2021
Feb 09

Call for content closes February 14

Submit your proposal to host a short, interactive session, and share what you know with the Drupal community. 

All sessions will be virtual this year, and facilitators and presenters will get free registration for the entire conference. We strongly encourage submissions from facilitators and presenters who have been marginalized due to racism, misogyny, transphobia, and other forms of discrimination.

We’re looking for ideas related to, but not limited to, the following topic areas:

  • Drupal Community Health
  • Content & Marketing
  • Core Contributions
  • Development & Coding
  • DevOps
  • Drupal and Open Source 101
  • Drupal Showcase
  • Leadership, Management, & Business
  • Site Building
  • User Experience, Accessibility & Design
Learn more about session topics and submission guidelines.  

If you’d like to lead a deeper dive into topics related to your industry, submit a presentation for one of our industry summits, held virtually throughout the month of April:

  • Tuesday, April 6: Drupal Community Summit
  • Tuesday, April 20: Higher Education Summit
  • Thursday, April 22: Healthcare Summit
  • Tuesday, April 27: Nonprofit Summit
  • Thursday, April 29: Government Summit

Learn, Connect, Build

DrupalCon North America is the largest DrupalCon event of the year, held virtually this year from April 12–16, from 11:00 am - 3:00pm Eastern (UTC -5). 

In keynote presentations, hands-on skill-building workshops, breakout discussions, and interactive sessions, you’ll:

Learn Drupal best practices, new skills, and innovative strategies you can use in your own projects.

Connect with others in the Drupal community, growing your network, and collaborating on ways to make the platform even better. 

Build your career or business, with opportunities to showcase your own work and meet potential employers, clients, talent, and partners.</span>

Gain visibility with targeted engagement opportunities 

Become a DrupalCon sponsor to reach thousands of the world's top Drupal contributors, influencers, and organizations. Sponsorship packages start at $4,050. 
Contact us today to learn about DrupalCon sponsorship opportunities.

Our Diamond Sponsor

Acquia logo

Our Platinum Sponsors


Have a question?

We’re here for you. Find answers to common DrupalCon questions in our FAQ.
Feb 09 2021
Feb 09

To enable visitors to the site to navigate more effectively via individual assistants that inform and direct them to the relevant topics depending on the context - that's what we wanted to realize for one of our projects. The goal behind this is to give end users the information they need quickly and easily at the time they need it.

In the World Wide Web such assistants can be found more and more. However, for our Drupal project we did not find the right thing for our needs - neither an existing Drupal module nor a service provider offering such a solution. So it was clear: A new module was needed. And this module should be available as a contribution for the whole Drupal community. Voilá, a few days later we developed the Site Assistant module.

View of a site assistant in action.View of a site assistant in action.

Requirements for the assistant

The overall requirement for the assistant is: It must be easy to use and well adaptable. Editorial teams should be able to create their own assistants with customized content for different areas. In addition, we defined the following requirements for the new module:

  • The content is created and maintained editorially.
  • There can be multiple assistants with different content.
  • The assistants are displayed depending on conditions (condition plugins).
  • The design of the assistants is customizable.
  • The assistants are sticky and usable in the different viewports.
  • Editors create assistants themselves, without a site builder and without the need for deployment.

This is inside the new Drupal module

From a technical perspective, the condition handling for the context-dependent display of the assistant is certainly the most exciting part. The Site Assistant module uses the existing condition plugin system and builds on it.

The display of our assistants is controlled by the conditions. A big advantage of the condition plugin system is that the Drupal community continuously provides new condition plugins and expands the selection more and more. For example, there are already condition plugins for the Drupal Commerce module for online stores. So if your Drupal site already has a store that uses this module and its condition plugin(s), you only need to activate the Site Assistant module and you can directly use the existing conditions from the store for the assistant as well.

The Site Assistant module itself is divided into three sets: the assistants, the assistant list items, and the assistant library content.

  • The assistants themselves are composed of a content field and a display options field. In the content field, the content to be displayed is placed and arranged using multiple list entries as needed. The display field is used to configure the conditions (condition plugins) on the basis of which the assistant is to be displayed.
  • The assistant list items are the contents that are displayed in the assistant. The module already brings a set of predefined list item types: headline, link, link list, subpage, library content, free text (wysiwyg). Site builders can create their own list item types, giving editors more choices in content design.
  • Assistant library items are content stored separately for reuse, and can be used in different assistants. An assistant library item consists of any number of list items. For example, for recurring contact opportunities, an assistant library item can be created with the heading "Contact" and a wysiwyg field with address, phone number and email, which is then used in different assistants.

The design of the assistant can be customized at different levels. Since each item in the assistant is a separate entity, they can be customized and formatted as usual via Drupal templates. The general design of the assistant can be customized via CSS.

Entry form for the fields of a site assistantEntry form for the fields of a site assistant.

And what's more?

With our new module there is now also the possibility to easily and quickly introduce context-dependent assistants in Drupal in order to better support the end user. The Site Assistant module brings a good base, but of course there is still a lot of potential for extensions and customizations. With a little development effort, you can create your own custom conditions (a tutorial can be found on Drupal.org) and customize the functionality exactly to your needs.

We would like to see more assistant list item types shared by and for the Drupal community so that the possibilities with the assistant grow. For example, I can very well imagine chat bots (possibly directly via service providers) being integrated into an assistant.

Feb 09 2021
Feb 09

A myth is a pretty powerful thing, especially if you end up believing it. Even if you don’t, it does dampen the prospects of whatever it is associated with. It’s like if someone told you that the latest smartphone, which is exorbitantly expensive, is not worth the price. In such a scenario, even if you had the means of buying it, you would end up thinking twice. And you don’t even know whether the myth is true or not. That is how powerful a myth can be. 
A myth, a misconception, a false belief, whatever you call it, follows as many things as you can imagine. It follows you and me, and it also follows the inanimate objects, making their abilities seem weaker than they actually are. 
And it is one such thing that we are going to be discussing and try to debunk the falsified claims that have been following it for a long time. The thing I would be talking about is actually a software, a Content Management Software by the name of Drupal
Drupal is used to build websites, websites that are feature packed and give a powerful performance, yet there have been many claims made that try to show Drupal in a bad light.
Today, you and I will get into all of these and ensure that all the myths associated with the name Drupal are busted so colourfully that they can never ever be claimed by anyone. So, let’s start with the most common misconceptions about Drupal.

# Drupal tends to be difficult to use

Any software or system that you may end up using, the foremost aspect that you would look out for is its usability. It might be able to provide you with a ton of features and functionality, but if you cannot figure out how to use it, all of that would be a waste. So, on the same note, let me tell you the first Drupal myth. 

The most common myth about Drupal is the fact that it is very difficult to use by all the parties involved. Developers find it tedious to work with, marketers can’t get a hang of it and the content authors and editors, well, they feel that they are way out of their element. 
This myth is not true at all. Drupal is a little complex to use, at least in comparison to its competitors. However, it requires that level of complexity to be able to do its job properly. For instance, Drupal provides umpteen number of modules, all of which are able to provide you with any kind of functionality that you may be on the lookout for. Selecting from these could be a daunting task, but it is also a necessary one. 

Let us look at some of the Drupal complexities to understand why they are mis-conceptualising an easy-to-use CMS.

  • Drupal’s large codebase seems intimidating, but the system only loads what you need, so the point is basically moot here. 
  • Drupal uses more memory, so it is perceived that if there is an out-of-memory error, it would become very difficult to solve, making the experience pretty complicated. 
  • Then, there is the fact that Drupal is built on PHP. Since that is a language many developers lack experience with, the myth of difficulty in use prevails all the more.
  • The last one would be the lousy UX experience Drupal provides to the non-technical users. Being a content editor myself, I would say that this isn’t true at all. I have been able to use Drupal with ease to edit and publish my content and so far the experience hasn’t been lousy in the least. 

That is my take on Drupal ease-of-use. There is an acclimating period required, but that is true of any new technology. More on the misconception of Drupal being difficult here.

# Drupal migration and upgrade tend to be an insurmountable task

Migrations and upgrades are an inevitable part of website development. There will come a point when the version you have built your site on is going to become so basic and on the verge of being obsolete that you would need to upgrade or migrate to something new and better. 
With Drupal, the story is the same. From the first version of Drupal that was launched almost two decades ago to Drupal 9, which is the current version, there is a stark difference that is too obvious to ignore. 
The myth going on is that these migrations and upgrades of Drupal are very difficult, complex and will most likely give you a headache. The truth behind it is quite the opposite. 
I would be wrong to say that Drupal migrations do not require any work because they do. Imagine moving from your home in India to the US, there would be work required and lots of it too. However, the work won’t be too much to make you rethink the move because the other side is too special to give up. 
Coming back to Drupal, the upgrades are usually seen from Drupal 7 to 8, 8 to 9 or directly from Drupal 7 to 9. All of which are possible scenarios for your site and its enhanced functionality. 
Drupal 7 users can first upgrade to Drupal 8 and then move on to Drupal 9. Or, they may choose the best route by going directly to Drupal 9 to ensure that the upgraded site has maximum expected life.
For the Drupal 8 to Drupal 9 upgrade, there are six steps involved; only 6 and not a bazillion.  

  • First, you would need to ensure that your hosting environment aligns with the platform requirements mandated by Drupal 9.
  • Then you would need to update to Drupal‘s more current versions, it could be Drupal 8.8.x or 8.9.x. 
  • Once that is done, you would need to ensure that all your contributed projects are compatible with Drupal 9 by simply updating them. 
  • When that is out of way, you would be required to build custom code that is also compatible with Drupal 9. 
  • In the penultimate step, you will be asked to update the core codebase of Drupal 9. 
  • And as the final step, all you have to do is run update.php and that is it. 

There is also a step by step guide in the upgrading section provided by Drupal to help in the transition process with all the details you may be looking for. 
It is true that upgrading from Drupal 7 to 8 or 9 can be pretty intricate. Drupal Community took cognisance of this matter and made sure upgrading to Drupal 9 would be the easier you can get. As a matter of fact, the upgrade to Drupal 9 has been deemed as the easiest upgrade of the decade. This in itself should have been enough to pop this myth there and then. Access this ultimate guide to Drupal 9 to know more. You can browse through our complete list Drupal 9 FAQs that answers every burning question that might have regarding Drupal 9.

# As Drupal 9 rolled out, Drupal 7 and 8 tend to be less efficient

Since we just talked about the upgrading to Drupal 9, I felt this myth that has been for a while now needed some straightening too. 

The myth is that since Drupal 9 has launched and is the most advanced and feature-packed version of Drupal, the earlier editions of Drupal, namely 7 and 8, are simply no longer viable. 

This is not by any means true. It will come true at one point of time, but that point is very distant in the future; not according to me, but the makers themselves claim so. 

  • If I talk about Drupal 8, which is reliant on Symfony 3, it would be supported by Drupal until the 2nd of November, 2021, since that is how long the life of Symfony 3 is expected to be.
  • Talking about Drupal 7, its community support was earlier marked to end by November, 2021. However, with COVID and the consequent crisis, that has been extended to by a year. So, Drupal is expected to be supported by the Drupal community until 28 November, 2022

This support is proof that Drupal 7 and 8 would still be fully functional for a couple of years, and their efficiency is not going to be marred by any way. The sites and projects reliant on them will continue to bask in all the glorified features of Drupal. Don’t believe anyone who tells you otherwise. Dries Buytaert, the founder and project lead of Drupal, takes pride in continuing to care for old software. Drupal 7 released almost a decade ago and continues to get the sort of care and attention from the Drupal Community it requires to function well. He wishes more and more software is well-maintained like Drupal is.

Another thing that I want to add is that even if the community support dies down, there is still the vendor support that lets the project be efficient. For Drupal 7 sites that support is extended to 2025. Let me also tell you that there are still many Drupal 6 sites which are performing efficiently through the vendor support. 

Yes, the end of life would come for the older versions sooner than the later versions, but that is how life in general works, don’t you agree?

# Drupal tends to be heavy on the pocket

Financial considerations are one of the major aspects to pivot someone’s intentions towards taking up a project or software. The same is true for Drupal as well, taking us to the next myth.

It is perceived that Drupal is extremely expensive, making it hard for smaller organisations to take it up as a software to build their websites.

Drupal is an open source software, which means it is free of cost, so this myth is a little funny to say the least. Yes, open source software is not entirely free or rather its implementation and maintenance is not free. It would definitely cost you to hire Drupal developers to make that happen. From maintaining Drupal modules and updating your digital properties to migrating Drupal to the most current version requires skilled developers who are not always economical. However, these costs are not exorbitant, at least not in comparison to proprietary software, with its licensing fee and other perpetual expenses. 

Then there is the support of the Drupal community, which is always there to help you in any dilemma you find yourself in. Any and all of your questions will always be answered. This also means you get to take advantage of the abundance of experience found throughout the community, you can simply build on solutions that have already been created. 

Apart from this, your non-technical staff, especially your content authors are not reliant on developers to post and edit the content. This frees up the developers, leading to savings. Finally, the migration from one Drupal version to the next is also not expensive at all. So, if you are planning to shift to Drupal 9 from 8, remember the switch from Drupal 8.0 to 8.1, the migration would be that simple. 

Now, you tell me, is this not cost effective?

# Drupal tends to lack in security

For any web application, it is extremely important to be secure. Having security issues often make you vulnerable and prone to hackers and that outcome is never going to be favourable. 

So, this Drupal myth states that the CMS is not secure at all. It is an open source software and that is reason enough to doubt all of its claimed security features. 

Let me start by telling you that Open Source security cannot be taken lightly. Also known as Software Composition Analysis, Open Source Security provides the user an opportunity to garner more visibility for his application. From examining binary fingerprints to using professional and proprietary research and corroborating it with scans is done to build elements and tools that help developers in building safer applications. 

Focusing on Drupal, it is deemed as one of the most secure CMSs in the market, not just in the Open Source market, but the proprietary as well.

A pie chart depicts of results of sample group survey for security of various CMSs.Source: Acunetix

The above image clearly shows Drupal leading the way in terms security, being the CMS with the least issues faced as per a sample group’s findings.

Let’s find out why. 

  • Drupal’s security team works with the community to tackle any security issue as soon as it arises. 
  • Drupal’s API and default configuration is equipped to handle security issues like XSS, injection and forgeries with standard solutions. 
  • Drupal provides a lot of out-of-the-box security features like secure access, granular user access control and database encryption to make it all the more secure. 
  • Then there is the fact that many prominent government agencies use Drupal to build and manage their online projects. This speaks to its security measures. 

So, no Drupal does not, by any means, lack in terms of security, rather its security is almost impeccable and really hard to breach.

# Drupal tends to be unscalable and gives a meagre performance

Despite how great your site is now, there would come a point when its present state is no longer viable with your business goals. And that is when you would need to scale your site accordingly and boost its performance. This leads us to the next myth. 

It is often presumed that Drupal is not very scalable and its lacklustre performance in terms of higher traffic load and more content growth is not appreciable by any means. 

As an answer to this preposterous myth, I just want to say that, if that were the case, why would sites like The Weather Company and NBC, which have a daily audience in hundreds of thousands use Drupal? The justification is the exact opposite of the myth. 

Drupal can handle traffic spikes, it can handle content growth and it can handle an incredibly elevated user count and it can do it all like a breeze. All you have to do is optimise Drupal to its best abilities. It provides a number of features and modules for you to work with to manage your site’s performance and scalability. Be it the Blazy module to provide integration or the Content Delivery Network to offload your site’s delivery, Drupal has you covered.

I think this misconception should be clarified now. These guides to Drupal performance optimisation techniques and scalability offerings would clear the air further.

# Drupal tends to be inaccessible

Web accessibility refers to a website or application being built in a manner that anyone can access it with ease, anyone has a special focus on people with disabilities. The World Wide Consortium has set a few guidelines that web developers have to follow to become universally accessible.
According to this Drupal myth, it is assumed that Drupal is not universally accessible. It is not meant to be used by people with disabilities and can cause them harm, if they were to use Drupal sites. 
There is absolutely no truth in this misconception. Drupal stringently follows the WCAG 2.0 guidelines and has built its features accordingly.  

  • The Olivero theme for the front end in Drupal 9 is the prime example of Drupal’s accessibility. With focus on colour, contrast and fonts in accordance to the WCAG 2.0 guidelines, it is universally accessible. 
  • The use of HTML5 and WAI-ARIA has led to better semantics of purpose and behaviour of the web pages for the screen readers. 
  • The use of alt text in images helps in making them accessible to the visually impaired. 

These are simply a few examples of Drupal’s accessibility features and values. To understand web accessibility completely, read our blog Design Considerations for Accessibility and know that Drupal follows each one of them.

# Drupal tends to be unequipped to handle large site

There are two kinds of websites, the first one is for small businesses and the second one falls under the big business umbrellas. Taking these categories into consideration, we come to the next false claim. 

Many believe that Drupal is only competent to handle smaller sites, when it comes to the larger businesses and their web needs, Drupal may fall short. 

To clarify this bizarre claim, let’s just look at some of Drupal’s clientele. Tesla, Oxford University, European Commission, NBA and the French Government are just a few names that do not need an elaboration, people already know them. With such an elite clientele, is it justified to say that Drupal cannot handle large sites? I think not. 

Drupal is well equipped to provide enterprise grade services and features that include; 

  • Impeccable user management; 
  • Impeccable content management; 
  • Impeccable admin interface; 
  • Impeccably easy coding; 
  • Impeccable technology stack; 
  • And an infrastructure that resounds all the impeccable innovations in it. 

That is a whole lot of impeccable, but that is Drupal for you.

# Drupal tends to be incompatible with mobile devices and unsuitable for mobile solutions

Today, it is the responsive sites that reign over the internet. If you have a site that is compatible with the computer and the mobile phone, you can consider yourself amongst the rulers, metaphorically speaking of course. 
Drupal is often understood as a CMS that is not mobile-friendly, which means the sites and applications built on Drupal are not able to support responsive designs. 
Again, this isn’t the case. Rather Drupal works on the ideology of building sites that are responsive and creating web applications that provide an enthralling visitor experience, regardless of the device they may be using. This means that Drupal is compatible with mobile devices as well as desktops, since it has the ability to offer a seamless content experience to every user every time. So, the myth is debunked. 

# Drupal tends to be disintegrated with third-party tools

Confining a website to just one tool and software has become a thing of the past. With more technological innovations come more third party integrations. And where does Drupal stand in all of that?
According to the myth, Drupal is not at the top of the integrations list. Rather it is assumed that Drupal does not work well with other tools and has an isolated digital marketing philosophy as well. 
And there is very little truth in this assumption. In fact, Drupal has the ability to integrate itself with a massive ecosystem of digital marketing technologies and other business applications. It allows you to have the chance of tapping into the most popular tools of the present as well as the chance to do the same in the future. 
Drupal also has an API-first rule, which essentially means that your content can easily be connected to other sites and applications. This means your words would resonate with a much wider audience, making them all the more powerful.

# Drupal tends to have an inflexible and uneasy content workflow

Content is basically the voice of your site, so it is wise to choose a platform that makes your voice the loudest and most clear. Does Drupal do that?
The myth related to content workflow doesn’t believe that. As per the misconception, it is believed that Drupal is quite inflexible in terms of content and using it for creating, editing and publishing the content is not easy at all.
Let me start debunking this Drupal myth with its Admin Interface. This helps you in creating the exact content architecture that you want. You get to display only the content suitable for every context. The use of Drupal efficient display mode tools and Views makes this task all more fun. You can add any media type you want, be it images, videos or pdfs. Then there is the fact that you can customise your menus to make them aligned with the user’s device. 
And there is more. 

  • You can create and edit in-place. You can simply browse to a page, click on the content and start editing then and there. 
  • You can edit from any mobile device, iPads, smartphones or tablets, android or iOS, your pick. 
  • You can make revisions multiple times and keep track of all of them even months after. 

Is this what you call inflexible and difficult? 
Modules are Layout Builder and Paragraphs are renowned for the ease they provide to editors and content authors.

# Drupal tends to be unfriendly with SEO

SEO and all that it encompasses is essential or more like life-saving for your website and its visibility on the web. Since that is what brings in the numbers, you know how important that should be. 
So, the myth that is doing that rounds is that Drupal is not SEO friendly. It does not have the features to heighten the visibility of your site on Google or any of the other search engines.
Do you think that could be true?
I certainly don’t and neither should you. Drupal has dedicated features and modules that help you get the best out of SEO. Take the SEO checklist module for instance. Being an SEO module, it helps you be on top of all the SEO related tasks and ensures you are reminded of them. It is always being updated with the latest SEO guidelines so that you are aware and ready to tackle all of them. 
From modules capitalising on your URLs to tags and onto communication and editing, Drupal will have you covered for every SEO dimension you can think of. Our blog, The ultimate Drupal SEO guide will help you get an even elaborated explanation of Drupal’s SEO capabilities, which by no means are lacking.

# Drupal tends to be incompetent as a headless CMS

Going headless or decoupling has become a trend as it allows the developers to use the different technologies available in the site building process and make it all the more impressive. 
With Drupal, it is often assumed that decoupling would mean more work and less benefits. You would have a lot on your plate when you decouple Drupal and the result would be a dysfunctional and mismanaged site. 
This is nowhere close to the truth. When you decouple, you would have a separate frontend and backend development and management; both of which will be interdependent and connected through an API. Yes, you would most definitely have to part with some of Drupal’s out-of-the-box features, but that isn’t necessarily bad. 
With decoupling, you would be able to build a frontend the way you want to, with whatever technology you want to. Fancy React, go for it. Have a liking for Angular, go for that. You would be using Drupal as a content repository and since Drupal knows its contextual ABCs pretty well, you will be in great hands. You can publish your content across varying channels and manage it from one place. 
When you go headless, you will get to choose from the best frontend technologies and get the best at the backend layer with Drupal. The best of both worlds for you. 
Now, do these features portray incompetence to you? 
Read about everything you would want to know about Decoupled Drupal, Decoupled Drupal Architecture, how to decouple Drupal and some of the success stories to get an understanding of how competent decoupled Drupal can be.

# Drupal tends to be efficient with multisites

Many organisations have subsidiary businesses for which they need to build multiple sites. They might want these sites to become a replica of each other, offering the same features and functionality, yet be different to each other. 
There is a false claim being made that Drupal cannot optimise multisites. It cannot provide the separate individual sites their own database, configuration or even the URL/ domain names .
Drupal offers a multisite setup that is pretty efficient and well-equipped to handle all the requirements.

  • You would be able to manage all your Drupal sites running on the same version of Drupal core, which ultimately saves you time. 
  • You would be able to update all your sites simultaneously when there is a  new release because all of them would have one codebase. 
  • You might have some drawbacks through the multisite setup, but by using Aegir hosting system, you would easily overcome them.

Inefficient isn’t a term that should go with Drupal, since it is anything but that. So, manage your sites from across the globe from your laptop while sipping coffee on your kitchen island with Drupal. I consider this pretty efficient.

# Drupal tends to be hostile to multilingual sites

Like I talked about in the previous point, sites today aren’t confined to a region, they are almost universal. An American brand is also famous in India and it gained popularity because it was able to resonate with the Indian audience at a personal level in their language and dialect. 
Everybody knows that Drupal can handle multilingual sites, but many believe that Drupal isn’t great at that. Translations and other multilingual features tend to be below average. And that is just a myth.
Translations are the most crucial for multilingual sites and Drupal offers not one nor two, but four translation modules in Drupal 8. From content to configuration and interface, everything can be translated to the local language with ease. You can install Drupal 8 in as many as 94 languages without any need for the installation of extra components. Moreover, custom translations are often packaged and deployed on several properties, so you developers have lesser language related headaches. Everything on Drupal’s multilingual capabilities can be accessed here.
I would not call Drupal hostile to multilingual sites, would you?

# Drupal tends to only be suitable for a few industries

A CMS builds and manages websites. Since these websites can be for any business and field, a CMS should be able to cater to their industry type. 
There is a Drupal myth going around that states that the CMS isn’t meant for every industry. It only caters to a few, and I am not even sure which ones account in those few. 
This one is probably the most ludicrous misconception of them all. I can understand that people may be skeptical about open source security, but this is just nonsensical. I mean if a CMS is able to build a site for a retail business, what is stopping it from building one for a blogger? Kind of bizarre, isn’t it?

A list of the many sectors Drupal caters to is shown.Source: Drupal.org

This is a list of industries that has Drupal imprinted on their web services. And the extensiveness of the list is clarification enough for the myth.  From publishing houses and educational institutions to government agencies and charitable organisations, Drupal serves the majority of the industries.

# Drupal tends to become a pain when it comes to support, maintenance, hiring and partnering with digital agencies

Working with Drupal on your own can become challenging. You would need support and expertise of someone who has worked with the CMS and knows its ins and outs with clarity and that is a Drupal agency. 
This Drupal myth states that the hiring of Drupal agencies is a blood-sucking task, which would drain you of the same. Add to this the support and maintenance of Drupal and you might just give up on site building altogether. 
Let me start with Drupal agencies, there are a lot of them for you to choose from. The good thing about that many agencies is that they try to outweigh each other in terms of the services they offer and you end up with everything you desire. Being a part of OpenSense Labs, I can proudly say that we rank amongst the top 5 Drupal agencies in the global Drupal marketplace.

A list of Drupal agencies is shown with their marketplace rankings. Source: Drupal.org

Talking about support and maintenance, whichever agency you may choose, you are bound to get some very convenient services in this regard. 

These are only a few of the support and maintenance features available and they won’t let you be in any kind of pain.

# Finally, Drupal tends to be incompetent with the emerging technologies

We live in a dynamic world, where everything is transitory, from human life and thoughts to the technologies we have become so dependent on. These changes are basically advancements that aim to enhance our quality of life and all of the experiences in it. So, how does Drupal come into the picture and what is the myth?
This Drupal myth states that the software cannot work well with all the new technologies coming on every day and its integration with them is almost impossible. 
Do I have to say that this is untrue? I’m sure you know that by now. Drupal and its abundant content-heavy sites mandate that it utilises the latest technologies to make the user experience even more delightful.
The use of artificial intelligence in the form of chatbots, cognitive search and digital voice assistants like Alexa on Drupal sites is probably the most justified clarification to the bizarrely unjustified claim. Along with these, the streamlined incorporation of Virtual Reality, with all its realness, IoT and Blockchain into Drupal sites is further proof of the myth being a colossal misconception. Our blog, From conception to reality:Drupal for futuristic websites will shed further light on this notion. 


Drupal is one of the very best content management systems in the market. Its features and abilities are truly astounding. Believing some false claim that says that Drupal is anything but one of the finest would be a mistake you do not want to make. Yes, there isn’t everything Drupal is great at and yes, it may even have some flaws, being perfect is almost impossible after all, but all of Drupal’s imperfections are not enough to dampen its overall appeal. 
So, if you have chosen Drupal to provide your site’s groundwork, rest assured that you have made the right choice. We, at OpenSense Labs, have clients from across the globe asking us to build their sites using Drupal and to this day, not a single one of them has gone disappointed. 
Finally, the moral of the story is that don’t believe everything you hear, at least not until you have proof of its trueness and I think I have managed to tell you all of Drupal’s truths for you to shun all of Drupal’s myths. Debunking Drupal myths was fun. 

Feb 09 2021
Feb 09

[embedded content]

It is not uncommon for links to be styled like buttons when building websites.  Although this is not frowned upon, it is important to understand what HTML element to use depending on the purpose of that element.  For example, a element is frequently used to perform an action such as submit a form, or interact with an element to change its behavior., whereas s element is often used to direct the user to another page, site, or section of a page.  Visually it may not make much difference how links or buttons are styled, but choosing the right element matters from a semantic and even accessibility perspective.  Let’s go over how to dynamically render the right HTML element depending on the element’s purpose using Twig on a Drupal website.

Before we get into the details of the code, let’s understand how the two elements differentiate from each other which will help us qualify the right element to use.  Probably the easiest way that makes the

and different from each other is that a contains a href attribute and the does not.  This alone can help us determine whether we should use a or element on our site.  The href attribute will always be present when using an anchor element.  So let’s use the href attribute to our advantage.
  1. Let's start by defining some key data elements we usually have at our disposal when building a button or anchor element in Twig or React:  url, text, class
  2. It’s possible we won’t always have url or class, but text will most likely be available so we can label the element (i.e. Read more, Submit, etc.)
  3. We know that a will always need a url.  We also know a
will never have a url value</li></ol>

Using the logic above, we could start writing some code.  Let’s start with Twig and then we will repeat the process with JavaScript if you are working on a React or Gatsby project.

{% if url %}


    {{ text }}


{% else %}

    {{ text }}    {% endif %}

JavaScript and/or Gatsby

The logic to follow in any other language will be the same. The syntax may vary but ultimately we are checking for a URL and if so, let’s print a element, otherwise, let’s print a element.  However, in Gatsby there is one more scenario to consider; is the URL internal or going to an external page/site?  One of Gatsby’s powerful core components or features is .  This is used to prefetch pages in a gatsby site which results in incredible performance improvements.  So our logic just got a little more complex because not only do we need to check for whether there is a URL, we also need to determine if the URL is internal or external.  If it is internal, we will use a component, otherwise, we will use a traditional element.  Let’s see how we can make this work.  In an effort to keep this example short and to the point, I am excluding some pieces you normally would see in a real gatsby component.  This would typically be done in a button.js file.

import React from 'react';
import { Link } from 'gatsby';

const Button = ({
}) => {

  // If the `to` prop exists, return a link.
  if (to) {
    return (

  // if the `to` prop does not exist but `href` does, return a  element.
  if (href) {
    return (

  // If the `to` or `href` props do not exist, return a 
element. return ( {children} ); }; export default Button;

If a to prop is identified, we return a Gatsby component. As we can see this is a more complex piece of code but ultimately does the same thing as when we wrote it in Twig.  The difference here is that we are detecting three scenarios to determine which element to return:

  • If href prop is detected instead of to, we return a traditional element with target="blank" and rel="noopener noreferrer", According to Gatsby docs, routing to target="_blank" is the equivalent of an external page and Gatsby cannot be used. Docs here: https://www.gatsbyjs.com/docs/linking-between-pages/#using-a-for-external-links
  • Finally, if neither to nor href are detected, we return a element.

In Closing

Some may not see the need to go to this extreme to simply decide whether we need a button or anchor element.  However, there are many reasons why this is something you should consider as this could impact accessibility, UX, and in the case of Gatsby, even your site performance.

Feb 09 2021
Feb 09

Drupal being an open source project relies significantly on its community for contributions, promotion and overall expansion. The user base adds to Drupal in terms of coding, testing, bug fixation on the software, also contributing themes, modules and distributions regularly. Acknowledging the important role that the community plays and also its requirements, Drupal comes up with Initiatives to streamline and channelize everyone’s efforts to a common high priority goal.

Drupal Initiatives are common upcoming goals in the Drupal sphere prioritised according to commercial interests and their effect on the community. Initiatives are formulated around a feature, a new module, or an alteration in any element of the software. Usually, 

  • user research, 
  • proposals by Drupal Core Maintainers,
  • advice of key community members, and
  • inputs from the board of directors

are taken into account before launching an initiative. 

How core development and strategic initiatives come about in Drupal?

To understand the functioning of initiatives better, let’s skim through the life cycle of Drupal initiatives.


To start with filing Drupal initiatives, the first step would be to file a proposal. Although absolutely anyone could propose an initiative in Drupal, it is recommended that certain factors be taken into consideration before the said proposal. It needs to have a proper backing of relevant data along with a farsighted vision of what the end result would look like. It is recommended that the proposed initiative should have a positive market impact on Drupal and adds sufficient value to both the software and the principles that it stands for. Discussion with the community must also take place before bringing an idea on the floor, and subsequent revisions must be done accordingly with respect to the feedback received. Lastly, an individual wanting to get a proposal approved also needs to be patient with the whole process, as in a large community like Drupal, things move forward in a pretty organic fashion.  


The sites supported by Drupal are diverse and cater to a plethora of end users, therefore prioritising initiatives in such a manner that every user persona is appropriately benefitted assumes utmost importance. Hence, several Drupal initiatives might be proposed, but only a few get highlighted. This is done in accordance with Drupal.org Prioritization Criteria. Prioritising can be a tricky business as while one initiative might affect every Drupal user one-on-one, another might influence a small segment of users but at the same time be essential for Drupal’s core. Keeping all these things in mind, propositions are ‘accepted’, ‘rejected’ or ‘postponed’. The accepted proposals are then prioritized.


The association’s staff are tasked with weighing the impact of the Drupal initiatives against the spectrum of affected audience. Once a bunch of initiatives are finalised, they’re listed in the Current section of the roadmap. Architects are assigned to supervise the initiative, the member also acting as the lead for that particular initiative. His job is to ensure accessibility to open discussions and related information. Weekly or biweekly meetings are done between the staff liaison and the community working on the initiative all throughout its execution, and the completed draft is then submitted for review.

The time taken to get done with an initiative is highly variable. A smaller one might take a few days while a lengthier initiative can also span a few months to complete. 

For core initiatives, the process looks a little different than this. Initiatives that are identified as being of utmost importance to Drupal and the community are included in strategic initiatives by Dries Buytaert himself, who is the founder and project lead of Drupal.

For an initiative to be included in this category, it must have the following -

  • Surveys or statistics figures substantiating the validity of the initiative.
  • An underlying vision that could result in a breakthrough for Drupal.
  • Requirement of channelised resources as it is high priority.
  • Due to its larger scope and influence, need for multiple stakeholders to come together and collaborate on it.

Major Ongoing Community Initiatives

Drupal Community initiatives are projects where the user base comes together to work on several components of Drupal like contributed modules and themes or even Drupal Core. These are some major ongoing Drupal initiatives -

Composer Support in Core Initiative

Composer dependency manager for PHP does not have an official dedicated Support, although there do exist several documented ways to use it wherever necessary. Owing to the manager’s widespread adoption by a number of contributed modules, an initiative has been set up to create a Composer Support in Drupal Core. 

Bug Smash Initiative

As the name suggests, the bug smash initiative is meant to contribute towards bug resolution. It is a core issue focusing on bugs in the newer versions of Drupal that have been recently launched. Issues resolved under this initiative are tagged accordingly under the same name.

Documentation and Help Initiatives

The goal here is to improve, with the help of extensive documentation, the user experience of a Drupal evaluator, developer or site builder. The initiative also focuses on achieving a better in-house Help system by adding high utility features like a topic based help provision.

Workflow in Core Initiative

This initiative is working towards bringing upgrades in Drupal’s content workflow, preview and staging features with the aim to provide content authors and editors sufficient tools to review and deliberate upon the content before it is rolled out.

Drupal Open Curriculum

The open curriculum initiative is self explanatory - the aim here being the provision of a systematic and structured training to new recruits in Drupal. Features like in-class sessions, self learning along with training exercises are to be included, culminating with on the job learning. It is targeted towards developer entrepreneurs owning Drupal agencies with new hires learning Drupal or individuals curious about the software.

Accessibility contribution

Another really important initiative is to ensure compliance to web accessibility norms set by various governments and organisations. Greater awareness regarding accessibility by individuals suffering from certain ailments has led to widespread adoption by the players, and the Drupal community aims on pitching in every idea and suggestion to improve accessibility for its users.

Major Ongoing Strategic Drupal Initiatives

Some major ongoing strategic Drupal initiatives are as follows -

Automatic Updates

This initiative aims to solve the inconvenience that comes with having to manually update a Drupal site after updates have been incorporated in the software, as one running behind the deadline might compromise on the standard and security of a website. Hence, the community is working on automated updates to provide a safe and secure environment for pre programmed updation, and also to cut down maintenance costs of a Drupal website.

Decoupled Menus

An initiative for decoupled menus is also ongoing, aiming for better synchronisation with front end JavaScript

Orange background with black flag and text talking about decoupled menus, one of the ongoing Drupal initiativesSource : Drupal.org

Drupal 10 Readiness

The Drupal 10 readiness initiative provides a timeline for things that need to be wrapped by 2022 to facilitate the release of Drupal 10 in June, or latest by December that year (which is 2 years after the release of Drupal 9). Milestones that need to be aced are updating dependencies before Drupal 10’s release, removing deprecated APIs, helping module contributors maintain the modules’ updation, and consecutively speeding up the launch of Drupal 10. The primary components of this initiative are -

pink background with a horizontal arrow, surrounding text talking about Drupal upgradesSource : Drupal.org

Easy out of the box

Ease of use has become a matter of prominence in recent years, which is why Drupal 9 was largely focused on making the update easier and convenient to use and incorporate by the users. Another area under maintenance now is the enabling of the new Media Library, Layout Builder and Administration Theme by default in Drupal for added user satisfaction.

New Olivero Front-End Theme

Olivero is set to become the new default frontend theme by replacing Claro. As design is what truly manifests changing technologies and advancements for the end user, upgrades  in terms of modernity and functionality are essential for proper representation. The new theme is also meant to include support for Drupal’s second level navigation and the embedded media. Staying laser focused on accessibility, the theme is also supposed to be fully WCAG (Web Content Accessibility Guidelines) AA (mid range) compliant.


Drupal Initiatives are a good way to both bring the community together and stay afloat on the productivity factor. Some major changes in the past have been triggered by initiative action, for example, several design changes, changing Drupal’s default markup to one that conforms to HTML5 standards, making Drupal mobile first, etc Hence, initiatives have brought together people of diverse backgrounds and experiences to create something global together, affecting both the individuals personally but also the community at large. 

Feb 09 2021
Feb 09

Are you among those who are still wondering if your web application really demands a front-end framework? With names like React, Angular, Vue, Backbonejs, Emberjs and jQuery hitting the tech market, choosing the best front-end framework for your Drupal website is another added conundrum. Read on to find answers to these popular questions that can help you make a better decision in choosing the best front-end framework for your Drupal website.

Front End Frameworks for Drupal

Why do you need a front-end framework?

Today's world is a close reality to something I dreamt of as a child. A world run by devices, the technology they use and their potential to change the future. New interfaces and devices have brought in sweeping changes to transform the web as we know it. Technologies like Artificial Intelligence and IOT have started to establish and make an impact in the digital world. This impact has changed the way we perceive a future with seamless, feature rich websites.

However, while newly web-enabled devices continue to dominate, we have evolved the way we develop for the web. Though the content remains the same, delivering this content differs from one to another based on requirements and complexities. The next generation of user-experience is here and websites are expected to function seamlessly and instantaneously.

With such requirements, it is tough to stick with the old solutions. While it isn’t impossible to build a complex yet awesome UI/UX with plain HTML and CSS, implementing one a front-end framework does reduce some bulk and messiness off as your front-end grows. Not surprisingly, most of the bigwigs like Airbnb, GitHub, Forbes, Netflix, Pinterest, PayPal, etc., who offer outstanding user-experiences, implement popular front-end frameworks like React, Angular and Vue.

Headless Drupal and Front-end frameworks

While Drupal can handle the backend beautifully, it isn’t as flexible in terms of its front-end capabilities. The need for modern, intricate, dynamic and application-like front-end interfaces gave rise to headless Drupal or decoupled Drupal as we know it. In a headless Drupal architecture, developers have the flexibility to build the front-end on their own without using Drupal. While Drupal still serves as a backend repository, front-end frameworks can talk to the database via API calls.

But how do you pick the right front-end framework for your Drupal website? While every framework has its own set of pros and cons, the choice largely depends on the needs and business requirements for the project. Let us discuss in detail.


The most preferred front-end framework on the list, Angular JS, is a developer's favorite when it comes to interfacing with Drupal. It lets you create feature-rich dynamic web applications and allows Drupal to work more efficiently, resulting in a dynamic, secure and a gripping Drupal website. Backed by Google itself, this open-source framework allows you to do handle your user's browser without having to fetch data from your server.

Things Developers Love about Angular

  • Extremely light weight and extensible with a wide scope of features.

  • An interactive framework, a result of which is great functionality like the two-way binding, which allows user actions to immediately trigger application objects.

  • Developers love HTML and the fact that AngularJS uses plain HTML templates that can be easily re-used, modified or extended, allows them to build interactive feature-rich web applications.
  • With a client-side nature, AngularJS does a great job in handling cyber-attacks as any data looking to breach the security cannot get anywhere near the server.
  • Immense community support which provides answers, tutorials and used cases, with well-developed documentation.

Drupal & AngularJS

With the user expectations growing with each passing day, decoupled Drupal or headless Drupal as it is commonly known, is gaining more popularity these days. The idea is to take advantage of Drupal's flexibility and powerful back-end capabilities while using a front-end framework to handle the client-facing interface. What better option than the interactive AngularJS to do the talking to the browser while Drupal takes care of the feature filled back-end.

Also, AngularJS does a great job in offloading Drupal from some of its logic and helping Drupal function effectively at the back-end. By moving display logic to the client-side and streamlining the back end will result in a site that performs better and faster.


Created by former Google employee Evan You, this incredibly adoptable JS has quickly gained recognition among developers. A JavaScript library for building modern web interface, it provides data-reactive components with a simple and flexible API.

Things Developers Love about Vue

  • With a subtle learning curve and a component model, Vue stands on the shoulders of giants to provide benefits of reactive data binding and composable view components with a simple API.
  • A combination of React's best - Virtual DOM and Angular's best - two-way binding, allows VueJS to perform efficiently and improve the performance of the websites.
  • Real-time monitoring of the progress in development with a built-in state management is an added advantage.
  • Vue JS follows a component-oriented development style with modern tooling and supporting libraries. With a simple-to-use syntax, people who are using it for the first time find it easy to adopt.
  • VueJS is one of the top trending JS frameworks on Github.
  • It is highly supported by an awesome community and adoption within the PHP community which does a great job of maintaining good documentation.

Drupal & VueJS

Vue allows developers to request and store Drupal content as data objects using the official Vue-Resource plugin.

In combination with Vue, Drupal can exhibit its magic at the back end while the compelling features of the JS handle the client side. The component system in Vue is one of its most powerful features that allows large-scale application building which comprises small and self-contained reusable components.


ReactJS is more of a library than a framework, used to build user interfaces that work on a concept of reusable components and aim to solve the issues created by the slowness of the DOM by replacing it with the virtual DOM structure. An open-source project maintained by Facebook, ReactJS is the go-to option for some of the biggest corporations for a fast and seamless client-side user experience.

Things Developers Love about React

  • By nature, ReactJS is very readable and easy to understand thus making it easier to understand how components render from their source files.
  • ReactJS does a great job in combining HTML and JavaScript into JSX, which is a great asset for developers as the complexity between HTML and JS is eliminated.
  • With Virtual DOM, React can easily process large amounts of data in an efficient manner by monitoring the lightweight virtual DOMs.
  • Rendered extremely fast, ReactJS is a great option to build speedy public facing apps and sites that are smooth and offer a best in class UI experience.
  • A ton of proper documentation, invaluable tools, add-ons and more which are available to developers, courtesy of constant contribution by Facebook towards the development of React.
  • It is highly supported by an awesome community and adoption within the PHP community which does a great job of maintaining good documentation.

Drupal & ReactJS

A hybrid approach to use React for dealing with the UX complexities while relying on Drupal for handling the content can be an added advantage which easily allows consistent mapping of Drupal and React components.

With Drupal, one of the major weaknesses that hinder its performance is the way it consumes and displays the structure of content to the end user. This goes out of hand when the user interactions are complex and even the combination of Twig with JQuery is not good enough to match the complexities. However, integration with a modern library such as React provides all the necessary modern mechanisms that do a great job in building seamless, rich user experiences.

Feb 09 2021
Feb 09

Now on Drupal 9, the community isn’t slowing down. This month, we continue our interview with Angie Byron, a.k.a Webchick, a Drupal Core committer and product manager, Drupal Association Board Member, author, speaker, mentor, and Mom, and so much more. Currently, she works at Aquia for the Drupal acceleration team, where her primary role is to “Make Drupal awesome.” We talk about Drupal, coding, family, and her journey throughout the years.

This article was originally published in the February 2021 issue of php[architect] magazine. You can read the complete article at the following links. To see the full issue, please subscribe or purchase the complete issue.

Feb 08 2021
Feb 08

In our blog post about Innovating Healthcare with Drupal, we talked about using Drupal to deliver an application that improves the healthcare experience for palliative care patients.  Our application was a resounding success.  The global COVID-19 pandemic hits and the need to keep people out of the Emergency Rooms to stop the spread of the Coronavirus suddenly becomes urgent.  To move the Drupal application out of tightly controlled pilots to a more widely distributed application requires adherence to HIPAA (USA) and PIPEDA (Canada) guidelines to safeguard patient information.  Unfortunately, the tried and tested Drupal DevOps and hosting environments we’ve become accustomed to don’t come close to providing the level of security required as a platform to become compliant with HIPAA or PIPEDA.  This is where the MedStack hosting service comes in to save the day.

MedStack is an application hosting platform that provides ISO 27001 compliance for the environment in which your application resides, but not for the application itself.  The interesting feature of MedStack is that their environment can spin up any Docker image, producing a hosting platform that conforms to privacy requirements while giving you the freedom to write your application in any language that can be run on a Docker image.  It is up to you, the application developers, to ensure you adhere to security best practices within your application to keep it secure.  Among the application security items to consider are password policies, two-factor authentication, private vs. public files, permissions and keeping up with the Drupal security patches. Privacy Impact Assessments (PIA) and Threat and Risk Assessments (TRA) will still have to be done on your applications to ensure they meet the requirements for your healthcare application and what steps are required to remedy any deficiencies.

Docker-based solutions such as Drupal VM, DDev or Lando are widely used in the Drupal development community.  These solutions are excellent for spinning up a feature-rich development environment, eliminating the need for developers to use specific operating systems or to create locally-running LAMP development stacks.  Unfortunately, you can’t use Drupal VM out of the box on Medstack.  MedStack uses its own MySQL Database service to provide the proper HIPAA/PIPEDA compliance and you should streamline your Docker images to be production-configured environments.

The following screenshots should give you some insight into what Medstack provides.

Showing Medstack Control - Container management

With some identifying information removed, shown is Medstack Control which allows you to set up new clusters, manage the existing Docker services, create new nodes and manage your database servers. What you should note are the details shown in this screenshot:  Encryption on the network, encryption at rest and encryption in transit. Safeguarding patient data is paramount and encryption of data at rest and on the network is mandatory. Likewise, this particular application is for a Canadian healthcare network, therefore we have to run in the Central Canada region.  We are able to spin up a new cluster in the US or EU, thus satisfying in-country hosting requirements.

Medstack Control - Container properties

Drilling into the docker service, you’re able to update the service’s configuration, shell or exec commands in your container and see the history of events and tasks performed on your environment.  Need Drush?  No problem.  You can execute drush commands in the shell to manage your environment.  Just configure Drush in your Docker image.

Coupling a properly configured Drupal application with Medstack has allowed us to move Drupal into a HIPAA and PIPEDA compliant environment, satisfying the underlying privacy requirements demanded by our healthcare institutions.  We can now focus on the application and leave Medstack to worry about compliance issues.  Working with our healthcare partners, we continue to evolve our Drupal application in the healthcare space.

Feb 08 2021
Feb 08

At first, I thought I would learn .NET, but when I started working with a senior .NET developer, I found them to be unsupportive, not collaborative, and almost prohibitively protective of their knowledge. My past experiences with lead developers, who should act as mentors, was discouraging. As a front-end developer, knowing just HTML, I remember being made fun of by not knowing what was a server-side include and accidentally breaking an application. I’m sharing this experience because it helps explain how I discovered Drupal and, ultimately, why I am so active in the Drupal community and enjoy helping people in this community.

Feb 08 2021
Feb 08

This may be the quickest quicktip we've ever written - if your site doesn't require the "Request new password" functionality, the No Request New Password module makes it pretty easy to remove it. 


No Request New Password screenshot - before


No Request New Password screenshot - after

Also - the module doesn't just hide the "Request new password" link, it removes the functionality completely, so if a user navigates directly to /user/password, they'll be redirected back to the login page. 

Feb 07 2021
Feb 07

At every pre-pandemic MidCamp attendees were welcomed by a team of volunteers with shirts and badges and stickers and funny hats and answers to every question possible. Our 2020 event had a few warm human moments, like when a room moderator asked 100+ people to come off mute simultaneously before our opening remarks, but ultimately recreating the human-ness of our prior events in a virtual setting proved challenging. This was not only draining for prior attendees but challenging for new community members.

This year, our new “Community Day” sets out to provide a more human on-ramp by:

  1. introducing new community members to the product, the community, and the kinds of conversations that will go on during the event, and 
  2. engaging all attendees in planning the event itself.

In our prior post, we began to define our audience. We’ll begin community day with a morning of short talks and discussions split up into three audience-specific tracks:

  • I’m new to Drupal: these discussions could include a review of the tools on Drupal.org, learning opportunities around the community, or a panel of folks discussing why they’ve stuck around the community.
  • I do Drupal, I’m new to the community: Many folks Drupal, but not every Drupaler knows how they can leverage our incredible community. Here we’ll talk about issue queues, documentation, and novice contribution opportunities.
  • I do Drupal, I’m involved in the community: This group could discuss Drupal core initiatives, triage Contribution Day tasks, or review mentoring opportunities.

If you’re interested in presenting or have a request for a topic, please comment on this Drupal.org issue.

In the second half of Community Day, we’ll open the “Call for Activities/Topics” for our Thursday and Friday Unconference. 

  • Thursday will be focused on building relationships in the community through social events, games, and lightning talks, and other activities that are welcoming to all.
  • Friday will take more of a traditional Unconference schedule, with Birds of a Feather (BoF) discussions and more.

To encourage diversity in discussions, we’ll also be holding a workshop for marginalized, underrepresented, and historically excluded speakers on Wednesday afternoon. 

So, that’s Community Day. With it, we hope to:

  • give new attendees the confidence to bring a topic or activity to the table,
  • give everyone a plan of what they can expect for the next two days,
  • provide opportunities for connection and mentorship.

In the end, this MidCamp we’re encouraging all attendees to:

  • come as you are, 
  • bring your excitement and ideas, 
  • plan for lots of opportunities to engage, and 
  • feel free to step away as you need.
Feb 07 2021
Feb 07

Edit: After publishing, and based on community feedback, we've modified our naming slightly. "Community Day" is now "Community Onboarding Day".

At every pre-pandemic MidCamp attendees were welcomed by a team of volunteers with shirts and badges and stickers and funny hats and answers to every question possible. Our 2020 event had a few warm human moments, like when a room moderator asked 100+ people to come off mute simultaneously before our opening remarks, but ultimately recreating the human-ness of our prior events in a virtual setting proved challenging. This was not only draining for prior attendees but challenging for new community members.

This year, our new “Community Onboarding Day” sets out to provide a more human on-ramp by:

  1. introducing new community members to the product, the community, and the kinds of conversations that will go on during the event, and 
  2. engaging all attendees in planning the event itself.

In our prior post, we began to define our audience. We’ll begin community day with a morning of short talks and discussions split up into three audience-specific tracks:

  • I’m new to Drupal: these discussions could include a review of the tools on Drupal.org, learning opportunities around the community, or a panel of folks discussing why they’ve stuck around the community.
  • I do Drupal, I’m new to the community: Many folks Drupal, but not every Drupaler knows how they can leverage our incredible community. Here we’ll talk about issue queues, documentation, and novice contribution opportunities.
  • I do Drupal, I’m involved in the community: This group could discuss Drupal core initiatives, triage Contribution Day tasks, or review mentoring opportunities.

If you’re interested in presenting or have a request for a topic, please comment on this Drupal.org issue.

In the second half of Community Onboarding Day, we’ll open the “Call for Activities/Topics” for our Thursday and Friday Unconference. 

  • Thursday will be focused on building relationships in the community through social events, games, and lightning talks, and other activities that are welcoming to all.
  • Friday will take more of a traditional Unconference schedule, with Birds of a Feather (BoF) discussions and more.

To encourage diversity in discussions, we’ll also be holding a workshop for marginalized, underrepresented, and historically excluded speakers on Wednesday afternoon. 

So, that’s Community Onboarding Day. With it, we hope to:

  • give new attendees the confidence to bring a topic or activity to the table,
  • give everyone a plan of what they can expect for the next two days,
  • provide opportunities for connection and mentorship.

In the end, this MidCamp we’re encouraging all attendees to:

  • come as you are, 
  • bring your excitement and ideas, 
  • plan for lots of opportunities to engage, and 
  • feel free to step away as you need.
Feb 07 2021
Feb 07

Google keeps changing their algorithms and the importance of ranking factors that impact your website’s or webpages score in terms of how they rank on search engine result pages - and now they introduced Core Web Vitals as a key ranking factor.

Ranking higher on search engines is an ongoing effort to ensure that your website has the best possible chance to attract the highest amount of traffic and relevant visitors.

What Are Core Web Vitals and Why Do They Matter?

Google announced that it would prioritize user experience when it comes to assessing the ranking of each webpage in particular and overall ranking score. They refer to the new approach to ranking as Page Experience signals.

Page Experience signals includes 2 subsets of signals: Core Web Vitals and Search Signals.

While Search Signals focus on mobile friendliness, security, safe browsing, and non-intrusive UX components of a website, Core Web Vitals are a set of metrics that measure real-world user experience for loading performance, interactivity, and visual stability of the page.

Source: Google Search Central

Page Experience is a set of signals that measure how users perceive the experience of interacting with a web page.

The metrics that Core Web Vitals include are:

  1. Large Contentful Paint (LCP) which measure a website's loading performance,
  2. First Input Delay (FID) which measures interactivity, and,
  3. Cumulative Layout Shift (CLS) which measures the visual stability of the website.


How To Rank Higher with Core Web Vitals?

Ranking higher on search engines means you have to optimize your website based on the performance reports of the aforementioned Core Web Vital metrics. Let's break them down one by one:

Large Contentful Paint (LCP)

According to Google's best practices, your website must load the most meaningful piece of content on each webpage for the user within the first 2.5 seconds.

The precursor to LCP as a metric was First Contentful Paint (FCP) which measured how long it took the website to load the first feature on any webpage visible in the user's viewport.

However, after careful analysis of actual user behavior - Google realized that the user doesn't really care about the "First" webpage feature which very likely could be the website logo. Rather, the focus shifted to LCP because Google determined that the most relevant feature to the user would be also the largest on any webpage.

The elements being considered when measuring any webpage's LCP score include:

  • elements
  • elements inside an element
  • elements 
  • An element with a background image loaded via the url() function
  • Block-level elements containing text nodes or other inline-level text elements children.

More webpage elements will be introduced into the mix by Google as they attempt to update and optimize the LCP measurement process.

LCP performance scores for any webpage keep updating based on the scrolling activity of the user. The largest piece of content on any webpage may very well be visible only after scrolling.

How To Improve Your Website LCP Score?

Identify the Largest Contentful Paint (LCP) element on your Drupal website by auditing the performance of your webpages with Drupal Audit. Drupal Audit utilizes Lighthouse and PageSpeed Insights tailored to Drupal websites and projects.

The most common methods that will help you improve your LCP score are:

  1. Remove any unnecessarily third-party scripts: Studies show that each third-party script slowed a page down by 34 ms.

  2. Upgrade your web host: Better hosting = faster load times overall (including LCP).

  3. Set up lazy loading:  Lazy loading makes it so images only load when someone scrolls down your page.

  4. Remove large page elements: Google PageSpeed Insights will tell you if your page has an element that’s slowing down your page’s LCP.

  5. Minify your CSS: Bulky CSS can significantly delay LCP times.

First Input Delay (FID)

Measures the interactivity and responsiveness of your website during load.

FID focuses only on input events from discrete actions like clicks, taps, and key presses. Scrolling and zooming don't count towards measurement of your webpage FID score.

FID only measures the "delay" in event processing.


Why "First" Input Delay in Particular?

Because it's the website visitor's first impression of your website and the biggest interactivity issues we see on the web today occur during page load. 


When Is First Input Delay (FID) Important?

Blogs and content heavy websites don’t need to worry about FID because interactions are limited.

However, FID is massively important for websites that rely upon conversion (i.e. newsletter sign up, account information form, logging-in, etc.) 

How soon do you think users will begin attempting to fill in the sign-in form in the screenshot below?

Source: Reddit

If for example I attempt to fill-in the username field before the whole page loads, the FID score will be how long it took for the field to respond to my request while the page loads.

According to Google, the ideal delay would be no more than 100ms.

Website visitors aren't typically patient until the whole page loads and will commence clicking on various features on your website before loading is complete.


How To Improve Your Website FID Score?

You can gain insight into how your Drupal website performs when it comes to First Input Delay (FID) by using Drupal Audit. The usual suspects impacting FID in any website are:

  1. Minimize (or defer) JavaScript:  It’s almost impossible for users to interact with a page while the browser is loading up JS. So minimizing or deferring JS on your page is key for FID.
  2. Remove any non-critical third-party scripts:  Just like with FCP and LCP, third-party scripts (like Google Analytics, heatmaps etc.) can negatively impact FID.
  3. Use a browser cache: This helps load content on your page faster. Which helps your user’s browser blast through JS loading tasks even faster.

Important: Testing FID needs to be contextual and requires a real user because results may vary based on each user's behavior.

Cumulative Layout Shift (CLS)

Is your website layout stable or doesn't keep shifting during webpage loading?

CLS measures the sum total of all individual layout shift scores for every unexpected layout shift that occurs during the entire lifespan of the page.

A layout shift occurs any time a visible element, such as an image or Call-To-Action button, changes its position from one rendered frame to the next.

Why CLS Is Important?

Frequent visitors will be used to a standard and norm when navigating and interacting with your website; if elements on your webpages move around as the page loads - this will cause frustration and possibly unwelcome consequences.

Users shouldn’t have to re-learn where links, images and fields are located when the page is fully loaded. Or click on something by mistake.

Imagine you wanted to click on a link next to the "Checkout" button on an e-commerce website and ended up making a purchase unintentionally because the button suddenly shifted its place.

Based on Google's metrics for CLS scoring, your website overall CLS score should not go above 0.1.

How To Improve Your Website CLS Score?

Conduct an assessment through Drupal Audit for your Drupal website to identify how to improve your CLS score for each webpage.

  1. Use set size attribute dimensions for any media (video, images, GIFs, infographics etc.): That way, the user’s browser knows exactly how much space that element will take up on that page. And won’t change it on the fly as the page fully loads.
  2. Make sure ads elements have a reserved space: Otherwise they can suddenly appear on the page, pushing content down, up or to the side.
  3. Add new UI elements below the fold: That way, they don’t push content down that the user “expects” to stay where it is.


Core Web Vitals and SEO in 2021

Google is prioritizing the end user's needs first when it comes to indexing the best possible search results, so Core Web Vitals and Page Experience are here to stay. This will force website development and design agencies rethink their project delivery and website owners to actually become user-centric.

  • Eliminate Silos: Agencies that design and develop new websites will need to eliminate the siloed mentality and work closer than ever to create the optimal UX (code and design) that delivers the actual experience that the website's end user demands.
  • Product Thinking: Websites are dead. Digital Experiences are the future. Much like the human body, a website has vital signs that must be monitored, gauged, and optimized on. Think of your website much like how you develop a product - remaining static is not an option if you are looking to grow.
  • Comprehensive QA: Don't just focus on typical bugs and errors. Implement comprehensive and qualitative testing scenarios for your website's UX.

Despite the fact that hundreds of ranking factors are considered when Google determines ranking each website; the above Core Web Vital signals will contribute 45% towards how high your website ranks effective May 2021.

You should begin optimizing your website's Core Web Vitals sooner, rather than later, to avoid being hit with penalties relevant to poor user experience factors.

The main tools to rely upon when assessing your website's Core Web Vitals performance are: Drupal Audit (Tailored to Drupal Websites and Projects),  Chrome User Experience ReportPageSpeed InsightsSearch Console (Core Web Vitals report)web-vitals JavaScript libraryChrome DevToolsLighthouse, and WebPageTest.

Download SEO Guide

On-Site SEO Guide 2020

Identify all the best practices you need to implement in order to ensure that your website is optimized for search engines!

Feb 05 2021
Feb 05

At Ny Media we’re quality-oriented and our main objective is to provide secure and reliable solutions, giving our clients all tools they need to run successful online projects. To ensure the quality of solutions and meeting the client’s acceptance criteria we’re developing test cases to cover all critical parts of their business logic. In order to achieve this, we are facilitating different testing frameworks, just to name a few:

  • PHPUnit - for unit testing
  • PHPStan - for static code analysis
  • Behat - for user-story testing and acceptance criteria

This is not a complete list (that’s material for a separate blogpost) but should at least provide some perspective on what is our testing stack which varies between different projects.

Travis CI

For many years we were using travis-ci.com, which is a paid version of travis-ci.org - the popular among FOSS (free and open-source software) maintainers, a continuous integration platform, that allows running customized test cases on your private Github repositories. For us, one of the reasons to become paying customer of Travis is to support the company that was promoting FOSS. Many of us, developers here at Ny Media, were using Travis on daily basis for our own open-source side-projects, so integrating Travis into our company workflow was the only logical thing at that time.
Unfortunately, lately, Travis became highly unreliable both to their customers but also their own employees. They have also made it impossible for FOSS maintainers to keep using their services. Therefore we decided it’s time for a change.

Github Actions

Since we’re using Github as our main remote git repository, the natural choice for us was to explore Github CI (aka Github Actions) capabilities. Github has been working hard the last couple of years since the initial announcement of Github Actions to provide everyone with access to their product.

I’d like to name a couple of features that immediately caught our eyes:

  1. GitHub-hosted runners
    You don’t need a custom infrastructure to run your tests - Github can provide it for you at a reasonable price. All our tests are running on linux-based platforms and the price, at the time this blog post was created, is 0.008 USD per minute. There is some package of minutes included in your Github plan. Moreover, all public repositories are free to run Github Actions!
  2. Self-hosted runners
    You can run your tests on Github infrastructure or use your own infrastructure - either physical or virtual. And the best part is - you don’t need to pay extra for utilizing that infrastructure as a Github test runner
  3. .“Unlimited” concurrency
    The number varies depends on your plan but it is quite generous even for Free accounts (20 concurrent runners). At the time of writing this blogpost Travis charges 249 USD for 5 concurrent runners.
  4. Powerful infrastructure
    A few years back Github was acquired by Microsoft, and as a side effect of this acquisition, it got access to cloud infrastructure - Azure. Therefore they can provide a quite powerful infrastructure at reasonable prices making it hard to beat.
  5. Exceptional documentation and support
    Github is quite great at documenting all features regarding the new platform. Even if the moment of doubts I’ve been able to reach out to the support team and get my answers within the same day.


Since our tests on Travis were running within a dockerized environment and not directly on the worker instance, we were able to finish the migration within one day. It required the following steps few steps.

Create .github/workflows/test.yaml file

Here’s the example file you can use as a template:

name: Run tests
      - develop
      - master
      - develop
      - master

    name: Run tests
    runs-on: ubuntu-latest
    timeout-minutes: 20
      - uses: actions/[email protected]
      - run: /bin/bash run-tests.sh

In the example above run-tests.sh script represents all the steps your test suite requires to execute. It may need to build dockerized environment, it may need to download all dependencies, run test scripts, it’s up to you. In our case it all the above.

I want to highlight 2 things in the template above. This workflow will only be triggered when you push to branches master and develop or if you open a Pull Request against those 2 branches and push some commits to the branch associated with such Pull Request.
Here’s what you’ll be seeing in your Pull Requests

Github status check

You can see 2 steps being run as a part of this job. Please mind the difference betwee keywords used for each of those steps. The run marks what script or sequence of commands should be run on your runner host.  The uses provides a way of utilizing from the whole marketplace of pre-cooked actions. The one used in the template - actions/[email protected] - is provided directly by the Github team. It allows your project to be cloned in a certain directory (by default the current one) and the commit which triggered the build will be checked out. Very simple, yet powerful - more about it in our future blogposts.

Make sure checks passed before merging the Pull Request

Do that by setting up the protected branch in repository Settings -> Branches

Protected branches require status checks

Disable Travis

Now that your tests are running using Github CI you can safely remove all references to Travis in 3 easy steps:

  1. Cancel your plan. If you’re paying for Travis just go to https://travis-ci.com/plans and switch to plan Free.
  2. Go to https://github.com/organizations/yourorganization/settings/installations and remove the Travis App
  3. Delete .travis.yml from all repositories that you used to test with Travis. You don’t need that anymore.


Are we happy with the migration? Yes, very much so. Without making any change to how we run tests (initially, we started with 1:1 test migration from Travis to Github CI) our test times were decreased by ~20% due to much more powerful test runners on Github vs. Travis. In addition, Travis was limiting us to a certain amount of runners that can run concurrently (depending on pricing plan). That restriction was increased by the order of magnitude when we migrated to Github CI, giving our developers much quicker feedback. On top of that, despite the increased performance, we’re actually paying less for using Github CI compared to Travis.

But this is not the end of this story. We have barely scratched the surface of possibilities when it comes to utilizing Github CI API. In the next blog post, we’ll talk about what can you do to further improve your workflow and optimize time spent on testing. Stay tuned.

Feb 04 2021
Feb 04

This January was a particularly exciting time for the Drupal community, with the open-source project celebrating its 20th birthday. Read on to discover some of the top Drupal reads from last month!

Drupal celebrates 20 years!

It makes perfect sense to start with Dries Buytaert’s post celebrating 20 years of Drupal and taking a look back at its development over these two decades. Here’s a fun fact: Wikipedia, Creative Commons and some other notable organisations were also launched on the same day, January 15th 2001.

Dries dedicates the main part of his post to three birthday wishes for Drupal: first, that it continues to evolve alongside other technologies and innovations; second, that the software’s ease of use keeps improving for all users; and third, the sustainability and scalability of open-source software by rewarding and promoting Makers.

Read Dries’s post on 20 years of Drupal

Celebrating twenty years of Drupal

Moving on, we have another post concerning Drupal’s 20th birthday, this one by Baddý Breidert of 1xINTERNET. Like Dries’s post, this one also has a personal angle, truly reflecting the company’s commitment to Drupal and its community.

As Baddý states, all their back-end tools are based on Drupal, and all members of their team are strong supporters of the project and heavy contributors in the form of code, events and connections.

This year, 1xINTERNET is showcasing 20 of their most successful Drupal projects to commemorate the 20th birthday. The first one they’re highlighting is Transgourmet, a major European food retailer, for whom they’ve built an efficient multi-site platform.

Read more about 1xINTERNET’s celebration of Drupal

Guide: How to integrate JavaScript in Drupal 8-9

Next up, we have an in-depth guide by David Rodríguez for the Russian Lullaby on integrating JavaScript in Drupal 8 and 9. David warns right away that this isn’t a guide on the implementation of “decoupled” Drupal, using a JavaScript-based front-end technology, but rather a guide for back-end developers having to integrate JavaScript when writing modules.

He starts off with some basic concepts in both Drupal and JavaScript, then continues with sections on the integration, jQuery, Drupal behaviors, JavaScript without JavaScript, and troubleshooting, before concluding with a list of links and extra resources. Each of the practical sections also includes examples and exercises.

Read more about integrating JavaScript in Drupal 8 & 9

Cookie Services: How to Handle Cookies in Drupal & Symfony

In the fourth post on this month’s recap, Jonathan Daggerhart shows how he solved cookie data management for a Drupal 9 project, namely, by using a cookie service to address some of the main issues such as the incompatibility of Drupal’s legacy cookie functions with Symfony’s cookie management.

Jonathan first builds a simple cookie service, but there are still some special considerations which he addresses in the second part of the article. He also provides an example of using the newly created service in a different module, then finishes with some possible next steps, such as making the service more reusable and have it work with multiple cookies.

Read more about cookie services in Drupal & Symfony

A Non-Code Community Contribution Opportunity: Become a Site Moderator

We continue with a more community-oriented post, namely, AmyJune Hineline’s CTA for community members to become drupal.org site moderators. The post starts off by enumerating the responsibilities of site moderators, which mainly include responding to issues in the drupal.org issue queue and onboarding new community members.

AmyJune continues the post with the steps necessary to become a moderator and some extra considerations. As she points out, contributing to Drupal is highly beneficial, no matter whether or not you’re a developer. For any questions or help getting started, she recommends joining the Drupal Slack channel and becoming involved there.

Read more about becoming a drupal.org moderator

Using Drupal For Digital Experiences

Another very interesting post from December is Gabe Sullice’s consideration of using Drupal for large-scale multichannel digital experiences. He builds off the idea of user experiences being directed graphs, and since Drupal is a graph builder, it could actually be viewed as a user experience builder - but one that’s currently optimized for website building.

The third key idea is the need for Drupal to evolve, not just in the sense of catching up to other innovations in the digital, but rather by actually embracing them and becoming a tool for powering any kind of digital experience. Gabe is optimistic about this possibility, and the first step is already being worked on by him and others - the decoupled menus initiative.

Read more about using Drupal for digital experiences

Drupal 10 is Coming in 2020: Are You Ready?

Don’t let the recent release date of Drupal 9 fool you; the next major version, Drupal 10, is slated for release in mid-2022, which is next year. In the final post on this month’s recap, Acquia’s Gábor Hojtsy takes a look at what you need to know to prepare for the upgrade.

One major part are the third-party dependencies that will be updated: CKEditor 5, Symfony 5/6, Composer 2 and PHP 8. No need to worry, though, as the upgrade process will be as pain-free as between versions 8 and 9.

Drupal 10 will be even more user-friendly, with a new default front-end theme, automated updates and the introduction of JavaScript components.

Read more about Drupal 10

Snowy bench by a frozen-over lake

We hope you enjoyed this month's recap. Tune in next month when we'll be sharing our selection of Drupal articles from February.

Feb 04 2021
Feb 04

One more year? Sure. Why not!?

When we originally announced that we'd be providing Drupal 6 Long-Term Support, we committed to supporting our customers until at least February 2017.

We've made pretty regular announcements in the past extending things far beyond that original end-date.

Today, we're announcing that we'll be extending our Drupal 6 Long-Term Support (D6LTS) until at least February 2023!

Why February 24th, 2023?

Well, we've been using the February 24th date, because Drupal 6 orginally reached it's End-of-Life on February 24th, 2016, and we've been taking it one year at a time.

Drupal 7's End-of-Life will be in November 2021. We know that D7ES will have a long life ahead of it as there are still hundreds of thousands of important sites running it. Similarly, we see no reason to stop our Drupal 6 support any time soon.

There's still TONS to do Drupal 6

While it can be a little hard to predict the challenges that Drupal 6 site owners will face in the future, don't worry. If the past is any indicator, I'm sure there will be plenty to do!

What are your plans for Drupal 7?

Recently, things have been coming into focus for Drupal 7's End-of-Life and the Drupal 7 Extended Support (D7ES) program.

We still don't know the full details of our offering, but we can say this:

  • It will be very similar to our D6LTS offer
  • We'll be providing Drupal 7 support until at least November 2025.

More details will come as we get closer to Drupal 7's End-of-Life!

Will you support Drupal 6 forever?

While some of customers would love if we'd support Drupal 6 forever, the answer is "no."

Our service is billed at a relatively low fixed monthly fee, so it depends on a certain amount of scale and overlap between our customers needs in order to be profitable.

This is great for our customers because they pay less than they'd probably pay hourly for individual services just for their site, by "sharing the load" with other customers with similar needs! But it also means that when enough of our customers quit or upgrade to being Drupal 7 or 8 maintenance and support customers, providing Drupal 6 LTS will be a loss for us.

When that happens (and it inevitably will), then we'll have to either (a) charge higher prices to make up the difference or (b) stop providing Drupal 6 LTS.

But don't worry - we'll let you know long in advance of when that is coming!

In the first half of 2022, we'll be announcing any changes to our Drupal 6 LTS offering, including:

  • Whether or not we'll be extending Drupal 6 support,
  • If there will be any changes to the price or service offered,
  • And if we have any special offers to help upgrade the remaining Drupal 6 sites

But for the time-being, you can expect our Drupal 6 LTS to last until February 24th, 2023!

Feb 03 2021
Feb 03

Thanks to all the hard work of the organiszers, speakers, attendees and, of course, contributors from the open source community. Without a doubt, DrupalCon Europe 2020 was a great time for everyone who was able to attend and an excellent opportunity to share and sharpen our experience design, web development, and web maintenance skills. Of course, the on-going mission to improve all things open source and Drupal 9 projects continues, so if you weren’t able to attend the event, missed a few sessions that you weren't able to fit into your schedule, or just want to refresh your memory — we’ve got you covered. In this article, we’ve pulled together the replays of our best sessions for you to watch — all in one easy place!

Catch Nick O'Sullivan's “Decoupled Translations with Drupal” and learn how to utilise Drupal’s exceptional multilingual features to produce a fully localised Gatsby site. 

Learn more about our philosophy and practices from the "Sustainable Practices for Building and Maintaining the Open Web Panel" with Amazee Labs Lead Engineer – Philipp Melab. 

Learn how to add end-to-end tests to your Drupal site in just one hour using CypressIO under the cool tutelage of Senior Developer – Fran Garcia-Linares.  

Dan Lemon tells the story of how CelebrateDrupal.org came to be, in his talk “Building a platform to bring people together to Celebrate Drupal” and joins the conversation in the Workshop and Retrospective: How did the COVID-19 crisis affect client relationships and what can we take out of it? 


Want to know more about how we can help with your project? Drop us a line and talk to one of our web consulting experts today!

Feb 03 2021
Feb 03

I was poking around the Drupal.org project usage page over the holidays checking out some trends and making sure there weren't any up-and-coming contrib projects that haven't been on my radar. Since Drupal 8 was released (over 5 years ago!) I've been bothered by the fact that this page can't be filtered by the Drupal core version.

Along the way I fell into a bit of a rabbit hole and decided to dig much deeper into Drupal.org statistics. But first, let's take a look at contrib projects.

Most installed contrib projects

An incomplete workaround to finding the most installed contrib projects by Drupal core version is to use the module search page and filter it by Drupal core version and sort by "most installed." While this provides a list of modules, it doesn't provide historical trends as the project usage page does. Regardless, here's some data:

Top 5 installed Drupal 8 and 9 contrib modules 

Top 5 installed Drupal 7 contrib modules

Top 5 installed Drupal 9 contrib themes 

Top 5 installed Drupal 8 contrib themes 

Top 5 installed Drupal 7 contrib themes

While this data is somewhat interesting, there really aren't any surprises.

Drupal.org usage statistics

For that, I recently requested, and received access to the Drupal.org analytics from the Drupal Association with the goal of looking at some usage statistics from 2020 and to dig a little deeper into what Drupal developers were up over the past 12 months.

I wasn't interested in doing a complete statistical analysis of the data and comparing it with historical data, rather I was just looking for things I thought were cool. Plain and simple.

All data below is for the time period of January 1, 2020 - December 31, 2020.

First off, I'm not a data scientist - I'm just a nerd who likes to look at data sometimes, so some of the assumptions I make below may be off-the-mark. If so, feel free to correct me.

Let's start off with some basic stats - in 2020, there were about 50,000 users on Drupal.org on any given weekday. Anecdotally, the daily December average was around 5,000 users/day higher when compared with January.

Who are we and how are we accessing Drupal.org?

Where are the users coming from? Not surprisingly, the largest percentage came from the United states, but more visitors came from India and China (when combined). Clearly, we need to do a better job in recruiting from these areas to be more involved in Drupal leadership. 


  • 20% US
  • 12% India
  • 11% China
  • 5% Sweden
  • 3% United Kingdom

Interestingly enough, when looking at the top 10 cities where Drupal.org traffic originates, 6 of the top 10 are from China and India. In order, they are: Beijing, unspecified, Stockholm, Chicago, Bengaluru, Chennai, London, Mumbai, Hyderabad, Pune

The majority of visitors' browsers report their language as English, with Chinese the next largest share. This seems to imply that many of the visitors from India speak English well enough to have their browsers set to use English.


  • 49% English (US)
  • 12% Chinese
  • 9% English (GB)
  • 3% English (unspecified)
  • 3% Spanish

Some other interesting statistics about who is visiting Drupal.org:

  • 85% are new visitors - this seems (very) high to me, and I'm going to attribute (at least a portion of it) to folks with privacy controls that make it seem like they're a new visitor.
  • Over 60% of visitors use Chrome, with almost 50% on some version of Windows, and 80% using a desktop browser.
  • Over 60% of users arrive via an organic search - this is not surprising to me at all, as I routinely (multiple times a day) use Duck Duck Go to search for content on Drupal.org rather than use Drupal.org's search tool.

What are we looking at?

Now for the data I was really I was interested in finding - which topics, issues, and projects we were actually looking at in 2020. To do this, I focused mainly on the top 100 most visited pages on drupal.org. 

Hash-tagged numbers indicate the page's position in Drupal.org's overall most visited pages ranking.

Most visited contributed projects

No huge surprises here, except I'm still amazed by the popularity of Bootstrap (keep in mind there are multiple Bootstrap-based base themes as well!) I was also a bit surprised at the popularity of Commerce, not because it isn't an amazing tool, but because I would've guessed other projects would be above it (Redirect module, for example, is #46).

It's also interesting that Webform was the most visited contributed project page, but didn't appear in any of the "most installed" lists above.

Most visited topic-specific documentation pages

Unsurprisingly, the top 2 most visited documentation pages (that aren't landing pages) were related to Composer. 

Most visited contrib project issues

This was probably the most unexpected and unexplainable data I found. No way I would've ever guessed that the most visited contrib project issue would be for the Commerce Braintree module. Luckily, it is marked as "fixed", I can only imagine that the traffic was primarily to access the patch? 

Then, the second most visited issue is related to the Image module for Drupal core version 5.x? It's traffic is pretty consistent for all of 2020. The only thing I can think of to possibly explain this is that the thread has a magic combination of keywords that put it high in organic search results (well over 90% of the traffic to this page originates from search engines).

Most visited forum post

How to login to Drupal admin panel (from 2017!) #84

Yes, the Drupal.org forum is still alive and people are still accidentally getting locked out of their sites.

Most visited page without a path alias

How to fix "The following module is missing from the file system..." warning messages (from the "Reference" section) #43

Oh yeah - I've definitely been someone who has searched for, and landed on this page.

Most visited Drupal core issues 

I didn't know what to expect when I went looking for the most visited Drupal core issue, but as soon as I found #216, it made perfect sense. I feel like you are in the minority of users if your Drupal core 8.8 update went smoothly.

Most visited Drupal core release pages 

I'm at a loss to explain why these core release pages were visited more than any others. Overall, there were 6 Drupal core release pages in the top 200 most visited pages.

Most visited pages overall

After the home page, the top visited pages were the project search, the download page, Drupal core, user Dashboards, the theme search, the User Guide, and Try Drupal.

None of these are all that surprising or interesting, at least to me, but included here for completeness. 

Finally, I was curious as to how many of the top 100 most visited pages were documentation pages (12) and contributed project pages (46).


A few things that I took away from this exercise:

  • Traffic to Drupal.org increased during 2020.
  • Composer continues to be a pain point in the community.
  • The contributed module ecosystem continues to be one of the crown-jewels of the Drupal community.
  • Pathauto should be in core (try to convince me otherwise)
  • Should the community consider tweaking metadata for pages related to older versions of Drupal core so they don't rank as high in search engines?
  • There seems to be an opportunity for new contributors to be provided with a list of the most visited forum, reference, and issue pages and convert those that make sense into documentation pages.
  • Why aren't a commensurate percentage of people from places with high numbers of users community leaders, and what can we do collectively to remedy it?

Thanks to Tim Lenhen from the Drupal Association for providing me with temporary access to the Drupal.org analytics.

Feb 03 2021
Feb 03

It’s this time of the year again! Late January is the Global Contribution Weekend in the Drupal community. This year it was from the 29th till the 31st of January. It is a great occasion to take a step back from the daily routine to reflect how we all profit from Drupal and how we give back. This year again, MD Systems was a big part of the Swiss event.

Meeting community collaborators in person at this event and sharing ideas is usually one of the highlights of the year. With everyone permanently working from home, it is difficult to motivate people for yet another virtual meeting when they deserved their weekend offline time. Nevertheless, MD Systems decided to dedicate Friday to the event and was able to make great progress.

Milos Bovan (mbovan) did his now famous first-contribution-workshop, welcoming new-comers, while others reflected on making Open Source sustainable.

The Drupal Switzerland Association higher management shared ideas about the future of the association, in particular concerning the creation of memberships specifically tailored for companies.

MD Systems thanks in particular Josef Krukenberg (dasjo) for organizing the Swiss event.

We're glad to see #Drupal contributors from all over Switzerland and beyond join the #ContributionWeekend #OpenSource https://t.co/eYwTkCM1Jb

— Liip (@liip) January 29, 2021

This year, 103 issues have been worked on by the whole community (tagged #ContributionWeekend2021 on drupal.org) with more than 10 Swiss participants only on Friday. Additionally, the community contributes and collaborates in many ways that are not represented in the issue queue.

Feb 03 2021
Feb 03

More so than ever before, government and public sector websites are called upon to multi-task,  functioning as the digital town square -- a central spot for connecting, conducting business, keeping informed, showcasing top attractions, and a lot more. 

Among government officials who are responsible for managing and making decisions concerning the right content management system (CMS), the margin for error is low and the stakes are high. Government websites need to be secure, scalable, engaging, flexible, accessible, dependable, and easy to navigate. As budgets get squeezed, websites also need to demonstrate cost effectiveness and all that factors into good governance.

Factoring in all that’s riding on getting it right, combined with our in-the-trenches perspective from hundreds of conversations and engagements with government clients, we at Promet Source rank Drupal to be the CMS that best stands up to the demands of public sector websites

We’re in good company concerning this assessment.

Drupal’s share of the government and public sector CMS market is built upon a solid foundation that includes these eight factors:


1. Drupal Integrates with other solutions and services.

Drupal plays well with others and this means a lot. Ease of integration with other services -- even proprietary solutions -- ensures that a government website will have the flexibility to accommodate both current and future needs. The Drupal CMS is a modular CMS, which serves as a foundation for easily integrating with other solutions. The value of this feature is compounded by the fact that Drupal is Open Source. As such, there is no controlling authority determining or limiting integration activity. Open Source also means that there are no additional costs or licensing fees associated with multiple integrations.  

2. Security is both transparent and robust.

Security is a paramount concern for government websites, and Drupal’s track record of superior security is a key factor contributing to its popularity among public sector clients. The fact that Drupal is Open Source means that government IT officials have access to the code for their sites and can exercise whatever level of due diligence they they feel that they need to in order to feel assured of the site can stand up to hacking and cyber threats. 

The 1.3 member strong Drupal community is collectively committed to eliminating potential threats with contributions of several security modules. Among them: Login Security, Password Policy, Captcha, and Security Kit. The Drupal Security Working Group is focused on ensuring that both Drupal core and the entire ecosystem of member contributions ensure both world-class security for Drupal sites.

Another factor worth noting is that Drupal is not a SAAS solution, which means that the site’s code is not commingled in a shared database.

3. Drupal accommodates multisites.

Any CMS for the government or public sector will need to accommodate a wide range of sites, and Drupal’s multisite feature streamlines the creation of multisites by enabling developers to copy the main site's code base and create as many offshoot sites as needed that leverage the same functionality. This represents a significant savings in both development costs and ongoing maintenance.

While the ability to accommodate multisites is now a standard feature among most CMS platforms, the factor that sets Drupal apart from the proprietary options is cost -- as in the absence of cost. As an Open Source CMS, there are no additional fees associated with the addition of multiple sites. 

Scalability of Drupal is intertwined with multi-site functionality, enabling brand guidelines to be be centrally maintained while individual agencies and departments can independently manage their content. 

4. Drupal sites can handle millions of hits.

Drupal’s inherent scalability is another important factor in the multisite feature, as a high influx of traffic on one site affects all of the sites on the same Drupal code base. 

Government websites need to be ready at any moment for a surge in traffic. Whether due to dangerous weather warnings, civic upheaval, or even a celebratory event, a government website is never more vital than when a critical mass of citizens flock to it at the same time. Drupal supports some of the most highly trafficked government websites in the world, and is built to handle both sudden surges and millions of visitors a month without crashing or breaking. 

5. Hosting options can fit specific requirements.

With Drupal, site owners can select the hosting vendor that best fits their needs, they can change hosting vendors whenever they feel the need to, and they can opt to host the website internally. This is not always the case with a proprietary CMS solution.

6. Multilingual support is built in.

Drupal supports more than 100 languages out of the box. Although all major CMS solutions offer multilingual support at this point, Drupal offers extra features that facilitate translation capabilities. 

  • The Content Translation module in Drupal allows pages and individual content to be translated by creating a duplicate set in the translated language. 
  • Entity Translation module allows particular fields to be translated.

7. Drupal sites are accessible out of the box.

Drupal is fundamentally committed to compliance with web accessibility standards, which is an essential consideration for all government and public sector websites. Clients can count on the fact that Drupal is compliant with the most recent Web Content Accessibility Guidelines (WCAG 2.1). This is a key advantage of Drupal, as CMS solutions that rely more heavily on external plug-ins cannot be counted on to be in compliance with web accessibility standards to anywhere near the same degree. 

8. The Drupal Community is 1.3 million members strong.

The Drupal community is a huge advantage to both developers and public sector clients. As a longtime member of the Drupal community, I’ve experienced on many occasions the power of a 1.3 million member community who are invested in its success. There are no secrets in Drupal. It is built on a common OS dev stack, which means that Drupal developer talent tends to be more widely available than for proprietary CMS solutions. Help and support for whatever or question may arise within the community is freely and generously available. 

At Promet Source, Drupal is in our DNA. We serve as engaged contributors to the Drupal community and embrace the spirit of open sharing of expertise and solutions, along with a strong track record of designing and developing Drupal sites for government and public sector clients. Let us know what we can do for you!

Subscribe to Promet Insights

Feb 02 2021
Feb 02


Don’t miss your last chance to register for DrupalCon at the early bird rate

Tomorrow, February 3, is the final opportunity to register for DrupalCon North America 2021 at the discounted rate.

Whether you’re new to Drupal or have been attending DrupalCon for years, you won’t want to miss this year’s all-virtual event. In keynote presentations, hands-on workshops, breakout discussion groups, and interactive sessions, you’ll connect with a global community of developers, marketers, and leaders — people who want to learn about what you’re doing with Drupal and have their own expertise to share. 

Dive deeper into issues related to your industry and share insights with peers in your field at a DrupalCon summit. Participation in industry summits are included this year with your DrupalCon ticket. Dates are spread throughout the month of April so you don't have to miss a thing. 

Feb 02 2021
Feb 02

The Robots.txt file is a very underrated on-page SEO factor. Not everybody realizes the value it brings to the table. The Robots.txt file is like an access control system that tells the crawler bots which pages need to be crawled and which ones don’t. It is a rule book for your website which is read by the various web spiders before it attempts a crawl on your website.

There are tons of amazing Drupal SEO modules in version 9 and 8 that help make our jobs easier and boosts SEO ranking. And one of them is the RobotTxt module. The RobotsTxt module in Drupal 9 (and 8) is a handy feature that enables easy control of the Robots.Txt file in a multisite Drupal environment. You can dynamically create and edit the Robots.txt files for each site via the UI. Let’s learn more about this utility module and how to implement it in Drupal 9.

RobotsTxt module for Drupal SEO

But how does Robots.Txt help with SEO?

So, Robots.Txt files restrict crawlers from crawling some pages. But why wouldn’t you want all your pages/files to be crawled, right? Why do you need to have any restrictions whatsoever? Well, in this case, the more isn’t always merrier.

  • Without a Robots.txt file, you are allowing web spiders to crawl all your webpages, sections and files. This uses up your Crawl Budget (Yes, that’s a thing) – which can affect your SEO.
  • A crawl budget is the number of your pages crawled by web spiders (Google bot, Yahoo, Bing, etc.) in a given timeframe. Too many pages to crawl could decrease your chances of being indexed faster. Not only that, you might also lose out on indexing the important pages!
  • Not all your pages need to be crawled. For example, I’m sure you wouldn’t want Google to crawl your development / staging environment web pages or your internal login pages.
  • You might want to restrict media files (images, videos or other documents) from being crawled upon.
  • If you have a reasonable number of duplicate content pages, it is a better idea to add them to the Robots.Txt file instead of using canonical links on each of those pages.

How to Install and Implement the RobotsTxt Module in Drupal 9

The RobotsTxt Drupal 9 module is great when you want to dynamically generate a Robot.Txt file for each of your website when you are running many sites from one codebase (multisite environment). 

Step 1: Install the RobotsTxt Module for Drupal 9

Using composer: 

composer require 'drupal/robotstxt:^1.4'

Step 2: Enable the module

Go to Home > Administration > Extend (/admin/modules) and enable RobotsTxt module.

Generates Robots.txt File


Step 3: Remove the existing Robots.txt file

Once the module is installed, make sure to delete (or rename) the robots.txt file in the root of your Drupal installation for this module to display its own robots.txt file(s). Otherwise, the module cannot intercept requests for the /robots.txt path.

Remove Robots.txt file

Step 4: Configure

Navigate to Home -> Administration -> Configuration -> Search and metadata -> RobotsTxt (/admin/config/search/robotstxt), where you can add in your changes to “Contents of robots.txt” region. Save the configuration.

Contents of Robots.txt

Step 5: Verify

To verify your changes please visit https://yoursitename.com/robots.txt

Verify Robots.txt File

RobotTxt API

If you want to implement a shared list of directives across your multisite environment, you can implement the RobotsTxt API. The module has a single hook called  hook_robotstxt(). The hook allows you to define extra directives via code. 

The example below will add a Disallow for /foo and /bar to the bottom of the robots.txt file without having to add them manually to the “Contents of robots.txt” region in the UI.

* Add additional lines to the site's robots.txt file.
* @return array 
*   An array of strings to add to the robots.txt.
function hook_robotstxt() {
  return [
   'Disallow: /foo',
   'Disallow: /bar',

Feb 01 2021
Feb 01

The key points that need to be considered when testing mobile applications can be addressed with the following questions:

  • Which mobile devices and operating system versions will this application support?
  • How do we test applications to make sure they run on those platforms?
  • What modifications must be made to accommodate the differences among platforms?
  • How will industry innovations be supported going forward, since new mobile devices, technologies, and applications are constantly being introduced?
  • How do we know how much testing is enough?
Mobile Application Testing MatrixThe mobile application testing matrix becomes exponentially more complex with the addition of each factor to be considered. 

Hardware Diversity Compounds Complexity

In the PC test environment, testers have essentially only one central processing unit platform (x86- compatible microprocessors) on which they need to test applications. Most of the other hardware components that go into a PC or Mac, such as the disk drives, graphics processor and network adapters are usually thoroughly tested for compatibility with those operating systems and pose a relatively minor risk of problems. Their display formats also fall within a relatively narrow range of choices, and the input devices (mostly keyboards and mice) are well-known and familiar.

But mobile voice and data service carriers differentiate themselves by offering a wide range of handsets, each with unique configurations and form factors that can have unpredictable effects on the performance, security, and usability of applications.

Multiple handset options are built around a wide variety of processors, running at various speeds with widely varying amounts of memory, as well as screens of different sizes operating at different resolutions and in different orientations (landscape, portrait or both).

Many handheld devices rely on multiple digital signal processors (one to handle voice communications, the other to process the audio, video, and images associated with applications), as well as multiple input devices, such as a touch-screen and a keypad. Each combination of components interacts in different ways with each other, and with the operating system, and this creates potential compatibility and performance issues that must be addressed in testing.


PLUS Complexities from Software Platform Diversity

In addition to these hardware-based concerns, the tester must cope with the complexity of the software environment. To ensure that an application will work on most customers' desktops, a tester need only test it on the most popular current versions of the Windows, Apple Macintosh, and Linux operating systems.

To ensure performance on the same range of mobile devices, a tester must address all current versions of the iPhone, Windows Mobile 7, Symbian, Android, iPhone, and RIM Blackberry OSes, as well as the MeeGo platform developed by Nokia.

Applications must be tested for their compatibility with any of the networks on which any given device might run.The networks operated by different carriers provide various levels of bandwidth. Different carriers use different methods to tunnel their own traffic into the TCP IP protocol used by the Web, changing how applications receive, transmit and receive data. They also use different Web proxies to determine which websites their users can access, and how those sites will be displayed on their devices. 

All of these differences can affect the stability performance or security of a mobile application and must be tested to ensure that the the end-user experience works as intended.

Testing Mobile Apps on Device Emulators

Mobile application testing needs to take into account a wide range of target mobile devices to ensure that every possible interaction among hardware and software elements, as well as with the wireless carrier's network, is covered. However, acquiring every possible target device and performing manual testing on it is too complex, costly, and time-consuming to be feasible during every stage of testing.

Device emulators, which are software that simulates the performance and behavior of the physical device, are far easier to obtain and less expensive than samples of the physical devices. While they can be less accurate test platforms than the actual hardware, they can be a cost-effective alternative to testing on the physical device when used appropriately.

Emulators can be used to test Web applications using the software development kit for a browser or by packaging the application as a .jar, .apk or .sis (platform-specific) file, installing the application on the emulated device, and testing the application.

Since speeding time-to-market is often a critical consideration in app development, many mobile applications are developed using RAD (rapid application development) in which multiple versions of the software are quickly developed, assessed by end-users, and tweaked accordingly. This rapid-fire cycle of coding and re-coding makes it almost impossible to assess how each change affects the application's performance, stability, or security.

At Promet Source, we are adept at mobile application development, as well as the full range of processes and procedures required to ensure that apps perform as intended. While complexities continue to increase, the margin for error remains zero, and our commitment is is get it right the first time. 

Interested in help or consultation with application development or Quality Assurance of a mobile application?  Contact us today.

Feb 01 2021
Feb 01


Join us for an all-virtual DrupalCon 2021

Join us for DrupalCon 2021, the world’s premier gathering of Drupal enthusiasts and experts. Build your skills, learn more about what Drupal can do, celebrate successes and collaborate to improve and grow the open source platform the world relies on every single day. 

This year’s all-virtual event, April 12-16, is designed to fit into your schedule, so you can still participate in the conference while balancing work and life at home. 

Buy your ticket before 11:59 p.m. ET (UTC -5) on February 3 to get the early bird rate and save $50 on admission.

Your ticket includes access to:

  • Keynote presentations, hands-on workshops, breakout discussion groups, and interactive sessions April 12-16, with live programming held from 11 a.m. through 3 p.m. ET (UTC -5) each day
  • DrupalCon industry summits, held throughout the month of April, with deeper dives into the following areas:
    • Drupal Community, April 6
    • Libraries, April 8
    • Higher Education, April 20
    • Healthcare, April 22
    • Nonprofit, April 27
    • Government, April 29

Volunteers help make DrupalCon possible, contributing their time, talent and skills. Explore all the different ways you can contribute — even if you’re not a developer.

Download the DrupalCon Contribution Opportunities Guide to learn more.

Call for content: Share your experience and expertise with the Drupal community

Lead a fireside chat, skill-sharing session, birds of a feather discussion group or interactive workshop. We’re looking for engaging program content that helps everyone in the Drupal community learn, grow and feel inspired to do more with Drupal.

Possible content topics include:

  • Community Health
  • Content and Marketing
  • Development and Coding
  • DevOps
  • Drupal and Open Source 101
  • Drupal Showcase
  • Leadership, Management and Business
  • Site Building
  • User Experience, Accessibility and Design
And these are just a starting point. If you’ve got a creative idea for a session topic or format, we’d love to hear it. We’re especially interested in ideas that help everyone break out of their daily Zoom fatigue, e.g., sessions focused on creative exercises, art, collaboration or movement.

Content submissions are due by February 14. Learn more about submission guidelines, content focus areas and session formats

Our Diamond Sponsor

  Acquia logo

Our Platinum Sponsors

ImageX Media logo        Centarro logo        Palantir logo

 Mediacurrent logo            Elevated Third logo            Tag1 logo

Have a question?

We’re here for you. Find answers to common DrupalCon questions in our FAQ.
Feb 01 2021
Feb 01

As part of the 10 year anniversary of Drupal Career Online, we're continuing a blogpost theme as we start off the year posts that involve lists of 10. 

As an organization that trains aspiring Drupal developers, evaluates individuals' Drupal skills, and provides skill assessments to potential employers, we’ve developed what we feel is some key insight into what makes a good Drupal job posting.

Over the past few years, as I've reviewed job postings for Drupal jobs on jobs.drupal.org and other job-related web sites, there are (more than) a few things that always make me cringe…

  1. Jobs advertised as junior and intermediate and advanced skill level. Which is it? All of the above? Job postings like this especially scare away junior developers (for fear they will be in over their head) and advanced developers (for fear they will not be challenged). If you're writing a job posting, be specific. If you're hiring for multiple skill levels, then post multiple listings.
  2. Not clearly stating the "minimum skills required". This is always really perplexing, especially when reviewing expert-level job postings. The list of requirements for a single job is often virtually unattainable by most applicants. I've been developing Drupal sites for more than 14 years and I often see advanced-level job postings that I'm not qualified for. If you're looking for someone with years of experience in Drupal, WordPress, Laravel, Symfony, Magneto, administering servers, Behat, site performance, SEO, React, Angular, then prepare to either be disappointed or pay top-dollar (if you can even find someone who meets your criteria). I recommend splitting up your requirements into a few groups: "absolute minimum", "willing-to-pay-a-bit-more-for", and "bonus".  If you're not willing to hire someone who only has the absolute minimum, then maybe you need to rethink the posting.
  3. Not clearly stating if the position is customer-facing or not. Some developers do not want to interface directly with customers. In some cases, interfacing with customers directly isn't in someone's skill set. By making it clear whether or not the developer will need to interface with a client (online via a ticketing system, for example) you can help avoid unwanted situations. 
  4. Junior-level positions that do not mention on-the-job training and/or mentorship. Here's a secret, junior level developers don't want to be junior level - they are usually hungry to learn and advance. If you want to hire a junior level developer, then your organization must be willing to invest in them to help advance them. 
  5. Not specifying if the position includes design as well as development. In this case, "design" may or may not include visual design as well as software design. There are some developers that absolutely love the design aspect of building sites (information architecture, class hierarchy of custom modules, etc…) and some who do not. Be specific in what is required.
  6. Junior-level positions that include front-end, back-end, site-building, project management, multiple Javascript frameworks, etc.. (you get the idea). There's a reason that junior-level developers exist - because they don't have all the skills and experience yet. Job descriptions like this do one thing very well - scare off talented junior-developers that don't want to be put in a no-win situation. 
  7. Advanced skill level positions that don't pay market rate. If you're looking for an expert developer then you need to be willing to pay for it. Drupal is (and has been for a long time) a seller's market - if you manage to find someone willing to fill an expert position at a far-below-market-value rate, you're going to be disappointed one way or another.
  8. Junior-level positions that require more than 1 year of experience. If you're looking for a junior developer with more than a year of experience, then you're not actually looking for a junior developer. More than likely, you're looking for at least an intermediate developer.
  9. Not providing benefits other than salary. As mentioned above, Drupal is a seller's market. Want to attract top Drupal talent, regardless of skill level - then beef up your offering to make it stand out. Most developers enjoy professional development - provide them with a budget and time to learn new skills that will benefit your organization - and don't double-book them with work while they are learning new skills. Another HUGE benefit to offer is to allow developers to spend company time making contributions back to the Drupal community. This is a form of professional development as well often a very healthy thing for remote workers to participate in. Finally, send your developers to Drupal events - nothing will accelerate your developers' skills than interacting with other developers. 
  10. Labelling a position as junior-level because it doesn't pay very well. Don't. Please just don't. 

Do you have a junior Drupal developer that you move into a more intermediate developer role? Then consider sending them to Drupal Career Online - it's only 2-3 times/week for 12 weeks, and you can be confident that they'll be learning best practices around Drupal development. 

Feb 01 2021
Feb 01

Sending an email works with defining an email template (subject, text and possibly email headers) and the replacement values to use in the appropriate places in the template. Processed email templates are requested from hook_mail() from the module sending the email. Any module can modify the composed email message array using hook_mail_alter(). Finally \Drupal::service('plugin.manager.mail')->mail() sends the email, which can be reused if the exact same composed email is to be sent to multiple recipients.

Below function's triggerMail code, can be written in any submit handlers or post actions in Drupal.

function triggerMail() {
  $module = "x_module";
  $key = "x_mail";
  $to = "[email protected]";
  $langcode = 'en';
  \Drupal::service('plugin.manager.mail')->mail($module, $key, $to, $langcode);

Below is the hook_mail() usage, written in x_module.module file.

function x_module_mail($key, &$message, $params) {
  switch($key) {
    // To Trigger email on x_mail key.
    case 'x_mail':
      $message['headers'] = [
        'content-type' => 'text/html'
      $message['subject'] = t('X Mail subject');
      $message['body'][] = (string) getXMailMessageBody();

Below is the helper function getXMailMessageBody() which can pull the dynamic data from the system and pass it through.

function getXMailMessageBody() {
  // Get Dynamic values.. then pass to twig file..
  return twig_render_template(
    drupal_get_path('module', 'x_module') . '/templates/x-mail.html.twig',
      'theme_hook_original' => 'not-applicable',
      'dynamic_key1' => "{value1}",
      'dynamic_key2' => "{value2}

Create a twig template file x-mail.html.twig in the module folder within templates folder.

Email Body and with dynamic values here {{ dynamic_key1 }} & {{ dynamic_key2 }}.

So, with these Drupal Mail service & custom template, you can send emails with customized message mails from your Drupal application.

Thank you :)

Original Article: https://peoplesblog.co.in/blogs/Sci-Tech/Drupal/2021-01-20-Send-Mail-with-Custom-Email-Template-and-with-Dynamic-values-via-Drupal-Mail-Service.html

Jan 30 2021
Jan 30

When you install a Drupal site, a settings.php file is created (either by you, or by the installer) to contain various settings specific to your site (such as database configuration, trusted hostnames, etc.). This is done by taking a copy of a file provided by Drupal core, default.settings.php, and adding or modifying the required lines.

As Drupal develops, additional features mean new things going into default.settings.php.

For example, after a long discussion, a new entry was added to default.settings.php with effect from Drupal 9.2

# $settings['update_fetch_with_http_fallback'] = TRUE;

This mitigates a potential man-in-the-middle attack with checking for updates to core and contributed modules. The point here is not to discuss that issue. This merely serves to illustrate something: Any Drupal site created before 9.2 will have its settings.php file based off the earlier default.settings.php file, and so won't have this entry with associated documentation comments.

This is going to become increasingly important. Before Drupal 8, major new Drupal releases would often involve creating a new settings.php file. Now, this file could persist as Drupal moves through 8.x, 9.x, 10.x, etc.

There needs to be some way to keep track of changes to default.settings.php between releases of Drupal core, so that any individual site's settings.php file can keep pace. Expecting site maintainers to comb the extensive release notes for every minor core release is not going to work; apart from the chance something might be missed, there is also the fact that sometimes smaller changes in a major release are documented in the release notes of a beta or release candidate.

To work, we need a solution that

  • Keeps track of the default.settings.php file off which the current settings.php file is based
  • Allows a (semi-)automated way to update settings.php files to incorporate changes
  • Alerts the site maintainer when the settings.php file has become stale.

This post will offer a solution, running through those 3 requirements

Keeping track of the defaults for the current settings.php

When you install your Drupal site, take a copy of default.settings.php. I'll call it last.default.settings.php.

Put it in web/sites/default, the same place as default.settings.php.

If your site is tracked with some kind of version control, make sure that file is included.

You now have a file that matches exactly the default.settings.php file used to create your site's settings.php file for the first time.

Checking when something has changed

We'll get to automating this in a bit.

But, for now, after any core update, you can run the following to check to see if default.settings.php has changed since you created settings.php.

diff -u web/sites/default/last.default.settings.php web/sites/default/default.settings.php

That command will return nothing if nothing has changed, or a diff of the changes if something has.

Incorporating changes (semi-)automatically

We can use that diff to change settings.php to incorporate any changes

cd web/sites/default
diff -u web/sites/default/last.default.settings.php web/sites/default/default.settings.php > settings-merge.patch
patch --dry-run settings.php < settings-merge.patch

As long as you're happy with what the dry-run says will happen, follow this with:

patch settings.php < settings-merge.patch
rm settings-merge.patch

Lastly, we need to copy the (changed) default.settings.php file. Your settings.php file is now based off the updated version, so we need to update the copy we're keeping to track this

cp -a default.settings.php last.default.settings.php

Take care, if patch creates a file called settings.php.orig containing the unaltered file, remove that before checking the changed version back into version control. If you're using version control, you don't need a separate copy of the old file anyway.

This is all only semi-automated, because it's possible that the patch won't apply cleanly (for example, if the changed portion of default.settings.php is too close to site-specific modifications you had made), in which case you'd have to make the changes manually.

Alerting the site maintainer

You could run that diff command manually after each core update. But it would be nice to automate that.

Fortunately, this is easily done.

I have a directory inside my composer root folder named hook-scripts. Your mileage may vary as to where you choose to put files like this, but working with my directory structure, create a file in hook-scripts named check-default-settings.

DIFF=$(diff -q web/sites/default/last.default.settings.php web/sites/default/default.settings.php)
if [ ! -z "$DIFF" ]; then
  echo -e "\e[31mdefault.settings.php file has changed\e[0m"

(Make sure you set the file to be executable, chmod +x)

If you execute that script, it will return nothing if default.settings.php is unchanged. But if your last.default.settings.php file no longer matches default.settings.php, it will print a message to that effect in red letters.

Now all we need to do is tell composer to call this script after each installation or update. I'll assume here that you know how to put pre / post update / install scripts into composer.json. But you want something like this:

    "scripts": {
        "post-update-cmd" : [
        "post-install-cmd" : [

Now, every time Drupal core is updated (indeed, anything is updated) your script will run. If default.settings.php has changed, you'll be prompted in red lettering. You can then go and run the diff / patch commands above to make sure those changes are included for your site.

Jan 29 2021
Jan 29

For the second time, Acquia has been recognized as a leader in the Gartner MQ for Digital Experience Platforms.

For the second year, Acquia was named a Leader in Gartner's Magic Quadrant for Digital Experience Platforms (DXPs).

Gartner magic quadrant for digital experience platforms

Our leadership position improved compared to last year. Acquia is now the clear number two behind Adobe. Market validation from Gartner on our vision is exciting and encouraging.

In the report, the analysts note the Drupal community as a powerful entity that sets Acquia apart from closed-monoliths. Closed monolithic stacks and martech silos are quickly becoming a thing of the past. Drupal's scale, modularity and openness is a real differentiator.

Mandatory disclaimer from Gartner

Gartner, Magic Quadrant for Digital Experience Platforms, Irina Guseva, Mick MacComascaigh, Mike Lowndes, January 27, 2021.

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Acquia.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Jan 29 2021
Jan 29

January 29, 2021

What is hook_update_N()?

Let’s say you are developing a Drupal module (custom or contrib) which tracks how many visitors landed on specific node pages, version 1 of your code might track visitors by nid (node id) in the database using a table like this:

nid visitors 1 4 13 22

Let’s set aside the debate over whether the above is a good idea or not, but once your code has been deployed live to production sites, that’s what the data will look like.

This module might work very well for a long time, and then you might have the need to track not only nodes but also, say, taxonomy term pages. You might reengineer to look like this:

type id visitors node 1 4 node 13 22 term 4 16

To achieve this change when the first version of your database is already out in the wild, you need to tell target environments to update the database schema. This is done using hook_update_N(), and you would replace the N() by incremental numbers, something like this:

 * Update database schema to allow for terms, not only nodes.
function hook_update_9001() {

If this case 9 is the major version (Drupal 9) and 001 because this is the first update to your code.

Each module tracks which version it’s using, so that if code introduces new hook_update_N() functions, it will know to run them only once. You can tell which schema version any installed module is using by running, for example:

drush ev "print(drupal_get_installed_schema_version('webform'))"

This might tell you, for example, that Webform’s current schema is 8621. This means that the latest update hook that was run is Webform’s hook_update_8621(). If the codebase introduces hook_update_8622(), say, or hook_update_8640() (you can skip numbers if need to), then the database will be marked as out of date and running drush updb will run the the new hook and update the installed schema version.

If you ever need to re-run an update hook (which happens rather rarely), you can update the schema, like this:

drush ev "drupal_set_installed_schema_version('webform', 8620)"

So what’s wrong with this?

This works well almost all the time, and you can automate your deployment process to update the database, making sure your schemas are always in sync. However as developers and site users it is important to be aware of certain drawbacks of hook_update_N(), which I’ll get to in detail:

  • hook_update_N() tightly couples the database to the code version;
  • it makes gradual-deployment on multi-container setups such as Kubernetes fragile (or impossible);
  • rollbacks are not possible;
  • it can add considerable compexity to deployment of configuration.

The shaky foundation of database-driven websites

The idea of version control is paramount to how we conceive of computer code. If you’re following the precepts of continuous deployment, then every version of your code needs to “work” (that is, tests need to pass, or, at the very least, it needs to be installable).

For example, let’s assume a bug makes it to your production envrionment for version 5 of your code, and you know this bug was not present on version 4 of your code, you should theoretically be able to check out version 4 and confirm it was working, then figure out what the difference it between version 4 and 5.

In fact this is exactly how things work on static sites such as Jekyll: all your data and your functionality (Javascript) are in your codebase. Each version of your code will be internally coherent, and not rely on an external unversioned database to do something useful.

On database-driven projects based on Drupal or Wordpress, if you check out version 4 of your codebase, it will probably not do anything useful without a database dump which was created using version 4 of your code.

Therefore, although we all use version control for our code, in a way we are fooling ourselves, because critical parts of our project are not version-controlled: the database dump, the ./sites/default/files folder, and the private files folder.

Although it makes sense for certain elements to be a database or on ./sites/default/files, for example, an encrypted user account password or a user’s avatar; for other elements such as your “About page” text, it would really make a lot more sense for this to be under version control.

In fact, the blog post you are reading right now is a file under version control on Jekyll, which you can see using this link, and not some collection of opaque, unversioned, entries in database tables with names like node__body, node__field_tags, node_field_revision, which can be changed at a moment’s notice by any module’s hook_update_N() functions.

Oh, did I mention that I love Drupal?

Tight code-database coupling

Let’s imagine a world where the database schema never changed. A world where hook_update_N() does not even exist.

In such a world, you could take any version of your code, and any version of your database dump (say, the latest version), combine the two on a test environment, and debug errors at will.

In the real world, every time any module updates the database schema, it makes the database more tightly coupled to the current version of the codebase.

Let’s take our “number of visitors per entity” code we had earlier: if I use an old codebase which expects my table to contain fields “nid” and “visitors”, but my only available database dump has fields “type” “id”, “visitors”, the history of my carefully version-controlled codebase will be useless, and old versions will fail with an error such as:

ERROR 1054 (42S22): Unknown column 'id' in 'field list'.

Gradual deployments

Mostly we think of Drupal sites as being on a server with one copy of the codebase, and one copy of the database. So the concept of keeping the database and code “in sync” makes sense.

But as more and more teams use containers and Kubernetes-type container-orchestration systems, high-traffic sites might have, say, one performance-optimized database, and then 5, 10 or 20 load-balanced copies of your PHP code.

Acquia uses such a setup behind the scenes for its cloud hosting, so it’s good to develop with this in mind. On Acquia’s setup, all PHP container use a single, shared database, as well as shared private and public files directories.

But the PHP containers do not share the /tmp directory. This means that every time you perform a web request on a server, the load balancer might direct you to a container with its own /tmp, whose contents differ from other containers’ /tmp.

It’s important to realize this if your code stuff such as building large files over several web requests, and can lead to hard-to-diagnose bugs such as:

But, in addition to providing you with headaches such as the above issues, multiple containers can also allow you to do gradual deployments of new code, reducing the cost of potential failure.

For example, let’s say you have 20 Drupal containers with 20 copies of your codebase, and each Drupal container is connected to a shared database, and shared files and private files directories. If you are deploying a risky update to your code, you might want to start by deploying it to 25% of the containers (5). Then if there are no adverse effects, scale up to 10 the next day, then the entire 20 the day after.

Code that uses hook_update_N() can break this workflow: because all containers share the database, if container 1 has the new version of your code and updates the database accordingly (so that the new database fields are “type” “id”, “visitors”); then container 10 (which uses the old version of your code) will fail when it looks up the database field “nid”.


Let’s forget about fancy container orchestration and just look at a typical Drupal website. A simple real-world site might have a “contact us” webform and some pages, plus some custom functionality.

Let’s say you are deploying a change to your codebase which triggers a hook_update_N(). No matter the amount of unit tests and testing on stage, there is always the possibility that a deployment to production might trigger unforseen issues. Let’s assume this is the case here.

A typical deployment-to-production scenario would be:

  • You backup your production database.
  • You install your new code.
  • You run drush updb which updates the database schema based on your hook_update_N().
  • A few hours pass. Several people fill in your contact form, which means now your database backup from step 1 is out of date.
  • You realize your newly-deployed code breaks something which was not caught by your stage testing or your automated tests.

In a situation like this, if you did not have hook_update_N()s in your code, you could simply roll back your codebase on production to the previous version.

However, this is no longer an option because your database will not work with previous versions of your codebase: there is no hook_downgrade_N(). You are now forced to live with the latest version of your code, and all the benefits of version-controlling your code are for naught.

Config management

Let us recall the elements which make up a Drupal website:

  • Versioned code.
  • Unversioned database and file directories.

If you are using configuration management and a dev-stage-production workflow, there is a third category:

  • Configuration, including the list enabled modules, defined node types and fields, which exist both in the database and in unversioned code.

It is worth recalling a typical workflow:

  • add field_new_field to the article node type on your local machine.
  • the field is now in your local development database but not in your codebase
  • drush config-export
  • the field is now in your local development database and also in your codebase
  • do all your testing and push your code to production

At this point your field is in your production codebase but not your production database.

You probably have a deployment script which includes a “drush updb” step. The question is: do you run “drush config-import” before or after “drush updb”?

It turns out this is not that easy a question to answer. (Drush also provides a drush deploy command which combines configuration import and database updates.)

Regardless of your deployment process, however, we need to take into account a more troubling possibility:

In addition to relatively benign database schema updates, hook_update_N() can modify configuration as well.

In such a case, if you are not careful to run hook_update_N() first on your development environment, then export the resulting configuration, then run your deployment, you may run into the following problem:

#3110362 If an update hook modifies configuration, then old configuration is imported, the changes made by the update hook are forever lost.

Let’s look at a real-world example using the Webform module. Let’s install a new Drupal 8 site with Webform 5.23, then export our configuration, then upgrade to Webform 6.x and import our old configuration. We’ll look at the kind of headache this can lead to (note to beginners: do not do this on a production site, it will completely erase your database).

composer require drupal/webform:5.23
drush site-install -y
drush en webform_ui -y
drush config-export -y

This puts your current site configuration into code. Among said configuration, let’s focus on a single piece of configuration from Webform:

drush config:get webform.settings settings.default_page_base_path
# 'webform.settings:settings.default_page_base_path': form

The base path for webforms is “form”. This tells Webform to build URLs with a structure such as https://example.com/form/whatever.

Let’s now update webform, and our database.

composer require drupal/webform:6
drush updb -y
drush config-import -y

In Webform’s webform_update_8602(), the config item webform.settings:settings.default_page_base_path is changed from “form” to “/form”.

But we are re-importing old config, which overwrites this change and reverts webform.settings:settings.default_page_base_path to “form”, not “/form”

To see the type of hard-to-diagnose error to which this might lead, you can now log into your Drupal site, visit /admin/structure/webform, create a webform named “test”, and click on the “View” tab.

Because the base path lack the expected leading prefix, you now get the “not found” URL /admin/structure/webform/manage/form/test, instead of the expected /form/test – a critical bug if you are on a production site.

In addition, this has a number of cascading effects including the creation of badly-formatted URL aliases which you can see at /admin/config/search/path.

If you find yourself in this situation on production, you need to revert your Webform schema version on your development environment, export your config, reimport it on production, and resave your forms, and potentially fix all your paths starting with “form” on /admin/config/search/path so that they start with “/form”.

To be fair, this is not the fault of the Webform maintainers. In my opinion it shows a fundamental frailty in hook_update_N() combined with lack of documentation on deployment best practices. However, if we strive for Drupal to be a robust framework, there should not be a single point of failure (in this case not strictly adhering to fickle, badly-documented deployment procedures) which can lead to major instability on production.

How do we fix hook_update_N()?

Here are a few approaches to avoid the potential damage done by hook_update_N():

Approach 1: don’t use hook_update_N()

When possible, you might consider not using hook_update_N() at all. Consider our “number of visitors per node” module from earlier.

Instead of a hook_udate_N(), your code could do something like this:

  • Do not change the field name from “nid” to “id”. Even though “id” makes more sense, the field is called “nid”, just leave it at that.
  • Do not expect there to be a “type” field. If your code needs it, for example if creating an entry for the first visitor to a non-node entity, your code can create it.
  • Assume an empty “type” means you are dealing with a node.

The above approach adds complexity to your code, which you can add to a “storage” abstraction class. Although not ideal, this does away with the need to use hook_update_N().

Approach 2: Don’t use hook_update_N() to update configuration

Updating configuration, as seen previously, is even more dangerous than updating non-configuration database tables. So if at all possible, avoid it.

In the Webform example given above, it might have been reasonable to consider keeping with the old non-leading-slash format for path prefixes, rather than update configuration.

When you absolutely must update configuration, you could consider the possibility that certain users might have reimported old configuration, and provide error-checking and hook_requirements() (displaying error messages on the /admin/reports/status page) accordingly.

Approach 3: Robust exception handling

Do not assume that your database schema, or your configuration structure, is up-to-date. If you decide to provide a hook_update_N() to update the schema from, for example, “nid” and “visitors” to “type”, “id”, “visitors”, when querying the database, you might want to consider the possibility that for whatever reason the database is not up-to-date. Here is some pseudo-code:

public function num_visitors_for_entity($id, $type = 'node') : int {
  try {
    return $this->query_database($type, $id);
  catch (\Exception $e) {
    return 0;

That way, if your database and code are not in sync, it’s not going to break your entire site, but rather log an exception and fail gracefully.

Approach 4: keep config changing logic idempotent and separate from update hooks

Let’s look again at Webform’s webform_update_8602(), the config item webform.settings:settings.default_page_base_path is changed from “form” to “/form”.

I would recommend having a separate function to update config, and call that function from the update hook. That way, if a development team makes the mistake of not updating their configuration before importing it into production, it will become easier to run, say “my_module_update_configuration()”.

Then, your hook_requirements() might perform some sanity checks to make sure your configuration is as expected (in this example, that the “webform.settings:settings.default_page_base_path” config item has a leading slash). If this smoke test fails, developers can be directed to run my_module_update_configuration() which will update all configuration to the required state.

In addition, my_module_update_configuration() can be made idempotent, meaning: no matter how often you run it, you will always end up with the desired state, and never get an error.


Please enable JavaScript to view the comments powered by Disqus.
Jan 28 2021
Jan 28

Drupal's editing interface can be unclear to novice editors, and the number of options may seem overwhelming to many. The Simplify module comes in handy here, as it allows you to limit the available functionalities to the necessary minimum, thus making website editing easier.

Further down the article I will present the module's capabilities and the effects of its operation.


The first version of the module was released in December 2010. Since then, it has been slowly but steadily developed. At the end of 2015, a stable 1.0 version for Drupal 8 was released.

Module's popularity

According to official statistics, Simplify is used by over 15 thousand websites, however only 20% of this number are Drupal 8-based projects. The 7.x-3.x branch is currently by far the most popular.


Module's creators

The module has four maintainers, they are:

So far 72 commits have been created in the code repository. In addition to the developers mentioned above, several people have been involved in the project.

What is the module used for?

The module is used to simplify administration forms by hiding selected functionalities from the user. This makes it easier for those less proficient in the use of CMSs to edit pages and blocks.


You can download the module at Drupal.org / Simplify.

After launching Simplify, select the elements that should be hidden from the user. Go to Configuration -> User interface -> Simplify in the administration menu and check out the list of options there.

Module's use

The functionalities that can be disabled are divided into several categories according to the module that supports them. For example, try to disable all the "Nodes" type elements:


As a result, the users visiting the page editing form will see its significantly simplified version - without the author, text type selection, promotion and versioning.


As the administrator, you have the View hidden fields permission which ignores the changes made by Simplify. If you want to put administrators on an equal footing with all the registered users, check the Hide fields from admin users option.

By default, the Simplify module allows you to hide the editing elements of nodes, blocks, comments and taxonomies.

Hooks and integrations

The module provides two hooks for adding your own exclusions to any form. Their use is very simple:

  • hook_simplify_get_fields_alter() - is used to change the list of exclusions; here you can add your own options or remove the existing ones
  • hook_simplify_hide_field_alter() - in this hook you hide a given field by modifying the array containing the form


Simplify is a module that is simple by design but is a great help for novice editors. It is good to consider using it on corporate websites where the user may be overwhelmed by the number of options available. Referring to the experience of our Drupal agency in cooperation with clients, I recommend it, especially if you put a lot of emphasis on edit forms.

If you are interested in CMS optimisation, check out the article on 6 ways to improve editors' productivity.

Jan 28 2021
Jan 28

As part of the interview process for Drupal Career Online, we provide potential students with some background information about Drupal so that they can make a more informed decision about whether or not the program suits them. One of the things we communicate is the scope of the Drupal project and its pervasiveness in the web development industry.

As anyone who has tried to find reliable numbers to answer the "how popular is Drupal?" question knows, finding reliable data is often a difficult process. While working to update our recruitment materials this year, we thought we'd share the data we found and the conclusions we've made.


To answer this question, in the past we have relied on the W3Techs (Web Technology Surveys) - a subsidiary of Q-Success, an Austria-based organization. Their methodology seems sound (they monitor the top 10 million web sites), but as their disclaimer states, "This information may be incomplete and inaccurate due to the vastness and complexity of the matter in hand"

As of January, 2021, they report that about 1.5% of all websites they monitor are Drupal-based. This is 2.4% of all web sites using a content management system.

W3Techs does provide a breakdown of usage among the top 1 million, 100,000, etc… sites, but only as part of one of their products (999 €). 

W3Techs reports that Drupal 7 is used by 66.4% of all the websites who use Drupal.

So, how do these numbers compare with other data sources?

Built With

Built With is an Australian company that provides lead generation lists in addition to web development trends among a plethora of technologies. 

They currently report that ~618,000 sites are using Drupal, which equates to 0.97% of all web sites. This is 4th among content management systems after WordPress, WooCommerce Checkout, and Joomla. Their sample size is over 35 million sites.

Their data for the top 1 million sites has Drupal with a 3.28% content management system market share (3rd place), 7.73% in the top 100,000 sites (2nd place), and 12.56% in the top 10,000 sites (2nd place).

Built With also reports that of the ~618k site running Drupal, the United States is responsible for ~250k, with Russia next on the list with ~41k sites.


SimilarTech appears to be primarily a lead-generation company with offices in Israel and the United States. They state that their "proprietary technology scans more than 30 billion web pages per month."

For Drupal, they report that ~236,000 unique domains run Drupal, which equates to ~337,000 sites (I'm assuming that subdomains account for the significant difference in these numbers). 

One potential major problem with their reports is that they often separate Drupal into multiple categories, including "Drupal", "Drupal 7", and "Drupal 8".

While they report that Drupal is used on 0.483% of all sites, it is not clear if this refers to just their "Drupal" category or includes their "Drupal", "Drupal 7", and "Drupal 8" categories. They appear to also break up other content management systems in a similar manner, making it difficult to draw reliable conclusions.

In their list of the top 1 million sites, they report that "Drupal" is used on 2.55% and "Drupal 8" on 0.81% (they only provide the top 8 positions). 

SimilarTech reports that after the United States, Russian sites account for the most Drupal sites with ~32,000. 


So, which data can we trust? What is the real answer? As you might imagine, it is difficult to say…

Removing the SimilarTech data from the equation for reasons stated above, let's look at a summary of we've found:

Metric W3Techs Built With

All sites (CMS + non CMS)

- 0.97% All sites (CMS) - - Top 10m sites (CMS + non CMS) 1.5% - Top 10m sites (CMS) 2.4% - Top 1m sites (CMS) - 3.28% Top 100k sites (CMS) - 7.73% Top 10k sites (CMS) - 12.56%

Clearly, using no-cost data from W3Techs and Built With, it isn't possible for a direct apples-to-apples comparison. But, combined, the data does seem to make sense. If you make the assumption that higher ranked sites have greater complexity and budget, along with the fact that Drupal 8 tends to be focused more on enterprise solutions, then it makes sense that Drupal has a higher percentage of usage among more popular sites.

Can we answer the question "how popular is Drupal?" - well, the answer is "sort of." Often answers to questions like this need caveats, as the table above illustrates. 

So what's the answer? Based on these sources, we're going to extrapolate, and go with Drupal runs about 3% of the top 1 million sites using a content management system as this seems well supported by two independent sources.

Jan 27 2021
Jan 27
  • 28 January 2021
  • Benoît Pointet

You are facilitating a group process. Post-its gather and start overlapping each other. Time to make sense of all that. You just launched the group into some clustering exercise when someone drops the bomb: "Wait! It all relates!".

Discover more about the services UX Design and Agile teams and processes our digital agency has to offer for you.

"But… it all relates!" A reaction so often heard while facilitating (or participating) to group reflexion processes (brainstorming, agile retrospectives, …).

"You ask us to group things … but everything is connected!"

It often comes with a contrived smile ("things are complex, you know!"). Sometimes also with a counterproposal "let us make a single group around the central thing here which is X, since obviously all things relate to X."

A very human reaction, which if you’re unprepared as facilitator, can take you aback. Keeping the following arguments in your mind can help.

  1. That it all relates does not mean that it all ought to conflate. It makes sense to distinguish the different aspects of a situation or a problem, the different knots of its web of complexity. Some seem to think that seeing the big picture implies refusing to distinguish the whole from its parts. Yet if we can see the links, the relationships, it is because we have identified the parts.

  2. Although a holistic view provides a definite advantage when facing a complex situation, it is good to remind ourselves that action cannot be holistic. You cannot act on the system as a whole. You may only act on precise points of the system.

Two simple arguments to help us facilitate these "everything is connected" moments and realize that in a (group) reflexion process, taking things apart is the first step towards deciding meaningful action.

Photo: Ruvande fjällripa

Benoît Pointet

Holacracy Coach

Related services
  • Topics
  • Tags
Jan 27 2021
Jan 27

Have you heard about the Drupal decoupled menus initiative? If not, I'll explain more in a moment. But first, if you've got any experience creating JavaScript front-ends for a decoupled CMS (Drupal or other) the initiative is looking for input through this survey: https://www.surveymonkey.com/r/N2JZFLD

Take the survey

It only took me about 10 minutes to fill out, and it's an easy contribution to the Drupal community with a big impact. Fill it out, then come back, and read the rest of this post. (I'll wait.)

What is the decoupled menus initiative?

The decoupled menus initiative (DMI) was introduced in Dries' keynote from DrupalCon Europe 2020, and this video by Gabe Sullice (embedded below) does a great job of explaining what it's all about.

[embedded content]

Video credit: Gabe Sullice

The goal of the decoupled menus initiative is to:

"Provide the best way for JavaScript front-ends to consume configurable menus managed in Drupal"

This includes creating official, community-supported components (e.g. React & Vue) that you can use in your own project or as a reference implementation--and everything required to support it including docs, packaging, security, etc. And at the same time keeping the scope small and attainable by saying we'll ship a single menu component rather than a complete overhaul of Drupal's admin UI.

Credit: Dries Buytaert, DrupalCon Europe 2020Credit: Dries Buytaert, DrupalCon Europe 2020

While on the surface this might sound like we're building a React component that displays links, I think it's the work that needs to happen to ensure that component can be effectively managed and maintained by the Drupal community that is the real value of this initiative. Some of the problems that need to be solved include:

  • Updating the Drupal.org infrastructure to handle any requirements for bundling, testing, and shipping JavaScript packages via GitLab etc.
  • Defining policies and practices for handling security issues with JavaScript packages
  • Defining tooling, and processes, for creating best-in-class documentation for how to consume menu data from Drupal
  • Developing an ideal data structure for consuming menu data, and then updating to Drupal core to facilitate providing that data
  • Allowing content creators to configure, and turn on/off, menus served via JSON:API through an intuitive UI
  • And of course writing the code for the different reference implementations in React, Vue, etc.


Looking at that list, most of those problems, once solved, will reduce the barriers to creating more awesome JavaScript integrations with Drupal's web services API. This. in itself, is a huge win. And hopefully results in a bunch of additional initiatives tackling things like authentication, content editor-generated layouts, image styles, routing, and other things that are traditional hard problems of decoupled architectures.

Think of the decoupled menus initiative as laying the groundwork for future innovations.

This is important; because, as Dries' pointed out in his keynote introducing the initiative in order for Drupal to continue grow, and to remain relevant for the next 20 years, it has to be better positioned to compete with the current class of decoupled content management systems. Drupal is already the best option from the perspective of content architecture, editorial workflows, and a deep feature set. But it lacks a developer experience that is attractive to JavaScript developers. and gets overlooked as a result. Since these devs are often influential in the decision regarding what CMS to use, it's important that they view Drupal as an awesome choice.

The experience of integrating with Drupal has to be as good, or better, than that of the competitors. This means meeting JavaScript developers where they are, and not making them jump through hurdles to integrate with Drupal. Because more often then not, we developers will prefer the path of least resistance. And speaking from my own experience npm install --save @contentful/app-sdk is a lot less friction than writing my own JavaScript library to integrate with Drupal's back-end. While there have been numerous attempts to create reusable libraries, they tend to lack the visibility required to make them truly useful.

Assuming this initiative is successful, I would love to see something similar for dealing with authentication: a set of community supported components that deal with the complex OAuth workflow, specifically designed to integrate with Drupal and the Simple OAuth module. This would get us closer to the experience of using solutions like Auth0.

Want to know more? Or get involved?

Did I mention there's a survey?

Take the survey

Jan 27 2021
Jan 27

RainU logo

On the heels of our recent Drupal 9 release of Rain CMS, we are excited to officially announce the beta release of our Rain University platform, RainU. RainU CMS is a Drupal-based development platform made just for higher education. Colleges and universities can now launch new sites faster with full, flexible control over content.

The RainU CMS Theme

RainU for Drupal homepage with hero image shows a group of students in graduation caps

New RainU CMS theme homepage

The RainU theme is based on the main Rain base theme but adds additional features that are relevant for university websites. The navigation and content have been staged to help content authors get started quickly. Some of the new features added to RainU are the event and quote carousels, as well as more components for highlighting content.

Building pages

RainU CMS content field for frequently asked questions

Rain CMS admin content edit page

RainU content authoring

Paragraph browser dialog

With Rain University, we give content authors the freedom and flexibility to build robust pages using a library of pre-stocked components (called “paragraphs” in Drupal). The Rain Admin UX offers many improvements over the stock Drupal admin which makes the overall experience more intuitive for editors.

Find out more

Currently, Mediacurrent’s Rain University CMS is available to new and existing clients. Our team of strategists, designers and developers can work with your organization to migrate your website from a legacy CMS onto an enterprise, open source platform.

For more information on the benefits of Rain University CMS and to schedule a free demo, please visit the RainU page or chat with us right now (see bottom right corner of the page). We would be happy to talk more about your project or schedule a demonstration.

Jan 27 2021
Jan 27

When you try to uninstall a module that has a field that you have used, it can throw the following error:

The following reasons prevent the modules from being uninstalled: Fields pending deletion

This is an issue that can happen in both Drupal 8 and Drupal 9. This is due to the fact that Drupal doesn’t actually delete the data for the field when you delete the field. It deletes the data during cron runs. If cron hasn’t been run enough times since you deleted the field, drupal won’t let you uninstall the module.

To force drupal to purge the data, you can run the following command

drush php-eval  'field_purge_batch(1000);'

Increase 1000 to a high enough number to wipe out the data, or run a few times. After this has completed, you should be able to uninstall the module.

Module uninstall dependencies (drupal stackexchange)
The message "Required by Drupal (Fields Pending Deletion)" baffles users
Can’t uninstall YAML because of the following reason: Fields pending deletion

Jan 27 2021
Jan 27

11 minute read Published: 27 Jan, 2021 Author: Seonaid Lee
DevOps , Drupal Planet

A lot of potential clients come to us with straightforward and small projects and ask, “Well, can you do Kubernetes?” And we say, “Well, we can, but you don’t need it.”

But they’re afraid that they’ll be missing out on something if we don’t add Kubernetes to the stack. So this is a post to tell you why we probably won’t be recommending Kubernetes.

This post is going to look at three perspectives on this question… First, I’ll consider the technical aspects, specifically what problems Kubernetes is really good for, compared with what problems most smaller software projects actually have.

Then I’ll talk about the psychology of “shiny problems.” (Yes, I’m looking at you, Ace Developer. I promise there are other shiny problems in the project you’re working on.)

And last but not least, we’ll consider the business problem of over-engineering, and what gets lost in the process.

Kubernetes solves specific problems

First off, (I’m sorry to draw your attention to this, but): You probably don’t have the problems that Kubernetes solves. Kubernetes lives at the container orchestration level. It shines in its ability to spin up and down stateless servers as needed for load balancing unpredictable or pulsed loads, especially from a large user base. Large, by the way, is not 10,000… it is millions or hundreds of millions.

Especially for the kind of internal custom services that a lot of our clients require, it is overkill. Many purpose-built sites are unlikely to have more than dozens or hundreds of users at a time, and traditional monolithic architectures will be responsive enough.

Kubernetes is designed to solve the problem of horizontal scalability, by making multiple copies of whichever services are most stressed, routing requests to minimize latency, and then being able to turn those machines back off when they are no longer needed. Even if you hope to someday have those problems, we suggest that you should hold off on adding Kubernetes to your stack until you get there, because the added technical overhead of container orchestration is expensive, in both time and dollars.

(It costs more to build, which delays your time to market, which delays your time to revenue, even if you aren’t paying yourself to build it.)

Which does lead to the question, “Why does everybody want to use this technology, anyway?” For that, we’ll have to take a step back and look at…

The Rise of the Twelve-Factor App

With the shift to the cloud and the desire for highly scalable applications, a new software architecture has arisen that has a strong separation between a system’s code and its data.

This approach treats processes as stateless and independent, and externalizes the database as a separate “backing service.” The stateless processes are isolated as microservices, which are each maintained, tested, and deployed as separate code bases.

This microservices approach decomposes the software into a group of related but separate apps, each of which is responsible for one particular part of the application.

Designing according to this architectural approach is non-trivial, and the overhead associated with maintaining the separate code bases, and particularly in coordinating among them is significant. Additionally, each app requires its own separate datastore, and maintaining synchronization in production introduces another level of complexity. Furthermore, extracting relevant queries from distributed systems of data is more challenging than simply writing a well-crafted SQL statement.

Each of these layers of complexity adds to the cost of not only the initial development, but also the difficulty of maintenance. Even Chris Richardson, in Microservices Patterns, recommends starting with a monolithic architecture for new software to allow rapid iteration in the early stages. (https://livebook.manning.com/book/microservices-patterns/chapter-1/174)

For many of the same reasons, you probably don’t need complex layers of data handling either. Redis, for example, is for persisting rapidly changing data in a quickly accessible form. It’s not suitable for a long-standing database with well-established relations, it costs more to run data in memory than to store it on disk, and it’s more difficult to build.

When you are getting started, a SQL back end with a single codebase will probably solve most of your problems, and without the overhead of Kubernetes (or any of the more exotic data stores.) If you’re still not convinced, let’s take a brief detour and consider the lifecycle of a typical application.

Development, CI, and Upgrades (Oh My!)

Most applications have the following characteristics:

  • Predictable load
  • Few (fewer than millions of) users
  • Predictable hours of use (open hours of the business, daily batch processing cron job at 3 AM, etc.)
  • Clear options for maintenance windows
  • Tight connection between the content layer and the presentation layer

Contrast this with the primary assumptions in the twelve-factor app approach.


The goal of moving to stateless servers is focused on different things:

  • Zero downtime
  • Rapid development
  • Scalability

This approach arose from the needs of large consumer-facing applications like Flickr, Twitter, Netflix, and Instagram. These need to be always-on for hundreds of millions or billions of users, and have no option for things like maintenance mode.

Development and Operations have different goals

When we apply the Dev-Ops calculus to smaller projects, though, there is an emphasis on Dev that comes at the expense of Ops.

Even though we may include continuous integration, automated testing and continuous deployment (and we strongly recommend these be included!), the design and implementation of the codebase and dependency management often focuses on “getting new developers up and running” with a simple “bundle install” (or “build install” etc.)

This is explicitly stated as a goal in the twelve-factor list.

This brings in several tradeoffs and issues in the long-term stability of the system; in particular, the focus on rapid development comes at a cost for operations and upgrades. The goal is to ship things and get them standing up from a cold start quickly… which is the easy part. The more difficult part of operations – the part you probably can’t escape, because you probably aren’t Netflix or Flickr or Instagram – is the maintenance of long-standing systems with live data.

Version Upgrades

Upgrades of conventional implementations proceed thusly:

  1. Copy everything to a staging server
  2. Perform the upgrade on the staging server
  3. If everything works, port it over to the production environment

There are time delays in this process: for large sites it can take hours to replicate a production database to staging, and if you want a safe upgrade, you need to put the site into maintenance mode to prevent the databases from diverging. The staging environment, no matter how carefully you set it up, is rarely an exact mirror of production; the connections to external services, passwords, and private keys for example, should not be shared. Generally, after the testing is complete in the staging environment, the same sequence of scripts is deployed in production. Even after extensive testing, it may prove necessary to roll back the production environment, database and all. Without the use of a maintenance freeze, this can result in data loss.

This sort of upgrade between versions is significantly easier in monolithic environments.

But isn’t Kubernetes supposed to make that easier?

It’s tempting to point to Kubernetes’ rolling updates and the ability to connect multiple microservices to different pods of the database running the different versions… but in content-focused environments, the trade-off for zero downtime is an additional layer of complexity required to protect against potential data loss.

Kubernetes and other 12-factor systems resolve the issue of data protection by sharding and mirroring the data across multiple stores. The database is separate from the application, and upgrades and rollbacks proceed separately. This is a strength for continuous delivery, but it comes at a cost: data that is produced in a blue environment during a blue-green deployment may simply be lost if it proves necessary to roll back schema changes. Additionally, if there are breaking changes to the schema and the microservices wind up attached to a non-backward compatible version, they can throw errors to the end-user (this is probably preferable to data loss.)

For data persistence, the data needs to be stored in volumes externally from the K8 cluster, and orchestrating multiple versions of the code base and database simultaneously requires significant knowledge and organization.

A deployment plan for such a system will need to include plans for having multiple versions of the code live on different servers at the same time, each of which connects to its associated database until the upgrade is complete and determined to be stable. It can be done, but even Kubernetes experts point out that this process is challenging to oversee.

When we are moving things into production, we need to have an operations team that knows how to respond when something fails. No matter how much testing you have done, sometimes a Big Hairy Bug gets into production, and you need to have enough control of your system to be able to fix it. Kubernetes, sad to say, makes this harder instead of easier for stateful applications.

So let’s consider what it means to have a stateful application.

When the Data is Intrinsic to the Application

A content management system by its nature is stateful. A stateful application has a lot of data that makes up a large fraction of “what it is.” State can also include cache information, which is volatile, but the data is part and parcel of what we are doing. Databases and the application layer are frequently tightly integrated, and it’s not meaningful to ship a new build without simultaneously applying schema updates. The data itself is the point of the application.

Drupal (for example) contains both content and configuration in the database, but there is additional information contained in the file structure. These, in combination, make up the state of the system… and the application is essentially meaningless without it. Also, as in most enterprise-focused applications, this data is not flat but is highly structured. The relationships are defined by both the database schema and the application code. It is not the kind of system that lends itself to scaling through the use of stateless containers.

In other words: by their very nature, Drupal applications lack the strict separation between database and code that makes Kubernetes an appropriate solution.

Shiny Problems

One of the things that (we) engineers fall into is a desire to solve interesting problems. Kubernetes, as one of the newest and most current technologies, is the “Shiny” technology towards which our minds bend.

But it is complex, has a steep learning curve, and is not the first choice when deploying stateful applications. This means that a lot of the problems you’re going to have to solve are going to be related to the containers and Kubernetes/deployment layer of the application, which will reduce the amount of time and energy you have to solve the problems at the data model and the application layer. We’ve never built a piece of software that didn’t have some interesting challenges; we promise they are available where you are working.

Also, those problems are probably what your company’s revenues rely on, so you should solve them first.

To Sum Up: Over-engineering isn’t free

As I hope I’ve convinced you, the use of heavier technologies than you need burns through your resources and has the potential to jeopardize your project. The desire to architect for the application you hope to have (rather than the one you do) can get your business into trouble. You will need more specialized developers, more complex deployment plans, additional architectural meetings and more coordination among the components.

When you choose technologies that are overpowered (in case you need them at some undefined point in the future), you front-load your costs and increase the risk that you won’t make it to revenue/profitability.

We get it. We love good tech as much as the next person.

But wow! Superfast!

The fact is, though, most projects don’t need response times measured in the millisecond range. They just need to be fast enough to keep users from wandering away from the keyboard while their query loads. (Or they need a reasonable queuing system, batch processing, and notification options.)

And even if you do need millisecond response times but you don’t have millions of users, Kubernetes will still introduce more problems than it solves.

Performance challenges like these are tough, but generally need to be solved by painstaking, time-consuming, unpredictable trial and error–and the more subcomponents your application is distributed/sharded into, the harder (more time consuming, more unpredictable - by orders of magnitude!) that trial and error gets.

But what if we’re the Next Big Thing?

Most sites are relatively small and relatively stable and will do quite well on a properly-sized VM with a well-maintained code base and a standard SQL server. Minimizing your technological requirements to those that are necessary to solve the problems at hand allows you to focus on your business priorities, leaving the complexity associated with containerization and the maintenance of external stateful information to a future iteration.

Leave the “How are we going to scale” problem for once you get there, and you increase the chances that this will eventually be the problem you have.

The article Kubernetes Won’t Save You first appeared on the Consensus Enterprises blog.

We've disabled blog comments to prevent spam, but if you have questions or comments about this post, get in touch!

Jan 27 2021
Jan 27

It has become increasingly common to find located in the footer of many websites a link to their Statement of Accessibility. In a few cases you will find some sites with a VPAT (Voluntary Product Accessibility Template) and Statement of Accessibility linked. What are these documents? What is their purpose and should your site have one or both? 

VPAT vs. Statement of Accessibility

A Statement of Accessibility is a document that defines the current state of accessibility for a website. It provides an area where the site owner can let a user know they are working on the accessibility of their site and provide a method for the user to contact the site owner regarding accessibility issues.

A VPAT explains how a website, service or product meets the Revised 508 Standards, which refers to the law that requires that the federal government procure, create and maintain technology that is accessible, regardless of whether a particular site is actually a federal government site.

A Statement of Accessibility is a general statement on a site's accessibility and a declaration that the owner of the site is working to remediate any identified inaccessible features. A VPAT specifically notes any accessibility issues within a site as they relate to WCAG, Section 508 or even European accessibility guidelines. 

The Statement of Accessibility has basically says "we are working on our accessibility and here is a way to contact us with questions," whereas VPAT lists all of the Web Content Accessibility Guidelines and whether the site is in compliance point by point. Here are examples of a VPAT and a Statement of Accessibility:

Often organizations are asked for their VPAT if they are receiving funds or working with the federal government in any capacity. It is Federal Government requirement to have a VPAT as part of the accessibility process.  Creation of this document can be time consuming and requires a full accessibility audit. 

Demonstration of Due Diligence

While these documents are similar in subject matter they are different in purpose. The Statement of Accessibility demonstrates to uses that you care about accessibility and the needs of those who require assistive technologies to access your content.  It helps to provide the user with information about the accessibility of content and demonstrates a commitment to accessibility and the community the website serves. 

The purpose of a VPAT can actually be more of a requirement than a voluntary statement. The VPAT is required for any business or service which fall under Section 508 of the Rehabilitation Act, primarily those that do business with the federal government or receive government funds. The VPAT is intended to communicate to the wider procurement community the accessibility level or degree of conformance of the website, service, or product. 

So should you add a Statement of Accessibility and a VPAT to your site? While neither of these documents are guaranteed to protect your organization from legal action, they do help show that your organization is aware of any accessibility issues and is working to resolve them. It is appropriate to add a VPAT if there is any chance your organization will be subject to Section 508 regulations.

VPATs can take considerable time to create and if the need is there, it is advisable to have one in place. 

Looking for clarification concerning whether your website requires a Statement of Accessibility and/or a VPAT? We’re happy to help and if necessary, move forward with pursuing this documentation. Contact us today.

Jan 27 2021
Jan 27

I recently recorded a video series tutorial about progressive Drupal decoupling. In this series I take two of the official React app examples and turn them into widgets. Your Drupal editorial team can then embed those React applications (a calculator, and an emoji selector) as blocks in a page, as a field in a content type, as an embedded entity in the body field using the WYSIWYG, …

#1 Embed Any JavaScript Application

In this first video in the series we will take one of the offical examples from react and we will turn it into a widget ready to be embedded in Drupal (or anywhere else).


  1. Create new repository from template.
  2. Migrate source to new repo.
    • Copy new source files.
    • Adapt index.js (including render function).
    • Combine package.json.
    • Find & replace «widget-example».
    • Remove / add specific features.
  3. Reformat and execute tests.
  4. Execute locally.
  5. Deploy application.
[embedded content]

#2 The Registry & the App Catalog

The widget registry is the place where you aggregate your widgets (and other people’s widgets you want to use) to make them discoverable to Drupal and other CMS integrations.

This piece plays a fundamental role in the governance of your project(s). You can choose to have a single registry for all your Drupal installations, or one registry per project. You can use the pull requests to gatekeep what versions are added to the registry and who can publish them. The idea is that the owner of the widget-registry project has the authority of accepting PRs to add/update widgets so they are available in the registry (and therefore in Drupal).

[embedded content]

#3 Set up Progressive Decoupled Drupal

In this video we will learn how to connect Drupal and the widget registry to let editors embed JS applications all over Drupal (that includes support for i18n!).

You can, for instance, embed JS applications as blocks, as a field for a content type, in the body field as an entity embed, …

[embedded content]

Photo by Shifaaz shamoon on Unsplash

Jan 26 2021
Jan 26

In this walk through I show my preferred setup for SPAs with Svelte, Typescript and Tailwind.


For the very impatient among us:

npx degit munxar/svelte-template my-svelte-project
cd my-svelte-project
npm i
npm run dev



In this article I'll give you some insights how I set up Svelte with TypeScript and style components with Tailwind. There are plenty of articles around, but I found a lot of them overcomplicate things, or don't fit my requirements.

So here are my goals for the setup:

  • stay as close to the default template as possible, to make updates easy
  • production build should only generate css that is used
  • use typescript wherever possible

What Do I Need?

You'll need at least some node version with npm on your machine. At time of writing I have node version 15.6.0 and npm version 7.4.0 installed on my machine.

node -v && npm -v

Install the Svelte Default Template

To setup Svelte I open a terminal and use the command from the official Svelte homepage. TypeScript support has been already added to this template, so nothing special here.

npx degit sveltejs/template my-svelte-project
# or download and extract 
cd my-svelte-project

Enable TypeScript

# enable typescript support
node scripts/setupTypeScript.js

At this point I try out if the setup works by installing all dependencies and start the development server.

# install npm dependencies
npm i
# run dev server
npm run dev

If everything worked so far, pointing my browser at http://localhost:5000 displays a friendly HELLO WORLD. Let's stop the development server by hitting ctrl-c in the terminal.

Install Tailwind

Back in the Terminal I add Tailwind as described in their documentation.

npm install -D [email protected] [email protected]

After this step I generate a default tailwind.config.js file with

npx tailwindcss init

If you prefer a full Tailwind config use the --full argument:
npm tailwindcss init --full
See the Tailwind documentation for more infos about this topic.

Configure Rollup to use Postcss

The default Svelte template uses Rollup as a bundler. When I run the setupTypeScript.js from the first setup step, I get the famous svelte-preprocess plugin already integrated into the rollup setup. The only thing left is that I add the config for postcss as options to the svelte-preprocess plugin. Here are the changes that I make in rollup.config.js:

// rollup.config.js (partial)
export default {
  plugins: [
       preprocess: sveltePreprocess({
         postcss: {
           plugins: [require("tailwindcss")],

At this point Rollup should trigger postcss and therefore the Tailwind plugin. To enable it in my application, I still need one important step.

Adding a Tailwind Component to the App

Now it's time to create a Svelte component that contains the postcss to generate all the classes. I call mine Tailwind.svelte but the name doesn't really matter.

// src/Tailwind.svelte

Some things to note here:

  • The component only has a single style element with no markup.
  • The attribute global tells the svelte-preprocess plugin to not scope the css to this component. Remember by default Svelte scopes every css to the component it was declared, in this case I don't want this.
  • The lang="postcss" attribute is telling svelte-preprocess to use postcss for the content. As a goody, some IDE extensions now display the content with the correct syntax highlighting for postcss.

Now use the Tailwind component in src/App.svelte

// src/App.svelte

Hello Tailwind!

Now my browser displays a Tailwind styled div. Very nice!
Let's clean up the public/index.html and remove the global.css link tag and remove the corresponding file from public/global.css I don't use it.

    Svelte app

Let's finish the setup for production builds. Right now it's perfect for development. I can use any Tailwind class and except for the first start of the development server, where all the Tailwind classes get generated, it behaves very snappy on rebuilds.

Production Builds


When it comes to production builds, right now I have not configured anything so I'll get a bundle.css with all Tailwind classes. I don't want that for a production build, so I modify the tailwind.conf.js to use it's integrated purgecss for that purpose.

// tailwind.config.js
module.exports = {
  purge: ["src/**/*.svelte", "public/index.html"],
  darkMode: false, // or 'media' or 'class'
  theme: {
    extend: {},
  variants: {
    extend: {},
  plugins: [],

With this modification Tailwind removes all classes that are not used in .svelte files and in the public/index.html html file. I added the public/index.html file because sometimes I add containers or some responsive design utilities directly on the <body> tag. If you don't need this, you can remove the index.html file from the purge list, or add additional files I don't have listed here. For example: if I use some plugins that contain .js, .ts, .html, ... files that use Tailwind classes, I would add them to this purge array too.

There is one little detail about the Tailwind purge: it only is executed if NODE_ENV=production which makes sense. I set this environment directly in my package.json scripts:

// package.json (partial)
  "scripts": {
      "build": "NODE_ENV=production rollup -c",

With these settings my bundle.css only contains the Tailwind classed I really use, plus the mandatory css reset code that Tailwind provides.


One last thing to add for production is vendor prefixes. I usually go with the defaults and just add autoprefixer as postcss plugin. If you need more control, add configuration as you please.

Install autoprefixer with npm:

npm i -D autoprefixer

Add it as postcss plugin in rollup.config.js:

// rollup.config.js (partial)
  preprocess: sveltePreprocess({
    postcss: {
      plugins: [

That's it.

Features of this Setup

Tailwind Classes

I can apply every Tailwind class on a every html element even in the index.html template.

Tailwind @apply

Additionaly I can use @apply inside a style tag of a Svelte component like this:


This will generate a class scoped to the button of this component. Important part here is the attribute lang="postcss", without this postcss would not process the content of the style tag.

Typesave Components

Let's implement a simple logo component with an attribute name of type string and a default value of "Logo".



When I use this component the svelte language service of my IDE (visual studio code) will yell at me, if I try to pass something as the name attribute that is not of type string.


If you have a IDE that supports the svelte language service, you get all the intellisense stuff you would expect inside your editor. I use Visual Studio Code with the very good svelte.svelte-vscode extension.


I demonstrated how easy it is to setup a Svelte project with the default template enable TypeScript and add production ready Tailwind support.

I hope you find some helpful information and write some amazing apps!

The source code is available at: https://github.com/munxar/svelte-template


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web