Oct 10 2018
Oct 10

by David Snopek on October 10, 2018 - 12:40pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Critical security release for the Lightbox2  module to fix a Cross Site Scripting (XSS) vulnerability.

The Lightbox2 module enables you to overlay images on the current page.

The module did not sanitize some inputs when used in combination with a custom View leading to potential XSS.

See the security advisory for Drupal 7 for more information.

Here you can download the Drupal 6 patch.

If you have a Drupal 6 site using the Lightbox2 module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Oct 10 2018
Oct 10

Everyone has their own preference when it comes to different genres of music ranging from alternative rock and post-rock to trap rap and drone metal. Website traffic touches sky-high when a horde of music lovers remain glued to their screens to witness their favourite artists being awarded in this annual event of Grammy awards. Being a scalable and high performing space, the website of Grammy has coped with huge spikes in traffic with ease.

A speedometer in a peach coloured background

Website scalability is such a crucial aspect which has a huge say in the performance of digital firms. Drupal 8 and its capabilities lend better web performance. Being highly scalable, Drupal 8 helps in building a web application that is unfazed by the colossal spike in the internet traffic and scales tremendously. Grammy, powered by Drupal, has extracted the scalability features of Drupal to a great extent. Let's find out how.

The Significance of Website Scalability

58% of application/site owners overestimate their capacity levels: State of Web Readiness Report by Load Impact
Infographics with a heading, text, and four icons vertically lined up in a red background

A report from Load Impact has some interesting statistics on web performance.

  • 68% of respondents have encountered performance or stability issues with their website where the main reason was pointed towards the lack of resources.
  • 39% of e-retailers claimed that they incurred a financial loss due to performance or stability problems. In comparison to this, just 24% of the non-e-retailers claimed that they incurred money loss.
  • 98% of e-retailers believed that sub-two-second response time was essential. But the load time for e-commerce sites that were analysed in this report surpassed the desirable response threshold by a huge margin. The average was 7.9 seconds which is more than that for non-e-commerce sites.

Typically, 90% of your website’s response time is from frontend resources but this laters as your load and traffic enhances.

A graph with two curves illustrating frontend and backend load time

Questions that you can raise while determining ways of scaling:

  • Is your website or application performing acceptably with a minimum number of users on the system?
  • Is the website or application responding running faster with additional concurrent users? And if so, how many?
  • Are there hardware bottlenecks in the present system?
  • Is the website or application able to handle the increasing number of users/data?
  • What is the maximum amount of load the system is able to handle?

Is Drupal Scalable?

Drupal is a highly scalable CMS. If you want your site to grow and be amongst the busiest of online spaces, then Drupal can scale with your needs. Even if you are already popular and offer stupendous digital experience to the customers, Drupal is scalable to cope with the gargantuan amount of traffic to your site content.

Large enterprises love Drupal with the digital presence of big names like Grammy, Weather.com, Pfizer, Time Inc., Tesla, Puma, Princess Cruises, and many more powered by Drupal. Whether it is the extreme traffic spikes on certain occasions or the constant web traffic, Drupal handles all of that with utmost ease.

Drupal accommodates content growth and supports the most content-rich sites and experiences. It scales to govern more traffic, content, and users. So whether if you have one or over a thousand content contributors, Drupal can efficaciously cope with a monumental amount of visitors, content, and Drupal users.

Optimising Drupal Performance and Scalability

A superfast website significantly contributes towards the improvement of user experience, usability and engagement. Drupal 8 is one of the most efficient CMS for enabling blistering page speed.

A superfast website significantly contributes towards the improvement of user experience, usability and engagement.

Module management

Outdated modules can deter your efforts in speeding up your website. Updating every module enabled on your Drupal site can be beneficial.
It is also important to keep a record on used or no longer used modules. The number of Drupal modules installed on the site is directly proportional to the time taken for code execution thereby affecting page load time. Hence, uninstalling unwanted modules can improve execution time.
Furthermore, disabling the modules also add to the execution time of the code. So, a complete removal by uninstalling the unused modules would make the site faster.

Cache optimisation

Caching is an important feature that you can configure for enhancing your website speed. For advanced caching, Drupal comes with a great set of modules:

  • Internal Page Cache module assists in caching the web pages for anonymous users to enhance the speed for subsequent users.
  • Dynamic Page Cache module caches web pages for the anonymous and authenticated users.
  • BigPipe module lets users to swiftly see the unaltered, cacheable page elements while the personalised content is displayed next.
  • Redis module helps in integrating with the Redis key-value store thereby offering a tremendous cache system for static pages.
  • Varnish module allows you to integrate Drupal sites with an advanced and fast reverse-proxy system - Varnish cache- to serve static files and anonymous page views faster and at high volumes.
  • Memcache API and Integration module integrated Drupal sites with Memcached which helps in storing your data in active memory for a limited time period that makes the site faster to access.

Incorporation of Content Delivery Network (CDN)

CDN, Drupal module, assists in the integration of Content Delivery Network for the websites. It alters the file URLs thereby allowing the files like CSS, JavaScripts, images, videos, and fonts to be downloaded from the CDN instead of the web server. This helps in mitigating page load time and rapidly delivers web page components.

Usage of lazy loading

In traditional websites, all the images and content are preloaded into the web browser when someone accesses the site. Lazy loading loads these elements as soon as a user scrolls to view a content. Blazy, Drupal module, offers the functionalities of lazy loading and multi-serves the images to save bandwidth and server requests.

Image optimisation

Drupal 8 is loaded with image optimisation feature for setting the compression ratio of the images and fine-tune the page performance. Also, the size of the images for different screen sizes can be optimised in Drupal 8 which increases the page load speed.

Bandwidth optimisation

Optimising bandwidth refers to the aggregation of all CSS and JavaScript files to make them load together which ensures that all the page elements can be seen by the users almost immediately.

404 error management

When something on the site breaks to cause a 404 error, it can result in sluggishness. For instance, a failed image can hamper the performance of the site. Fast 404, Drupal module, utilises the resources better and whitelists files and checks pathways of problem.

Management of use of CSS and JavaScript

Avoiding overuse of CSS files and JS and adopting a minimalistic approach by keeping the code to a minimum can improve performance. Advanced CSS/JS Aggregation, Drupal module, can help in keeping a tab of your front-end performance by aggregating CSS and JavaScript files to improve speed.

Web hosting

It is of utmost significance that, while implementing every possible way of utilising the Drupal’s powerful capabilities, you select the best hosting provider that will decide your site’s ultimate speed, stability and security.

Scaling the server

In case, your server hardware is nearing its limits and you have optimised the site as much as possible, or you need a faster way of scaling than you can optimise, you can upgrade the server hardware in the following ways:

  • Scaling vertically: This is the simplest way of scaling the hardware. It refers to throwing more resources at the same server. In a cloud data centre, it may be as simple as upgrading the server size for more CPU cores, memory etc.
  • Scaling horizontally: This is a more intricate process than scaling vertically. It refers to adding more servers to separate the load. When done right, this can hugely minimise the load any single server receives.
  • Considering multiple servers: In case, you have multiple app servers for Drupal, you will require a method of deploying code to each server simultaneously. For instance, SaaS platforms like platform.sh and pantheon.io can handle the complete hosting setup for you but if you are doing it by yourself, you would need an rsync setup or git push to each of your servers etc. 

Up above the world so high: Drupal’s scalability for NASA

Flowchart showing the architecture of a website redesign with relevant iconsSource: Drupal.org

A digital agency migrated the website of NASA to the AWS cloud and onto Drupal to create a fully responsive and user-centric experience. Several AWS based Drupal CMS solutions for NASA were implemented for NASA for serving a plethora of needs ranging from nasa.gov to the Science Mission Directorate’s science.nasa.gov in both English and Spanish to a multisite platform and governance model for numerous Drupal applications serving groups across different NASA centres.
Nasa.gov and all of its subdomain components were migrated and relaunched which involved replacement of a closed source system. It comprised of more than 250000 pages and almost 3 TB of content.

Drupal in action

Drupal compressed the complete development timeline and also saved a lot of money in the process. Building with Drupal on the Amazon cloud ensured that NASA’s content is stored safely and scales with the content growth. With its user-driven APIs, dynamic host provisioning, infinite compute scalability and storage, and well-architected security architecture, Drupal and AWS together was the right fit.

Drupal compressed the complete development timeline and also saved a lot of money in the process.

Project highlights

  • Migration from proprietary, on-premise CMS system to an open source CMS in the Amazon cloud was performed without any service interruptions
  • Mobile-first approach was employed to the redesign of the site.
  • Headless Drupal in AWS cloud environment was built with security, performance and availability in mind.


Statistics shown on the right side in four blocks and a tab screen on the left with NASA homepage and a red paper bird on it.Source: Drupal.org

It is deployed in multiple AWS availability zones for redundancy handling approximately 500 content editors performing over 2000 content updates every day. On an average, it receives nearly one million page views a day and has handled peak loads of over 40,000,000 page views in a single day with a staggering and record-breaking 2,000,000+ concurrent users during NASA’s 2017 Total Solar Eclipse coverage.


Website scalability and performance is a significant aspect that ensures how well the digital business does during the busiest of times. Whether internet users throng at your website causing a sudden spike in traffic, or you are managing a popular service that witnesses a constant web traffic, Drupal scales with your needs.
We have been steadfast in our objective of offering an amazing digital experience through a suite of services.
Contact us at [email protected] to build a highly scalable Drupal site for your business.

Oct 10 2018
Oct 10

You have a piece of HTML code that needs to be included in multiple Twig templates. The code will be consistent across all of them - if you need to change it, you need to change it everywhere it is used.

As an example, I recently had to include a piece of code from a social share service called RhythmOne. I want to include this code in some, but not all, of the node templates.

A solution

Twig has the ability to add partial templates to another template. To add a partial template, use the include statement and the filename of the template.

Step by step

  1. In your theme, create a directory inside the templates directory called includes
  2. In the includes directory, create a Twig file for your partial template. For example, rhythmone.html.twig' 
  3. In the file you created in step two, add any code you need
  4. Go to the Twig template where you want to include the partial template. For example, a node template
  5. Include the partial template with:
 % include 'rhythmone.html.twig' %

  1. Repeat for any other templates where you need the same partial template

Drupal will automatically look in the template/includes directory in your theme.

Including templates from another theme or module

If you need to include a template from another theme or module, the syntax is slightly different.

If the partial template that you are including is stored anywhere other than the theme you are including it from, Drupal would not know where to find it with:

 {% include 'rhythmone.html.twig' %}

You could add the full path to the template, but that can be long winded.

To get around this, Twig has a concept called namespaces. This is very similar to PHP namespaces in modules. A namespace creates a short cut to the relevant template directory.

Modules and themes in Drupal have Twig namespaces that you can use. This will create short cut to the templates directory inside the relevant module or theme directory.

Let’s look at an example. If you have partial template in a module called mypages, then you can include it in another template in a different module, or theme, with:

 {% include '@mypages/includes/rhythmone.html.twig' %}

The @mypages part of the above code is a Twig namespace. It is made up of:

  • @
  • mypages: the name of the theme or module

The path after the namespace (@mypages) does not need “templates” because Drupal assumes that that is where your templates are stored.

One of the big advantages of using partial templates like this is that if you need to update the code, you only need to change it in one place.

Oct 10 2018
Oct 10

The Main Menu on your website is the first thing that catches visitor’s attention. Besides, it’s one of the most important elements that will assist the user in an effortless and intuitive navigation. In this article we are going to talk about how to design a Drupal Drop-down Menu with Glazed Theme.

Before diving into customizing our Main Menu, learn How to Create Drop-down menus with Drupal 8 and Glazed Theme.

To start customizing our menu design we need to go to the Glazed Theme Settings page. Amongst a dozen of options that are directly responsible for every element of your website design and how the end-user sees it, we are going to find the Header & Main Menu - the place that will change how our menu looks like.

Header & Main Menu

When navigating to the Header & Main Menu, first thing that will come to our attention is the “Top Header Options”.
Clicking it will open an entire new world of possibilities. Here We can choose 1 of the 7 layouts that come pre-installed with Glazed Builder. A layout for every taste! Depending on what you want to choose for your website, you can pick between having your logo on the left and menu on the right, which is the default layout that you are already familiar with (spoiler: the one on this website), having everything beautifully aligned in the center, place the menu on the left, or several others.

Header style is probably one of the most important settings here and will determine the overall look and feel for your website (will it be more minimalistic or will it have a more “heavy” look to it?). We are able to choose between 3 general types of header styles: Normal, Overlay or Pull-down Navbar.

You can see examples of each header style and header layouts on the Glazed Theme Live Demo Page.

Each one of these options look great when implemented, but they are going to need some further refinements to look exactly how you imagined. This brings us to our next element: Height. Choosing the height value will determine how tall or short our main menu will be. Once we got our perfect Height settings we will move on to deciding whether or not we would like our Menu to have a fixed position. A fixed header stays at the top of the browser window when a user scrolls.

Behavior of the Drupal Drop-Down Menu

If we decide that we want our menu to stick to the top of the browser when people scroll, then another option will pop-out asking us if we would also like a sticky header – which basically means that the menu will appear only after the user scrolls past a certain point on your page, only then will it stick to the top of his or her window. This will be determined by the scroll offset, height & background opacity values that you decide on.
Congrats! You’re past all the technical aspects of your gorgeous Menu bar. (Well not really but for now we are moving on to the fun part - choosing the colors)

Customize Navigation Colors

Here is the place where it all comes together. You can choose colors from your website’s color scheme (Custom one you made or Glazed Default) or add individual values to each settings’ custom color. In this menu we can choose the colors for every aspect of the navigation menu: text color, drop-down background, menu hover text, etc.

After we are happy with the result we can move on to adding the final touches to our menu design.

Side Header & Mobile Menu Options

In this area you can choose the layout look on mobile devices and your side header (if you choose to opt for one in the beginning). You can determine elements such as content alignment (left/center/right) and menu bar width.

The Main Menu Link is for determining the font style and hover style of our Main Menu (both website & mobile versions).
Last but not least, by going to Mobile Header menu we will determine the mobile breakpoint and the height of our menu in order to make it even more mobile-friendly.

We have designed a brand new Main Menu for our website which will be enjoyed by the visitors and help them easily navigate to the information they are looking for!

We also created some Main Menu designs along with you so you don’t get bored doing it all by yourself. We’ll leave each settings value below. If you like any of them feel free to recreate or use them as an inspiration for your future eye-catching Main Menu.


Oct 10 2018
Oct 10

Agiledrop is highlighting active Drupal community members through a series of interviews. Learn who are the people behind Drupal projects. 

This week we talked with Jonathan Hedstrom. Read about what he thinks has been the biggest evolution for Drupal, what contribution is he the proudest of and what he thinks is the most important about Drupal today.  

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

I’m an active core contributor and the core maintainer of the Datetime module. I’ve been a member and sometimes a leader of the Portland Drupal User Group for the last decade. I maintain dozens of contributed modules (and contribute patches/fixes/features to many dozens more). I am a co-maintainer of Drush and the project lead of the Drupal Behat Extension. I am currently a Software Architect at Phase2.

2. When did you first came across Drupal? What convinced you to stay, software or the community, and why?

I discovered Drupal in 2005 for my personal website. I converted it from a static HTML site to use Drupal 4.4. I started working professionally with Drupal when I moved to Portland in 2007 to be the lead developer at OpenSourcery. My first DrupalCon in Hungary really exposed me to the international community, and that continued engagement has kept me going. The warm welcome, I was given by the local community, made it simple to stay connected to the broader Drupal community too.

3. What impact Drupal made on you? Is there a particular moment you remember?

Drupal has provided the basis for much of my career, and many of my life-long friends have been made through the community, both locally and internationally.

4. How do you explain what Drupal is to other, non-Drupal people?

I usually describe Drupal as Legos for non-Drupal folks. It has all the pieces to do just about anything one can imagine, but there are some steps between dumping the pile of pieces out on the floor to a complete and functional thing.

5. How did you see Drupal evolving over the years? What do you think the future will bring?

I think the adoption of modern coding practices, and embracing third-party libraries have been the biggest evolution I’ve seen for Drupal. It has let Drupal focus on the things that it wants to do better, and not have to re-invent (and maintain) the little pieces that are already done.

6. What are some of the contribution to open source code or community that you are most proud of?

It’s always fun to get commits into a new project. I most recently had my first commit to Symfony accepted. One of my proudest commits to Drupal 8 would be the views integration for the Datetime module. Also, I was heavily involved in resolving many of the critical blockers for Drupal 8 to support an upgrade path from beta to beta, and am quite proud of helping that effort, which eventually resulted in the release of Drupal 8.0.0. I’m also proud of the time I spent as a leader of the Portland Drupal User Group. In that time I helped organize several camps, two Pacific Northwest Drupal Summits, and helped mentor quite a few folks.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

The current effort to more fully embrace Composer is really exciting and important, I think. Also, the move to completely get off of the legacy SimpleTest framework in the core is important, and probably less visible than some of the more exciting initiatives.

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavorment. 

The increasing focus on decoupled architecture and all the technology (React, GraphQL, etc.) that goes along with that is really fun, and it’s bringing a lot of new faces into the community, I think.  

Oct 09 2018
Oct 09

This is the eighth and last installment in a series presenting work on shared configuration that comes out of the Drutopia initiative and related efforts. If you've stuck with it this far, wow! If you haven't seen previous installments, you might want to start with Part 1, Configuration Providers.

The series focus has been meeting the needs of distributions for creating and updating packages of shared configuration, a use case not addressed in Drupal 8 core.

We've seen how configuration can be packaged using the Features module (Part 6). We've covered the ins and outs of updating packages of configuration, including how to assemble configuration as provided by extensions (Part 1), make needed changes to configuration from another package (Part 4), capture a previous configuration state (Part 2), and merge in configuration updates (Part 5). Along the way we've examined the particular cases of preserving configuration customizations (Part 3), working with core configuration required by multiple features (Part 7), and (bonus!) managing blocks when subthemes are in the mix.

The fact we're on installment eight - not even including the sneak bonus episode - in itself suggests a general conclusion: addressing these needs is not simple!

Working with Drupal core

In this series we've seen challenges or regressions when it comes to managing shared configuration in Drupal 8. In Part 1 we noted an issue with core code related to optional configuration. In Part 4 we saw how previously well supported requirements for shared configuration such as adding permissions to site-wide user roles or adding fields to existing content types now require dauntingly complex configuration alters.

These regressions are closely linked to Drupal 8's focus on the configuration staging use case. Progress on these fronts was not helped by what I, at least, experienced as an often boosterist environment in the Drupal 8 cycle in which critical perspectives on use cases that were and were not being addressed were sometimes drowned out. In a parallel Drupal 8 universe in which the use case of shared configuration was fully on par with staging configuration on a single site, solutions to much or all of the problems reviewed in this series might have been part of Drupal core.

At the same time, we've seen various ways that Drupal 8 core, including the configuration management system, helps with the task of filling in missing configuration management requirements. In Part 2 we saw how the configuration snapshot used in core's configuration staging can be a model for extension configuration snapshots, and also how core's LanguageServiceProvider is a useful pointer for how to provide dynamic services for configuration snapshotting.

Then there are key developments in the contributed modules space, especially Config Filter. By modelling how to extend and customize core's configuration staging workflow, Config Filter also opened the way to a parallel workflow for merging in configuration updates from extensions. Again, core's configuration management provides an essential base here. The Config Distro module takes everything that core's configuration staging API provides and, with a few small changes, repurposes it for a shared configuration use case. True, the result is only a framework - hence the many other pieces covered in this series - but the work gets a leg up from core.

CMI 2.0

The Drupal core Configuration Management 2.0 Initiative (CMI 2.0) aims to address "common workflows [that] are not natively supported" and fill in "missing functionality" in core's APIs. For the work covered in this series, major additions to the configuration management API could be a big plus.

But they'll also have their tradeoffs. As a small hint of what may be coming, one of the first CMI 2.0 improvements led to breakage in the Config Distro module. A current big patch in the queue, Add configuration transformer API, looks like a great improvement, but since it would supersede the Config Filter module and our whole approach is built on Config Filter . . . . Yeah, a bit of work there. On the plus side, a compatibility layer between Config Filter and the configuration transformer API could mean it would all "just work" in the interim. Or, more likely, continue to break, but for other reasons ;)


Drutopia is a small initiative. In contributing to configuration management improvements we've been able to lean on contributions from many others including Config Filter and Config Distro maintainer Fabian Bircher, Features and Config Actions maintainer Mike Potter, Configuration Update Manager maintainer Jennifer Hodgdon, and the core configuration work by Alex Pott and many more.

Still, to get from where we are to a polished and solid solution set, we'll need a lot of help. Most of the modules we've contributed are still in alpha or beta. We hope that others will see value in the start we've made and continue to adopt and adapt the tools, making them their own. Recent help with automated testing is particularly welcome.

Beyond shared configuration

In the Drutopia initiative, configuration management is only a small part of our focus. Looking to emerging models of platform cooperativism, we're building a cooperative where members will own, shape, and use the software distributions that drive their sites. The first such distribution is Drutopia, nearing its beta release.

We welcome new participants and contributors. Interested? See our guide to contributing and the many ways to get in touch.

Next steps

This series has had a decidedly technical bent. But how do you actually use the stuff?

Good question. We're working on a follow-up on how to put it all together, including:

  • How to build a distribution using Config Distro and Configuration Synchronizer.
  • How to update sites using Configuration Synchronizer.

Watch for it.

Oct 09 2018
Oct 09

Helping content creators make data-driven decisions with custom data dashboards

Our analytics dashboards help Mass.gov content authors make data-driven decisions to improve their content. All content has a purpose, and these tools help make sure each page on Mass.gov fulfills its purpose.

Before the dashboards were developed, performance data was scattered among multiple tools and databases, including Google Analytics, Siteimprove, and Superset. These required additional logins, permissions, and advanced understanding of how to interpret what you were seeing. Our dashboards take all of this data and compile it into something that’s focused and easy to understand.

We made the decision to embed dashboards directly into our content management system (CMS), so authors can simply click a tab when they’re editing content.

GIF showing how a content author navigates to the analytics dashboard in the Mass.gov CMS.

How we got here

The content performance team spent more than 8 months diving into web data and analytics to develop and test data-driven indicators. Over the testing period, we looked at a dozen different indicators, from pageviews and exit rates to scroll-depth and reading grade levels. We tested as many potential indicators as we could to see what was most useful. Fortunately, our data team helped us content folks through the process and provided valuable insight.

Love data? Check out our 2017 data and machine learning recap.

We chose a sample set of more than 100 of the most visited pages on Mass.gov. We made predictions about what certain indicators said about performance, and then made content changes to see how it impacted data related to each indicator.

We reached out to 5 partner agencies to help us validate the indicators we thought would be effective. These partners worked to implement our suggestions and we monitored how these changes affected the indicators. This led us to discover the nuances of creating a custom, yet scalable, scoring system.

Line chart showing test results validating user feedback data as a performance indicator.

For example, we learned that a number of indicators we were testing behaved differently depending on the type of page we were analyzing. It’s easy to tell if somebody completed the desired action on a transactional page by tracking their click to an off-site application. It’s much more difficult to know if a user got the information they were looking for when there’s no action to take. This is why we’re planning to continually explore, iterate on, and test indicators until we find the right recipe.

How the dashboards work

Using the strategies developed with our partners, we watched, and over time, saw the metrics move. At that point, we knew we had a formula that would work.

We rolled indicators up into 4 simple categories:

  • Findability — Is it easy for users to find a page?
  • Outcomes — If the page is transactional, are users taking the intended action? If the page is focused on directing users to other pages, are they following the right links?
  • Content quality — Does the page have any broken links? Is the content written at an appropriate reading level?
  • User satisfaction — How many people didn’t find what they were looking for?
Screenshot of dashboard results as they appear in the Mass.gov CMS.

Each category receives a score on a scale of 0–4. These scores are then averaged to produce an overall score. Scoring a 4 means a page is checking all the boxes and performing as expected, while a 0 means there are some improvements to be made to increase the page’s overall performance.

All dashboards include general recommendations on how authors can improve pages by category. If these suggestions aren’t enough to produce the boost they were looking for, authors can meet with a content strategist from Digital Services to dive deeper into their content and create a more nuanced strategy.

GIF showing how a user navigates to the “Improve Your Content” tab in a Mass.gov analytics dashboard.

Looking ahead

We realize we can’t totally measure everything through quantitative data, so these scores aren’t the be-all, end-all when it comes to measuring content performance. We’re a long way off from automating the work a good editor or content strategist can do.

Also, it’s important to note these dashboards are still in the beta phase. We’re fortunate to work with partner organizations who understand the bumps in the proverbial development road. There are bugs to work out and usability enhancements to make. As we learn more, we’ll continue to refine them. We plan to add dashboards to more content types each quarter, eventually offering a dashboard and specific recommendations for the 20+ content types in our CMS.

Oct 09 2018
Oct 09

I recently worked with the Mass.gov team to transition its development environment from Vagrant to Docker. We went with “vanilla Docker,” as opposed to one of the fine tools like DDev, Drupal VM, Docker4Drupal, etc. We are thankful to those teams for educating and showing us how to do Docker right. A big benefit of vanilla Docker is that skills learned there are generally applicable to any stack, not just LAMP+Drupal. We are super happy with how this environment turned out. We are especially proud of our MySQL Content Sync image — read on for details!

Pretty docks at Boston Harbor. Photo credit.

Docker compose

The heart of our environment is the docker-compose.yml. Here it is, then read on for a discussion about it.

Developers use .env files to customize aspects of their containers (e.g. VOLUME_FLAGS, PRIVATE_KEY, etc.). This built-in feature of Docker is very convenient. See our .env.example file:

MySQL content sync image

The most innovative part of our stack is the mysql container. The Mass.gov Drupal database is gigantic. We have tens of thousands of nodes and 500,000 revisions, each with an unholy number of paragraphs, reference fields, etc. Developers used to drush sql:sync the database from Prod as needed. The transfer and import took many minutes, and had some security risk in the event that sanitization failed on the developer’s machine. The question soon became, “how can we distribute a mysql database that’s already imported and sanitized?” It turns out that Docker is a great way to do just this.

Today, our mysql container builds on CircleCI every night. The build fetches, imports, and sanitizes our Prod database. Next, the build does:

That is, we commit and push the refreshed image to a private repository on Docker Cloud. Our mysql image is 9GB uncompressed but thanks to Docker, it compresses to 1GB. This image is really convenient to use. Developers fetch a newer image with docker-compose pull mysql. Developers can work on a PR and then when switching to a new PR, do a simple ahoy down && ahoy up. This quickly restores the local Drupal database to a pristine state.

In order for this to work, you have to store MySQL data *inside* the container, instead of using a Docker Volume. Here is the Dockerfile for the mysql image.

Drupal image

Our Drupal container is open source — you can see exactly how it’s built. We start from the official PHP image, then add PHP extensions, Apache config, etc.

An interesting innovation in this container is the use of Docker Secrets in order to safely share an SSH key from host to the container. See this answer and mass_id_rsa in the docker-compose.yml above. Also note the two files below which are mounted into the container:

Configure SSH to use the secrets file as private key Automatically run ssh-add when logging into the container


Traefik is a “cloud edge router” that integrates really well with docker-compose. Just add one or two labels to a service and its web site is served through Traefik. We use Traefik to provide nice local URLs for each of our services (www.mass.local, portainer.mass.local, mailhog.mass.local, …). Without Traefik, all these services would usually live at the same URL with differing ports.

In the future, we hope to upgrade our local sites to SSL. Traefik makes this easy as it can terminate SSL. No web server fiddling required.

Ahoy aliases

Our repository features a .ahoy.yml file that defines helpful aliases (see below). In order to use these aliases, developers download Ahoy to their host machine. This helps us match one of the main attractions of tools like DDev/Lando — their brief and useful CLI commands. Ahoy is a convenience feature and developers who prefer to use docker-compose (or their own bash aliases) are free to do so.

Bells and whistles

Our development environment comes with 3 fine extras:

  • Blackfire is ready to go — just run ahoy blackfire [URL|DrushCommand] and you’ll get back a URL for the profiling report
  • Xdebug is easily enabled by setting the XDEBUG_ENABLE environment variable in a developer’s .env file. Once that’s in place, the PHP in the container will automatically connect to the host’s PHPStorm or other Xdebug client
  • A chrome-headless container is used by our suite which incorporates Drupal Test Traits — a new open source project we published. We will blog about DTT soon

Wish list

Of course, we are never satisfied. Here are a couple issues to tackle:

Oct 09 2018
Oct 09

A straightforward mission doesn’t always mean there’s a simple path. When we kicked off the Mass.gov redesign, we knew what we wanted to create: A site where users could find what they needed without having to know which agency or bureaucratic process to navigate. At DrupalCon Baltimore in 2017, we shared our experience with the first nine months of the project building a pilot website with Drupal 8, getting our feet wet with human-centered (AKA “constituent-centric”) design, and beginning to transform the Mass.gov into a data-driven product.

Oct 09 2018
Oct 09

Structured content is central to the content strategy for our new Drupal-based Mass.gov. So is the idea of a platform-agnostic design that can be reused across our diverse ecosystem of web applications, to encourage a consistent look, feel, and user experience. We’re not the only ones with these priorities for our new digital platform. So we can’t be the only ones wrestling with the unsolved authoring experience implications either. This blog post is an attempt to articulate the basic problem and share our best, latest idea for one possible solution. As the saying goes, ‘If you want to go fast, go alone. If you want to go far, go together.’ We want to go far. If you like our idea, or if you have other possible solutions or insights, please share them with us in comments below, or by tweeting at us @massgov, or contact me at @bryanhirsch. Let’s go together.


Drupal has made major strides toward improving the content authoring experience in recent years: Acquia’s Spark project in Drupal 7, Quick Edit in Drupal 8, and Edward Faulkner’s integrations with Ember. But (1) our vendors for Mass.gov have unanimously dissuaded us from leveraging these innovations in our new authoring experience because with complex data models it’s costly and highly customized to make in-place editing through the front end work. (2) This challenge is compounded by the desire to keep our design portable across diverse web properties, which means we want to keep Drupal’s Quick Edit markup out of our Pattern Lab-based component library. Additionally, (3) if we get structured content right, our Mass.gov content will be reused in applications we don’t control (like Google’s knowledge graph) and in an increasing number of government website via APIs. It’s both cost prohibitive and practically impossible to create frontend authoring UIs for each of these applications. That said, if we move authoring away from the front end in-place editing experience, (4) authors still need detailed context to visualize how different pieces of content will be used to write good content.

Here’s one possible solution we’re interested to explore that seems like it should address the four issues outlined above: Move high-fidelity live previews to the backend authoring experience, displayed alongside the content editing form as visualized in the designs below.

Visual design of possible UI for live preview of different components from the Pattern Lab-based Mayflower component library, incorporated into the backend Mass.gov authoring experience. View additional designs here.

By showing authors examples of how different pieces of content might be used and reused, we can give writers enough context to write meaningful structured content. But maybe we only need to give authors enough examples to have reasonable interactions with all the different content form fields. If this works, then we don’t have to try and keep up with an infinitely growing and changing landscape of possible new frontends. Pattern Kit (here) and Netlify CMS (here) both offer working, open-source, examples of how this authoring experience could work.

Mass.gov is not a “decoupled Drupal” ecosystem yet. Our API strategy is in its infancy. We know other people with much more experience are thinking about the same issues. Maybe some are even writing code and actively working to develop solutions. If this is you, we’d love to hear from you, to know how you’re approaching these problems, and to learn from anything you’ve tried here.

Oct 09 2018
Oct 09

This is a guest blog by Lijo Abraham and Ali Fathima N. A. to tell you about a recent Global Training Days event in Kerala.

Group photo on stage at GTD event
Group photo by Sumesh S (sumesh_sr)

A Drupal Global Training Day (GTD) was held in Thiruvananthapuram, Kerala, India on September 29, welcoming 60 participants representing diverse sectors, including students from engineering colleges, software professionals, and government officials. The event created momentum to form the Drupal Community in Kerala.

The International Centre for Free and Open Source Software (ICFOSS)—an autonomous institution under the Government of Kerala that is mandated with propagation, promotion, and development of Free and Open Source Software—donated the event. Zyxware Technologies provided the technical partnership for the GTD.

The GTD had eminent personalities addressing the participants and sharing their experiences. In the inaugural address, M. Sivasankar IAS, Secretary of the E&IT Department for the Government of Kerala, stressed the role played by the Kerala Government in enabling the technology ecosystem in India. 

Dr. Rajeev R. R., Program Head of ICFOSS, welcomed the participants. Thomas P. Thomas, CEO of Zyxware Technologies, offered an Introduction to Drupal, the Drupal Community, and the GTD.

Vimal Joseph, Senior Manager of Technology at Zyxware Technologies, presented a session on 'Fueling the Digital Transformation with Drupal'—which was followed by open questions and answers regarding Drupal. 

A case study on 'Multi-site Platform for a Government Agency' was presented by Mathew T. Abraham, a Project Manager at Zyxware Technologies. Participants interacted directly with the speakers.

Vimal Joseph speaks with Umami demo showing on screen
Fuelling Digital Transformation with Drupal, photo by Sudheesh S. Babu.

Presentations were followed by hands-on Drupal workshops. Drupal developers of Zyxware Technologies namely  Abhinand Gokhala, Sumesh S, Jijo Joseph, Sudheesh S. Babu, Jithin Prabhakaran, Sahal V. A., Jeslin Shaji and Ali Fathima N. A. provided individual attention to the participants. Workshops led by Krishna R. P., Technical Project Manager, and Ajish J. Pulikottil, Technical Consultant, offered an introduction to Drupal, installing, and how to build a simple Drupal 8 application. Nearly a dozen staff from ICFOSS and Zyxware Technologies volunteered at the event as well.

The workshop on Drupal has been very inspiring. I am feeling delighted to have been a part of this and will try to continue with the wave approach on society with this positive technique,” stated Aishwarya Soman Nair, a student at Saintgits College of Engineering.

Overall, participants’ feedback stated that this was a new, helpful opportunity to learn more about Drupal in detail. Participants were awarded certificates of participation.

One of the best workshops I have attended. The training was inspiring, informative, and its method of delivery was so easy to receive. I am interested in forthcoming open source training also,” said Raveena R. Marangattu, a student at Saintgits College of Engineering.

You can be part of Global Training Days

Get involved with Global Training Days! Join the group and host an event this November 30-December 1.

Oct 09 2018
Oct 09

It was recently announced that 2020 will be the year Drupal 9 is officially released into the wild. The exact date hasn’t been set, but we can now look forward to the 9.0 release that year. The announcement also gave us an official End of Life date of November 2021 for Drupal 7 AND Drupal 8. So, what does this mean if you’re currently running or developing a site on one of those versions? In this post, I’ll explain.

What this means for Drupal 8?

Drupal 8 is built around a concept of continuous innovation. What this means is that new features and backwards-compatible changes are continuously added. When an old system or code is depreciated, instead of removing it, it stays in the codebase. This ensures that custom code and contributed modules will continue to work and have time to update. Eventually, there will be an excess amount of depreciated code and dependencies and there will be a need to remove it. That is one of the reasons for the release of Drupal 9. All that old stuff gets removed and we start fresh with the latest and greatest technology.

The great thing about Drupal 8 is that by the time Drupal 9 is released all of the modules and custom code in your site should be up-to-date. Therefor, updating from 8 to 9 is no different than from 8.5 to 8.6. Clean and painless!

And that’s the point. This method of building and releasing versions will continue for the foreseeable future which is why we like to say that a migration to the latest Drupal will be the last migration you ever need.

What this means for Drupal 7?

Unfortunately, Drupal 7 is a different story. When Drupal 7 reaches end of life in November of 2021, it will no longer be supported by the community at large. There are plans to release a Drupal 7 version that uses the latest version of PHP. There is also a paid support program planned (similar to Drupal 6 LTS) that will allow people and organizations unable or unwilling to migrate to continue to keep their sites secure. But really, your best course of action is to plan for a migration to Drupal 8 by 2020. This keeps your site current and guarantees it’s security moving forward.

The codebase between 7 and 8 is entirely different so a migration to Drupal 8 is a pretty big undertaking. You could call it replatforming. Drupal 8 does however include a built in data migration tool that will make the move easier. You might still need some help though depending on your site requirements and edge cases. Plus, data is one thing, but you would also need to move your theme, too. The silver lining is that migrating presents an opportunity to freshen up the look of your site and increase site speed with the latest software. For more information on what is involved in a migration, check out this post.

Like I mentioned earlier in this post, a migration to Drupal 8 may likely be the last migration you ever need since subsequent major version updates (i.e. from 8 to 9) should be very quick and easy. Once you’ve made that initial investment migrating to Drupal 8, you can rest assured that you won't have to go through that process again, possibly forever.

Migration experts

Acro Media is a Drupal agency specialized in eCommerce. We help build and maintain successful eCommerce websites as well as the underlying Drupal Commerce platform. We are also heavily involved in the development of Drupal’s migration tools. If you want to discuss what a migration might look like for your business, talk to us! We’re happy to help.

Contact us to discuss your migration!

Oct 09 2018
Oct 09

Drupal Modules: The One Percent — Login History (video tutorial)

[embedded content]

Episode 47

Here is where we bring awareness to Drupal modules running on less than 1% of reporting sites. Today we'll check out Login History, a module logs details of user logins.

Oct 09 2018
Oct 09
Solid concept inrupt

Last week, I had a chance to meet with Inrupt, a startup founded by Sir Tim Berners-Lee, who is best known as the inventor of the World Wide Web. Inrupt is based in Boston, so their team stopped by the Acquia office to talk about the new company.

To learn more about Inrupt's founding, I recommend reading Tim Berners-Lee's blog or Inrupt's CEO John Bruce's announcement.

Inrupt is on an important mission

Inrupt's mission is to give individuals control over their own data. Today, a handful of large platform companies (such as Facebook) control the media and flow of information for a majority of internet users. These companies have profited from centralizing the Open Web and lack transparent data privacy policies on top of that. Inrupt's goal is not only to improve privacy and data ownership, but to take back power from these large platform companies.

Inrupt will leverage Solid, an open source, decentralized web platform that Tim and others have been developing at MIT. Solid gives users a choice of where their personal data is stored, how specific people and groups can access select elements, and which applications can use it. Inrupt is building a commercial ecosystem around Solid to fuel its success. If Solid and/or Inrupt are widely adopted, it could radically change the way web sites and web applications work today.

As an advocate for the Open Web, I'm excited to see how Inrupt's mission continues to evolve. I've been writing about the importance of the Open Web for many years and even proposed a solution that mirrors Solid, which I called a Personal Information Broker. For me, this is an especially exciting and important mission, and I'll be rooting for Inrupt's success.

My unsolicited advice: disrupt the digital marketing world

It was really interesting to have the Inrupt team visit the Acquia office, because we had the opportunity to discuss how their technology could be applied. I shared a suggestion to develop a killer application that surround "user-controlled personalization".

Understanding visitors' interests and preferences to deliver personalized experiences is a big business. Companies spend a lot of time and effort trying to scrape together information about its website's visitors. However, behavior-based personalization can be slow and inaccurate. Marketers have to guess a visitor's intentions by observing their behavior; it can take a long time to build an accurate profile.

By integrating with a "Personal Information Broker" (PIB), marketers could get instant user profiles that would be accurate. When a user visits a site, they could chose to programmatically share some personal information (using a standard protocol and standard data schema). After a simple confirmation screen, the PIB could programmatically share that information and the site would instantly be optimized for the user. Instead of getting "cold leads" and trying to learn what each visitor is after, marketers could effectively get more "qualified leads".

It's a win not only for marketers, but a win for the site visitor too. To understand how this could benefit site visitors, let's explore an example. I'm 6'5" tall, and using a commerce site to find a pair of pants that fit can be a cumbersome process. I wouldn't mind sharing some of my personal data (e.g. inseam, waist size, etc) with a commerce site if that meant I would instantly be recommended pants that fit based on my preferences. Or if the store has no pants that would fit, it could just tell me; Sorry, we currently have no pants long enough for you!. It would provide me a much better shopping experience, making it much more likely for me to come back and become a long-time customer.

It's a simple idea that provides a compelling win-win for both the consumer and retailer, and has the opportunity to disrupt the digital sales and marketing world. I've been thinking a lot about user-controlled personalization over the past few years. It's where I'd like to take Acquia Lift, Acquia's own personalization product.

Inrupt's success will depend on good execution

I love what Solid and Inrupt are building because I see a lot of potential in it. Disrupting the digital marketing world is just one way the technology could be applied. Whatever they decide to focus on, I believe they are onto something important that could be a foundational component of the future web.

However, it takes a lot more than a good idea to build a successful company. For startups, it's all about good execution, and Inrupt has a lot of work to do. Right now, Inrupt has prototype technology that needs to be turned into real solutions. The main challenge is not building the technology, but to have it widely adopted.

For an idea this big, Inrupt will have to develop a protocol (something Tim Berners-Lee obviously has a lot of experience with), build out a leading Open Source reference implementation, and foster a thriving community of developers that can help build integrations with Drupal, WordPress and hundreds of other web applications. Last but not least, Inrupt needs to look for a sustainable business model by means of value-added services.

The good news is that by launching their company now, Inrupt has put themselves on the map. With Tim Berners-Lee's involvement, Inrupt should be able to attract top talent and funding for years to come.

Long story short, I like what Inrupt is doing and believe it has a lot of potential. I'm not sure what specific problem and market they'll go after, but I think they should consider going after "user-controlled personalization" and disrupt the digital marketing world. Regardless, I'll be paying close attention, will be cheering for their success and hopefully find a way to integrate it in Acquia Lift!

Oct 09 2018
Oct 09

A website with a security hole could be a nightmare for your business, leaving regular users untrusted. The security breach is not just about the website resources, but it could be putting up the website reputation at stake and injecting harmful data in the server & executing them. There could be many ways to do that. One of them is an Automated script, which scans your website and looks up for the sensitive part and tries to bypass the web security with injected code.

I believe you might be thinking of your website now.

  • Whether your website is fully secured or not? 
  • How to make sure everything ships on our website is generic? And how to protect them? 

As a Drupal Developer, I’ve come across some of the contributed module available on Drupal.org that can help your site in dealing with security issues. Still, I can’t assure, by applying those modules, you can safeguard your website. But it’s always recommended to follow the set guideline & utilize the modules to minimize the security breaches. 

Let’s take a look at those modules:

Secure Pages

We all know that moving an application from HTTP to HTTPS gives an additional layer of security, which can be trusted by the end users. Unlike regular modules, you just don’t need to follow regular module installations instead your server should be SSL enabled.

Currently, it is available for Drupal 7 only.
Ref URL: https://www.Drupal.org/project/securepages

Security Kit

The Kit itself is a collection of multiple vulnerabilities such as Cross-site scripting, Cross-site Request Forgery, Clickjacking, SSL/TLS. With the help of security kit module, we can mitigate the common risk of vulnerabilities. Some of the vulnerabilities have already been taken care by Drupal core like clickjacking introduced in 7.50 version.

Currently, it’s available for both Drupal 7 and Drupal 8.
Ref URL: https://www.Drupal.org/project/seckit

Password Policy

This module is used to enforce users to follow certain rules while setting up the password. A web application with weaker security implementation, allow hackers to guess password easily. That’s the reason you get password policy instruction while setting up the password. It’s not just a fancy password, but secure & difficult to guess.

# Password should include 1 Capital letter
# Password should include 1 Numeric
# Password should include 1 Special Character
# Password should MIn & Max Character

This module is currently available for both Drupal 7 and Drupal 8.
Ref URL: https://www.Drupal.org/project/password_policy


This module looks for places in the user interface, where an end user can misuse the input area and block them. Few features that need to showcase here are:

# Disable permission "use PHP for block visibility".
# Disable creating “use the PHP” filter.
# Disable user #1 editing.
# Prevent risky permissions.
# Disable disabling this module. 

Currently, it’s available for Drupal 7 and Drupal 8.
Ref URL: https://www.Drupal.org/project/paranoia

Flood Control

This module provides an Administrative UI to manage user based on UID & User-IP. There is configuration available to manage user restriction based on the nth number of the wrong hit by user ID/IP. We already know that Drupal core has a shield mechanism to protect their user with five unsuccessful logins hit, users get blocked for an hour/minute. With the help of the contributed module, we can dig it a bit.

Currently, it’s available for Drupal 7.
Ref URL: https://www.Drupal.org/project/flood_control

Automated logout

In terms of user safety, the site administrator can force log out users, if there is no activity from the user end. On top of that, it provides various other configurations like:

# Set timeout based on roles.
# Allow users to log in for a longer period of time.
# User has the ability to set their own time.

Currently, it’s available for Drupal 7 and Drupal 8.
Ref URL: https://www.Drupal.org/project/autologout

Security Review

This module checks for basic mistakes that we do while setting up a Drupal website. Just untar the module & enable it. This will run an automated security check and produce a result. Remember this won’t fix the errors. You need to manually fix them. Let's take a look at some of the security features that need to be tested by the module:

# PHP or Javascript in content
# Avoid information disclosure
# File system permissions/Secure private files/Only safe upload extensions
# Database errors
# Brute-force attack/protecting against XSS
# Protecting against access misconfiguration/phishing attempts.

Currently, it’s available for Drupal 7.
Ref URL: https://www.Drupal.org/project/security_review


This tool helps developer avoid adding messy code directly to their contributed module, instead of applying patches or new release update. It works on a very simple logic. It scans all the modules & themes available on your site. Download them and compare it with an existing module to make sure modules/themes are on correct shape. The result will give you information on changed module/theme and the rest of the thing you are well aware of - what needs to be done?

Currently, it’s available for Drupal 7 and Drupal 8.
Ref URL: https://www.Drupal.org/project/hacked

All of the above modules are my recommendation that a Drupal website should have. Some contributed module will resolve your security issues by providing correct configuration and some of them are just an informer. They will let you know the issue. But you need to manually fix those issue.
Further, these contributed modules provide the atomic security based on the complexity of your site and types of user available. You can look up for the security module and protect your site against anonymous.

We, at Valuebound - a Drupal CMS development company, help enterprises with Drupal migration, Drupal support, third-party integration, performance tuning, managed services, and others. Get in touch with our Drupal experts to find out how you can enhance user experience and increase engagement on your site.

Oct 08 2018
Oct 08

by David Snopek on October 8, 2018 - 4:45pm

If you haven't heard yet, PHP 5 will reach the end of its security support (from the upstream project) in December of this year.

During DrupalCon Baltimore we announced that we'd be updating Drupal 6 to work with PHP 7.2, and, in September, we announced that we'd be making a big push to get that live with a couple of our customers.

Finally, we have something to show for it! :-)

So far, we've only tested with a few sites, so I'm sure there's some additional issues and bugs we haven't encountered yet. But we have an initial release of Drupal core and some selected contrib modules that work with PHP 7.2 in our testing.

And all our work so far has been released back to the community!

Read more for the details :-)

Drupal core

The short version: We've released Drupal 6.45 with support for PHP 7.2

We've take a particular approach with this:

  • We included a shim for the ereg() family of functions that were removed, rather than converting core to using preg_*() functions. This was done because contrib also uses those removed functions and this saves us from having to update many contrib modules.
  • In one or two cases, we modified Drupal core to maintain the PHP 5 behavior of its APIs if that behavior was depended on by "a lot" (subjective judgement) of contrib modules, again in order to have to update fewer contrib.
  • We made most of the updates recommended by the PHPCompatibility standard for phpcs
  • We tried to retain (and tested for) PHP 5.2+ compatibility, so that our Drupal core fork would continue to work for people who haven't updated yet. (If you're not aware of it, 3v4l.org is a great tool for trying PHP snippets in lots of versions of PHP at once, and, well, we have a bunch of different PHP versions via Docker too.)
  • But otherwise, we've based our changes on actual manual testing and confirmed bugs, and tried to make the smallest possible change to fix each problem.

Important security note!

Drupal adds a .htaccess file to the public (ie. sites/default/files/) and temporary files directory to prevent PHP files that somehow end up there from being executable when using Apache.

However, this .htaccess file won't work with PHP 7 unless modified!

One way to do this, is to delete the .htaccess files and then visit the "Status report" on your site, which will re-create the file with the changes necessary for PHP 7.

We've considered adding an update hook to do this, but we're worried about wiping out any added changes - see the issue on GitHub and leave your thoughts.


You need a patched Drush 8 in order to work with PHP 7. See drush-ops/drush#3706 and you can grab the patch here.

The Drush maintainers seem open to committing this patch, so hopefully, this will make it into a Drush 8 release at some point. :-)

Selected contrib modules

Of course, the true power of Drupal is in it's contributed modules!

We're committed to updating the contrib modules used by our D6LTS customer to work with PHP 7.2.

That said, updating contrib (especially complex contrib) is a lot harder than Drupal core, so we expect this process to take us all the way to the end of the year.

Here's the contrib releases we've made so far:

There's also a number of contrib modules (generally the simpler ones) that work fine without any changes.

How to get involved!

If you're also working on getting your Drupal 6 site working on PHP 7, and you find any issues or bugs, you can write an issue on the project on GitHub or in the D6LTS queue on Drupal.org. We appreciate the help and a number of people have contributed already - thanks! :-)

Or if you're interested in us doing this for you...

Sign up for Drupal 6 Long-Term Support!

Have you updated your Drupal site to PHP 7 already? How'd that go? Please leave a comment below!

Oct 08 2018
Oct 08

Big enterprises have been marching on towards the path of digital transformation. Disney, an entertainment giant, has advanced its digital transformation strategies by acquiring companies. By acquiring BAMTech, a media technology service provider, it can access the streaming technology instead of building it in-house. Spending billions of dollars on 21st Century Fox with the assets including famous characters from Marvel comics, it can connect directly with consumers instead of distributors and advertisers.

Illustration showing people sitting around a desk and working

Social media, cloud computing, data analytics, and mobility are dramatically altering the way companies operate. On the one hand, their synergy has led to easy-to-use products and services for customers, on the other hand, it has resulted in richer dividends for companies. Witnessing a positive correlation between business performance and the emerging technologies, organisations are fast-forwarding to digital transformation. 

Drupal, as one of the pioneering CMSs in powering digital innovation, can be highly fruitful for enterprises in their digital transformation endeavours. But before looking at Drupal’s prowess, how did the digital transformation of business models start coming into the picture?

Global connectivity shifts and empowerment of customers

A graph showing a straight line and a curve depicting the evolution of digital transformation

The emphasis and impact of the internet and global connectivity has shifted since the late 1990s to the 2010s. In the 1990s, only the enterprises in select industries like music, entertainment, and electronics were exploring digital products and services. Infrastructure providers led from the front in building out the information backbone for the efficaciousness and better productivity across finance, supply chain, and human resources.

In the late 1990s, the internet hype wounded up with a crash in 2000. But the evolution of consumer demand for digital products and services remained steadfast. As more customers were getting empowered with pervasive access to online information, along with a plethora of choices and channels, their expectations accentuated even more.

Now, the customers have become the primary force behind the digital transformation in all industries.

Digital Transformation in the driver’s seat

A flowchart showing boxes explaining building blocks of digital transformationSource: Hellosign

Digital transformation is the process of integrating digital technology into different aspects of the business that needs fundamental technological changes, different approach towards culture, operational upgrade, and value delivery.

Organisational change is the foundation of digital business transformation - Global Centre for Digital Business Transformation

For leveraging futuristic technologies and their increasing expansion into human activities, a reinvention of business is required by dramatically metamorphosing the entire business process and model.

Digital transformation demands a shift of focus to the edge of the enterprise and numerous agile data centres for supporting that edge. It also requires an enterprise to shed legacy technology that may be price-heavy to maintain thereby changing the company culture in order to support the push that acceleration that comes with the digital transformation.

Gartner’s IT Market Clocks for 2016: Digital Transformation Demands Rapid IT Modernisation states that 66% of the organisations doing digital transformation expect to generate more financial gains from their operations while 48% predict that more business will arrive through digital channels. Cited by 40%, it can empower employees with digital tools and 39% cited that it minimised business costs. To reap the rewards down the line, there is no better time to start upon your own digital transformation journey.
Major benefits of digital transformation for an organisation include:

  • Improvement of processes: Newer technologies permits businesses to automate simpler processes and remove the intermediaries in more intricate processes.
  • Finding newer revenue streams: Emerging technologies can open the doors for new profitable avenues that may not have been available for an organisation when they were first established.
  • Building personalised customer experience: Customers expect businesses to meet their individual needs and evolution of technology has deemed it fit enough to fulfil this very purpose.

Strategising digital transformation

Resolving the tension among the leaders

A piechart showing statistics on digital transformation leadership

The natural starting point in an effort to build executive alignment is to resolve tensions in the leadership group and distribute the ownership tasks in a planned manner.

Leveraging the intelligent edge

Infographic showing statistics on IoT with relevant piccharts and icons

The hottest thing in computing right now is ‘the edge’ also referred to as ‘the intelligent edge’. Think of it as a cluster of sensors deployed across industry and society. Vast amounts of sensor data have to be processed locally near the data source. The work must be done on-site if there is no time to send data to the cloud and wait for a response. That is why it is known as the intelligent edge. 
Getting the intelligent edge in businesses can make the process more efficient and effective and create a more pleasing environment.

Growing with the platform economy

Illustration showing logo of different companies to explain different types of platform economySource: Raconteur

Platform economy has transformed the way businesses think about innovation. Organisations can take advantage of this phenomenon where online marketplaces are storming market after market, connecting buyers and sellers and taking the friction out of commerce.

Signifying the security from the start

A bar graph showing statistics on top challenges impeding organisations from taking advantage of digital trends

When undergoing a digital transforming process in your organisation, it is always better to engage security people from the very beginning. It saves a lot of pain and backtracking later on

Reimagining your business right now

Four piecharts showing percentages on top drivers of digital transformation

In spite of the economic and political uncertainty, enterprises must seize the initiative and entirely reimagine themselves and the role technology plays for them. Such a need for constant change to be ahead of others in the business means the ‘digital transformation’ may give way to ‘digital evolution’.

Enabling digital tax transformation

An illustration in the shape of hexagon depicting digital tax strategy

Digital transformation can allow the tax professionals to rethink the way they work while offering exciting new opportunities. Digital technology in tax is quickly moving up the boardroom agenda as it affects enterprises both internally and externally. With more businesses going digital, governments and tax authorities can also adopt disruptive technology including advanced data-driven auditing techniques.

Transforming a whole economy

Horizontal bar graphs showing statistics on mobile and internet penetration

Digital technology is not only automating finance functions but the potential to grow business using artificial intelligence. For instance, the potential of a digital economy in India, world’s second most populous country behind China, is gargantuan with benefits for business and citizens alike.

How is Drupal 8 enabling digital transformation?

Drupal 8 has changed the web development landscape with its immense scope for enabling the digital transformation of an enterprise. Not only Drupal 8 does an astounding job to be a leading open source web content management system, but it is continuing to evolve with altering customer expectations.

Drupal 8 is continuously evolving with customer expectations

Some of the highlights are mentioned below:

Content creation

Drupal 8 authoring and theming systems are designed for ease-of-use and standards compliance. For instance, Drupal 8.6 adds support for remote media types thereby letting you to easily embed YouTube or Vimeo videos in your content. Moreover, the addition of Workspaces, experimental module, offers sophisticated content staging capabilities.

Easy installation

A table with rows and columns showing time to installation of different open source frameworksSource: Dries Buytaert's blog

Time taken for installing Drupal has been reduced considerably. You can install Drupal with the new Umami demo profile which exhibits some of Drupal’s powerful capabilities by providing an awesome website filled with content straight out of the box.

[embedded content]

Developer’s paradise

  • Developers can upgrade a single-language Drupal 6 or Drupal 7 website to Drupal 8 with the built-in UI.
  • Everything is fieldable in Drupal 8 thereby improving data modelling.
  • Drupal 8’s core improvements and APIs make it easier to create custom functionality on the basis of bi-directional data connections to handle and exchange content.
  • With mobile-first displays and responsive layouts, content can be deployed across platforms prepackaged for mobile devices.
  • Drupal 8 backend can handle content and be de completely decoupled from the front end and web applications.

Interoperability with adjacent technologies

Drupal 8 has a top-of-the-line interoperability with adjacent technologies such as CRM, Digital asset management (DAM), web analytics, marketing resource management, and multichannel campaign management (MCCM).

Security by design

Drupal is secure by design. The design of the Drupal is in such a way that it addresses all of the top 10 security risks of Open Web Application Security Project (OWASP).

A box with pointers showing list of OWASP 2017 security risks

With a proven record of being the most secure CMS, it beats the big players in the CMS market when it comes to being resilient to critical internet vulnerabilities. Drupal Security Team is steadfast in its objective of addressing security issues and offering timely fixes.

You can enable a secure access to your Drupal site as it has the in-built support for salting and repeatedly hashing account passwords when they are stored in the database. It helps in enforcing firm password policies, industry-standard authentication mechanisms, sessions limits, and single sign-on systems.
It offers granular user access control to give administrators full control over who can see and modify different parts of the site. You can even configure Drupal for strong database encryption in the top-notch security applications.
Its Form API makes sure that data validation is done which helps in preventing XSS, CSRF, and other malicious data entry. It also limits the number of times login attempts are performed from a single IP address to prevent brute-force password attacks. Its multi-layered cache architecture assists in mitigating Denial of Service (DoS) attacks.
Rightfully, a report from Sucuri shows that Drupal is best security-focussed CMS amongst the leading players in the CMS market.

Bar graph showing the statistics on Drupal securitySource: Sucuri

Drupal as an e-commerce platform

Drupal stands tall in the age of platforms offering a suite of Drupal commerce modules which can be leveraged to build e-commerce websites and applications of all sizes. While e-commerce solutions, most often than not, are built with an application mindset, Drupal commerce was built with a framework mindset emphasising on what you can develop with it.

Metamorphosis of the whole economy

Drupal can transform the whole economic landscape of a business. For instance, Drupal enabled digital transformation endeavours of TPG Capital which is one of the renowned enterprise-level fintech companies. Drupal turned out to be the best choice to tackle the financial industry’s stringent legal and regulatory requirements.

Homepage of TPG Capital with images of TPG products

The story of Digital transformation with Drupal

Drupal Europe 2018 in Darmstadt (Germany) had a dedicated section for ‘Digital Transformation + Enterprise’. One of the presentations focussed on the digital revolution of the Chatham House.
Chatham House, which is a not-for-profit, non-governmental organisation, has been leading as a global independent policy institute. But its digital presence scrabbled to evolve quite so prosperously. It was an arduous task to access key reports and information.

Homepage of Chatham House with an illustration showing a boat and Euro logo

Being a content-heavy site, it had a set of intricate requirements. Drupal was chosen for the editorial flexibility along with its open source ethos. It proved as the ideal springboard for success with seamless possibilities. And moving to Drupal was just the inception of their long-term digital transformation plans.
New strategy focussed on improving the reputation of Chatham House, prioritising outputs, putting more efforts in marketing, and leveraging insights from feedback. It was all to be underpinned by measuring success KPIs and reporting.

Moving to Drupal was just the inception of the long-term digital transformation plans of Chatham House

Then, full website redevelopment project was performed with a user-centric design utilising the powerful capabilities of Drupal.
The presentation delineated the significance of collaborative efforts. Combining strategic partnerships with strong internal relationships has ratcheted the monthly online users of Chatham House.


Organisational change involves a lot more than just adopting emerging technologies. There is a change in cultural setup, operational upgrades, and transformation of the whole value chain. And all these factors culminate to form the very essence of digital transformation.
Digital transformation has been vital for organisations to be relevant, keep churning out ways for tackling the changing needs of customers in the digital landscape and stay ahead of others.
Drupal, as an open source CMS, has an active community that has been working towards powering digital innovation. With such a community, Drupal has been able to evolve with altering needs and can be a great solution for building a marvellous web presence for the businesses.
Drupal experts at Opensense Labs have been continuously collaborating with partners to pursue the digital transformation endeavours with Drupal development.
Contact us at [email protected] to understand how can we digitally transform your business using Drupal.

Oct 08 2018
Oct 08

In this blog post, we'll have a look at how contributed Drupal modules can remove the core deprecation warnings and be compatible with both Drupal 8 and Drupal 9.

Ever since Drupal Europe, we know Drupal 9 will be released in 2020. As per @catch’s comment in 2608496-54

We already have the continuous upgrade path policy which should mean that any up-to-date Drupal 8 module should work with Drupal 9.0.0, either with zero or minimal changes.

Drupal core has a proper deprecation process so it can be continuously improved. Drupal core also has a continuous process of removing deprecated code usages in core should not trigger deprecated code except in tests and during updates, because of proper deprecation testing.

The big problem for contributed modules aka contrib is the removal of deprecated code usage. To allow contrib to keep up with core's removal of deprecation warnings contrib needs proper deprecation testing which is being discussed in support deprecation testing for contributed modules on Drupal.org.

However, Drupal CI build process can be controlled by a drupalci.yml file found in the project. The documentation about it can be found at customizing DrupalCI Testing for Projects.

It is very easy for contributed modules to remove their usage of deprecated code. All we need is to add the following drupalci.yml file to your contributed modules and fix the fails.

# This is the DrupalCI testbot build file for Dynamic Entity Reference.
# Learn to make one for your own drupal.org project:
# https://www.drupal.org/drupalorg/docs/drupal-ci/customizing-drupalci-testing
        # phpcs will use core's specified version of Coder.
        sniff-all-files: true
        halt-on-fail: true
      # run_tests task is executed several times in order of performance speeds.
      # halt-on-fail can be set on the run_tests tasks in order to fail fast.
      # suppress-deprecations is false in order to be alerted to usages of
      # deprecated code.
        types: 'PHPUnit-Unit'
        testgroups: '--all'
        suppress-deprecations: false
        halt-on-fail: false
        types: 'PHPUnit-Kernel'
        testgroups: '--all'
        suppress-deprecations: false
        halt-on-fail: false
        types: 'PHPUnit-Functional'
        testgroups: '--all'
        suppress-deprecations: false
        halt-on-fail: false
        concurrency: 15
        types: 'PHPUnit-FunctionalJavascript'
        testgroups: '--all'
        suppress-deprecations: false
        halt-on-fail: false

This drupalci.yml will check all the Drupal core coding standards. This can be disabled by the following change:

        # phpcs will use core's specified version of Coder.
        sniff-all-files: false
        halt-on-fail: false

This file also only runs PHPUnit tests, to run legacy Simpletest you have to the following block:

         types: 'Simpletest'
         testgroups: '--all'
         suppress-deprecations: false
         halt-on-fail: false

But if you still have those, you probably want to start there, because they won't be supported in Drupal 9.

Last but not the least if you think the is module is not ready yet to fix all the deprecation warning you can set suppress-deprecations: true.

As a contrib module maintainer or a contrib module consumer I encourage you to add this file to all the contrib modules you maintain or use, or at least create an issue in the module's issue queue so that at the time of Drupal 9 release all of your favourite modules will be ready. JSONAPI module added this file in https://www.drupal.org/node/2982964 which inspired me to add this to DER in https://www.drupal.org/node/3001640.

Photo of Jibran Ijaz

Posted by Jibran Ijaz
Senior Drupal Developer

Dated 8 October 2018


What Alex said. I did this for the JSON API module because we want it to land in Drupal core. But we are running into the problems explained in the issue linked by Alex.

Despite Drupal 9 being announced, Drupal Continuous Integration system is not yet ready for modules trying to keep current with all deprecations for Drupal 9, while remaining compatible with both simultaneously minors with security team coverage (current + previous, ATM 8.6 + 8.5) and the next minor (next, ATM 8.7). Hopefully we soon will be :)

Thanks for writing about this though, I do think it's important that more module maintainers get in this mindset!

DrupalCI always runs contrib tests against the latest core branch. As a contrib module maintainer if I have to make a compatibility change for core minor version then I create a new release and add that to releases notes after the stable release of core minor version e.g. 8.x-2.0-alpha8. I never have to create a new release for the core patch release at least till now but yes I don't know how would I name the new release if I ever have to do that but then again that's contrib semvar issue.

DrupalCI runs against whichever core branch the maintainer has configured it to run against.

If a contributed module wants to remove usages of deprecations, it should probably never do that against the "Development" branch, as there isnt a way for a contrib module to both remove those deprecations, *and* still be compatible with supported or security branches. The earliest that a contrib module should try to remove new deprecations is at the pre-release phase, as at that point we're unlikely to introduce new deprecations.


Add new comment

Oct 05 2018
Oct 05

Read our Roadmap to understand how this work falls into priorities set by the Drupal Association with direction and collaboration from the Board and community.

Drupal.org Updates

Drupal Europe 10-14 Sep 2018Thank you to the Drupal Europe team and attendees!

Members of the Drupal Association team joined the community to attend Drupal Europe in September. It was a fantastic event, and we had many great conversations with local community leaders, Supporting Partners, and others about the challenges and opportunities of the European market.

The Drupal.org Engineering Team also met with a number of contributors at Drupal Europe to move forward initiatives like improving Composer support for core, automatic updates, and more.

Reminder: DrupalCon Seattle Early Bird Registration is open nowDrupalCon Seattle April 8-12 2019

DrupalCon Seattle general registration is open now. The programming has been transformed to address the needs of Builders, Content and Digital Marketers, Agency Leaders, and Executives, while preserving that feeling of homecoming for the community that is central to every DrupalCon.

Have questions about the next evolution of DrupalCon? Head of Events Amanda Gonser was recently interviewed by Lullabot for their podcast and explains what's new and what's staying the same.

A video prototype of our integration with GitLab

Are you as excited as we are about the upcoming migration to GitLab? Watch this video for a visual prototype of the integration we're planning.

[embedded content]

We should be announcing a window for our Phase 1 migration shortly.

Phase 1 of Improved Support for Composer begins

In September we moved forward with our multi-phase proposal for improving Drupal core's support for Composer workflows. There are still considerations under discussion, such as how to handle multi-site support, and the implementation details of the later phase. However, Phase 1 has now been broken into its own meta issue, with a goal of bringing these changes into Drupal as of release 8.7.

Seeking a Technical Program Manager

The Drupal Association seeks a Technical Program Manager (TPM) to join our Engineering team and shepherd key programs for Drupal.org that empower our global community to collaborate and build the Drupal project. A TPM is expected to be technically fluent, have excellent project management skills, and excel in internal and external communication. The Drupal Association serves one of the largest global open source communities — Drupal has pioneered open source for 17 years. Join our incredible, mission-driven team and make an important impact by building the tools that help our community build Drupal.

Join Promote Drupal

At Drupal Europe we kicked off the volunteer coordination for Promote Drupal. We've put together an introductory video that explains how to get your marketing teams involved.

[embedded content]

Just go to the Promote Drupal landing page to sign up!

Further improvements for inclusivity on Drupal.org

Thanks to the community contributed work of @justafish, among others, Drupal.org user profiles now include fields for pronouns, primary language, and location, to help give people cultural context as they interact with each other online. We’ll be adding options to show this information with comments throughout Drupal.org.


As always, we’d like to say thanks to all the volunteers who work with us, and to the Drupal Association Supporters, who make it possible for us to work on these projects. In particular we want to thank:

If you would like to support our work as an individual or an organization, consider becoming a member of the Drupal Association.

Follow us on Twitter for regular updates: @drupal_org, @drupal_infra

Oct 05 2018
Oct 05

Drupal is an open-source platform that more than a million of people across the globe find useful for their content management purposes. They choose Drupal because of its flexibility, reliability, and security. However, not all of them know how to use it properly. Find out about the mistakes that Drupal beginners often make.
Let’s dive deeper and analyze the examples of developer’s activity that could make Drupal ineffective and see what to do to get better results right away!

Bad content structure

Without the proper plan in place, your content structure can end up with a messy, incoherent experience for the site visitors. Determining a good structure from the very start increase your website performance. 

Try to minimize the number of content types, fields and tables. Too many content types can confuse content creators so you need to standardize them. For instance, you don’t need both “news” and “article” types, as these are almost the same - leave one of them. Moreover, creating new fields for every content type is a waste of resources that also comes with worse performance. Avoid making similar fields, as it brings unnecessary complexity to your site. 

This is why it’s good to design the system before you start to implement the elements. Take time to think about the structure and decide how the Drupal architecture should look like to improve website’s performance. Make it as simple as possible and use only the unnecessary elements. 

New Drupal developers also have problems with the folder structure. To be more precise, it’s about setting up wrong folder structure and putting themes and modules in the folder on a root level rather than in separate folders. This is one of the serious mistakes, as it affects the process of upgrading to the latest version and make debugging a way more difficult. You will lose a lot of time trying to fix it, so focus on creating a proper folder structure in the beginning. 

Using unnecessary Drupal modules across the site

Inexperienced developers can be amazed by a plethora of modules available which leads to installing too many of them. Even if Drupal developers don’t make a use of all of the modules at the beginning, they think that they will need it later. If you’re one of these developers… stop doing this! 
You need to realize that the more elements you have, the slower is your website, not to mention the mess you have in the code. With that being said, review all the modules you have again and get rid of ones you don’t need. Also, too many modules can decrease website security, so think about that. 

Not removing previous versions of the modules

Speaking of modules, Drupal developers often forget to remove older versions of downloaded modules. Some of them simply don’t know that even if the module is placed into a different directory, Drupal may decide to use the older version. It’s because the folders are usually on the same level. 
Sometimes, the software can switch between various module versions so, as you can guess, it can cause some problems. And we all know you don’t need such unwanted surprises.

Choosing unsupported modules in Drupal

A large number of different modules can be a challenge for a Drupal beginner, especially when there are ones that cover all the functionalities you look for. Unsupported modules can cause some problems in the long run, for instance, because of the compatibility issues or bugs that will never be fixed. 
Before downloading a new module, check when the last update was made and read the description provided by the author. Note that some projects are marked unsupported for security reasons so think twice before you decide to use one of these, as you risk data breach.

Ignoring code standards

When several people are working on the same website, it’s important to have code standards in place. Without them, you risk wasting the time trying to understand other developer’s code. Don’t make the mistake and start by creating the guideline to improve the quality of the source code and the efficiency of the team. 

Applying Drupal code standards is a good practice, even if you are a single developer working on your own project. Think about the situation when you want to expand the project that requires another developer’s involvement. With documented standards, it will be much easier to start, so… draw conclusions. 

Never make Drupal beginners mistakes again

It’s not surprising that newbies make mistakes, they simply need time to learn about all the opportunities the system gives them. Drupal is a complex tool, so there’s a huge chance to create something that doesn’t work the way we wanted, especially when we don’t have proper knowledge and experience. 

The good news is that you can always ask questions - the Drupal community exceeded a million of users that are willing to help. Someday, you can also be handy by sharing your experience. 

Oct 05 2018
Oct 05

Our back-office management solution is now running on latest version of Drupal (8.6.1). An online demo is updated with the latest version that showcase the application features.

It has been a long run since the project was initiated while Drupal 8 was still under alpha stage. And there is still plenty of work to do.

An installation code is also available for those familiar with Drupal. The installation process is partially covered in this article.Thus if any of Drupalists are enthusiastic about business process solutions and would like to contribute, they are welcome.

We focus first on moving an old in-house php application into Drupal 8 modules. This covers many simple but useful back office functionalities like address book, products and services database, sales documents (invoices, purchases), projects, HR, logistics documents, cost tracking, journal records and others collaborative tools. It is still a young project that will certainly need more integration provided by Drupal 8 capabilities as it grows. Some of Drupal 8 features like multilingual support and tour guide are already very useful in the business environment we operate.

The solution is run by small businesses and start-ups. We provide paid support, comprehensive cloud solution and management expertise service as well. It is a very good solution for small business that need to organize their back office and data management.

On one hand, it gives us tremendous information and feedback about all necessary improvements we need to implement  and fixes to apply. On the other hand, Drupal 8 has proven to be very stable and efficient in running this solution, and we are still to explore plenty of value added features that are of high value in data processing like RESTful Web Services for instance or plugins developments

We encourage anyone to explore this solution, provide feedback, and even contribute to the project.

Oct 05 2018
Oct 05

October 05, 2018

I recently ran into a series of weird issues on my Acquia production environment which I traced back to some code I deployed which depended on my site being served securely using HTTPS.

Acquia Staging environments don’t use HTTPS by default and require you to install SSL certificates using a tedious manual process, which in my opinion is outdated, because competitors such as Platform.sh and Pantheon, even Github pages support lots of automation around HTTPS using Let’s Encrypt.

Anyhow, because staging did not have HTTPS, I could not test some code I deployed, which ended up costing me an evening debugging an outage on a production environment. (Any difference between environments will eventually result in an outage.)

I found a great blog post which explains how to set up Let’s Encrypt on Acquia environments, Installing (FREE) Let’s Encrypt SSL Certificates on Acquia, by Chris at Redfin solutions, May 2, 2017. Although the process is very well documented, I made some tweaks:

  • First, I prefer using Docker-based solutions rather than install softward on my computer. So, instead of install certbot on my Mac, I opted to use the Certbot Docker Image, this has two advantages for me: first, I don’t need to install certbot on every machine I use this script on; and second, I don’t need to worry about updating certbot, as the Docker image is updated automatically. Of course, this does require that you install Docker on your machine.
  • Second, I automated everything I could. This result in this gist (a “gist” a basically a single file hosted on Github), a script which you can install locally.

Running the script

When you put the script locally on your computer (I added it to my project code), at, say ./scripts/set-up-letsencrypt-acquia-stage.sh, and run it:

  • the first time you run it, it will tell you where to put your environment information (in ./acquia-stage-letsencrypt-environments/environment-my-acquia-project-one.source, ./acquia-stage-letsencrypt-environments/environment-my-acquia-project-two.source, etc.), and what to put in those files.
  • the next time you run it, it will automate what it can and tell you exactly what you need to do manually.

I tried this and it works for creating new certs, and should work for renewals as well!

Please enable JavaScript to view the comments powered by Disqus.

Oct 04 2018
Oct 04
Though there was no DrupalCon Europe this year, the European Drupal community stepped up and organized their own conference, Drupal Europe, in Darmstadt, Germany last month. An incredibly successful gathering held in the Darmstadtium venue, a beautiful convention center in the center of this college town, Drupal Europe demonstrated the unique power that grassroots initiatives can have in our open-source community.
Oct 04 2018
Oct 04
Should I Re-use Existing Drupal Fields?

Sometimes we're able to give really clear advice: "Do this!" or "Don't do that!"

This is not going to be one of those blog posts.

Drupal gives you the ability to re-use fields. If you have an "Image" field, you could choose to use that same field on every content type on your site.

However, it's not always clear whether re-using fields is a good idea. Sometimes it is, sometimes it isn't.

Here's an overview of the advantages and disadvantages to consider before re-using Drupal fields.

General advice on re-using fields

You can choose the "Re-use existing field" feature whenever you go to "Structure", then "Content types" and click "Manage fields" for a content type.

reuse drupal fields

The Drupal.org documentation use to officially recommend that you do not re-use fields:

It is recommended to create new fields, rather than reusing existing ones, unless you have a clear reason to do so.

However, that advice has become more nuanced in recent years, and Drupal officially says that are both advantages and disadvantages.

The Drupal Field UI documentation has a detailed section called "Reusing Fields":

There are two main reasons for reusing fields. First, reusing fields can save you time over defining new fields. Second, reusing fields also allow you to display, filter, group, and sort content together by field across content types. For example, the contributed Views module allows you to create lists and tables of content. So if you use the same field on multiple content types, you can create a View containing all of those content types together displaying that field, sorted by that field, and/or filtered by that field. There is one main reason to not reuse a field: different permissions. For example, you may need different user roles to have different levels of access to a field, depending on the content type to which it has been added. This can be difficult if you reuse a field.

Advantage: re-using fields can make your simpler

Yes, there can be a speed boost, but the time-savings are very small. A more compelling advantage is that re-using fields can sometimes make site administration simpler. Web Initiative sum this up nicely:

Reuse of fields can also reduce the system’s complexity. Instead of creating and maintaining 10 different fields, Drupal admins maintain only two fields and their documentation. Database administrators only need to improve performance of two extra tables. KISS is always a good principle.

It definitely would be easier to apply permissions, setting and design elements to one re-used field rather than 10 unique fields.

Advantage: some content works well with re-used fields

Back to the Drupal Field UI documentation again:

reusing fields also allows you to display, filter, group, and sort content together by field across content types. For example, the contributed Views module allows you to create lists and tables of content. So if you use the same field on multiple content types, you can create a View containing all of those content types together displaying that field, sorted by that field, and/or filtered by that field.

One comment writer on the Drupal.org documentation makes the same point about Views. They point out that Views can combe content in sophisticated ways. So, if you have multiple different content types, with different date fields, then Views can combine them into a single view. However, they also point out that Views isn't so sophisticated with sorting. So, if you have multiple different content types, with different date fields, then Views will struggle to sort the content on all those different date fields.

Disadvantage: Re-used fields are inflexible

Brandon Williams on Twitter summed this up nicely: 

at first it's a good idea, but give it a few weeks, reqs change, you end up creating separate ones anyway

To a large degree, if you choose re-used fields, you are limiting the changes you can easily make to your data later.

Disadvantage: Re-used fields make data harder to export or migrate

Re-using fields could become an issue when you need to export your data or when you need to migrate to a new version of Drupal or another platform.

Each Drupal field has it's own database table, as shown below. Extracting that data can be tough. The Features module (the most common way to export Drupal data) struggled for a long time with shared fields, although current versions can handle them more effectively.


This advice is similar to our thoughts on using multi-sites. Whenever you start to build dependencies between codebases or database tables, you add complexity to your site.

Advantage or disadvantage? Performance

The Drupal documentation outlines one possible benefit of re-using fields:

Reusing fields not only makes Drupal run faster, it also makes your project easier to maintain.

This thread on Stack Overflow has a very relevant discusion on performance. It includes this comment:

A real problem however is the number of fields you have. Because currently in Drupal 7, the complete field configuration of all fields, no matter if they're loaded or not, is fetched from the cache on every single request. I've seen sites with 250+ fields, where loading and unserializing the field configuration takes 13MB+ memory."

So, re-using fields could possibly give small performance improvements by letting us have a lower total number of fields.

However, those small improvements may lost elsewhere. This from Web Initiative again:

[fields] extra complexity to a Drupal system. When creating a new field, the field’s definition is added to the field class table and the field’s configuration is added to the field instance table; meanwhile, a new table is added to the Drupal database to store the field data. Database tables add complexity to the system. In addition, queries of nodes will incur JOINexpressions of tables to field data. Multiple JOINs will impact database performance since MySQL responds poorly to queries with multiple JOINs of tables if not properly configured.


Sorry that we don't have an easy answer to this question. This is a question where you will benefit from reading around the issue and understanding the pros and cons. If you're doing a real site build, it will be worth constructing the site in a test environment to learn more about how these pros and cons impact your site's needs.

About the author

Steve is the founder of OSTraining. Originally from the UK, he now lives in Sarasota in the USA. Steve's work straddles the line between teaching and web development.
Oct 04 2018
Oct 04

Now that we have automated our deployment, it wouldn't be too hard to wire it with our code management setup. In this post, we will hook the Ansible scripts with our Git hosting setup so that a deployment gets triggered when you do a "git push". The idea is, deployment shouldn't be a chore, so that developers don't even think of it and only focus on the business logic of their application.

source: xkcd.com

Also, if deployment is such a low activation energy task, you ship more frequently. Why ship frequently? because speed of iteration beats quality of iteration. There are a special class of tools which act as a hub between source code management and deployment, called continuous integration services. There are so many of them, but I picked up Gitlab CI for this exercise.

Why Gitlab

My criteria for choosing CI tool is it should be free as in free beer and free as in freedom. I should be able to look under the hood and make changes(though I don't do it usually), and if things break, I should be able to see what's happening behind the scenes. I was sort of tied between Jenkins and Gitlab CI, which shares the same traits. I went ahead with Gitlab CI as my code was hosted in Gitlab and adopting Gitlab CI was easier. Hint: If you are already using Gitlab, you're better off using Gitlab for the CI.

Also, there is no need to host and maintain a separate service if you are using Gitlab. The CI comes built in and you can create runner environments if your setup scales.

Bonus: Drupal is moving to Gitlab!

Gitlab CI overview

You check in the Gitlab CI configuration file as a part of your codebase. It is called .gitlab-ci.yml at the top level directory. This is a declarative YAML file which has information about how your CI process should execute. Each Gitlab CI pipeline consists of one or more "jobs", with each job belonging to a "stage".

A stage is one of "build", "test" or "deploy". Multiple jobs belonging to the same stage are executed in parallel. Our CI pipeline only consits of the "deploy" stage,

  - deploy

with 2 jobs,

  stage: deploy
  stage: deploy

The job names are arbitrary, and the jobs run only if their condition is met, i.e. do a dev deploy only if code is pushed to develop branch.

  stage: deploy
  image: python:3.6
    - develop

Also, there is an image directive which tells Gitlab CI to execute the job's tasks in a "python:3.6" container(and that's because we use Ansible).

Let's quickly walk through the deployment steps:

  - pip install ansible
  # install ssh-agent
  - 'which ssh-agent || ( apt-get update -y && apt-get install openssh-client -y )'
  # run ssh-agent
  - mkdir -p ~/.ssh
  - eval $(ssh-agent -s)
  - ssh-add <(echo "$SSH_PRIVATE_KEY")
  - echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config
  # deploy
  - ansible-playbook -i "$DEV_HOST," -u root ansible/playbook.yml --tags deploy --vault-password-file=./ansible/vault-env
when: always

We install Ansible, and because Ansible requires an SSH key, we inject the key from an environment variable $SSH_PRIVATE_KEY. Now, how can we securely inject this environment variable which holds a private ssh key into the CI pipeline? Gitlab allows you to do it by adding these variables via the UI, so that they are automatically inserted in every build.

The actual playbook execution also uses environment variables which are defined in the CI file itself.

  DEV_HOST: staging.example.com
  PROD_HOST: www.example.com

Notice that we only run the tasks in the playbook with the "deploy" tag. Also, we inject the Ansible Vault decryption password the same way as we did with the ssh private key, i.e. we add it as a CI variable. This is a secure and recommended practice.

Gitlab CI variables

Here's how your pipeline looks after successful execution.

CI run

CI run 2

Notice that the deployment job ends with a when: always condition. This means that the job will execute irrespective of the result of the previous jobs. This can be removed to add a rule like "deploy only if the tests pass".

Where do all the jobs run for our pipelines? Gitlab provides a piece of infrastructure called runner. This is provided by Gitlab if you have hosted your code in Gitlab.com. You can also specify your own runners to augment your CI infrastructure.

There is more stuff we can do in Gitlab CI, which will be the scope of a later post. For current context, the Ansible playbook will suffice and does all the heavy lifting. Try doing a git push to your codebase and see the CD pipeline getting triggered automatically.

Where to go from here

Adding test steps in CI

You can add a test stage and run all your tests, which will be executed before the deployment. You can fire a deployment only if all tests pass.

Adding multiple environments

You can tweak the Ansible scripts to create per-branch environments, also called review apps and manage their lifecycle using Gitlab CI and Ansible.

Traefik using docker swarm

The current Trafeik setup assumes a single machine running Docker. Your setup can quickly outgrow this if you are running lot of sites. To mitigate this, you can run your containers in a Docker swarm cluster and configure Traefik to this setup rather than a single node docker setup. More on this in a future post coming soon!

Adopting to a different CD pipeline

You can adopt this exact setup to a different CI/CD tool like Jenkins, Travis or Drone.

Pre baked images

You can inject the source code and composer dependencies and build a fresh docker image for every build(by building this on top of the existing PHP FPM image) during the build stage of the CD pipeline and directly deploy this newly minted image during the deployment stage. I'll be writing a detailed post about this in the future.

Oct 04 2018
Oct 04

This is a public update on the work of the Governance Task Force.

The Governance Task Force has been working hard to prepare the proposal. We currently have a completed draft that we are actively refining for editorial improvement. As part of the review, we believe it is important to get initial feedback from some key stakeholders to ensure there are no major issues identified. We'll consider making changes to the proposal at our discretion. The proposal will then be delivered to the community and we’re very excited to soon share this. While things may change, we believe we are on time to deliver the proposal before the end of October.

Our team is actively discussing the handoff and next steps that follow from our work. We recognize that there may be ongoing support needed and want to do what we can to help follow-up efforts. It is imperative that momentum is maintained after our proposal is delivered.

We will be recommending a public commentary period before any recommendations move forward for the community to share their thoughts. This commentary period will likely outlast the task force. The task force will officially disband at the end of October, as we have stated in our charter. This does not mean that the work is complete, as there may be discussion and, most importantly, approved recommendations will need support to move forward. The task force wants to do what we can to enable the next steps and we are actively discussing how this might happen, even if we, as individuals, participate without an official charter.

We continue to be committed to serving the community and operating in a transparent way. If you wish to reach us, please fill out this Google form and we will respond as soon as we’re able.

Oct 03 2018
Oct 03

by David Snopek on October 3, 2018 - 4:30pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Critical security release for the Print module to fix a Remote Code Execution (RCE) vulnerability.

The Print module provides printer-friendly versions of content, including send by e-mail and PDF versions.

The module doesn't sufficiently sanitize the arguments passed to the wkhtmltopdf executable, or HTML passed to dompdf or other PDF generation tools.

See the security advisory for Drupal 7 for more information.

NOTE: This vulnerability has a lower risk in Drupal 6 than in Drupal 7 (where it's Highly Critical). This is because you can't pass shell commands to execute using the HTTP basic auth user/pass, like you can in Drupal 7.

Here you can download the Drupal 6 patch.

If you have a Drupal 6 site using the Print module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Oct 03 2018
Oct 03

drupal 8 logo in spotlights

Although it can sometimes be easy to forget about that little URL bar at the top of your browser, what it contains for each page can be surprisingly important. Creating good paths (the part of the URL after the domain name) for your site content can have numerous benefits, especially for SEO, but doing so can be a bit frustrating. Usually, you end up needing to re-type most of the page title into Drupal's URL alias field, which isn't necessarily too bad on its own, but it's still an extra step whenever you create content. And what about when you update the page and change the title? This is where it gets very easy to forget to change the path, which can lead not only to worse SEO but also to your site's visitors not ending up on a page with the content they expected it to have.

Fortunately, the Pathauto module makes all such worries a thing of the past.

This module adds a whole host of new options to how you can configure Drupal's paths, perhaps the most important of which is path "patterns". Patterns allow you specify what you want your paths to look like for different pieces of content; for instance, if you have a site where each of its users can have a blog, you could set up a pattern for Blog nodes such as "/blog/[node:author]/[node:title]". Then, whenever a user creates a blog post, it will automatically be given an alias which includes their username and the title of the post, and it will even update the alias if the user edits the post to change its title.

If you already have a site which has inconsistent aliases, Pathauto can also be used to bulk generate new aliases for your content, which is also quite helpful if you ever decide you want to change the structure of your paths. The module also offers plenty of settings to configure exactly what the paths it generates look like, including the ability to use a wide range of tokens provided by the Tokens module.

Finally, Pathauto also integrates with the excellent Redirect module. If you have both modules installed, changing the title of your content won't just change the path, it will also create a redirect from the old path to the new one so that any links already out there to the old URL will continue to function as expected. This is a rather important bit of functionality, and thanks to Drupal's great selection of contrib modules, it's just a few clicks away from being set up and working on your site.

New Call-to-action

Oct 03 2018
Oct 03

This is a guest blog by Marina Paych (paych) with tips for coordinating Drupal training events.

Organizing a Drupal Global Training Days event is a great way to spread the word about Drupal, engage more novice developers, and increase adoption. If you haven’t heard about Drupal GTD, typically, these events are a one- or two-day training where experienced developers deliver sessions about Drupal and teach attendees how to create their first Drupal website or module.

Why is this event called global? It’s because it is held globally on or around a certain date every quarter. Organizers in the North and South Americas, Europe, Africa, Asia, and Australia come together to create the GTD movement and show the power of community-organized events worldwide.

GTD event in Saudi Arabia
Introducing Drupal 8 in Riyadh. Photo by @EssamAlQaie.

However, Drupal event organizers face challenges while running trainings and conferences. The Drupal Global Training Days Working Group conducted a survey to find out common difficulties for organizers and how the working group can offer support. We came up with an online questionnaire first, and then held several in-depth interviews with organizers to validate gathered quantitative data.

In this article, I've collected the most widespread challenges of Drupal Global Training Days organizers and brainstormed possible ways to overcome them based on my personal event organizing experience and good practices of other organizers. Let's dive in.

GTD event in Russia - group photo
Group photo at GTD in Omsk. Photo by @ADCISolutions.

Issue 1: a lack of speaker resources (documentation/PPTs of other organizers)

What do we teach? A lack of ready-made resources is the most common concern of organizers worldwide. And I admit that it is an issue. There is no special folder with all the up-to-date and ready to deliver presentations for everyone to use.

There are two main reasons for this:

  1. Most organizers hold GTD events on their local language: Spanish, Russian, etc., and that makes such PPTs hardly usable for other countries.

  2. All the organizers have different event programs due to different profiles of attendees: in some countries, the attendees are mostly beginners, in other countries - experts.

These barriers make it harder (if not impossible) to have PPTs for all types of sessions and all types of audiences.

However, there’s a way that you can overcome this problem: ask other organizers for help! Join the group at Drupal.org. There, you can create a new discussion and ask people to share their documents or at least their events programs, so you will be able to adapt the materials to your audience and goals. Also, there is a Slack channel, where you can do the same thing; ask other organizers for help.

I did this when we decided to add a workshop to our GTD. Previously, we only had theoretical sessions, but then we wanted to deliver a practical workshop. Since we had never run a workshop before, I asked other organizers if anyone already delivered workshops. And Mauricio replied to me and shared lots of materials they've used in Nicaragua.

We discussed his materials together with our team of speakers and we adapted some related to our agenda and goals. What was also very useful:a technology stack Mauricio uses on his events. We didn’t know what to choose to use in our workshop, so we also used tools that were suggested by Mauricio, which were helpful.

Incorporating the experience of Nicaraguan GTD events allowed us to avoid starting from scratch, and we delivered a quality event.

Also, there is a global initiative aimed to translate the Drupal 8 User Guide to different languages. If your local language is one of the following: Català, Magyar, Español, Українська, 简体中文, Français, Deutsch, فارسی,  Bahasa Indonesia, 日本語 — you can use these materials for your training. If your language is not on the list yet, you can help translate it.

Issue 2: a lack of unified promotional materials

The second widespread problem is the lack of promotional materials. The situation is similar to the first issue. Different languages and different profiles of attendees make it difficult to have a single set of materials to promote all GTD events. We have discussed what we can do during some of our GTD Working Group calls, and we realized that it is best for each organizer to create local promotional materials because they know their audience better and can transmit the GTD main goal -- Drupal Adoption -- in plain comprehensive language.

But again, feel free to ask for help or advice in the group or Slack channel from organizers who have had success in promoting their events.

Issue 3: a lack of speakers

Some organizers mentioned that they often don’t have people to deliver sessions due to different reasons: developers are busy with work, or are too shy to speak, or there is no one at all to deliver sessions.

Scenario #1

If your potential speakers agree to prepare and deliver sessions but are too busy to really engage, then schedule an event in advance and help them plan the timeline of preparation in a way that does not take many hours per week. Show them that if they start preparing in advance and they take it slow, they will be able to do everything simultaneously. But, of course, it will require your time to remind and check in for good results.

Scenario #2

If your potential speakers are too shy to speak, try to involve them gradually. Ask them to prepare a short internal report on something simple for the session topic. Then provide them with quality feedback and highlight the strong sides of their presentation. And then, when they are ready, invite them to a bigger event. It will take some time, but your efforts will pay off when your event achieves your goals.

Scenario #3

You are a solo organizer or there is no one available to speak in your company. There are two possible ways to overcome this problem:

  1. Find speakers from other Drupal companies in your region. They will get their company's promotional support and you will get speakers with experience. The other option can be applied if you have a strong community in your city or region. Then you can invite people from the community to speak at the event.

  2. If the first way can’t be implemented, find remote speakers and make video calls or even make an online event. The Global Drupal Community is HUGE. I’m sure you will find amazing and motivated speakers for your event within the community.

Issue 4: low conversion rates (many people sign up, few attend)

If this is a one-time thing, it might be just random low attendance. For example, the weather suddenly became terrible and people didn’t want to go outside at all. That is not specific to your event.

But if this constantly happens, you need to carefully assess your promotional efforts. Possible reasons:

  • Promotional materials are misleading: the expectations attendees had were not justified. Check that you deliver what you promise and improve your promotional materials and texts according to it. To identify the gap between the expectations and reality, organize a couple of in-depth interviews with the representatives of your target audience or add a question about expectations to the application form.

  • Attendees servicing issues: usually, it is a couple of weeks between the date when a person sends an event application form and an actual event date. A person can forget about the event, can lose interest, or even change their plans. In order to avoid that, plan some touch points with people who have already registered. Shoot emails, tell about the sessions and programs, send reminders.

  • Not exactly your target audience: if your promotional materials are distributed to channels that are popular among a wider audience (not only tech students but all students, for instance), you might receive many sign-ups but only tech people will attend. Be mindful when choosing channels for promotion; make sure your target audience is there.

There may be other reasons why you have a low conversion rate: it's difficult to get to your venue, the price (if an event is paid) is too high, the agenda is not that interesting, etc. But you need to find the reasons for your particular case. Talk to attendees, ask them what would improve their experience, and your conversion rate will increase when you implement their feedback. Just don't forget to mention your improvements in your promotional materials so people know that you really listened to them.

Issue 5: a lack of money for a venue, coffee breaks

How great is it when you come to an event and there is free coffee there? To ensure your attendees will come to a comfortable venue and enjoy sessions after a welcome coffee break, you can take simple steps:

  • Organize the event in the office of an IT company: it is great if your office is good for events, but if it isn’t, you can approach bigger companies and offer a collaboration: you will organize the event and they provide a venue. Perhaps they can distribute their promotional materials during the event. Of course, it doesn't necessarily need to be an IT company office; you can find any comfortable and beautiful venue, but IT offices are often cool and prepared for IT events.

  • Organize the event in the University: you can try to negotiate with Universities to let you organize your event there without a fee. The Universities have two main advantages: they usually have all the necessary equipment for events and they also have tech students who can possibly attend your event.

  • Find in-kind partners for your event: Drupal GTD is a globally supported event that attracts many people. You can use this to your benefit and find partners for your event who will provide you with goods or services in exchange for promotion. You can put their banner in the event venue. It won’t hurt your event to have a few banners, and you will be able to offer your attendees a more pleasant experience without any monetary investment.

Issue 6: no sponsors

Usually, most of the event's needs can be covered with the help of in-kind partners. It is much easier to find in-kind partners than sponsors. So, I would recommend you start with in-kind partners, especially if you have never sold event sponsorships before.

If, however, you need sponsorship:

  1. Create a pool of companies who might be interested in your event.

  2. Find contacts for decision-makers from these companies.

  3. Come up with a list of benefits you can offer to them (promotion, employer branding, speaking at the event, etc.) and set a price.

  4. Create a customized proposal for each company.

It’s best to set up a meeting with a company because it will allow you to talk with a person and see which of your benefits are more relevant for them, and provide more information.

If some companies support your local Drupal or IT community, it is likely that they could support your event.

Issue #7: a lack of awareness

The GTD events are not well-known in some regions, which creates additional challenges for organizers. It becomes harder to find sponsors, speakers, and attract attendees. If this is your case, the action that can help is for you to inform Drupal companies in your region about the benefits of participating in the GTD movement. Benefits like promotion of a company within the Drupal Community, with a possibility to be featured in blog posts and tweets, spotting on a Drupal events map, and even credits on Drupal.org!

In order to promote your event and attract more attendees, you can focus on the fact that the same events are being organized on the same day all over the world. It usually inspires people and makes them curious to attend. Feel free to use the videos from GTD organizers (video #1, video #2) during your promotion or on-site at your event.

GTD event in India - speaker on stage
Mr. Thomas speaks at GTD in Kerala. Photo by @zyxware.


Event management is a complex and sometimes complicated process. But it is interesting and allows a huge amount of opportunities for improvement and experimentation. I’ve been organizing events of different scale for years, and I still find something interesting in the work.

That is why I wish all the GTD organizers to be proactive, creative, and consistent. These qualities will help you make wonderful events and engage hundreds of people with Drupal!

And the Drupal GTD Working Group (paych, lizzjoy, dinarcon, rgs, rachit_gupta, pendashteh, solomonkitumba) is always here to lend you a hand and help with advice.

Feel free to contact me with any questions about this article or event management in general.

Join the Drupal GTD Group and Slack Channel, and follow us on Twitter :)

Happy GTD!

Oct 03 2018
Oct 03

Recently, The Guardian Insurance Company made the strategic business decision to start marketing and selling their products directly to consumers. While Guardian has been around for nearly 160 years (WOW!) most consumer experiences with their products stem from employer insurance coverage offerings. As the industry landscape evolves and the US workforce moves slowly towards distributed and independent employment Guardian endeavored to differentiate their offerings not only from industry stalwarts but also from up-and-coming, startup-like products catering to the same demographics. Mediacurrent was proud to be chosen by Guardian Insurance Company as their Design and Strategy partner during product development of their new direct-to-consumer website.

Now that the site has launched, we’re happy to share some behind-the-scenes details of how this fresh, new experience came together. While our Case Study gives a broader overview of the project, this blog post will provide additional insight into our data driven design process.

Part 1 of this 2-part series covers the early planning parts of our process including Strategic Design Discovery, Style Tiles, and Wireframing. Part 2 will cover Mockups & Visual Design, Custom Illustration, and Custom Iconography. Let’s dive in!

Strategic Design Discovery

A Discovery phase begins many of our projects at Mediacurrent. This phase is led by a cross-functional group of our world-class team to help frame the challenges ahead. Throughout this phase, comprehensive knowledge of the Guardian brand, its customers, its competitors, and its business were gathered. These strategic design discovery insights allowed us to understand the types of consumers the new product is being geared towards (user personas), how success is being measured (Key Performance Indicators or KPIs), the ways in which a person becomes a customer (conversion paths) and a boatload of other data that served as our “guiding light” throughout the process.


Next, we moved on to really digging into the user personas and conversion paths generated in the Discovery phase by creating wireframes for the different sections of the site. One of the most exciting parts about this project was that we were not only designing the marketing section of the site but also the entire customer experience from the point where the visitor is attracted, becomes a lead, and eventually converts to a customer. This means that in addition to top-level marketing pages – like the home, about, and product pages – user journey designs were also needed for getting a quote and enrolling in one or multiple products.

Throughout the wireframing process, we broke down the different types of pages and sections of the site that users will encounter when visiting and organized the placement of content and calls-to-action (CTAs) by mapping the layout structure to the user journeys and KPIs identified in Discovery. Working from the Mobile First perspective, we made sure to consider this hierarchy not only on desktop machines but also tablets and mobile phones. In this case, we decided to take a medium fidelity approach – meaning that we avoided color or imagery in order to maintain focus on content organization and the user journey. We did include the typographic and iconography styles defined in our approved Style Tile, and we simulated actual copy since the process included quoting and checkout workflows which would not have made sense with greek copy.

In the end, these blueprints ensured that the user experience was providing the clearest path for a visitor to learn about the products, understand the cost and coverage offered, then ultimately enroll in a Guardian Insurance plan.

The goals mapped through the top-level marketing page wireframes were not only to educate the consumer about the company and the products offered but, more importantly, to serve as a hassle-free gateway to the actions the business measures – namely generating a quote and enrolling. On the homepage, Guardian wanted to make sure a newly developed brand message – Life is full of surprises. The cost of paying for them shouldn’t be. – was clearly communicated so we even started to play with some interaction suggestions. 

medium fidelity wireframe focuses on content organization and the user journeyDesktop, tablet, and mobile homepage wireframes.

The goals mapped through the quoting experience provided a simple way to understand the cost and coverage offered for a variety of different types of consumers – single person, couple, children, adult dependents, etc.

wireframe shows progression toward getting a quote

Tablet-width wireframes of the Find a Dental Plan (or quoting) process.

The goals mapped through the enrollment/checkout process were to 

a) keep the experience as simple and logical as possible for all types of visitors; 
b) allow them to enroll in one or multiple products at the same time;
c) gather all legally required information and consent – which varies between products.

wireframes show user journey to find a price quote

Desktop-width wireframes of the Enrollment (or checkout) process.

Along with these broader page-level experiences, microinteractions – such as saving and retrieving quotes – were considered and wireframed to ensure that the experience was cohesive.

Style Tiles 

With these insights in hand, we began creating Style Tiles that took the brand’s visual guidelines, placed them in an interactive digital context, and expanded on them where we saw the need and/or opportunity. This process created a high level view of the visual tone of the new website. Adding a bit of complexity, Guardian was deep in the midst of a larger corporate rebrand when our project began. In this case, we had to consider the existing brand guidelines, be flexible enough to incorporate new brand elements as they were provided, and ensure that the digital experience was coherent, unique, and accessible to all users – a critical concern identified during Discovery. 

Three concepts were presented initially:

1. New Blue Suit

– This concept expresses subtle sophistication through the use of color, typography, and whitespace. It relies heavily on the brand’s primary blue hue as a color that reinforces trust, loyalty, and integrity. An overall contemporary, minimalist approach is suggested as a means of reducing cognitive load. The icon style followed these principles as well by choosing an outline style with subtle monotone accents. Typographically, we included the fonts defined in their brand guidelines in order to maintain consistency with existing materials.

style tile includes blue and gray color palette and brandon grotesque typography

2. Happy Place

This concept is lively and pleasant using bright colors to create a friendly experience. We expanded the brand color palette to add cheerful, accessible hues able to be used in a variety of UI elements. Typographically we pushed the existing brand guidelines by incorporating a new font – Open Sans – as its wider variety of weights allows it to be used more expressively than the currently defined Arial family. Our type treatments utilized a lighter weight body font to balance the heavy use of color and maintain valuable whitespace. The icon style suggested takes a more fully-realized illustrative approach making use of the expanded color palette and adding dimension through highlights and shadows.

style tile with primary color palette

3. Gilded Skies

This concept reduces the color palette and adds trendy accents. Broader than New Blue Suit, but more restrained than Happy Place, this example’s color palette features a rich gray, trusted blue, and adds shades of gold to suggest value, elegance, and quality. Here we’ve suggested a hand-drawn icon style that personalizes the experience with a more genuine feel. We also included photography style suggestions that feature color and light effects to extend the color palette and maintain a clean and crisp look.

style tile features photography examples and a gray, blue, and gold palette

Through this exercise, we were able to understand just how far Guardian was willing to push their existing brand and create a final document that was approved by all stakeholders as the visual voice we wanted to achieve with the final product. In the end, and with a few iterations in between, we landed on what was essentially a blend New Blue Suit and Gilded Skies as the path forward. This final Style Tile was chosen in order to maintain consistency with Guardian’s larger, existing brand guidelines while introducing elements that would help create a more casual tone appropriate for the demographic of their direct-to-consumer offerings. 

final style tile includes blue, grey, gold and red tones, typography, buttons, and icons

Stay Tuned!

Our process does not end here but part 1 of this post does, unfortunately. :( Keep an eye out for part 2 where we’ll dig into the visual design details of the site! We’ll look at different visual concepts that were created, how the design was finalized with custom illustrations and iconography, and the component-driven approach we follow. For now, just direct your feet to the sunny side of the street!

Oct 03 2018
Oct 03

At Drupal Europe in September, the Association was thrilled to announce that DrupalCon Europe would be returning in 2019. During the gap year, we knew we wanted to transform the event to improve its fiscal sustainability, pilot a new more-scalable model that we may be able to bring to the rest of the world, and most of all ensure that the event still has the close care and attention of the local community. We believe our partnership with Kuoni Congress through the new licensing model we've established will accomplish each of those goals, and we're excited to see its first iteration in Amsterdam next year.

The Community's Homecoming

DrupalCon has always represented a homecoming for the community—and preserving that sense of belonging is critical to the future of the event. That's why we're pleased to announce that a DrupalCon Community Advisory Board has been chosen, chaired by Baddy Breidert and Leon Tong.

The community advisory committee consists of:

This membership represents a good cross-section of countries and roles within the community, with a focus on European representation to support the event. While Kuoni handles the operational and execution side of the event, this committee will help to inform the content and will bring more than a decade of historical knowledge about Drupal events. The charter of the committee is as follows:

  • Advise on programming
  • Create and oversee the content selection subcommittee
  • Create and oversee the volunteer subcommittee (room monitors, etc.)
  • Create any additional subcommittees as needed

In addition to this, members of the committee and the Drupal Association team will be traveling to Vienna in late November for a 2-day intensive kick-off with the Kuoni Congress team.

We want to thank all that showed interest in joining the committee and we will contact each and every one of you to offer the opportunity to take on some tasks for DrupalCon Amsterdam 2019.

We'll see you in Amsterdam!

About Kuoni Congress

Kuoni is a professional event services organization with offices around the globe, including more than 20 in Europe. The team building DrupalCon Amsterdam is located in Europe, and attended Drupal Europe in September 2018, where they were incredibly impressed by the community spirit and professionalism of the Drupal community. The Kuoni team is proud to partner with the Drupal Association and the Drupal community to bring DrupalCon back to Europe in 2019.

Oct 03 2018
Oct 03

Sooperthemes Glazed Builder is a Drupal drag & drop page builder made to revolutionize the way professionals build websites with Drupal. Glazed Builder is fully integrated with both Drupal 7 and Drupal 8 and gives the marketing staff, content teams and site builders the tools to do a faster, more efficient job! And guess what? You can build a fully functional website with Glazed  Builder without ever having to touch a line of code!
The builder provides 36 fully customizable elements to help you make and share beautiful and appealing content on your website without breaking a sweat. In today's blog article we are going to cover five Glazed Builder Elements and show you how to use and personalize them:

1. Countdown Timer

The first Glazed Builder Element we are going to talk about is the Countdown Timer. Using this module is a great way to create a sense of urgency about your campaign and persuade the customer to act soon if they don't want to miss a limited time offer or that amazing upcoming event your team worked so hard to put together. 

Currently there are six countdown timer styles to choose from, with the options to count to a specific day, hour, minute and second. It is also possible to set Date and Time limitations for a specific Date only, Date and Time, Time only (repeat every day) and Resetting Counter (with an interval up to 24 hours).

How to Use the Countdown Timer element

In order to create a new Countdown Timer, we are going to choose "Add Elements" by clicking the "+" icon. 

Oct 03 2018
Oct 03

Hopefully, content editors and website owners are ready to gasp in admiration — because we know they will when they discover Gutenberg editor in Drupal 8. The eighth version of Drupal already boasts with easy content publishing, custom layout creating, and, as of Drupal 8.6.0, improved media handling and remote video embedding. However, there is no limit to perfection! Gutenberg editor is bringing unprecedented content creation features to Drupal 8.

Gutenberg editor in Drupal 8

Where the Gutenberg name comes from

In the XV century, Johannes Gutenberg of Germany invented the printing press, which revolutionized book creation forever. The Gutenberg editor was named after him, and is expected to bring revolutionary web content creation experiences. Well, every epoch needs its own Gutenberg!

The essence of Gutenberg editor’s work

Gutenberg is a modern, open-source user interface for creating rich pages. It is an app built in JavaScript — more precisely, React.js.

Content editors can play with Gutenberg UI to shape the exact look of their pages. They can achieve theming effects otherwise available only to CSS and HTML experts — but easy for anyone with this editor.

Any page elements can be added as blocks and formatted to your heart’s content. These can be image galleries, texts, tables, lists, shortcodes, buttons, columns, social media embeds, paragraphs, quotes, verses, files, and so much more.

Bringing Gutenberg editor to Drupal 8

Gutenberg.js was created with WordPress in mind. WordPress has already adopted the editor, which is promised to be fully ready with the release of WordPress 5. However, this JavaScript app was so lucrative that the Drupal community also decided to get it.

Drupal developers Per Andre Rønsen and Marco Fernandes introduced Gutenberg to everyone at Drupal Europe in Darmstadt and created the Gutenberg Drupal module to connect the editor to Drupal 8 websites.

This Drupal 8 module is in alpha state for the moment of writing, so it is not completely ready for use. The module has a number of issues, and the same applies to Gutenberg editor itself.

Considering the extensive works being performed for both the app and the module, this situation can change very quickly.

How the Gutenberg module in Drupal 8 works

With the Gutenberg Drupal 8 module installed and enabled, the Gutenberg editor needs to be switched on for the desired content types. The “Gutenberg experience” option needs to be selected and checked.

Enable Gutenberg editor for content type in Drupal 8

The module requires a field of “long text” type with the Gutenberg text format selected.

Select Gutenberg text format for content type in Drupal 8

When it’s done, every node opens for editing in the Gutenberg UI. As we see, its interface is pretty much white space, with the menu options hidden until you need them.

By clicking a plus icon in the left-hand corner, you are offered the list of page elements that can be added as blocks. For your convenience, blocks are searchable via the search box.

They fall into six main categories:

  • Drupal blocks

Content body, title, image, user ID, picture, email, comments, and many more.

  • Common blocks

Paragraph, image, heading, gallery, list, quote, audio, cover image, file, video.

  • Formatting

Coding, preformatted, pullquote, table, verse.

  • Layout elements

Button, columns, “more,” page break, separator, spacer.

Shortcode, archives, categories, latest comments, latest posts.

Over 30 social networks and media providers to embed content from.

List of blocks in Drupal 8's Gutenberg editor interface

Once a block is added, you can open its detailed settings in the right hand corner. For example, you can choose the background and text color, the size of the letters, the number of columns, and so on, depending on the block. Many blocks have an advanced option to add CSS styles.

Configuring a block in Drupal 8's Gutenberg editor

You can move the blocks up and down, align them as you wish, switch their block type, duplicate or remove them. In addition to the visual editor, Gutenberg also has the code editor.

To recap

Drupal itself works like the best of Legos. And now it is getting another interesting Lego box inside it — Gutenberg editor.

The time has come for exclusive content creation opportunities! If you wish to migrate to Drupal 8 to enjoy them, or if you need help with installing and configuring any modules like Gutenberg editor in Drupal 8 or other, contact our Drupal web development company.

Oct 03 2018
Oct 03

In the previous post, we created and booted a fully dockerized Drupal setup. We will be using Ansible to automate the whole deployment process from start to finish.

Why Ansible

Primarily because I'm a huge fan of Ansible. It is agentless, has a great ecosystem, the YAML syntax is simple to read, understand and maintain(honestly, sometimes it is tiring to figure out what exactly is happening). This could be automated using any other provisional tool like Chef or Puppet as well.

The other decision we will be taking is, making the Ansible playbooks a part of our codebase. It will live alongside our Drupal code. Having your infrastructure and deployment as a part of your code is considered an industry wide good practice. All in all, this will be a self contained repository with both the code and instructions on how to deploy it. It is still not technically 100% infrastructure-as-code setup, as we only have the provisioning scripts checked in, not the code to spin the actual servers. The playbooks assume that the servers are already there with docker and docker compose installed and having ssh access.

This setup makes the deployment process both consistent and repeatable as well. Any developer in your team with necessary permissions can run the script and get the same results every time. Also, when the build fails, it fails loud and clear so that you know where it messed up.

Some limitations

I'd like to put up the limitations of this setup before we dive in. For simplicity's sake, I don't guarantee a rollback for this process. If for instance, you do a deployment and it fails, and you want to rollback to previous state, you have to do it manually. The setup has no provision to rollback to previous state. It does store DB backups. That said, it wouldn't be too difficult to add a rollback mechanism with the tag rollback and some parameters, like what commit to rollback to, which DB to reset to etc.

There is a small downtime when the older containers and brought down and new containers are built and brought up. It is not a big concern if you are a small to medium site. If you are traffic heavy, you need to take steps to prevent this. I'm working on alternative solutions for this one and open to suggestions.

The nature of Drupal is in general hairy to do 12-factor stuff like rollbacks, zero downtime deployments etc.

what steps to run

An important precursor to automating is to document and have a script for each step. Fortunately, we have most of them that way for our stack. We can divide our tasks into 2 broad categories,

  • stuff we do on a one-time basis when we setup the system, Ex: creating DB backup directories
  • stuff we do for every deploy, Ex: running DB updates via drush

Ansible has the concept of tags which we will exploit for this purpose. We define 2 tags for the above purpose. One called setup, another called deploy.

List of setup only tasks:

  1. Create a directory for DB files to persist
  2. Create a directory for storing DB backups
  3. Create a directory for storing file backups

List of tasks for both setup and deployment:

  1. Create a backup of files and DB
  2. Clone the correct code, i.e. specified branch or bleeding edge
  3. Create .env file(this is not checked in, needs to be created)
  4. Build and boot latest containers for all services
  5. Run composer install(Drupal specific)
  6. Run DB updates(Drupal specific)
  7. Import config from files(Drupal specific)
  8. Clear cache(Inevitably Drupal specific)

security considerations and playbooks

It is important to secure your servers before you deploy the application. This is an Ansible playbook as well which can be used for any web application, not just this stack. Also, when running the playbook, you will be storing a lot of sensitive information, like DB credentials, the SSH key pair and the server user credentials. If we follow infrastructure-as-code paradigm, it is not practical to checkin this as part of your code. Ansible helps us to store them in an encrypted fashion by taking in a password to encrypt or decrypt it. This password can be supplied by a user prompt or from environment variables. We will use the latter approach so that it is easier to fully automate it in future.

We will use the Ansible template module to create the nginx configuration file and .env files.

- name: Create .env file
    src: "templates/dotenv-{{ env }}.j2"
    dest: "{{ project_path }}/.env"
    owner: deploy
    - deploy
    - setup

non prod environments

This setup allows you to easily create production replicas, or non production environments easily. This is where the script shines. You have to create the following environment specific changes:

  1. Where your codebase will live in the servers. The environment won't share the code for obvious reasons.
  2. Env specific credentials, like DB etc.
  3. The environment name
  4. The environment specific domain, like staging.example.com

I've written the script to support a prod environment and a non prod dev/staging environment. You can extend it to as many environments as you want. This feature is handy if you want to:

  1. Showcase a new feature to a client
  2. Reproduce a production bug and fix it
  3. Test an unshipped feature

NOTE You have to make sure to keep off search engine crawlers from crawling your non production sites. I thought of using .htaccess password protection to achieve this, but ditched that approach in favour of editing the robots.txt rules.

Here's how I do it in the playbook:

- name: Update robots.txt to disallow search engines for non prod site
    src: "templates/robots.txt.j2"
    dest: "{{ project_path }}/web/robots.txt"
    owner: deploy
    - deploy
    - setup
  when: env != "prod"

And the contents of my non prod robots.txt,

User-agent: *
Disallow: /

Yeah, keep off my staging site you greedy crawlers! (Crawlers respect that.)

Running ansible

If you are running the deployment setup for the first time, run the setup tags first(assuming you've secured your servers and have docker and friends installed).

First, set the vault password in your shell.

$ export ANSIBLE_VAULT_PASSWORD=supersecret

$ ansible-playbook -i "dev.example.com," playbook.yml  --vault-password-file=./vault-env --extra-vars "env=dev" --tags setup

of course, you should have Ansible installed on your local for that to work. The -i specifies the inventory, the machine where the script will run, followed by the location where I should find my vault decryption password(otherwise Ansible will fail trying to decrypt it unsuccessfully). Notice that I inject the environment from commandline and I run the setup related tasks first. This will setup and deploy your site for the first time.

Once you make changes and want to deploy, you run,

$ ansible-playbook -i "www.example.com," playbook.yml  --vault-password-file=./vault-env --extra-vars "env=dev" --tags deploy

This will run only the deployment related steps. You have successfully created a one step build and deploy process for your Drupal site. Now if only I had the whole thing run when I do a git push, or perhaps I push to master, and production deployment happens and when I push to a dev branch, it deploys to staging or something similar. We are talking about continuous delivery here, the holy grail of any agile team. That will be the subject of the next post!

Oct 03 2018
Oct 03

One of our customers asked how to tweak the fields of a table output by Views to give the table a cleaner look.

They were looking for a way to merge the fields of the first and second columns. They also wanted to display the file download link just with an icon.

There are a couple of ways to achieve this. One of them is to rewrite the output of Views’ fields.

This tutorial will explain how to rewrite the results of any Views’ field independently of the display of the view (i.e. table, list, grid, etc).

Let’s start!

Step #1. Download the Required Module

In order to link the image field to the file, it’s necessary to install the Linked field module.

  • Type in your terminal the following line:

composer require drupal/linked_field in your terminal

This will download and place the module in the modules’ folder of your Drupal installation.

  • Click Extend and enable the Linked Field module.
  • Click Install.

Click to enlarge the image:

Click Install

Step #2. Create the Content Type

  • Click Structure > Content types > Add content type.
  • Create a content type and give it a proper name, for example Issued Paper.
  • Click Save and manage fields.
  • Add the following fields:
    • Label: Authors / Field type: Text (plain).
    • Label: Pages / Field type: Text (plain).
    • Label: Reference / Field type: Number (integer).

Click to enlarge the image:

Add the following fields

  • Click Add field.
  • Select File from the dropdown list.
  • Give this field a proper label.
  • Click Save and Continue.

Click to enlarge the image:

Click Save and Contine

  • Leave the default values and click Save field settings.
  • Scroll down to the Allowed file extensions option, delete txt and replace it with pdf.
  • Click Save settings.

Click to enlarge the image:

Click Save settings

  • Click Add field.
  • Under the Reuse an existing field dropdown select Image.
  • Click Save and continue.

Click to enlarge the image:

How to Rewrite the Output of Views FieldsClick Save and continue

  • Check this field as required.
  • Upload a default “pdf icon” image.
  • Write an alt text for this image.

Click to enlarge the image:

Write an alt text for this image

  • Change the file directory to pdf-files.

Click to enlarge the image:

Change the file directory to pdf-files

  • Scroll down and click Save settings.
  • Click Manage form display.

Click to enlarge the image:

Click <i>Manage form display

  • Drag the Image field to the Disabled section and click Save.

Click to enlarge the image:

Drag the Image field to the Disabled section

By disabling the field in the form, giving it a default value and making it required, we make sure that the same image (in this case an icon) will be displayed on all nodes of the same content type.

  • Click Manage display.
  • Drag the File field to the disabled section.
  • Click the cogwheel besides the Image field.
  • Check Link this field.
  • The destination will be the files folder. From the Available tokens section, select the [node:field_file_download:entity:url] token.
  • Click Update.

Click to enlarge the image:

  • Rearrange the fields.
  • Click Save.

Click to enlarge the image:

Click Save

Step #3. Create the Content

  • Click Content > Add Content > Issued Paper.
  • Create three or more nodes.

Click to enlarge the image:

Create three or more nodes

Your Content listing screen should look more or less like this:

Click to enlarge the image:

Your Content listing screen should look more or less like this

Step #4. Create the View

  • Click Structure > Views > Add view.
  • View Content of type Issued paper.
  • Check the Create a page checkbox.
  • Display it as a Table of fields.
  • Click Save and edit.

Click to enlarge the image:

Click <i>Save and edit

  • Click the Add button in the FIELDS area.
  • Type the name of your content type in the search box (e.g. “Issued paper”) - it works best with the machine name.
  • Select all fields except the body field.
  • Click Add and configure fields.

Click to enlarge the image:

Click Add and configure fields

  • Click Apply and continue five times for now.
  • Rearrange the fields like in the following image.
  • Click Apply.

Click to enlarge the image:

Click Apply

  • Click the Authors field and exclude it from the display.
  • Click Apply.

Click to enlarge the image:

Click Apply

  • Click the Title field, uncheck Link to the content and click Apply.
  • Click the File Download field, exclude it from display and set the Formatter to URL to file.

Click to enlarge the image:

Click the File Download

  • Select the Image field, and set the Formatter to Image, choose Thumbnail 100x100 pixels.

Click to enlarge the image:

Select the Image field

Your Fields section should look like in the following image (click to enlarge it):

Your Fields section should look like in the following image

Now it’s a good time to rewrite the output of these fields. Rewriting results is about displaying additional information that is different from the actual field within that field.

You achieve this through the use of tokens. It may sound complicated but it’s not.

Let’s take a look at that!

Step #5. Rewrite Results

  • Click the Title field.
  • Click Rewrite results.
  • Select Override the output of this field with custom text.
  • Click Replacement patterns.
  • Copy the {{ title }} token and paste it in the text area.
  • Wrap this token between the h2 tags.
  • Copy the {{ field_authors }} token and paste it below the {{ title }} token.
  • Click Apply.

Click to enlarge the image:

Click Apply

Notice that you can only rewrite the results of this particular field with the tokens of the fields that are above in the Fields section of Views UI.

  • Click the Image field.
  • This time check the Output this field as a custom link checkbox.
  • In the Link path textbox enter the {{ field_file_download }} token (available in the REPLACEMENT PATTERNS section).
  • Click Apply.

Click to enlarge the image:

Click Apply

  • Save the view and take a look at the page of this view on your frontend.
  • Click one of the download icons to test if they work.

Click to enlarge the image:

Click one of the download icons to test if they work

The Rewrite results feature in views allows you to rewrite a field in views with other fields of the same view.

Thanks for reading!

About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Oct 02 2018
Oct 02

On September 10, 2018, the Drupal Association Board met at the DrupalEurope event in Darmstadt, Germany.

You can find the official meeting minutes and board packet on the Board Meeting Minutes and Materials page of the Drupal.org website.

Thank you to our outgoing class of board members

We want to say a special thanks to our outgoing class of board members. Their service has helped define the course for Drupal Association in recent years, and their contributions to this community are immense.

Donna is a long-time advocate of Open Source in Australia, and has served on the Drupal Association Board since2012. Donna was a consistent voice for inclusiveness, global representation, and community. In Drupal Europe Donna led a panel about the past, present, and future of the DA. Thank you, Donna!

Sameer served two terms on the Drupal Association Board, bringing his background as a Professor of Information Systems at SF State to provide historical insight into the wider open source world, as well as deep knowledge of Drupal in Higher Ed. Sameer's knowledge of strategic frameworks helped to level up board conversations.

Steve Francia joined the Drupal Association Board to bring to bear his perspective from leading a wide variety of different open source projects, including MongoDB, Docker, Hugo, and Go. Steve helped provide an understanding of Drupal's context in the larger world, and most notably, he shared that in his keynote at DrupalCon Nashville. Thank you, Steve!

Shyamala was elected in 2016 by the community to serve a two-year term on the Drupal Association Board. Shyamala enhanced the global perspective of the board, and helped to tie the Asian (and especially Indian) community more closely with the DA. Shyamala is now deeply involved in creating the local Indian Drupal Association.

Welcome to our new board members

Suzanne was elected by the community earlier this year to serve a two-year term on the Drupal Association Board. Suzanne has regularly spoken at Drupal events, runs an in-depth Drupal training program, and has more than a decade of experience in Drupal.

Vishal's role as the head of Open Source Technology Solutions at Tata Consultancy Services gives him a wealth of experience with the open source landscape, both in Drupal and beyond. His experience at a major SI will help the board drive conversations about adoption of Drupal as the hub of a web solution for major service providers.

Luma brings her experience as a Managing Director of Charles Schwab to share her knowledge of the Financial Technology space, and Drupal's impact on these organizations and end-users. Luma manages a large Drupal installation, and can provide a powerful end-user perspective on Drupal's future.

Until next time

We hope you can join us for our future board meetings, which will be announced soon.

Oct 02 2018
Oct 02

The Media module made its way into Drupal core for the Drupal 8.4 release a while back. It gives Drupal users a standardized way for managing local media resources, including image, audio, video, and document files. We wanted to add using this module into our Drupal Commerce demo site to give an example of how this module could potentially be used in a Commerce setting.

In this Tech Talk video, I’ll quickly show you how we updated our digital download Commerce product example to use the Media module, giving us the flexibility to add audio samples to the product page and access to the full download after purchase.

[embedded content]


The product I wanted to update is the Epic Mix Tape by Urban Hipster digital download example product. This is a fake album featuring all of your favourites by artists you’ve never heard before. The idea is to showcase that you can add digital products to a Drupal Commerce based online store, not just physical products.

Originally we were using just a standard file field that, when checkout was completed, gave the customer access to download the file. This was done before the Media module made its way into core. Now that the Media module is in core, we figured it’s time to update it.

Setting up an Album media type

When the Media module is installed you get some new admin menu items. The first is a section called Media Types (under Structure) where you can configure your media entities like any other Drupal content entity. Here I created an ‘Album’ media type with two unlimited file fields, one for sample audio tracks and one for the full audio tracks. This is the basis for creating my downloadable albums.

The second admin menu is under Content. Here you get a new Media tab which is where you can add, edit and remove any media items. Since I already created the Album media type I can now add the Epic Mix Tape album files here. This completes the media side of the updated digital download product. All I need to do now is update the product configuration to use it.

Completing the digital download product configuration

Now that the media type has been added and I’ve uploaded an album, I need to set up a way to use it. It’s pretty easy to do. First, for the digital download Product Type, I add an entity reference field to give a way for selecting the album media entity to use for the product samples.

I then do the same thing for the Product Variation Type. This one, however, will be used to give access to the full files after purchase.

Finally, some template updates. The Drupal Commerce demo site has some pretty custom template files for the products. In the template, I access the media entity directly and loop through the items, printing each audio sample and track title onto the product page. I do the same thing for the checkout complete page but print out the full tracks instead.

Depending on your templates and display settings, you can get similar results without manually accessing the files in the template file, however I wanted to print out the file description with the audio player right on the page. Showing the description unfortunately is something you don’t have the option of doing using the standard audio display widget.

And that’s it! Check out the Urban Hipster Drupal Commerce demo site below to see it in action.

Demo Drupal Commerce today! View our demo site.

Oct 02 2018
Oct 02

In the previous post, we created a setup to run Drupal + Docker in local. With a skip and a jump, we can make the same setup run in production as well. We'll do a deep dive of the same in this post.

12-factor apps and other goodies

One thing I'll keep repeating is how effectively we can "12-factor"ize our app. This will make a host of other best practices like one step build, backup-restore etc. a lot easier. Drupal is a stateful application, wherein it cannot be readily 12-factorized. For example, the sites/default/files directory resides inside the code base. Hence, it cannot be easily scaled horizontally like a stateless app. We still strive to make Drupal as 12-factor as possible by storing configuration in files, using environment variables etc. Here's a quick run down of 12-factor tenets and how we apply it to our Docker-Drupal context:

one code base, many deploys

Holds good for us by default. We can use the same code base and deploy to a production and staging environment separately.


We explicitly declare and manage dependencies using composer.json and composer.lock.


Site config is stored in files as part of the code. Other config like database credentials and keys are stored as environment variables. Also, we store docker specific stuff in a .env file.

Backing services

These are other services needed to run the app, in our case MariaDB. The 12factor tenet treats this as an attached resource. We add this as a part of our composer file.

Build, release and run

We will come back to this in a future installment, but we currently don't have it.


Every part of a stack is a single container running a process.( PHP FPM, Nginx, MariaDB etc.)

Port binding

This essentially means that each of the services in the web app are bound to a port and a URL which can be referred to in another app. We refer to the DB in our settings.php using this approach. The Nginx container refers to the PHP FPM process running in a port using the same approach.


In a 12factor setup, scaling happens via the process model. We'll see how we can achieve this using Docker containers in this post.

  1. Disposability

Docker compose up and down commands with ability to stop and start the app gracefully and build new images on demand.

Dev/Prod parity

That's the whole premise for adopting this setup! We can have a clone of the production setup in both local and staging.


Docker allows you to stream logs either to stdout or a file.

Admin processes

The 12factor site states this as, "Run admin/management tasks as one-off processes". I think drush fits the bill perfectly here.

domain names & traefik

How would you manage domain names on a VM based setup? You would map them with their IPs. What about when you run containers? different processes running in the same IP but on different ports? That's exactly what Traefik does for you. It actually offers much more than that. Traefik is a modern reverse proxy tool which integrates easily with your Docker containers.

When I first heard about Traefik, it sounded too good to be true. The closest thing I'd worked with for getting a similar functionality was nginx proxy. After fiddling with nginx proxy unsuccessfully for a few days, I gave Traefik a try, and never looked back. Traefik plays well with docker-compose and consumes the label metadata associated with a Docker container to expose it as a route. I'd also like to mention here that Traefik is extensively well documented. We will be covering Traefik installation and setup in the deployment stage towards the end of this post.

Free certificates

Traefik allows you to use Let's Encrypt to automatically generate and manage SSL certificates for your domain. All it requires is the HTTP challenge and the associated configuration required. A challege is a task posed to you by the Let's Encrypt Certificate Authority to show proof that you own and control the domain you've requested an SSL certificate for. There are many ways to do this. For this context, we will be using the DNS challege, where Let's Encrypt will request you to add TXT records to your domain and verify if they're there or not. We can automate the whole thing by supplying a DigitalOcean read-write API token(as I manage my domain in DigitalOcean) to Trafeik, so that it can add the TXT records via API and do the verification. We also mention in Traefik's configuration that we use the DNS challenge to verify that the domain belongs to us.

The next step is to add Traefik related annotation to our containers. We can do this in the production.yml file by annontating each container appropriately. You might want to choose which containers you expose to the outside world. Not all containers need to be exposed. For instance, the database will only be visible to the php container. We can set these rules in the production.yml file under the network section. There are 2 overlay networks available for a container. One which is specific to the compose file and assoiciated containers. In our case, the nginx, mariadb and php containers all belong to the same network by default. The other is the external network which we created earlier in the article. If you want a container to be visible to Traefik, you should add it to the external network proxy which we created above.

In addition to the network info, you provide addition metadata specific to Traefik, like what port you're exposing, which domain you want to expose it under, etc. Our updated production.yml looks like this:

    file: docker-compose.yml
    service: nginx
    - ./:/code
    - ./deploy/nginx/config/default:/etc/nginx/conf.d/default.conf
    - ./deploy/nginx/config/drupal:/etc/nginx/include/drupal
    - traefik.backend=example
    - traefik.frontend.rule=Host:www.example.com
    - traefik.docker.network=proxy
    - traefik.port=80
    - internal
    - proxy

You can see that nginx is part of both the internal and proxy networks.

We also refer to both the internal and the external proxy network in the docker compose file.

    external: true
    external: false

checking your app logs

Once you have booted your docker containers, it is easy to check the logs of each service, like Nginx, PHP FPM and the database. For instance, You can check nginx logs by running:

$ docker-compose -f production.yml logs nginx

Just replace the container name in above command to check the respective service logs. Also, you can add a -f option after the logs command to stream the logs.

a sample deployment

Now that we have a production ready setup, we can deploy this to our servers. If you don't have a server ready, I recommend spinning a new one using DigitalOcean and mandatorily secure the server if you are running a production site. Once you've done those, then clone your Drupal code at a convenient location in the server. By now, you can checkin the docker-compose.yml and its accompanying production.yml files at the top level directory of your codebase.

Also, install docker and docker-compose on the production server. We will be running all the services, including Traefik inside containers. This is a one time task. Next, you set up your Traefik service. For this, we first create a Docker network(let's call it proxy). Every dockerized web app we create will have containers which belong to the overlay network specific to that stack, or both the overlay network and the proxy network. We can choose which stack to expose to Traefik and thus the outside world.

Let's create the proxy network.

$ docker network create proxy

Traefik comes with a web console as well and requires some basic configuration to run. Here's the Traefik configuration for our Drupal site(s),

defaultEntryPoints = ["http", "https"]
address = ":8080"
  users = ["admin:$apr1$NpIqapqV$PReV1wDm6xXjvqpl7PYqN0"]
  address = ":80"
      entryPoint = "https"
  address = ":443"

Traefik web console requires username/password credentials to show up, this is mentioned above. The weird looking password hash can be obtained by running htpasswd.

$ htpasswd -nb admin <my-password>

To specify Traefik that I'm using the DNS challenge, I'd have to add this part to my traefik.toml file,

email = "[email protected]"
storage = "acme.json"
entryPoint = "https"
onHostRule = true
entryPoint = "http"

We decided run Traefik as a docker container. As it comes with a fair bit of configuration, its better run as a docker compose file in itself.

version: '2'

    image: traefik
    restart: always
    command: --docker
      - 80:80
      - 443:443
      - proxy
      - /var/run/docker.sock:/var/run/docker.sock
      - $PWD/traefik.toml:/traefik.toml
      - $PWD/acme.json:/acme.json
    container_name: traefik
      - traefik.frontend.rule=Host:monitor.example.com
      - traefik.port=8080

    external: true

A few things of note here.

  • Traefik consumes port 80 and 443, the HTTP and SSL ports of your system. Make sure you don't run any other process in those ports.
  • You can see in the above config that Traefik run on the proxy network.
  • I inject the DigitalOcean API token inside Traefik containers(I use DigitalOcean to manage my infrastructure and DNS). We'll see why in a moment.

Also, this will be totally separate and exclusive from your app codebase, the reasoning behind this is to reuse the Traefik setup for different webapps running on the same machine.

Let's boot the Traefik setup we created:

$ docker-compose up -d

NOTE you might need to create a file called acme.json with write permissions for Traefik to start successfully, in the same directory alongside the traefik.toml and docker-compose.yml.

You can check the Trafik web console by hitting monitor.example.com in your browser(the domain you gave in your docker compose file above). The first time, it is going to prompt for credentials which you gave in the toml file above.

Next step is to boot your Drupal setup. You can go to the Drupal codebase and create a .env file. This file is:

  • NOT checked in to your code base
  • contains environment specific details and some sensitive information related to your site, like MySQL credentials.

If you want to run a staging setup of your site, you clone the codebase and create a different .env file. This file will be read by docker when booting all your containers. Here's a sample env file.


Finally, boot your docker containers.

$ docker-compose -f production.yml up

Some Drupal specific steps you need to do,

  1. Create DB dump file in the /mariadb-init2/${ENV} directory of your server(this is in your production yml file) if you are porting an existing site. You do this before booting the containers so that MariaDB picks it up when booting.

  2. Run composer to install dependencies. To run composer in a docker setup,

    $ docker-compose -f production.yml run php composer install

  1. To run drush,
    $ docker-compose -f production.yml run php ./vendor/bin/drush --root=/code/web cr

You can add drush to $PATH in the Dockerfile, and create an alias so that you need not specify --root, but all those are cosmetic changes. You get the idea :)

Once you have your setup running, you can hit the domain you specified in the .env file to view the site. Congratulations! you successfully created a fully dockerized production setup of your Drupal site.

How do you deploy changes to this setup? It would be awesome if we just do a git push to our code and the deployment happens automagically right? We will walk through this exact setup in the next post! Till then, adieu.

Oct 02 2018
Oct 02
On the 28th of March 2018 the Drupal Security Team announced SA-CORE-2018-002, a serious Remote Code Execution vulnerability, which came to be known by many as "Drupalgeddon 2". Here's what we learned defending against it.
Oct 02 2018
Oct 02

Drupal Modules: The One Percent — Access by Reference (video tutorial)

[embedded content]

Episode 46

Here is where we bring awareness to Drupal modules running on less than 1% of reporting sites. Today we'll consider Access by Reference, a module which lets content editors easily grant other users access to specific nodes.


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web