Oct 02 2018
Oct 02

[embedded content]

Watch other videos on our YouTube channel. Click here to subscribe.

I was recently looking at all the default views that come with Drupal 8. For people who don’t know, the Views module is part of Drupal 8 core. In Drupal 7 and below it’s the most installed module so during Drupal 8’s development it was decided to move Views into core.

During my exploration into all of the default Views, I noticed that in the People (User) view there was a filter called “Combine fields filter”.

Want to learn about Views? Read Build a Blog in Drupal 8: Using Views or watch it as part of our FREE Drupal 8 Site Building course.

Now just a quick side note, if you’re new to Drupal and Views I’d highly recommend you spend time walking through all of the default views and see how they were configured. You can learn a lot just by seeing how things are set up.

The “Combine fields filter” does a pretty cool thing. It allows you to search across multiple fields or put another way, it allows you to combine fields and then filter by their combined value.

How to use “Combine Fields Filter”

Using this filter is relatively straightforward. Just click on Add in the Filter criteria field-set. Search for the filter by name or select Global from the Category drop-down.

When configuring the filter, you can select which fields you want to search from the “Choose fields to combine for filtering” drop-down.

If you want to see what the actual query looks like, turn on “Show the SQL query” from the Settings page (admin/structure/views/settings).

Then in the preview area, you should see the query that gets generated.

The above example is from the “People (User)” view.


If you want to add basic filtering across fields to your views, then this is the way to go. It’s useful for those custom admin pages which we create to help editors manage content. If you’re looking for something more advanced such as keyword searching, then look at using Search API.

Ivan Zugec

About Ivan Zugec

Ivan is the founder of Web Wash and spends most of his time consulting and writing about Drupal. He's been working with Drupal for 10 years and has successfully completed several large Drupal projects in Australia.

Oct 02 2018
Oct 02

Drupal 8’s REST API reached a next level of maturity in 8.5. In 8.6, we matured it further, added features and closed some gaps.

Drupal 8.6 was released with some significant API-First improvements!

The REST API made a big step forward with the 6th minor release of Drupal 8 — I hope you’ll like these improvements :)

Thanks to everyone who contributed!

  1. File uploads! #1927648

    No more crazy per-site custom REST resource plugins, complex work-arounds or base64-encoded hacks! Safe file uploads of any size are now natively supported!

    POST /file/upload/node/article/field_hero_image?_format=json HTTP/1.1
    Content-Type: application/octet-stream
    Content-Disposition: file; filename="filename.jpg"
    [… binary file data …]

    then, after receiving a response to the above request:

    POST /node?_format=json HTTP/1.1
    Content-Type: application/json
      "type": [{"value": "article"}],
      "title": [{"value": "Dramallama"}],
      // Note that this is using the file ID we got back in the response to our previous request!
      "field_hero_image": [
          "target_id": 345345,
          "description": "The most fascinating image ever!"

    If you’d like a more complete example, see the change record, which explains it in detail. And if you want to read about the design rationale, see the dedicated blog post.

  2. parent field on Term now is a standard entity reference #2543726

    "parent": []

      "target_id": 2,
      "target_type": "taxonomy_term",
      "target_uuid": "371d9486-1be8-4893-ab20-52cf5ae38e60",
      "url": "https://example.com/taxonomy/term/2"
    We fixed this at the root, which means it not only helps core’s REST API, but also the contributed JSON API and GraphQL modules, as well as removing the need for its previously custom Views support!
  3. alt property on image field lost in denormalization #2935738

      "target_id": 2,
      "target_type": "file",
      "target_uuid": "be13c53e-7f95-4add-941a-fd3ef81de979",
      "alt": "Beautiful llama!"

    after denormalizing, saving and then normalizing, this would result in:

      "target_id": 2,
      "target_type": "file",
      "target_uuid": "be13c53e-7f95-4add-941a-fd3ef81de979",
      "alt": ""

    Same thing for the description property on file and image fields, as well as text, width and height on image fields. Denormalization was simply not taking any properties into account that specializations of the entity_reference field type were adding!

  4. PATCHing a field → 403 response without with reason #2938035

    {"message":"Access denied on updating field 'sticky'."} {"message":"Access denied on updating field 'sticky'. The 'administer nodes' permission is required."}

    Just like we improved PATCH support in Drupal 8.5 (see point 4 in the 8.5 blog post), we again improved it! Previously when you’d try to modify a field you’re not allowed to modify, you’d just get a 403 response … but that wouldn’t tell you why you weren’t allowed to do so. This of course was rather frustrating, and required a certain level of Drupal knowledge to solve. Now Drupal is far more helpful!

  5. 406 responses now lists & links supported formats #2955383

    Imagine you’re doing a HTTP request like GET /entity/block/bartik_branding?_format=hal_json. The response is now more helpful.

    Content-Type: application/hal+json
    {"message": "No route found for the specified format hal_json."}

    Content-Type: application/hal+json
    Link: <http://example.com/entity/block/bartik_branding?_format=json>; rel="alternate"; type="application/json", >http://example.com/entity/block/bartik_branding?_format=xml>; rel="alternate"; type="text/xml"
    {"message": "No route found for the specified format hal_json. Supported formats: json, xml."}
  6. Modules providing entity types now responsible for REST tests

    Just like we achieved comprehensive test coverage in Drupal 8.5 (see point 7 in the 8.5 blog post), we again improved it! Previously, the rest.module component in Drupal core provided test coverage for all core entity types. But if Drupal wants to be API-First, then we need every component to make HTTP API support a priority.
    That is why in Drupal 8.6, the module providing an entity type contains said test coverage (A). We also still have test coverage test coverage (B). Put A and B together, and we’ve effectively made HTTP API support a new gate for entity types being added to Drupal core. Also see the dedicated blog post.

  7. rest.module is now maintainable!

    I’m happy to be able to proudly declare that Drupal 8 core’s rest.module in Drupal 8.6 can for the first time be considered to be in a “maintainable” state, or put differently: in a well-maintained state. I already wrote about this in a dedicated blog post 4.5 months ago. Back then, for the first time, the number of open issues fit on “a single page” (fewer than 50). Today, several months later, this is still the case. Which means that my assessment has proven true :) Whew!

Want more nuance and detail? See the REST: top priorities for Drupal 8.6.x issue on drupal.org.

Are you curious what we’re working on for Drupal 8.7? Want to follow along? Click the follow button at REST: top priorities for Drupal 8.7.x — whenever things on the list are completed (or when the list gets longer), a comment gets posted. It’s the best way to follow along closely!

The other thing that we’re working on for 8.7 besides the REST API is getting the JSON API module polished to core-worthiness. All of the above improvements help JSON API either directly or indirectly! I also wrote about this in my State of JSON API blog post. Given that the REST API is now in a solid place, for most of 2018 the majority of our attention has actually gone to JSON API, not core’s REST API. I expect this to continue to be the case.

Was this helpful? Let me know in the comments!

For reference, historical data:

Oct 02 2018
Oct 02

Whilst at Drupal Europe last month, I was privileged to be invited by Drupal’s founder, Dries Buytaert, to a round table discussion, aimed at further marketing the Drupal project.

Bringing together a number of leaders from the Drupal community, we all shared the same desire to boost the marketable assets of the open source platform. One of the ways we hope to achieve this publicity is by creating a comprehensive, customer-facing "Pitch Deck".

The session began as a workshop, facilitated by Adam Goodman. We were guided to identify opportunities for delivering the benefits of Drupal to the uninitiated. The ultimate objective is to encourage the adoption of the Drupal platform. Consensus was reached that we focus upon three separate initiatives.


We're not competing with one another, yet we’re not helping each other either. Our role as leaders is to activate the assets that already exist in the community. Bert Boerland

One of these three initiatives is a plan to create a comprehensive, customer-facing marketing resource, or "Pitch Deck". The resource will present Drupal’s credentials in a persuasive manner, containing impressive exemplar case studies, to ease the process of convincing an organisation or client to choose Drupal.

The Team

I volunteered to take overall responsibility for the creation of the end result. Joining forces with Suzanne Dergacheva and Ricardo Amarowho bring rich, varied perspectives and skill sets, I feel confident providing the basis for this universal toolkit. But we can only be truly successful if many others contribute to our initiative. We need sales people, marketers, copywriters to join our cause.

Get Involved Today

Providing a single and persuasive resource, available for all Drupal promoters, to sell the powerful advantages of Drupal will benefit all who use it. With strong consistent messaging, and bolstered by the many Drupal success stories, the deck will position all advocates better to expand the Drupal market share across many scenarios.

With a core team of fellow Drupal professionals, we plan to cover as many topics as we can identify, from security, accessibility and performance functionality through to specific industry verticals, like Higher Education or Media. The key intention is to show how Drupal can adapt to fit projects of all shapes and sizes, across all industries.


The Benefits

Many of Drupal’s competitors (think Wordpress, Squarespace etc.) are widely publicised and, consequently, innately popular. In many cases, Drupal may well be the ideal platform for a project, but it risks losing out to competing CMS providers as the success and potential of Drupal is not easily demonstrated.

Our intended users are sophisticated purchasers. As they ask more and more questions, our responsibility grows to equip agencies with comprehensive information. By using the collaborative resource, agencies will be able to accurately sell the Drupal platform, whilst spending more of their energy and resources focusing on the services they deliver. Freeing up time from writing and re-writing duplicated Drupal sales, organisations will be left to promote their unique strengths.

The Plan

We plan to kick off the project by identifying the high-level requirements and the mechanism to create the slide deck. From there, we hope to crowdsource for support, and seek volunteers from the wider business community. By recruiting sales people, marketers, copywriters and subject matter experts, we hope to create a well-rounded resource, targeted at the varied stakeholders of a new Drupal development project.

Brainstorm Notes from Drupal Europe RoundtableBrainstorm Notes from Drupal Europe Roundtable - Photo by Meike Jung

By working together, embracing open source ideals, we hope to rapidly achieve the first incarnation of the slide deck, ready for it to be built upon in the future. The sooner we create a draft, the sooner we can share the potential of Drupal with a wider audience. Projects like this prove that you needn’t be a web developer to be part of the welcoming Drupal community.

Get Involved!

If you’re interested in getting involved with this innovative project, please get in touch via our web form. Any contributions, big or small, will be gratefully received, as we strive to convert this idea into a reality.

Join the cause, let’s make Drupal better together!

Get Involved Today

Drupal.org Issue: Drupal "Pitch Deck" for Presenting to (Potential) Customers 

Oct 02 2018
Oct 02
Take a look at the bottom right corner of this blog post. See it? That Echidna-red “speech bubble”? Go ahead… click on it. I’ll wait! That’s right. A direct link to me. And legitimately me, not just a team of “me”s monitoring the account. Of…
Oct 02 2018
Oct 02

Have you ever thought about how people with visual & auditory impairment or cognitive or physical disabilities access websites? People with disabilities can access the web as we do only when a website and/or mobile apps have been developed keeping a different set of audience in mind. This is where web accessibility comes into play.

Making a site accessible isn’t that difficult to implement as it seems to be. You just need to understand the underlying issues that make a site difficult or impossible to use by the challenged people.

So what is web accessibility?

Web accessibility is all about making sure that websites work for the widest possible audience.

According to the World Wide Web Consortium (W3C) - an international community - web accessibility means that websites, tools, and technologies are designed and developed so that people with disabilities can use them.

Importance and benefits of accessibility

Expands your audience base

An easily accessible web provides equal access and equal opportunity to people with disabilities thus resulting in an increased audience (more users) and increased effectiveness (more useful).

Good for SEO

Accessibility has a direct correlation to developing best practices for SEO. It increases the visibility of the web page.

Grow your Business

The most obvious benefit of building accessible websites is to help businesses and organizations widen their reach.

Build Positive Public Relations

Building an accessible website also builds your reputation. Why not stand out from your competition by being accessible to everyone and demonstrating social responsibility?

Web Content Accessibility Guidelines 2.0

Level A

The most basic web accessibility features are:

  • provides text alternatives for non-text content
  • Provides an alternative to video-only and audio-only content
  • Accessible by keyboard only

Level AA

Deals with the biggest and most common barriers for disabled users:

  • Text can be resized to 200% without loss of content or function
  • Do not use text over image
  • Suggest fixes when users make errors

Level AAA

Deals with the most complex level of web accessibility

  • Provide detailed help and instructions
  • Do not use text over image

To get a full list, refer WCAG 2.0 checklists

Ways to make your website accessible

The heading is an outline of your webpage. Use headlines to introduce content, they are labels, not a statement.

Heading 1 is the most important for the page use it as the title of the page. Heading 2 is subheadings of the H1. Use it to divide content into scannable blocks. H3 is for subheadings of the H2.

Below are the guidelines for the heading:

Nested Heading

The heading is an outline of your webpage. Use headlines to introduce content, they are labels, not a statement.

Heading 1 is the most important for the page use it as the title of the page. Heading 2 is subheadings of the H1. Use it to divide content into scannable blocks. H3 is for subheadings of the H2.

Below are the guidelines for the heading:

  • Every page should have an h1 heading
  • Headings must be properly nested
  • Headings are used for structure, not formatting
  • Don’t hide headings
nested heading

Form Accessibility

  • Every Form elements must have a label and announced before the input
  • Keyboard Accessible Field
  • Use Inline Form error
  • Always include a button for users to submit their form information
inline form error

Add an Alt Text to All Images

Alt text is a replacement text of the image if it fails to load. ALT text is accessed by screen reader users to provide them with a text equivalent of images. It also helps in improving SEO.

Screen Reader Output With ALT Tag

image with alt text

Screen Reader Output Without ALT Tag

image without alt text

Aural Alerts

Users with visual impairment cannot see all visual updates of the page such as color changes, animations or texts appended to the content. In this case, Drupal provides a JavaScript method Drupal.announce() which creates an “aria-live” element on the page. Drupal.announce accepts a string to be read by an audio UA. Below is the example of aural alert.

  Drupal.t('You look beautiful today.')

Text over images

Text over an image shouldn't be completely avoided, if required then make sure that the text is both legible and readable to users. To handle this, maximize the contrast on your pages among graphics, fonts, and backgrounds.


The removal of the blue background from the image is giving maximum contrast. The content is more clear and accessible to read.

CSS Display Options

None of your content hides completely by using the hidden attribute or setting display. Users can't see it and screen readers or search engines can't read it. To help with this, Drupal 8 has adopted three new display classes: (1) hidden, (2) visually-hidden (3) invisible.

Hidden: Hides an element visually and from screen readers
Visually-hidden: Hides an element visually but exposed to screen reader users
Invisible: Hides element visually and from screen readers. Also, it doesn’t affect the layout

Constrain tabbing

Drupal 8 introduced JavaScript feature, called tabbing manager, to guide a non-visual user to tabbing contextual links when the global edit mode is enabled. Use below code to invoke the tabbing manager.

var tabbingContext = Drupal.tabbingManager.constrain($('.contextual-toolbar-tab, .contextual'));

A set of elements is passed to the constrain method. Pressing the tabbing key will only move between the tab-able elements in this set of elements.

To remove the tabbing constrain, the release method must be called on the tabbing context object.


Drupal 8 Contributed Modules For Web Accessibility

Automatic Alt text

When a user doesn’t provide Alt text for images, Drupal 8 uses Microsoft Azure Cognitive services API to generate one.

Text Resize

This module generates block to quickly change the font size of text on your Drupal site.

CKEditor Accessibility Checker

This module provides the Accessibility Checker plugin in your WYSIWYG editor to inspect the accessibility level of content created in CKEditor and immediately solve any accessibility issues found.

Block ARIA Landmark Roles

This module adds additional elements to the block configuration forms that allow users to assign ARIA landmark role and/or ARIA labels to a block.

To sum up, web accessibility is all about making your site available to as many users as possible. There shouldn’t be any hindrance for disabled people in the digital world. This post focuses primarily on discussing the ways you can make your site accessible using Drupal 8 contribute modules.

We, at Valuebound - a Drupal web development company, help enterprises with Drupal migration, Drupal support, third-party integration, performance tuning, managed services and others. Get in touch with our Drupal experts to find out the ways you can use big data for your business.

Below given is a presentation on "web accessibility in Drupal 8"

Oct 01 2018
Oct 01

Way back in January 2005, I posted a proposal to improve the governance of the Drupal project and help make it "fully 'community-driven'". In response, one commenter wrote:

Yesterday evening on the #drupal channel there was a trial vote casting for "Leave Dries alone" and unamiously everyone voted +1 on this...

Looking back at this long ago exchange, it seems to me to say something about what were and were not considered acceptable ideas in the community, and also about socially expected behaviour when there was a perceived threat to authority.

The issue of Drupal project governance has been returned to many times in the 13+ years since, but when it comes to the parameters of debate I don't know that so much has changed.

Why? Context is important. The Drupal project is structured as a centralized hierarchy--a dictatorship. While Dries Buytaert is often referred to as the project "lead", his formal, self-appointed position as noted in the project's governance documentation is "Benevolent Dictator For Life". Those skeptical of the claim that any dictatorship is benevolent - I'm certainly one - may reasonably shorten that to "Dictator For Life".

Now yet another group is tasked with producing recommendations for improving governance in the project. I have to say, I feel for them.

Because I've been there. I served for years as an elected permanent member of the now defunct Belgum-based Drupal Association (DA), and stayed on as an advisory board member for the DA in its current, US-based incarnation. From all those hundreds (probably, thousands) of hours of work, I can point to the occasional change that seemed at the time like progress, such as the process I co-led to design a community election system. But mostly in those years I learned the hard way just how difficult it is - futile might be more accurate - to try to fix a broken system from within.

The governance group itself, along with its assignment, is a product of the very power structure it's tasked with reworking. The task force was personally approved by the dictator for life. It reports to the dictator for life. Any decision on its recommendations will be made by the dictator for life.


There's the challenge of scope. They're tasked with coming up with recommendations for the governance of the Drupal "community". But, according to their charter, that community excludes all technical and code-related decision making and groups, as well as the Drupal Association.

So the assignment is to talk about open source governance without talking about open source governance--did I get that right?

There's the challenge of existing networks of influence, themselves deeply shaped by the project's centralized structure.

I've been particularly struck by contradictions baked into the project's supposed "Values & Principles". That document presents as the shared commitments of a very large community--and yet it's written in the first person singular? "Leadership is not something that is appointed", the document claims; but, um, isn't personal appointment by the dictator for life precisely how leadership positions are created and filled? Community members are to be treated with "dignity and respect"; and that's somehow perfectly consistent with a structure in which "The values and principles are maintained by me, Dries Buytaert"?

And so on.

Per the task force charter, a governance proposal must include "Implementing of the Drupal Values & Principles"--which themselves both reflect and encode a very specific form of governance, a dictatorship.

Anyone else got that "let me off this merry-go-round" feeling?

As far as I can tell, pretty much every previous governance exercise has produced some variation on the fantasy world in which (to quote again from the "Values & Principles") the ideals of "diversity, equity, and inclusion" magically turn out to be ideally met by a few tweaks to the cosy male dictatorship we already have.

But there's also a lot pushing in quite a different direction. Part of that is definitely the influence of the Me Too movement. Now is a time of deep scrutiny of structures of male, cis-gender dominance. Community members may be taking a long, hard look at the list of supposedly "benevolent" dictators for life and asking: is this a model we as a community would consciously choose to adopt? And if we're skeptical, what form of governance would reflect our shared values?

And part of the push for change is the state of the Drupal project itself.

Take the bold and transformative steps the project needs--but do so within the strictures of the current power structure? Sure, no problem.

But critique is easy. What about alternatives?

Well, for one, we could do a lot worse than look at a parallel process in the Backdrop CMS project (a conversation I participated in). How did that governance decision making process play out? What options were considered? What governance model did they choose, and why? How well does the current Backdrop leadership realize the values of diversity and inclusion?

What can we learn there about what it takes to achieve meaningful change?

Oct 01 2018
Oct 01

Who sponsors Drupal development?

We know who contributes

A few weeks ago, Dries Buytaert published his annual who sponsors Drupal development. His report acknowledges individual and organization contributions and what projects they are supporting. This report provides a high-level overview of who contributing in the Drupal community. There are some old names on this list and some new names.

Asking why they contribute

Now that we know who is contributing to Drupal, the next and more difficult question is “Why are they contributing to Drupal?” Knowing the story behind why an individual or organization contributes to Drupal will inspire more people to get involved and give something back to Drupal and Open Source.

My contribution to Drupal

This year, I was the number three individual contributor to Drupal. The previous year, when I first appeared on the top contributor list, it was completely unexpected. I joked with my son, Ben, that, "I won a race that I did not know I was running." Being included on this list was an honor that I did not expect to achieve, partially because I’m always in awe of the ongoing work by all the core maintainers and contributors.

Since last year, I have not slowed down on my commitment to the Webform module for Drupal 8. So I was not surprised to be included in this year's list. Over the past year, I have had several interesting conversations with other developers on the top contributor list, and what resonated with me the most is that everyone on this list has a different history as to why he or she contributes to Drupal. Here is more of the story.

There are different types of contributions

I found one of the biggest difference in our contributions and commitment to Drupal is whether our work is primarily sponsored or volunteer (aka unpaid).

Only 12% of the commit credits that we examined in 2017-2018 were "purely volunteer" credits (6,007 credits), in stark contrast to the 49% that were "purely sponsored". In other words, there were four times as many "purely sponsored" credits as "purely volunteer" credits.

-- https://dri.es/who-sponsors-drupal-development-2018

Right now, I would estimate < 10% of my contributions are sponsored (aka paid). The fact that I am doing all this work for free is not the norm for most contributors. It is essential to note that every single contributor that I know on the top contributor list has at one time or another contributed a ridiculous amount of volunteer contributions to Drupal before being finding an employer or organization to sponsor their work. Some people, including Dries, started contributing to Drupal in college, others do it as a hobby, and some just do it.

We all have different stories about how we discovered Drupal. Our stories begin with our first experience with Drupal, the software, which is quickly followed by our first experience with Drupal, the community. At some point, we contribute our first patch, it might take months or years for us to start contributing regularly or maintaining a project. Finally, for me and other major contributors something changed and suddenly we are spending a significant amount time contributing and helping maintain Drupal Core or a contrib project.

Why am I contributing so much to Drupal?

If I had to pick one word to describe why I contribute so much to Drupal I would have to say "Brand." I am willing to bet that most people did not expect me to summarize my contribution to Open Source with a word generally associated with marketing.

A brand is a name, term, design, symbol, or other feature that distinguishes an organization or product from its rivals in the eyes of the customer.

-- https://en.wikipedia.org/wiki/Brand 

Personal brand

The concept of a personal "brand" never really crossed my mind until I started to work with Marco Salazar, who is been my career coach for the past two years. I was inspired to work with a coach by Michael Schmid's (@Schnitzel) community keynote at DrupalCon Baltimore called "Your brain health is more important than your standing desk". Michael's first of many great suggestions was “get a coach.”

Marcos introduced me to the notion that everyone in the digital/social media world has a story and that story is communicated by one’s personal brand. The moment we create a Facebook page, a LinkedIn profile, a Twitter account, or a Drupal.org user profile, we have started to distinguish ourselves from others (aka rivals in the eyes of customers). Ironically, I never post to Facebook and rarely engage in social media, but I love to write and share code. You might say that the work that I do, the writing and sharing of code is how I’ve defined myself - my work is my way of defining myself - my content is the work. That said, all that posting, writing, creating on social media, and even coding is content

Code is content

At the heart of what I consider my personal brand is code, specifically the Webform module. Code alone is not really content.

Computer code is the set of instructions forming a computer program which is executed by a computer.

-- https://en.wikipedia.org/wiki/Computer_code

In Open Source, our shared and freely available code is still computer code but everything around the code is content. If documentation is content, presentations are also content, even a response to a question is content.

For the past three years, I have generated a lot of content around the Webform module beginning with my personal website, this blog, documentation, videos, presentations, and responding to support requests in the Webform issue queue and Drupal Answers. Ultimately all this content has succeeded in creating a name for myself in the Drupal community. Yes, being the maintainer of something like the Webform module will help get me a job interview, more importantly, content like this blog post and even how I respond to support requests help future employers and clients understand who am I and how I work.

I understand the value of people knowing about the work I do and how I do it because in the fast and changing tech industry, it is essential not to become obsolete. My favorite children’s story about overcoming the challenge of being obsolete is "Mike Mulligan and His Steam Shovel" by Virginia Lee Burton.

Mike Mulligan and His Steam Shovel

Mike Mulligan and His Steam Shovel

The story of Mike Mulligan and his steam shovel

In this story, the main character is Mike Mulligan, a steam shovel operator, and his steam shovel, Mary Anne, are being replaced by newer diesel and electric shovels because the industry and its corresponding tools are changing. The diligent and hard workers that they are, get word of an upcoming job and in an effort to not only do their job but to also prove they can do it well and efficiently, Mike and Mary Anne boast that they can dig as much in a day and 100 men can do in a week. Mike gets the job and he and Mary Anne succeed in digging the foundation for the town hall of Popperville in one day. It turns out to be their final digging job, however, the story's ending has a wonderful approach to addressing the shift in time and technology. Remarkably, the ending which was suggested to the book’s author by a 12-year-old boy is really special and should be saved for the first time you read this book to a child. I won’t totally give it away but suffice it to say, that Mike and Mary Anne are acknowledged, remembered and valued for who they are - the town of Popperville and the world shifts, but Mike and Mary Anne still have a place in society and are not lost in obscurity.

Everyone in the software industry can relate to the challenge of feeling obsolete. Even if we master the latest and greatest programming language or framework, there are dozens more that we should/could be learning. Mastering a programming language, even writing test coverage is challenging when our work is tied to deadlines and budgets. Open Source projects don't generally have budgets or deadlines; people are just sharing ideas and working to solve challenging problems.

Contributing the Webform module provided me with a new professional challenge and community

The challenge of contributing

One of my first blog posts provides a history and the process around of my work on the Webform module. That post gives a fairly complete overview of the actual work I am doing on the Webform module. In comparison, this current blog post is exploring why am I doing all the work in the first place.

Three years ago, I was reviewing my resume and felt that working with the same institution, Memorial Sloan-Kettering, for so many years (18+) was potentially going to hurt my career prospects. I noted that my work/consulting experience was very independent and contained minimal speaking and project management experience. It is worth stating that there is nothing wrong with staying a job for years, especially if they are one of the best employers in NYC

Maintaining a large Open Source project like the Webform module is more of a software architecture and community coordination challenge, then a coding challenge.

People are watching me code

In the story "Mike Mulligan and His Steam Shovel", Mike and Mary Anne get to work on what is to be their final digging job, intermittently stating that “we always work harder and faster as more people are watching us”. And there is undoubtedly something to this. It’s very rewarding when people appreciate the work I am doing on the Webform module. Watching people gradually move to Drupal 8 and start using my work is a great feeling, especially organizations and non-for-profits that I have a personal connection with, like the YMCA who include the Webform module in their OpenY distribution.

Now, that you know the story behind why I contribute to Drupal, it is also worth discussing precisely what am I contributing to Drupal.

What am I contributing to Drupal?

Every presentation and screencast I record begins with…

Hi, my name is Jacob Rockowitz

Hi, my name is Jacob Rockowitz

My sole contribution to Drupal is the Webform module. This is a very personal and deliberate decision. I am a one project kind of guy. I do my best work when I can focus on a concrete goal and purpose. Maintaining and working on a single, isolated project is not the norm for Open Source contribution. Open Source works better when people maintain some projects while contributing to others. But for me, I find I lose a lot of momentum when having to jump from one project to another. I also feel with subsystems like Webform, someone needs to be fully engaged able to add features, respond to issues and fix bugs in a timely and efficient manner.

Writing code

The Webform module is a feature-rich application. I generally add one or two new features each week and try to refactor some existing code on a bi-weekly basis. I try to break down most tasks into one to two hours of work, and almost never estimate a feature or change that will take more than four hours into a single ticket.

While working, I am very bullish when it comes to committing code - I like maintaining a certain amount of velocity as I do things. I find it really challenging to get people to review patches and frequently I will post patch, let it sit in the issue queue for a day, come back to issue, do another code review (of my code) and commit the patch.

Quality assurance

Everyone has different levels of availability and it's understandable that someone might have time to create an issue one day but not be able to come back to review the solution. I find maintaining a certain level of quality with peer review in Open Source incredibly challenging.

Drupal's core committers do an amazing job of requiring peer review and enforcing code quality. Drupal's contributed projects are a slightly different beast. Still, certain key projects like Webform need to define and continually review their process. When Webform 8.x-5.x is finally released on Christmas, I am going to review the process for maintaining a stable version of the Webform module.

For example, last year, when Webform moved into a release candidate, I started to maintain change records, which help developers and site builders get an overview on recent changes and improvements.

Knowing that my code is not always getting enough peer review and sometimes can cause regressions makes it crucial that I respond to issues and bugs.

Webform Issue Queue

Webform Issue Queue

Responding to issues and fixing bugs

Everything we post online is content, including how we respond to issues, which means our response is part of our personal and professional profile. I do my best to respond to every request almost immediately, especially if the research and resolution of an issue might only require a few minutes.

Over the past three years, I have responded to 100's of issues and support requests. Sometimes it is incredibly challenging dealing with people who take for granted that I am generally working for free. Surprisingly, sometimes my biggest feeling of accomplishment comes from being able to help someone who initially posts an issue that has "negative undertones". I always respond professionally and help them resolve their problem; they always say "thank you." I think I find hearing gratitude from someone whose initial tone was difficult or agitated to be a complete 360. And to have that kind of affect on someone feels good.

Around 50% of my commit credits are earned through quick bugs and support requests that usually take less than an hour. I also get to decide when an issue is resolved and a commit credit is earned. I agree with Dries that…

Not all code credits are the same. We currently don't have a way to account for the complexity and quality of contributions; one person might have worked several weeks for just one credit, while another person might receive a credit for ten minutes of work. In the future, we should consider issuing credit data in conjunction with issue priority, patch size, etc. This could help incentivize people to work on larger and more important problems and save coding standards improvements for new contributor sprints. Implementing a scoring system that ranks the complexity of an issue would also allow us to develop more accurate reports of contributed work.

-- https://dri.es/who-sponsors-drupal-development-2018

It’s really hard to determine what is and is not commit credit worthy. Even though responding to a support request does not take long, the fact that I provide ongoing and immediate support contributes significantly to people’s success with using the Webform module and Drupal.

Showing a breakdown of how a commit credit is earned, whether it be from a core or contrib task, bug fix, support request, and documentation can help us understand how everyone's commit credits are earned. And there are layers of value in constantly evaluating, learning and discovering the time, effort, energy and attitude that goes into these things.

The Drupal Association has already done an amazing job of continually improving Drupal.org user profiles, which for me is as important as my resume. The Drupal Association has also improved the tools available for creating and maintaining documentation.

Creating documentation

As the Webform module's features and user base grew, I realized that I needed to start thinking about documentation. I code better than I write but the Webform module's needs documentation. I set up the Webform module's documentation section and gradually revised different sections and pages. Before a stable release, the Webform Features page needs to be updated.

Producing help videos

I discovered the best medium for me to create documentation is screencasts. I found an amazing Creative Commons slides template and started creating and producing very simple screencasts. These screencasts are available within the Webform module and on Drupal.org. Putting a voice behind the Webform module has helped assure people that this is a supported project. Yes, these videos also help promote my 'personal brand.'

How am I able to make this level of contribution to Drupal?

The reported data shows that only 7% of the recorded contributions were made by contributors that do not identify as male, which continues to indicate a steep gender gap….The gender imbalance in Drupal is profound and underscores the need to continue fostering diversity and inclusion in our community.

-- https://dri.es/who-sponsors-drupal-development-2018

I am incredibly fortunate to have ongoing work as a consultant for a client like Memorial Sloan Kettering. This relationship gives me the ability to contribute to Drupal in between paid work. I can also write-off a lot of Open Source related expenses as professional development.

I am fortunate to have the means to contribute to Drupal.

It is important to acknowledge that gender and race play a role in how much people earn and how much (free) time they have available to contribute to Open Source. Without embarking on a much larger discussion, it’s essential to realize the gender and race inequality is more directly addressed when organizations and businesses get involved in changing things.

If more tech companies work to improve their diversity while also allowing their employees to contribute to Open Source this could tip the scales where gender and race imbalance in our community reside.

What is going to be my next contribution to Drupal?​

I am committed to maintaining the Webform Module for Drupal 8 for the forcile future and…

There needs to be a stable release of the Webform module.

I am willing to admit that it is time. My goal is to tag a stable release before the New Year. The Webform module has less than 50 open issues and only 4 critical issues so I think this goal is achievable.

In 2019, I would like to start exploring the next generation of Drupal Form API and Webform module. If I keep plugging away at the Webform module I can write a follow-up to this blog post in a year and maybe some of Drupal's top contributors can also share their story.

Oct 01 2018
Oct 01

One of the biggest arguments for using Docker to develop your app is to have isolated environments for different setups(The classic case of two different versions of PHP for two projects). Even that is sometimes not convincing enough. I find Docker to be damn useful when building production parity on local(Think trying to reproduce a production bug on your local). If you've faced this problem and want to solve it, read on.

I wanted to switch over to Docker when I first heard about it. It sounded like Docker would address all the problems posed by Vagrant, without compromising the benefits. I adopted Docker4Drupal, a comprehensive Docker based setup open sourced by the nice folks at Wodby. I even use Lando these days. Its quite handy if you're working on non-Drupal projects as well. Why would I spin my own Docker setup with all these wonderful tools around?

With the rationale to "reinvent the wheel" out of the way, let's build our custom setup.

First, a humble docker-compose to get the setup up and running. As an aside, a docker-compose file is a declarative configuration(in YAML) of how your web stack should roll out as docker containers. We shall use v2 version of docker-compose as its more widespread. I'll update this post for a v3 version sometime in future.

We're building a LEMP stack, which involves a web app(PHP + Nginx) talking to a DB service(MariaDB). Docker compose serves as a single "source of truth" specification for the stack. Also, it builds all the associated containers in the same docker network so that its easier for the services to discover each other.

As we are particular about having the same configuration in both local and production, we use the concept of docker-compose inheritance to have the commonalities in one place and add env specific configuration in a respective file. Our setup consists of 3 containers,

Here's how the base compose version looks like.

We can add more optional containers like PHPMyAdmin if needed.

We have the option to use a pre-built Docker image or build our own. This is based on few considerations:

  • PHP

    We start with `php:7.1-fpm` image as the base image. This strikes a good balance between bleeding edge and stability. Apart from installing the basic Drupal related dependencies, we install some binaries like Git, Wget and Composer. We also configure a working directory where the code gets injected, called /code. There is more room for improvement, like running the main processes like Nginx and PHP FPM as non root users. But that's the topic of a later blog post.

    Our PHP docker image looks like this,

    FROM php:7.1-fpm
    RUN apt-get update \
        && apt-get install -y libfreetype6-dev libjpeg62-turbo-dev libpng-dev wget mysql-client git
    RUN  docker-php-ext-configure gd --with-freetype-dir=/usr/include/ --with-jpeg-dir=/usr/include/ \
         && docker-php-ext-install gd \
         && :\
         && docker-php-ext-install pdo pdo_mysql opcache zip \
         && docker-php-ext-enable pdo pdo_mysql opcache zip
    # Install Composer
    RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
    WORKDIR /code

    Also, I'd prefer that search engines don't crawl my non production sites. I'd go with a .htaccess based approach for this, but it apparently [[][incurs a performance penality]]. I write some extra stuff in my dockerfile to address this.

    RUN if [ -n "$NON_PROD" ] ; then printf "User-agent: *\nDisallow: /\nNoindex: /\n" > ./web/robots.txt; fi
  • mariadb

    There are 2 things to take care of when building the database containers.

    • Injecting database credentials and exposing them to other containers who want to use it,
    • Persisting databases even if containers are killed and restarted.

    For the former, we use a .env file which docker reads. This will supply all the environment variables needed to build our image. Here's how our .env will look:


    Let's tweak our local docker compose file to pick up these variables.

          file: docker-compose.yml
          service: mariadb

    Note the use of extends construct, which picks up everything else about the mariadb container from the base compose file and "extends" it. While we are at it, let's add persistence power to our database container, by mounting volumes. We map 2 volumes, one for storing the actual database data(which resides at /var/lib/mysql inside the container), another to supply init scripts to MySQL. The official MariaDB containers ship with a way to initiate the database with sample data. This path is at /docker-entrypoint-initdb.d. We will see that this is pretty useful to build production replicas of our Drupal site. Let's add those to the local compose file.

          file: docker-compose.yml
          service: mariadb
          # ...
          - ./mariadb-init:/docker-entrypoint-initdb.d
          - ./mariadb-data:/var/lib/mysql
  • Nginx config

    The Nginx container requires 2 things,

    • where the code resides inside the container.
    • the nginx configuration for running our Drupal site.

    Both these inputs can be supplied by mounting volumes.

    I picked up the Nginx configuration from here. I separated them into 2 files(one more generic and another file included in this which is specific to Drupal) to have a cleaner and more modular setup. This can be a single file as well.

    Here's how my Nginx container spec looks like for local compose file,

        file: docker-compose.yml
        service: nginx
        - ./:/code
        - ./deploy/nginx/config/local:/etc/nginx/conf.d/default.conf
        - ./deploy/nginx/config/drupal:/etc/nginx/include/drupal
        - '8000:80'

    Finally, I run Nginx in port 8000 and expose this from the container. Feel free to change 8000 to anything else you feel appropriate.

  • First spin at local

    Before we boot our containers, we have to make a few small tweaks to our PHP setup. We mount the code directory inside PHP because the PHP FPM process requires it. Also, as a 12 factor app best practice, we expose DB specific details as environment variables.

    Here's how our PHP container spec looks on local compose.

        file: docker-compose.yml
        service: php
        - ./:/code
        PHP_SENDMAIL_PATH: /usr/sbin/sendmail -t -i -S mailhog:1025
        PHP_FPM_CLEAR_ENV: "no"
        DB_HOST: mariadb
        DB_USER: ${MYSQL_USER}
        DB_DRIVER: mysql
        NON_PROD: 1

    Let's update our settings.php to read DB credentials from environment variables.

    $databases['default']['default'] = array (
        'database' => getenv('DB_NAME'),
        'username' => getenv('DB_USER'),
        'password' => getenv('DB_PASSWORD'),
        'prefix' => '',
        'host' => getenv('DB_HOST'),
        'port' => '3306',
        'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
        'driver' => 'mysql',

    NOTE the settings.php file, by deliberate design is blacklisted by Git from being checked in as it might contain sensitive information about your environment like passwords or API keys. For this setup to work, you will have to checkin the settings.php file, and doubly ensure that it does not contain any sensitive information. If this is the case, you inject them into your app using environment variables like how we did for the DB credentials above.

  • Booting our app

    Let's boot our full docker stack.

    $ docker-compose -f local.yml up --build -d

    To check the logs, run,

    $ docker-compose -f local.yml logs <container-name>

    The app can be accessed at localhost:8000 or whatever port you supplied for Nginx in local compose file. Make sure you run composer install before setting up Drupal.

    $ docker-compose -f local.yml run php composer install

    To run drush, you have to supply the full path of the drush executable and the root directory where Drupal is installed.

    $ docker-compose -f local.yml run php ./vendor/bin/drush --root=/code/web pm-list

    If you add /code/vendor/bin to PATH when building the container and create a drush alias with /code/web as the root directory, then we can run drush in a more elegant manner, but that's totally optional.

    Finally, to stop the setup, we run

    $ docker-compose -f local.yml down

    That's pretty much how you run Drupal 8 on Docker in your local. We shall see how to translate this into a production setup in the next installment.

  • Oct 01 2018
    Oct 01
    “Digital Experiences” are the next big thing someone at your company is almost certainly talking about. These include visionary technology that operates based on rich data that is timely and location-based, interactions between other services and products, and perhaps most importantly: content that is not reliant on a user manually driving the experience (as they usually might on a website or mobile application). This article discusses a unique digital experience, thousands of countdown clocks, developed in Drupal 8 by Acquia for New York's Metropolitan Transportation Authority (MTA). 
    Oct 01 2018
    Oct 01

    One of the most exciting additions to Drupal 8.6 is the new experimental Layout Builder. Many are focused on Layout Builder replacing Panels, Panelizer, Display Suite, and even Paragraphs. The clean and modular architecture of Layout Builder supports a multitude of different use cases. It can even be used to create a WYSIWYG Mega Menu experience.

    Note: Experimental

    While Layout Builder was first added as experimental to Drupal 8.5, it has changed significantly since and is now considered more "beta" than "alpha". While still technically experimental and not officially recommended for production sites, the functionality and architecture has stabilized with Drupal 8.6 and it's time to start evaluating it more seriously.

    What is a Mega Menu?

    For the purposes of this discussion, I'll define a "Mega Menu" as simply a navigation structure where each item in the menu can expand to show a variety of different components beyond a simple list of links.

    In the above example example, we see a three column menu item with two submenus, a search form, and a piece of static content (or reference to another node).

    Mega Menus present many challenges for a site including accessibility, mobile responsiveness, governance and revision moderation, etc. While I don't advocate the use of mega menus, sometimes they are an unavoidable requirement.

    Past Solutions

    I've seen many different implementations of Mega Menus over the years.

    • Modules such as we_megamenu (D8),  tb_megamenu (D7), etc.
    • Custom blocks (D8),
    • Hard-coded links, node references, and Form API rendered in theme,
    • MiniPanels rendered in the theme (D7)
    • Integrations with Javascript libraries such as Superfish
    • Custom site-specific code

    These solutions had many problems and often didn't provide any easy way for site owners to make changes. Often these solutions caused headaches when migrating the site or supporting it over a long life cycle. I've known many teams who simply groan when a client mentions "we want mega menus."

    Wouldn't it be nice if there was a consistent way in Drupal 8 to create and manage these menus with a component-based design architecture?

    Layout Builder

    The Layout Builder module can take control over the rendering of an entity view mode. Normally in Drupal, a view mode is just a list of fields you want to display, and in which order. These simplistic lists of fields are usually passed to a theme template responsible for taking the raw field data and rendering it into the designed page.

    With Layout Builder, a view mode consists of multiple "sections" that can contain multiple "blocks." A "Section" references a specific "Layout" (2 column, 3 column, etc). Each field of the entity can be displayed via a new field_block. Thus, a traditional view mode is just a single section with a one-column layout filled with a block for each field to be displayed.

    The core Layout Discovery module is used to locate the available "layouts" on the site that can be assigned to a Section. Core comes with one column, two column, and three column (33/33/33 and 25/50/25) layouts. Custom layout modules can be easily created to wrap a twig template for any layout needed within a section.

    Blocks for each field can be added to a section, along with any other predefined or custom block on the site. Core also provides "inline blocks" that are instances of custom blocks referenced by the node but not shown in the global block layout admin view.

    When an inline block is edited, a new revision of the block is created and a new revision of the node entity is created to reference it, allowing layout changes to be handled with the same workflow as normal content changes.

    Section Storage

    Layout Builder uses a Section Storage plugin to determine how the list of block uuids referenced in a layout section are stored. The default layout for a content type is stored in the third_party_settings for the specific view mode configuration. If node-specific overrides are enabled for the bundle, the overriding list of blocks in the section are stored within a layout_builder__layout field added to the node.

    While the use of Layout Builder is focused on Nodes (such as Landing Pages), the Layout Builder architecture actually works with any entity type that supports the Section Storage. Specifically, any entity that is "fieldable" is supported.

    Fieldable Menu Items

    If Layout Builder works with any fieldable entity, how can we make a Menu Item entity fieldable? The answer is the menu_item_extras contrib module. This module allows you to add fields to a menu entity along with form and display view modes. For example, you can add an "icon" image field that will be displayed next to the menu link.

    The Menu Item Extras module has been used in Drupal 8 for a while to implement mega menus via additional fields. However, in Drupal 8.6 you don't need to add your own fields, you just need to enable Layout Builder for the default menu item view display mode:

    When you allow each menu link to have its layout customized, a layout_builder__layout field is added to the menu item to store the list of blocks in the sections. When you Add a Link to your menu, a new tab will appear for customizing the layout of the new menu link item:

    The Layout tab will show the same Layout Builder UI used to create node landing pages, except now you are selecting the blocks to be shown on the specific menu item. You can select "Add Section" to add a new layout, then "Add Block" to add blocks to that section.

    In the example above I have used the optional Menu Blocks module to add submenus of the Drupal Admin menu (Structure and Configuration) to the first two columns (default core menu blocks do not allow the parent to be selected, but the Menu Block contrib module adds that). In third column the Search Form block was added, and below that is an "Inline Block" using the core Basic Block type to add static text to the menu item.

    Theming the Menu

    The Menu Item Extras module provides twig templates for handling the display of the menu item. Each menu item has a "content" variable that contains the field data of the view mode, just like with any node view mode.

    Each theme will need to decide how best to render these menus. Using a subtheme of the Bootstrap theme I created the following menu-levels.html.twig template to render the example shown at the beginning of this article:

    <ul{{ attributes.addClass(['menu', 'menu--' ~ menu_name|clean_class, 'nav', 'navbar-nav']) }}>
     {% for item in items %}
       {% set item_classes = [
         item.is_expanded ? 'expanded',
         item.is_expanded and menu_level == 0 ? 'dropdown',
         item.in_active_trail ? 'active',
       <li{{ item.attributes.addClass(item_classes) }}>
         <a href="https://www.phase2technology.com/blog/creating-mega-menu-layout-builder/{{ item.url }}" class="dropdown-toggle" data-toggle="dropdown">{{ item.title }} <span class="caret"></span></a>
         <div class="dropdown-menu dropdown-fullwidth">
           {{ item.content }}
     {% endfor %}


    The combination of Layout Builder and Menu Item Extras provides a nearly WYSIWYG experience for site owners to create complex mega menus from existing block components. While this method still requires a contrib module, the concept of making a menu item entity fieldable is a clean approach that could easily find its way into core someday. Rather than creating yet another architecture and data model for another "mega menu module", this approach simply relies on the same entity, field, and view mode architecture used throughout Drupal 8.

    While Layout Builder is still technically "experimental", it is already very functional. I expect to see many sites start to use it in the coming months and other contrib modules to enhance the experience (such as Layout Builder Restrictions) once more developers embrace this exciting new functionality in Drupal core.

    My thanks to the entire team of developers who have worked on the Layout Initiative to make Layout Builder a reality and look forward to it being officially stable in the near future.

    Oct 01 2018
    Oct 01

    While Drupal isn’t considered a slouch when it comes to performance out of the box, there are some factors which can slow it down and some basic practices which everyone should implement in order to squeeze more speed out of their Drupal sites.

    In this post, I’ll highlight some tips which can help to speed up your Drupal site. Without much further ado, let’s get to them:

    Use Caching

    Probably the most important factor in keeping your site working optimally is to cache as much of pages or block for as many visitors possible. Anytime a visitor visits your website, some components of it are stored in an easy access location, ensuring faster delivery of such components the next time the same visitor returns to your website. There are multiple methods of caching that come with Drupal out-of-the-box, as well as popular third-party caching technologies such as Redis, Varnish etc.

    Keep site up to date

    Drupal is regularly updated to fix bugs and improve its performance. Keeping your site updated, not only that ensures its efficiency but also has many other advantages, key amongst them ensures the security of the platform. It’s a win-win situation!

    Use CDN

    CDNs or Content Delivery Networks is a well-known resource for speeding up a website. While in the old days, utilizing a CDN with Drupal was somewhat of a hassle, it has now gotten quite simple. CDNs can bring a huge speed boost to your site, localizing data for your audiences through third-party networks. Using a CDN with your Drupal site should be a no-brainer really!

    Bandwidth Optimization

    This is a built-in method on Drupal which isn’t enabled by default. It’s a simple matter of enabling it through Drupal’s UI, but it brings about a considerable boost to your site by aggregating your site’s Javascript and CSS files. This means that fewer HTTP requests are needed in turn to speed up your site’s loading speed. This step targets the loading speed of your site’s front-end.

    Optimize Images

    Images are some of the heaviest elements of any website. They take the longest to load due to their (potentially) large size. Fortunately, Drupal gives you options to optimize images for faster loading. Two easy to implement methods provided by Drupal are the Image Styles and the Image Optimization features. The Image Style feature enables automatic resizing of images with respect to different screen sizes, and the Image Optimization feature enables settings of compression ratios for the images.

    Another additional method that ties to the optimization of images are lazy-loading them. This is a technique that loads images only when they are visible on the browser window, eliminating the need to load all images as soon as the site is requested helping in turn to reduce the time it takes for the site to be available to the visitor.

    Remove Unused Modules

    Over the course of a Drupal site’s lifetime, the number of additional modules on it can increase drastically. Over time, many of these modules might become obsolete, or you might not need the functionality provided by some of them anymore. In such cases, these modules might be loading with your website but not providing any value. It always helps to make sure you keep only the modules needed by your site and remove all unnecessary ones, both from the security standpoint and the speed of your site.

    Use a Good Hosting Provider

    The choice of hosting provider you use for your site is probably the single biggest factor for your site’s success. Make sure you do your research well and settle upon a good hosting provider!


    These are some basic tips that should enable any developer to get started on their site speed optimization, regardless of the skill level. Optimizing a Drupal site is an ongoing process, and it does go much deeper than the tips provided here, but these should be enough to get you started on making your site faster.

    Want to make your site even faster? Not sure where to go from here? Contact us at Agiledrop and hand over your site worries to us!  

    Oct 01 2018
    Oct 01

    Think of your best friend who keeps things to himself - a characteristic that would sometimes make it strenuous for you to understand if he is in distress. Juxtapose such a character with another friend who is an open book which makes it a downhill task to know what he is thinking and feeling. Such a correlation can be observed in this digital world where the security of open source software and proprietary software is constantly debated.

    An open book, showing an image of ocean tides, kept on the table

    Dr. A.P.J Abdul Kalam, former President and renowned scientist of India, once reiterated that “open source codes can easily introduce the users to build security algorithms in the system without the dependence of proprietary platforms”. The security that open source software offers is unparalleled. Drupal, as an open source content management framework, is known for its provision of magnificent security for your online presence and is worth considering.

    Getting to know open source software

    It was in 1999 when Eric Raymond stipulated that more eyeballs can make the bugs look shallow. He coined the term “Linus’ Law” which was named in honour of Linux creator Linus Torvalds. Since then, it has been almost two decades for continuous usage of Linus’ Law as a doctrine by some to explain the security benefits of open source software.

    Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone - The Cathedral and the Bazaar by Eric S. Raymond (Lesson 8)

    Open source software consists of source code that is openly available for anyone to do inspection, adjustments or improvements. This code may comprise of bugs or issues that require to be flagged.

    Furthermore, public availability means attackers could study and exploit the code that emphasises on inculcating code level security practices. Some of the common open source security practices constitute:

    • Governing an inventory of all software used. This data must consist of the version, hash value and the original source of the code.
    • Verification of the availability of security updates and bug fixes. This makes sure that the patch management processes are being done regularly.
    • Testing and scanning the source code. This is performed using code analysers, auditing tools, or a community like Drupal.
    • Make sure that open source applications are in compliance with the existing network architecture to avoid violations of any firewall or security policies.

    Myth or fact: Is Open source software more secure than closed source software?

    Whether it is the Heartbleed incident in 2014, where the vulnerability was discovered in OpenSSL. Or, the Microsoft Vulnerability Exploit in 2014, when credit card information of millions of Home Depot customers was compromised. Both open source and closed source software have a history of encountering security threats. But which one is more secure?

    Closed-source software, also known as proprietary software, is only distributed to authorized users with private modification and republishing restrictions. On the flip side, OSS is distributed under a licensing agreement which makes it available for the general public to use and modify for no cost.

    It is this ability to modify the code that forms the crux of the argument within the Linux community that open source is more safer and less susceptible to security attacks in comparison to closed source software Microsoft Windows.

    OSS allows anyone to rectify the broken code. In contrast, closed source can only be fixed by the vendor. 

    So, when more people are testing and fixing the code within the OSS community, open source gradually increases its salience on security over time. Although attacks are still discovered, it has become a lot easier to identify and fix bugs. Open source enthusiasts believe that they experience fewer exploits and their code receives patches more rapidly as there are a plethora of developers contributing to the project.

    When digging deeper, the notion that open source platforms offer users the capability to keeping itself relevant with new and altering requirements underpins the argument for open source over closed. OSS does have a reputation of being more secure as the University of Washington states in a report.

    Open source security in Drupal

    Infographic with roadmap showing the Drupal security process with relevant iconsSource: Acquia

    Drupal Security Team in action

    The Drupal open source project has a dedicated team of volunteers who track security-related bugs and release updates. They help in:

    • Resolving security issues that are reported in a Security Advisory.
    • Offering help for contributed module maintainers in fixing security issues.
    • Offering documentation for writing secure code and safeguarding Drupal sites
    • Providing assistance to the infrastructure team for keeping the Drupal.org infrastructure safe.

    Anyone, who discovers or learns about a potential error, weakness or a security threat that can compromise the security of Drupal, can submit it to the Drupal security team.

    Process cycle

    The process cycle of the Drupal security team involves:

    • Analysis of issues and evaluation of potential impact on all the supported releases of Drupal.
    • Mobilizing the maintainer for its removal if found with a valid problem
    • Creation, assessment, and testing of new versions
    • Creation of new releases on Drupal.org
    • Using available communication channels to inform users when issues are fixed
    • Issuing an advisory if the maintainer does resolve the issues within the deadline and recommending to disable the module thereby marking the project as unsupported on Drupal.org.

    The security team keeps issues private until there is a fix available for the issue or if the maintainer is not addressing the issue from time-to-time. Once the threat is addressed and a safer version is available, it is publicly announced.

    In addition, the security team coordinates the security announcements in release cycles and works with Drupal core and module maintainers. For any concern with the management of security issues, you can also ask [email protected] 

    Security features

    • You can enable a secure access to your Drupal site as it has the out-of-the-box support for salting and repeatedly hashing account passwords when they are stored in the database.
    • It also lets you enforce strong password policies, industry-standard authentication practices, session limits, and single sign-on systems.
    • It provides granular access control for giving administrators full control over who gets to see and who gets to modify different parts of a site.
    • You can also configure Drupal for firm database encryption in the top-notch security applications.
    • Its Form API helps in data validation and prevents XSS, CSRF, and other malicious data entry.
    • It limits the login attempts that can be made from a single IP address over a predefined time period. This helps in avoiding brute-force password attacks.
    • Its multilayered cache architecture assists in reducing Denial of Service (DoS) attacks and makes it the best CMS for some of the world’s highest traffic websites like NASA, the University of Oxford, Grammys, Pfizer etc.
    • Notably, the Drupal addresses all of the top 10 security risks of Open Web Application Security Project (OWASP).
    A box showing a list of OWASP Top 10 Most Critical Web Application Security Risks in 2017

    Statistical reports

    In the 2017 Cloud Security Report by Alert Logic, among open source frameworks assessed for content management and e-commerce, Drupal was reported for the least number of web application attacks.

    Table with rows and columns showing web app attacks targeting top CMS and ecommerce assetsSource: Alert Logic

    Sucuri’s Hacked Website Report also showed that Drupal was the most security-focused CMS with fewer security vulnerabilities reported. It stood on top against leading open source CMSs like Wordpress, Joomla, and Magento.

    A bar graph in red colour showing infected websites platform distribution in 2017Source: Sucuri

    Challenges in open source security

    Open source software has its share of challenges as well. Equifax’s 2017 breach was notable because of the millions of US consumers who were affected. For the digital transformation to transpire, developers are moving from perfect to fast using open source components as vital assets for swiftly adding common functionality. For letting developers move as swiftly as customers demand, security pros must address some fundamental challenges.

    For the digital transformation to transpire, developers are moving from perfect to fast using open source components as vital assets for swiftly adding common functionality

    The time between disclosure and exploit is shrinking. Today, there is enough information in the Common Vulnerability and Exposures (CVE) description of a vulnerability consisting of affected software versions and how to execute an attack. Malicious hackers can make use of this information and decrease the time between disclosure and exploit as was witnessed in the case of Equifax.

    Identification of open source component vulnerabilities listed in the National Vulnerability Database (NVD) increased by 10% from 2015 to 2016 with a similar increase in 2017. For instance, in components from Maven packages, Node.js packages, PyPi packages, and RubyGems, published vulnerabilities doubled (see graph below).

    A bar graph in green colour showing the state of open source security in 2017

    Security pros must assume that prerelease vulnerability scans are not executed by open source developers. So, software composition analysis (SCA) and frequent updates to open source components will be the responsibility of the enterprise.


    Open source security does pose a significant case for itself and can be a better option than a closed source or proprietary software. Also, it is a matter of preference looking at the organisational needs and project requirement, to choose between them for your digital business.

    Drupal, as an open source content management framework, comes out as the most secure CMS in comparison to leading players in the market. At Opensense Labs, we have been perpetually offering digital transformation with our strong expertise in Drupal development.

    Contact us at [email protected] for amazing web development projects and leverage open source security in Drupal 8.

    Sep 28 2018
    Sep 28

    Previously, we covered some simple tips that allow you to get more out of Drupal and I think we covered some basics. This time we are going to go a bit deeper to see what Drupal can really do. In the right hands, Drupal can be a very powerful tool for more than just content management. The following tips will take you through a few different topics to get more out of Drupal than ever before. Some of these tips are a bit more on the advanced side, but they are very useful.

    Tip #1: Don't be afraid of caching

    If you are at all familiar with website caching, then you know at least two things about it. It is useful for getting your pages to load faster, and it can be very complex. Caching is meant to speed up your site by putting your page together just one time and then simply redisplaying that rather than needing to build it anew every time. However, because of this, when something about how that content should display changes, that particular entry in the cache needs to be "invalidated" so that it doesn't continue to be used, which could result in it showing content that is no longer current.

    Fortunately, Drupal 8 has a fantastic caching system, provided by two modules which are included in Core: Internal Page Cache and Internal Dynamic Page Cache. The former caches entire pages for users who aren't logged in, while the latter caches the individual components of pages (such as blocks and rendered nodes) for all users. On most Drupal sites, these should both be turned on. Drupal Core and most contributed modules are built with this caching already in mind, so it's easy enough to just turn on the modules and get a nice performance boost from doing so.

    If you have lots of custom code which deals with how content renders, this may not be quite so simple, but it is still something worth looking into, especially if your site sometimes feels a bit slow.

    Ready to get the most out of Drupal?  Schedule a free consultation with an Ashday Drupal Expert. 

    Tip #2: Remove modules you don't need

    Drupal, by default, usually comes with a whole suite of modules installed, some of which not every site needs. These include, for example, the Tour module (for creating tutorial-style interfaces which highlight certain parts of your site in) and the Search module (which provides Drupal's default searching mechanism and is useful only when your site doesn't warrant a different search solution). If you don't actually need a module, uninstall it, and if it is a contrib module rather than a core module, you can then remove it from your site's code entirely.

    Every unnecessary module you have on your site can add clutter to the admin UI which makes it harder to find the things you actually want, and since each module can have its own potential security risks, uninstalling the ones you don't need can even help improve your site's security and stability.

    Other core modules which are good candidates to consider removing are CKEditor (for sites which don't need WYSIWYG content), Color (for when you're using a custom theme and don't need to change its appearance through the UI), and Comment (if your site doesn't allow users to comment on content anyway). And that's just core modules in the C's!

    Just be careful not to uninstall modules such as the Internal Page Cache module, which may not be specifically required to provide the site's intended functionality but which are important for keeping your site working smoothly. Consider each enabled module individually (and then, do the same with enabled themes!) 

    Tip #3: Use the latest version of PHP

    Drupal is written in PHP and is designed to take advantage of the new features and performance improvements provided by its latest versions. Although Drupal 8 can run on PHP versions as old as 5.5, it is now optimized for and fully compatible with PHP 7.2, and so a simple PHP version update can be a great benefit for your site's speed and reliability. You can check which PHP version your site is on from Drupal's Status report, and your hosting provider should provide a way to upgrade PHP if necessary.

    Important: Versions of Drupal prior to 8.5 are only compatible up to PHP 7.1. If you're on Drupal 8.4 or older, you should be sure to update Drupal (which is an important thing to do anyway to get all of its latest features, bug fixes, and security updates) prior to switching to the new version of PHP.

    Tip #4: Manage config the Drupal way

    We've spoken a bit before about configuration management in Drupal 8, but it's important enough it's worth mentioning again. Back in the Drupal 7 and earlier versions, deploying changes to a site typically involved recreating a whole bunch of "clicks" in the user interface, to arrange fields and blocks, API settings, user roles and permissions, and pretty much any other aspect of the site's configuration. Drupal 8 makes that a whole lot simpler with its configuration management system. Although managing config can get quite complex for some sites, for most it is simple. Once you've made a bunch of changes on your development site that you want to roll out to live, you can export those changes into a zipped collection of YAML files. You can then upload that zipped file directly to your live site to import the changes, or save the YAML files into your codebase and roll them out alongside your code changes. We prefer to use the latter method since it also has the benefit of keeping your config in your version control system, but either method works fine, and a direct upload of config can be a bit simpler to manage.

    Tip #5: Join the community

    Drupal is open source software and is built by a large community of developers and designers from across the world, and joining that community by signing up for an account at drupal.org can result in some tangible benefits for your site. One easy benefit of this is that if you come across a bug in Drupal or one of the contributed modules you are using, or even if you just find that some feature you'd like it to have is missing, you can post a message in the drupal.org issue queues to get a discussion started with the very people who can make the sort of improvements you want. Often, if you search the issue queues, you may even find that somebody else has thought of the same thing you have, and there might even be a patch already available to give you the functionality you want. 

    If you've had to create any custom modules or themes for your site, and they may be the sort of thing other people may find useful as well, it may be good to consider contributing them. If other people start looking at and using your custom modules, they may find ways to improve it and may even submit patches to fix bugs or add new features. Then it's easy to update your module with their recommendations, making it even more useful both for you and for the rest of the Drupal community.

    There you have it. Five more tips to get the most out of Drupal. Some of these might seem obvious, but they are some big wins you can make for yourself and your website. We’ve been working with Drupal long enough that some of these seem like second nature and in time they may also be that way for you. Stay tuned for more Drupal tips in the future!

    Offer for a free one-hour consultation, make you next project a success

    Sep 28 2018
    Sep 28

    Drupal 8.6 was released a couple weeks ago and it’s probably the most exciting release since Drupal 8.0. As you might know, new features are added with each minor release of Drupal 8 (e.g. between 8.5 and 8.6). At first, I thought that this would just change how we test and update our sites. But it’s amazing to see how many new, valuable features are being added in minor versions. These are the features that allow Drupal to constantly evolve and innovate, and keep everyone excited about using Drupal.

    Also, minor releases that add features are a great reason to keep your Drupal site up-to-date with the latest minor version!

    I tried out Drupal 8.6 the other day and here are some of the highlights. Note that some of these features (Media management, Workspaces) are provided by experimental modules. They are not ready to use in production yet, but are ready to be tested out in development and sandbox environments:


    As a Drupal site builder, the media features are a huge step forward. I watch a lot of content editors use Drupal and it’s clear that having media editing work smoothly greatly improves the content editing experience. From the Admin UX research I’ve worked on, better media management is one of the number one things that content editors want.

    So, what does media in core provide? You can now add media (images, video, audio, etc) through the WYSIWYG editor and via a new media field. You can re-use media that’s already been added to the site, or upload new items. You can also manage the media via an overview page and add new media items directly without creating content.

    Screenshot Drupal media library


    Drupal 8.6 comes with a Quickstart command that lets you install Drupal on your machine with a limited number of requirements. This makes it really easy to test out Drupal without installing other software, configuring a VM, or finding a vendor that provides cloud hosting.

    I think it’s great to have a feature like this out-of-the-box so that we can have a better experience for newcomers to Drupal. In fact, there’s already updated documentation on Drupal.org about how to install a quick version of Drupal.

    Thanks to Matt Grasmick for putting this together!

    Out-of-the-box Demo

    At DrupalCon Nashville, I tested out the new Umami install profile, which provides a demo of Drupal out-of-the-box. When you install Drupal, you’ll now see the Umami as an option on the install profile step. Umami comes with content, content types, views, and a theme for a recipe website. I think this profile, along with the Quickstart feature will allow developers and site builders new to Drupal to easily test out and demo its features.

    Screenshot of Umami. Vegetarian pasta bake


    Migrate has been around since the first minor release of Drupal 8, it’s the module that allows you to pull content into Drupal 8 from previous versions of Drupal or external sources. Migrate is now a stable module, which means that it will be easier for developers to create custom migrations without worrying about changes to the underlying code. This will also make it easier to write documentation and blog posts about how to do things with Migrate.

    There are some features around migrating multilingual content which have been set aside in a separate module (Migrate Drupal Multilingual). This module is an experimental module, as there is still some outstanding work to be done in this area.


    You are probably wondering: what is « workspaces »? This is a new, experimental module that allows a site administrator to create a new, parallel version of the site content - e.g. a Staging workspace - that can be deployed to the live site in one go. In Drupal 8.5, content moderation was introduced to Drupal, providing a workflow for content to be drafted, reviewed, and approved by different types of users. Workspaces takes this to the next level, allowing entire sections of content to be staged before publishing.

    More Under the Hood

    Besides new modules, there have been other improvements made to Drupal under the hood. There have been updates to the experimental Layout Builder module. It is now possible to create blocks via the layout builder interface, which will not show up in the global list of blocks. The process of porting tests from Simpletest to PHPUnit is almost done. Nightwatch.js was added to allow for automated javascript testing.

    What’s next?

    There are lots of new features planned for Drupal 8.7 including support for JSON API in core, potentially a refresh of the default Drupal admin theme (Seven) and work on features like automatic upgrades. Looking forward to seeing what’s next with Drupal in that release, which will come out early next year. Watch the latest DriesNote here, from Drupal Europe for an overview of the Drupal roadmap and new development in the works.

    You can get more information from the blog post on drupal.org and the Drupal 8.6 press release.

    Let us know in the comments what’s your favourite part of Drupal 8.6!

    Sep 28 2018
    Sep 28

    Just imagine... automatic updates in Drupal core.

    Such a feature would put an end to all those never-ending debates and ongoing discussions taking place in the Drupal community about the expectations and concerns with implementing such an auto-update system.

    Moreover, it would be a much-awaited upgrade for all those users who've been looking for (not to say “longing for) ways to automate Drupal core and modules for... years now. Who've been legitimately asking themselves:

    “Why doesn't Drupal offer an auto-update feature like WordPress?”

    And how did we get this far? From idea to a steady-growing initiative?

    1. first, it was the need to automate Drupal module and security updates
    2. then, the issues queues filled with opinions grounded in skepticism, valid concerns and high hopes started to “pile up” on Drupal.org,
    3. then, there was Dries' keynote presentation at Drupalcon Vienna in 2017, raising awareness around the need to re-structure Drupal core in order to support a secure auto-update system
    4. … which grew into the current Auto Update Initiative
    5. that echoed, recently, at Drupal Europa 2018, during the “Hackers Automate but the Drupal Community still Downloads Modules from Drupal.org” session

    Many concerns and issues have been pointed out. Many questions have been added to the long list.

    Yet, one thing's for sure:

    There still is a pressing, ever-growing need for an auto-update feature in Drupal...

    So, let me try to answer my best to some of your questions regarding this much-awaited addition to Drupal core:

    • What's in it for you precisely? How will an auto-update pre-built feature benefit you? 
    • Does the user persona profile suit you, too? Is it exclusively low-end websites that such a feature would benefit? Or are enterprise-level, company websites targeted, as well?
    • What are the main concerns about this implementation?

    1. The Automatic Updates Initiative: Goal & Main Challenges 

    Let's shift focus instead and pass in review the inconveniences of manually installing updates in Drupal:

    • it's time-consuming
    • it's can get risky if you don't know what you're doing
    • it can be an intimidatingly complex process if you have no dedicated Drupal support & maintenance team to rely on
    • it can get quite expensive, especially for a small site or blog owner

    See where I'm heading at?

    This initiative's main objective is to spare Drupal users of all these... inconveniences when it comes to updating and maintaining their websites. Inconveniences that can easily grow into reasons why some might get too discouraged to adopt Drupal in the first place.

    The goal is to develop an auto-update mechanism for Drupal core conceptually similar to those already implemented on other platforms (e.g.WordPress).

    And now, let's dig up and expose the key challenges in meeting this goal:

    • enabling update automation in Drupal core demands a complete re-engineering of the codebase; it calls for a reconstructing of its architecture and code layout in order to support a perfectly secure auto-update system 
    • such an implementation will have a major impact on the development cycle itself, causing unwanted disruption
    • such a built-in auto-update feature could get exploited for distributing and injecting malware into a whole mass of Drupal websites

    2. Automatic Updates in Drupal: Basic Implementation Requirements 

    What would be the ideal context for implementing such a perfectly secure auto-update system? 

    Well, its implementation would call for:

    • multiple (up to date) environments
    • released updates to be detected automatically and instantly
    • an update pipeline for quality assurance
    • existing automate tests with full coverage
    • a development team to review any changes applied during the update process 

    3. How Would These Auto-Updates Benefit You, the Drupal User?

    Well, let's see, maybe answering these key questions would help you identify the benefits that you'd reap (if any):

    • is your Drupal website currently maintained by a professional team?
    • has it been a... breeze for you so far to cope with Drupal 8's release cycle (one new patch each month and a new minor release every 6 months sure claim for a lot of your time)?
    • have you ever got tangled up in Composer's complexities and a whole load of third-party libraries when trying to update your Drupal 8 website?
    • did you run the Drupalgeddon update fast enough?
    • have you been secretly “fancying” about a functionality that would just update Drupal core and modules, by default, right on the live server?

    To sum up: having automatic updates in Drupal core would keep your website secured and properly maintained without you having to invest time or money for this.

    4. Drupal Updating Itself: Main Concerns

    And concerns increase exponentially as the need for an update automation in Drupal rises (along with the expectations).

    Now, let's outline some of the most frequently expressed ones:

    • there is no control over the update process, no quality assurance pipeline; basically, there's no time schedule system enabling you to test any given update, in a development environment, before pushing it live
    • there's no clearly defined policy on what updates (security updates only, all updates, highly critical updates etc.) should be pushed
    • with Drupal updating itself, rolling back changes wouldn't be possible anymore (or discouragingly difficult) with no GIT for version control
    • again: automatic updates in Drupal could turn into a vulnerability for hackers to exploit for a mass malware attack 
    • there's no clear policy regarding NodeJS, PHP and all the JS libraries in Drupal 8, all carrying their own vulnerabilities, too
    • it's too risky with all those core and module conflicts and bugs that could break through
    • such a feature should be disabled by default; thus, it would be every site owner's decision whether to turn it on or not
    • could this auto-update system cater to all the possible update workflows and specific behaviors out there? Could it meet all the different security requirements?

    So, you get the point: no control over the update pipeline and no policy for handling updates are the aspects that concern developers the most.

    6. Does It Cater for Both Small & Enterprise-Level Websites' Needs? 

    There is this shared consensus that implementing automatic updates in Drupal core would:

    1. not meet large company websites' security requirements; that it would not fit their specific update workflows
    2. benefit exclusively small, low-end websites that don't benefit from professional maintenance services

    Even the team behind the automatic updates initiative have prioritized low-end websites in their roadmap.

    But, is that really the case?

    Should this initiative target small websites, with simple needs and writable systems, that rarely update and to overlook enterprise-level websites by default?

    Or should this much-wanted functionality be adjusted so that it meets the latter's needs, as well? 

    In this case, the first step would be building an update pipeline that would ensure quality.

    What do you think?

    7. How About Now?"What Are My Options for Automating Updates in Drupal?"

    In other words: what are the currently available solutions if you want to automate the Drupal module and security updates? 

    7.1. You Can Use Custom Scripts to Automate Updates

    … one that's executed by Jerkins or another CI platform. 

    Note: do bear in mind that properly maintaining a heavy load of scrips and keeping up with all the new libraries, tools, and DevOp changes won't be precisely a “child's play”. Also, with no workflow and no integrated tools, ensuring quality's going to be a challenge to consider.

    7.2. You Can Opt for a Drupal Hosting Provider's Built-In Solution

    “Teaming up” with a Drupal hosting provider that offers you automated updates services, too, is another option at hand.

    In this respect, solutions for auto-updating, such as those provided by Pantheon or Acquia, could fit your specific requirements. 

    Note: again, you'll need to consider that these built-in solutions do not integrate with your specific DevOps workflows and tools.

    And my monologue on automatic updates in Drupal ends here, but I do hope that it will grow into a discussion/debate in the comments here below:

    Would you turn it on, if such a feature already existed in Drupal core?

    1. Definitely yes
    2. No way
    3. It depends on whether...
    Sep 27 2018
    Sep 27

    As of Fall 2018 and the release of Drupal 8.6, the Migrate module in core is finally stabilizing! Hopefully Migrate documentation will continue to solidify, but there are plenty of gaps to fill.

    Recently, I ran into an issue migrating Paragraph Entities (Entity Reference Revisions) that had a few open core bugs and ended up being really simple to solve within prepareRow in the source plugin.


    In my destination D8 site, I had a content type with a Paragraph reference field. Each node contained one or more Paragraph entities. This was reflected by having a Node migration with a dependency on a Paragraph Entity migration.

    With single value entity references, the migration_lookup plugin makes it really easy lookup up entity reference identifiers that were previously imported. As of September 2018, there is an open core issue to allow multiple values with migration_lookup. Migration lookup uses the migration map table created in the database to connect previously migrated data to Drupal data. The example below lookups up the taxonomy term ID based on the source reference (topic_area_id) from a previously ran migration. Note: You will need to add a migration dependency to your migration yml file to make sure migrations are run in the correct order.

       plugin: migration_lookup
       migration: nceo_migrate_resource_category
       no_stub: true
       source: topic_area_id


    Without using a Drupal Core patch, we need a way to do a migration_lookup in a more manual way. Thankfully prepareRow in your Migrate source plugin makes this pretty easy.

    Note: This is not a complete Migrate source plugin. All the methods are there, but I’m focussing on the prepareRow method for this post. The most important part of the code is manually querying the Migrate map database table created in the Paragraph Entity migration.

    namespace Drupal\your_module\Plugin\migrate\source;
    use Drupal\Core\Database\Database;
    use Drupal\migrate\Plugin\migrate\source\SqlBase;
    use Drupal\migrate\Row;
    * Source plugin for Sample migration.
    * @MigrateSource(
    *   id = "sample"
    * )
    class Sample extends SqlBase {
      * {@inheritdoc}
     public function query() {
       // Query source data.
      * {@inheritdoc}
     public function fields() {
       // Add source fields.
      * {@inheritdoc}
     public function getIds() {
       return [
         'item_id' => [
           'type' => 'integer',
           'alias' => 'item_id',
      * {@inheritdoc}
     public function prepareRow(Row $row) {
       // In migrate source plugins, the migrate database is easy.
       // Example: $this->select('your_table').
       // Getting to the Drupal 8 db requires a little more code.
       $drupalDb = Database::getConnection('default', 'default');
       $paragraphs = [];
       $results = $drupalDb->select('your_migrate_map_table', 'yt')
         ->fields('yt', ['destid1', 'destid2'])
         ->condition('yt.sourceid2', $row->getSourceProperty('item_id'), '=')
       if (!empty($results)) {
         foreach ($results as $result) {
           // destid1 in the map table is the nid.
           // destid2 in the map table is the entity revision id.
           $paragraphs[] = [
             'target_id' => $result->destid1,
             'target_revision_id' => $result->destid2,
       // Set a source property that can be referenced in yml.
      // Source properties can be named however you like.
       $row->setSourceProperty('prepare_multiple_paragraphs', $paragraphs);
       return parent::prepareRow($row);

    In your migration yml file, you can reference the prepare_multiple_paragraphs that was created in the migrate source plugin like this:

    id: sample
    label: 'Sample'
     plugin: sample
       plugin: default_value
       default_value: your_content_type
       source: prepare_multiple_paragraphs
       plugin: sub_process
         target_id: target_id
         target_revision_id: target_revision_id

    Sub_process was formally the iterator plugin and allows you to loop over items. This will properly create references to multiple Paragraph Entities. It will be nice when the migration_lookup plugin can properly handle this use case, but it’s a good thing to understand how prepareRow can provide flexibility.

    Sep 27 2018
    Sep 27
    Mike and Matt talk with the Drupal Association's Senior Events Manager, Amanda Gonser, about upcoming changes to DrupalCon events.
    Sep 27 2018
    Sep 27

    This is the seventh and (promise!) penultimate installment in a series presenting work on shared configuration that comes out of the Drutopia initiative and related efforts, beginning with Part 1, Configuration Providers.

    In this series we've covered how to create and update reusable packages of configuration in Drupal, otherwise known as features.

    In Part 6, we saw how the Features module can be used to package configuration that will be used by multiple different features into a "core" feature. An example is when multiple fields use the same storage. A core feature might provide a field_tags field storage, allowing multiple features to add a field_tags field to different content types. All the features that provide a field would require the core feature.

    This approach helps to manage dependencies among different features, but it has at least two major shortcomings.

    • Any site that wants to install even a single feature that's dependent on the core feature will get all the core configuration--whether or not it's needed. For example, if the core feature provides five field storages but only one is required by the dependent feature, all five will still be created on the site.
    • Features from different sets or distributions will have conflicting dependencies. Say we have two different distributions, A and B. An event feature from distribution A requires the distribution A core feature, which provides the field_tags field storage. An article feature from distribution B requires the distribution B core feature, which provides an identical field_tags field storage. The event feature should theoretically be compatible with the article feature. But in practice they can't be installed on the same site, since an attempt to install both core features will raise an exception because configuration provided by the first-installed core feature will already exist on the site when the second is queued for installation.

    In this installment we'll look at options for managing shared configuration that's required across multiple features--or multiple distributions.

    Namespaced configuration

    One workaround that's sometimes taken to address the issues of core configuration is to ensure individual configuration items are unique by giving them a namespace prefix.

    There are two ways this could go. The first keeps the concept of a core feature but namespaces everything in it by feature set or distribution. In this approach, the core feature from distribution A provides a field storage called a_field_tags, while the B distribution's core feature provides b_field_tags.

    More radically, in the second approach it's every feature for itself. Even within the same feature set or distribution, there are shared configuration items shared between features. An event feature from distribution A would provide its own tags field storage, such as a_event_field_tags, while the article feature from distribution B would provide b_article_field_tags.

    Either of these approaches would work to avoid conflicting dependencies. But they raise problems that are just as vexing as the ones they attempt to address.

    On a given site, you don't really want eight field storages for tags, one per feature you've installed. Nor, of course, do you want eight different tags vocabuaries, each with its own set of tags. You want one that will work site-wide. To take just one example, say I want to ship my distribution with a sitewide content search feature that includes the ability to filter results by tags (a taxonomy). I need all features on the site to use the same tags field referencing the same taxonomy--not eight different versions. See this issue on the Search API module for further details.

    Namespacing configuration entities is a workaround that avoids but doesn't successfully address the key challenges here.

    Configuration Share

    The Configuration Share module is written to address the two big shortcomings of core features.

    In Part 1 of this series we looked at the Configuration Provider module, which enables alternate configuration provision patterns beyond the required and optional directories supported by Drupal core.

    Configuration Share provides an additional plugin, shared. Commonly needed configuration items like user roles and field storages can be designated as shared by placing them in a module's config/shared directory. Shared configuration works similarly to optional configuration as supported by Drupal core in that it's not necessarily installed when a module providing it is installed. But while optional configuration is installed after its dependencies have been met, shared configuration is installed only on demand--when another piece of configuration that requires it is about to be installed.

    The basic logic:

    • When any extension is installed, the configuration to be installed is passed to all configuration provider plugins.
    • The shared plugin has a high weight and so runs after other plugins, including the ones for required and optional config.
    • The shared plugin analyzes all shared configuration with reference to the dependencies of the configuration queued for installation. If any shared configuration is required by queued configuration but is not yet installed, it is added to the list of configuration to be installed--along with any shared configuration it in turn requires.

    In this way:

    • A given site will have only the specific shared configuration that is required.
    • Multiple features can share the same required configuration items without creating conflicting dependencies, since shared configuration is only queued for installation if it's not already on the site.


    While Configuration Share is one potential approach to the challenges of base configuration, it's at best an enabler.

    In theory, two different distributions could both provide, say, a field_body storage and each add body fields to their respective features. But how do we know the two body fields will work together? One distribution might use Drupal core's text_with_summary field type for a field_body, while another might use Paragraphs for its body field, meaning that field_body would be of type entity_reference_revisions. A shared name - in this case, field_body - is no guarantee of compatibility.

    The real work of interoperable sets of features would require collaboration among the developers of multiple distributions to co-maintain one or more curated sets of standardized core configuration. The Compatible module is a preliminary - and thus far undeveloped - step in that direction. It's intended as a repository for a standard set of configuration that can be shared among multiple extensions or distributions.

    Next up

    Stay tuned for the last installment in this blog series: Summary and Future Directions.

    Sep 27 2018
    Sep 27

    There are couple of online tools, and integration modules to get sharing widget to your site. They rely on JavaScript and the security of your users is questionable. This article will show you how to create a simple yet flexible and safer service sharing widget without line of JavaScript.


    The main reason why not to use some of the tools like AddToAny is the security. This is often a case for government or other public facing project such as GovCMS. Sharing widget of these services is not connecting directly to the social service, but it is processed on their servers first. And they can track the user on through the web because of the fingerprint they made. Another reason is that the JS code is often served from a CDN so you don’t know when the code changes and how? Have they put them some malicious script? I don’t want this on my site. And clients often as well. :)

    Thankfully each service provide a simple way how to share content and we will use that.

    Final example

    You can see the final result in action with different styling applied at our example GovCMS 8 demo page (scroll down to the bottom of page).

    Site build

    First we need to prepare the data structure. For our purpose we will need to create a custom block type, but it can be easily done as a paragraph too.

    Custom block name: Social Share
    Machine name: [social_share]

    And throw in few Boolean fields. One for each service.

    Field label: [Human readable name] e.g. “Twitter”
    Machine name: [machine_name] e.g. “social_share_twitter” – this one is important and we will use it later.

    Go to the manage display screen of the block (/admin/structure/block/block-content/manage/social_share/display) and change the Output format to Custom. Then fill in the Custom output for TRUE with the text you like to see on the link e.g. "Share to twitter".

    Social Share labels

    Now we are able to create a new block of the Social share type and check some of these checkboxes. Users will see only the Labels as result.


    The fun part is changing the output of the field from simple label to actual share link.
    First we need to know how the final link looks like.
    Links examples:

    mailto:?subject=Interesting page [PAGE_TITLE]&body=Check out this site I came across [PAGE_URL]

    To get it work we need a current page Page URL, Page title, and Base path. Only the page URL is directly accessible from TWIG template. The other two needs to be prepared in preprocess. Lets add these in the theme_name.theme file.

     * Implements template_preprocess_field().
    function theme_name_preprocess_field(&$variables, $hook) {
      switch ($variables['field_name']) {
        case 'field_social_share_twitter':
          $request = \Drupal::request();
          $route_match = \Drupal::routeMatch();
          $title = \Drupal::service('title_resolver')
            ->getTitle($request, $route_match->getRouteObject());
      if (is_array($title)) {
        $variables['node_title'] = $title['#markup'];
      else {
        $variables['node_title'] = (string) $title;
      $variables['base_path'] = base_path();      

    As we probably will have more then one service we should use the DRY approach here. So we create extra function for the variable generation.

     * Preprocess field_social_share.
    function _theme_name_preprocess_field__social_shares(&$variables) {
      $request = \Drupal::request();
      $route_match = \Drupal::routeMatch();
      $title = \Drupal::service('title_resolver')
        ->getTitle($request, $route_match->getRouteObject());
      if (is_array($title)) {
        $variables['node_title'] = $title['#markup'];
      else {
        $variables['node_title'] = (string) $title;
      $variables['base_path'] = base_path();

    And we than call it for various cases. If some service will need more variables it will be easy to add it in different function. So we don’t process whats not required.

     * Implements template_preprocess_field().
    function theme_name_preprocess_field(&$variables, $hook) {
      switch ($variables['field_name']) {
        case 'field_social_share_facebook':
        case 'field_social_share_twitter':
        case 'field_social_share_linkedin':
        case 'field_social_share_email':

    Now we have the Node title and Base path prepared to be used in field templates.

    Enable twig debug and look in the markup for the checkbox. You will see couple of suggestions, the one we are looking for is field--field-social-share-twitter.html.twig.

    As the output should be single link item it is safe to assume we can remove all the labels condition and the single/multiple check as well. On the other hand we need to ensure that if the checkbox is unchecked it will not output any value. That is particularly hard in TWIG as it doesn’t have any universal information about the state of checkbox. It has only access to the actual value. But since we don’t know the value of custom label we cannot use it. However there is a small workaround we can use. Remember we hav not set the FALSE value.
    We can check if the field is outputting any #markup. The empty FALSE value will not produce anything, hence the condition will fail.

    {% if item.content['#markup'] %}

    Here is the full code for field template:

      set classes = [
    {% for item in items %}
      {% if item.content['#markup'] %}
        <a {{ attributes.addClass(classes) }} href="http://twitter.com/intent/tweet?status={{ node_title }}+{{ url('<current>') }}" title="Share to {{ item.content }}">{{ item.content }}</a>
      {% endif %}
    {% endfor %}

    For other services you need to adapt it. But it will still follow the same pattern.

    And we are done. Now your block should return links to sharing current page to the service.

    Social Share links

    Pro tip:

    So far we have not use any contrib module. But obviously your client would like to have some fancy staying applied. You can add everything in the theme, but that will be only one hardcoded option. For easier live of editors you can use Entity Class formatter module to easily add classes to the block from a select list. You can provide multiple select list for Size, Color, Rounded corners, Style etc.


    At this point we have the simple social share widget ready. We can select which predefined services will show in each instance and how will they look. E.g. On blog post you can have sharing for Twitter, Facebook and Email styled as small rounded icons. But with another instance of the block you can have only large squared LinkedIn icon + label shown on Job offering content type.

    Social Share icons

    Further notes

    After I wrote first draft of this article new module appeared which work in very similar way. Give it a try at Better Social Sharing Buttons. It will be quicker to get up ad running as it has predefined styles and services, but that can be a drawback at the same time. If I need different style, or extra service it can be harder to add it.

    Sep 27 2018
    Sep 27

    One of the reasons why The New York Times is able to catch up to its growing user base is its inclination towards technological advancements. That was evident when it leveraged the power of microservice architecture via a remodelled video publishing platform to scale with their newsroom demands. They also moved their infrastructure to the cloud which resulted in a stable and scalable email platform, powered by a suite of microservices, for sending emails to the readers.

    Illustration showing a back office of news media company with several people working and paper being circulated in the machines

    Why are big enterprises like The New York Times leaning towards microservices? Microservices has grown exponentially and holds an astronomical future for the digital businesses. It will be interesting to see how traditional CMS like Drupal finds a place in the world of microservices. But before plunging into all that, one might wonder where did this ‘microservices’ thing originate from?

    Tracing the roots in the UNIX world

    New Relic has compiled an interesting and brief timeline of the evolution of microservices. Microservices has its roots in the Unix world that takes us back to more than three decades ago.

    As a term, microservices was first documented in 2011 by Martin Fowler

    Service-oriented architecture (SOA), a design principle where services are offered to other components by application components via communication protocol over a network, was all the rage decades ago. Due to a superabundance of failures and costly implementations, the SOA earned a poor reputation and took a backseat. Martin Fowler, among others, has said that microservices are a new spin on SOA.

    As a term, it was first documented in 2011 by Fowler at a software architects’ workshop.

    In 2012, a presentation was given by James Lewis at the 33rd Degree in Krakow which was titled “Microservices - Java, the Unix Way”. This delineated microservices as a means of building software more rapidly by dividing and conquering and used Conway’s Law to structure teams.

    Since that time, the adoption of microservice architecture has grown and many organisations are going for microservices as their default style for building enterprise applications.

    Understanding the terminology

    Illustration showing four boxes with texts explaining microservicesSource: LeanIX GmbH

    What are microservices? Microservices are an architecture for splitting a monolithic application into smaller pieces. Each of those pieces offers a certain function through a well-defined and carefully handled API.

    The collection delivers the same overall business value like the monolithic application with the difference being these independently working individual pieces in microservices. That means they can be updated swiftly without impacting an entire application.

    “The microservice architectural style is an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API. These services are built around business capabilities and independently deployable by fully automated deployment machinery. There is a bare minimum of centralized management of these services, which may be written in different programming languages and use different data storage technologies”. - Martin Fowler

    "A microservice architectural style is an approach to develop a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API".

    Netflix is an unsurpassed example of microservices adoption. It moved from a traditional development model with several engineers producing a monolithic DVD-rental application to a microservices architecture. Small teams could focus on the end-to-end development of hundreds of microservices that work together to serve digital entertainment to millions of Netflix customers every day.

    Flowchart with the logos of amazon and etsy showing the difference between monolithic and microservices architecturesSource: LeanIX GmbH

    The main difference between monolithic and microservices architecture, as can be seen in the depiction above, is that all the features and functionalities were under a single umbrella. That is, they were under a single instance sharing a single database. With microservices, each feature is allotted a different microservice, managing its own data, and performing a different set of functionalities.

    How good or bad are microservices?

    Illustration with boxes on left and right side showing challenges and solutions of microservices on the respective sidesSource: Logentries

    The benefits of microservices are laid out below:

    • Autonomous deployments: You can update a service without having to redeploy the entire application and rollback or roll forward an update during mishaps. Fixing bugs and feature releases are much more manageable with fewer challenges.
    • Autonomous development: Building, testing and deploying a service would need a single development team leading to perpetual innovation and swift release cadence.
    • Small teams: Teams can lay their focus onto one service thereby simplifying the understanding of the codebase with the smaller scope for each service.
    • Isolation of faults: Downtime in one of the services won’t affect the overall application. This does not mean that you get resiliency for free.
    • Tech stack mixture: Technology that is deemed most fit for a service can be selected by the teams.
    • Scalability at granular levels: Independent scaling of services is possible.

    Some of the challenges are outlined below:

    • Intricacy: More moving parts are there in microservice application than the equivalent monolithic application.
    • Development and testing: Developing against service dependencies would need a different approach and testing service dependencies is difficult particularly when the application is evolving rapidly.
    • The dearth of administration: The decentralised approach for building microservices may lead to numerous languages and frameworks thereby making it harder to manage.
    • Network congestion and latency: Usage of granular services can result in more inter-service communication. Chances are that if the chain of service dependencies gets too elongated, additional latency can be a challenge.
    • Data integrity: Data consistency can be a hurdle with each microservice responsible for its own data persistence.
    • Management: Correlated logging across services can become a formidable task.
    • Update issues: If not for a careful design, several services updating at a given time could result in backward or forward compatibility.
    • Team skill-set: As the highly distributed systems, microservices require a team with the right mix of skills and experience.

    Taking Drupal into the context

    Drupal is a monolith. How can it survive this trend of microservices? Drupal, being an amazing content management framework, provides a great content editing experience and has been pioneering digital innovation. With that being said, microservices architecture can be used for development and deployment of applications using Drupal. Let’s see how Drupal can put into the scheme of things.

    Demonstration at DrupalCon Vienna 2017

    A presentation held at DrupalCon Vienna 2017 demonstrated an effective way of integrating Drupal 8 in a microservices architecture.
    Drupal 8 proved to be a useful content management framework for this implementing microservices architecture because of its:

    • Symfony components,
    • Composer to manage external dependencies,
    • and the magnificent results of the Web Services and Context Core Initiative (WSCCI).

    [embedded content]

    It exhibited the delegation of asynchronous work from Drupal to a set of very reactive applications written in Go with some assistance of RabbitMq queues. Elasticsearch was leveraged as a common data storage between services and REST endpoints were exposed where the endpoints could notify back to Drupal.
    Furthermore, methods of connecting websocket server to push and pull messages between services were shown. To run all these services in a controlled and replicable manner, services of Ansible and Docker were extracted.

    Demonstration at Drupal Developer Days Lisbon 2018

    Another session at Drupal Developer Days Lisbon 2018 delineated how the citizen portal of the city of Reykjavik (Iceland) was relaunched using Drupal and microservices.
    With the incorporation of more than 100 web services ranging from simple services like registering a dog or renewing a driver’s license to the intricate services like the admissions of children to school or updating the residential address.

    [embedded content]

    Powered by Drupal 8, this new portal integrates the services with a microservices architecture using JSON Schema as communication protocol. The microservices architecture was chosen to let centralised data collection and presentation in a single portal while simultaneously incorporating a heterogeneous landscape of services autonomously from one another.

    Predictions ahead

    Oracle’s Cloud Predictions 2018 report states that by 2020, the lion’s share of new applications will be powered by microservices architectures.

    Open source has given a whopping push to the microservices architecture. Its several components support continuous integration and delivery pipelines, microservices platforms, containers, container management and orchestration, container registry service, and serverless capability.

    Open source has given a whopping push to the microservices architecture

    Adoption of cross-cloud containers like Docker and Kubernetes is on the upwards trajectory and developers consider an open cloud stack to prevent vendor lock-in.

    A bar graph showing the CAGR of microservices from 2016 to 2023Source: Market Research Future

    According to a report on Market Research Future, the microservices architecture market is expected to reach $32.01 billion by 2023 with a Compound Annual Growth Rate (CAGR) of around 16.17% during the forecast period.

    Another report on Research and Markets for the forecast period of 2017 to 2023 states that as far as the ‘Market Analysis’ is concerned, the rise in the cloud adoption is integral for microservices market. This is because the microservices architectures function on smaller and simpler services. Also, there is a high demand from North American companies as they have implemented it in e-commerce, financial, and travel services. This has helped in storing data and information cost-effectively and enhanced the efficacy, agility and scalability.

    The report on Research and Markets has an interesting ‘Countries and Vertical Analysis’ vis-à-vis microservices. Most of the major players are in the American region with the prominent vendors covered in the report include the likes of Cognizant, IBM Corporation, Datawire, Salesforce, Infosys Ltd., MuleSoft Inc., and Software AG. Japan, the US and China are expected to witness a tremendous growth in microservices adoption.


    Microservices architectures streamline the overall application development lifecycle leading to quicker testing, higher quality and more releases. Such an architecture can be hugely useful for efficient management of Drupal-based projects. Innovation has always been something Drupal is greatly supportive of. Adopting a microservice architecture for Drupal development is possible and is extremely fruitful.

    Organisations should be wary of their digital business ecosystem and should understand the challenges that they might have to encounter during its adoption. Opensense Labs has been in the constant pursuit of bringing a positive change for our valued partners with our expertise in Drupal.

    Contact us at [email protected] to know more about microservices architectures and its value to your organisational setup.

    Sep 27 2018
    Sep 27

    Pairing Composer template for Drupal Projects with Lando gives you a fully working Drupal environment with barely any setup.

    Lando is an open-source, cross-platform local development environment. It uses Docker to build containers for well-known frameworks and services written in simple recipes. If you haven’t started using Lando for your local development, we highly recommend it. It is easier, faster, and relatively pain-free compared to MAMP, WAMP, VirtualBox VMs, Vagrant or building your own Docker infrastructure.


    You’ll need to have Composer and Lando installed:

    Setting up Composer Template Drupal Project

    If you want to find details about what you are getting when you install the drupal-project you can view the repo. Otherwise, if you’d rather simply set up a Drupal template site, run the following command.

    composer create-project drupal-composer/drupal-project:8.x-dev [your-project] --stability dev --no-interaction

    Once that is done running, cd into the newly created directory. You’ll find that you now have a more than basic Drupal installation.

    Getting the site setup on Lando

    Next, run lando init, which prompts you with 3 simple questions:

    ? What recipe do you want to use? > drupal8
    ? Where is your webroot relative to the init destination? > web
    ? What do you want to call this app? > [your-project]

    Once that is done provisioning, run lando start—which downloads and spins up the necessary containers. Providing you with a set of URLs that you can use to visit your site:


    Setup Drupal

    Visit any of the URLs to initialize the Drupal installation flow. Run lando info to get the database detail:

    Database: drupal8
    Username: drupal8
    Password: drupal8
    Host: database

    Working with your new Site

    One of the useful benefits of using Lando is that your toolchain does not need to be installed on your local machine, it can be installed in the Docker container that Lando uses. Meaning you can use commands provided by Lando without having to install other packages. The commands that come with Lando include lando drush, lando drupal, and lando composer. Execute these commands in your command prompt as usual, though they’ll execute from within the container.

    Once you commit your lando.yml file others can use the same Lando configuration on their machines. Having this shared configuration makes it easy to share and set up local environments that have the same configuration.

    Sep 26 2018
    Sep 26

    Now we're ready to map our variables in Drupal and send them to be rendered in Pattern Lab. If you're not familiar with it, I suggest you start by learning more about Emulsify, which is Four Kitchens' Pattern Lab-based Drupal 8 theme. Their team is not only super-helpful, they're also very active on the DrupalTwig #pattern-lab channel. In this case, we're going to render the teasers from our view as card molecules that are part of a card grid organism. In order to that, we can simply pass the view rows to the the organism, with a newly created view template (views-view--tagged-content.html.twig):

    {# Note that we can simply pass along the arguments we sent via twig_tweak  #} 
    {% set heading = view.args.3 %}
    {% include '@organisms/card-grid/card-grid.twig' with {
      grid_content: rows,
      grid_blockname: 'card',
      grid_label: heading
    } %}

    Since the view is set to render teasers, the final step is to create a Drupal theme template for node teasers that will be responsible for mapping the field values to the variables that the card template in Pattern Lab expects.  

    Generally speaking, for Pattern Lab projects I subscribe to the principle that the role of our Drupal theme templates is to be data mappers, whose responsibility it is to take Drupal field values and map them to Pattern Lab Twig variables for rendering. Therefore, we never output HTML in the theme template files. This helps us keep a clean separation of concerns between Drupal's theme and Pattern Lab, and gives us more predictable markup (note more, since this only applies to templates that we're creating and adding to the theme; otherwise, the Drupal render pipeline is in effect). Here is the teaser template we use to map the values and send them for rendering in Pattern Lab (node--article--teaser.html.twig):

    {% set img_src = (img) ? img.uri|image_style('teaser') : null %}
    {% include "@molecules/card/01-card.twig" with {
      "card_modifiers": 'grid-item',
      "card_img_src": img_src,
      "card_title": label,
      "card_link_url": url,
    } %}

    If you're wondering about the img object above, that's related to another custom module that I wrote several years ago to make working with images from media more user friendly. It's definitely out of date, so if you're interested in better approaches to responsive images in Drupal and Pattern Lab, have a look at what Mark Conroy has to say on the topic. Now, if we clear the cache and refresh the page, we should see our teasers rendering as cards (see "Up Next" below for a working version).

    Congrats! At this point, you've reached the end of this tutorial. Before signing off, I'll just mention other useful "configuration" settings we've used, such as "any" vs. "all" filtering when using multiple tags, styled "variations" that we can leverage as BEM modifiers, and checkboxes that allow a content creator to specify which content types should be included. The degree of flexibility required will depend on the content strategy for the project, but the underlying methodology works similarly in each case. Also, stay tuned, as in the coming weeks I'll show you how we've chosen to extend this implementation in a way that is both predictable and reusable (blocks, anyone?).

    Sep 26 2018
    Sep 26

    We’ve had several great opportunities this summer to connect with the Drupal community and share our latest work on Drupal Commerce. We’ve been able to highlight specifically our efforts to progressively decouple Drupal Commerce on Drupal 8.

    Drupal Camp Asheville 2018
    Ryan Szrama gave a demo on Saturday, July 14, based on the Belgrade demo store that provided an overview of Commerce Cart API Flyout. We detailed this work in our recent blog post announcing the feature.

    A fully decoupled Drupal Commerce experience—including support for complex forms like checkout—is something that Commerce Guys is committed to delivering by the end of 2019. Until then, our strategy is to progressively decouple the product catalog and shopping cart to help sites scale in addition to opening new user interfaces. In Ryan’s words, “We started with the shopping cart because that’s the obvious way to help large websites avoid a common bottleneck for performance.”

    Watch Ryan’s session to learn more about the Commerce Cart API project and see the demo.

    Decoupled Drupal Days 2018
    Next, Matt Glaman presented his talk “The road to a headless Drupal Commerce future” at Decoupled Drupal Days in NYC.

    The session reviewed the development of the Commerce Cart API in greater depth. It covers our research into the RESTful Web Services and contributed JSON API projects (potentially in core soon) as future dependencies that the Cart API can adopt. Matt demonstrated even more progress on the project since Ryan’s demo, including a fully decoupled React based front-end.

    This talk put the progressively decoupled Drupal Commerce Add to Cart form and shopping cart on display for the community with the expressed desire that Drupal based merchants will have an out of the box experience rivaling other major e-commerce software platforms.

    Drupal Europe 2018
    Matt’s session at Drupal Europe covered our latest developments in the Commerce Cart API and Flyout as part of the dedicated eCommerce track. This was an iteration of the Drupal Drupal Days session, including any improvements and additions in the time between Drupal Europe and Decoupled Drupal Days.

    If you’re interested in contributing to the roadmap for decoupling Drupal Commerce, connect with Matt to learn where to get involved or how to give us feedback from your implementations.

    Sep 26 2018
    Sep 26

    As we say in terms of computer programming, only two things are extremely complex: naming variables and invalidating the cache. Drupal 8 has an automatic caching system activated by default that is truly revolutionary, which makes it possible to offer a cache for anonymous visitors and especially for authenticated users without any configuration. This cache system is based on three basic concepts:

    • The cache tags
    • The context cache
    • Cache duration (max-age)

    Tag caches allow you to tag content, pages, page elements with very precise tags allowing to easily and accurately invalidate all pages or page elements that have these caches tags. Context caches allow you to specify the criteria by which the cache of a page can vary (by user, by path, by language, etc.) while the max-age property can be used to define a maximum duration of cache validity.

    But the purpose of this post is not to go into the details of this cache system, but rather to illustrate the use of the Cache API to set up its own cache for a specific use case. Imagine the need to build a hierarchical tree for a user, a tree based on a very prolific taxonomy. Once the hierarchical tree built, with all the included business data, satisfied, you can finally consult the fruit of your work ... after a waiting time of 17s ... Ouch.

    temps d'attente 16s

    In order to remedy this small inconvenience, we will set up a specific cache for our business needs and use the Cache API.

    Setting up a cache is relatively easy. We can also use the default cache provided by Drupal 8 (the cache.default service), or declare and use our own cache, which will then have a dedicated table but which can also be managed separately, especially if we want to put this cache on a third party service such as Redis or Memcache.

    To declare our cache, let's create the my.module.services.yml file in the directory of our my_module module.

        class: Drupal\Core\Cache\CacheBackendInterface
          - { name: cache.bin }
        factory: cache_factory:get
        arguments: [my_cache]

    And we declare our cache whose identifier will be my_cache.

    Then within our Controller, we have to condition the computation intensive construction of our hierarchical tree to the existence or not of our cache. Which can be translated by these additional lines:

    $cid = 'my_hierarchical_tree:' . $user->id();
    $data_cached = $this->cacheBackend->get($cid);
    if (!$data_cached) {
      // Extensive calcul...
      // Store the tree into the cache.
      $this->cacheBackend->set($cid, $data, CacheBackendInterface::CACHE_PERMANENT, $tags);
    else {
      $data = $data_cached->data;
      $tags = $data_cached->tags;

    To have a complete Controller that could look like this example.

    namespace Drupal\my_module\Controller;
    use Drupal\Core\Cache\CacheBackendInterface;
    use Drupal\Core\Controller\ControllerBase;
     * Class HomeController.
    class MyModuleController extends ControllerBase {
       * The cache backend service.
       * @var \Drupal\Core\Cache\CacheBackendInterface
      protected $cacheBackend;
       * Constructs a new HomeController object.
      public function __construct(CacheBackendInterface $cache_backend) {
        $this->cacheBackend = $cache_backend;
       * {@inheritdoc}
      public static function create(ContainerInterface $container) {
        return new static(
       * Build the hierarchical tree.
       * @return array
       *   Return the render array of the hierarchical tree.
      public function buildTree() {
        $user = $this->userStorage->load($this->currentUser->id());
        $argument_id = $this->getDefaultArgument($user);
        $cid = 'my_hierarchical_tree:' . $user->id();
        $data_cached = $this->cacheBackend->get($cid);
        if (!$data_cached) {
          $data = $this->getTree('VOCABULARY', 0, 10, $argument_id);
          $this->addUserStatusToTree($data['response'], $user);
          $tags = isset($data['#cache']['tags']) ? $data['#cache']['tags'] : [];
          $tags = Cache::mergeTags($tags, [$cid]);
          // Store the tree into the cache.
          $this->cacheBackend->set($cid, $data, CacheBackendInterface::CACHE_PERMANENT, $tags);
        else {
          $data = $data_cached->data;
          $tags = $data_cached->tags;
        $build = [
          '#theme' => 'user--tree',
          '#user' => $user,
          '#data' => $data,
          '#cache' => [
            'tags' => $tags,
            'context' => ['user'],
        return $build;

    And of course it remains to invalidate the cache only on some very specific actions, actions that by their nature will modify the tree cached.

    Cache::invalidateTags(['my_hierarchical_tree:' . $user->id()]);

    And then we can get a more than correct result compared to the initial waiting time divided by about 8 (the times are here extracted from a development instance)

    Temps d'attente de 2s

    The performance improvement is very clear here, but when invalidating the cache then the user will have to wait a while so the data is rebuilt and then cached.

    We can still improve this behavior, according to the project's business constraints: is it acceptable for a user to consult data that we know to no longer be valid, and therefore to use an invalid cache, and according to some conditions and duration? . If the answer is positive, then we can still improve our cache system, using the cache that we know is invalid.

    The idea here is pretty simple. If the business data cache is invalid, then we can decide to deliver despite this data while triggering, in the background and therefore not perceptible by the user, a query that will recalculate the data.

    $cid = 'my_hierarchical_tree:' . $user->id();
    $data_cached = $this->cacheBackend->get($cid, TRUE); // Get the cached data even if invalid.
    if (!$data_cached->valid) {
      $this->queueWorker->addItem(['user' => $user->id()]);
    if (!$data_cached) {
      // Extensive calcul...
      // Store the tree into the cache.
      $this->cacheBackend->set($cid, $data, CacheBackendInterface::CACHE_PERMANENT, $tags);
    else {
      $data = $data_cached->data;
      $tags = $data_cached->tags;

    To do this we use a TRUE parameter to allow us to recover an invalid cache, and if so, we delegate to a QueueWorker plugin the task of recalculating the data during the next cron.

    We could also handle this very well just as we invalidate the cache and delegate to a Queue the reconstruction of the data in the background. This would even allow us to recalculate the data even before the user comes to consult his profile.

    And in this way the problem of a calculation time too long can be solved in a sustainable way, with some compromises. It's just a matter of finding a fair compromise depending on the project and its constraints, and a Drupal 8 developer can help you to distinguish the ins and outs of each solution, or even suggest other possible optimization paths, perhaps by going back even at the root cause of the root cause of an issue encountered.

    Sep 26 2018
    Sep 26

    As part of our ongoing activities to ensure a safe and welcoming environment for collaboration in Open Source, we have updated the drupal.org Terms of Service, at drupal.org/terms

    This change has clarified which behaviors will be regarded as “harassment” and are, therefore, not acceptable whilst using the Drupal online services. The language is now in line with that already employed in the DrupalCon Code of Conduct.

    The updated text, from Section C - Activities, now reads as:

    • Harassment will not be tolerated in any form, including but not limited to: harassment based on gender, gender identity and expression, sexual orientation, disability, physical appearance, body size, race, age or religion. Any report of harassment will be addressed immediately. Harassment includes, but is not limited to:

      • Comments or imagery that reinforce social structures of domination related to gender, gender identity and expression, sexual orientation, disability, physical appearance, body size, race, age, or religion.

      • Unwelcome comments regarding a person’s lifestyle choices and practices, including those related to food, health, parenting, drugs, and employment.

      • Abusive, offensive, or degrading language or imagery

      • Language or imagery that encourages, glorifies, incites, or calls for violence, emotional, or physical harm against an individual or a group of people

      • Intimidation, stalking, or following

      • Sexual imagery. At a minimum, no images containing nudity or expressions of sexual relationships that might be deemed inappropriate for a business environment should be uploaded or linked to

      • Unwelcome sexual attention or advances

      • Advocating for, or encouraging, any of the above behavior

    You do not need to do anything to acknowledge this update.

    Whilst you are here…

    Are you receiving all the news and information you need? The Drupal Association publishes a number of news updates and you might be missing out. Check which news updates you are receiving by visiting our recently updated subscription page at http://eepurl.com/hWxwQ

    Sep 26 2018
    Sep 26

    Today, Agiledrop celebrates exactly 5 years from its official incorporation, so the company is in a festive mood. Obviously, we had a cake to celebrate the birthday, and we also organized a championship in Mario Kart on the PlayStation. For this article, we wanted to look at the past and tell the story about how it all started.

    The founding team

    It all started in 2008 when Iztok used Drupal for the first time. He immediately fell in love with Drupal as software and the community. In 2010, he attended his first DrupalCon. The result of the conference were new clients from all around the world. If he had not been visiting that event, maybe there would be no Agiledrop as it is today.

    During this time, in 2009, Marko founded a company that was specialised in converting PSD design into HTML code and worked primarily for digital agencies. Marko had the vision to grow the company and he saw an opportunity in Drupal, where there was a huge demand for front-end developers.

    In 2011, Iztok and Marko joined forces in projects, where Marko covered the front-end, while Iztok was developing back-end. In 2012, Agiledrop became a brand (or simply a website), under which they worked with some other developers. And finally, on 26. 9. 2013 the company AGILEDROP d.o.o. was officially registered.

    By 2014, the company counted 6 full-time developers. Boštjan joined the founding team. As they say, three types of entrepreneurs are needed for a good start-up: Hipster, Hacker and Hustler. As it has been established through cooperation, Iztok excels in communication with clients, Marko in financial, administrative and management operations, and Boštjan in leading the development. Since each of them was covering one of these important areas, they were able to fulfil the vision to create a company, in which they wanted to work. After about two years, they formally distributed these roles, thus creating the structure of the company's leadership, as it remains to this day. Marko is the Managing director, Iztok is the Commercial director, and Boštjan is the Development director.

    There were also changes in the business model. Initially, AGILEDROP offered complete website design and development and ran the projects from the beginning to the end. They soon realized that they were best known for Drupal development, which convinced them to focus solely on this stage of the website production. The decision proved to be correct, as the business model increased their competitive edge in the market. As they say, it's better to work one thing perfectly than to do all of them well.

    The team in Ljubljana office The team in Maribor office

    The demand for AGILEDROP services has grown from year to year, and the team also rapidly grew. Other non-development roles appeared in the company, like salespeople, talent manager and office manager. Today our team totals 33 people.

    We look proudly on the past and look forward to the future.   

    Sep 26 2018
    Sep 26

    It’s been two years since the première of Drupal 8. We already got used to the differences between versions 7 and 8, and a lot of websites were created based on D8. Many Drupal 7-based websites are applications that use Drupal Commerce – an e-commerce module for Drupal. Many of the applications were set-up with the Commerce Kickstart distribution, which was based on this add-on. What’s the way to do it with D8? For a long time, only the alpha version was available, then a beta version was released. On the 20th of September 2017, we saw the release of version 2.0. As of today, the current version is 2.3. We'll see what’s new in DC and how it works with D8. For testing purposes, we are going to use DC 2.3 and Drupal 8.4.3.

    Set-up and requirements

    According to the manual, it is recommended to install Drupal using Composer; set-up requires Drupal 8.4 or newer. We did not have any issues with the installation process. To install DC 2.3, we used the following command:

    composer require drupal/commerce:2.3

    DC requires several additional modules (Address, Entity, Inline Entity Form, Entity Reference Revisions, Profile, State Machine). When using Composer, you don’t have to worry about installing them manually, as they will be added automatically. After successful set-up, the list of modules will grow by 12 new entries.

    List of modules with new items sorted in alphabetical order

    Functionality, modules and innovations

    We have launched all the modules for the purpose of testing. The Commerce icon appeared on our main menu. At first glance, you can see a number of options that were not included in the standard version for D7. This includes:

    • Store types
    • Product attributes
    • Promotions
    • Order types
    • Order item types
    • Checkout flows
    • Product variation types.

    Store and Store types

    It allows you to define store types your website, “online” is added by default. Adding more store types may be useful if you have a network of brick and mortar stores or branches in different countries. These stores may have a country-specific offer but use the same database of all products kept in one place. You should remember that a product may belong to one or more stores; however, an order placed by the user is always assigned to one store.

    Another interesting option is the possibility of creating stores by users on your website, allowing vendors to open their online stores on your platform, as well as create and sell their own products, just like ETSY. You can find out more about that functionality from Drupal Commerce documentation.

    Product attributes

    Allows you to add attributes to your products. There are three display options available: Select list, Radio button and Rendered attribute. These attributes can be assigned to types and used when adding new products.

    Attributes - a selectable list with three available options


    DC 2.3 provides you with a sub-module that allows you to add bonuses and discounts. The discount can be set for specific products or the entire order. The discount may be either a fixed amount or a percentage, and they can be assigned to a role, a shipping address, or an e-mail address. You can also limit promotions and special offers to a maximum order value or currency. In addition, you have the option to add start and end dates, limit the number of uses and decide whether a discount can be combined with other promotions. Admittedly, this is a great convenience. Compared to other e-commerce systems, these are the things that should already be standard. In D7, it was not so obvious, and it could cause quite a headache.

    A view of addingthe promotion in Drupal Commerce 2

    Order types

    Another new thing in DC is a new approach to orders. You can now create several order types with different shopping paths, or even showing shopping cart in a different way. This is quite an interesting solution and it will certainly be useful for more complex projects, where products require a different business approach. Each type can have its own unique fields and rendering methods.

    A view of order types page

    Checkout flows

    As mentioned above, in addition to the order types, you can set up many different shopping paths. They may vary depending on the type of order placed. The entire order process is displayed using plug-ins. By default, you can take advantage of Multistep; however, you can add your own plug-in and use it, for example, for one of several shopping paths. This is quite an interesting approach, thanks to which you won’t have to alter the single default path. You can check out how to create your own flow plug-in here: https://docs.drupalcommerce.org/commerce2/developer-guide/checkout/create-custom-checkout-flow

    Adding your own module to the list of available plugins

    Order item types

    One could say that this is something like “Line item” from D7 – an item that stores order data and products, you can also define your own fields for storing other information.

    Commerce 8 in action

    Let's see how commerce works in practice.

    Adding a product

    A subpage to add a product. Some options created before are available to choose

    As you can see, you can use the attributes and variations of the products that you have created earlier. Let’s add a product with several options to choose from.

    Product sheet

    Product card with the options filled according to previous choices

    Standard view of cart
    The standard cart is a block with views, which can be easily and freely configured, like in D7.


    We have included the default test payments available in the module for the purpose of our tests. If you want to use a ready-made gateway, you can go with:

    PayPal - https://www.drupal.org/project/commerce_paypal - beta1

    Tpay - https://www.drupal.org/project/commerce_tpay - rc2


    You can take advantage of a shipping module – Commerce Shipping beta4.


    It is integrated with the physical model, thanks to which you can use automatic conversion of sizes and scales to the final shipping cost.

    A view of "adding the package type" page

    As for the shipment integration, I only managed to find an Alpha3 version for FedEx.


    In addition, DC developers used the Address module.

    The address fields are supported for more than 200 countries. They include locations, regions, voivodeships, Länder and so on from most countries of the world. In addition, you can create custom “Address zones” and assign special properties such as special shipping prices, taxes, etc.


    Drupal Commerce has a number of basic functions and many interesting innovations operational and working, which is, of course, a huge advantage. In addition, the developers decided to provide users with more configuration options by default, compared to the D7 version. A typical site builder might have issues with building a store for a client, due to the lack of D7 modules and the fact that the majority of them is in alpha or beta version, which means that they may be unstable. If you don't develop advanced modules for the D8, it can be a huge obstacle, otherwise, you might have to develop them yourself.

    What was great about DC was the fact that we were able to configure and set the basic functions of our store in a short time without any problems. More and more modules are released, and a large number of them is already available in stable versions. That is why the combination of Drupal 8 and Drupal Commerce is a tool that gives a lot of possibilities for implementing interesting projects.

    You might ask whether you should go for a proven Commerce with D7 duo or try the innovations introduced in D8.

    The answer is... It depends on the project, your development resources, as well as time and budget for developing elements that are not available or do not work properly with D8.

    However, as of now, the project looks promising and we keep our fingers crossed for the further development of Commerce. If you need more information about DC, feel free to visit https://docs.drupalcommerce.org/

    We also encourage you to read other articles on our blog!

    Sep 26 2018
    Sep 26

    If you are at least a little bit familiar with Drupal, then you most likely heard about Drupalize.

    Drupalize is one of the most active players when it comes to updating their tutorials in order to keep the content up-to-date with the new features that come to Drupal 8. The material provided by them starts all the way from the Basics and Site Building, and moves on to covering more advanced topics such as Management and Strategy.

    With the largest collection of Drupal premium learning videos, Drupalize has thousands of tutorials you can choose from. If you are interested in: Learning to Build Drupal Websites, Working with Drupal Themes, Becoming a Drupal Developer, Learning Drupal 8 or in an Introduction to the Drupal CMS - then Drupalized is the right choice for you!

    Not sure whether or not it’s worth the investment? Check out their YouTube Playlists made in association with Lullabot, which cover topics such as: Drupal 8 User Guide, Configuration System, or How to Install Drupal for Local Development and see if the content provided by them meets your expectations.
    In order to recognize contributors' hard work, Drupalize.me offers a Free Membership to the drupal.org Project Maintainers and those listed in the Drupal 8 MAINTAINERS.txt file.

    Sep 26 2018
    Sep 26

    The Drupal Unconference is coming up in November and we can’t wait! Following the huge success of last year's event, we are once again proud to be Silver Sponsors of this alternative annual conference.

    As active members of the Drupal community, several of our team are already preparing lightning talks to pitch on the day. To secure attendance for the majority of our large Drupal team, we have just bought a batch of tickets. To avoid disappointment, we encourage you to do the same! 

    Unconference Tickets

    [embedded content]Co-organiser, Eli, on what to expect

    This year’s Unconference will be held on 3rd November at The Federation, Manchester. The annual unconference breaks the mould, with an informal, accessible programme. All talks are planned on the day by the attendees rather than organisers. Representing open source ideals, Unconference recognises that the best ideas can come from anyone, no matter their experience. First-time speakers and long-term contributors have equal opportunity to share their insights into the Drupal Content Management System.

    For the second year in a row, we are proudly sponsoring the event and attending en mass. Our developers are preparing talks on a wide range of topics: from front-end design using Pattern Lab, to a bold career change, swapping auto body repairs for Drupal development. The unplanned structure of the Unconference enables speeches that are reactive to recent topics and events. As such, we expect some competition for the most innovative talk this year!

    Not sure what to talk about?

    You can reach beyond Drupal core and open source code. Unconference presentations will address a wide range of digital topics. With talks and insights expected to cover UX, databases, frameworks, security and front-end design. Web developers, devops, project managers, designers and marketers can all expect relevant and actionable takeaways from the event. Website owners and end users, no matter their technical experience, are welcomed to the inclusive conversation.

    Unlike Drupal sprints, which focus on delivering working software and contributed modules, the Unconference is designed to be a rich learning environment. Offering real-world case studies and ideas, NWDUG invite anyone to share their digital experiences.

    Hosted at The Federation, the 2018 event will be bigger and better than ever. With more space comes more opportunities for different speakers and discussion groups.

    [embedded content]Explore the Venue

    Last year’s event was a huge success, so we are optimistic for Unconference 2018 to be the best yet. We are excited to see new faces and new innovations from the open source community.

    Join the welcoming Drupal community this November 3rd for a day that celebrates inclusivity, accessibility and open source software.


    Find out more and order your tickets of the Unconference website. We'll see you there!

    Unconference Tickets

    Sep 25 2018
    Sep 25

    This October, come to our Bay Area Drupal Camp (BADCamp) booth to give your elected representative a piece of your mind. We won’t be holding members of congress captive at the Kalamuna booth (or will we?) but we’ll have plenty of other excitements there to keep you curious. And of course, BADCamp is a celebration of open-source software, so we’ll be giving plenty of Drupal talks. Read on to find out about those and our exciting booth happening.

    BADCamp is a time for great Drupal talks, high-fiving friends, and enjoying a few drinks together. We’ll all have Drupal on our minds, but we’ll have the rest of the world on our minds, too. Which is partly why our agency was founded: while we love technology and design, we’ve always regarded it within the context of that “rest of the world.” This is why we help mission-driven organizations. It is also why, at last year’s BADCamp, we nodded to that rest of the world by asking camp attendees what nonprofits they supported and donated our swag budget to those orgs.

    BADCamp 2017 attendee donated some of our swag money to PASH@ponies holds up a square of our BADCamp 2017 booth wall noting his donation to PASH.

    This year, we want to focus our efforts on three organizations that exemplify what we believe in as a company. To that end, we’re using our BADCamp booth not only to let attendees divert our swag budget to one of three staff-chosen organizations, but also to enable BADCampers to write postcards to their representatives about our three causes.

    The three causes we’re focusing on this year are:

    1. reproductive health and education
    2. civil rights and
    3. digital freedom and privacy.

    Kalamuna has been concerned with these issues since our beginning six years ago, but it was our current staff that identified these issues to focus on for BADCamp.

    So, when you come to our booth, this is what we’d like you to do:

    1. Come say hi
    2. Decide to address one of three issues our staff is concerned with:
      1. reproductive health and education
      2. civil rights or
      3. digital freedom and privacy
    3. Find your elected representative from our list, write to them on a free, pre-stamped postcard
    4. Pin your postcard to our booth wall for all to see
    5. We’ll send $5 of our swag budget to the nonprofit representing the issue you addressed

    Our Beneficiary Organizations

    We chose to give our swag budget to the three organizations below. We selected these groups by surveying our staff to ask them which nonprofits were their favorites. While we identified about 15 favorite organizations, these are the ones that came up most often:

    Planned Parenthood Logo

    Planned Parenthood: Reproductive health and education

    Planned Parenthood is a trusted health care provider, an informed educator, a passionate advocate, and a global partner helping similar organizations around the world. Planned Parenthood delivers vital reproductive health care, sex education, and information to millions of people worldwide.


    ACLU Logo

    The American Civil Liberties Union: Civil rights

    For nearly 100 years, the ACLU has been our nation’s guardian of liberty, working in courts, legislatures, and communities to defend and preserve the individual rights and liberties that the Constitution and the laws of the United States guarantee everyone in this country. Whether it’s achieving full equality for LGBT people, establishing new privacy protections for our digital age of widespread government surveillance, ending mass incarceration, or preserving the right to vote or the right to have an abortion, the ACLU takes up the toughest civil liberties cases and issues to defend all people from government abuse and overreach.

    EFF Logo

    The Electronic Frontier Foundation: Digital freedom and privacy

    The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Founded in 1990, EFF champions user privacy, free expression, and innovation through impact litigation, policy analysis, grassroots activism, and technology development. EFF works to ensure that rights and freedoms are enhanced and protected as our use of technology grows.

    We hope you’ll come hang out with us, write a postcard, and help us support the causes that matter to us, and hopefully, to you.

    Also, Learn Drupal from Us at BADCamp

    We’re excited to be offering six sessions this year. Some of our speakers are Kalamuna-BADCamp vets, and some are new to our agency. We’re excited about the breadth of topics we’ll be addressing. Check out what we’ve got going on:


    Crispin Bailey, Director of Design & UX

    Have you ever had a project go exactly according to plan, without any hiccups or gotchas? More often than not, once a project gets going and the fuzzy warm glow from the kickoff meeting has faded, the reality of project unknowns and assumptions reveals a potentially frightening path ahead - or worse yet, there may be so many unknowns that it seems like there is no clear path ahead. Don’t worry, take a deep breath. This session will go over the many ways in which a project that seems to be in jeopardy can be tamed, and reveal the tools and techniques you can employ to forge ahead with confidence - even if you don’t know exactly where you’re headed.

    In this session we’ll cover:

    1. Building up your UX Toolkit to tackle any problems that might arise
    2. Handling wily stakeholders like an experienced lion-tamer
    3. Building up confidence to sail across the Seas of Doubt
    4. And other tips and tricks to help you find your way through the Cone of Uncertainty

    October 26,  10:15 AM-11:00 AM, Stephens Lounge


    Rob Loach, Director of Technology

    In a world where everyone has the ability to post anything online, it becomes critical for site owners wanting to maintain their brand reputation to curate what is published. How can content editors keep up with reviews when there's a steady stream of content submissions? Enter the robots.

    Google Cloud Vision API provides image labelling, face, logo, and explicit content detection through Machine Learning. This takes the burden off of your content curators, frees them from their role as authoritarian gatekeepers, and allows you to focus on the business value and strategic goals.

    In this session, you'll learn how to:

    • Get set up with Google Cloud Vision API
    • Configure Google Cloud Vision API with Drupal through the Drupal module
    • Automatically add metadata to uploaded media
    • Enable explicit content detection on image fields

    October 27, 10:15 AM-11:00 AM in Pauley East


    Andrew Mallis, CEO, Kalamuna with Michael Enslow, Director of Delivery, Exygy

    Are you a stakeholder? Do you work for an agency as a designer, engineer, or product manager? Are you a citizen wondering how your government’s website can serve you better?

    Exygy & Kalamuna are focused on solving health and civic challenges for communities around the globe. We believe resilient communities are ones that thrive when they have strong social systems and resources to support families and individuals in their day-to-day lives.

    In order to make these types of services more efficient and accessible, three design agencies partnered to work with a complex team of government agency stakeholders spanning 3 departments to help 16 programs from 9 locations serve over 250,000 people. Our goal was to migrate from a deprecated system to Drupal. This took roughly 5,000 human hours over 14 months!

    The San Francisco Human Services Agency (HSA) is a lifeline for 23% of San Franciscans, serving over quarter million unique persons seeking food assistance, health care, elder care, job training, daycare, and other essential social services. With more than 30,000 monthly unique visits to sfhsa.org, the strategy had to accommodate multilingual users, be accessible on all devices, and accomodate the technical ability of content authors using the CMS. To provide a Drupal solution for this diverse community, we employed a design practice deeply rooted in user and stakeholder empathy.

    The new sfhsa.org  launched in late 2017, as an ambitious new Drupal website designed and engineered by Exygy and Kalamuna. We’ll walk you through this project and share with you our successes and failures along the way. We’ll draw from our extensive experience in the civic and education tech sectors, grounded in our mission-driven work ethic. Once we’re done, you will…

    • Learn how to confidently negotiate a joint bidding process with a partner agency
    • Learn our technique for navigating a complex stakeholder map
    • Understand how to align methodologies between partners and stakeholders
    • Manage change requests with confidence
    • Understand the benefits of leveraging reusable web components for building your CMS using the/Lego concept and pattern libraries

    October 26, 4:45 PM-5:30 PM in Tilden


    Alice Freda, Support Manager

    If you work on Drupal sites that include forms, varied user profiles, paywalls, eCommerce features, integrations or API codes stored in the database, you’re particularly concerned with security—and could benefit from a security audit. But what is it and how do you run one? And what do you do once you’ve performed your security audit? Whether you’re building a site and want to adhere to Drupal security best practices or are working with an already-existing site and need to secure it, this session is for you.

    You’ll get a security checklist as well as some tried-and-tested ways to respond to your findings. Some topics we’ll cover:

    • Core and contrib module updates: Why they’re important and how to keep on top of them
    • Making the most out of tools that come bundled into Drupal: eg. how to respond to notices on the Site Status report page
    • Drupal configurations: User management, permissions, password management
    • Modules that can help protect your site as well as flag existing issues
    • Ways to restrict Administrative access and access to other configuration information
    • Beyond Drupal: Securing at the server level

    October 26, 2:30 PM-3:15 PM, in Toll


    Angelo Porretta, Senior Architect

    This session is about giving a practical example of how the CMS and Drupal community can put machine learning into practice by using a Drupal module, the taxonomy system, and Google's Natural Language Processing API. We will begin with an overview of what natural language processing is and some natural language processing concepts, including:

    • Sentiment analysis
    • Entity analysis
    • Topic segmentation
    • Language identification

    Several different natural language processing API alternatives will be compared and contrasted to help the audience choose for themselves what would be best for their needs. These alternatives include:

    We will then explore practical use cases through analyzing and automatically categorizing news articles using Drupal's taxonomy system and combining those categories with sentiment in order to make a recommendation system for a hypothetical news audience.

    October 27, 1:30 PM-2:15 PM, Toll

    Sep 25 2018
    Sep 25

    This blog has been re-posted and edited with permission from OneShoe's blog. The following are results from the 2018 Drupal Business Survey conducted by One Shoe and Exove, in partnership with the Drupal Association.

    Drupal Business Survey 2018: hot topics are recruitment, changing Drupal playing field, and shift to Drupal 8

    The last couple of months Exove and One Shoe worked closely with the Drupal Association on the global Drupal business survey to assess current trends, adoption of emerging technologies and shifting perspectives on the Drupal landscape. The survey was open during July and August. In these two months 136 Drupal agency leaders and decision makers worldwide were surveyed to learn where the Drupal industry is heading and how the Drupal community can chart their course for Drupal’s success in the years to come.

    According to the survey, the Drupal client landscape has been changing with the continuing adoption of Drupal 8. For many of the respondents, the sales pipeline and average deal size has grown – while a number of companies struggle with client acquisition and moving to Drupal 8. The surveyed companies are using various strategies to adapt to the changed situation. As in the previous surveys, the Drupal talent availability is seen as one of the major challenges.

    Survey participants were Drupal business leaders from around the world

    Most surveyed companies and offices are based in Europe (63 %) followed by 40 % in North America and 7.4 % in Asia. Out of the total responses, most participants of the survey had the role of founder (65.9 %), CEO (50.4 %) CTO (18.5 %) and COO (1.5 %). A little over 30 % of the respondents stated that their company existed for over 14 years, followed by almost 20 % of the companies who’ve existed between 10 - 11 years. 60 % of the companies who filled in the survey have just one office, with 19.3 % two offices.

    World map showing locations of respondents by percentage

    A little over a half (54.8 %) of the companies stated that they are a digital agency. 14.8 % define their profile as a software company with 10.4 % as a consulting agency.

    Almost all (94.8 %) of the respondents said that their company provides web development. A majority of the companies shared that they provide visual design (65.9 %), user experience (68.1 %), system integration (67.4 %) or support (59.3 %). These answers are very similar to the results of last year’s survey.

    The workfield of the Drupal agencies has become more industry specific

    Drupal companies have clients in diverse industries. More than half (59.3 %) of the respondents reported to have Drupal clients in Charities & Non-Profit organisations. Other industries are Government & Public Administration (54.8 %), Healthcare & Medicine (47.4 %), Arts & Culture (41.5 %) and IT (40.7 %). Based on the responses, it can be stated that Drupal companies are becoming more industry specific. The Drupal Business Survey responses of the last three years show that each year, fewer companies have clients in every industry. The outcome of the surveys show that the industries of Media and Banking & Insurance have had the biggest drop, while Healthcare & Medicine and Consulting have grown the most from the first survey.

    Top 10 industries in which Drupal clients operate in 2018 - bar graph

    Compared to 2016/2017/2018:

    Top 10 industries in which Drupal clients operate - bar graph

    Biggest challenges in recruitment, client acquisition and Drupal 8 adoption

    The outcome of the survey shows that in the last 12 months the Drupal agencies faced three main challenges, namely recruitment (24 %), client acquisition/pipeline (17 %) and conversion to Drupal 8 (14 %). These three challenges are analysed in the following parts of this article.

    Biggest challenges Drupal agencies faced in the last 12 months

    Recruitment – a war on Drupal talent

    The Drupal agencies wanting to grow, know the importance of Drupal talent. For years, the demand for Drupal talent has exceeded the supply. According to this year’s survey, agency leaders see recruiting new employees as their biggest challenge. That’s nothing new; the lack of developers is a universally known challenge, that applies to not only Drupal developers. According to research from The App Association, there are 223,000 job openings for software developers in the US alone. And in Finland alone there is a shortage of 10 000 developers (source: Code from Finland).

    One of the recipients describes their challenge of the last 12 months as:

    A war on talent.

    But still: the demand for digital services is great and the stakes are high. Agencies simply need manpower to continue to grow their business (59 %): "We hit a productivity ceiling and need to expand if we were ever to have capacity to provide for further growth." The lack of Drupal talent can be a threat for new projects: "We lose out on opportunities because our capacity is too low."

    The answers of the surveyed pointed out that scarcity and financial compensation continue to be the main obstacles for attracting employees with experience and/or (highly) skilled in Drupal. A lot of the respondents mention that senior developers are typically very expensive to hire, while junior developers match the budget.

    Every year we hear that Drupal agencies can't find talent. What they often mean is that they can't find talent at the rates they are willing to pay.

    Most of the Drupal talent is either completely new to Drupal or already skilled and working, requiring a strong incentive to change positions.

    However, despite the difficulties, 80 % of the agency leaders did hire new employees in the last year and managed to meet their Drupal talent needs, mostly by actively prospecting and hunting Drupal specialists (51.5 %). According to the respondents, it also seems to be a good strategy to motivate and educate people for Drupal who are not familiar with Drupal before, but are willing to learn: agencies hire graduates/juniors (47 %) or hire experienced developers (35.8 %) and train them in Drupal themselves.

    Opportunities in collaborating with education institutes

    Respondents advise to collaborate more with education institutes and other organizations to prepare interested and motivated people to become the Drupal experts of tomorrow. As one suggests:

    We need further engagement between tertiary institutes and industry to ensure open-source platforms and industry standard development methodologies are taught to address the medium term skills shortage.

    One respondent told us they even started their own Academy in collaboration with tech universities.

    Changed Drupal playing field brings new challenges

    Over the last couple of years, Drupal has undergone major changes. For one, Drupal 8 was released.
    Also, Drupal starts to play more and more a role as the backend for headless or decoupled CMSs, Drupal is evolving towards an API-first platform and is competing head to head with proprietary platforms like Sitecore and Adobe Experience Manager.

    These changes inevitably impact the Drupal market. It’s therefore no surprise that the second biggest challenge (17 %) for agencies in the last 12 months had to do with generating leads for Drupal focused projects or the acquisition of new and suitable customers. This raises different reactions. It is clear that the changed playing field of Drupal benefits certain companies, while others struggle with the change:

    The landscape is changing seismically. We are seeing smaller competitors shrink whilst those delivering enterprise and business critical services are prospering. With Drupal 8 we are winning in many situations where platform decisions are open.

    The (other) challenge we've been facing is the perceived lack of interest in Drupal overall, specifically on the commerce side. We've been working hard to educate the market on the viability of open source for commerce using Drupal, but have a lot more work to do to get a foot in the door in that enterprise market.

    One of the companies also seemed to notice a slower growth on the Drupal market:

    Drupal is facing competition from several directions: WordPress is no longer a blog platform but equals Drupal. Increased demand for static site in combination with cloud CMS-es and developers losing interest in Drupal in favor of .JS and lightweight PHP frameworks.

    JS-based frameworks are more in demand and PHP is losing its appeal.

    Decoupled: We see a role for Drupal in the decoupled world, however we are still behind on what Drupal should deliver to be an API backend first choice.

    Average deal size of Drupal projects increased

    Average deal size of Drupal projects in a pie chart

    It is striking that although the client acquisition seemed to be a major challenge for the respondents, a little over half of the Drupal agencies (51.5 %) saw their Drupal project average deal size increasing, with 36.6 % whose average deal size stayed roughly the same and 12 percent (11.9 %) experienced a decrease. This seems to indicate that Drupal projects are becoming bigger and bigger.

    As someone mentioned:

    We see larger and larger deals opening up in the Drupal space. The role played by Acquia is significant in the growth of Drupal in the Enterprise space.

    We are still seeing growing demand for Drupal, especially among large/ enterprise organisations.

    Drupal agencies seize new opportunities

    In response to the changes within the Drupal market, some agencies have found new opportunities with Drupal by developing new business models.

    The survey results show that 34.1% of the respondents did not change their business model in the last year. However 28.9 % expanded their services beyond building Drupal sites whilst 15.1% of the agencies chose to become more specialized (focus on specific vertical or industry). Main reason to change their business model was to grow their pipeline better/faster (58.2 %), identification of a better business model (51.6 %) or changing market conditions (50.4 %).

    Has your business model changed? Pie chart

    On the one hand, there are the Drupal companies who expand their business by offering new services like consultation or strategic work:

    We are helping more agency and merchant teams adopt Drupal Commerce specifically for Drupal 8 than ever before. They have a strong desire to do things "the right way", which means they're thinking more strategically long term.

    And on the other hand, you have the Drupal agencies who believe that specialization is the answer to keep the pipeline full instead of offering a full-stack service to attract new clients.

    More specialized expertise and strategy are valued more than full stacks development services.

    But the decision from companies to make a change in the business model has more reasons. The agencies who expanded their services also mentioned that they saw a shift in demand from their clients. In other words, the (Drupal) market has changed and those who adapt, have a good chance of succeeding:

    Clients are no longer looking just for software development services. They want the service provider to be deeply involved in the engagement and take responsibility for the business outcomes. They want the vendors to come higher up in the value chain.

    Even mid-market business leaders are realizing that digital is more than a website. They are seeking to use digital for new revenue streams or to reduce expenses. We have completely revamped our services to offer high level strategic consulting services that address the people, process and technology that affects our client organizations.

    Open source and recommendations help Drupal win in the CMS battle

    Top 5 reasons for choosing Drupal - bar graph

    The competition in the CMS business has become tough, and clients are more aware of the opportunities of different CMSs. This has led to many companies expanding their set of technologies and portfolios, as one of the respondents mentioned:

    There's no CMS we can use as a silver bullet.

    The survey shows that Drupal has a lot of qualities that clients need and search for in a digital platform. The respondents shared that the fact that Drupal is open source is the main reason for clients for choosing Drupal (67.4 %), followed by 56.3 % who said that Drupal was chosen because of the agencies’ recommendation. Other reason are because clients are already familiar with Drupal (54.8 %), the CMS’s flexibility (49.6 %) or reputation (42.2 %).

    The shift to Drupal 8 has been rocky but brought significant benefits to some companies

    The third main challenge (14 %) of the Drupal companies was the conversion to Drupal 8. The upgrade from one major version of Drupal to the next (e.g. from Drupal 7 to Drupal 8) asked much effort, a steep learning curve and a difficult upgrade path. Introducing new technology in general – not just Drupal – will always have the risk of facing some sort of a challenge. Whether it’s a delay in introducing new features, unexpected security risks or maybe a more difficult learning curve. One of the agencies stated:

    Adopting Drupal 8 and Drupal Commerce 2 [were the biggest challenges]. There was a significant learning curve for our team and many of the modules (including the ones we were in control of) weren't ready to roll out complete commerce solutions to clients we were committed to.

    Another company told us:

    We have been working with Drupal 8 since beginning of 2016. Since our clients mostly fit in the small business category, we have struggled to push our project budgets high enough to be profitable on Drupal 8 projects, as we were on Drupal 7 projects. It's not easy to say what all the reasons are, but Composer is finicky, major modules weren't ready for the first year or more, security updates are more hassle because of more changes, and the increased bugs and missing features required work-arounds. Against our desires, economics are pushing many projects to Wordpress for its page builders and many plugins. On the bright side, the current Drupal initiatives are exciting!

    2018 has brought strong growth but we diversified due to slow adoption in 2016/2017. Drupal can learn from this to prevent the same from happening with the launch of Drupal 9 (more quickly available information / modules).

    Dries Buytaert, founder and project lead of Drupal, states: ‘These kind of growing pains are not unfamiliar and one of the key reasons that Drupal has been successful is because we always made big, forward-looking changes. And as a result Drupal is one of very few CMSs that has stayed relevant for 15+ years. We see a way to keep innovating while providing a smooth upgrade path and learning curve from Drupal 8 to Drupal 9.’ Right now, the DA works with united forces to make future Drupal upgrades smoother and much simpler than previous upgrades with faster releases with easy upgrades and a smoother learning curve.


    The competition on the digital market is and remains strong. New Drupal talent is needed to ensure response to the demand for Drupal. The major changes that Drupal has undergone in the last few years had an impact on client acquisition and the amount of new Drupal projects for the Drupal agencies.

    The outcome of the survey shows that the Drupal business community is resourceful and capable of adapting to the continuous changing market by using different strategies. On the one hand, there are the Drupal companies who become full-stack agencies while others believe that specialization is the answer.

    One thing is certain: clients want the best CMS for their company. ‘There’s no CMS we can use as a silver bullet’ one agency told us. And although that might be the case, we can still continue to aim for Drupal to become that silver bullet.


    See the 2017 survey results.

    For more information, please contact Janne Kalliola ([email protected]) or Michel van Velde ([email protected])

    About Exove

    Exove delivers digital growth. We help our clients to grow their digital business by designing and building solutions with agile manner, service design methodologies, and open technologies. Our clients include Sanoma, Fiskars, Neste, Informa, Trimble, and Finnlines. We serve also start-up companies, unions and public sector. Exove has offices in Helsinki, Oulu and Tampere, Finland; Tallinn, Estonia; and London, United Kingdom. For more information, please visit www.exove.com.

    About One Shoe

    One Shoe is an integrated advertising and digital agency with more than 10 years experience in Drupal. With more than 40 specialists, One Shoe combines strategy, UX, design, advertising, web and mobile development to deliver unique results for international clients like DHL, Shell, Sanofi, LeasePlan, MedaPharma and many more. For more information, please visit www.oneshoe.com.

    About the Drupal Association

    The Drupal Association is dedicated to fostering and supporting the Drupal project, the community and its growth. The Drupal Association helps the Drupal community with funding, infrastructure, education, promotion, distribution and online collaboration at Drupal.org. For more information, please visit drupal.org/association.

    Sep 25 2018
    Sep 25

    The Urban Hipster Drupal Commerce demo site was built to showcase what Drupal 8 and Commerce related modules can do. While the main focus has been Commerce, recently I started enhancing the content side of the site, mainly the blog. After all, Drupal is a content publishing platform at its core, so why not show how content and commerce can work on the same platform together. In the ecommerce world, that’s actually a pretty big deal!

    In this Tech Talk video, I’ll show you how the Drupal core Comments module is used for blog commenting and product reviews. I also go into detail on how you can configure a role based publishing workflow using core’s Workflows and Content Moderation modules.

    [embedded content]

    Comments and reviews

    All of the blog posts and products on the demo site use the core Comments module for customer feedback. This allows any level of user (anonymous, authenticated, etc.) to add comments or reviews to these content items. The configuration and permissions for the Comments module controls whether or not the comments need to be approved by an administrator before they appear on the site. When logged in, an administrator who has permissions to manage the comments can use both the frontend interface as well as a backend interface for deleting, approving, editing and finally replying to the comments.

    Like any content entity in Drupal, comments are fieldable. This means that you can configure fields to allow for additional functionality for your comments. It’s not covered in this video, but it’s worth mentioning that this is how I was able to get a 5 star review system easily integrated into the product comments.

    Content moderation workflows

    Drupal core also has a couple modules for letting you define a process for adding specific types of content to your site. The Urban Hipster blog is now setup to be an example for this. 

    The first aspect to configure is the workflow. Workflows is where you determine what content will make use of the workflow, the “states” that the content will transition through, and finally the transitions that can happen at any given state. These things all need to be configured first before moving on to permissions.

    The second aspect is assigning role based permissions to use the workflow. Permissions for workflows are found in the usual permissions backend page where all other permissions are set. Each workflow transition has a permission attached to it and so you just simply check the role that can perform each transition. You can create new roles if you need to.

    View the live example

    As mentioned, the Urban Hipster Drupal Commerce is an example of what can be done. Try it out yourself and see what you think. Here are some username/password combinations that will let you check out the workflows in action. The site refreshes every night so you don’t need to worry about breaking anything.

    Role based workflow logins:

    • Blog author: blogauthor/blogauthor
    • Blog reviewer: blogreviewer/blogreviewer
    • Blog publisher: blogpublisher/blogpublisher

    Administrator login (for viewing the configuration):

    • Administrator: demoadmin/demoadmin
    Demo Drupal Commerce today! View our demo site.
    Sep 25 2018
    Sep 25

    In Part 4 of a current series on managing shared configuration for Drupal distributions we looked at needs and options for altering configuration provided by extensions (modules, themes, or the site's installation profile). We covered common needs such as altering user roles to add permissions. But when it comes to altering configuration, blocks are a special case--hence this bonus installment!

    When you create a site based on a distribution, there may be a requirement to customize the look and feel. The usual solution is to create a custom subtheme for the site; see the drupal.org documentation on subtheming. That way you can get everything the distribution provides but give the site a custom presentation.

    Using a custom theme will work fine for most configuration. But it won't work for configuration that includes the theme itself as a dependency--like blocks.

    As explained in the drupal.org handbook page Working with Blocks:

    Blocks are boxes of content rendered into an area, or region, of a web page (such as "User Login" or "Who's online") that can be displayed in regions (such as footer or sidebar) on your page.

    Blocks are configured per theme. Typically a Drupal distribution will ship with a designated default theme--a core theme, a contributed theme, or, most commonly, one written especially for the distribution. Blocks are placed into this theme, ensuring sites get all the expected page elements.

    But this approach breaks down on sites using a custom theme. For example, if a site is installed with a distribution, sets a custom theme as the default theme, and then updates to a new version of the distribution and brings in configuration updates including new blocks, those new blocks won't show up on the site. That's because the new blocks were created for the distribution's theme--not the custom theme that's the default theme on the site.

    Even if a distribution-provided block did show somehow in a custom theme, it might be in the wrong position. Each Drupal theme supports a designated set of regions--page areas that blocks can be placed in. While in most cases a custom theme will be given a similar set of regions as the distribution's theme, this may not always be the case.

    Block Theme Sync

    In Drutopia we drafted a module to try to address these problems.

    Block Theme Sync allows a site admin to configure a "theme mapping": a relationship between two themes. As well as specifying a source and destination theme, a theme mapping includes a mapping of source to destination regions.

    Block Theme Sync uses an approach of cloning and then synchronizing. When a theme mapping is in place, all blocks created for the source theme are automatically cloned and created for the destination one. Any subsequent change to block configuration in the source is applied as well to the destination theme.

    Say for example that there's a theme mapping with Bartik selected as the source theme and Mytheme as the destination theme. Whenever a block is added to Bartik, a corresponding block will be added to Mytheme and assigned to the corresponding theme region as specified in the theme mapping.

    This approach works, but has tradeoffs. The main one is inherent to cloning. Since we now have two distinct versions of the block configuration, we face the complexity of divergence between the two. For example, if a destination theme's block is edited, how do we handle that customization when the source theme's block is updated? That's a question we covered in the installment on Respecting Customizations and Updating from Extensions. There's a relevant issue open on Block Theme Sync: Use three-way merging when synchronizing.

    Better Sub-themes

    Not long after our work on Block Theme Sync, Stuart Clark (Deciphered on drupal.org) posted the Better Sub-themes module. While similar in many ways to Block Theme Sync, it takes a different basic approach.

    Like Block Theme Sync, Better Sub-themes provides a way to designate a source and destination relationship between themes and also to map regions. Rather than doing so through configuration, Better Sub-themes extends the theme info file format.

    But in the implementation, rather than cloning and synchronizing, for every block in the source theme, Better Sub-themes dynamically layers on a block in the destination theme. These destination theme blocks don't exist as separate configuration entities on the site. So they don't show up in the site's block admin screen for the destination theme, or when site configuration is exported. They can't be separately edited. But they show up on the site in their designated theme regions.

    If Block Theme Sync implements a cloning and synchronizing pattern, Better Sub-themes uses inheritance. This avoids the problems inherent in cloning. But it introduces challenges of its own. The fact that inherited blocks don't show up in the block administration UI is potentially confusing--how do you edit them? And it makes it difficult to position other blocks in relation to the inherited ones.

    Potential enhancements

    Both Block Theme Sync and Better Sub-themes are affected by a quirk of Drupal core's theme system.

    By default, when you install a theme, if the theme doesn't supply any blocks of its own, Drupal core copies over the blocks that are enabled at that time for the current default theme--see block_theme_initialize(). And a mea culpa: I wrote the first version of Drupal core's theme region system, and included that quirk in the implementation ;)

    Here's how this quirk affects each of the modules:

    • With Block Theme Sync, synchronization is done only when a block from the source theme is saved. if the destination theme doesn't provide any blocks, it will get an initial set of blocks created--but won't get any that were created after the theme was installed but before Block Theme Sync was in place. If it does provide blocks, it won't get any from another theme. Relevant issue: Initialize theme blocks on creation of theme mapping.
    • With Better Sub-themes, when we (a) install a distribution site, getting its standard default default theme along with a set of blocks, then (b) switch the default theme to a custom subtheme with Better Sub-themes support, all the source theme blocks will appear twice on the site: first, because they were cloned by Drupal core; second, because they're inherited from the source theme via Better Subthemes. Relevant issue: Prevent or delete duplicate blocks.

    Related core issue


    • For distributions that include block placement, subthemes present challenges.
    • There are two different modules available that attempt to address these issues: Block Theme Sync and Better Sub-themes.
    • Both modules are more proofs of concept than polished and proven solutions.
    • Each has its relative merits, and which if either is a fit will depend on the particular distribution use case.
    Sep 25 2018
    Sep 25

    With phone in hand, laptop in bag and earbuds in place, the typical user quickly scans multiple sites. If your site takes too long to load, your visitor is gone. If your site isn’t mobile friendly, you’ve lost precious traffic. That’s why it’s essential to build well organized, mobile ready sites.

    But how do you get good results?

    • Understand whom you’re building for
    • Employ the right frameworks
    • Organize your codebase
    • Make your life a lot easier with a CSS preprocessor

    Let’s look at each of these points.

    Design For Mobile

    When you look at usage statistics, the trend is clear. This chart shows how mobile device usage has increased each year. 

    Mobile device usage graphSource

    A vast array of mobile devices accomplish a variety of tasks while running tons of applications. This plethora of device options means that you need to account for a wide assortment of display sizes in the design process.

    As a front end developer, it’s vital to consider all possible end users when creating a web experience. Keeping so many display sizes in mind can be a challenge, and responsive design methodologies are useful to tackle that problem.

    Frameworks that Work

    Bootstrap, Zurb, and Jeet are among the frameworks that developers use to give websites a responsive layout. The concept of responsive web design provides for optimal viewing and interaction across many devices. Media queries are rules that developers write to adapt designs to specific screen widths or height.

    Writing these from scratch can be time consuming and repetitive, so frameworks prepackage media queries using common screen size rules. They are worth a try even just as a starting point in a project.

    Organizing A Large Code Base

    Depending on the size of a web project, just the front end code can be difficult to organize. Creating an organizational standard that all developers on a team should follow can be a challenge. Here at Zivtech, we are moving toward the atomic design methodology pioneered by Brad Frost. Taking cues from chemistry, this design paradigm suggests that developers organize code into 5 categories:

    1. Atoms
    2. Molecules
    3. Organisms
    4. Templates
    5. Pages

    Basic HTML tags like inputs, labels, and buttons would be considered atoms. Styling atoms can be done in one or more appropriate files. A search form, for example, is considered a molecule composed of a label atom, input atom, and button atom. The search form is styled around its atomic components, which can be tied in as partials or includes. The search form molecule is placed in the context of the header organism, which also contains the logo atom and the primary navigation molecule.

    Now Add CSS Preprocessors

    Although atomic design structure is a great start to organizing code, CSS preprocessors such as Sass are useful tools to streamline the development process. One cool feature of Sass is that it allows developers to define variables so that repetitive code can be defined once and reused throughout.

    Here’s an example. If a project uses a specific shade of mint blue (#37FDFC), it can be defined in a Sass file as $mint-blue = #37FDFC. When styling, instead of typing the hex code every time, you can simply use $mint-blue. It makes the code easier to read and understand for the team. 

    Let’s say the client rebrands and wants that blue changed to a slightly lighter shade (#97FFFF). Instead of manually finding all the areas where $mint-blue is referenced on multiples pages of code, a developer can easily revise the variable to equal the new shade ($mint-blue = #97FFFF; ). This change now automatically reflects everywhere $mint-blue was used.

    Another useful feature of Sass is the ability to nest style rules. Traditionally, with plain CSS, a developer would have to repetitively type the parent selector multiple times to target each child component. With Sass, you can confidently nest styles within a parent tag, as shown below. The two examples here are equivalent, but when you use Sass, it’s a kind of shorthand that automates the process.

    Traditional CSS


    Although there are a lot of challenges organizing code and designing for a wide variety of screen sizes, keep in mind that there are excellent tools available to automate the development process, gracefully solve all your front end problems and keep your site traffic healthy.

    This post was originally published on July 1, 2016 and has been updated for accuracy.

    Sep 25 2018
    Sep 25

    Drupal Modules: The One Percent — Dynamic Entity Reference (video tutorial)

    [embedded content]

    Episode 45

    Here is where we bring awareness to Drupal modules running on less than 1% of reporting sites. Today we'll look R Dynamic Entity Reference, a module which permits you to reference multiple types of entities from a single reference field.

    Sep 25 2018
    Sep 25

    Drupal 8.6 has shipped with the Media Library! It’s just one part of the latest round of improvements from the Media Initiative, but what a great improvement! Being brand new it’s still in the “experimental” module state but we’ve set it up on this website to test it out and are feeling pretty comfortable with its stability.

    That said, I highly encourage you test it thoroughly on your own site before enabling any experimental module on a production site. Don’t just take my word for it :)

    What it adds

    The Media Library has two main parts to it...

    Grid Listing

    There’s the Grid Listing at /admin/content/media, which takes precedence over the usual table of media items (which is still available under the “Table” tab). The grid renders a new Media Library view mode showing the thumbnail and compact title, as well as the bulk edit checkbox.

    The new media library grid listing page

    Field Widget

    Then there’s the field widget! The field widget can be set on the “Manage Form Display” page of any entity with a Media Reference Field. Once enabled, an editor can either browse existing media (by accessing the Grid Listing in a modal) or create a new media item (utilising the new Media Library form mode - which is easy to customise).

    Media reference field with the new Media Library form widget

    Media Library widget once media has been added, which shows a thumbnail of the media

    The widget is very similar to what the ‘Inline Entity Form’ module gave you, especially when paired with the Entity Browsers IEF submodule. But the final result is a much nicer display and in general feels like a nicer UX. Plus it’s in core so you don’t need to add extra modules!

    The widget also supports bulk upload which is fantastic. It respects the Media Reference Fields cardinality, so limit it to one - and only file can be uploaded or selected from the browser. Allow more than one and upload or select up to that exact number.  The field even tells you how many you can add and how many you have left. And yes, the field supports drag and drop :)

    What is doesn’t add

    WYSIWYG embedding

    WYSIWYG embed support is now being worked on for a future release of Drupal 8 core, you can follow this Meta issue to keep track of the progress. It sounds like some version of Entity Embed (possibly limited to Media) will make it’s way in and some form of CKEditor plugin or button will be available to achieve something similar to what the Media Entity Browser, Entity Browser, Entity Embed and Embed module set provides currently.

    Until then though, we’ve been working on integrating the Media Libraries Grid Listing into a submodule of Media Entity Browser to provide editors with the UX improvements that came with Media Library but keeping the same WYSIWYG embed process (and the contrib modules behind it) they’re currently used to (assuming they’re already using Media Entity Browser, of course). More on this submodule below.

    This is essentially a temporary solution until the Media Initiative team and those who help out on their issue queue (all the way from UX through to dev) have the time and mental space to get it into core. It should hopefully have all same the bulk upload features the field widget has, it might even be able to support bulk embedding too!

    View mode or image style selectors for editors

    Site builders can set the view mode of the rendered media entity from the manage display page, which in turn allows you to set an image style for that view mode, but editors can’t change this per image (without needing multiple different Media reference fields).

    There is work on supporting this idea for images uploaded via CKEditor directly, which has nothing to do with Media, but I think it would be a nice feature for Media embedding via WYSIWYG as well. Potentially also for Media Reference Fields. But by no means a deal breaker.

    Advanced cropping

    From what I can gather there are no plans to add any more advanced cropping capabilities into core. This is probably a good thing since cropping requirements can differ greatly and we don’t want core to get too big. So contrib will still be your goto for this. Image Widget Crop is my favourite for this, but there’s also the simpler Focal Point.

    You can test out the submodule from the patch on this issue and let us know what you think! Once the patch is added, enable the submodule then edit your existing Entity Browsers and swap the View widget over to the “Media Entity Browser (Media Library)” view.

    Form for changing the Entity Browser view widget

    It shouldn’t matter if you’ve customised your entity browser. If you’ve added something like Dropzone for drag-and-drop support it *should* still work (if not, check the Dropzone or Entity Browser issue queues). If you’ve customised the view it uses however, you might need to redo those customisations on the new view.

    I also like updating the Form Mode of the Entity Browsers IEF widget to use the new Media Library form display, which I always pair back to just the essential fields (who really needs to manually set the author and created time of uploaded media?).

    You still can’t embed more than one media item at a time. But at least now you also can’t select more than one item when browsing so that’s definitely an improvement.

    Modal of the Media Entity Browser showing the same Grid listing

    Plus editors will experience a fairly consistent UX between browsing and uploading media on fields as they do via the WYSIWYG.

    Once setup and tested (ensuring you’ve updated any Media Reference Fields to use the new Media Library widget too) you can safely disable the base Media Entity Browser module and delete any unused configuration - it should just be the old “Media Entity Browser” view.

    Please post any feedback on the issue itself so we can make sure it’s at its best before rolling another release of the module.

    Happy days!

    I hope you have as much fun setting up the Media Library as I did. If you want to contribute to the Media Initiative I’m sure they’ll be more than happy for the help! They’ve done a fantastic job so far but there’s still plenty left to do.

    Photo of Rikki Bochow

    Posted by Rikki Bochow
    Front end Developer

    Dated 25 September 2018


    Nice and useful article of Media Library in core usage into each Drupal 8 project.
    Thank you!


    Add new comment

    Sep 24 2018
    Sep 24

    I moved over to DDEV for my local development stack back in February. One of my favorite things is the ease of using Xdebug. You can configure Xdebug to always be enabled, or turn it on and off as needed (my preferred method.) When you have Xdebug enabled, it also enables it for any PHP scripts executed over the command line. That means you can debug your Drush or Drupal Console scripts like a breeze!

    This article is based on using Xdebug within PhpStorm, as it is my primary IDE.


    When using PhpStorm with any non-local stack you will have to set up a mapping. This allows PhpStorm to understand what files on your local stack match the contents within the PhpStorm project. For example, DDEV serves files from /var/www/html within its containers but the project files are actually /Users/myuser/Sites/awesomeproject  on your machine.

    If you haven't yet, set up the configuration for this mapping. Generally, I never do this upfront and wait until the first time I end up using Xdebug over a web request and PhpStorm prompts me to. In this configuration, you will provide a server name. I set this as the DDEV domain for my project.


    Now, to get Xdebug over the CLI to work we need to ensure a specific environment variable is present: PHP_IDE_CONFIG. This contains the server name to be used, which PhpStorm maps to the configured servers for your project.

    An example value would be


    Now, you could export this every time you want to profile a PHP script over the command line -- but that is tedious. To ensure this variable is always present, I use an environments file for DDEV. To do this, create a docker-compose.env.yaml file in your .ddev directory. DDEV will load this alongside its main Docker Composer file and merge it in.

    version: '3.6'
        - PHP_IDE_CONFIG=serverName=myproject.ddev.local

    And now you can have PHP scripts over the command line trigger step debugging via Xdebug!

    Sep 24 2018
    Sep 24

    This is the sixth installment in a series presenting work on shared configuration that comes out of the Drutopia initiative and related efforts, beginning with Part 1, Configuration Providers.

    Our main focus has been updating configuration from distributions--specifically, the question:

    How can I update my site so that I have all the latest configuration changes from a distribution--while still retaining any customizations I made?

    Updates are well and good. But before packages of configuration can be updated, they need to be produced and managed in the first place. In Drupal 8 as in previous major versions, that task is the domain of the Features module.

    The Drupal 8 version of Features is a complete rewrite with major improvements over previous versions. If you're familiar with previous versions but haven't used Features in Drupal 8 this backgrounder will bring you up to speed.

    Despite being a development-focused tool, Features is in the top 40 or so most installed contributed Drupal 8 modules. Features is used in building and maintaining several of the more-used Drupal 8 distributions including Open Social, Varbase, Open Church, and Quick Start. It's a key build tool for the Gitlab-hosted Drutopia project.

    In this installment we'll cover Features in Drupal 8, including how to use it to produce a distribution.

    Analyzing the site

    With its default configuration, Features will automatically divide a site's configuration into a logically structured set of distinct features. It will even generate an installation profile to use with the set of features.

    Features is architected around three main components.

    • An assignment plugin influences the way that configuration on a site is assigned to distinct packages or features. Features ships with a large set of assignment plugins and other modules can provide their own. See documentation on Features assignment plugins and related developer documentation.
    • A generation plugin determines how features are generated. Features ships with two generation plugins, one that creates downloadable archives and the other that writes features directly to the file system. See documentation on Features generation and related developer documentation.
    • A features bundle is a configuration entity that defines a set of features such as those included in a specific distribution. Each bundle has an associated set of assignment plugins configuration--so each distribution can customize the way its configuration is packaged. See documentation on Features bundles.

    The easiest way to get a feel for this functionality is to install a fresh Drupal site and then use Features to turn it into a distribution.

    Here's how.

    Install Drupal and enable the Features UI module

    One quick way is to use simplytest.me:

    • Bring up simplytest.me with the Features module preselected and click "Launch sandbox". This will spin up a site installed with Drupal core's "Standard" installation profile.
    • When the site loads, if you have not been automatically logged in, click the "Log in" link and log in with the user name "admin" and password "admin".

    Alternately, install a local development site using the "Standard" install profile. See Chapter 3 of the Drupal 8 user guide for details.

    When you have a site installed and are logged in as an administrator:

    • Click "Extend" and install the "Features UI" module.
    • Navigate to admin/config/development/features and click the "Configure bundles" tab to bring up the form for creating and editing feature bundles.
    • For "Bundle", select "--New--" and enter the bundle name "Exemplary".
    • Check the checkbox "Include install profile" and enter "exemplary" for "Profile name".
    • Click the "Features" tab to bring up the form for downloading features. You'll see that the site configuration installed by the "Standard" installation profile has been neatly divided into several distinct features. As well, an installation profile, "Exemplary" has been created.
    • Click the checkbox in the table header, left of "Features", to select all available features.
    • Click the "Download archive" button.

    What you get is an initial draft of a Drupal distribution. You can copy the resulting directory and all its subdirectories to the profiles folder of a Drupal codebase and install a new site using the "Exemplary" installation profile.

    Of course, you won't get anything that's not already in Drupal core's "Standard" installation profile. But there are two advantages you get from the exercise:

    • You can easily install just a subset of what Drupal core provides. This contrasts with the "Standard" profile, which installs everything all at once, whether or not you need it on the particular site.
    • You have a base to build on. As you continue to build out new elements on your site, following the naming conventions in the Features building workflow, Features will recognize new features as you build them and, again, automatically package them into feature modules for generation.

    Features and dependencies

    The "base" and "core" assignment plugins give a feel for how assignment plugins work together to assign configuration to features.

    The Base assignment plugin

    To automatically divide a site's configuration into distinct features, we have to know what features to create.

    One way is by analyzing the configuration itself. Certain types of configuration have a whole lot of other configuration that depends on them. Content types are a key example. A content type doesn't require any other piece of configuration, but typically has many pieces of configuration of different types that depend on it: fields, view modes, search indexes, and so on. For that reason, it's often a good design decision to create a feature module per content type on the site.

    That's what the base assignment plugin does: takes a particular type of configuration and created a feature for each item of that type. Concretely, on a site that has an event content type and an article content type, the Base assignment plugin will create an event and an article feature. Then it will assign to that feature every piece of configuration that's namespaced with the feature name. For example, a field called field_event_type would be assigned to an event feature.

    The Core assignment plugin

    The core assignment pugin addresses the problem of configuration that's required across multiple features. How do we provide that configuration without creating a bunch of cumbersome inter-feature dependencies?

    Say we have a tags field we use on both an event content type (provided by our event feature) and an article content type (provided by our article feature). Like all fields, the tags field has an accompanying field storage. That means the tags field storage will be required by both the event and the article features.

    If we put it in the event feature, the article feature will require events--which doesn't make sense. The reverse case - that events require articles - isn't any better.

    The Core assignment plugin addresses this issue by pulling specified types of configuration into a core feature that can be required by other features.

    Further reading

    See the blog post How to use Features module in Drupal 8 to bundle functionality in reusable module for an introduction to the module. There's also a detailed Features 8 handbook on drupal.org.

    Potential enhancements

    While Features in Drupal 8 is mature and used on thousands of sites, there are loose ends and some significant missing pieces still to cover, and additional uses the module could be extended to.


    Drupal 8 supports multiple "collections" for configuration. So far, Features only works with the default collection. In practice, this means most configuration is handled fine, but language translations of configuration are not. Here's the relevant issue: Support non-default configuration collections, including language.


    A significant missing piece is integration for alters (covered in Part 4 of this series).

    There are two parts to integrating support for alters into Features. First, when exporting items to a feature, it should be possible to export specific changes as alters. Second, in the export of the original item, any alters should be reversed, so that the exported item is alter-free. Core's configuration override system includes the ability to load configuration free of overrides, but there is no equivalent as yet for the various approaches to altering configuration-provided configuration. Potential pieces to help address these requirements include:

    Additional use cases

    The plugin-based architecture of Features means it can potentially be applied beyond its original use case of managing feature modules. For example, it would be relatively simple to provide a features generation plugin to generate configuration packages as Configuration Split configuration entities.

    Related core issues

    Next up

    Stay tuned for the next installment in this series: Base Configuration.

    Sep 24 2018
    Sep 24
    How does Drop Guard know what to do? About Update behaviors & Events
    A tool only performs as good as it’s configured and handled. This post gives detailed insights into the important touch points of the Drop Guard actions you need to configure in order to benefit from a smooth and individual update pipeline. 

    Step 0: You've created your project

    Let’s start with this scenario: You have successfully created your first project in Drop Guard via the “Projects” menu item. Within this process, you gave your project a name, you provided Drop Guard access to your git repository and selected whether your project is managed by composer or not (you had problems at this state of project creation? Contact us, we’ll help you to create your project successfully!).
    So, now that you have the basis for a smooth - and automated update process - it’s time to dig deeper and add the crucial details: How Drop Guard should behavior when an update is available. 
    After saving your project basic settings, you get forwarded to the project’s overview page and some notifications pop up at the top of this page: 
    • Your project is paused: nothing will happen until you unpause  your project, so feel safe to configure everything calmly
    • Add a subscription plan: without a subscription plan, you are only able to set up a project for update monitoring purpose (you need to install the Drop Guard module on your live site for this) 
    • Add new update behaviors now: YES, this is what we are looking for right now as a next step. 

    Step 1: What do you want?

    Get a clear picture of how your current update process (if you have one) is running and agree with your team and/ or customer on how Drop Guard should adapt to this (completely alike, optimized or even totally different to test a new update process? It’s your own free choice!).

    Step 2: Set up update behaviors

    Click on the provided link at the top of your project’s overview page (in the correlating notification popup) or enter the configuration view (“Configure this project” on the right corner). 
    We start with the “Update behaviors” tab. 

    Update behaviors tab overview

    1. ADVANCED and SIMPLE mode

    You can decide whether you want to configure all update types : 
    • Highly Critical (scores 20 to 25)
    • Critical (risk level of 15 to 19)
    • Moderately Critical (10 to 14) 
    • Less Critical (scores 5 to 9)
    • Not Critical (risk level between 0 and 4)
    • Normal (all non-security related module updates)
    If you chose to use the “SIMPLE” mode, the “Security-related” update type merges 4 update types: Critical, Moderately Critical, Less Critical and Not Critical. 
    Learn more about Security risk levels on drupal.org

    2. Behavior settings for each update type

    By clicking on these Update types one after another, you can configure these 6 general behaviors for each type individually:
    1. Apply updates: decide on which branch Drop Guard should deploy the update at the end (this field enables the specific update type to be respected within the process, so if you uncheck it, this update type will not be respected). Specify the Git branch you want the updates of this type to be committed to. The list of your branches will be pulled into the select list and the most appropriate branch will be suggested automatically.
    2. Create feature branches: you can let Drop Guard commit updates to a feature branch. It will be created from the branch specified in the “Apply updates..” checkbox above. If unchecked, all updates will be committed to the source branch you’ve set above. When you are using feature branches, don’t forget to create the “Merge branch” action on the “Events” screen of the project edit form, so your QA process is fulfilled. A good recipe is to merge the feature branch when the task status is “Test passed” or “Closed”.
    3. Execute manual tests: if you (or your customers) want to test updates manually on a specific state, check this box. When it's checked, Drop Guard won’t close the update task until the "Test passed" button will be clicked by someone responsible for QA within the Drop Guard interface or triggered via project management tools such as JIRA or Redmine.
    4. Commit updates automatically: if Drop Guard should start with your configured process automatically, check this box. Otherwise, you have to kick-start every available update manually. This checkbox is essential for the “Highly critical” updates, as the website should be patched as soon as possible, otherwise you’re at risk to get hacked.
    5. Stop process if a patch could not be applied: yes, update automation and patches can work out at the end! It’s Drop Guard’s goal to preserve the customized changes of your modules. Check this box if you want to re-check manually when Drop Guard couldn’t apply your custom patch due to a merge conflict. You will find a log file in the Error task to understand what failed.
    6. Create git tag: Drop Guard can add a tag to the update commit when pushing it to your repository.

    3. Please also configure how Drop Guard should handle its own updates in rare cases.

    YES! - you’ve just set up how Drop Guard should react to each type of update out there in the wild! 
    Let’s get to the magic page, which needs some concentration.

    Excuse me, I just don’t get it - you can contact us, join our Slack or start again with the “Reload best practices” button at the end of the Update behaviors tab (step 4 in the screenshot above). 

    Step 3: Actions, please! 

    After you've decided how Drop Guard should interact with available update types in general, you will now configure what EXACTLY needs to happen in this pipeline and you will give Drop Guard the instructions for your individual process. 
    These actions will be configured on the “Events” tab of the project configuration view. Between “Update behaviors” and “Events” tab, you can also integrate your Project Management/ Communication tools (JIRA, Redmine, Slack) and hosting providers (Pantheon, Acquia, soon: Platform.sh) if there are any included within your update process. 

    Let’s start with an overview:

    Events tab overview


    1. Available actions

    Check out the essence of this page: the available actions. This list is the detailed listing of all available actions that can be chosen in each update process below (blue fields). BUT: if you haven’t added a tool integration like JIRA, Slack or Redmine, you will not be able to choose the applicable action as it won’t be displayed (until you enable it on the “Integrations” tab). 

    Available actions on Events tab

    2. Re-check your Update behaviors

    Click the button “Show your Update behaviors” to see a brief overview of the instructions you gave Drop Guard in the “Update behaviors” tab. Why do you need to be reminded of this for configuring the actions? Because if you configured that Drop Guard should apply a Highly Critical update to your master/ production branch f.e., you should now insert the specific link to this branch for all Highly Critical updates and not provide the link to another branch just because you forgot about your settings. Simply put, it’s a reminder for you so you don’t mess up your previous settings ;-) 

    3. Configure actions 

    We’re at the last step. Now you can call your desired update process into being by telling Drop Guard how to respond in certain update process situations (events). 
    1. New updates are available: Drop Guard already created an update task at this point. You can configure f.e. to be messaged by Drop Guard at this point via email or via a ticket.
    2. Task status is “Ready to test”: Drop Guard applied the new available update to your current module version of your git repository. So now either you/ your customer can test the website (check your Update behavior overview with the grey button above and ensure if you enabled manual tests), an external testing tool or Drop Guard tests automatically. 
    3. Tasks status is “Test passed”: a successful test - done manually, with tools or by Drop Guard - results in a “Test passed” marked task. At this point, all “Test passed” marked tasks can be merged to a branch. 
    4. ATTENTION: if you follow our best practices, you let Drop Guard merge available HIGHLY Critical updates directly without testing them (security beats functionality), which means that Drop Guard assumes that these tasks are in the “Test passed” status anyway, so you have to add the “Merge a branch” action to the production or master branch here for Highly Critical updates).
    5. Task status is “closed”: If a task got closed manually or via a project management tool, you can add an action to get informed about it. 
    6. Time schedule of actions: this action enables you to schedule each action - this means that you can control the day and time of the configured actions. 
    7. Task status is “Failed”: if some action failed due to missing accesses f.e., let yourself be informed by Drop Guard!
    8. A patch could not be applied: get informed if a task failed because Drop Guard couldn’t re-apply a custom patch.
    9. Task status “Test failed”: if a task status was set on “Failed” - f.e. Via JIRA after your customer tested the update on their website - you can configure what Drop Guard should do.
    IMPORTANT: The Actions will be executed in the same order as you position them on the Events management screen. If any Action fails, all further Actions will not be executed and the update task status will be set to “Error”.
    You’ll find the Drop Guard DOCS section about Events and actions here.
    PHEW, we made it!! 
    After this deep configuration, you can add a subscription to this project and unpause it to let Drop Guard do the work for you.
    As an update process combines quite complex steps, we know the nice feeling of mapping the own, the optimized or requested process successfully.

    If you need help to configure your individual process or if you don’t know how to align Drop Guard’s functions with your process, contact us, join our Slack or request a personal demo with our dev team.
    We’re always happy to make you and your project successful. 

    Sep 24 2018
    Sep 24

    Culture. It's the only truly sustainable competitive advantage for a Drupal business.  But what does that look like in action? I've seen firsthand how that culture extends far beyond Mediacurrent's business and customer service approach, shaping the way we network. 

    We have all been to a party, lunch, or even coffee and cookies with a vendor trying to make a connection with you or your company. You can separate all of these into two basic categories: those that you walk into and have fun and those you walk into defensively because you know the goal is to pitch a sale to you.

    Hosting a networking event can be a costly endeavor for your company and there is no guarantee that you will receive a high percentage of return on your investment. Between your time investment, activities, and potentially cost of a space, expenses can begin to pile up quickly.

    Hitting that optimal zone where customers or potential clients will feel relaxed and are open to conversation is key to reaching your maximum potential for ROI for your event. There are several ways you can do this, but it all starts with one word.


    Passion for what you love is the difference between just hosting an event and connecting with the community in your field of business. The goal is to show your passion for what you do, and the community you are in -- in our case, the open source and Drupal communities.  

    DrupalCon 2018 conference aerial group shot

    Take the Dave and Paul approach for example. Over DrupalCon 2018, they threw an amazing after party hosted by Mediacurrent. Everything down to the invites was inclusive to all (not just those with purchasing power) with the message of “Hey, we are throwing a party, come to hang out! Hope to see you there.” Every single person was treated like a friend.

    karaoke at Mediacurrent's DrupalCon Nashville after party

    While at the party, the sales team focused on just interacting, listening to people’s experience and thanking the community for showing up. This approach made people feel so comfortable that if they had a sales question, they would just ask.

    When a person feels welcomed, unpressured and a part of the group, then it's easy for them to make the leap from conference attendee to a potential client. Remember: you and everyone who attends your function is a part of the same community. If you view them as just potential sales, then this will be translated into your body language and verbiage.

    In closing, being a part of the Mediacurrent team has reaffirmed for me the value of networking with authenticity. Hosting your event with the passion you have for the community you are a part of will shine through to everyone who attends and solidify you in their mind as the right partner for their project.


    About Drupal Sun

    Drupal Sun is an Evolving Web project. It allows you to:

    • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
    • Facet based on tags, author, or feed
    • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
    • View the entire article text inline, or in the context of the site where it was created

    See the blog post at Evolving Web

    Evolving Web