Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jul 08 2020
Jul 08

Front-end development workflows have seen considerable innovation in recent years, with technologies like React disseminating revolutionary concepts like declarative components in JSX and more efficient document object model (DOM) diffing through Virtual DOMs. Nonetheless, while this front-end development revolution has led to significant change in the developer experiences we see in the JavaScript landscape and to even more momentum in favor of decoupled Drupal architectures in the Drupal community, it seems that many traditional CMSs have remained behind the curve when it comes to enabling true shared component ecosystems through developer experiences that focus on facilitating shared development practices across back and front end.

At DrupalCon Amsterdam 2019, Fabian Franz (Senior Technical Architect and Performance Lead at Tag1) delivered a session entitled "Components everywhere: Bridging the gap between back end and front end" that delved into his ideal vision for enabling such shared components in Drupal's own native rendering layer. Fabian joined Michael Meyers (Managing Director at Tag1), and me (Preston So, Editor in Chief at Tag1; Senior Director, Product Strategy at Oracle; and author of Decoupled Drupal in Practice) for a Tag1 Team Talks episode highlighting the progress other ecosystems have made in the face of this problem space and how a hypothetical future Drupal could permit rich front-end developer experiences seldom seen in the CMS world. In this two-part blog series, a sequel to Fabian's DrupalCon session, we dive into some of his new conclusions and their potential impact on Drupal's future.

Components everywhere in Drupal

At the onset of our conversation, Fabian offered a quick summary of his idea behind components everywhere—i.e. shared across both client and server—within the Drupal context. The main thrust of Fabian's vision is that developers in Drupal ought to be able to implement a back-end application in a manner indistinguishable from how they would implement a front-end application. In other words, developers should not necessarily need to understand Drupal's application programming interfaces (APIs) or decoupled Drupal approaches. By decoupling Drupal within its own architecture (as I proposed with Lauri Eskola and with Sally Young and Matt Grill before that), we can enable the implementation of purely data-driven Drupal applications.

But what does this truly mean from the standpoint of Drupal developers? Fabian identifies the moment where components everywhere will truly reach success as the conditions in which the same component can be leveraged on the front end and back end without any distinction in how data is handled. One of the key means of doing this in a way that can be shared across client and server is through slots, which can contain additional data and provide the concept of component "children."

Because of how Drupal's front-end architecture was originally architected, there are significant gaps between how Drupal handles its "components" and how other technologies juggle theirs. For instance, while theme functions comprise an important foundation for how Drupal developers interact with the Drupal front end, there is no way to provide a slot for interior data or nested components. There is an analogous concept in terms of children in the render tree, but this requires considerable knowledge of PHP to traverse. According to Fabian, though we have all of the elements needed for a component-based system available in Drupal, one of the primary challenges is that there are so many elements within Drupal that can lend themselves to such a component-based system.

Looking to Laravel for inspiration

Adhering to the open-source philosophy of "proudly found elsewhere," Fabian turned to other projects for inspiration as he began to articulate what it would take to implement the vision he presented in Amsterdam. After all, reinventing the wheel is usually an ill-advised approach when open-source solutions are available to be leveraged. For instance, Laravel contains templates but needed to introduce component tags to their templating system in order to capture generic slots. In Drupal, on the other hand, both theme functions and Twig templates can morphologically be considered components, but they lack certain key attributes most components today contain. Slots are implementable in Twig, but that is solely because all data is already available to Twig templates in Drupal.

Laravel 7 introduced BladeX to the Laravel ecosystem. BladeX provides a highly enjoyable developer experience by serving as a component handler for Laravel components. As long as developers prefix all components with x- in their custom element names (i.e. <x-component>, they no longer need to use a regular expression to find all possible component names in the component system, instead simply searching for all components whose names are prefixed with x-. And if the React developer experience is any indication, many modern front-end developers strongly prefer declarative HTML like the following:

    <x-alert prop="value"></x-alert>

BladeX first began as a contributed plugin to Laravel. Later, it was added to Laravel core due to its usefulness in enabling not only a graceful component system but also pleasant-to-use syntax to work with those components. Livewire also includes graceful capabilities enabling interactivity, which in Drupal is currently represented by the Drupal Ajax framework (difficult to use due to its tight coupling to Drupal's Form API).

More recently, Laravel introduced a tool known as Livewire, which makes it possible to implement server-side document object models (DOM) but lacks the data input/output (I/O) necessary to enable state management and interactivity. As such, Fabian extended the concept of a store from his DrupalCon session to include a provider that allows data retrieval and use in components. Fortunately, Livewire has a partial implementation of this, and it is possible to implement a server-side message that increments a counter and then to retrieve that counter value gracefully from the client side. Livewire automatically understands that it needs to update the server-side render of that counter and serve that updated value to the client.

What about Web Components?

Fabian's thinking is by no means alone when it comes to enabling components everywhere in Drupal. Many other initiatives, including one that aimed to introduce Web Components into Drupal, have been down this road. But why are Web Components so compelling for this in the first place? By going a step further and introducing the Shadow DOM, Web Components can provide full encapsulation automatically, off the shelf.

And the Shadow DOM itself is a game changer because of the benefits provided by syntactic features like CSS scoping, in which styles contained in a Shadow DOM are unaffected by those that came previously. Another way to accomplish such CSS scoping is through stringent class-based selector nomenclature or utilities like TailwindCSS that dispense with the traditional CSS cascade altogether. Many in the JavaScript world are increasingly moving in this direction, according to Fabian, of considering the cascade in CSS a suboptimal feature.

In other user interface (UI) systems, particularly in the mobile application development landscape, there are two emerging approaches to styling mobile applications seen in ecosystems like React Native and Flutter. These allow you to assemble compelling layouts without any cascade presented in CSS, and all are React-driven components that leverage CSS-in-JavaScript solutions to perform styling. Increasingly, these developments point to a landscape where developers eschew the cascade, long essential to writing CSS, in favor of a more atomic approach to styling components.


Components are difficult even in the best of times, not solely because of the relative conceptual complexity and differences in understandings when it comes to how components are defined from system to system. In the case of JavaScript technologies, approaches like React's declarative component syntax and Virtual DOM portend a world in which components are increasingly shared between client and server and data in components is decoupled from the component during all stages of component life cycles, irrespective of whether it is rendered on the back end or front end. Complicating matters further is the fact that traditional content management systems like Drupal and WordPress have largely not kept pace with the dizzying innovation in the front-end development universe.

In this blog post, we examined some of the new conclusions Fabian has come to well after his DrupalCon presentation when it comes to enabling components everywhere in Drupal, particularly taking inspiration from other ecosystems like Laravel, React, and Web Components. In the second installment of this two-part blog series, we'll dive into how to define components in Drupal, offer a more declarative component experience when working with them, and some of the other ways in which we can enable shared components across client and server and rich immutable data-driven state in a setting where these novelties have long seemed to be anathema or worlds removed: the Drupal front-end ecosystem.

Special thanks to Fabian Franz and Michael Meyers for their feedback during the writing process.

Photo by Tim Johnson on Unsplash

Jul 01 2020
Jul 01

Many front-end technologies, especially React, now consider the notion of declarative components to be table stakes. Why haven't they arrived in environments like the Drupal CMS's own front end? Many native CMS presentation layers tend to obsolesce quickly and present a scattered or suboptimal developer experience, particularly against the backdrop of today's rapidly evolving front-end development workflows. But according to Fabian Franz, there is a solution that allows for that pleasant front-end developer experience within Drupal itself without jettisoning Drupal as a rendering layer.

The solution is a combination of Web Components support within Drupal and intelligent handling of immutable state in data that allows for Drupal to become a more JavaScript-like rendering layer. Rather than working with endless render trees and an antiquated Ajax framework, and instead of reinventing Drupal's front-end wheel from scratch, Fabian recommends adopting the best of both worlds by incorporating key aspects of Web Components, the Shadow DOM, and particularly syntactic sugar for declarative components that competes readily not only with wildly popular JavaScript technologies like React and Vue but also matches up to the emerging approaches seen in ecosystems like Laravel.

In this Tag1 Team Talks episode, join Fabian Franz (Senior Technical Architect and Performance Lead at Tag1), Michael Meyers (Managing Director at Tag1), and your host and moderator Preston So (Editor in Chief at Tag1; Senior Director, Product Strategy at Oracle; and author of Decoupled Drupal in Practice) for a wide-ranging technical discussion about how to enable declarative components everywhere for Drupal's front end out of the box. If you were interested in Fabian's "Components Everywhere" talk at DrupalCon Amsterdam last year, this is a Tag1 Team Talks episode you won't want to miss!

[embedded content]

Related Links

DrupalCon Amsterdam 2019:Components everywhere! - Bridging the gap between backend and frontend

Insider insights on rendering and security featuresWhat the future holds for decoupled Drupal - part 2


Laravel Blade Templates


Mortenson's WebComponents server-side shim

AJAX API Guide on Drupal.org

Chat with the Drupal Community on Slack: https://www.drupal.org/slack



Other mentions:

Preston’s newsletter: Preston.so

Preact - Fast 3kB alternative to React with the same modern API https://preactjs.com/

Descript.com - Uses AI to transcribe Audio (PodCasts) and Video into text, providing you with a transcript & closed captioning; edit the audio/video by editing the text!

Photo by Ren Ran on Unsplash.

Jun 24 2020
Jun 24

After four-and-a-half years of development, Drupal 9 was just released, a milestone in the evolution of the Drupal content management system. The Drupal Association has long played a critical role not only in supporting the advancement and releases of one of the world's largest and most active open-source software projects; it also contributes to the Drupal roadmap and drives its forward momentum in other important ways. In addition to maintenance releases for Drupal 7 and Drupal 8, the Drupal 9 release not only promises an easy upgrade for Drupal 8 users but also ushers in a new period of innovation for Drupal.

But that's not all. Drupal 9's release also means long-awaited upgrades to Drupal.org as well as some of the most essential infrastructure and services that underpin Drupal.org and its associated properties, like localize.drupal.org, groups.drupal.org, and api.drupal.org. Releases in Drupal have also garnered greater scrutiny from nefarious actors who target launch dates to seek security vulnerabilities. The Drupal Association works tirelessly to buttress all of these initiatives and responsibilities, with the support of Tag1 and other organizations.

In this Tag1 Team Talks episode, part of a special series with the engineering team at the Drupal Association, we speak discuss Drupal 9 and what it portends for Drupal's future with Tim Lehnen (Chief Technology Officer, Drupal Association), Neil Drumm (Senior Technologist, Drupal Association), Narayan Newton (Chief Technology Officer, Tag1 Consulting), Michael Meyers (Managing Director, Tag1 Consulting), and Preston So (Editor in Chief at Tag1 Consulting and author of Decoupled Drupal in Practice). We dove into some of the nitty-gritty and day-in-the-life of Drupal core committers and how Drupal is taking a uniquely new approach to tackle technical debt.

[embedded content]



Photo by asoggetti on Unsplash

Jun 22 2020
Jun 22

Maintaining Drupal projects and managing Drupal modules can be challenging for even contributors who have unlimited time. For decades now, Drupal's ecosystem has cultivated a wide array of tools for contributors to create patches, report issues, collaborate on code, and perform continuous integration. But as many source control providers begin to release shiny new features like web IDEs and issue workspaces that aim to make open-source contributors' lives even easier, many are doubtlessly wondering how Drupal's own developer workflows figure in an emerging world of innovation in the space.

DrupalSpoons, created by Moshe Weitzman and recently released, is a special configuration of groups and projects in GitLab that provides a bevy of useful features and tools for Drupal contributors who are maintaining Drupal projects. A play on the word "fork," which refers to a separately maintained clone of a codebase that still retains a link to the prior repository, DrupalSpoons offers support for GitLab issues, merge requests (GitLab's analogue for GitHub's pull requests), and continuous integration on contributed Drupal projects in the ecosystem. It leverages zero custom code, apart from the issue migration process to aid DrupalSpoons newcomers, and outlines potential trajectories for Drupal contribution in the long term as well.

In this exciting episode of Tag1 Team Talks, Moshe Weitzman (Subject Matter Expert, Senior Architect, and Project Lead at Tag1) hopped on with Michael Meyers (Managing Director at Tag1) and your host Preston So (Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) for a deep dive into what makes DrupalSpoons so compelling for Drupal contributors and the origin story that inspired Moshe to build it. Join us to learn how you can replace your existing Drupal contribution workflows with DrupalSpoons to get the most out of Drupal's recent migration to GitLab and the most modern capabilities in Drupal code management today.

[embedded content]



Photo by Richard Iwaki on Unsplash

Jun 15 2020
Jun 15

Part 1 | Part 2 | Part 3

For several years now, decoupled Drupal has been among the topics that has fixated members of the Drupal community. At present, there is no shortage of blog posts and tutorials about the subject, including my own articles, as well as a comprehensive book covering decoupled Drupal and an annual conference in New York City to boot. Now that JSON:API has been part of Drupal core for quite some time now, some of the former obstacles to implementations of decoupled Drupal architectures have been lowered.

However, though we have seen a large upswing in the number of decoupled Drupal projects now in the wild, some areas of the decoupled Drupal ecosystem have not yet seen the spotlight afforded projects like JSON:API and GraphQL. Nonetheless, many of these contributed projects are critical to adding to the possibilities of decoupled Drupal and can abbreviate the often lengthy period of time it takes to architect a decoupled Drupal build properly.

In April of last year, this author (Preston So, Editor in Chief at Tag1 Consulting and author of Decoupled Drupal in Practice) spoke to a packed auditorium at DrupalCon Amsterdam about some of the lesser-known portions of the decoupled Drupal landscape. In this multi-part blog series, we survey just a few of these intriguing projects that can serve to accelerate your decoupled Drupal implementations with little overhead but with outsized results. In this third and final installment, we cover several projects that encompass some of the most overlooked requirements in decoupled Drupal, namely JSON-RPC, Schemata, OpenAPI, and Contenta.js.

Running Drupal remotely with JSON-RPC

Just before we continue, it's important that you have exposure to the other information provided in this series for the most complete possible perspective on these projects that make decoupled Drupal even more compelling. This third and final installment in the blog series presumes knowledge already presented in the first and second installments, in particular the summary of motivations behind JSON-RPC provided in the installment immediately preceding this post.

Maintained by Mateu Aguiló Bosch (e0ipso) and Gabe Sullice (gabesullice), the mission of JSON-RPC is to serve as a canonical foundation for Drupal administrative actions that go well beyond the limitations and possibilities of RESTful API modules like core REST, Hypertext Application Language (HAL), and JSON:API. The JSON-RPC module also exposes certain internals of Drupal, including permissions and the list of enabled modules on a site.

To install JSON-RPC, use the following commands, which also enable JSON-RPC submodules.

    $ composer require drupal/jsonrpc
    $ drush en -y jsonrpc jsonrpc_core jsonrpc_discovery

Executing Drupal actions

To rebuild the cache registry, you can issue a POST request to the /jsonrpc endpoint with the following request body, and JSON-RPC will respond with the following response body and a 204 No Content response code.

      "jsonrpc": "2.0",
      "method": "cache.rebuild",

To retrieve a user's permissions, you can similarly issue a POST request to the same /jsonrpc endpoint, which will respond to your request with a 200 OK response code and a list of the user's permissions.

  "jsonrpc": "2.0",
      "method": "user_permissions.list",
      "params": {
    "page": {
      "limit": 5,
          "offset": 0
      "id": 2

All JSON-RPC methods

The table below shows some of the other common methods that you can execute by issuing requests to JSON-RPC. For a deeper explanation of JSON-RPC as well as a full account of what features JSON-RPC makes available to decoupled Drupal architectures, consult Chapter 23 of my book Decoupled Drupal in Practice.


Derived schemas and documentation with Schemata and OpenAPI

In API-first approaches, schemas are declarative descriptions that outline the shape of a JSON document, such as a typical entity response from a Drupal web service. In Drupal 8, the Schemata module, maintained by Adam Ross (grayside), is responsible for providing schemas that facilitate features that were previously impossible in Drupal such as generated API documentation and generated code, both of which we will examine shortly. To install the Schemata module, execute the following commands:

    $ composer require drupal/schemata
    $ drush en -y schemata schemata_json_schema

Navigating schemas

With Schemata, you can navigate a schema to learn more about how the API issues and handles data either by using the browser or by issuing GETrequests against endpoints that are prefixed with /schemata. Consider, for instance, the following format for Schemata requests:


Here are two examples of schema navigation with regard to the possible URLs against which you can issue requests. Note that in the first example, we are requesting a description of the resource according to the JSON:API module, whereas in the second we are requesting it in the HAL format found in Drupal 8's core REST module.



In the image below, you can see the result of a sample response from Schemata for the schema describing article data.

Schemata sample response


OpenAPI is a separate project, formerly known as the Swagger specification, which describes RESTful web services based on a schema. The OpenAPI module, maintained by Rich Gerdes (richgerdes) and Ted Bowman (tedbow), integrates with both core REST and JSON:API to document available entity routes in both web services modules.

The unique value proposition for OpenAPI for decoupled Drupal practitioners is that it offers a full explorer to traverse an API schema to understand what requests are possible and what responses are output when the API issues a response. To install OpenAPI, execute the following commands, depending on whether you prefer to use ReDoc or Swagger UI, both of which are libraries that integrate with OpenAPI to provide styles for API documentation.

    # Use ReDoc.
    $ composer require drupal/openapi
    $ composer require drupal/openapi_ui_redoc
    $ drush en -y openapi openapi_ui_redoc

    # Use Swagger UI.
    $ composer require drupal/openapi
    $ composer require drupal/openapi_ui_swagger
    $ drush en -y openapi openapi_ui_swagger

One of the more exciting use cases that OpenAPI makes possible is the idea of generated code, which is dependent on the notion that generated API documentation based on derived schemas means that APIs are predictable. This opens the door to possibilities such as generated CMS forms with built-in validation based on what these schemas provide. For more information about generated code based on the advantages of derived schemas and generated API documentation, consult Chapter 24 of my book Decoupled Drupal in Practice.

Revving up with proxies:

One final project that we would be remiss not to cover as part of this survey of hidden treasures of decoupled Drupal is Contenta.js, authored by Mateu Aguiló Bosch (e0ipso), which addresses the pressing need for a Node.js proxy that acts as middleware between a Drupal content API layer with web services and a JavaScript application. As many decoupled Drupal practitioners have seen in the wild, a Node.js proxy is often useful for decoupling Drupal due to its value in offloading responsibilities normally assigned to Drupal.

Contenta.js integrates seamlessly with any Contenta CMS installation that exposes APIs, as long as the URI of the site is provided in that site's configuration. Many developers working with decoupled Drupal are knowledgeable about Contenta CMS, an API-first distribution for Drupal that provides a content repository optimized for decoupled Drupal while still retaining many of the elements that make Drupal great such as the many contributed modules that add to Drupal's base functionality. (Another similar project, Reservoir, has since been deprecated.)

One of the compelling selling points of Contenta.js is that for Contenta installations that already have modules like JSON:API, JSON-RPC, Subrequests (covered in Chapter 23 of Decoupled Drupal in Practice), and OpenAPI need no further configuration in order for Contenta.js to work with little customization out of the box. Contenta.js contains a multithreaded Node.js server, a Subrequests server facilitating request aggregation, a Redis integration, and a more user-friendly approach to cross-origin resource sharing (CORS). For more information about Contenta.js, consult Chapter 16 of my book Decoupled Drupal in Practice.


Decoupled Drupal is no longer theoretical or experimental. For many developers the world over, it is now not only a reality but also a bare minimum requirement for many client projects. But fortunately for decoupled Drupal practitioners who may be skittish about the fast-changing world of API-first Drupal approaches, there is a rapidly expanding and maturing ecosystem for decoupled Drupal that furnishes solutions for a variety of use cases. Most of these are described at length in my book Decoupled Drupal in Practice.

In this final installment, we covered some of the major modules—and one Node.js project—that you should take into consideration when architecting and building your next decoupled Drupal project, including JSON-RPC, Schemata, OpenAPI, and Contenta.js. And in this multi-part blog series, we summarized some of the most important projects in the contributed Drupal landscape that can help elevate your decoupled Drupal implementations to a new level of success, thanks to the accelerating innovation occurring in the Drupal community.

Special thanks to Michael Meyers for his feedback during the writing process.

Part 1 | Part 2 | Part 3

Photo by ASA Arts & Photography on Unsplash

Jun 08 2020
Jun 08

Though the biggest news this month is the release of Drupal 9, that doesn't mean big releases aren't happening on other versions of Drupal too. The milestone represented by Drupal 9 also welcomes new versions of both Drupal 7 and Drupal 8 to the Drupal ecosystem. It's been four-and-a-half years since Drupal 8 was released, and 54 months of development from scores of contributors around the world went into Drupal 9. And thanks to the indefatigable efforts of open-source contributors in the module ecosystem, there are already over 2,000 contributed modules ready to go, compatible with Drupal 9 out of the box.

Drupal 9 is a massive step for innovation in the Drupal community, thanks to the careful thought that went into how Drupal can continue to stay ahead of the curve. During the Drupal 9 development cycle, which was largely about deprecating and removing old code, the Drupal core committers laid the groundwork for the future and facilitated a more pleasant upgrade experience from Drupal 8 to Drupal 9 that should smooth over many of the hindrances that characterized the transition from Drupal 7 to Drupal 8. And there's already exciting new plans ahead for Drupal 9, with coming releases consisting of even more refactoring and deprecations. With Drupal 9.1 in December, the focus will shift to new features and improvements, including user experience, accessibility, performance, security, privacy, and integrations.

In the second episode of our new monthly show Core Confidential, Fabian Franz (VP Software Engineering at Tag1) sat down with Michael Meyers (Managing Director at Tag1) and your host Preston So (Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) for a quick but comprehensive survey of how Drupal 9 will change Drupal for the better. Beyond discussing the technical improvements and ecosystem advancements, this Core Confidential episode also dives into the anxieties, challenges, and concerns that core committers have about Drupal 9 moving forward.

[embedded content]


Two moderately critical advisoriess that you need to be aware of and address:

Photo by Jingda Chen on Unsplash

Jun 01 2020
Jun 01

Part 1 | Part 2 | Part 3

Decoupled Drupal has been a hot topic in the Drupal community for several years now, and there are now many projects implementing decoupled Drupal architectures, as well as a bevy of content (including my own articles on the subject). Nowadays, decoupled Drupal practitioners can now benefit from the first-ever comprehensive book about decoupled Drupal as well as a yearly decoupled Drupal conference. Presently, especially with the JSON:API module now available as part of Drupal core, getting started with decoupled Drupal has never been more accessible to more developers.

Nevertheless, there are still hidden areas of decoupled Drupal that have seldom seen much attention in the Drupal community for a variety of reasons. Some of these contributed Drupal modules have been around for quite some time and can help to shorten the amount of time you spend implementing a decoupled Drupal architecture, whether it comes down to a differing API specification or extending existing functionality.

Recently, your correspondent (Preston So, Editor in Chief at Tag1 Consulting and author of Decoupled Drupal in Practice) delivered a DrupalCon Seattle session about some of these lesser-known parts of the decoupled Drupal ecosystem. In this multi-part blog series, we embark on a tour through some of these exciting areas and dive into how these projects can accelerate your builds. In this second installment, we cover how you can leverage the RELAXed Web Services module for own purposes and how you can extend existing features in the JSON:API module now incorporated into core.

Working with RELAXed Web Services

Before we proceed, be sure to read the first installment in this series for a quick introduction to decoupled Drupal and a taxonomy of the architecture involved. The first installment in this blog series also introduces RELAXed Web Services and how to install and configure the module. From this point forward, it is presupposed that you have a working Drupal 8 site with RELAXed Web Services installed and configured.

To verify that RELAXed Web Services is working properly, we can issue the following GET request against the /relaxed endpoint (or whatever we have configured the URL in the previous installment of this blog series). The Drupal server should respond with a 200 response code and the following response body:

      "couchdb": "Welcome",
      "uuid": "02286a1b231b68d89624d281cdfc0404",
      "vendor": {
        "name": "Drupal",
        "version": "8.5.6",
      "version": "8.5.6"

Retrieving data with RELAXed Web Services

The following table describes all of the GET requests that you can issue against a variety of resources provided by RELAXed Web Services.

RELAXed Web Services response

The screenshot below demonstrates a sample RELAXed Web Services response to a GET request targeted to retrieve a single Drupal entity.

RELAXed Web Services response

Creating entities with RELAXed Web Services

To create documents, which in RELAXed Web Services parlance are equivalent to Drupal entities, you can issue a POST request to the /relaxed/live endpoint (or prefixed with the custom API root you have configured) with the following request body. The server will respond with a 201 Created response code.

      "@context": {
        "_id": "@id",
        "@language": "en"
      "@type": "node",
        "_id": "b6cea743-ba86-49b0-81ac-03ec728f91c4",
        "en": {
          "@context": {
            "@language": "en"
        "langcode": [{ "value": "en" }],
        "type": [{ "target_id": "article" }],
        "title": [{ "value": "REST and RELAXation" }],
        "body": [
            "value": "This article brought to you by a request to RELAXed Web

Because a full description of RELAXed Web Services is well beyond the scope of this survey blog series, this section provided just a taste of some of the ways in which RELAXed Web Services differs from some of the other API approaches available in the decoupled Drupal ecosystem, including Drupal 8's native core REST and HAL (Hypertext Application Language).

Nonetheless, for developers looking for RESTful solutions that are more flexible than core REST and better-suited to offline solutions than JSON:API in many cases, RELAXed Web Services provides a powerful RESTful alternative. For more information about RELAXed Web Services and information about modifying and deleting individual documents remotely in Drupal, please consult Chapters 8 and 13 of my book Decoupled Drupal in Practice.

Extending JSON:API with Extras and Defaults

Oftentimes, when using modules like JSON:API, which is now available as part of Drupal 8 core for developers to leverage, we need to override the preconfigured defaults that accompany the module upon installation. Luckily, there are two modules available in Drupal's contributed ecosystem that can make this process much easier, especially given the fact that JSON:API aims to work out of the box as a zero-configuration module.

The JSON:API Extras module provides interfaces to override default settings and configure new ones that the resulting API should follow in lieu of what comes off the shelf in the JSON:API module. Some of the features contained in the module include capabilities such as enabling and disabling individual resources from the API altogether, aliasing resource names and paths, disabling individual fields within entity responses, aliasing constituent field names, and modifying field output through field enhancers in Drupal.

You can install both modules easily with Composer. JSON:API Defaults, which we cover later in this section, is available as a submodule of JSON:API Extras.

    # Install JSON:API Extras.
    $ composer require drupal/jsonapi_extras
    $ drush en -y jsonapi_extras

    # Install JSON:API Defaults.
    $ drush en -y jsonapi_extras jsonapi_defaults

In the following image, you can see how we can override certain preconfigured settings in JSON:API such as disabling the resource altogether, changing the name of the resource type, and overriding the resource path that follows the /jsonapi prefix.

Add JSON:API resource

In the image below, field customization is displayed in JSON:API Extras, a feature that allows you to alias fields and perform other actions that permit you to customize the response output in a more granular way. As you can see, one of the most compelling motivations for JSON:API Extras on your own implementation is the notion of full customization of JSON:API's output not only at the resource level but at the field level as well.

Field customization

JSON:API Defaults

Formerly an entirely separate module maintained by Martin Kolar (mkolar), JSON:API Defaults allows you to set default includes and filters for resources in JSON:API. JSON:API Defaults is particularly useful when consumers prefer issuing slimmer requests without the parameters required to yield a response that includes relationships in the payload. In other words, you can issue a request without parameters and receive a response having predetermined defaults such as includes.

Though a full discussion of JSON:API Defaults is outside the scope of this rapid-fire survey of the lesser-known parts of the decoupled Drupal ecosystem, I highly encourage you to check out Chapter 23 in my book Decoupled Drupal in Practice, which engages in an in-depth discussion of JSON:API Extras and JSON:API Defaults.

Running Drupal remotely

Sometimes, merely interacting with Drupal content through APIs in decoupled Drupal is insufficient for the use cases and requirements that our customers demand. Deeper functionality in Drupal is often required remotely for consumer applications to access, particularly actions such as performing a cache registry rebuild or running a cron job. But these do not necessarily fit neatly into the normal API-driven approaches for Drupal entities, because they are not part of the RESTful paradigms in which Drupal generally operates out of the box.

In decoupled Drupal and other software ecosystems, remote procedure calls (RPCs) are calls that execute a procedure on another system, written as if they were local actions, without direct code written against that other system. In short, in the decoupled Drupal context, they are a convenient way for consumer applications to perform tasks remotely without their developers needing to understand the nuts and bolts of the upstream system. In the next installment of this blog series, we'll cover Drupal's RPC approach for decoupled Drupal and how you can leverage it for a variety of tasks you need in your client.


In this blog post, we surveyed several of the major API-first solutions available for decoupled Drupal aficionados that have not received as much attention as of late, including RELAXed Web Services and supplementary modules that provide additional features like JSON:API Extras and JSON:API Defaults. Over the course of this post, we covered how to retrieve entities using RELAXed Web Services and how you can customize your JSON:API resources and fields to your heart's content.

In the following installment of this multi-part blog series, we dive into JSON-RPC, the RPC provider for decoupled Drupal and discuss how to perform certain tasks using the JSON-RPC module. In addition, we'll cover derived schemas and API documentation, two of the most important concepts in the emerging API-first landscape that is beginning to gain significant attention in the headless CMS community.

Special thanks to Michael Meyers for his feedback during the writing process.

Part 1 | Part 2 | Part 3

Photo by Stefan Steinbauer on Unsplash

May 26 2020
May 26

Now that decoupled Drupal has permeated the Drupal community, even to the furthest extremes, articles (including my own) introducing concepts and how-to tutorials describing how you can build your first decoupled Drupal architecture are ubiquitous. As a matter of fact, decoupled Drupal now also has a book on the subject as well as an annual conference dedicated to the topic. Particularly with the JSON:API module in Drupal core as of 8.7.0, decoupled Drupal out of the box has never been easier.

But despite the brilliant spotlight shining on decoupled Drupal from all corners of the CMS industry, there are lesser-known secrets and hidden treasures that reflect not only the innovative character of the Drupal community but also some true gems that can accelerate your decoupled Drupal implementation. Whether in the area of web services or the category of Drupal modules that extend those same web services, there are myriad components of the decoupled Drupal experience that you may not have heard of before.

In this multi-part blog series, we’ll delve into a few of these concepts in this companion piece to the recent session I (Preston So, Editor in Chief at Tag1 Consulting and author of Decoupled Drupal in Practice) gave entitled “Secrets of the decoupled Drupal practitioner” at DrupalCon Seattle in April. We’ll first venture through a rapid reintroduction to decoupled Drupal before moving progressively up the stack, starting with web services and ending with some of the consumer tooling available to help you maintain high velocity.

A quick introduction to decoupled Drupal

In short, monolithic Drupal consists of a contiguous Drupal architecture that cannot be separated into distinct services. In other words, the default Drupal front end is inextricable from the larger Drupal monolith because of all the linkages that require the front end to remain coupled to the back end, including data references in the theme layer and other tools like the Form API, which allows for the rendering of forms in the Drupal presentation layer according to certain back-end logic.

Defining decoupled Drupal

The simplest definition of decoupled Drupal is also one that adheres to the larger definition of decoupled CMS (and an exhaustive definition is also available in my book Decoupled Drupal in Practice). In short, a decoupled CMS is a content or data service that exposes data for consumption by other applications, whatever these applications are built in, including native mobile, native desktop, and single-page applications. Whereas a single Drupal site could serve a single consumer, in today’s landscape, many practitioners are implementing a single Drupal site that simultaneously acts as a repository for a wide variety of consumers.

The original iteration of decoupled Drupal came in the mid-2010s with the advent of progressively decoupled Drupal. In this paradigm, rather than separating out the front end into a separate implementation, a JavaScript framework could be interpolated into the existing Drupal front end and have access not only to ES2015 capabilities but also certain data Drupal makes available to its presentation layer.

Universal JavaScript

With the proliferation of Node.js and the enablement of server-side JavaScript, Drupal began to be relegated more to concerns surrounding API provisioning and structured content management, while JavaScript application libraries and frameworks like React and Vue.js could take over for all rendering concerns, not only on the client side but also on the server (for both progressive enhancement and search engine optimization purposes).

A typical architecture that implements decoupled Drupal in conjunction with a universal JavaScript application (shared JavaScript code for rendering across both client and server) would facilitate the following interactions: During the initial server-side render executed by Node.js, the application fetches all data synchronously from Drupal to flesh out the render that will be flushed to the browser. Then, when the client-side bundle of the application initializes, the initial render is rehydrated with further asynchronous client-side renders that retrieve updated data from Drupal as needed.

There are many risks and rewards involved in implementing a decoupled architecture of this nature, especially in terms of architecture, developer experience, security and performance, and project management. For more information about these advantages and disadvantages as well as more detailed background on decoupled Drupal, consult my new book Decoupled Drupal in Practice (Apress, 2018).

An alternative API: RELAXed Web Services

While JSON:API and GraphQL have seemingly received all the airtime when it comes to web services available in Drupal, there is another web service implementation that not only adheres to a commonly understood specification, like JSON:API and GraphQL, but also enables a variety of new functionality related to content staging and offline-enabled website features. RELAXed Web Services, a module maintained by Tim Millwood and Andrei Jechiu, implements the Apache CouchDB specification and is part of the Drupal Deploy ecosystem, which provides modules that allow for rich content staging.

An implementation of CouchDB stores data within JSON documents (or resources) exposed through a RESTful API. And unlike Drupal’s own core REST API, now mostly superseded by the availability of JSON:API in core as of Drupal 8.7, CouchDB implementations accept not only the typical HTTP methods of GET, POST, and DELETE, but also PUT and COPY.


RELAXed Web Services occupies a relatively unique place in the Drupal web services ecosystem. The diagram above, which is not exhaustive, illustrates some of the ways in which Drupal’s major web services modules interact. Some depend on only the Serialization module, such as Drupal’s JSON:API implementation (prior to its entry into Drupal core), while others such as GraphQL rely on nothing at all. RELAXed Web Services relies on both REST and Serialization in order to provide its responses.

Thus, we can consider RELAXed Web Services part of the RESTful API segment of Drupal’s web services. The above Euler diagram illustrates how GraphQL, because it does not adhere to REST principles, remains uniquely distinct from other modules such as core REST, JSON:API, and RELAXed Web Services. While all RESTful APIs are web services, not all web services are RESTful APIs.

Installing and configuring RELAXed Web Services

To install RELAXed Web Services, you’ll need to use Composer to install both the relaxedws/replicator dependency and the module itself:

    $ composer require relaxedws/replicator:dev-master
    $ composer require drupal/relaxed
    $ drush en -y relaxed

Fortunately, RELAXed Web Services does not require you to use its content staging capabilities if you do not wish to, but you will need to configure the Replicator user and install the separate Workspaces module if you wish to do so. Without the Workspaces module enabled, the default workspace that is available by default in RELAXed Web Services is live, and we will see in the next installment of this blog series why that name is so important.

The screenshot below displays the RELAXed Web Services settings page, where you can configure information such as the Replicator user and customize an API root if you wish to prefix references to your resources with something different.


While covering the full range of RELAXed Web Services' capabilities is beyond the scope of this first installment, I strongly encourage you to take a look at what is available with the help of the Apache CouchDB specification, as some of the use cases that this approach can enable are unprecedented when it comes to the future of user experiences leveraging decoupled Drupal.


In this blog post, we embarked on a rapid-fire reintroduction to decoupled Drupal for those unfamiliar with the topic as well as a deep dive into one of the most fascinating and seldom discovered modules utilized in the decoupled Drupal space, RELAXed Web Services, which implements the Apache CouchDB specification. In the process, we covered how to install the module before turning to how to use RELAXed Web Services to implement a variety of data requirements in decoupled Drupal architectures in the next installment.

In the next installment in this multi-part blog series, we'll cover how to employ RELAXed Web Services for common needs in decoupled Drupal architectures and some of the intriguing ways in which the module and its surrounding ecosystem enable not only content staging use cases but also offline-enabled features that satisfy the widening demands that many clients working with decoupled Drupal today have on a regular basis.

Special thanks to Michael Meyers for his feedback during the writing process.

Photo by Michael Dziedzic on Unsplash

May 18 2020
May 18

Of all the discussions in the Drupal community, few have generated such a proliferation of blog posts and conference sessions as decoupled Drupal, which is also the subject of a 2019 book and an annual New York conference—and has its share of risks and rewards. But one of the most pressing concerns for Drupal is how we can ensure a future for our open-source content management system (CMS) that doesn't relegate it to the status of a replaceable content repository. In short, we have to reinvent Drupal to provide not only the optimal back-end experience for developers, but also a front end that ensures Drupal's continued longevity for years to come.

A few months ago, Fabian Franz (Senior Technical Architect and Performance Lead at Tag1 Consulting) offered up an inspirational session that presents a potential vision for Drupal's front-end future that includes Web Components and reactivity in the mix. In Fabian's perspective, by adopting some of the key ideas that have made popular JavaScript frameworks famous among front-end developers, we can ensure Drupal's survival for years to come.

In this multi-part blog series that covers Fabian's session in detail from start to finish, we summarize some of the key ideas that could promise an exciting vision not only for the front-end developer experience of Drupal but also for the user experience all Drupal developers have to offer their customers. In this fifth installment in the series, we continue our analysis of some of the previous solutions we examined and consider some of the newfangled approaches made possible by this evolution in Drupal.

The "unicorn dream"

Before we get started, I strongly recommend referring back to the first, second, third, and fourth installments of this blog series if you have not already. They cover essential background information and insight into all of the key components that constitute the vision that Fabian describes. Key concepts to understand include Drupal's render pipeline, virtual DOMs in React and Vue, the growing Twig ecosystem, universal data stores, and how reactivity can be enabled in Drupal.

One of the final questions Fabian asks in his presentation is about the promise unleashed by the completion of work to enable shared rendering in Drupal, as well as reactivity and offline-enabled functionality. During his talk, Fabian recalls a discussion he had at DrupalCon Los Angeles with community members about what he calls the unicorn dream: an as-yet unfulfilled vision to enable the implementation of a Drupal site front end with nothing more than a single index.html file.


Fabian argues that the component-driven approach that we have outlined in this blog series makes this unicorn dream possible thanks to slots in Web Components. Because React, Vue, and Twig all have slots as part of their approaches to componentization, this possibility becomes more possible than ever before. Front-end developers can insert repeatable blocks with little overhead while still benefiting from configuration set by editors who don't touch a single line of code but that affects rendered output. Developers can extend said block rather than overriding the block.

Consider, for instance, the following example that illustrates leveraging an attribute to override the title of a block:

    <sidebar type="left">
      <block slot="header" id="views:recent_content">
        <h2 slot="title">I override the title</h2>

When Fabian attempted to do this with pure Twig, he acknowledges that the complexity quickly became prohibitive to proceed, and the prototype never reached core-readiness. However, thanks to this approach using Web Components slots, one could create plugins for modern editors that would simply use and configure custom elements. Among editors that would support this hypothetical scenario are heavyweights like CKEditor 5, ProseMirror (which Tag1 recently selected as part of a recent evaluation of rich-text editors), and Quip.

Developer experience improvements

This means that we as developers no longer have the need to convert the display of tokens through a variety of complex approaches. Instead, we can simply render HTML and directly output the configured component; Drupal will handle the rest:

    <drupal-image id="123" />

Moreover, leveraging BigPipe placeholders with default content finally becomes simple thanks to this approach, argues Fabian. We can simply place default content within the component, and once the content arrives, it becomes available for use:

    <block id="views:recent_content" drupal-placeholder="bigpipe">
      I am some default content!

In this way, we can take advantage of our existing work implementing BigPipe in Drupal 8 rather than resorting to other JavaScript to resolve this problem for us.

Performance improvements

Finally, some of the most important advancements we can make come in the area of performance. For front-end developers who need to serve the ever-heightening demands of customers needing the most interactive and reactive user experience possible, performance is perennially a paramount consideration. When using a universal data store, performance can be improved drastically, particularly when the store is utilized for as many data requirements as possible.

We can simply update the real-time key-value store, even if this happens to solely be located on Drupal. As Fabian argues, a data-driven mindset makes the problem of shared rendering and componentization in Drupal's front end much simpler to confront. Developers, contends Fabian, can export both the data and template to a service such as Amazon S3 and proceed to load the component on an entirely different microsite, thus yielding benefits not only for a single site but for a collection of sites all relying on the same unified component, such as &lt;my-company-nav />.

Such an approach would mean that this company-wide navigation component would always be active on all sites requiring that component, simplifying the codebase across a variety of disparate technologies.

Editorial experience improvements

Nonetheless, perhaps some of the most intriguing benefits come from improvements to the editorial experience and advancements in what becomes possible despite the separation of concerns. One of the chief complaints about decoupled Drupal architectures, and indeed one of its most formidable disadvantages, is the loss of crucial in-context functionality that editors frequently rely on on a daily basis such as contextual links and in-place editing.

With Fabian's approach, the dream that formerly seemed utterly impossible of achieving contextual administrative interfaces within a decoupled Drupal front end become not only possible but realistic. We can keep key components of Drupal's contextual user interface such as contextual links as part of the data tree rather than admitting to our customers that such functionality would need to vanish in a scenario enabling greater reactivity and interactivity for users.

After all, one of the key critiques of decoupled Drupal and JavaScript approaches paired with Drupal, as I cover in my book Decoupled Drupal in Practice, is the lack of support for contextual interfaces and live preview, though I've presented on how Gatsby can mitigate some of these issues. Not only does this solution allow for contextual interfaces like contextual links to remain intact; it also means that solutions like progressive decoupling also become much more feasible.

Moreover, one of the key benefits of Fabian's approach is Drupal's capacity to remain agnostic to front-end technologies, which guarantees that Drupal is never coupled to a framework that could become obsolete in only a few years, without having to reinvent the wheel or create a Drupal-native JavaScript framework. And one of the key defenses of Fabian's vision is this rousing notion: We can potentially enable single-page applications with Drupal without having to write a single line of JavaScript.

Outstanding questions

Despite the rousing finish to Fabian's session, pertinent questions and concerns remain about the viability of his approach that were borne out during the Q&A session following the session. One member of the audience cited the large number of examples written in Vue and asked whether other front-end technologies could truly be used successfully to implement the pattern that Fabian prescribes. Fabian responded by stating that some work will be necessary to implement this in the framework's own virtual DOM, but in general the approach is possible, as long as a customizable render() function is available.

Another member of the audience asked how Drupal core needs to evolve in order to enable the sort of future Fabian describes. Fabian answered by recommending that more areas in Drupal responsible for rendering should be converted to lazy builders. This is because once no dependencies in the render tree are present, conversion to a component tree would be much simpler. Fabian also cited the need for a hook that would traverse the DOM to discover custom components after each rendering of the Twig template. Thus, the main difference would be writing HTML in lieu of a declaration in Twig such as {% include menu-item %}.


In this fifth and final installment of our multi-part blog series about a visionary future for Drupal's front end, we examined Fabian's rousing DrupalCon Amsterdam session to discuss some of the benefits that reactivity and offline-first approaches could have in Drupal, as well as a framework-agnostic front-end vision for components that potentially extends Drupal's longevity for many years to come. For more information about these concepts, please watch Fabian's talk and follow our in-house Tag1 Team Talks for discussion about this fascinating subject.

Special thanks to Fabian Franz and Michael Meyers for their feedback during the writing process.

Photo by Stephen Leonardi on Unsplash

May 13 2020
May 13

What is the day-to-day life of a Drupal core committer like? Besides squashing bugs and shepherding the Drupal project, the maintainers responsible for Drupal core are also constantly thinking of ways to improve the developer experience and upgrade process for novice and veteran Drupal users alike. With Drupal 9 coming just around the corner, and with no extended support planned for Drupal 8 thanks to a more seamless transition to the next major release, Drupal's core developers are hard at work building tools, approving patches, and readying Drupal 9 for its day in the spotlight. But Drupal 9 isn't the only version that requires upkeep and support—other members of the Drupal core team also ensure the continued longevity of earlier versions of Drupal like Drupal 7 as well.

The impending release of Drupal 9 has many developers scrambling to prepare their Drupal implementations and many module maintainers working hard to ensure their contributed plugins are Drupal 9-ready. Thanks to Gábor Hojtsy's offer of #DrupalCares contributions in return for Drupal 9-ready modules, there has been a dizzying acceleration in the growth of modules available as soon as Drupal 9 lands. In addition, the new Rector module allows for Drupal contributors to have access to a low-level assessment of what needs to change in their code to be fully equipped for the Drupal 9 launch.

In this inaugural episode of Core Confidential, the insider guide to Drupal core development and Tag1's new series, we dive into the day-to-day life of a core committer and what you need to know about Drupal 9 readiness with the help of Fabian Franz (VP of Software Engineering at Tag1), Michael Meyers (Managing Director at Tag1), and your host Preston So (Editor in Chief at Tag1 and author of Decoupled Drupal in Practice). Learn more about how Drupal's core team continues to support the Drupal project as it gets ready for the latest and greatest in Drupal 9, due to be released this summer for eager CMS practitioners worldwide.

[embedded content]


May 11 2020
May 11

Drupal is one of the largest and most active open-source software projects in the world. Behind the scenes is the Drupal Association, the non-profit organization responsible for enabling it to thrive by architecting and introducing new tooling and infrastructure to support the needs of the community and ecosystem. Many of us know the Drupal Association as the primary organizer of the global DrupalCon conference twice a year. But it's less common knowledge that the Drupal Association is actively engaged in Drupal development and maintains some of the most important elements of the Drupal project. This runs across the spectrum of software localizations, version updates, security advisories, dependency metadata, and other "cloud services" like the Drupal CI system that empower developers to keep building on Drupal.

With the ongoing coronavirus pandemic, the Drupal Association is in dire financial straits due to losses sustained from DrupalCon North America (one of the largest sources of funding) having to be held as a virtual event this year. As part of the #DrupalCares campaign, we at Tag1 Consulting implore organizations that use Drupal, companies that provide Drupal services, and even individuals who make their living off Drupal development to contribute in some shape or form to the Drupal Association in this time of need.

We are putting our money where our mouth is. For years we have donated at least eighty hours a month to support the DA and Drupal.org infrastructure and tooling. I’m proud to announce that we are expanding this commitment by 50% to 120 hours a month of pro-bono work, from our most senior resources, to help the DA offset some of its operating expenses. Furthermore, we contributed to help #DrupalCares reach its $100,000 goal and so that any donation you make is doubled in value.

To gain insights into building software communities at scale in open source, Michael Meyers (Managing Director at Tag1) and I (Preston So, Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) recently kicked off a Tag1 Team Talks miniseries with the Drupal Association's engineering team, represented by Tim Lehnen (Chief Technology Officer at the Drupal Association) and Narayan Newton (Chief Technology Officer at Tag1), to examine all the ways in which the DA keeps the Drupal community ticking.

Why Tag1 supports the Drupal Association

Here at Tag1, we work with a diverse range of technologies, but Drupal has been our passion for many years. It's been a critical part of our business since Tag1's inception, and we're grateful to the Drupal Association for sustaining such an essential part of our work today. By no means is it an understatement to characterize the Drupal Association as the lifeblood of the Drupal ecosystem. Because of our appreciation for what Drupal has given us, we're committed to doing our part to giving back to Drupal, not only over the course of our many years working in concert with the Drupal Association but also right now during the #DrupalCares campaign.

How we contribute to Drupal

Though Tag1 is well-known for being the all-time number-two contributor to the Drupal project, with the largest concentration of core committers, branch managers, release managers, and core maintainers of any organization in the community, we're much less known for how we support the underlying foundations of the ecosystem. Beyond the more visible contributions of staff members like Moshe Weitzman, Nathaniel Catchpole (catch), Francesco Placella (plach), and Fabian Franz (fabianx), we also do much more than add our support to Drupal core development. After all, supporting Drupal requires more than just code; it also requires the tooling and infrastructure that keep the project's blood flowing.

During our Tag1 Team Talks episode with the Drupal Association, Tim Lehnen eloquently made the case for the non-profit that has driven Drupal's success for so many years: While the software makes up the bulk of open-source contributions, offering surrounding services that buttress the software's core is another key function that the Drupal Association performs. To that end, for many years, Tag1 has donated 80 hours of pro-bono work a month to ensure that Drupal.org and all the tooling the community relies on stays up and running. Tag1 is honored to increase our monthly contribution of pro-bono hours to the Drupal Association by 50% from 80 to 120 hours of expert work from our most senior resources. And now with our increased work hours and financial contributions, critical projects like the migration to GitLab can continue to move forward, even during a situation like the current pandemic.

Supporting Drupal's test infrastructure

In Drupal, a key aspect of code contribution is running tests that verify a patch will work against a massive variety of environments, be compatible with a spectrum of versions of Drupal, and not introduce any functional regressions in the code. One of the key questions many community members ask is why Drupal maintains its own testing infrastructure in lieu of a service such as TravisCI.

Unfortunately, whenever existing continuous integration solutions were tasked with running a Drupal core test for every Drupal patch, they would consistently time out, maxing out available resources. To solve the challenges associated with developing and testing at scale, the DA partnered with Tag1. We deployed our expertise in infrastructure, mission-critical application development, and performance and scalability to help run and maintain Drupal.org's servers and the DrupalCI test runner system. The CI system ensures that contributors have a reliable center for collaboration and a dependable test infrastructure for all of their patches and modules. Tag1's deep expertise has been critical to the success of the DrupalCI system, which we scaled dynamically to the extent that it is now concurrently running more than an entire decade's worth of testing in a single year.

The new testing infrastructure was an enormous undertaking for the Drupal Association due to its complexity. Narayan Newton opted from early days to leverage standard Unix tools to build out the environments for testing targets. And rather than use Kubernetes for the orchestration of tests, the Drupal Association opted to use Jenkins and the EC2 Fleet plugin for DrupalCI. Jenkins manages the orchestration of virtual machines (VMs) and initializes them as test targets before actually running the tests themselves in a clean room environment. As Narayan notes during our conversation, one of the most fascinating quirks of Drupal's infrastructure is that many of its core elements were installed before standardized tooling emerged to handle those use cases in a regimented way.

Supporting Drupal's migration to GitLab

In addition to our contributions to Drupal's underlying infrastructure, Tag1 also assists with key initiatives run by the Drupal Association such as the ongoing migration from Drupal's homegrown Git system to GitLab, a source control provider. According to Narayan, the migration to GitLab has been much more straightforward than previous historical migrations in Drupal's past, more specifically the original migration from Drupal's previous CVS source control system to Git, which it has used ever since. Code management in Drupal has long employed a bespoke Git approach with a homegrown Git daemon written by the community and cgit as the web-based front end for Git repositories.

One of the key benefits GitLab provides to the Drupal Association is the fact that the DA is no longer responsible for building and supporting a source control system for Drupal at the scale at which it operates. After all, GitLab has a dedicated site reliability engineering (SRE) team focused on ensuring source availability even at high loads. And as Narayan notes, GitLab has been responsive to security issues, in addition to facilitating "one of the smoothest migrations I've been a part of." But this doesn't mean there weren't complications.

Because GitLab has a superset of features that include some existing Drupal.org functionality, the Drupal Association, supported by Tag1, worked closely with the GitLab team to ensure that certain features could be disabled for use with the Drupal project, avoiding many of the issues that have plagued the GitHub mirror of Drupal since its conception. Narayan contributed key features to ensure that GitLab's integration points could be toggled on and off in order to enable the unique needs and requirements of the Drupal community and ecosystem.

Tim adds that in terms of lack of downtime, disruption, forklifting the entire Git code management infrastructure without disrupting the development community was a rousing success, especially given that there was no impact on a minor version release. In the process, the Drupal community has gained a number of key features that will enable accelerated development and conversation between contributors in ever-richer ways. In coming months, the Drupal Association will also facilitate the addition of GitLab's merge requests feature, which will introduce yet more efficiencies for those making code contributions.

Why #DrupalCares is so important

For us, Drupal is a key reason we exist, and the Drupal Association has done wonders to ensure the longevity of an open-source software project we hold dear. This is why in these troubling times for the Drupal Association, it could not be more important to uphold the ideals of open source and ensure the survival of our beloved community and ecosystem. Over the course of the past month, we've witnessed an incredible outpouring of support from all corners of the community, buttressed by the various matches provided by community members like none other than project lead Dries Buytaert. We at Tag1 Consulting have contributed toward #DrupalCares' $100,000 goal in order to multiply the impact of community donations and buttress our existing support.

Without your support, whether as a company or an individual, we may never see another DrupalCon grace our stages or celebrate yet another major version release that introduces innovative features to the Drupal milieu. And it's not just about the more visible elements of the Drupal experience like DrupalCon. It's also about the invisible yet essential work the Drupal Association does to keep the Drupal project rolling along. Thanks to the innumerable contributions the Drupal Association has made to maintain DrupalCI, the GitLab migration, Composer Façade, and a host of other improvements to Drupal's infrastructure and tooling, with the support of Tag1, the Drupal project remains one of the most impressive open-source projects in our industry.


Here at Tag1, we believe in the enduring value of open source and its ability to enrich our day-to-day lives in addition to the way we do business. We're dedicated to deepening our already extensive support for the Drupal Association in ways both financial and technological. And now it's your turn to return the favor. If you're an individual community member, we strongly encourage you to start or renew a membership. If you're an organization or company in the Drupal space, we encourage you to contribute what you can to ensure the continued success of Drupal. Together, we can keep Drupal alive for a new era of contribution and community.

Special thanks to Jeremy Andrews and Michael Meyers for their feedback during the writing process.

Photo by Jon Tyson on Unsplash

May 06 2020
May 06

In recent years, it seems as if open source has taken the software world by storm. Nonetheless, many enterprise organizations remain hesitant to adopt open-source technologies, whether due to vendor lock-in or a preference for proprietary solutions. But open source can in fact yield substantial fruit when it comes to advancing your business in today’s highly competitive landscape. By leveraging and contributing back to open source, you can distinguish your business with open source as a competitive advantage.

A few years back, Michael Meyers (Managing Director at Tag1 Consulting) presented a keynote at Texas Camp 2016 about the individual and business benefits of open source. As part of that talk, he highlighted some of the best motivations for open-source adoption and the outsized benefits that open source delivers to not only individual developers but also businesses that are seeking to get ahead in recruiting, sales, and other areas. In this two-part blog series (read the first part), we analyze the positive effects of open source on everyone from individual developers to the biggest enterprises in the world, all of whom are benefitting from their adoption of open-source software.

In this second installment, we dive into some of the ways in which open-source technologies like Drupal can improve your bottom line, with the help of a hypothetical tale of two companies and real-world case studies that demonstrate that open source presents much more rewards than risks in the context of enterprise.

A tale of two enterprises

As I wrote in the previous installment in this two-part series, individuals who participate in and contribute to Drupal garner immense benefits from open-source communities. And organizations can leverage these benefits as well for themselves by encouraging their employees to attend open-source conferences and grow their expertise and knowledge.

Let’s consider a hypothetical scenario in which two enterprise organizations are attempting to outcompete others in their space. The two protagonists of our vignette are DIY Corporation (whose slogan is “reinventing the wheel since forever”), and their corporate headquarters is located in the silos next to a nearby waterfall. Collab Incorporated is the other main character in this story, and they focus on working with others.

Writing custom code vs. leveraging open source

In this hypothetical scenario, DIY Corporation downloads Drupal, one of the most commonly used open-source content management systems (CMS) in the world. However, it soon discovers that it needs to extend existing functionality to solve problems unique to its business requirements. DIY Corporation chooses to write code to solve the problem rather than leveraging others’ code, something that is a common event among organizations that are unaccustomed to open-source software. Writing code makes perfect sense, as the business needs are resolved, but the challenge is when developers leave and additional support is required. When DIY Corporation gets stuck, they have no one to turn to, because their code is located in a private repository.

Meanwhile, Collab Inc. first checks to see if there is a solution available that has already been committed to the open-source ecosystem in the form of a Drupal module or experimental sandbox project. The key distinction here is that if there is no solution already available, Collab Inc. can decide only then to write a solution—and they choose to do so in public rather than in a silo. Too often, Drupal companies download software and write code in isolation rather than contributing that code back. If every organization opts to do this, then we negate the value of the open-source community in the first place.

A real-world example: Fivestar module

The key lesson from this hypothetical scenario is that sharing code from the beginning translates into better results for everyone across the board. By being open to contributions and ideas from others, we can resolve shared problems when we hit a wall. After all, other organizations will have a vested interest in your contributed code, because they are dependent on it and appreciative of the outcomes they have been able to achieve as a result.

A real-world example of this situation is Drupal’s Fivestar module, which ironically does exactly what it says it does. Originally developed by Lullabot for Sony BMG, which needed to provide ratings on pages associated with the label’s musicians, it has quickly found ubiquity across a variety of businesses leveraging Drupal. After the Fivestar module was released, Warner Music Group also contributed to the module’s codebase by upgrading it to Drupal 6 from Drupal 5. This illustrates an increasingly rare scenario in the hyper-competitive music landscape: two competitors helping each other for better results all around.

Thanks to Warner Music Group’s contributions, when Sony BMG finally needed to update all of their artist websites to Drupal 6, they simply employed the existing Drupal 6 module. Because of this strategic alliance, even as direct competitors, Sony BMG and Warner Music Group recognized that they were not competitors in the technology space—only in the music space—and worked together to realize mutual benefits. In the end, technology is a commodity, and every dime spent on additional code is not in each organization’s best interest. Their five-star rating systems are not a differentiator; instead of building separate code in that arena, they can focus on creating good music.

Recruiting talent in open source

Consider another scenario. Organizations are always looking to recruit the best talent, particularly in the Drupal ecosystem. Our hypothetical DIY Corporation posts to a job board; they release information about their opening and wonder why their recruiting pipeline is running dry, especially if they are not a well-known household name. However, because DIY Corporation has not focused on recruiting open-source developers in an open-source community, they have not attracted the interest they desire.

This brings us to a crucial point: Locking up your code guarantees a disincentive for people to work with you as an employee. Developers who wish to grow their careers are willing to work with employers who have a vested interest in their growth as well. If an employer does not grant the necessary opportunities to engage with open-source communities, this results in a lack of opportunities. Thanks to open source, organizations can develop a bench of people who may be interested in the future, thus expanding their recruiting pipelines.

The competitive advantage of open source

The benefits and advantages conferred by Drupal cannot be overstated. While the most overt benefit is Drupal’s cost-effectiveness, the more subtle—and perhaps realer—benefit is that you can participate in a global community with common methodologies and best practices that expand your sphere of knowledge and influence. Open source has been proven time and time again to be better, faster, and cheaper.

For agencies interested in getting involved in open source, there are huge opportunities. For instance, if a customer is looking to hire a consultancy to solve a particular problem, they have a clear choice between an agency that simply uses Drupal as opposed to one that actively contributes meaningfully to the Drupal community.

Agencies can gain a significant advantage by contributing to open source. Granted, contributing to open source as a small agency can be difficult, and bench time can often be limited for developers not actively working on projects. However, organizations that do get involved and publicize their open-source contributions tend to get meaningfully more business as a result. For instance, prominent companies in the Drupal landscape such as Amazee Labs, Phase2, Chapter Three, and others with full-time Drupal contributors often have customers reaching out directly precisely because of their commitment to open source.


Getting involved in open source can yield substantial dividends for those who engage in it. Though there are thousands upon thousands of open-source projects in the wild that you can get involved in, Drupal in particular has a highly well-developed ecosystem for organizations to get involved in open-source contribution, including user groups and Drupal conferences around the world that are looking for sponsors interested in supporting open source. As a case in point, I organize a non-profit open-source conference in New York City called Decoupled Days, about decoupled Drupal (also the subject of my book), and we’re currently looking for more sponsors!

For businesses interested in contributing to open source, there are also business summits and events, such as Drupal Business Days, that can help you connect with other organizations exploring open-source software like Drupal. And there’s no need to be a developer to contribute to open source. In fact, among the most critical needs the Drupal community perpetually has are marketing and event support. That brings us to perhaps the most important message of open-source contributions: You, too, can contribute.

Special thanks to Michael Meyers for his feedback during the writing process.

Photo by Vincent van Zalinge on Unsplash

May 05 2020
May 05

Drupal is one of the largest and most active open-source projects in the world, and the Drupal Association is responsible for enabling it to thrive by creating and maintaining tooling and other projects that keep Drupal humming. Though many in the Drupal community and outside it see the Drupal Association only as the organizer of the global DrupalCon conferences each year, the Drupal Association is, in fact, responsible for some of the most critical elements that sustain Drupal as a software product, including localizations, updates, security advisories, metadata, and infrastructure. All of the "cloud services" that we work with on a daily basis in the Drupal ecosystem represent fundamental functions of the Drupal Association.

In recent years, the Drupal Association has launched several features that reinvent the way developers interact with Drupal as a software system, including DrupalCI (Drupal's test infrastructure), Composer Façade (in order to support Drupal's adoption of Composer), and Drupal's ongoing migration to GitLab for enhanced source control. For many years, Tag1 Consulting has supported and contributed to the Drupal Association not only as a key partner in visible initiatives but also in the lesser-known aspects of the Drupal Association's work that keep Drupal.org and the ecosystem running. Though we've long provided 80 free hours of work a month to the Drupal Association, we're proud to announce we are expanding this commitment by 50% to 120 pro-bono hours per month (75% of an FTE). In addition we have also made a donation toward #DrupalCares' $100,000 goal.

In this special edition of the _Tag1 Team Talks _show, we introduce a miniseries with the engineering team at the Drupal Association, including Tim Lehnen (Chief Technology Officer, Drupal Association) and Narayan Newton (Chief Technology Officer, Tag1), along with Michael Meyers (Managing Editor, Tag1) and Preston So (Editor in Chief at Tag1 and Senior Director, Product Strategy at Oracle). In this first installment, we dive into some of the mission-critical work the Drupal Association performs for the Drupal community with the support of Tag1 and other organizations and how they represent the lifeblood of the Drupal project as well as its continued longevity.

[embedded content]

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web