Aug 19 2019
Aug 19

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Low-code and no-code tools for the web are on a decade-long rise; they enable self-service for marketers, and allow developers to focus on innovation.

Low code no code

A version of this article was originally published on Devops.com.

Twelve years ago, I wrote a post called Drupal and Eliminating Middlemen. For years, it was one of the most-read pieces on my blog. Later, I followed that up with a blog post called The Assembled Web, which remains one of the most read posts to date.

The point of both blog posts was the same: I believed that the web would move toward a model where non-technical users could assemble their own sites with little to no coding experience of their own.

This idea isn't new; no-code and low-code tools on the web have been on a 25-year long rise, starting with the first web content management systems in the early 1990s. Since then no-code and low-code solutions have had an increasing impact on the web. Examples include:

While this has been a long-run trend, I believe we're only at the beginning.

Trends driving the low-code and no-code movements

According to Forrester Wave: Low-Code Development Platforms for AD&D Professionals, Q1 2019, In our survey of global developers, 23% reported using low-code platforms in 2018, and another 22% planned to do so within a year..

Major market forces driving this trend include a talent shortage among developers, with an estimated one million computer programming jobs expected to remain unfilled by 2020 in the United States alone.

What is more, the developers who are employed are often overloaded with work and struggle with how to prioritize it all. Some of this burden could be removed by low-code and no-code tools.

In addition, the fact that technology has permeated every aspect of our lives — from our smartphones to our smart homes — has driven a desire for more people to become creators. As the founder of Product HuntRyan Hoover, said in a blog post: "As creating things on the internet becomes more accessible, more people will become makers."

But this does not only apply to individuals. Consider this: the typical large organization has to build and maintain hundreds of websites. They need to build, launch and customize these sites in days or weeks, not months. Today and in the future, marketers can embrace no-code and low-code tools to rapidly develop websites.

Abstraction drives innovation

As discussed in my middleman blog post, developers won't go away. Just as the role of the original webmaster (FTP hand-written HTML files, anyone?) has evolved with the advent of web content management systems, the role of web developers is changing with the rise of low-code and no-code tools.

Successful no-code approaches abstract away complexity for web development. This enables less technical people to do things that previously could only be done by developers. And when those abstractions happen, developers often move on to the next area of innovation.

When everyone is a builder, more good things will happen on the web. I was excited about this trend more than 12 years ago, and remain excited today. I'm eager to see the progress no-code and low-code solutions will bring to the web in the next decade.

Aug 19 2019
Aug 19

Low-code and no-code tools for the web are on a decade-long rise; they enable self-service for marketers, and allow developers to focus on innovation.

Low code no code

A version of this article was originally published on Devops.com.

Twelve years ago, I wrote a post called Drupal and Eliminating Middlemen. For years, it was one of the most-read pieces on my blog. Later, I followed that up with a blog post called The Assembled Web, which remains one of the most read posts to date.

The point of both blog posts was the same: I believed that the web would move toward a model where non-technical users could assemble their own sites with little to no coding experience of their own.

This idea isn't new; no-code and low-code tools on the web have been on a 25-year long rise, starting with the first web content management systems in the early 1990s. Since then no-code and low-code solutions have had an increasing impact on the web. Examples include:

While this has been a long-run trend, I believe we're only at the beginning.

Trends driving the low-code and no-code movements

According to Forrester Wave: Low-Code Development Platforms for AD&D Professionals, Q1 2019, In our survey of global developers, 23% reported using low-code platforms in 2018, and another 22% planned to do so within a year..

Major market forces driving this trend include a talent shortage among developers, with an estimated one million computer programming jobs expected to remain unfilled by 2020 in the United States alone.

What is more, the developers who are employed are often overloaded with work and struggle with how to prioritize it all. Some of this burden could be removed by low-code and no-code tools.

In addition, the fact that technology has permeated every aspect of our lives — from our smartphones to our smart homes — has driven a desire for more people to become creators. As the founder of Product Hunt Ryan Hoover said in a blog post: As creating things on the internet becomes more accessible, more people will become makers..

But this does not only apply to individuals. Consider this: the typical large organization has to build and maintain hundreds of websites. They need to build, launch and customize these sites in days or weeks, not months. Today and in the future, marketers can embrace no-code and low-code tools to rapidly develop websites.

Abstraction drives innovation

As discussed in my middleman blog post, developers won't go away. Just as the role of the original webmaster has evolved with the advent of web content management systems, the role of web developers is changing with the rise of low-code and no-code tools.

Successful no-code approaches abstract away complexity for web development. This enables less technical people to do things that previously could only by done by developers. And when those abstractions happen, developers often move on to the next area of innovation.

When everyone is a builder, more good things will happen on the web. I was excited about this trend more than 12 years ago, and remain excited today. I'm eager to see the progress no-code and low-code solutions will bring to the web in the next decade.

August 19, 2019

2 min read time

Aug 13 2019
Aug 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

A special bird flying in space has the spotlight while lots of identical birds sit on the ground (lack of diversity)

At Drupalcon Seattle, I spoke about some of the challenges Open Source communities like Drupal often have with increasing contributor diversity. We want our contributor base to look like everyone in the world who uses Drupal's technology on the internet, and unfortunately, that is not quite the reality today.

One way to step up is to help more people from underrepresented groups speak at Drupal conferences and workshops. Seeing and hearing from a more diverse group of people can inspire new contributors from all races, ethnicities, gender identities, geographies, religious groups, and more.

To help with this effort, the Drupal Diversity and Inclusion group is hosting a speaker diversity training workshop on September 21 and 28 with Jill Binder, whose expertise has also driven major speaker diversity improvements within the WordPress community.

I'd encourage you to either sign up for this session yourself or send the information to someone in a marginalized group who has knowledge to share, but may be hesitant to speak up. Helping someone see that their expertise is valuable is the kind of support we need in order to drive meaningful change.

Aug 12 2019
Aug 12
A special bird flying in space has the spotlight while lots of identical birds sit on the ground (lack of diversity)

At Drupalcon Seattle, I spoke about some of the challenges Open Source communities like Drupal often have with increasing contributor diversity. We want our contributor base to look like everyone in the world who uses Drupal's technology on the internet, and unfortunately, that is not quite the reality today.

One way to step up is to help more people from underrepresented groups speak at Drupal conferences and workshops. Seeing and hearing from a more diverse group of people can inspire new contributors from all races, ethnicities, gender identities, geographies, religious groups, and more.

To help with this effort, the Drupal Diversity and Inclusion group is hosting a speaker diversity training workshop on September 21 and 28 with Jill Binder, whose expertise has also driven major speaker diversity improvements within the WordPress community.

I'd encourage you to either sign up for this session yourself or send the information to someone in a marginalized group who has knowledge to share, but may be hesitant to speak up. Helping someone see that their expertise is valuable is the kind of support we need to drive meaningful change.

August 12, 2019

44 sec read time

db db
Aug 01 2019
Aug 01

For the sixth year in a row, Acquia has been recognized as a leader in the Gartner Magic Quadrant for Web Content Management.

For the sixth year in a row, Acquia has been recognized as a leader in the Gartner Magic Quadrant for Web Content Management. Acquia first entered the Web Content Management Magic Quadrant back in 2012 as a Visionary, and since then we've moved further than any other vendor to cement our leadership position.

As I've written before, analyst reports like the Gartner Magic Quadrant are important because they introduce organizations to Acquia and Drupal. As I've put if before If you want to find a good coffee place, you use Yelp. If you want to find a nice hotel in New York, you use TripAdvisor. Similarly, if a CIO or CMO wants to spend $250,000 or more on enterprise software, they often consult an analyst firm like Gartner..

In 2012, Gartner didn't fully understand the benefits of Acquia being the only WCM company who embraced both Open Source and cloud. Just seven years later, our unique approach has forever changed web content management. This year, Acquia moved up again in both of the dimensions that Gartner uses to rank vendors: Completeness of Vision and Ability to Execute. You'll see in the Magic Quadrant graphic that Acquia has tied Sitecore for the first time:

The 2019 Gartner Magic Quadrant for Web Content ManagementAcquia recognized as a leader, next to Adobe, Sitecore and Episerver, in the 2019 Gartner Magic Quadrant for Web Content Management.

In mature markets like Web Content Management, there is almost always a single proprietary leader and a single Open Source leader. There is Oracle and MongoDB. Splunk and Elastic. VMWare and Docker. Gitlab and Github. That is why I believe that next year it will be Acquia and Adobe at the very top of the WCM Magic Quadrant. Sitecore and Episerver will continue to fight for third place among companies who prefer a Microsoft-centric approach. I was not surprised to see Sitecore move down this year as they work to overcome technical product debt and cloud transition, leading to strange decisions like acquiring a services company.

You can read the complete report on Acquia.com. Thank you to everyone who contributed to this result!

August 01, 2019

1 min read time

db db
Jul 23 2019
Jul 23

This blog has been re-posted and edited with permission from Dries Buytaert's blog:

Coder Dojo

Volunteering as a mentor at CoderDojo to teach young people, including my own kids, how to write software.

Last week, I published an opinion piece on CNN featuring my thoughts on what is wrong with the web and how we might fix it.

In short, I really miss some things about the original web, and don't want my kids to grow up being exploited by mega-corporations.

I am hopeful that increased regulation and decentralized web applications may fix some of the web's current problems. While some problems are really difficult to fix, at the very least, my kids will have more options to choose from when it comes to their data privacy and overall experience on the web.

You can read the first few paragraphs below, and view the whole article on CNN.

I still remember the feeling in the year 2000 when a group of five friends and I shared a modem connection at the University of Antwerp. I used it to create an online message board so we could chat back and forth about mostly mundane things. The modem was slow by today's standards, but the newness of it all was an adrenaline rush. Little did I know that message board would change my life.

In time, I turned this internal message board into a public news and discussion site, where I shared my own experiences using experimental web technologies. Soon, I started hearing from people all over the world that wanted to provide suggestions on how to improve my website, but that also wanted to use my site's technology to build their own websites and experiment with emerging web technologies.

Before long, I was connected to a network of strangers who would help me build Drupal.

Jul 22 2019
Jul 22

An opinion piece featuring my thoughts on what is wrong with the current web and how we might fix it, ran on CNN last week.

Coder DojoVolunteering as a mentor at CoderDojo to teach young people, including my own kids, how to write software.

Last week, I published an opinion piece on CNN featuring my thoughts on what is wrong with the web and how we might fix it.

In short, I really miss some things about the original web, and don't want my kids to grow up being exploited by mega-corporations.

I am hopeful that increased regulation and decentralized web applications may fix some of the web's current problems. While some problems are really difficult to fix, at the very least, my kids will have more options to choose from when it comes to their data privacy and overall experience on the web.

You can read the first few paragraphs below, and view the whole article on CNN.

I still remember the feeling in the year 2000 when a group of five friends and I shared a modem connection at the University of Antwerp. I used it to create an online message board so we could chat back and forth about mostly mundane things. The modem was slow by today's standards, but the newness of it all was an adrenaline rush. Little did I know that message board would change my life.

In time, I turned this internal message board into a public news and discussion site, where I shared my own experiences using experimental web technologies. Soon, I started hearing from people all over the world that wanted to provide suggestions on how to improve my website, but that also wanted to use my site's technology to build their own websites and experiment with emerging web technologies.

Before long, I was connected to a network of strangers who would help me build Drupal.

July 22, 2019

1 min read time

db db
Jun 28 2019
Jun 28

Acquia is growing its presence in the Asia-Pacific region with a new Pune, India office.

This week, Acquia announced the opening of its new office in Pune, India, which extends our presence in the Asia Pacific region. In addition to Pune, we already have offices in Australia and Japan.

I've made several trips to India in recent years, and have experienced not only Drupal's fast growth, but also the contagious excitement and passion for Drupal from the people I've met there.

While I wasn't able to personally attend the opening of our new office, I'm looking forward to visiting the Pune office soon.

For now, here are a few pictures from our grand opening celebration:

Acquians at the opening of the Pune, India officeAcquians at the opening of the Pune, India officeAcquians at the opening of the Pune, India office

June 28, 2019

27 sec read time

db db
Jun 20 2019
Jun 20

Acquia Content Cloud, a new content-as-a-service solution for simplified content creation and syndication across multi-channel digital experiences, is now available in private beta.

Earlier this week at our Acquia Engage conference in London, Acquia announced a new product called "Content Cloud", a headless, SaaS-based content-as-a-service solution built on Drupal.

Years ago, we heard that organizations wanted to:

  • Create content that is easy to re-use across different channels, such as websites and mobile applications, email, digital screens, and more.

  • Use a content management system with a modern web service API that allows them to use their favorite front-end framework (e.g. React, Angular, Vue.js, etc) to build websites and digital experiences.

As a result, Acquia spent the last 5+ years helping to improve Drupal's web services capabilities and authoring experience.

But we also heard that organizations want to:

  • Use single repository to manage all their organization's content.
  • Make it really easy to synchronize content between all their Drupal sites.
  • Manage all content editors from a central place to enable centralized content governance and workflows.
  • Automate the installation, maintenance, and upgrades of their Drupal-based content repository.

All of the above becomes even more important as organizations scale the number of content creators, websites and applications. Many large organizations have to build and maintain hundreds of sites and manage hundreds of content creators.

So this week, at our European customer conference, we lifted the curtain on Acquia Content Cloud, a new Acquia product built using Drupal. Acquia Content Cloud is a content-as-a-service solution that enables simplified, headless content creation and syndication across multi-channel digital experiences.

For now, we are launching an early access beta program. If you’re interested in being considered for the beta or want to learn more as Content Cloud moves toward general availability, you can sign up here.

In time, I plan to write more about Content Cloud, especially as we get closer to its initial release. Until then, you can watch the Acquia Content Cloud teaser video below:

June 20, 2019

1 min read time

db db
Jun 18 2019
Jun 18

The new version of Acquia Lift makes it simpler for marketers to run website personalization campaigns themselves, without the need for writing code.

Today, we released a new version of Acquia Lift, our web personalization tool.

In today's world, personalization has become central to the most successful customer experiences. Most organizations know that personalization is no longer optional, but have put it off because it can be too difficult. The new Acquia Lift solves that problem.

While before, Acquia Lift may have taken a degree of fine-tuning from a developer, the new version simplifies how marketers create and launch website personalization. With the new version, anyone can point, click and personalize content without any code.

We started working on the new version of Acquia Lift in early 2018, well over a year ago. In the process we interviewed over 50 customers, redesigned the user interface and workflows, and added various new capabilities to make it easier for marketers to run website personalization campaigns. And today, at our European customer conference, Acquia Engage London, we released the new Acquia Lift to the public.

You can see all of the new features in action in this 5-minute Acquia Lift demo video:

The new Acquia Lift offers the best web personalization solution in Acquia's history, and definitely the best tool for Drupal.

June 18, 2019

47 sec read time

db db
Jun 11 2019
Jun 11

Recently I was interviewed on RTL Z, the Dutch business news television network. In the interview, I talk about the growth and success of Drupal, and what is to come for the future of the web. Beware, the interview is in Dutch. If you speak Dutch and are subscribed to my blog (hi mom!), feel free to check it out!

June 11, 2019

15 sec read time

db db
Jun 11 2019
Jun 11

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Recently, GitHub announced an initiative called GitHub Sponsors where open source software users can pay contributors for their work directly within GitHub.

There has been quite a bit of debate about whether initiatives like this are good or bad for Open Source.

On the one hand, there is the concern that the commercialization of Open Source could corrupt Open Source communities or harm contributors' intrinsic motivation and quest for purpose.

On the other hand, there is the recognition that commercial sponsorship is often a necessary condition for Open Source sustainability. Many communities have found that to support their growth, as a part of their natural evolution, they need to pay developers or embrace corporate sponsors.

Personally, I believe initiatives like GitHub Sponsors, and others like Open Collective, are a good thing.

It helps not only with the long-term sustainability of Open Source communities, but also improves diversity in Open Source. Underrepresented groups, in particular, don't always have the privilege of free time to contribute to Open Source outside of work hours. Most software developers have to focus on making a living before they can focus on self-actualization. Without funding, Open Source communities risk losing or excluding valuable talent.

May 29 2019
May 29

I'm pleased to share that Francesco Placella (plach on Drupal.org) is moving from a "provisional" core committer to a full-fledged framework manager. (Read more about Drupal core's governance structure.)

Francesco has been a member of the Drupal community for over 11 years. He contributed an incredible amount to multilingual efforts, the Field and Entity API, and was a top contributor to the Drupal Association's D8 Accelerate program, so you can also thank him for Drupal 8 getting released. :)

This experience has given Francesco an extremely well-rounded knowledge of Drupal's API underpinnings, making him a perfect candidate for Framework Manager. He is also extremely meticulous in his patch reviews, and always willing to jump in on problems to help others.

The rest of the committer team all were extremely happy to recommend his promotion to full-fledged committer, so please join me in formally welcoming plach to the team!

May 24 2019
May 24

This blog has been edited and reposted with permission from Dries.
 

On the day that the Drupal-powered Justice.gov website released Special Counsel Robert Mueller's long-awaited report on Russian interference in the U.S. election, the site experienced a 7,000% increase in traffic.

The report was successfully delivered without interruption via Acquia, using Drupal. 

According to Federal Computer Week, by 5pm on April 18, there had already been 587 million site visits, with 247 million happening in the first hour the report was released. The site typically receives 8 million visits per day. 

There were no IT performance or availability issues during the release of the 142-MB report, which is ideal during these types of high-pressure events when the world is watching. Thus, no news is good news. 

Keeping sites like this up and available to the public is an important part of democracy and the freedom of information. 

I'm proud of Drupal and Acquia’s ability to deliver when it matters most!

May 23 2019
May 23

Last month, Special Counsel Robert Mueller's long-awaited report on Russian interference in the U.S. election was released on the Justice.gov website.

With the help of Acquia and Drupal, the report was successfully delivered without interruption, despite a 7,000% increase in traffic on its release date, according to the Ottawa Business Journal.

According to Federal Computer Week, by 5pm on the day of the report's release, there had already been 587 million site visits, with 247 million happening within the first hour.

During these types of high-pressure events when the world is watching, no news is good news. Keeping sites like this up and available to the public is an important part of democracy and the freedom of information. I'm proud of Acquia's and Drupal's ability to deliver when it matters most!

May 23, 2019

31 sec read time

db db
May 08 2019
May 08

Acquia acquired Mautic, the open source marketing automation platform, to deliver the only Open Digital Experience Platform as an alternative to the expensive, closed, and stagnant marketing clouds.

Acquia joins forces with Mautic

I'm happy to announce today that Acquia acquired Mautic, an open source marketing automation and campaign management platform.

A couple of decades ago, I was convinced that every organization required a website — a thought that sounds rather obvious now. Today, I am convinced that every organization will need a Digital Experience Platform (DXP).

Having a website is no longer enough: customers expect to interact with brands through their websites, email, chat and more. They also expect these interactions to be relevant and personalized.

If you don't know Mautic, think of it as an alternative to Adobe's Marketo or Salesforce's Marketing Cloud. Just like these solutions, Mautic provides marketing automation and campaign management capabilities. It's differentiated in that it is easier to use, supports one-to-one customer experiences across many channels, integrates more easily with other tools, and is less expensive.

The flowchart style visual campaign builder you saw in the beginning of the Mautic demo video above is one of my favorite features. I love how it allows marketers to combine content, user profiles, events and a decision engine to deliver the best-next action to customers.

Mautic is a relatively young company, but has quickly grown into the largest open source player in the marketing automation space, with more than 200,000 installations. Its ease of use, flexibility and feature completeness has won over many marketers in a very short time: the company's top-line grew almost 400 percent year-over-year, its number of customers tripled, and Mautic won multiple awards for product innovation and customer service.

The acquisition of Mautic accelerates Acquia's product strategy to deliver the only Open Digital Experience Platform:

The building blocks of a Digital Experience Platform and how Mautic accelerates Acquia's vision. The pieces that make up a Digital Experience Platform, and how Mautic fits into Acquia's Open Digital Experience Platform. Acquia is strong in content management, personalization, user profile management and commerce (yellow blocks). Mautic adds or improves Acquia's multi-channel delivery, campaign management and journey orchestration capabilities (purple blocks).

There are many reasons why we like Mautic, but here are my top 3:

Reason 1: Disrupting the market with "open"

Open Source will disrupt every component of the modern technology stack. It's not a matter of if, it's when.

Just as Drupal disrupted web content management with Open Source, we believe Mautic disrupts marketing automation.

With Mautic, Acquia is now the only open and open source alternative to the expensive, closed, and stagnant marketing clouds.

I'm both proud and excited that Acquia is doubling down on Open Source. Given our extensive open source experience, we believe we can help grow Mautic even faster.

Reason 2: Innovating through integrations

To build an optimal customer experience, marketers need to integrate with different data sources, customer technologies, and bespoke in-house platforms. Instead of buying a suite from a single vendor, most marketers want an open platform that allows for open innovation and unlimited integrations.

Only an open architecture can connect any technology in the marketing stack, and only an open source innovation model can evolve fast enough to offer integrations with thousands of marketing technologies (to date, there are 7,000 vendors in the martech landscape).

Because developers are largely responsible for creating and customizing marketing platforms, marketing technology should meet the needs of both business users and technology architects. Unlike other companies in the space, Mautic is loved by both marketers and developers. With Mautic, Acquia continues to focus on both personas.

Reason 3: The same technology stack and business model

Like Drupal, Mautic is built in PHP and Symfony, and like Drupal, Mautic uses the GNU GPL license. Having the same technology stack has many benefits.

Digital agencies or in-house teams need to deliver integrated marketing solutions. Because both Drupal and Mautic use the same technology stack, a single team of developers can work on both.

The similarities also make it possible for both open source communities to collaborate — while it is not something you can force to happen, it will be interesting to see how that dynamic naturally plays out over time.

Last but not least, our business models are also very aligned. Both Acquia and Mautic were "born in the cloud" and make money by offering subscription- and cloud-based delivery options. This means you pay for only what you need and that you can focus on using the products rather than running and maintaining them.

Mautic offers several commercial solutions:

  • Mautic Cloud, a fully managed SaaS version of Mautic with premium features not available in Open Source.
  • For larger organizations, Mautic has a proprietary product called Maestro. Large organizations operate in many regions or territories, and have teams dedicated to each territory. With Maestro, each territory can get its own Mautic instance, but they can still share campaign best-practices, and repeat successful campaigns across territories. It's a unique capability, which is very aligned with the Acquia Cloud Site Factory.

Try Mautic

If you want to try Mautic, you can either install the community version yourself or check out the demo or sandbox environment of Mautic Open Marketing Cloud.

Conclusion

We're very excited to join forces with Mautic. It is such a strategic step for Acquia. Together we'll provide our customers with more freedom, faster innovation, and more flexibility. Open digital experiences are the way of the future.

I've got a lot more to share about the Mautic acquisition, how we plan to integrate Mautic in Acquia's solutions, how we could build bridges between the Drupal and Mautic community, how it impacts the marketplace, and more.

In time, I'll write more about these topics on this blog. In the meantime, please feel free to join DB Hurley, Mautic's founder and CTO, and me in a live Q&A session on Thursday, May 9 at 10am ET. We'll try to answer your questions about Acquia and Mautic.

May 08, 2019

3 min read time

Apr 30 2019
Apr 30

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

The Drupal Association is announcing that Heather Rocker, former CEO of Girls, Inc. and Executive Director of Women in Technology, is joining as its next Executive Director.
 

The Drupal Association announced today that Heather Rocker has been selected as its next executive director.

This is exciting news because it concludes a seven-month search since Megan Sanicki left.

We looked long and hard for someone who could help us grow the global Drupal community by building on its diversity, working with developers and agency partners, and expanding our work with new audiences such as content creators and marketers.

The Drupal Association (including me) believes that Heather can do all of that, and is the best person to lead Drupal into its next phase of growth.

Heather earned her engineering degree from Georgia Tech. She has dedicated much of her career to working with women in technology, both as the CEO of Girls, Inc. of Greater Atlanta and the Executive Director of Women in Technology.

We were impressed not only with her valuable experience with volunteer organizations, but also her work in the private sector with large customers. Most recently, Heather was part of the management team at Systems Evolution, a team of 250 business consultants, where she specialized in sales operations and managed key client relationships.

She is also a robotics fanatic who organizes and judges competitions for children. So, maybe we’ll see some robots roaming around DrupalCon in the future!

As you can tell, Heather will bring a lot of great experience to the Drupal community and I look forward to partnering with her.

Last but not least, I want to thank Tim Lehnen for serving as our Interim Executive Director. He did a fantastic job leading the Drupal Association through this transition.

Apr 30 2019
Apr 30

The Drupal Association is announcing that Heather Rocker, former CEO of Girls, Inc. and Executive Director of Women in Technology, is joining as its next Executive Director.

The Drupal Association announced today that Heather Rocker has been selected as its next executive director.

This is exciting news because it concludes a seven month search since Megan Sanicki left.

We looked long and hard for someone who could help us grow the global Drupal community by building on its diversity, working with developers and agency partners, and expanding our work with new audiences such as content creators and marketers.

The Drupal Association (including me) believes that Heather can do all of that, and is the best person to lead Drupal into its next phase of growth.

Heather earned her engineering degree from Georgia Tech. She has dedicated much of her career to working with women in technology, both as the CEO of Girls, Inc. of Greater Atlanta and the Executive Director of Women in Technology.

We were impressed not only with her valuable experience with volunteer organizations, but also her work in the private sector with large customers. Most recently, Heather was part of the management team at Systems Evolution, a team of 250 business consultants, where she specialized in sales operations and managed key client relationships.

She is also a robotics fanatic who organizes and judges competitions for children. So, maybe we’ll see some robots roaming around DrupalCon in the future!

As you can tell, Heather will bring a lot of great experience to the Drupal community and I look forward to partnering with her.

Last but not least, I want to thank Tim Lehnen for serving as our Interim Executive Director. He did a fantastic job leading the Drupal Association through this transition.

April 30, 2019

1 min read time

db db
Apr 23 2019
Apr 23

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

DrupalCon Seattle Driesnote presentation

Last week, many Drupalists gathered in Seattle for DrupalCon North America, for what was the largest DrupalCon in history.

As a matter of tradition, I presented my State of Drupal keynote. You can watch a recording of my keynote (starting at 32 minutes) or download a copy of my slides (153 MB).

Making Drupal more diverse and inclusive

DrupalCon Seattle was not only the largest, but also had the most diverse speakers. Nearly 50% of the DrupalCon speakers were from underrepresented groups. This number has been growing year over year, and is something to be proud of.

I actually started my keynote by talking about how we can make Drupal more diverse and inclusive. As one of the largest and most thriving Open Source communities, I believe that Drupal has an obligation to set a positive example.

Free time to contribute is a privilege

I talked about how Open Source communities often incorrectly believe that everyone can contribute. Unfortunately, not everyone has equal amounts of free time to contribute. In my keynote, I encouraged individuals and organizations in the Drupal community to strongly consider giving time to underrepresented groups.

Improving diversity is not only good for Drupal and its ecosystem, it's good for people, and it's the right thing to do. Because this topic is so important, I wrote a dedicated blog post about it.

Drupal 8 innovation update

I dedicated a significant portion of my keynote to Drupal 8. In the past year alone, there have been 35% more sites and 48% more stable modules in Drupal 8. Our pace of innovation is increasing, and we've seen important progress in several key areas.

With the release of Drupal 8.7, the Layout Builder will become stable. Drupal's new Layout Builder makes it much easier to build and change one-off page layouts, templated layouts and layout workflows. Best of all, the Layout Builder will be accessible.

Drupal 8.7 also brings a lot of improvements to the Media Library.

We also continue to innovate on headless or decoupled Drupal. The JSON:API module will ship with Drupal 8.7. I believe this not only advances Drupal's leadership in API-first, but sets Drupal up for long-term success.

These are just a few of the new capabilities that will ship with Drupal 8.7. For the complete list of new features, keep an eye out for the release announcement in a few weeks.

Drupal 7 end of life

If you're still on Drupal 7, there is no need to panic. The Drupal community will support Drupal 7 until November 2021 — two years and 10 months from today.

After the community support ends, there will be extended commercial support for a minimum of three additional years. This means that Drupal 7 will be supported for at least five more years, or until 2024.

Upgrading from Drupal 7 to Drupal 8

Upgrading from Drupal 7 to Drupal 8 can be a lot of work, especially for large sites, but the benefits outweigh the challenges.

For my keynote, I featured stories from two end-users who upgraded large sites from Drupal 7 to Drupal 8 — the State of Georgia and Pegasystems.

The keynote also featured quietone, one of the maintainers of the Migrate API. She talked about the readiness of Drupal 8 migration tools.

Preparing for Drupal 9

As announced a few months ago, Drupal 9 is targeted for June 2020. June 2020 is only 14 months away, so I dedicated a significant amount of my keynote to Drupal 9.

Making Drupal updates easier is a huge, ongoing priority for the community. Thanks to those efforts, the upgrade path to Drupal 9 will be radically easier than the upgrade path to Drupal 8.

In my keynote, I talked about how site owners, Drupal developers and Drupal module maintainers can start preparing for Drupal 9 today. I showed several tools that make Drupal 9 preparation easier. Check out my post on how to prepare for Drupal 9 for details.

A timeline with important dates and future milestones

Thank you

I'm grateful to be a part of a community that takes such pride in its work. At each DrupalCon, we get to see the tireless efforts of many volunteers that add up to one amazing event. It makes me proud to showcase the work of so many people and organizations in my presentations.

Thank you to all who have made this year's DrupalCon North America memorable. I look forward to celebrating our work and friendships at future events!

Apr 22 2019
Apr 22

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Upgrading from Drupal 8 to Drupal 9 should be easy if you regularly check for and remove the use of deprecated code.

With Drupal 9 targeted to be released in June of 2020, many people are wondering what they need to do to prepare.

The good and important news is that upgrading from Drupal 8 to Drupal 9 should be really easy — radically easier than upgrading from Drupal 7 to Drupal 8.

The only caveat is that you need to manage "deprecated code" well.

If your site doesn't use deprecated code that is scheduled for removal in Drupal 9, your upgrade to Drupal 9 will be easy. In fact, it should be as easy as a minor version upgrade (like upgrading from Drupal 8.6 to Drupal 8.7).

What is deprecated code?

Code in Drupal is marked as "deprecated" when it should no longer be used. Typically, code is deprecated because there is a better alternative that should be used instead.

For example, in Drupal 8.0.0, we deprecated \Drupal::l($text, $url). Instead of using \Drupal::l(), you should use Link::fromTextAndUrl($text, $url). The \Drupal::l() function was marked for removal as part of some clean-up work; Drupal 8 had too many ways to generate links.

Deprecated code will continue to work for some time before it gets removed. For example, \Drupal::l() continues to work in Drupal 8.7 despite the fact that it was deprecated in Drupal 8.0.0 more than three years ago. This gives module maintainers ample time to update their code.

When we release Drupal 9, we will "drop" most deprecated code. In our example, this means that \Drupal::l() will not be available anymore in Drupal 9.

In other words:

  • Any Drupal 8 module that does not use deprecated code will continue to work with Drupal 9.
  • Any Drupal 8 module that uses deprecated code needs to be updated before Drupal 9 is released, or it will stop working with Drupal 9.

If you're interested, you can read more about Drupal's deprecation policy at https://www.drupal.org/core/deprecation.

How do I know if my site uses deprecated code?

There are a few ways to check if your site is using deprecated code.

If you work on a Drupal site as a developer, run drupal-checkMatt Glaman(Centarro) developed a static PHP analysis tool called drupal-check, which you can run against your codebase to check for deprecated code. I recommend running drupal-check in an automated fashion as part of your development workflow.

If you are a site owner, install the Upgrade Status module. This module was built by Acquia. The module provides a graphical user interface on top of drupal-check. The goal is to provide an easy-to-use readiness assessment for your site's migration to Drupal 9.

If you maintain a project on Drupal.org, enable Drupal.org's testing infrastructure to detect the use of deprecated code. There are two complementary ways to do so: you can run a static deprecation analysis and/or configure your existing tests to fail when calling deprecated code. Both can be set up in your drupalci.yml configuration file.

If you find deprecated code in a contributed module used on your site, consider filing an issue in the module's issue queue on Drupal.org (after having checked no issue has been created yet). If you can, provide a patch to fix the deprecation and engage with the maintainer to get it committed.

How hard is it to update my code?

While there are some deprecations that require more detailed refactoring, many are a simple matter of search-and-replace.

You can check the API documentation for instructions on how to remedy the deprecation.

When can I start updating my code?

I encourage you to start today. When you update your Drupal 8 code to use the latest and greatest APIs, you can benefit from those improvements immediately. There is no reason to wait until Drupal 9 is released.

Drupal 8.8.0 will be the last release to deprecate for Drupal 9. Today, we don't know the full set of deprecations yet.

How much time do I have to update my code?

The current plan is to release Drupal 9 in June of 2020, and to end-of-life Drupal 8 in November of 2021.

Contributed module maintainers are encouraged to remove the use of deprecated code by June of 2020 so everyone can upgrade to Drupal 9 the day it is released.

A timeline with important dates and future milestones

Drupal.org project maintainers should keep the extended security coverage policy in mind, which means that Drupal 8.8 will still be supported until Drupal 9.1 is released. Contributed projects looking to support both Drupal 8.8 and Drupal 9.0 might need to use two branches.

How ready are the contributed modules?

Dwayne McDaniel (Pantheon) analyzed all 7,000 contributed modules for Drupal 8 using drupal-check.

As it stands today, 44% of the modules have no deprecation warnings. The remaining 56% of the modules need to be updated, but the majority have less than three deprecation warnings.

Apr 15 2019
Apr 15

DrupalCon Seattle Driesnote presentation

Last week, many Drupalists gathered in Seattle for DrupalCon North America, for what was the largest DrupalCon in history.

As a matter of tradition, I presented my State of Drupal keynote. You can watch a recording of my keynote (starting at 32 minutes) or download a copy of my slides (153 MB).

Making Drupal more diverse and inclusive

DrupalCon Seattle was not only the largest, but also had the most diverse speakers. Nearly 50% of the DrupalCon speakers were from underrepresented groups. This number has been growing year over year, and is something to be proud of.

I actually started my keynote by talking about how we can make Drupal more diverse and inclusive. As one of the largest and most thriving Open Source communities, I believe that Drupal has an obligation to set a positive example.

Free time to contribute is a privilege

I talked about how Open Source communities often incorrectly believe that everyone can contribute. Unfortunately, not everyone has equal amounts of free time to contribute. In my keynote, I encouraged individuals and organizations in the Drupal community to strongly consider giving time to underrepresented groups.

Improving diversity is not only good for Drupal and its ecosystem, it's good for people, and it's the right thing to do. Because this topic is so important, I wrote a dedicated blog post about it.

Drupal 8 innovation update

I dedicated a significant portion of my keynote to Drupal 8. In the past year alone, there have been 35% more sites and 48% more stable modules in Drupal 8. Our pace of innovation is increasing, and we've seen important progress in several key areas.

With the release of Drupal 8.7, the Layout Builder will become stable. Drupal's new Layout Builder makes it much easier to build and change one-off page layouts, templated layouts and layout workflows. Best of all, the Layout Builder will be accessible.

Drupal 8.7 also brings a lot of improvements to the Media Library.

We also continue to innovate on headless or decoupled Drupal. The JSON:API module will ship with Drupal 8.7. I believe this not only advances Drupal's leadership in API-first, but sets Drupal up for long-term success.

These are just a few of the new capabilities that will ship with Drupal 8.7. For the complete list of new features, keep an eye out for the release announcement in a few weeks.

Drupal 7 end of life

If you're still on Drupal 7, there is no need to panic. The Drupal community will support Drupal 7 until November 2021 — two years and 10 months from today.

After the community support ends, there will be extended commercial support for a minimum of three additional years. This means that Drupal 7 will be supported for at least five more years, or until 2024.

Upgrading from Drupal 7 to Drupal 8

Upgrading from Drupal 7 to Drupal 8 can be a lot of work, especially for large sites, but the benefits outweigh the challenges.

For my keynote, I featured stories from two end-users who upgraded large sites from Drupal 7 to Drupal 8 — the State of Georgia and Pegasystems.

The keynote also featured quietone, one of the maintainers of the Migrate API. She talked about the readiness of Drupal 8 migration tools.

Preparing for Drupal 9

As announced a few months ago, Drupal 9 is targeted for June 2020. June 2020 is only 14 months away, so I dedicated a significant amount of my keynote to Drupal 9.

Making Drupal updates easier is a huge, ongoing priority for the community. Thanks to those efforts, the upgrade path to Drupal 9 will be radically easier than the upgrade path to Drupal 8.

In my keynote, I talked about how site owners, Drupal developers and Drupal module maintainers can start preparing for Drupal 9 today. I showed several tools that make Drupal 9 preparation easier. Check out my post on how to prepare for Drupal 9 for details.

A timeline with important dates and future milestones

Thank you

I'm grateful to be a part of a community that takes such pride in its work. At each DrupalCon, we get to see the tireless efforts of many volunteers that add up to one amazing event. It makes me proud to showcase the work of so many people and organizations in my presentations.

Thank you to all who have made this year's DrupalCon North America memorable. I look forward to celebrating our work and friendships at future events!

April 15, 2019

2 min read time

Apr 12 2019
Apr 12

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Open Source communities often incorrectly believe that everyone can contribute. Unfortunately, not everyone has equal amounts of free time to contribute.

On this page:

In Open Source, there is a long-held belief in meritocracy, or the idea that the best work rises to the top, regardless of who contributes it. The problem is that a meritocracy assumes an equal distribution of time for everyone in a community.

Open Source is not a meritocracy

I incorrectly made this assumption myself, saying: The only real limitation [to Open Source contribution] is your willingness to learn.

Today, I've come to understand that inequality makes it difficult for underrepresented groups to have the "free time" it takes to contribute to Open Source.

For example, research shows that women still spend more than double the time as men doing unpaid domestic work, such as housework or childcare. I've heard from some of my colleagues that they need to optimize every minute of time they don't spend working, which makes it more difficult to contribute to Open Source on an unpaid, volunteer basis.

Or, in other cases, many people's economic conditions require them to work more hours or several jobs in order to support themselves or their families.

Systemic issues like racial and gender wage gaps continue to plague underrepresented groups, and it's both unfair and impractical to assume that these groups of people have the same amount of free time to contribute to Open Source projects, if they have any at all.

What this means is that Open Source is not a meritocracy.

Only local images are allowed.

Free time is a mark of privilege, rather than an equal right. Instead of chasing an unrealistic concept of meritocracy, we should be striving for equity. Rather than thinking, "everyone can contribute to open source", we should be thinking, "everyone deserves the opportunity to contribute".

Time inequality contributes to a lack of diversity in Open Source

This fallacy of "free time" makes Open Source communities suffer from a lack of diversity. The demographics are even worse than the technology industry overall: while 22.6% of professional computer programmers in the workforce identify as women (Bureau of Labor Statistics), less than 5% of contributors do in Open Source (GitHub). And while 34% of programmers identify as ethnic or national minorities (Bureau of Labor Statistics), only 16% do in Open Source (GitHub).

Only local images are allowed.

It's important to note that time isn't the only factor; sometimes a hostile culture or unconscious bias play a part in limiting diversity. According to the same GitHub survey cited above, 21% of people who experienced negative behavior stopped contributing to Open Source projects altogether. Other recent research showed that women's pull requests were more likely to get accepted if they had a gender-neutral username. Unfortunately, examples like these are common.

Taking action: giving time to underrepresented groups

While it's impossible to fix decades of gender and racial inequality with any single action, we must do better. Those in a position to help have an obligation to improve the lives of others. We should not only invite underrepresented groups into our Open Source communities, but make sure that they are welcomed, supported and empowered. One way to help is with time:

  • As individuals, by making sure you are intentionally welcoming people from underrepresented groups, through both outreach and actions. If you're in a community organizing position, encourage and make space for people from underrepresented groups to give talks or lead sprints about the work they're interested in. Or if you're asked to, mentor an underrepresented contributor.
  • As organizations in the Open Source ecosystem, by giving people more paid time to contribute.

Taking the extra effort to help onboard new members or provide added detail when reviewing code changes can be invaluable to community members who don't have an abundance of free time. Overall, being kinder, more patient and more supportive to others could go a long way in welcoming more people to Open Source.

In addition, organizations within the Open Source ecosystem capable of giving back should consider financially sponsoring underrepresented groups to contribute to Open Source. Sponsorship can look like full or part-time employment, an internship or giving to organizations like Girls Who CodeCode2040Resilient Coders or one of the many others that support diversity in technology. Even a few hours of paid time during the workweek for underrepresented employees could help them contribute more to Open Source.

Applying the lessons to Drupal

Over the years, I've learned a lot from different people's perspectives. Learning out in the open is not always easy, but it's been an important part of my personal journey.

Knowing that Drupal is one of the largest and most influential Open Source projects, I find it important that we lead by example.

I encourage individuals and organizations in the Drupal community to strongly consider giving time and opportunities to underrepresented groups. You can start in places like:

When we have more diverse people contributing to Drupal, it will not only inject a spark of energy, but it will also help us make better, more accessible, inclusive software for everyone in the world.

Each of us needs to decide if and how we can help to create equity for everyone in Drupal. Not only is it good for business, it's good for people, and it's the right thing to do.

Special thanks to the Drupal Diversity and Inclusion group for discussing this topic with me.

 April 10, 2019

 3 min read time

 Permalink

Apr 11 2019
Apr 11

Upgrading from Drupal 8 to Drupal 9 should be easy if you regularly check for and remove the use of deprecated code.

With Drupal 9 targeted to be released in June of 2020, many people are wondering what they need to do to prepare.

The good and important news is that upgrading from Drupal 8 to Drupal 9 should be really easy — radically easier than upgrading from Drupal 7 to Drupal 8.

The only caveat is that you need to manage "deprecated code" well.

If your site doesn't use deprecated code that is scheduled for removal in Drupal 9, your upgrade to Drupal 9 will be easy. In fact, it should be as easy as a minor version upgrade (like upgrading from Drupal 8.6 to Drupal 8.7).

What is deprecated code?

Code in Drupal is marked as "deprecated" when it should no longer be used. Typically, code is deprecated because there is a better alternative that should be used instead.

For example, in Drupal 8.0.0, we deprecated \Drupal::l($text, $url). Instead of using \Drupal::l(), you should use Link::fromTextAndUrl($text, $url). The \Drupal::l() function was marked for removal as part of some clean-up work; Drupal 8 had too many ways to generate links.

Deprecated code will continue to work for some time before it gets removed. For example, \Drupal::l() continues to work in Drupal 8.7 despite the fact that it was deprecated in Drupal 8.0.0 more than three years ago. This gives module maintainers ample time to update their code.

When we release Drupal 9, we will "drop" most deprecated code. In our example, this means that \Drupal::l() will not be available anymore in Drupal 9.

In other words:

  • Any Drupal 8 module that does not use deprecated code will continue to work with Drupal 9.
  • Any Drupal 8 module that uses deprecated code needs to be updated before Drupal 9 is released, or it will stop working with Drupal 9.

If you're interested, you can read more about Drupal's deprecation policy at https://www.drupal.org/core/deprecation.

How do I know if my site uses deprecated code?

There are a few ways to check if your site is using deprecated code.

If you work on a Drupal site as a developer, run drupal-check. Matt Glaman (Centarro) developed a static PHP analysis tool called drupal-check, which you can run against your codebase to check for deprecated code. I recommend running drupal-check in an automated fashion as part of your development workflow.

If you are a site owner, install the Upgrade Status module. This module was built by Acquia. The module provides a graphical user interface on top of drupal-check. The goal is to provide an easy-to-use readiness assessment for your site's migration to Drupal 9.

If you maintain a project on Drupal.org, enable Drupal.org's testing infrastructure to detect the use of deprecated code. There are two complementary ways to do so: you can run a static deprecation analysis and/or configure your existing tests to fail when calling deprecated code. Both can be set up in your drupalci.yml configuration file.

If you find deprecated code in a contributed module used on your site, consider filing an issue in the module's issue queue on Drupal.org (after having checked no issue has been created yet). If you can, provide a patch to fix the deprecation and engage with the maintainer to get it committed.

How hard is it to update my code?

While there are some deprecations that require more detailed refactoring, many are a simple matter of search-and-replace.

You can check the API documentation for instructions on how to remedy the deprecation.

When can I start updating my code?

I encourage you to start today. When you update your Drupal 8 code to use the latest and greatest APIs, you can benefit from those improvements immediately. There is no reason to wait until Drupal 9 is released.

Drupal 8.8.0 will be the last release to deprecate for Drupal 9. Today, we don't know the full set of deprecations yet.

How much time do I have to update my code?

The current plan is to release Drupal 9 in June of 2020, and to end-of-life Drupal 8 in November of 2021.

Contributed module maintainers are encouraged to remove the use of deprecated code by June of 2020 so everyone can upgrade to Drupal 9 the day it is released.

A timeline with important dates and future milestones

Drupal.org project maintainers should keep the extended security coverage policy in mind, which means that Drupal 8.8 will still be supported until Drupal 9.1 is released. Contributed projects looking to support both Drupal 8.8 and Drupal 9.0 might need to use two branches.

How ready are the contributed modules?

Dwayne McDaniel (Pantheon) analyzed all 7,000 contributed module for Drupal 8 using drupal-check.

As it stands today, 44% of the modules have no deprecation warnings. The remaining 56% of the modules need to be updated, but the majority have less than three deprecation warnings.

April 11, 2019

3 min read time

Apr 10 2019
Apr 10

Open Source communities often incorrectly believe that everyone can contribute. Unfortunately, not everyone has equal amounts of free time to contribute.

In Open Source, there is a long-held belief in meritocracy, or the idea that the best work rises to the top, regardless of who contributes it. The problem is that a meritocracy assumes an equal distribution of time for everyone in a community.

Open Source is not a meritocracy

Free time to contribute is a privilege

I incorrectly made this assumption myself, saying: The only real limitation [to Open Source contribution] is your willingness to learn.

Today, I've come to understand that inequality makes it difficult for underrepresented groups to have the "free time" it takes to contribute to Open Source.

For example, research shows that women still spend more than double the time as men doing unpaid domestic work, such as housework or childcare. I've heard from some of my colleagues that they need to optimize every minute of time they don't spend working, which makes it more difficult to contribute to Open Source on an unpaid, volunteer basis.

Or, in other cases, many people's economic conditions require them to work more hours or several jobs in order to support themselves or their families. Systemic issues like racial and gender wage gaps continue to plague underrepresented groups, and it's both unfair and impractical to assume that these groups of people have the same amount of free time to contribute to Open Source projects, if they have any at all.

Underrepresented groups don't have the same amount of free time

These are just a few examples of free time not being equally distributed. What this means is that Open Source is not a meritocracy.

Free time is a mark of privilege, rather than an equal right. Instead of chasing an unrealistic concept of meritocracy, we should be striving for equity. Rather than thinking, "everyone can contribute to open source", we should be thinking, "everyone deserves the opportunity to contribute".

Time inequality contributes to a lack of diversity in Open Source

This fallacy of "free time" makes Open Source communities suffer from a lack of diversity. The demographics are even worse than the technology industry overall: while 22.6% of professional computer programmers in the workforce identify as women (Bureau of Labor Statistics), less than 5% of contributors do in Open Source (GitHub). And while 34% of programmers identify as ethnic or national minorities (Bureau of Labor Statistics), only 16% do in Open Source (GitHub).

Diversity in data

It's important to note that time isn't the only factor; sometimes a hostile culture or unconscious bias play a part in limiting diversity. According to the same GitHub survey cited above, 21% of people who experienced negative behavior stopped contributing to Open Source projects altogether. Other recent research showed that women's pull requests were more likely to get accepted if they had a gender-neutral username. Unfortunately, examples like these are common.

Taking action: giving time to underrepresented groups

A person being ignored

While it's impossible to fix decades of gender and racial inequality with any single action, we must do better. Those in a position to help have an obligation to improve the lives of others. We should not only invite underrepresented groups into our Open Source communities, but make sure that they are welcomed, supported and empowered. One way to help is with time:

  • As individuals, by making sure you are intentionally welcoming people from underrepresented groups, through both outreach and actions. If you're in a community organizing position, encourage and make space for people from underrepresented groups to give talks or lead sprints about the work they're interested in. Or if you're asked to, mentor an underrepresented contributor.
  • As organizations in the Open Source ecosystem, by giving people more paid time to contribute.

Taking the extra effort to help onboard new members or provide added detail when reviewing code changes can be invaluable to community members who don't have an abundance of free time. Overall, being kinder, more patient and more supportive to others could go a long way in welcoming more people to Open Source.

In addition, organizations within the Open Source ecosystem capable of giving back should consider financially sponsoring underrepresented groups to contribute to Open Source. Sponsorship can look like full or part-time employment, an internship or giving to organizations like Girls Who Code, Code2040, Resilient Coders or one of the many others that support diversity in technology. Even a few hours of paid time during the workweek for underrepresented employees could help them contribute more to Open Source.

Applying the lessons to Drupal

Over the years, I've learned a lot from different people's perspectives. Learning out in the open is not always easy, but it's been an important part of my personal journey.

Knowing that Drupal is one of the largest and most influential Open Source projects, I find it important that we lead by example.

I encourage individuals and organizations in the Drupal community to strongly consider giving time and opportunities to underrepresented groups. You can start in places like:

When we have more diverse people contributing to Drupal, it will not only inject a spark of energy, but it will also help us make better, more accessible, inclusive software for everyone in the world.

Each of us needs to decide if and how we can help to create equity for everyone in Drupal. Not only is it good for business, it's good for people, and it's the right thing to do.

Special thanks to the Drupal Diversity and Inclusion group for discussing this topic with me.

April 10, 2019

3 min read time

Apr 09 2019
Apr 09

For most people, today marks the first day of DrupalCon Seattle.

Open Source communities create better, more inclusive software when diverse people come to the table. Unfortunately, there is still a huge gender gap in Open Source, and software more broadly. It's something I'll talk more about in my keynote tomorrow.

One way to help close the gender gap in the technology sector is to give to organizations that are actively working to solve this problem. During DrupalCon Seattle, Acquia will donate $5 to Girls Who Code for every person that visits our booth.

April 09, 2019

22 sec read time

db db
Mar 13 2019
Mar 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Three stars will align and the Open Web will win.

Today, the world wide web celebrates its 30th birthday. In 1989, Sir Tim Berners-Lee invented the world wide web and changed the lives of millions of people around the globe, including mine.

Tim Berners-Lee sitting in front of a computer showing the first website

Tim Berners-Lee, inventor of the World Wide Web, in front of the early web.

Milestones like this get me thinking about the positive impact a free and Open Web has had on society. Without the web, billions of people would not have been able to connect with one another, be entertained, start businesses, exchange ideas, or even save lives. Open source communities like Drupal would not exist.

As optimistic as I am about the web's impact on society, there have been many recent events that have caused me to question the Open Web's future. Too much power has fallen into the hands of relatively few platform companies, resulting in widespread misinformation, privacy beaches, bullying, and more.

However, I'm optimistic that the Open Web has a chance to win in the future. I believe we'll see three important events happen in the next five years.

First, the day will come when regulators will implement a set of laws that govern the ownership and exchange of data online. It's already starting to happen with GDPR in the EU and various state data privacy laws taking shape in the US. These regulations will require platforms like Facebook to give users more control over their data, and when that finally happens, it will be a lot easier for users to move their data between services and for the Open Web to innovate on top of these data platforms.

Second, at some point, governments globally will disempower large platform companies. We can't leave it up to a handful of companies to judge what is false and true, or have them act as our censors. While I'm not recommending governments split up these companies, my hope is that they will institute some level of algorithmic oversight. This will offer an advantage to the Open Web and Open Source.

Third, I think we're on the verge of having a new set of building blocks that enable us to build a better, next-generation web. Thirty years into the web, our data architectures still use a client-server model; data is stored centrally on one computer, so to speak. The blockchain is turning that into a more decentralized web that operates on top of a distributed data layer and offers users control of their own data. Similar to building a traditional website, distributed applications (dApps) require file storage, payment systems, user data stores, etc. All of these components are being rebuilt on top of the blockchain. While we have a long way to go, it is only a matter of time before a tipping point is reached.

In the past, I've publicly asked the question: Can we save the Open Web? I believe we can. We can't win today, but we can keep innovating and get ready for these three events to unfold. The day will come!

With that motivation in mind, I want to wish a special happy birthday to the world wide web!

Feb 21 2019
Feb 21

Learn how to implement lazy loading images to improve the performance of your website.

Recently, I've been spending some time making performance improvements to my site. In my previous blog post on this topic, I described my progress optimizing the JavaScript and CSS usage on my site, and concluded that image optimization was the next step.

Last summer I published a blog post about my vacation in Acadia National Park. Included in that post are 13 photos with a combined size of about 4 MB.

When I benchmarked that post with https://webpagetest.org, it showed that it took 7.275 seconds (blue vertical line) to render the page.

The graph shows that the browser downloaded all 13 images to render the page. Why would a browser download all images if most of them are below the fold and not shown until a user starts scrolling? It makes very little sense.

As you can see from the graph, downloading all 13 images take a very long time (purple horizontal bars). No matter how much you optimize your CSS and JavaScript, this particular blog post would have remained slow until you optimize how images are loaded.

Webpagetest images february before

"Lazy loading" images is one solution to this problem. Lazy loading means that the images aren't loaded until the user scrolls and the images come into the browser's viewport.

You might have seen lazy loading in action on websites like Facebook, Pinterest or Medium. It usually goes like this:

  • You visit a page as you normally would, scrolling through the content.
  • Instead of the actual image, you see a blurry placeholder image.
  • Then, the placeholder image gets swapped out with the final image as quickly as possible.
An animated GIF of a user scrolling a webpage and a placeholder images being replaced by the final image

To support lazy loading images on my blog I do three things:

  1. Automatically generate lightweight yet useful placeholder images.
  2. Embed the placeholder images directly in the HTML to speed up performance.
  3. Replace the placeholder images with the real images when they become visible.

Generating lightweight placeholder images

To generate lightweight placeholder images, I implemented a technique used by Facebook: create a tiny image that is a downscaled version of the original image, strip out the image's metadata to optimize its size, and let the browser scale the image back up.

To create lightweight placeholder images, I resized the original images to be 5 pixels wide. Because I have about 10,000 images on my blog, my Drupal-based site automates this for me, but here is how you create one from the command line using ImageMagick's convert tool:

$ convert -resize 5x -strip original.jpg placeholder.jpg
  • -resize 5x resizes the image to be 5 pixels wide while maintaining its aspect ratio.
  • -strip removes all comments and redundant headers in the image. This helps make the image's file size as small as possible.

The resulting placeholder images are tiny — often shy of 400 bytes.

Large metal pots with wooden lids in which lobsters are boiledThe original image that we need to generate a placeholder for. An example placeholder image shows brown and black tonesThe generated placeholder, scaled up by a browser from a tiny image that is 5 pixels wide. The size of this placeholder image is only 395 bytes.

Here is another example to illustrate how the colors in the placeholders nicely match the original image:

A sunrise with beautiful reds and black silhouettesAn example placeholder image that shows red and black tones

Even though the placeholder image should only be shown for a fraction of a second, making them relevant is a nice touch as they suggest what is coming. It's also an important touch, as users are very impatient with load times on the web.

Embedding placeholder images directly in HTML

One not-so-well-known feature of the <img> element is that you can embed an image directly into the HTML document using the data URL scheme:

<img src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/data:image/jpg;base64,/9j/4AAQSkZJRgABAQEA8ADwAAD/2wB
  DAAEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQICAQECAQEB
  AgICAgICAgICAQICAgICAgICAgL/2wBDAQEBAQEBAQEBAQECAQEBAgICAgICA
  gICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgL/wA
  ARCAADAAUDAREAAhEBAxEB/8QAFAABAAAAAAAAAAAAAAAAAAAACf/EAB8QAAM
  AAQMFAAAAAAAAAAAAAAECAwcEBQYACAkTMf/EABUBAQEAAAAAAAAAAAAAAAAA
  AAIG/8QAJBEAAQIFAgcAAAAAAAAAAAAAAQURAAIDBDEGFBIVQUVRcYH/2gAMA
  wEAAhEDEQA/AAeyb5HO8o8lSLZd01Jz2nbKoK4yxDVvZqYl7uaV4CWdmZQSSS
  ST86FJBsafEJK15KD05ioNk4G6Yeg0V9bVCmZpXt08sB2hJ8DJ2Tn7H/2Q==" />

Data URLs are composed of four parts: the data: prefix, a media type indicating the type of data (image/jpg), an optional base64 token to indicate that the data is base64 encoded, and the base64 encoded image data itself.

data:[<media type>][;base64],<data>

To base64 encode an image from the command line, use:

$ base64 placeholder.jpg

To base64 encode an image in PHP, use:

$data =  base64_encode(file_get_contents('placeholder.jpg'));

What is the advantage of embedding a base64 encoded image using a data URL? It eliminates HTTP requests as the browser doesn't have to set up new HTTP connections to download the images. Fewer HTTP requests usually means faster page load times.

Replacing placeholder images with real images

Next, I used JavaScript's IntersectionObserver to replace the placeholder image with the actual image when it comes into the browser's viewport. I followed Jeremy Wagner's approach shared on Google Web Fundamentals Guide on lazy loading images — with some adjustments.

It starts with the following HTML markup:

<img class="lazy" src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/placeholder.jpg" data-src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/original.jpg" />

The three relevant pieces are:

  1. The class="lazy" attribute, which is what you'll select the element with in JavaScript.
  2. The src attribute, which references the placeholder image that will appear when the page first loads. Instead of linking to placeholder.jpg I embed the image data using the data URL technique explained above.
  3. The data-src attribute, which contains the URL to the original image that will replace the placeholder when it comes in focus.

Next, we use JavaScript's IntersectionObserver to replace the placeholder images with the actual images:

document.addEventListener('DOMContentLoaded', function() {
  var lazyImages = [].slice.call(document.querySelectorAll('img.lazy'));

  if ('IntersectionObserver' in window) {
    let lazyImageObserver = new IntersectionObserver(
      function(entries, observer) {
        entries.forEach(function(entry) {
          if (entry.isIntersecting) {
            let lazyImage = entry.target;
            lazyImage.src = lazyImage.dataset.src;
            lazyImageObserver.unobserve(lazyImage);
          }
        });
    });

    lazyImages.forEach(function(lazyImage) {
      lazyImageObserver.observe(lazyImage);
    });
  }
  else {
    // For browsers that don't support IntersectionObserver yet,
    // load all the images now:
    lazyImages.forEach(function(lazyImage) {
      lazyImage.src = lazyImage.dataset.src;
    });
  }
});

This JavaScript code queries the DOM for all <img> elements with the lazy class. The IntersectionObserver is used to replace the placeholder image with the original image when the img.lazy elements enter the viewport. When IntersectionObserver is not supported, the images are replaced on the DOMContentLoaded event.

By default, the IntersectionObserver's callback is triggered the moment a single pixel of the image enters the browser's viewport. However, using the rootMargin property, you can trigger the image swap before the image enters the viewport. This reduces or eliminates the visual or perceived lag time when swapping a placeholder image for the actual image.

I implemented that on my site as follows:

const config = {
    // If the image gets within 250px of the browser's viewport, 
    // start the download:
    rootMargin: '250px 0px',
  };

let lazyImageObserver = new IntersectionObserver(..., config);

Lazy loading images drastically improves performance

After making these changes to my site, I did a new https://webpagetest.org benchmark run:

A diagram that shows page load times for dri.es before making performance improvements

You can clearly see that the page became a lot faster to render:

  • The document is complete after 0.35 seconds (blue vertical line) instead of the original 7.275 seconds.
  • No images are loaded before the document is complete, compared to 13 images being loaded before.
  • After the document is complete, one image (purple horizontal bar) is downloaded. This is triggered by the JavaScript code as the result of one image being above the fold.

Lazy loading images improves web page performance by reducing the number of HTTP requests, and consequently reduces the amount of data that needs to be downloaded to render the initial page.

Is base64 encoding images bad for SEO?

Faster sites have a SEO advantage as page speed is a ranking factor for search engines. But, lazy loading might also be bad for SEO, as search engines have to be able to discover the original images.

To find out, I headed to Google Search Console. Google Search Console has a "URL inspection" feature that allows you to look at a webpage through the eyes of Googlebot.

I tested it out with my Acadia National Park blog post. As you can see in the screenshot, the first photo in the blog post was not loaded. Googlebot doesn't seem to support data URLs for images.

A screenshot that shows Googlebot doesn't render placeholder images that are embedded using data URLs

Is IntersectionObserver bad for SEO?

The fact that Googlebot doesn't appear to support data URLs does not have to be a problem. The real question is whether Googlebot will scroll the page, execute the JavaScript, replace the placeholders with the actual images, and index those. If it does, it doesn't matter that Googlebot doesn't understand data URLs.

To find out, I decided to conduct an experiment. Yesterday, I published a blog post about Matt Mullenweg and me visiting a museum together. The images in that blog post are lazy loaded and can only be discovered by Google if its crawler executes the JavaScript and scrolls the page. If those images show up in Google's index, we know there is no SEO impact.

I'm not sure how long it takes for Google to make new posts and images available in its index, but I'll keep an eye out for it.

If the images don't show up in Google's index, lazy loading might impact your SEO. My solution would be to selectively disable lazy loading for the most important images only. (Note: even if Google finds the images, there is no guarantee that it will decide to index them — short blog posts and images are often excluded from Google's index.)

Conclusions

Lazy loading images improves web page performance by reducing the number of HTTP requests and data needed to render the initial page.

Ideally, over time, browsers will support lazy loading images natively, and some of the SEO challenges will no longer be an issue. Until then, consider adding support for lazy loading yourself. For my own site, it took about 40 lines of JavaScript code and 20 lines of additional PHP/Drupal code.

I hope that by sharing my experience, more people are encouraged to run their own sites and to optimize their sites' performance.

February 21, 2019

6 min read time

Feb 20 2019
Feb 20

A month ago, Matt Mullenweg, co-founder of WordPress and founder of Automattic, visited me in Antwerp. While I currently live in Boston, I was born and raised in Antwerp, and also started Drupal there.

We spent the morning together walking around Antwerp and visited the Plantin Moretus Museum.

The museum is the old house of Christophe Plantin, where he lived and worked around 1575. At the time, Plantin had the largest printing shop in the world, with 56 employees and 16 printing presses. These presses printed 1,250 sheets per day.

Today, the museum hosts the two oldest printing presses in the world. In addition, the museum has original lead types of fonts such as Garamond and hundreds of ancient manuscripts that tell the story of how writing evolved into the art of printing.

The old house, printing business, presses and lead types are the earliest witnesses of a landmark moment in history: the invention of printing, and by extension, the democratization of publishing, long before our digital age. It was nice to visit that together with Matt as a break from our day-to-day focus on web publishing.

An old printing press at the Plantin Moretus Museum

Dries and Matt in front of the oldest printing presses in the world

An old globe at the Plantin Moretus Museum

February 20, 2019

47 sec read time

db db
Feb 14 2019
Feb 14

I've been thinking about the performance of my site and how it affects the user experience. There are real, ethical concerns to poor web performance. These include accessibility, inclusion, waste and environmental concerns.

A faster site is more accessible, and therefore more inclusive for people visiting from a mobile device, or from areas in the world with slow or expensive internet.

For those reasons, I decided to see if I could improve the performance of my site. I used the excellent https://webpagetest.org to benchmark a simple blog post https://dri.es/relentlessly-eliminating-barriers-to-growth.

A diagram that shows page load times for dri.es before making performance improvements

The image above shows that it took a browser 0.722 seconds to download and render the page (see blue vertical line):

  • The first 210 milliseconds are used to set up the connection, which includes the DNS lookup, TCP handshake and the SSL negotiation.
  • The next 260 milliseconds (from 0.21 seconds to 0.47 seconds) are spent downloading the rendered HTML file, two CSS files and one JavaScript file.
  • After everything is downloaded, the final 330 milliseconds (from 0.475 seconds to 0.8 seconds) are used to layout the page and execute the JavaScript code.

By most standards, 0.722 seconds is pretty fast. In fact, according to HTTP Archive, it takes more than 2.4 seconds to download and render the average web page on a laptop or desktop computer.

Regardless, I noticed that the length of the horizontal green bars and the horizontal yellow bar was relatively long compared to that of the blue bar. In other words, a lot of time is spent downloading JavaScript (yellow horizontal bar) and CSS (two green horizontal bars) instead of the HTML, including the actual content of the blog post (blue bar).

To fix, I did two things:

  1. Use vanilla JavaScript. I replaced my jQuery-based JavaScript with vanilla JavaScript. Without impacting the functionality of my site, the amount of JavaScript went from almost 45 KB to 699 bytes, good for a savings of over 6,000 percent.
  2. Conditionally include CSS. For example, I use Prism.js for syntax highlighting code snippets in blog posts. prism.css was downloaded for every page request, even when there were no code snippets to highlight. Using Drupal's render system, it's easy to conditionally include CSS. By taking advantage of that, I was able to reduce the amount of CSS downloaded by 47 percent — from 4.7 KB to 2.5 KB.

According to the January 1st, 2019 run of HTTP Archive, the median page requires 396 KB of JavaScript and 60 KB of CSS. I'm proud that my site is well under these medians.

File type Dri.es before Dri.es after World-wide median JavaScript 45 KB 669 bytes 396 KB CSS 4.7 KB 2.5 KB 60 KB

Because the new JavaScript and CSS files are significantly smaller, it takes the browser less time to download, parse and render them. As a result, the same blog post is now available in 0.465 seconds instead of 0.722 seconds, or 35% faster.

After a new https://webpagetest.org test run, you can clearly see that the bars for the CSS and JavaScript files became visually shorter:

A diagram that shows page load times for dri.es after making performance improvements

To optimize the user experience of my site, I want it to be fast. I hope that others will see that bloated websites can come at a great cost, and will consider using tools like https://webpagetest.org to make their sites more performant.

I'll keep working on making my website even faster. As a next step, I plan to make pages with images faster by using lazy image loading.

February 13, 2019

2 min read time

Feb 13 2019
Feb 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

An abstract image of three boxes

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST for Drupal core's needs.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetchdata but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability.

Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an anyHTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted conventionamongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GETrequest with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling.

Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:APIis on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions.

Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure.

What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, for Drupal core's needs, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. That said, Drupal's GraphQL implementation is fantastic, especially when you have the developer capacity to build a bespoke API for your project.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein(Acquia) for their feedback during the writing process.

 February 11, 2019

 11 min read time

 Permalink

Feb 11 2019
Feb 11

We compare REST, JSON:API and GraphQL — three different web services implementations — based on request efficiency, operational simplicity, API discoverability, and more.

An abstract image of three boxes

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST for Drupal core's needs.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetch data but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability.

Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an any HTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted convention amongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GET request with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling.

Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:API is on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions.

Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure.

What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, for Drupal core's needs, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. That said, Drupal's GraphQL implementation is fantastic, especially when you have the developer capacity to build a bespoke API for your project.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein (Acquia) for their feedback during the writing process.

February 11, 2019

11 min read time

Feb 07 2019
Feb 07

I'm frequently sent examples of how Drupal has changed the lives of developers, business owners and end users. Recently, I received a very different story of how Drupal had helped in a rescue operation that saved a man's life.

The Snowdonia Ultra Marathon website

In early 2018, Race Director Mike Jones was looking to build a new website for the Ultra-Trail Snowdonia ultra marathon. He reached out to a good friend and developer, Rob Edwards, to lead the development of the website.

A photo of a runner at the Ultra-trail Snowdonia ultramarathon© Ultra-trail Snowdonia and No Limits Photography

Rob chose Drupal for its flexibility and extensibility. As an organization supported heavily by volunteers, open source also fit the Snowdonia team's belief in community.

The resulting website, https://apexrunning.co/, included a custom-built timing module. This module allowed volunteers to register each runner and their time at every aid stop.

A runner goes missing

Rob attended the first day of Ultra-Trail Snowdonia to ensure the website ran smoothly. He also monitored the runners at the end of the race to certify they were all accounted for.

Monitoring the system into the early hours of the morning, Rob noticed one runner, after successfully completing checkpoints one and two, hadn't passed through the third checkpoint.

A photo of a runner at the Ultra-trail Snowdonia ultramarathon© Ultra-trail Snowdonia and No Limits Photography

Each runner carried a mobile phone with them for emergencies. Mike attempted to make contact with the runner via phone to ensure he was safe. However, this specific area was known for its poor signal and the connection was too weak to get through.

After some more time eagerly watching the live updates, it was clear the runner hadn't reached checkpoint four and more likely hadn't ever made it past checkpoint three. The Ogwen Mountain Rescue were called to action.

Due to the terrain and temperature, searching for the lost runner on foot would be too slow. Instead, the mountain rescue volunteers used a helicopter to scan the area and locate the runner.

How Drupal came to rescue

The area covered by runners in an ultra marathon like this one is vast. The custom-built timing module helped rescuers narrow down the search area; they knew the runner passed the second checkpoint but never made it to the third.

After following the fluorescent orange markers in the area pinpointed by the Drupal website, the team quickly found the individual. He had fallen and become too injured to carry on. A mild case of hypothermia had set in. The runner was airlifted to the hospital for appropriate care. The good news: the runner survived.

Without Drupal, it might have taken much longer to notify anyone that a runner had gone missing, and there would have been no way to tell when he had dropped off.

NFC and GPS devices are now being explored for these ultra marathon runners to carry with them to provide location data as an extra safety precaution. The Drupal system will be used alongside these devices for more accurate time readings, and Rob is looking into an API to pull this additional data into the Drupal website.

Stories about Drupal having an impact on organizations and individuals, or even helping out in emergencies, drive my sense of purpose. Feel free to keep sending them my way!

Special thanks to Rob Edwards, Poppy Heap (CTI Digital) and Paul Johnson (CTI Digital) for their help with this blog post.

February 06, 2019

2 min read time

Jan 31 2019
Jan 31

Since I was young, I've been an avid tennis player and fan. I still play to this day, though maybe not as much as I'd like to.

In my teens, Andre Agassi was my favorite player. I've even sported some of his infamous headbands. I also remember watching him win the Australian Open in 1995.

In 2012, I traveled to Melbourne for a Drupal event, the same week the Australian Open was going on. As a tennis fan, I was lucky enough to watch Belgium's Kim Clijsters play.

Last weekend, the Australian Open wrapped up. This year, their website, https://ausopen.com, ran on Acquia and Drupal, delivered by the team at Avanade.

In a two-week timeframe, the site successfully welcomed tens of millions of visitors and served hundreds of millions of page views.

I'm very proud of the fact that many of the world's largest sporting events and media organizations (such as NBC Sports who host the Super Bowl and Olympics in the US) trust Acquia and Drupal as their chosen digital platform.

When the world is watching an event, there is no room for error!

A photo of the team that worked on the Australian Open website in 2019Team Tennis Australia, Acquia and Avanade after the men's singles final.

Many thanks to the round-the-clock efforts from Acquia's team in Asia Pacific, as well as our partners at Avanade!

January 30, 2019

50 sec read time

db db
Jan 29 2019
Jan 29

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

The European Commission worked with the Drupal Security Team to set aside 89,000€ (or roughly $100,000 USD) for a Drupal bug bounty.

An image of a shield with the Drupal mascot

The European Commission made an exciting announcement; it will be awarding bug bounties to the security teams of Open Source software projects that the European Commission relies on.

If you are not familiar with the term, a bug bounty is a monetary prize awarded to people who discover and correctly report security issues.

Julia Reda — an internet activist, Member of the European Parliament (MEP) and co-founder of the Free and Open Source Software Audit (FOSSA) project — wrote the following on her blog:

Like many other organizations, institutions like the European Parliament, the Council and the Commission build upon Free Software to run their websites and many other things. But the Internet is not only crucial to our economy and our administration, it is the infrastructure that runs our everyday lives.

With more than 150 Drupal sites, the European Commission is a big Drupal user, and has a large internal Drupal community. The European Commission set aside 89,000€ (or roughly $100,000 USD) for a Drupal bug bounty. They worked closely with Drupal's Security Team to set this up. To participate in the Drupal bug bounty, read the guidelines provided by Drupal's Security Team.

Over the years I've had many meetings with the European Commission, presented keynotes at some of its events, and more. During that time, I've seen the European Commission evolve from being hesitant about Open Source, to recognizing the many benefits that Open Source provides for its key ICT services, to truly embracing Open Source.

In many ways, the European Commission followed classic Open Source adoption patterns; adoption went from being technology-led (bottom-up or grassroots) to policy-led (top-down and institutionalized), and now the EU is an active participant and contributor.

Today, the European Commission is a shining example and role model for how governments and other large organizations can contribute to Open Source (just like how the White House used to be).

The European Commission is actually investing in Drupal in a variety of ways — the bug bounty is just one example of that — but more about that in a future blog post.

 January 28, 2019

 1 min read time

 Permalink

Jan 29 2019
Jan 29

In my 2018 Acquia retrospective, I reflect on our business momentum, the evolution of the Acquia Platform, and why the market continues to move in Acquia's direction.

Every year, I sit down to write my annual Acquia retrospective. It's a rewarding exercise, because it allows me to reflect on how much progress Acquia has made in the past 12 months.

Overall, Acquia had an excellent 2018. I believe we are a much stronger company than we were a year ago; not only because of our financial results, but because of our commitment to strengthen our product and engineering teams.

If you'd like to read my previous retrospectives, they can be found here: 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009. This year marks the publishing of my tenth retrospective. When read together, these posts provide a comprehensive overview of Acquia's growth and trajectory.

Updating our brand

Exiting 2017, we doubled down on our transition from website management to digital experience management. In 2018, we updated our product positioning and brand narrative to reflect this change. This included a new Acquia Experience Platform diagram:

An image of the Acquia Platform in 2018The Acquia Platform is divided into two key parts: the Experience Factory and the Marketing Hub. Drupal and Acquia Lightning power every side of the experience. The Acquia Platform supports our customers throughout the entire life cycle of a digital experience — from building to operating and optimizing digital experiences.

In 2018, the Acquia marketing team also worked hard to update Acquia's brand. The result is a refreshed look and updated brand positioning that better reflects our vision, culture, and the value we offer our customers. This included updating our tagline to read: Experience Digital Freedom.

[embedded content]

I think Acquia's updated brand looks great, and it's been exciting to see it come to life. From highway billboards to Acquia Engage in Austin, our updated brand has been very well received.

A photo of an Acquia billboard at the Austin airportWhen Acquia Engage attendees arrived at the Austin-Bergstrom International Airport for Acquia Engage 2018, they were greeted by an Acquia display.

Business momentum

This year, Acquia surpassed $200 million in annualized revenue. Overall new subscription bookings grew 33 percent year over year, and we ended the year with nearly 900 employees.

Mike Sullivan completed his first year as Acquia's CEO, and demonstrated a strong focus on improving Acquia's business fundamentals across operational efficiency, gross margins and cost optimization. The results have been tangible, as Acquia has realized unprecedented financial growth in 2018:

  • Channel-partner bookings grew 52 percent
  • EMEA-based bookings grew 103 percent
  • Gross profit grew 39 percent
  • Adjusted EBITDA grew 78 percent
  • Free cash flow grew 84 percent
A photo that summarizes Acquia's 2018 business results mentioned throughout this blog post2018 was a record year for Acquia. Year-over-year highlights include new subscription bookings, EMEA-based bookings, free cash flow, and more.

International growth and expansion

In 2018, Acquia also witnessed unprecedented success in Europe and Asia, as new bookings in EMEA were up more than 100 percent. This included expanding our European headquarters to a new and larger space with a ribbon-cutting ceremony with the mayor of Reading in the U.K.

Acquia also expanded its presence in Asia, and opened Tokyo-based operations in 2018. Over the past few years I visited Japan twice, and I'm excited for the opportunities that doing business in Japan offers.

We selected Pune as the location for our new India office, and we are in the process of hiring our first Pune-based engineers.

Acquia now has four offices in the Asia Pacific region serving customers like Astellas Pharmaceuticals, Muji, Mediacorp, and Brisbane City Council.

A photo of an Acquia marketing one-pager in JapaneseAcquia product information, translated into Japanese.

Acquia Engage

In 2018, we welcomed more than 650 attendees to Austin, Texas, for our annual customer conference, Acquia Engage. In June, we also held our first Acquia Engage Europe and welcomed 300 attendees.

Our Engage conferences included presentations from customers like Paychex, NBC Sports, Wendy's, West Corporation, General Electric, Charles Schwab, Pac-12 Networks, Blue Cross Blue Shield, Bayer, Virgin Sport, and more. We also featured keynote presentations from our partner network, including VMLY&R, Accenture Interactive, IBM iX and MRM//McCann.

Both customers and partners continue to be the most important driver of Acquia's product strategy, and it's always rewarding to hear about this success first hand. In fact, 2018 customer satisfaction levels remain extremely high at 94 percent.

Partner program

Finally, Acquia's partner network continues to become more sophisticated. In the second half of 2018, we right sized our partner community from 2,270 firms to 226. This was a bold move, but our goal was to place a renewed focus on the partners who were both committed to Acquia and highly capable. As a result, we saw almost 52 percent year-over-year growth in partner-sourced ACV bookings. This is meaningful because for every $1 Acquia books in collaboration with a partner, our partner makes about $5 in services revenue.

Analyst recognition

In 2018, the top industry analysts published very positive reviews about Acquia. I'm proud that Acquia was recognized by Forrester Research as the leader for strategy and vision in The Forrester Wave: Web Content Management Systems, Q4 2018. Acquia was also named a leader in the 2018 Gartner Magic Quadrant for Web Content Management, marking our placement as a leader for the fifth year in a row.

Product milestones

An image showing a timeline of Acquia's product history and evolutionAcquia's product evolution between 2008 and 2018. When Acquia was founded, our mission was to provide commercial support for Drupal and to be the "Red Hat for Drupal"; 12 years later, the Acquia Platform helps organizations build, operate and optimize Drupal-based experiences.

2018 was one of the busiest years I have experienced; it was full of non-stop action every day. My biggest focus was working with Acquia's product and engineering team. We focused on growing and improving our R&D organization, modernizing Acquia Cloud, becoming user-experience first, redesigning the Acquia Lift user experience, working on headless Drupal, making Drupal easier to use, and expanding our commerce strategy.

Hiring, hiring, hiring

In partnership with Mike, we decided to increase the capacity of our research and development team by 60 percent. At the close of 2018, we were able to increase the capacity of our research and development team by 45 percent percent. We will continue to invest in growing our our R&D team in 2019.

I spent a lot of time restructuring, improving and scaling the product organization to make sure we could handle the increased capacity and build out a world-class R&D organization.

As the year progressed, R&D capacity came online and our ability to innovate not only improved but accelerated significantly. We entered 2019 in a much better position as we now have a lot more capacity to innovate.

Acquia Cloud

Acquia Cloud and Acquia Cloud Site Factory support some of the largest and most mission-critical websites in the world. The scope and complexity that Acquia Cloud and Acquia Cloud Site Factory manages is enormous. We easily deliver more than 30 billion page views a month (excluding CDN).

Over the course of 10 years, the Acquia Cloud codebase had grown very large. Updating, testing and launching new releases took a long time because we had one large, monolithic codebase. This was something we needed to change in order to add new features faster.

Over the course of 2018, the engineering team broke the monolithic codebase down into discrete components that can be tested and released independently. We launched our component-based architecture in June. Since then, the engineering team has released changes to production 650 times, compared to our historic pace of doing one release per quarter.

A graph that shows how we moved Acquia Cloud from a monolithic codebase to a component-based code base.This graph shows how we moved Acquia Cloud from a monolithic code base to a component-based code base. Each color on the graph represents a component. The graph shows how releases of Acquia Cloud (and the individual components in particular) have accelerated in the second half of the year.

Planning and designing for all of these services took a lot of time and focus, and was a large priority for the entire engineering team (including me). The fruits of these efforts will start to become more publicly visible in 2019. I'm excited to share more with you in future blog posts.

Acquia Cloud also remains the most secure and compliant cloud for Drupal. As we were componentizing the Acquia Cloud platform, the requirements to maintain our FedRAMP compliance became much more stringent. In April, the GDPR deadline was also nearing. Executing on hundreds of FedRAMP- and GDPR-related tasks emerged as another critical priority for many of our product and engineering teams. I'm proud that the team succeeded in accomplishing this amid all the other changes we were making.

Customer experience first

Over the years, I've felt Acquia lacked a focus on user experience (UX) for both developers and marketers. As a result, increasing the capacity of our R&D team included doubling the size of the UX team.

We've stepped up our UX research to better understand the needs and challenges of those who use Acquia products. We've begun to employ design-first methodologies, such as design sprints and a lean-UX approach. We've also created roles for customer experience designers, so that we're looking at the full customer journey rather than just our product interfaces.

With the extra capacity and data-driven changes in place, we've been working hard on updating the user experience for the entire Acquia Experience Platform. For example, you can see a preview of our new Acquia Lift product in this video, which has an increased focus on UX:

[embedded content]

Drupal

In 2018, Drupal 8 adoption kept growing and Drupal also saw an increase in the number of community contributions and contributors, both from individuals and from organizations.

Acquia remains very committed to Drupal, and was the largest contributor to the project in 2018. We now have more than 15 employees who contribute to Drupal full-time, in addition to many others that contribute periodically. In 2018, the Drupal team's main areas of focus have been Layout Builder and the API-first initiative:

  • Layout Builder: Layout Builder offers content authors an easy-to-use page building experience. It's shaping up to be one of the most useful and pervasive features ever added to Drupal because it redefines the how editors control the appearance of their content without having to rely on a developer.
  • API First: This initiative has given Drupal a true best-in-class web services API for using Drupal as a headless content management system. Headless Drupal is one of the fastest growing segments of Drupal implementations.
A photo of Acquia engineers, designers and product managers at Acquia Build Week 2018Our R&D team gathered in Boston for our annual Build Week in June 2018.

Content and Commerce

Adobe's acquisition of Magento has been very positive for us; we're now the largest commerce-agnostic content management company to partner with. As a result, we decided to extend our investments in headless commerce and set up partnerships with Elastic Path and BigCommerce. The momentum we've seen from these partnerships in a short amount of time is promising for 2019.

The market continues to move in Acquia's direction

In 2019, I believe Acquia will continue to be positioned for long-term growth. Here are a few reasons why:

  • The current markets for content and digital experience management continues to grow rapidly, at approximately 20 percent per year.
  • Digital transformation is top-of-mind for all organizations, and impacts all elements of their business and value chain.
  • Open source adoption continues to grow at a furious pace and has seen tremendous business success in 2018.
  • Cloud adoption continues to grow. Unlike most of our CMS competitors, Acquia was born in the cloud.
  • Drupal and Acquia are leaders in headless and decoupled content management, which is a fast growing segment of our market.
  • Conversational interfaces and augmented reality continues to grow, and we embraced these channels a few years ago. Acquia Labs, our research and innovation lab, explored how organizations can use conversational UIs to develop beyond-the-browser experiences, like cooking with Alexa, and voice-enabled search for customers like Purina.

Although we hold a leadership position in our market, our relative market share is small. These trends mean that we should have plenty of opportunity to grow in 2019 and beyond.

Thank you

While 2018 was an incredibly busy year, it was also very rewarding. I have a strong sense of gratitude, and admire every Acquian's relentless determination and commitment to improve. As always, none of these results and milestones would be possible without the hard work of the Acquia team, our customers, partners, the Drupal community, and our many friends.

I've always been pretty transparent about our trajectory (e.g. Acquia 2009 roadmap and Acquia 2017 strategy) and will continue to do so in 2019. We have some big plans for 2019, and I'm excited to share them with you. If you want to get notified about what we have in store, you can subscribe to my blog at https://dri.es/subscribe.

Thank you for your support in 2018!

[embedded content]

January 29, 2019

7 min read time

Jan 28 2019
Jan 28

The European Commission worked with the Drupal Security Team to set aside 89,000€ (or roughly $100,000 USD) for a Drupal bug bounty.

An image of a shield with the Drupal mascot

The European Commission made an exciting announcement; it will be awarding bug bounties to the security teams of Open Source software projects that the European Commission relies on.

If you are not familiar with the term, a bug bounty is a monetary prize awarded to people who discover and correctly report security issues.

Julia Reda — an internet activist, Member of the European Parliament (MEP) and co-founder of the Free and Open Source Software Audit (FOSSA) project — wrote the following on her blog:

Like many other organizations, institutions like the European Parliament, the Council and the Commission build upon Free Software to run their websites and many other things. But the Internet is not only crucial to our economy and our administration, it is the infrastructure that runs our everyday lives.

With over 150 Drupal sites, the European Commission is a big Drupal user, and has a large internal Drupal community. The European Commission set aside 89,000€ (or roughly $100,000 USD) for a Drupal bug bounty. They worked closely with Drupal's Security Team to set this up. To participate in the Drupal bug bounty, read the guidelines provided by Drupal's Security Team.

Over the years I've had many meetings with the European Commission, presented keynotes at some of its events, and more. During that time, I've seen the European Commission evolve from being hesitant about Open Source to recognizing the many benefits that Open Source provides for its key ICT services, to truly embracing Open Source.

In many ways, the European Commission followed classic Open Source adoption patterns; adoption went from being technology-led (bottom-up or grassroots) to policy-led (top-down and institutionalized), and now the EU is an active participant and contributor.

Today, the European Commission is a shining example and role model for how governments and other large organizations can contribute to Open Source (just like how the White House used to be).

The European Commission is actually investing in Drupal in a variety of ways — the bug bounty is just one example of that — but more about that in a future blog post.

January 28, 2019

1 min read time

db db
Jan 24 2019
Jan 24

Do I build my website with Drupal's built-in templating layer or do I use Drupal's decoupled or headless capabilities in combination with a JavaScript framework?

The pace of innovation in content management has been accelerating — driven by both the number of channels that content management systems need to support (web, mobile, social, chat) as well as the need to support JavaScript frameworks in the traditional web channel. As a result, we've seen headless or decoupled architectures emerge.

Decoupled Drupal has seen adoption from all corners of the Drupal community. In response to the trend towards decoupled architectures, I wrote blog posts in 2016 and 2018 for architects and developers about how and when to decouple Drupal. In the time since my last post, the surrounding landscape has evolved, Drupal's web services have only gotten better, and new paradigms such as static site generators and the JAMstack are emerging.

Time to update my recommendations for 2019! As we did a year ago, let's start with the 2019 version of the flowchart in full. (At the end of this post, there is also an accessible version of this flowchart described in words.)

A flowchart of how to decouple Drupal in 2019

Different ways to decouple Drupal

I want to revisit some of the established ways to decouple Drupal as well as discuss new paradigms that are seeing growing adoption. As I've written previously, the three most common approaches to Drupal architecture from a decoupled standpoint are traditional (or coupled), progressively decoupled, and fully decoupled. The different flavors of decoupling Drupal exist due to varying preferences and requirements.

In traditional Drupal, all of Drupal's usual responsibilities stay intact, as Drupal is a monolithic system and therefore maintains complete control over the presentation and data layers. Traditional Drupal remains an excellent choice for editors who need full control over the visual elements on the page, with access to features such as in-place editing and layout management. This is Drupal as we have known it all along. Because the benefits are real, this is still how most new content management projects are built.

Sometimes, JavaScript is required to deliver a highly interactive end-user experience. In this case, a decoupled approach becomes required. In progressively decoupled Drupal, a JavaScript framework is layered on top of the existing Drupal front end. This JavaScript might be responsible for nothing more than rendering a single block or component on a page, or it may render everything within the page body. The progressive decoupling paradigm lies on a spectrum; the less of the page dedicated to JavaScript, the more editors can control the page through Drupal's administrative capabilities.

Up until this year, fully decoupled Drupal was a single category of decoupled Drupal architecture that reflects a full separation of concerns between the presentation layer and all other aspects of the CMS. In this scenario, the CMS becomes a data provider, and a JavaScript application with server-side rendering becomes responsible for all rendering and markup, communicating with Drupal via web service APIs. Though key functionality like in-place editing and layout management are unavailable, fully decoupled Drupal is appealing for developers who want greater control over the front end and who are already experienced with building applications in frameworks like Angular, React, Vue.js, etc.

Over the last year, fully decoupled Drupal has branched into two separate paradigms due to the increasing complexity of JavaScript development. The so-called JAMstack (JavaScript, APIs, Markup) introduces a new approach: fully decoupled static sites. The primary reason for static sites is improved performance, security, and reduced complexity for developers. A static site generator like Gatsby will retrieve content from Drupal, generate a static website, and deploy that static site to a CDN, usually through a specialized cloud provider such as Netlify.

What do you intend to build?

The top section of the flowchart showing how to decouple Drupal in 2019

The essential question, as always, is what you're trying to build. Here is updated advice for architects exploring decoupled Drupal in 2019:

  1. If your intention is to build a single standalone website or web application, choosing decoupled Drupal may or may not be the right choice, depending on the features your developers and editors see as must-haves.
  2. If your intention is to build multiple web experiences (websites or web applications), you can use a decoupled Drupal instance either as a) a content repository without its own public-facing front end or b) a traditional website that acts simultaneously as a content repository. Depending on how dynamic your application needs to be, you can choose a JavaScript framework for highly interactive applications or a static site generator for mostly static websites.
  3. If your intention is to build multiple non-web experiences (native mobile or IoT applications), you can leverage decoupled Drupal to expose web service APIs and consume that Drupal site as a content repository without its own public-facing front end.

What makes Drupal so powerful is that it supports all of these use cases. Drupal makes it simple to build decoupled Drupal thanks to widely recognized standards such as JSON:API, GraphQL, OpenAPI, and CouchDB. In the end, it is your technical requirements that will decide whether decoupled Drupal should be your next architecture.

In addition to technical requirements, organizational factors often come into play as well. For instance, if it is proving difficult to find talented front-end Drupal developers with Twig knowledge, it may make more sense to hire more affordable JavaScript developers instead and build a fully decoupled implementation.

Are there things you can't live without?

The middle section of the flowchart showing how to decouple Drupal in 2019

As I wrote last year, the most important aspect of any decision when it comes to decoupling Drupal is the list of features your project requires; the needs of editors and developers have to be carefully considered. It is a critical step in your evaluation process to weigh the different advantages and disadvantages. Every project should embark on a clear-eyed assessment of its organization-wide needs.

Many editorial and marketing teams select a particular CMS because of its layout capabilities and rich editing functionality. Drupal, for example, gives editors the ability to build layouts in the browser and drop-and-drag components into it, all without needing a developer to do it for them. Although it is possible to rebuild many of the features available in a CMS on a consumer application, this can be a time-consuming and expensive process.

In recent years, the developer experience has also become an important consideration, but not in the ways that we might expect. While the many changes in the JavaScript landscape are one of the motivations for developers to prefer decoupled Drupal, the fact that there are now multiple ways to write front ends for Drupal makes it easier to find people to work on decoupled Drupal projects. As an example, many organizations are finding it difficult to find affordable front-end Drupal developers experienced in Twig. Moving to a JavaScript-driven front end can resolve some of these resourcing challenges.

This balancing act between the requirements that developers prioritize and those that editors prioritize will guide you to the correct approach for your needs. If you are part of an organization that is mostly editorial, decoupled Drupal could be problematic, because it reduces the amount of control editors have over the presentation of their content. By the same token, if you are part of an organization with more developer resources, fully decoupled Drupal could potentially accelerate progress, with the warning that many mission-critical editorial features disappear.

Current and future trends to consider

A diagram showing a spectrum of site building solution; low-code solutions on the left and high-code solutions on the rightOver the past year, JavaScript frameworks have become more complex, while static site generators have become less complex.

One of the common complaints I have heard about the JavaScript landscape is that it shows fragmentation and a lack of cohesion due to increasing complexity. This has been a driving force for static site generators. Whereas two years ago, most JavaScript developers would have chosen a fully functional framework like Angular or Ember to create even simple websites, today they might choose a static site generator instead. A static site generator still allows them to use JavaScript, but it is simpler because performance considerations and build processes are offloaded to hosted services rather than the responsibility of developers.

I predict that static site generators will gain momentum in the coming year due to the positive developer experience they provide. Static site generators are also attracting a middle ground of both more experienced and less experienced developers.

Conclusion

Drupal continues to be an ideal choice for decoupled CMS architectures, and it is only getting better. The API-first initiative is making good progress on preparing the JSON:API module for inclusion in Drupal core, and the Admin UI and JavaScript Modernization initiative is working to dogfood Drupal's web services with a reinvented administrative interface. Drupal's support for GraphQL continues to improve, and now there is even a book on the subject of decoupled Drupal. It's clear that developers today have a wide range of ways to work with the rich features Drupal has to offer for decoupled architectures.

With the introduction of fully decoupled static sites as an another architectural paradigm that developers can select, there is an even wider variety of architectural possibilities than before. It means that the spectrum of decoupled Drupal approaches I defined last year has become even more extensive. This flexibility continues to define Drupal as an excellent CMS for both traditional and decoupled approaches, with features that go well beyond Drupal's competitors, including WordPress, Sitecore and Adobe. Regardless of the makeup of your team or the needs of your organization, Drupal has a solution for you.

Special thanks to Preston So for co-authoring this blog post and to Angie Byron, Chris Hamper, Gabe Sullice, Lauri Eskola, Ted Bowman, and Wim Leers for their feedback during the writing process.

Accessible version of flowchart

This is an accessible and described version of the flowchart images earlier in this blog post. First, let us list the available architectural choices:

  • Coupled. Use Drupal as is without additional JavaScript (and as a content repository for other consumers).
  • Progressively decoupled. Use Drupal for initial rendering with JavaScript on top (and as a content repository for other consumers).
  • Fully decoupled static site. Use Drupal as a data source for a static site generator and, if needed, deploy to a JAMstack hosting platform.
  • Fully decoupled app. Use Drupal as a content repository accessed by other consumers (if JavaScript, use Node.js for server-side rendering).

Second, ask the question "What do you intend to build?" and choose among the answers "One experience" or "Multiple experiences".

If you are building one experience, ask the question "Is it a website or web application?" and choose among the answers "Yes, a single website or web application" or "No, Drupal as a repository for non-web applications only".

If you are building multiple experiences instead, ask the question "Is it a website or web application?" with the answers "Yes, Drupal as website and repository" or "No, Drupal as a repository for non-web applications only".

If your answer to the previous question was "No", then you should build a fully decoupled application, and your decision is complete. If your answer to the previous question was "Yes", then ask the question "Are there things the project cannot live without?"

Both editorial and developer needs are things that projects cannot live without, and here are the questions you need to ask about your project:

Editorial needs

  • Do editors need to manipulate page content and layout without a developer?
  • Do editors need in-context tools like in-place editing, contextual links, and toolbar?
  • Do editors need to preview unpublished content without custom development?
  • Do editors need content to be accessible by default like in Drupal's HTML?

Developer needs

  • Do developers need to have control over visual presentation instead of editors?
  • Do developers need server-side rendering or Node.js build features?
  • Do developers need JSON from APIs and to write JavaScript for the front end?
  • Do developers need data security driven by a publicly inaccessible CMS?

If, after asking all of these questions about things your project cannot live without, your answers show that your requirements reflect a mix of both editorial and developer needs, you should consider a progressively decoupled implementation, and your decision is complete.

If your answers to the questions about things your project cannot live without show that your requirements reflect purely developer needs, then ask the question "Is it a static website or a dynamic web application?" and choose among the answers "Static" or "Dynamic." If your answer to the previous question was "Static", you should build a fully decoupled static site, and your decision is complete. If your answer to the previous question was "Dynamic", you should build a fully decoupled app, and your decision is complete.

If your answers to the questions about things your project cannot live without show that your requirements reflect purely editorial needs, then ask two questions. Ask the first question, "Are there parts of the page that need JavaScript-driven interactions?" and choose among the answers "Yes" or "No." If your answer to the first question was "Yes", then you should consider a progressively decoupled implementation, and your decision is complete. If your answer to the first question was "No", then you should build a coupled Drupal site, and your decision is complete.

Then, ask the second question, "Do you need to access multiple data sources via API?" and choose among the answers "Yes" or "No." If your answer to the second question was "Yes", then you should consider a progressively decoupled implementation, and your decision is complete. If your answer to the second question was "No", then you should build a coupled Drupal site, and your decision is complete.

January 24, 2019

8 min read time

Jan 15 2019
Jan 15

[embedded content]

Eighteen years ago today, I released Drupal 1.0.0. What started from humble beginnings has grown into one of the largest Open Source communities in the world. Today, Drupal exists because of its people and the collective effort of thousands of community members. Thank you to everyone who has been and continues to contribute to Drupal.

Eighteen years is also the voting age in the US, and the legal drinking age in Europe. I'm not sure which one is better. :) Joking aside, welcome to adulthood, Drupal. May your day be bug free and filled with fresh patches!

January 15, 2019

23 sec read time

db db
Jan 08 2019
Jan 08

A modern and fresh look is coming to the Drupal administration interface soon. Take a look!

Last year, I talked to nearly one hundred Drupal agency owners to understand what is preventing them from selling Drupal. One of the most common responses raised is that Drupal's administration UI looks outdated.

This critique is not wrong. Drupal's current administration UI was originally designed almost ten years ago when we were working on Drupal 7. In the last ten years, the world did not stand still; design trends changed, user interfaces became more dynamic and end-user expectations have changed with that.

To be fair, Drupal's administration UI has received numerous improvements in the past ten years; Drupal 8 shipped with a new toolbar, an updated content creation experience, more WYSIWYG functionality, and even some design updates.

A visual comparison of Drupal 7 and Drupal 8's administration UIA comparison of the Drupal 7 and Drupal 8 content creation screen to highlight some of the improvements in Drupal 8.

While we made important improvements between Drupal 7 and Drupal 8, the feedback from the Drupal agency owners doesn't lie: we have not done enough to keep Drupal's administration UI modern and up-to-date.

This is something we need to address.

We are introducing a new design system that defines a complete set of principles, patterns, and tools for updating Drupal's administration UI.

In the short term, we plan on updating the existing administration UI with the new design system. Longer term, we are working on creating a completely new JavaScript-based administration UI.

A screenshot of the content administration using Drupal 8's Carlo themeThe content administration screen with the new design system.

As you can see on Drupal.org, community feedback on the proposal is overwhelmingly positive with comments like Wow! Such an improvement! and Well done! High contrast and modern look..

A screenshot of the spacing guidelines of Drupal 8's Carlo themeSample space sizing guidelines from the new design system.

I also ran the new design system by a few people who spend their days selling Drupal and they described it as "clean" with "good use of space" and a design they would be confident showing to prospective customers.

Whether you are a Drupal end-user, or in the business of selling Drupal, I recommend you check out the new design system and provide your feedback on Drupal.org.

Special thanks to Cristina Chumillas, Sascha Eggenberger, Roy Scholten, Archita Arora, Dennis Cohn, Ricardo Marcelino, Balazs Kantor, Lewis Nyman,and Antonella Severo for all the work on the new design system so far!

We have started implementing the new design system as a contributed theme with the name Claro. We are aiming to release a beta version for testing in the spring of 2019 and to include it in Drupal core as an experimental theme by Drupal 8.8.0 in December 2019. With more help, we might be able to get it done faster.

Throughout the development of the refreshed administration theme, we will run usability studies to ensure that the new theme indeed is an improvement over the current experience, and we can iteratively improve it along the way.

Administration themes must meet large and varied use cases. For example, accessibility is critical for the administration experience, and I'm happy to see that this initiative connected with and have taken feedback into account from the accessibility team.

Acquia has committed to being an early adopter of the theme through the Acquia Lightning distribution, broadening the potential base of projects that can test and provide feedback on the refresh. Hopefully other organizations and projects will do the same.

How can I help?

The team is looking for more designers and frontend developers to get involved. You can attend the weekly meetings on #javascript on Drupal Slack Mondays at 16:30 UTC and on #admin-ui on Drupal Slack Wednesdays at 14:30 UTC.

Thanks to Lauri Eskola, Gábor Hojtsy and Jeff Beeman for their help with this post.

January 08, 2019

2 min read time

Dec 30 2018
Dec 30

Have you ever wanted to preview your new Drupal theme in a production environment without making it the default yet?

I did when I was working on my redesign of dri.es earlier in the year. I wanted the ability to add ?preview to the end of any URL on dri.es and have that URL render in my upcoming theme.

It allowed me to easily share my new design with a few friends and ask for their feedback. I would send them a quick message like this: Hi Matt, check out an early preview of my site's new design: https://dri.es?preview. Please let me know what you think!.

Because I use Drupal for my site, I created a custom Drupal 8 module to add this functionality. The module is probably too simple to share on Drupal.org so I figured I'd start with sharing it on my blog instead.

Like all Drupal modules, my module has a *.info.yml file. The purpose of the *.info.yml file is to let Drupal know about the existence of my module and to share some basic information about the module. My theme preview module is called Previewer so it has a *.info.yml file called previewer.info.yml:

name: Previewer
description: Allows previewing of a theme by adding '?preview' to URLs.
package: Custom
type: module
core: 8.x

The module has only one PHP class, Previewer, that implements Drupal's ThemeNegotiatorInterface interface:

<?php

namespace Drupal\previewer\Theme;

use Drupal\Core\Routing\RouteMatchInterface;
use Drupal\Core\Theme\ThemeNegotiatorInterface;

class Previewer implements ThemeNegotiatorInterface {

  /**
   * The function applies() determines if it wants to set the 
   * active theme. If the ?preview query string is part of the
   * URL, return TRUE to denote that Previewer wants to set
   * the theme. determineActiveTheme() will be called to
   * ask for the theme's name.
   */
  public function applies(RouteMatchInterface $route_match) {
    if (isset($_GET['preview'])) {
      return TRUE;
    }
    return FALSE;
  }

  /**
   * The function determineActiveTheme() is responsible
   * for returning the name of the theme that is to be used.
   */
  public function determineActiveTheme(RouteMatchInterface $route_match) {
    return 'dries'; // Yes, the name of my theme is 'dries'.
  }
}

?>

The function applies() checks if '?preview' is set as part of the current URL. If so, applies() returns TRUE to tell Drupal that it would like to specify what theme to use. If Previewer is allowed to specify the theme, its determineActiveTheme() function will be called. determineActiveTheme() returns the name of the theme. Drupal uses the specified theme to render the current page request.

Next, we have to tell Drupal about our theme negotiator class Previewer. This is done by registering it a service in previewer.services.yml:

services:
  theme.negotiator.previewer:
    class: Drupal\previewer\Theme\Previewer
    tags:
      - { name: theme_negotiator, priority: 10 }

previewer.services.yml tells Drupal to call our class Drupal\previewer\Theme\Previewer when it has to decide what theme to load.

A service is a common concept in Drupal (inherited from Symfony). Many of Drupal's features are separated into a service. Each service does just one job. Structuring your application around a set of independent and reusable service classes is an object-oriented programming best-practice. To some it might feel complex, but it actually promotes reusable and decoupled code.

Note that Drupal 8 adheres to PSR-4 namespaces and autoloading. This means that files must be named in specific ways and placed in specific directories in order to be recognized and loaded. Here is what my directory structure looks like:

$ tree previewer
previewer
├── previewer.info.yml
├── previewer.services.yml
└── src
    └── Theme
        └── Previewer.php

And that's it!

December 30, 2018

2 min read time

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web