Sep 24 2019
Sep 24
Acquia partners with Vista Equity Partners

Today, we announced that Acquia has agreed to receive a substantial majority investment from Vista Equity Partners. This means that Acquia has a new investor that owns more than 50 percent of the company, and who is invested in our future success. Attracting a well-known partner like Vista is a tremendous validation of what we have been able to achieve. I'm incredibly proud of that, as so many Acquians worked so hard to get to this milestone.

Our mission remains the same

Our mission at Acquia is to help our customers and partners build amazing digital experiences by offering them the best digital experience platform.

This mission to build a digital experience platform is a giant one. Vista specializes in growing software companies, for example, by providing capital to do acquisitions. The Vista ecosystem consists of more than 60 companies and more than 70,000 employees globally. By partnering with Vista and leveraging their scale, network and expertise, we can greatly accelerate our mission and our ability to compete in the market.

For years, people have rumored about Acquia going public. It still is a great option for Acquia, but I'm also happy that we stay a private and independent company for the foreseeable future.

We will continue to direct all of our energy to what we have done for so long: provide our customers and partners with leading solutions to build, operate and optimize digital experiences. We have a lot of work to do to help more businesses see and understand the power of Open Source, cloud delivery and data-driven customer experiences.

We'll keep giving back to Open Source

This investment should be great news for the Drupal and Mautic communities as we'll have the right resources to compete against other solutions, and our deep commitment to Drupal, Mautic and Open Source will be unchanged. In fact, we will continue to increase our current level of investment in Open Source as we grow our business.

In talking with Vista, who has a long history of promoting diversity and equality and giving back to its communities, we will jointly invest even more in Drupal and Mautic. We will:

  • Improve the "learnability of Drupal" to help us attract less technical and more diverse people to Drupal.
  • Sponsor more Drupal and Mautic community events and meetups.
  • Increase the amount of Open Source code we contribute.
  • Fund initiatives to improve diversity in Drupal and Mautic; to enable people from underrepresented groups to contribute, attend community events, and more.

We will provide more details soon.

I continue in my role

I've been at Acquia for 12 years, most of my professional career.

During that time, I've been focused on making Acquia a special company, with a unique innovation and delivery model, all optimized for a new world. A world where a lot of software is becoming Open Source, and where businesses are moving most applications into the cloud, where IT infrastructure is becoming a metered utility, and where data-driven customer experiences make or break business results.

It is why we invest in Open Source (e.g. Drupal, Mautic), cloud infrastructure (e.g. Acquia Cloud and Site Factory), and data-centric business tools (e.g. Acquia Lift, Mautic).

We have a lot of work left to do to help businesses see and understand the power of Open Source. I also believe Acquia is an example for how other Open Source companies can do Open Source right, in harmony with their communities.

The work we do at Acquia is interesting, impactful, and, in a positive way, challenging. Working at Acquia means I have a chance to change the world in a way that impacts hundreds of thousands of people. There is nowhere else I'd want to work.

Thank you to our early investors

As part of this transaction, Vista will buy out our initial investors. I want to provide a special shoutout to Michael Skok (North Bridge Venture Partners + Underscore) and John Mandile (Sigma Prima Ventures). I fondly remember Jay Batson and I raising money from Michael and John in 2007. They made a big bet on me — at the time, a college student living in Belgium when Open Source was everything but mainstream.

I'm grateful for the belief and trust they had in me and the support and mentorship they provided the past 12 years. The opportunity they gave me will forever define my professional career. I'm thankful for their support in building Acquia to what it is today, and I am thrilled about what is yet to come.

Stay tuned for great things ahead! It's a great time to be an Acquia customer and Drupal or Mautic user.

September 24, 2019

3 min read time

Sep 19 2019
Sep 19

To scale and sustain Open Source ecosystems in a more efficient and fair manner, Open Source projects need to embrace new governance, coordination and incentive models.

A scale that is in balance

In many ways, Open Source has won. Most people know that Open Source provides better quality software, at a lower cost, without vendor lock-in. But despite Open Source being widely adopted and more than 30 years old, scaling and sustaining Open Source projects remains challenging.

Not a week goes by that I don't get asked a question about Open Source sustainability. How do you get others to contribute? How do you get funding for Open Source work? But also, how do you protect against others monetizing your Open Source work without contributing back? And what do you think of MongoDB, Cockroach Labs or Elastic changing their license away from Open Source?

This blog post talks about how we can make it easier to scale and sustain Open Source projects, Open Source companies and Open Source ecosystems. I will show that:

  • Small Open Source communities can rely on volunteers and self-governance, but as Open Source communities grow, their governance model most likely needs to be reformed so the project can be maintained more easily.
  • There are three models for scaling and sustaining Open Source projects: self-governance, privatization, and centralization. All three models aim to reduce coordination failures, but require Open Source communities to embrace forms of monitoring, rewards and sanctions. While this thinking is controversial, it is supported by decades of research in adjacent fields.
  • Open Source communities would benefit from experimenting with new governance models, coordination systems, license innovation, and incentive models.

Some personal background

Scaling and sustaining Open Source projects and Open Source businesses has been the focus of most of my professional career.

Drupal, the Open Source project I founded 18 years ago, is used by more than one million websites and reaches pretty much everyone on the internet.

With over 8,500 individuals and about 1,100 organizations contributing to Drupal annually, Drupal is one of the healthiest and contributor-rich Open Source communities in the world.

For the past 12 years, I've also helped build Acquia, an Open Source company that heavily depends on Drupal. With almost 1,000 employees, Acquia is the largest contributor to Drupal, yet responsible for less than 5% of all contributions.

This article is not about Drupal or Acquia; it's about scaling Open Source projects more broadly.

I'm interested in how to make Open Source production more sustainable, more fair, more egalitarian, and more cooperative. I'm interested in doing so by redefining the relationship between end users, producers and monetizers of Open Source software through a combination of technology, market principles and behavioral science.

Why it must be easier to scale and sustain Open Source

We need to make it easier to scale and sustain both Open Source projects and Open Source businesses:

  1. Making it easier to scale and sustain Open Source projects might be the only way to solve some of the world's most important problems. For example, I believe Open Source to be the only way to build a pro-privacy, anti-monopoly, open web. It requires Open Source communities to be long-term sustainable — possibly for hundreds of years.
  2. Making it easier to grow and sustain Open Source businesses is the last hurdle that prevents Open Source from taking over the world. I'd like to see every technology company become an Open Source company. Today, Open Source companies are still extremely rare.

The alternative is that we are stuck in the world we live in today, where proprietary software dominates most facets of our lives.

Disclaimers

This article is focused on Open Source governance models, but there is more to growing and sustaining Open Source projects. Top of mind is the need for Open Source projects to become more diverse and inclusive of underrepresented groups.

Second, I understand that the idea of systematizing Open Source contributions won't appeal to everyone. Some may argue that the suggestions I'm making go against the altruistic nature of Open Source. I agree. However, I'm also looking at Open Source sustainability challenges from the vantage point of running both an Open Source project (Drupal) and an Open Source business (Acquia). I'm not implying that every community needs to change their governance model, but simply offering suggestions for communities that operate with some level of commercial sponsorship, or communities that struggle with issues of long-term sustainability.

Lastly, this post is long and dense. I'm 700 words in, and I haven't started yet. Given that this is a complicated topic, there is an important role for more considered writing and deeper thinking.

Defining Open Source Makers and Takers

Makers

Some companies are born out of Open Source, and as a result believe deeply and invest significantly in their respective communities. With their help, Open Source has revolutionized software for the benefit of many. Let's call these types of companies Makers.

As the name implies, Makers help make Open Source projects; from investing in code, to helping with marketing, growing the community of contributors, and much more. There are usually one or more Makers behind the success of large Open Source projects. For example, MongoDB helps make MongoDB, Red Hat helps make Linux, and Acquia (along with many other companies) helps make Drupal.

Our definition of a Maker assumes intentional and meaningful contributions and excludes those whose only contributions are unintentional or sporadic. For example, a public cloud company like Amazon can provide a lot of credibility to an Open Source project by offering it as-a-service. The resulting value of this contribution can be substantial, however that doesn't make Amazon a Maker in our definition.

I use the term Makers to refer to anyone who purposely and meaningfully invests in the maintenance of Open Source software, i.e. by making engineering investments, writing documentation, fixing bugs, organizing events, and more.

Takers

Now that Open Source adoption is widespread, lots of companies, from technology startups to technology giants, monetize Open Source projects without contributing back to those projects. Let's call them Takers.

I understand and respect that some companies can give more than others, and that many might not be able to give back at all. Maybe one day, when they can, they'll contribute. We limit the label of Takers to companies that have the means to give back, but choose not to.

The difference between Makers and Takers is not always 100% clear, but as a rule of thumb, Makers directly invest in growing both their business and the Open Source project. Takers are solely focused on growing their business and let others take care of the Open Source project they rely on.

Organizations can be both Takers and Makers at the same time. For example, Acquia, my company, is a Maker of Drupal, but a Taker of Varnish Cache. We use Varnish Cache extensively but we don't contribute to its development.

A scale that is not in balance

Takers hurt Makers

To be financially successful, many Makers mix Open Source contributions with commercial offerings. Their commercial offerings usually take the form of proprietary or closed source IP, which may include a combination of premium features and hosted services that offer performance, scalability, availability, productivity, and security assurances. This is known as the Open Core business model. Some Makers offer professional services, including maintenance and support assurances.

When Makers start to grow and demonstrate financial success, the Open Source project that they are associated with begins to attract Takers. Takers will usually enter the ecosystem with a commercial offering comparable to the Makers', but without making a similar investment in Open Source contribution. Because Takers don't contribute back meaningfully to the Open Source project that they take from, they can focus disproportionately on their own commercial growth.

Let's look at a theoretical example.

When a Maker has $1 million to invest in R&D, they might choose to invest $500k in Open Source and $500k in the proprietary IP behind their commercial offering. The Maker intentionally balances growing the Open Source project they are connected to with making money. To be clear, the investment in Open Source is not charity; it helps make the Open Source project competitive in the market, and the Maker stands to benefit from that.

When a Taker has $1 million to invest in R&D, nearly all of their resources go to the development of proprietary IP behind their commercial offerings. They might invest $950k in their commercial offerings that compete with the Maker's, and $50k towards Open Source contribution. Furthermore, the $50k is usually focused on self-promotion rather than being directed at improving the Open Source project itself.

A visualization of the Maker and Taker math

Effectively, the Taker has put itself at a competitive advantage compared to the Maker:

  • The Taker takes advantage of the Maker's $500k investment in Open Source contribution while only investing $50k themselves. Important improvements happen "for free" without the Taker's involvement.
  • The Taker can out-innovate the Maker in building proprietary offerings. When a Taker invests $950k in closed-source products compared to the Maker's $500k, the Taker can innovate 90% faster. The Taker can also use the delta to disrupt the Maker on price.

In other words, Takers reap the benefits of the Makers' Open Source contribution while simultaneously having a more aggressive monetization strategy. The Taker is likely to disrupt the Maker. On an equal playing field, the only way the Maker can defend itself is by investing more in its proprietary offering and less in the Open Source project. To survive, it has to behave like the Taker to the detriment of the larger Open Source community.

Takers harm Open Source projects. An aggressive Taker can induce Makers to behave in a more selfish manner and reduce or stop their contributions to Open Source altogether. Takers can turn Makers into Takers.

Open Source contribution and the Prisoner's Dilemma

The example above can be described as a Prisoner's Dilemma. The Prisoner's Dilemma is a standard example of game theory, which allows the study of strategic interaction and decision-making using mathematical models. I won't go into detail here, but for the purpose of this article, it helps me simplify the above problem statement. I'll use this simplified example throughout the article.

Imagine an Open Source project with only two companies supporting it. The rules of the game are as follows:

  • If both companies contribute to the Open Source project (both are Makers), the total reward is $100. The reward is split evenly and each company makes $50.
  • If one company contributes while the other company doesn't (one Maker, one Taker), the Open Source project won't be as competitive in the market, and the total reward will only be $80. The Taker gets $60 as they have the more aggressive monetization strategy, while the Maker gets $20.
  • If both players choose not to contribute (both are Takers), the Open Source project will eventually become irrelevant. Both walk away with just $10.

This can be summarized in a pay-off matrix:

Company A contributes Company A doesn't contribute Company B contributes A makes $50
B makes $50 A makes $60
B makes $20 Company B doesn't contribute A makes $20
B makes $60 A makes $10
B makes $10

In the game, each company needs to decide whether to contribute or not, but Company A doesn't know what company B decides; and vice versa.

The Prisoner's Dilemma states that each company will optimize its own profit and not contribute. Because both companies are rational, both will make that same decision. In other words, when both companies use their "best individual strategy" (be a Taker, not a Maker), they produce an equilibrium that yields the worst possible result for the group: the Open Source project will suffer and as a result they only make $10 each.

A real-life example of the Prisoner's Dilemma that many people can relate to is washing the dishes in a shared house. By not washing dishes, an individual can save time (individually rational), but if that behavior is adopted by every person in the house, there will be no clean plates for anyone (collectively irrational). How many of us have tried to get away with not washing the dishes? I know I have.

Fortunately, the problem of individually rational actions leading to collectively adverse outcomes is not new or unique to Open Source. Before I look at potential models to better sustain Open Source projects, I will take a step back and look at how this problem has been solved elsewhere.

Open Source: a public good or a common good?

In economics, the concepts of public goods and common goods are decades old, and have similarities to Open Source.

Examples of common goods (fishing grounds, oceans, parks) and public goods (lighthouses, radio, street lightning)

Public goods and common goods are what economists call non-excludable meaning it's hard to exclude people from using them. For example, everyone can benefit from fishing grounds, whether they contribute to their maintenance or not. Simply put, public goods and common goods have open access.

Common goods are rivalrous; if one individual catches a fish and eats it, the other individual can't. In contrast, public goods are non-rivalrous; someone listening to the radio doesn't prevent others from listening to the radio.

I've long believed that Open Source projects are public goods: everyone can use Open Source software (non-excludable) and someone using an Open Source project doesn't prevent someone else from using it (non-rivalrous).

However, through the lens of Open Source companies, Open Source projects are also common goods; everyone can use Open Source software (non-excludable), but when an Open Source end user becomes a customer of Company A, that same end user is unlikely to become a customer of Company B (rivalrous).

For end users, Open Source projects are public goods; the shared resource is the software. But for Open Source companies, Open Source projects are common goods; the shared resource is the (potential) customer.

Next, I'd like to extend the distinction between "Open Source software being a public good" and "Open Source customers being a common good" to the the free-rider problem: we define software free-riders as those who use the software without ever contributing back, and customer free-riders (or Takers) as those who sign up customers without giving back.

All Open Source communities should encourage software free-riders. Because the software is a public good (non-rivalrous), a software free-rider doesn't exclude others from using the software. Hence, it's better to have a user for your Open Source project, than having that person use your competitor's software. Furthermore, a software free-rider makes it more likely that other people will use your Open Source project (by word of mouth or otherwise). When some portion of those other users contribute back, the Open Source project benefits. Software free-riders can have positive network effects on a project.

However, when the success of an Open Source project depends largely on one or more corporate sponsors, the Open Source community should not forget or ignore that customers are a common good. Because a customer can't be shared among companies, it matters a great deal for the Open Source project where that customer ends up. When the customer signs up with a Maker, we know that a certain percentage of the revenue associated with that customer will be invested back into the Open Source project. When a customer signs up with a customer free-rider or Taker, the project doesn't stand to benefit. In other words, Open Source communities should find ways to route customers to Makers.

Both volunteer-driven and sponsorship-driven Open Source communities should encourage software free-riders, but sponsorship-driven Open Source communities should discourage customer free-riders.

Lessons from decades of Common Goods management

Hundreds of research papers and books have been written on public good and common good governance. Over the years, I have read many of them to figure out what Open Source communities can learn from successfully managed public goods and common goods.

Some of the most instrumental research was Garrett Hardin's Tragedy of the Commons and Mancur Olson's work on Collective Action. Both Hardin and Olson concluded that groups don't self-organize to maintain the common goods they depend on.

As Olson writes in the beginning of his book, The Logic of Collective Action: Unless the number of individuals is quite small, or unless there is coercion or some other special device to make individuals act in their common interest, rational, self-interested individuals will not act to achieve their common or group interest..

Consistent with the Prisoner's Dilemma, Hardin and Olson show that groups don't act on their shared interests. Members are disincentivized from contributing when other members can't be excluded from the benefits. It is individually rational for a group's members to free-ride on the contributions of others.

Dozens of academics, Hardin and Olson included, argued that an external agent is required to solve the free-rider problem. The two most common approaches are (1) centralization and (2) privatization:

  1. When a common good is centralized, the government takes over the maintenance of the common good. The government or state is the external agent.
  2. When a public good is privatized, one or more members of the group receive selective benefits or exclusive rights to harvest from the common good in exchange for the ongoing maintenance of the common good. In this case, one or more corporations act as the external agent.

The wide-spread advice to centralize and privatize common goods has been followed extensively in most countries; today, the management of natural resources is typically managed by either the government or by commercial companies, but no longer directly by its users. Examples include public transport, water utilities, fishing grounds, parks, and much more.

Overall, the privatization and centralization of common goods has been very successful; in many countries, public transport, water utilities and parks are maintained better than volunteer contributors would have on their own. I certainly value that I don't have to help maintain the train tracks before my daily commute to work, or that I don't have to help mow the lawn in our public park before I can play soccer with my kids.

For years, it was a long-held belief that centralization and privatization were the only way to solve the free-rider problem. It was Elinor Ostrom who observed that a third solution existed.

Ostrom found hundreds of cases where common goods are successfully managed by their communities, without the oversight of an external agent. From the management of irrigation systems in Spain to the maintenance of mountain forests in Japan — all have been successfully self-managed and self-governed by their users. Many have been long-enduring as well; the youngest examples she studied were more than 100 years old, and the oldest exceed 1,000 years.

Ostrom studied why some efforts to self-govern commons have failed and why others have succeeded. She summarized the conditions for success in the form of core design principles. Her work led her to win the Nobel Prize in Economics in 2009.

Interestingly, all successfully managed commons studied by Ostrom switched at some point from open access to closed access. As Ostrom writes in her book, Governing the Commons: For any appropriator to have a minimal interest in coordinating patterns of appropriation and provision, some set of appropriators must be able to exclude others from access and appropriation rights.. Ostrom uses the term appropriator to refer to those who use or withdraw from a resource. Examples would be fishers, irrigators, herders, etc — or companies trying to turn Open Source users into paying customers. In other words, the shared resource must be made exclusive (to some degree) in order to incentivize members to manage it. Put differently, Takers will be Takers until they have an incentive to become Makers.

Once access is closed, explicit rules need to be established to determine how resources are shared, who is responsible for maintenance, and how self-serving behaviors are suppressed. In all successfully managed commons, the regulations specify (1) who has access to the resource, (2) how the resource is shared, (3) how maintenance responsibilities are shared, (4) who inspects that rules are followed, (5) what fines are levied against anyone who breaks the rules, (6) how conflicts are resolved and (7) a process for collectively evolving these rules.

Three patterns for long-term sustainable Open Source

Studying the work of Garrett Hardin (Tragedy of the Commons), the Prisoner's Dilemma, Mancur Olson (Collective Action) and Elinor Ostrom's core design principles for self-governance, a number shared patterns emerge. When applied to Open Source, I'd summarize them as follows:

  1. Common goods fail because of a failure to coordinate collective action. To scale and sustain an Open Source project, Open Source communities need to transition from individual, uncoordinated action to cooperative, coordinated action.
  2. Cooperative, coordinated action can be accomplished through privatization, centralization, or self-governance. All three work — and can even be mixed.
  3. Successful privatization, centralization, and self-governance all require clear rules around membership, appropriation rights, and contribution duties. In turn, this requires monitoring and enforcement, either by an external agent (centralization + privatization), a private agent (self-governance), or members of the group itself (self-governance).

Next, let's see how these three concepts — centralization, privatization and self-governance — could apply to Open Source.

Model 1: Self-governance in Open Source

For small Open Source communities, self-governance is very common; it's easy for its members to communicate, learn who they can trust, share norms, agree on how to collaborate, etc.

As an Open Source project grows, contribution becomes more complex and cooperation more difficult: it becomes harder to communicate, build trust, agree on how to cooperate, and suppress self-serving behaviors. The incentive to free-ride grows.

You can scale successful cooperation by having strong norms that encourage other members to do their fair share and by having face-to-face events, but eventually, that becomes hard to scale as well.

As Ostrom writes in Governing the Commons: Even in repeated settings where reputation is important and where individuals share the norm of keeping agreements, reputation and shared norms are insufficient by themselves to produce stable cooperative behavior over the long run. and In all of the long-enduring cases, active investments in monitoring and sanctioning activities are quite apparent..

To the best of my knowledge, no Open Source project currently implements Ostrom's design principles for successful self-governance. To understand how Open Source communities might, let's go back to our running example.

Our two companies would negotiate rules for how to share the rewards of the Open Source project, and what level of contribution would be required in exchange. They would set up a contract where they both agree on how much each company can earn and how much each company has to invest. During the negotiations, various strategies can be proposed for how to cooperate. However, both parties need to agree on a strategy before they can proceed. Because they are negotiating this contract among themselves, no external agent is required.

These negotiations are non-trivial. As you can imagine, any proposal that does not involve splitting the $100 fifty-fifty is likely rejected. The most likely equilibrium is for both companies to contribute equally and to split the reward equally. Furthermore, to arrive at this equilibrium, one of the two companies would likely have to go backwards in revenue, which might not be agreeable.

Needless to say, this gets even more difficult in a scenario where there are more than two companies involved. Today, it's hard to fathom how such a self-governance system can successfully be established in an Open Source project. In the future, Blockchain-based coordination systems might offer technical solutions for this problem.

Large groups are less able to act in their common interest than small ones because (1) the complexity increases and (2) the benefits diminish. Until we have better community coordination systems, it's easier for large groups to transition from self-governance to privatization or centralization than to scale self-governance.

The concept of major projects growing out of self-governed volunteer communities is not new to the world. The first trade routes were ancient trackways which citizens later developed on their own into roads suited for wheeled vehicles. Privatization of roads improved transportation for all citizens. Today, we certainly appreciate that our governments maintain the roads.

The roads system evolving from self-governance to privatization, and from privatization to centralization

Model 2: Privatization of Open Source governance

In this model, Makers are rewarded unique benefits not available to Takers. These exclusive rights provide Makers a commercial advantage over Takers, while simultaneously creating a positive social benefit for all the users of the Open Source project, Takers included.

For example, Mozilla has the exclusive right to use the Firefox trademark and to set up paid search deals with search engines like Google, Yandex and Baidu. In 2017 alone, Mozilla made $542 million from searches conducted using Firefox. As a result, Mozilla can make continued engineering investments in Firefox. Millions of people and organizations benefit from that every day.

Another example is Automattic, the company behind WordPress. Automattic is the only company that can use WordPress.com, and is in the unique position to make hundreds of millions of dollars from WordPress' official SaaS service. In exchange, Automattic invests millions of dollars in the Open Source WordPress each year.

Recently, there have been examples of Open Source companies like MongoDB, Redis, Cockroach Labs and others adopting stricter licenses because of perceived (and sometimes real) threats from public cloud companies that behave as Takers. The ability to change the license of an Open Source project is a form of privatization.

Model 3: Centralization of Open Source governance

Let's assume a government-like central authority can monitor Open Source companies A and B, with the goal to reward and penalize them for contribution or lack thereof. When a company follows a cooperative strategy (being a Maker), they are rewarded $25 and when they follow a defect strategy (being a Taker), they are charged a $25 penalty. We can update the pay-off matrix introduced above as follows:

Company A contributes Company A doesn't contribute Company B contributes A makes $75 ($50 + $25)
B makes $75 ($50 + $25) A makes $35 ($60 - $25)
B makes $45 ($20 + 25) Company B doesn't contribute A makes $45 ($20 + $25)
B makes $35 ($60 - $25) A makes $0 ($10 - $25)
B makes $0 ($10 - $25)

We took the values from the pay-off matrix above and applied the rewards and penalties. The result is that both companies are incentivized to contribute and the optimal equilibrium (both become Makers) can be achieved.

The money for rewards could come from various fundraising efforts, including membership programs or advertising (just as a few examples). However, more likely is the use of indirect monetary rewards.

One way to implement this is Drupal's credit system. Drupal's non-profit organization, the Drupal Association monitors who contributes what. Each contribution earns you credits and the credits are used to provide visibility to Makers. The more you contribute, the more visibility you get on Drupal.org (visited by 2 million people each month) or at Drupal conferences (called DrupalCons, visited by thousands of people each year).

Example issue credit on drupal orgA screenshot of an issue comment on Drupal.org. You can see that jamadar worked on this patch as a volunteer, but also as part of his day job working for TATA Consultancy Services on behalf of their customer, Pfizer.

While there is a lot more the Drupal Association could and should do to balance its Makers and Takers and achieve a more optimal equilibrium for the Drupal project, it's an emerging example of how an Open Source non-profit organization can act as a regulator that monitors and maintains the balance of Makers and Takers.

The big challenge with this approach is the accuracy of the monitoring and the reliability of the rewarding (and sanctioning). Because Open Source contribution comes in different forms, tracking and valuing Open Source contribution is a very difficult and expensive process, not to mention full of conflict. Running this centralized government-like organization also needs to be paid for, and that can be its own challenge.

Concrete suggestions for scaling and sustaining Open Source

Suggestion 1: Don't just appeal to organizations' self-interest, but also to their fairness principles

If, like most economic theorists, you believe that organizations act in their own self-interest, we should appeal to that self-interest and better explain the benefits of contributing to Open Source.

Despite the fact that hundreds of articles have been written about the benefits of contributing to Open Source — highlighting speed of innovation, recruiting advantages, market credibility, and more — many organizations still miss these larger points.

It's important to keep sharing Open Source success stories. One thing that we have not done enough is appeal to organizations' fairness principles.

While a lot of economic theories correctly assume that most organizations are self-interested, I believe some organizations are also driven by fairness considerations.

Despite the term "Takers" having a negative connotation, it does not assume malice. For many organizations, it is not apparent if an Open Source project needs help with maintenance, or how one's actions, or lack thereof, might negatively affect an Open Source project.

As mentioned, Acquia is a heavy user of Varnish Cache. But as Acquia's Chief Technology Officer, I don't know if Varnish needs maintenance help, or how our lack of contribution negatively affects Makers in the Varnish community.

It can be difficult to understand the consequences of our own actions within Open Source. Open Source communities should help others understand where contribution is needed, what the impact of not contributing is, and why certain behaviors are not fair. Some organizations will resist unfair outcomes and behave more cooperatively if they understand the impact of their behaviors and the fairness of certain outcomes.

Make no mistake though: most organizations won't care about fairness principles; they will only contribute when they have to. For example, most people would not voluntarily redistribute 25-50% of their income to those who need it. However, most of us agree to redistribute money by paying taxes, but only so long as all others have to do so as well.

Suggestion 2: Encourage end users to offer selective benefits to Makers

We talked about Open Source projects giving selective benefits to Makers (e.g. Automattic, Mozilla, etc), but end users can give selective benefits as well. For example, end users can mandate Open Source contributions from their partners. We have some successful examples of this in the Drupal community:

If more end users of Open Source took this stance, it could have a very big impact on Open Source sustainability. For governments, in particular, this seems like a very logical thing to do. Why would a government not want to put every dollar of IT spending back in the public domain? For Drupal alone, the impact would be measured in tens of millions of dollars each year.

Suggestion 3: Experiment with new licenses

I believe we can create licenses that support the creation of Open Source projects with sustainable communities and sustainable businesses to support it.

For a directional example, look at what MariaDB did with their Business Source License (BSL). The BSL gives users complete access to the source code so users can modify, distribute and enhance it. Only when you use more than x of the software do you have to pay for a license. Furthermore, the BSL guarantees that the software becomes Open Source over time; after y years, the license automatically converts from BSL to General Public License (GPL), for example.

A second example is the Community Compact, a license proposed by Adam Jacob. It mixes together a modern understanding of social contracts, copyright licensing, software licensing, and distribution licensing to create a sustainable and harmonious Open Source project.

We can create licenses that better support the creation, growth and sustainability of Open Source projects and that are designed so that both users and the commercial ecosystem can co-exist and cooperate in harmony.

I'd love to see new licenses that encourage software free-riding (sharing and giving), but discourage customer free-riding (unfair competition). I'd also love to see these licenses support many Makers, with built-in inequity and fairness principles for smaller Makers or those not able to give back.

If, like me, you believe there could be future licenses that are more "Open Source"-friendly, not less, it would be smart to implement a contributor license agreement for your Open Source project; it allows Open Source projects to relicense if/when better licenses arrive. At some point, current Open Source licenses will be at a disadvantage compared to future Open Source licenses.

Conclusions

As Open Source communities grow, volunteer-driven, self-organized communities become harder to scale. Large Open Source projects should find ways to balance Makers and Takers or the Open Source project risks not innovating enough under the weight of Takers.

Fortunately, we don't have to accept that future. However, this means that Open Source communities potentially have to get comfortable experimenting with how to monitor, reward and penalize members in their communities, particularly if they rely on a commercial ecosystem for a large portion of their contributions. Today, that goes against the values of most Open Source communities, but I believe we need to keep an open mind about how we can grow and scale Open Source.

Making it easier to scale Open Source projects in a sustainable and fair way is one of the most important things we can work on. If we succeed, Open Source can truly take over the world — it will pave the path for every technology company to become an Open Source business, and also solve some of the world's most important problems in an open, transparent and cooperative way.

September 19, 2019

22 min read time

Sep 12 2019
Sep 12

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

An in-depth analysis of how Drupal's development was sponsored between July 1, 2018 and June 30, 2019.

The past years, I've examined Drupal.org's contribution data to understand who develops Drupal, how diverse the Drupal community is, how much of Drupal's maintenance and innovation is sponsored, and where that sponsorship comes from.

You can look at the 2016 report, the 2017 report, and the 2018 report. Each report looks at data collected in the 12-month period between July 1st and June 30th.

This year's report shows that:

  • Both the recorded number of contributors and contributions have increased.
  • Most contributions are sponsored, but volunteer contributions remains very important to Drupal's success.
  • Drupal's maintenance and innovation depends mostly on smaller Drupal agencies and Acquia. Hosting companies, multi-platform digital marketing agencies, large system integrators and end users make fewer contributions to Drupal.
  • Drupal's contributors have become more diverse, but are still not diverse enough.

Methodology

What are Drupal.org issues?

"Issues" are pages on Drupal.org. Each issue tracks an idea, feature request, bug report, task, or more. See https://www.drupal.org/project/issues for the list of all issues.

For this report, we looked at all Drupal.org issues marked "closed" or "fixed" in the 12-month period from July 1, 2018 to June 30, 2019. The issues analyzed in this report span Drupal core and thousands of contributed projects, across all major versions of Drupal.

What are Drupal.org credits?

In the spring of 2015, after proposing initial ideas for giving credit, Drupal.org added the ability for people to attribute their work in the Drupal.org issues to an organization or customer, or mark it the result of volunteer efforts.

Example issue credit on drupal org

A screenshot of an issue comment on Drupal.org. You can see that jamadar worked on this patch as a volunteer, but also as part of his day job working for TATA Consultancy Services on behalf of their customer, Pfizer.

Drupal.org's credit system is truly unique and groundbreaking in Open Source and provides unprecedented insights into the inner workings of a large Open Source project. There are a few limitations with this approach, which we'll address at the end of this report.

What is the Drupal community working on?

In the 12-month period between July 1, 2018 and June 30, 2019, 27,522 issues were marked "closed" or "fixed", a 13% increase from the 24,447 issues in the 2017-2018 period.

In total, the Drupal community worked on 3,474 different Drupal.org projects this year compared to 3,229 projects in the 2017-2018 period — an 8% year over year increase.

The majority of the credits are the result of work on contributed modules:

A pie chart showing contributions by project type: most contributions are to contributed modules.

Compared to the previous period, contribution credits increased across all project types:

A graph showing the year over year growth of contributions per project type.

The most notable change is the large jump in "non-product credits": more and more members in the community started tracking credits for non-product activities such as organizing Drupal events (e.g. DrupalCamp Delhi projectDrupal Developer DaysDrupal Europe and DrupalCon Europe), promoting Drupal (e.g. Drupal pitch deck or community working groups (e.g. Drupal Diversity and Inclusion Working GroupGovernance Working Group).

While some of these increases reflect new contributions, others are existing contributions that are newly reported. All contributions are valuable, whether they're code contributions, or non-product and community-oriented contributions such as organizing events, giving talks, leading sprints, etc. The fact that the credit system is becoming more accurate in recognizing more types of Open Source contribution is both important and positive.

Who is working on Drupal?

For this report's time period, Drupal.org's credit system received contributions from 8,513 different individuals and 1,137 different organizations — a meaningful increase from last year's report.

A graph showing that the number of individual and organizational contributors increased year over year.

Consistent with previous years, approximately 51% of the individual contributors received just one credit. Meanwhile, the top 30 contributors (the top 0.4%) account for 19% of the total credits. In other words, a relatively small number of individuals do the majority of the work. These individuals put an incredible amount of time and effort into developing Drupal and its contributed projects:

Rank Username Issues 1 kiamlaluno 1610 2 jrockowitz 756 3 alexpott 642 4 RajabNatshah 616 5 volkswagenchick 519 6 bojanz 504 7 alonaoneill 489 8 thalles 488 9 Wim Leers 437 10 DamienMcKenna 431 11 Berdir 424 12 chipway 356 13 larowlan 324 14 pifagor 320 15 catch 313 16 mglaman 277 17 adci_contributor 274 18 quietone 266 19 tim.plunkett 265 20 gaurav.kapoor 253 21 RenatoG 246 22 heddn 243 23 chr.fritsch 241 24 xjm 238 25 phenaproxima 238 26 mkalkbrenner 235 27 gvso 232 28 dawehner 219 29 e0ipso 218 30 drumm 205

Out of the top 30 contributors featured this year, 28 were active contributors in the 2017-2018 period as well. These Drupalists' dedication and continued contribution to the project has been crucial to Drupal's development.

It's also important to recognize that most of the top 30 contributors are sponsored by an organization. Their sponsorship details are provided later in this article. We value the organizations that sponsor these remarkable individuals, because without their support, it could be more challenging for these individuals to be in the top 30.

It's also nice to see two new contributors make the top 30 this year — Alona O'neill with sponsorship from Hook 42 and Thalles Ferreira with sponsorship from CI&T. Most of their credits were the result of smaller patches (e.g. removing deprecated code, fixing coding style issues, etc) or in some cases non-product credits rather than new feature development or fixing complex bugs. These types of contributions are valuable and often a stepping stone towards towards more in-depth contribution.

How much of the work is sponsored?

Issue credits can be marked as "volunteer" and "sponsored" simultaneously (shown in jamadar's screenshot near the top of this post). This could be the case when a contributor does the necessary work to satisfy the customer's need, in addition to using their spare time to add extra functionality.

For those credits with attribution details, 18% were "purely volunteer" credits (8,433 credits), in stark contrast to the 65% that were "purely sponsored" (29,802 credits). While there are almost four times as many "purely sponsored" credits as "purely volunteer" credits, volunteer contribution remains very important to Drupal.

Contributions by volunteer vs sponsored

Both "purely volunteer" and "purely sponsored" credits grew — "purely sponsored" credits grew faster in absolute numbers, but for the first time in four years "purely volunteer" credits grew faster in relative numbers.

The large jump in volunteer credits can be explained by the community capturing more non-product contributions. As can be seen on the graph below, these non-product contributions are more volunteer-centric.

A graph showing how much of the contributions are volunteered vs sponsored.

Who is sponsoring the work?

Now that we've established that the majority of contributions to Drupal are sponsored, let's study which organizations contribute to Drupal. While 1,137 different organizations contributed to Drupal, approximately 50% of them received four credits or less. The top 30 organizations (roughly the top 3%) account for approximately 25% of the total credits, which implies that the top 30 companies play a crucial role in the health of the Drupal project.

Top conytinuying organizations

Top contributing organizations based on the number of issue credits.

While not immediately obvious from the graph above, a variety of different types of companies are active in Drupal's ecosystem:

Category Description Traditional Drupal businesses Small-to-medium-sized professional services companies that primarily make money using Drupal. They typically employ fewer than 100 employees, and because they specialize in Drupal, many of these professional services companies contribute frequently and are a huge part of our community. Examples are Hook42, Centarro, The Big Blue House, Vardot, etc. Digital marketing agencies Larger full-service agencies that have marketing-led practices using a variety of tools, typically including Drupal, Adobe Experience Manager, Sitecore, WordPress, etc. They tend to be larger, with many of the larger agencies employing thousands of people. Examples are Wunderman, Possible and Mirum. System integrators Larger companies that specialize in bringing together different technologies into one solution. Example system agencies are Accenture, TATA Consultancy Services, Capgemini and CI&T. Hosting companies Examples are Acquia, Rackspace, Pantheon and Platform.sh. End users Examples are Pfizer or bio.logis Genetic Information Management GmbH.

A few observations:

  • Almost all of the sponsors in the top 30 are traditional Drupal businesses with fewer than 50 employees. Only five companies in the top 30 — Pfizer, Google, CI&T, bio.logis and Acquia — are not traditional Drupal businesses. The traditional Drupal businesses are responsible for almost 80% of all the credits in the top 30. This percentage goes up if you extend beyond the top 30. It's fair to say that Drupal's maintenance and innovation largely depends on these traditional Drupal businesses.
  • The larger, multi-platform digital marketing agencies are barely contributing to Drupal. While more and more large digital agencies are building out Drupal practices, no digital marketing agencies show up in the top 30, and hardly any appear in the entire list of contributing organizations. While they are not required to contribute, I'm frustrated that we have not yet found the right way to communicate the value of contribution to these companies. We need to incentivize each of these firms to contribute back with the same commitment that we see from traditional Drupal businesses
  • The only system integrator in the top 30 is CI&T, which ranked 4th with 795 credits. As far as system integrators are concerned, CI&T is a smaller player with approximately 2,500 employees. However, we do see various system integrators outside of the top 30, including Globant, Capgemini, Sapient and TATA Consultancy Services. In the past year, Capgemini almost quadrupled their credits from 46 to 196, TATA doubled its credits from 85 to 194, Sapient doubled its credits from 28 to 65, and Globant kept more or less steady with 41 credits. Accenture and Wipro do not appear to contribute despite doing a fair amount of Drupal work in the field.
  • Hosting companies also play an important role in our community, yet only Acquia appears in the top 30. Rackspace has 68 credits, Pantheon has 43, and Platform.sh has 23. I looked for other hosting companies in the data, but couldn't find any. In general, there is a persistent problem with hosting companies that make a lot of money with Drupal not contributing back. The contribution gap between Acquia and other hosting companies has increased, not decreased.
  • We also saw three end users in the top 30 as corporate sponsors: Pfizer (453 credits), Thunder (659 credits, up from 432 credits the year before), and the German company, bio.logis (330 credits). A notable end user is Johnson & Johnson, who was just outside of the top 30, with 221 credits, up from 29 credits the year before. Other end users outside of the top 30, include the European Commission (189 credits), Workday (112 credits), Paypal (80 credits), NBCUniversal (48 credits), Wolters Kluwer (20 credits), and Burda Media (24 credits). We also saw contributions from many universities, including the University of British Columbia (148 credits), University of Waterloo (129 credits), Princeton University (73 credits), University of Austin Texas at Austin (57 credits), Charles Darwin University (24 credits), University of Edinburgh (23 credits), University of Minnesota (19 credits) and many more.

A graph showing that Acquia is by far the number one contributing hosting company.

Contributions by system integrators

It would be interesting to see what would happen if more end users mandated contributions from their partners. Pfizer, for example, only works with agencies that contribute back to Drupal, and uses Drupal's credit system to verify their vendors' claims. The State of Georgia started doing the same; they also made Open Source contribution a vendor selection criteria. If more end users took this stance, it could have a big impact on the number of digital agencies, hosting companies and system integrators that contribute to Drupal.

While we should encourage more organizations to sponsor Drupal contributions, we should also understand and respect that some organizations can give more than others and that some might not be able to give back at all. Our goal is not to foster an environment that demands what and how others should give back. Instead, we need to help foster an environment worthy of contribution. This is clearly laid out in Drupal's Values and Principles.

How diverse is Drupal?

Supporting diversity and inclusion within Drupal is essential to the health and success of the project. The people who work on Drupal should reflect the diversity of people who use and work with the web.

I looked at both the gender and geographic diversity of Drupal.org contributors. While these are only two examples of diversity, these are the only diversity characteristics we currently have sufficient data for. Drupal.org recently rolled out support for Big 8/Big 10, so next year we should have more demographics information

Gender diversity

The data shows that only 8% of the recorded contributions were made by contributors who do not identify as male, which continues to indicate a wide gender gap. This is a one percent increase compared to last year. The gender imbalance in Drupal is profound and underscores the need to continue fostering diversity and inclusion in our community.

A graph showing contributions by gender: 75% of the contributions come from people who identify as male.

Last year I wrote a post called about the privilege of free time in Open Source. It made the case that Open Source is not a meritocracy, because not everyone has equal amounts of free time to contribute. For example, research shows that women still spend more than double the time as men doing unpaid domestic work, such as housework or childcare. This makes it more difficult for women to contribute to Open Source on an unpaid, volunteer basis. It's one of the reasons why Open Source projects suffer from a lack of diversity, among others including hostile environments and unconscious biases. Drupal.org's credit data unfortunately still shows a big gender disparity in contributions:

A graph that shows that compared to males, female contributors do more sponsored work, and less volunteer work.

Ideally, over time, we can collect more data on non-binary gender designations, as well as segment some of the trends behind contributions by gender. We can also do better at collecting data on other systemic issues beyond gender alone. Knowing more about these trends can help us close existing gaps. In the meantime, organizations capable of giving back should consider financially sponsoring individuals from underrepresented groups to contribute to Open Source. Each of us needs to decide if and how we can help give time and opportunities to underrepresented groups and how we can create equity for everyone in Drupal.

Geographic diversity

When measuring geographic diversity, we saw individual contributors from six continents and 114 countries:

A graph that shows most contributions come from Europe and North America.

Contributions per capita

Contribution credits per capita calculated as the amount of contributions per continent divided by the population of each continent. 0.001% means that one in 100,000 people contribute to Drupal. In North America, 5 in 100,000 people contributed to Drupal the last year.

Contributions from Europe and North America are both on the rise. In absolute terms, Europe contributes more than North America, but North America contributes more per capita.

Asia, South America and Africa remain big opportunities for Drupal, as their combined population accounts for 6.3 billion out of 7.5 billion people in the world. Unfortunately, the reported contributions from Asia are declining year over year. For example, compared to last year's report, there was a 17% drop in contribution from India. Despite that drop, India remains the second largest contributor behind the United States:

A graph showing the top 20 contributing countries.

The top 20 countries from which contributions originate. The data is compiled by aggregating the countries of all individual contributors behind each issue. Note that the geographical location of contributors doesn't always correspond with the origin of their sponsorship. Wim Leers, for example, works from Belgium, but his funding comes from Acquia, which has the majority of its customers in North America.

Top contributor details

To create more awareness of which organizations are sponsoring the top individual contributors, I included a more detailed overview of the top 50 contributors and their sponsors. If you are a Drupal developer looking for work, these are some of the companies I'd apply to first. If you are an end user looking for a company to work with, these are some of the companies I'd consider working with first. Not only do they know Drupal well, they also help improve your investment in Drupal.

Rank Username Issues Volunteer Sponsored Not specified Sponsors 1 kiamlaluno 1610 99% 0% 1% 2 jrockowitz 756 98% 99% 0% The Big Blue House (750), Memorial Sloan Kettering Cancer Center (5), Rosewood Marketing (1) 3 alexpott 642 6% 80% 19% Thunder (336), Acro Media Inc (100), Chapter Three (77) 4 RajabNatshah 616 1% 100% 0% Vardot (730), Webship (2) 5 volkswagenchick 519 2% 99% 0% Hook 42 (341), Kanopi Studios (171) 6 bojanz 504 0% 98% 2% Centarro (492), Ny Media AS (28), Torchbox (5), Liip (2), Adapt (2) 7 alonaoneill 489 9% 99% 0% Hook 42 (484) 8 thalles 488 0% 100% 0% CI&T (488), Janrain (3), Johnson & Johnson (2) 9 Wim Leers 437 8% 97% 0% Acquia (421), Government of Flanders (3) 10 DamienMcKenna 431 0% 97% 3% Mediacurrent (420) 11 Berdir 424 0% 92% 8% MD Systems (390) 12 chipway 356 0% 100% 0% Chipway (356) 13 larowlan 324 16% 94% 2% PreviousNext (304), Charles Darwin University (22), University of Technology, Sydney (3), Service NSW (2), Department of Justice & Regulation, Victoria (1) 14 pifagor 320 52% 100% 0% GOLEMS GABB (618), EPAM Systems (16), Drupal Ukraine Community (6) 15 catch 313 1% 95% 4% Third & Grove (286), Tag1 Consulting (11), Drupal Association (6), Acquia (4) 16 mglaman 277 2% 98% 1% Centarro (271), Oomph, Inc. (16), E.C. Barton & Co (3), Gaggle.net, Inc. (1), Bluespark (1), Thinkbean (1), LivePerson, Inc (1), Impactiv, Inc. (1), Rosewood Marketing (1), Acro Media Inc (1) 17 adci_contributor 274 0% 100% 0% ADCI Solutions (273) 18 quietone 266 41% 75% 1% Acro Media Inc (200) 19 tim.plunkett 265 3% 89% 9% Acquia (235) 20 gaurav.kapoor 253 0% 51% 49% OpenSense Labs (129), DrupalFit (111) 21 RenatoG 246 0% 100% 0% CI&T (246), Johnson & Johnson (85) 22 heddn 243 2% 98% 2% MTech, LLC (202), Tag1 Consulting (32), European Commission (22), North Studio (3), Acro Media Inc (2) 23 chr.fritsch 241 0% 99% 1% Thunder (239) 24 xjm 238 0% 85% 15% Acquia (202) 25 phenaproxima 238 0% 100% 0% Acquia (238) 26 mkalkbrenner 235 0% 100% 0% bio.logis Genetic Information Management GmbH (234), OSCE: Organization for Security and Co-operation in Europe (41), Welsh Government (4) 27 gvso 232 0% 100% 0% Google Summer of Code (214), Google Code-In (16), Zivtech (1) 28 dawehner 219 39% 84% 8% Chapter Three (176), Drupal Association (5), Tag1 Consulting (3), TES Global (1) 29 e0ipso 218 99% 100% 0% Lullabot (217), IBM (23) 30 drumm 205 0% 98% 1% Drupal Association (201) 31 gabesullice 199 0% 100% 0% Acquia (198), Aten Design Group (1) 32 amateescu 194 0% 97% 3% Pfizer, Inc. (186), Drupal Association (1), Chapter Three (1) 33 klausi 193 2% 59% 40% jobiqo - job board technology (113) 34 samuel.mortenson 187 42% 42% 17% Acquia (79) 35 joelpittet 187 28% 78% 14% The University of British Columbia (146) 36 borisson_ 185 83% 50% 3% Calibrate (79), Dazzle (13), Intracto digital agency (1) 37 Gábor Hojtsy 184 0% 97% 3% Acquia (178) 38 adriancid 182 91% 22% 2% Drupiter (40) 39 eiriksm 182 0% 100% 0% Violinist (178), Ny Media AS (4) 40 yas 179 12% 80% 10% DOCOMO Innovations, Inc. (143) 41 TR 177 0% 0% 100% 42 hass 173 1% 0% 99% 43 Joachim Namyslo 172 69% 0% 31% 44 alex_optim 171 0% 99% 1% GOLEMS GABB (338) 45 flocondetoile 168 0% 99% 1% Flocon de toile (167) 46 Lendude 168 52% 99% 0% Dx Experts (91), ezCompany (67), Noctilaris (9) 47 paulvandenburg 167 11% 72% 21% ezCompany (120) 48 voleger 165 98% 98% 2% GOLEMS GABB (286), Lemberg Solutions Limited (36), Drupal Ukraine Community (1) 49 lauriii 164 3% 98% 1% Acquia (153), Druid (8), Lääkärikeskus Aava Oy (2) 50 idebr 162 0% 99% 1% ezCompany (156), One Shoe (5)

Limitations of the credit system

It is important to note a few of the current limitations of Drupal.org's credit system:

  • The credit system doesn't capture all code contributions. Parts of Drupal are developed on GitHub rather than Drupal.org, and often aren't fully credited on Drupal.org. For example, Drush is maintained on GitHub instead of Drupal.org, and companies like Pantheon don't get credit for that work. The Drupal Association is working to integrate GitLab with Drupal.org. GitLab will provide support for "merge requests", which means contributing to Drupal will feel more familiar to the broader audience of Open Source contributors who learned their skills in the post-patch era. Some of GitLab's tools, such as in-line editing and web-based code review will also lower the barrier to contribution, and should help us grow both the number of contributions and contributors on Drupal.org.
  • The credit system is not used by everyone. There are many ways to contribute to Drupal that are still not captured in the credit system, including things like event organizing or providing support. Technically, that work could be captured as demonstrated by the various non-product initiatives highlighted in this post. Because using the credit system is optional, many contributors don't. As a result, contributions often have incomplete or no contribution credits. We need to encourage all Drupal contributors to use the credit system, and raise awareness of its benefits to both individuals and organizations. Where possible, we should automatically capture credits. For example, translation efforts on https://localize.drupal.org are not currently captured in the credit system but could be automatically.
  • The credit system disincentives work on complex issues. We currently don't have a way to account for the complexity and quality of contributions; one person might have worked several weeks for just one credit, while another person might receive a credit for 10 minutes of work. We certainly see a few individuals and organizations trying to game the credit system. In the future, we should consider issuing credit data in conjunction with issue priority, patch size, number of reviews, etc. This could help incentivize people to work on larger and more important problems and save smaller issues such as coding standards improvements for new contributor sprints. Implementing a scoring system that ranks the complexity of an issue would also allow us to develop more accurate reports of contributed work.

All of this means that the actual number of contributions and contributors could be significantly higher than what we report.

Like Drupal itself, the Drupal.org credit system needs to continue to evolve. Ultimately, the credit system will only be useful when the community uses it, understands its shortcomings, and suggests constructive improvements.

A first experiment with weighing credits

As a simple experiment, I decided to weigh each credit based on the adoption of the project the credit is attributed to. For example, each contribution credit to Drupal core is given a weight of 11 because Drupal core has about 1,1 million active installations. Credits to the Webform module, which has over 400,000 installations, get a weight of 4. And credits to Drupal's Commerce project gets just 1 point as it is installed on fewer than 100,000 sites.

The idea is that these weights capture the end user impact of each contribution, but also act as a proxy for the effort required to get a change committed. Getting a change accepted in Drupal core is both more difficult and more impactful than getting a change accepted to Commerce project.

This weighting is far from perfect as it undervalues non-product contributions, and it still doesn't recognize all types of product contributions (e.g. product strategy work, product management work, release management work, etc). That said, for code contributions, it may be more accurate than a purely unweighted approach.

Top contributing individuals based on weighted credits.

The top 30 contributing individuals based on weighted Drupal.org issue credits.

Top contributing organizations based on weighted credits.

The top 30 contributing organizations based on weighted Drupal.org issue credits.

Conclusions

Our data confirms that Drupal is a vibrant community full of contributors who are constantly evolving and improving the software. It's amazing to see that just in the last year, Drupal welcomed more than 8,000 individuals contributors and over 1,100 corporate contributors. It's especially nice to see the number of reported contributions, individual contributors and organizational contributors increase year over year.

To grow and sustain Drupal, we should support those that contribute to Drupal and find ways to get those that are not contributing involved in our community. Improving diversity within Drupal is critical, and we should welcome any suggestions that encourage participation from a broader range of individuals and organizations.

Sep 11 2019
Sep 11

An in-depth analysis of how Drupal's development was sponsored between July 1, 2018 and June 30, 2019.

The past years, I've examined Drupal.org's contribution data to understand who develops Drupal, how diverse the Drupal community is, how much of Drupal's maintenance and innovation is sponsored, and where that sponsorship comes from.

You can look at the 2016 report, the 2017 report, and the 2018 report. Each report looks at data collected in the 12-month period between July 1st and June 30th.

This year's report shows that:

  • Both the recorded number of contributors and contributions have increased.
  • Most contributions are sponsored, but volunteer contributions remains very important to Drupal's success.
  • Drupal's maintenance and innovation depends mostly on smaller Drupal agencies and Acquia. Hosting companies, multi-platform digital marketing agencies, large system integrators and end users make fewer contributions to Drupal.
  • Drupal's contributors have become more diverse, but are still not diverse enough.

Methodology

What are Drupal.org issues?

"Issues" are pages on Drupal.org. Each issue tracks an idea, feature request, bug report, task, or more. See https://www.drupal.org/project/issues for the list of all issues.

For this report, we looked at all Drupal.org issues marked "closed" or "fixed" in the 12-month period from July 1, 2018 to June 30, 2019. The issues analyzed in this report span Drupal core and thousands of contributed projects, across all major versions of Drupal.

What are Drupal.org credits?

In the spring of 2015, after proposing initial ideas for giving credit, Drupal.org added the ability for people to attribute their work in the Drupal.org issues to an organization or customer, or mark it the result of volunteer efforts.

Example issue credit on drupal orgA screenshot of an issue comment on Drupal.org. You can see that jamadar worked on this patch as a volunteer, but also as part of his day job working for TATA Consultancy Services on behalf of their customer, Pfizer.

Drupal.org's credit system is truly unique and groundbreaking in Open Source and provides unprecedented insights into the inner workings of a large Open Source project. There are a few limitations with this approach, which we'll address at the end of this report.

What is the Drupal community working on?

In the 12-month period between July 1, 2018 and June 30, 2019, 27,522 issues were marked "closed" or "fixed", a 13% increase from the 24,447 issues in the 2017-2018 period.

In total, the Drupal community worked on 3,474 different Drupal.org projects this year compared to 3,229 projects in the 2017-2018 period — an 8% year over year increase.

The majority of the credits are the result of work on contributed modules:

A pie chart showing contributions by project type: most contributions are to contributed modules.

Compared to the previous period, contribution credits increased across all project types:

A graph showing the year over year growth of contributions per project type.

The most notable change is the large jump in "non-product credits": more and more members in the community started tracking credits for non-product activities such as organizing Drupal events (e.g. DrupalCamp Delhi project, Drupal Developer Days, Drupal Europe and DrupalCon Europe), promoting Drupal (e.g. Drupal pitch deck or community working groups (e.g. Drupal Diversity and Inclusion Working Group, Governance Working Group).

While some of these increases reflect new contributions, others are existing contributions that are newly reported. All contributions are valuable, whether they're code contributions, or non-product and community-oriented contributions such as organizing events, giving talks, leading sprints, etc. The fact that the credit system is becoming more accurate in recognizing more types of Open Source contribution is both important and positive.

Who is working on Drupal?

For this report's time period, Drupal.org's credit system received contributions from 8,513 different individuals and 1,137 different organizations — a meaningful increase from last year's report.

A graph showing that the number of individual and organizational contributors increased year over year.

Consistent with previous years, approximately 51% of the individual contributors received just one credit. Meanwhile, the top 30 contributors (the top 0.4%) account for 19% of the total credits. In other words, a relatively small number of individuals do the majority of the work. These individuals put an incredible amount of time and effort into developing Drupal and its contributed projects:

Out of the top 30 contributors featured this year, 28 were active contributors in the 2017-2018 period as well. These Drupalists' dedication and continued contribution to the project has been crucial to Drupal's development.

It's also important to recognize that most of the top 30 contributors are sponsored by an organization. Their sponsorship details are provided later in this article. We value the organizations that sponsor these remarkable individuals, because without their support, it could be more challenging for these individuals to be in the top 30.

It's also nice to see two new contributors make the top 30 this year — Alona O'neill with sponsorship from Hook 42 and Thalles Ferreira with sponsorship from CI&T. Most of their credits were the result of smaller patches (e.g. removing deprecated code, fixing coding style issues, etc) or in some cases non-product credits rather than new feature development or fixing complex bugs. These types of contributions are valuable and often a stepping stone towards towards more in-depth contribution.

How much of the work is sponsored?

Issue credits can be marked as "volunteer" and "sponsored" simultaneously (shown in jamadar's screenshot near the top of this post). This could be the case when a contributor does the necessary work to satisfy the customer's need, in addition to using their spare time to add extra functionality.

For those credits with attribution details, 18% were "purely volunteer" credits (8,433 credits), in stark contrast to the 65% that were "purely sponsored" (29,802 credits). While there are almost four times as many "purely sponsored" credits as "purely volunteer" credits, volunteer contribution remains very important to Drupal.

Contributions by volunteer vs sponsored

Both "purely volunteer" and "purely sponsored" credits grew — "purely sponsored" credits grew faster in absolute numbers, but for the first time in four years "purely volunteer" credits grew faster in relative numbers.

The large jump in volunteer credits can be explained by the community capturing more non-product contributions. As can be seen on the graph below, these non-product contributions are more volunteer-centric.

A graph showing how much of the contributions are volunteered vs sponsored.

Who is sponsoring the work?

Now that we've established that the majority of contributions to Drupal are sponsored, let's study which organizations contribute to Drupal. While 1,137 different organizations contributed to Drupal, approximately 50% of them received four credits or less. The top 30 organizations (roughly the top 3%) account for approximately 25% of the total credits, which implies that the top 30 companies play a crucial role in the health of the Drupal project.

Top conytinuying organizationsTop contributing organizations based on the number of issue credits.

While not immediately obvious from the graph above, a variety of different types of companies are active in Drupal's ecosystem:

Category Description Traditional Drupal businesses Small-to-medium-sized professional services companies that primarily make money using Drupal. They typically employ fewer than 100 employees, and because they specialize in Drupal, many of these professional services companies contribute frequently and are a huge part of our community. Examples are Hook42, Centarro, The Big Blue House, Vardot, etc. Digital marketing agencies Larger full-service agencies that have marketing-led practices using a variety of tools, typically including Drupal, Adobe Experience Manager, Sitecore, WordPress, etc. They tend to be larger, with many of the larger agencies employing thousands of people. Examples are Wunderman, Possible and Mirum. System integrators Larger companies that specialize in bringing together different technologies into one solution. Example system agencies are Accenture, TATA Consultancy Services, Capgemini and CI&T. Hosting companies Examples are Acquia, Rackspace, Pantheon and Platform.sh. End users Examples are Pfizer or bio.logis Genetic Information Management GmbH.

A few observations:

  • Almost all of the sponsors in the top 30 are traditional Drupal businesses with fewer than 50 employees. Only five companies in the top 30 — Pfizer, Google, CI&T, bio.logis and Acquia — are not traditional Drupal businesses. The traditional Drupal businesses are responsible for almost 80% of all the credits in the top 30. This percentage goes up if you extend beyond the top 30. It's fair to say that Drupal's maintenance and innovation largely depends on these traditional Drupal businesses.
  • The larger, multi-platform digital marketing agencies are barely contributing to Drupal. While more and more large digital agencies are building out Drupal practices, no digital marketing agencies show up in the top 30, and hardly any appear in the entire list of contributing organizations. While they are not required to contribute, I'm frustrated that we have not yet found the right way to communicate the value of contribution to these companies. We need to incentivize each of these firms to contribute back with the same commitment that we see from traditional Drupal businesses
  • The only system integrator in the top 30 is CI&T, which ranked 4th with 795 credits. As far as system integrators are concerned, CI&T is a smaller player with approximately 2,500 employees. However, we do see various system integrators outside of the top 30, including Globant, Capgemini, Sapient and TATA Consultancy Services. In the past year, Capgemini almost quadrupled their credits from 46 to 196, TATA doubled its credits from 85 to 194, Sapient doubled its credits from 28 to 65, and Globant kept more or less steady with 41 credits. Accenture and Wipro do not appear to contribute despite doing a fair amount of Drupal work in the field.
  • Hosting companies also play an important role in our community, yet only Acquia appears in the top 30. Rackspace has 68 credits, Pantheon has 43, and Platform.sh has 23. I looked for other hosting companies in the data, but couldn't find any. In general, there is a persistent problem with hosting companies that make a lot of money with Drupal not contributing back. The contribution gap between Acquia and other hosting companies has increased, not decreased.
  • We also saw three end users in the top 30 as corporate sponsors: Pfizer (453 credits), Thunder (659 credits, up from 432 credits the year before), and the German company, bio.logis (330 credits). A notable end user is Johnson & Johnson, who was just outside of the top 30, with 221 credits, up from 29 credits the year before. Other end users outside of the top 30, include the European Commission (189 credits), Workday (112 credits), Paypal (80 credits), NBCUniversal (48 credits), Wolters Kluwer (20 credits), and Burda Media (24 credits). We also saw contributions from many universities, including the University of British Columbia (148 credits), University of Waterloo (129 credits), Princeton University (73 credits), University of Austin Texas at Austin (57 credits), Charles Darwin University (24 credits), University of Edinburgh (23 credits), University of Minnesota (19 credits) and many more.
A graph showing that Acquia is by far the number one contributing hosting company.Contributions by system integrators

It would be interesting to see what would happen if more end users mandated contributions from their partners. Pfizer, for example, only works with agencies that contribute back to Drupal, and uses Drupal's credit system to verify their vendors' claims. The State of Georgia started doing the same; they also made Open Source contribution a vendor selection criteria. If more end users took this stance, it could have a big impact on the number of digital agencies, hosting companies and system integrators that contribute to Drupal.

While we should encourage more organizations to sponsor Drupal contributions, we should also understand and respect that some organizations can give more than others and that some might not be able to give back at all. Our goal is not to foster an environment that demands what and how others should give back. Instead, we need to help foster an environment worthy of contribution. This is clearly laid out in Drupal's Values and Principles.

How diverse is Drupal?

Supporting diversity and inclusion within Drupal is essential to the health and success of the project. The people who work on Drupal should reflect the diversity of people who use and work with the web.

I looked at both the gender and geographic diversity of Drupal.org contributors. While these are only two examples of diversity, these are the only diversity characteristics we currently have sufficient data for. Drupal.org recently rolled out support for Big 8/Big 10, so next year we should have more demographics information

Gender diversity

The data shows that only 8% of the recorded contributions were made by contributors who do not identify as male, which continues to indicate a wide gender gap. This is a one percent increase compared to last year. The gender imbalance in Drupal is profound and underscores the need to continue fostering diversity and inclusion in our community.

A graph showing contributions by gender: 75% of the contributions come from people who identify as male.

Last year I wrote a post called about the privilege of free time in Open Source. It made the case that Open Source is not a meritocracy, because not everyone has equal amounts of free time to contribute. For example, research shows that women still spend more than double the time as men doing unpaid domestic work, such as housework or childcare. This makes it more difficult for women to contribute to Open Source on an unpaid, volunteer basis. It's one of the reasons why Open Source projects suffer from a lack of diversity, among others including hostile environments and unconscious biases. Drupal.org's credit data unfortunately still shows a big gender disparity in contributions:

A graph that shows that compared to males, female contributors do more sponsored work, and less volunteer work.

Ideally, over time, we can collect more data on non-binary gender designations, as well as segment some of the trends behind contributions by gender. We can also do better at collecting data on other systemic issues beyond gender alone. Knowing more about these trends can help us close existing gaps. In the meantime, organizations capable of giving back should consider financially sponsoring individuals from underrepresented groups to contribute to Open Source. Each of us needs to decide if and how we can help give time and opportunities to underrepresented groups and how we can create equity for everyone in Drupal.

Geographic diversity

When measuring geographic diversity, we saw individual contributors from six continents and 114 countries:

A graph that shows most contributions come from Europe and North America.Contributions per capitaContribution credits per capita calculated as the amount of contributions per continent divided by the population of each continent. 0.001% means that one in 100,000 people contribute to Drupal. In North America, 5 in 100,000 people contributed to Drupal the last year.

Contributions from Europe and North America are both on the rise. In absolute terms, Europe contributes more than North America, but North America contributes more per capita.

Asia, South America and Africa remain big opportunities for Drupal, as their combined population accounts for 6.3 billion out of 7.5 billion people in the world. Unfortunately, the reported contributions from Asia are declining year over year. For example, compared to last year's report, there was a 17% drop in contribution from India. Despite that drop, India remains the second largest contributor behind the United States:

A graph showing the top 20 contributing countries.The top 20 countries from which contributions originate. The data is compiled by aggregating the countries of all individual contributors behind each issue. Note that the geographical location of contributors doesn't always correspond with the origin of their sponsorship. Wim Leers, for example, works from Belgium, but his funding comes from Acquia, which has the majority of its customers in North America.

Top contributor details

To create more awareness of which organizations are sponsoring the top individual contributors, I included a more detailed overview of the top 50 contributors and their sponsors. If you are a Drupal developer looking for work, these are some of the companies I'd apply to first. If you are an end user looking for a company to work with, these are some of the companies I'd consider working with first. Not only do they know Drupal well, they also help improve your investment in Drupal.

Rank Username Issues Volunteer Sponsored Not specified Sponsors 1 kiamlaluno 1610 99% 0% 1% 2 jrockowitz 756 98% 99% 0% The Big Blue House (750), Memorial Sloan Kettering Cancer Center (5), Rosewood Marketing (1) 3 alexpott 642 6% 80% 19% Thunder (336), Acro Media Inc (100), Chapter Three (77) 4 RajabNatshah 616 1% 100% 0% Vardot (730), Webship (2) 5 volkswagenchick 519 2% 99% 0% Hook 42 (341), Kanopi Studios (171) 6 bojanz 504 0% 98% 2% Centarro (492), Ny Media AS (28), Torchbox (5), Liip (2), Adapt (2) 7 alonaoneill 489 9% 99% 0% Hook 42 (484) 8 thalles 488 0% 100% 0% CI&T (488), Janrain (3), Johnson & Johnson (2) 9 Wim Leers 437 8% 97% 0% Acquia (421), Government of Flanders (3) 10 DamienMcKenna 431 0% 97% 3% Mediacurrent (420) 11 Berdir 424 0% 92% 8% MD Systems (390) 12 chipway 356 0% 100% 0% Chipway (356) 13 larowlan 324 16% 94% 2% PreviousNext (304), Charles Darwin University (22), University of Technology, Sydney (3), Service NSW (2), Department of Justice & Regulation, Victoria (1) 14 pifagor 320 52% 100% 0% GOLEMS GABB (618), EPAM Systems (16), Drupal Ukraine Community (6) 15 catch 313 1% 95% 4% Third & Grove (286), Tag1 Consulting (11), Drupal Association (6), Acquia (4) 16 mglaman 277 2% 98% 1% Centarro (271), Oomph, Inc. (16), E.C. Barton & Co (3), Gaggle.net, Inc. (1), Bluespark (1), Thinkbean (1), LivePerson, Inc (1), Impactiv, Inc. (1), Rosewood Marketing (1), Acro Media Inc (1) 17 adci_contributor 274 0% 100% 0% ADCI Solutions (273) 18 quietone 266 41% 75% 1% Acro Media Inc (200) 19 tim.plunkett 265 3% 89% 9% Acquia (235) 20 gaurav.kapoor 253 0% 51% 49% OpenSense Labs (129), DrupalFit (111) 21 RenatoG 246 0% 100% 0% CI&T (246), Johnson & Johnson (85) 22 heddn 243 2% 98% 2% MTech, LLC (202), Tag1 Consulting (32), European Commission (22), North Studio (3), Acro Media Inc (2) 23 chr.fritsch 241 0% 99% 1% Thunder (239) 24 xjm 238 0% 85% 15% Acquia (202) 25 phenaproxima 238 0% 100% 0% Acquia (238) 26 mkalkbrenner 235 0% 100% 0% bio.logis Genetic Information Management GmbH (234), OSCE: Organization for Security and Co-operation in Europe (41), Welsh Government (4) 27 gvso 232 0% 100% 0% Google Summer of Code (214), Google Code-In (16), Zivtech (1) 28 dawehner 219 39% 84% 8% Chapter Three (176), Drupal Association (5), Tag1 Consulting (3), TES Global (1) 29 e0ipso 218 99% 100% 0% Lullabot (217), IBM (23) 30 drumm 205 0% 98% 1% Drupal Association (201) 31 gabesullice 199 0% 100% 0% Acquia (198), Aten Design Group (1) 32 amateescu 194 0% 97% 3% Pfizer, Inc. (186), Drupal Association (1), Chapter Three (1) 33 klausi 193 2% 59% 40% jobiqo - job board technology (113) 34 samuel.mortenson 187 42% 42% 17% Acquia (79) 35 joelpittet 187 28% 78% 14% The University of British Columbia (146) 36 borisson_ 185 83% 50% 3% Calibrate (79), Dazzle (13), Intracto digital agency (1) 37 Gábor Hojtsy 184 0% 97% 3% Acquia (178) 38 adriancid 182 91% 22% 2% Drupiter (40) 39 eiriksm 182 0% 100% 0% Violinist (178), Ny Media AS (4) 40 yas 179 12% 80% 10% DOCOMO Innovations, Inc. (143) 41 TR 177 0% 0% 100% 42 hass 173 1% 0% 99% 43 Joachim Namyslo 172 69% 0% 31% 44 alex_optim 171 0% 99% 1% GOLEMS GABB (338) 45 flocondetoile 168 0% 99% 1% Flocon de toile (167) 46 Lendude 168 52% 99% 0% Dx Experts (91), ezCompany (67), Noctilaris (9) 47 paulvandenburg 167 11% 72% 21% ezCompany (120) 48 voleger 165 98% 98% 2% GOLEMS GABB (286), Lemberg Solutions Limited (36), Drupal Ukraine Community (1) 49 lauriii 164 3% 98% 1% Acquia (153), Druid (8), Lääkärikeskus Aava Oy (2) 50 idebr 162 0% 99% 1% ezCompany (156), One Shoe (5)

Limitations of the credit system

It is important to note a few of the current limitations of Drupal.org's credit system:

  • The credit system doesn't capture all code contributions. Parts of Drupal are developed on GitHub rather than Drupal.org, and often aren't fully credited on Drupal.org. For example, Drush is maintained on GitHub instead of Drupal.org, and companies like Pantheon don't get credit for that work. The Drupal Association is working to integrate GitLab with Drupal.org. GitLab will provide support for "merge requests", which means contributing to Drupal will feel more familiar to the broader audience of Open Source contributors who learned their skills in the post-patch era. Some of GitLab's tools, such as in-line editing and web-based code review will also lower the barrier to contribution, and should help us grow both the number of contributions and contributors on Drupal.org.
  • The credit system is not used by everyone. There are many ways to contribute to Drupal that are still not captured in the credit system, including things like event organizing or providing support. Technically, that work could be captured as demonstrated by the various non-product initiatives highlighted in this post. Because using the credit system is optional, many contributors don't. As a result, contributions often have incomplete or no contribution credits. We need to encourage all Drupal contributors to use the credit system, and raise awareness of its benefits to both individuals and organizations. Where possible, we should automatically capture credits. For example, translation efforts on https://localize.drupal.org are not currently captured in the credit system but could be automatically.
  • The credit system disincentives work on complex issues. We currently don't have a way to account for the complexity and quality of contributions; one person might have worked several weeks for just one credit, while another person might receive a credit for 10 minutes of work. We certainly see a few individuals and organizations trying to game the credit system. In the future, we should consider issuing credit data in conjunction with issue priority, patch size, number of reviews, etc. This could help incentivize people to work on larger and more important problems and save smaller issues such as coding standards improvements for new contributor sprints. Implementing a scoring system that ranks the complexity of an issue would also allow us to develop more accurate reports of contributed work.

All of this means that the actual number of contributions and contributors could be significantly higher than what we report.

Like Drupal itself, the Drupal.org credit system needs to continue to evolve. Ultimately, the credit system will only be useful when the community uses it, understands its shortcomings, and suggests constructive improvements.

A first experiment with weighing credits

As a simple experiment, I decided to weigh each credit based on the adoption of the project the credit is attributed to. For example, each contribution credit to Drupal core is given a weight of 11 because Drupal core has about 1,1 million active installations. Credits to the Webform module, which has over 400,000 installations, get a weight of 4. And credits to Drupal's Commerce project gets just 1 point as it is installed on fewer than 100,000 sites.

The idea is that these weights capture the end user impact of each contribution, but also act as a proxy for the effort required to get a change committed. Getting a change accepted in Drupal core is both more difficult and more impactful than getting a change accepted to Commerce project.

This weighting is far from perfect as it undervalues non-product contributions, and it still doesn't recognize all types of product contributions (e.g. product strategy work, product management work, release management work, etc). That said, for code contributions, it may be more accurate than a purely unweighted approach.

Top contributing individuals based on weighted credits.The top 30 contributing individuals based on weighted Drupal.org issue credits. Top contributing organizations based on weighted credits.The top 30 contributing organizations based on weighted Drupal.org issue credits.

Conclusions

Our data confirms that Drupal is a vibrant community full of contributors who are constantly evolving and improving the software. It's amazing to see that just in the last year, Drupal welcomed more than 8,000 individuals contributors and over 1,100 corporate contributors. It's especially nice to see the number of reported contributions, individual contributors and organizational contributors increase year over year.

To grow and sustain Drupal, we should support those that contribute to Drupal and find ways to get those that are not contributing involved in our community. Improving diversity within Drupal is critical, and we should welcome any suggestions that encourage participation from a broader range of individuals and organizations.

September 11, 2019

13 min read time

Sep 10 2019
Sep 10

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

I'm excited to share that when Drupal 8.8 drops in December, Drupal's WYSIWYG editor will allow media embedding.

You may wonder: Why is that worth announcing on your blog? It's just one new button in my WYSIWYG editor.

It's a big deal because Drupal's media management has been going through a decade-long transformation. The addition of WYSIWYG integration completes the final milestone. You can read more about it on Wim's blog post.

Drupal 8.8 should ship with complete media management, which is fantastic news for site builders and content authors who have long wanted a simpler way to embed media in Drupal.

Congratulations to the Media Initiative team for this significant achievement!

Sep 06 2019
Sep 06

I'm excited to share that when Drupal 8.8 drops in December, Drupal's WYSIWYG editor will allow media embedding.

You may wonder: Why is that worth announcing on your blog? It's just one new button in my WYSIWYG editor..

It's a big deal because Drupal's media management has been going through a decade-long transformation. The addition of WYSIWYG integration completes the final milestone. You can read more about it on Wim's blog post.

Drupal 8.8 should ship with complete media management, which is fantastic news for site builders and content authors who have long wanted a simpler way to embed media in Drupal.

Congratulations to the Media Initiative team for this significant achievement!

September 06, 2019

27 sec read time

db db
Sep 04 2019
Sep 04

I'm excited to announce that Acquia has acquired Cohesion, the creator of DX8, a software-as-a-service (SaaS) visual Drupal website builder made for marketers and designers. With Cohesion DX8, users can create and design Drupal websites without having to write PHP, HTML or CSS, or know how a Drupal theme works. Instead, you can create designs, layouts and pages using a drag-and-drop user interface.

Amazon founder and CEO Jeff Bezos is often asked to predict what the future will be like in 10 years. One time, he famously answered that predictions are the wrong way to go about business strategy. Bezos said that the secret to business success is to focus on the things that will not change. By focusing on those things that won't change, you know that all the time, effort and money you invest today is still going to be paying you dividends 10 years from now. For Amazon's e-commerce business, he knows that in the next decade people will still want faster shipping and lower shipping costs.

As I wrote in a recent blog post, no-code and low-code website building solutions have had an increasing impact on the web since the early 1990s. While the no-code and low-code trend has been a 25-year long trend, I believe we're only at the beginning. There is no doubt in my mind that 10 years from today, we'll still be working on making website building faster and easier.

Acquia's acquisition of Cohesion is a direct response to this trend, empowering marketers, content authors and designers to build Drupal websites faster and cheaper than ever. This is big news for Drupal as it will lower the cost of ownership and accelerate the pace of website development. For example, if you are still on Drupal 7, and are looking to migrate to Drupal 8, I'd take a close look at Cohesion DX8. It could accelerate your Drupal 8 migration and reduce its cost.

Here is a quick look at some of my favorite features:

An animated GIF showing how to edit styles with Cohesion.An easy-to-use “style builder” enables designers to create templates from within the browser. The image illustrates how easy it is to modify styles, in this case a button design. An animated GIF showing how to edit a page with Cohesion.In-context editing makes it really easy to modify content on the page and even change the layout from one column to two columns and see the results immediately.

I'm personally excited to work with the Cohesion team on unlocking the power of Drupal for more organizations worldwide. I'll share more about Cohesion DX8's progress in the coming months. In the meantime, welcome to the team, Cohesion!

September 04, 2019

1 min read time

db db
Aug 19 2019
Aug 19

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Low-code and no-code tools for the web are on a decade-long rise; they enable self-service for marketers, and allow developers to focus on innovation.

Low code no code

A version of this article was originally published on Devops.com.

Twelve years ago, I wrote a post called Drupal and Eliminating Middlemen. For years, it was one of the most-read pieces on my blog. Later, I followed that up with a blog post called The Assembled Web, which remains one of the most read posts to date.

The point of both blog posts was the same: I believed that the web would move toward a model where non-technical users could assemble their own sites with little to no coding experience of their own.

This idea isn't new; no-code and low-code tools on the web have been on a 25-year long rise, starting with the first web content management systems in the early 1990s. Since then no-code and low-code solutions have had an increasing impact on the web. Examples include:

While this has been a long-run trend, I believe we're only at the beginning.

Trends driving the low-code and no-code movements

According to Forrester Wave: Low-Code Development Platforms for AD&D Professionals, Q1 2019, In our survey of global developers, 23% reported using low-code platforms in 2018, and another 22% planned to do so within a year..

Major market forces driving this trend include a talent shortage among developers, with an estimated one million computer programming jobs expected to remain unfilled by 2020 in the United States alone.

What is more, the developers who are employed are often overloaded with work and struggle with how to prioritize it all. Some of this burden could be removed by low-code and no-code tools.

In addition, the fact that technology has permeated every aspect of our lives — from our smartphones to our smart homes — has driven a desire for more people to become creators. As the founder of Product HuntRyan Hoover, said in a blog post: "As creating things on the internet becomes more accessible, more people will become makers."

But this does not only apply to individuals. Consider this: the typical large organization has to build and maintain hundreds of websites. They need to build, launch and customize these sites in days or weeks, not months. Today and in the future, marketers can embrace no-code and low-code tools to rapidly develop websites.

Abstraction drives innovation

As discussed in my middleman blog post, developers won't go away. Just as the role of the original webmaster (FTP hand-written HTML files, anyone?) has evolved with the advent of web content management systems, the role of web developers is changing with the rise of low-code and no-code tools.

Successful no-code approaches abstract away complexity for web development. This enables less technical people to do things that previously could only be done by developers. And when those abstractions happen, developers often move on to the next area of innovation.

When everyone is a builder, more good things will happen on the web. I was excited about this trend more than 12 years ago, and remain excited today. I'm eager to see the progress no-code and low-code solutions will bring to the web in the next decade.

Aug 19 2019
Aug 19

Low-code and no-code tools for the web are on a decade-long rise; they enable self-service for marketers, and allow developers to focus on innovation.

Low code no code

A version of this article was originally published on Devops.com.

Twelve years ago, I wrote a post called Drupal and Eliminating Middlemen. For years, it was one of the most-read pieces on my blog. Later, I followed that up with a blog post called The Assembled Web, which remains one of the most read posts to date.

The point of both blog posts was the same: I believed that the web would move toward a model where non-technical users could assemble their own sites with little to no coding experience of their own.

This idea isn't new; no-code and low-code tools on the web have been on a 25-year long rise, starting with the first web content management systems in the early 1990s. Since then no-code and low-code solutions have had an increasing impact on the web. Examples include:

While this has been a long-run trend, I believe we're only at the beginning.

Trends driving the low-code and no-code movements

According to Forrester Wave: Low-Code Development Platforms for AD&D Professionals, Q1 2019, In our survey of global developers, 23% reported using low-code platforms in 2018, and another 22% planned to do so within a year..

Major market forces driving this trend include a talent shortage among developers, with an estimated one million computer programming jobs expected to remain unfilled by 2020 in the United States alone.

What is more, the developers who are employed are often overloaded with work and struggle with how to prioritize it all. Some of this burden could be removed by low-code and no-code tools.

In addition, the fact that technology has permeated every aspect of our lives — from our smartphones to our smart homes — has driven a desire for more people to become creators. As the founder of Product Hunt Ryan Hoover said in a blog post: As creating things on the internet becomes more accessible, more people will become makers..

But this does not only apply to individuals. Consider this: the typical large organization has to build and maintain hundreds of websites. They need to build, launch and customize these sites in days or weeks, not months. Today and in the future, marketers can embrace no-code and low-code tools to rapidly develop websites.

Abstraction drives innovation

As discussed in my middleman blog post, developers won't go away. Just as the role of the original webmaster has evolved with the advent of web content management systems, the role of web developers is changing with the rise of low-code and no-code tools.

Successful no-code approaches abstract away complexity for web development. This enables less technical people to do things that previously could only by done by developers. And when those abstractions happen, developers often move on to the next area of innovation.

When everyone is a builder, more good things will happen on the web. I was excited about this trend more than 12 years ago, and remain excited today. I'm eager to see the progress no-code and low-code solutions will bring to the web in the next decade.

August 19, 2019

2 min read time

Aug 13 2019
Aug 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

A special bird flying in space has the spotlight while lots of identical birds sit on the ground (lack of diversity)

At Drupalcon Seattle, I spoke about some of the challenges Open Source communities like Drupal often have with increasing contributor diversity. We want our contributor base to look like everyone in the world who uses Drupal's technology on the internet, and unfortunately, that is not quite the reality today.

One way to step up is to help more people from underrepresented groups speak at Drupal conferences and workshops. Seeing and hearing from a more diverse group of people can inspire new contributors from all races, ethnicities, gender identities, geographies, religious groups, and more.

To help with this effort, the Drupal Diversity and Inclusion group is hosting a speaker diversity training workshop on September 21 and 28 with Jill Binder, whose expertise has also driven major speaker diversity improvements within the WordPress community.

I'd encourage you to either sign up for this session yourself or send the information to someone in a marginalized group who has knowledge to share, but may be hesitant to speak up. Helping someone see that their expertise is valuable is the kind of support we need in order to drive meaningful change.

Aug 12 2019
Aug 12
A special bird flying in space has the spotlight while lots of identical birds sit on the ground (lack of diversity)

At Drupalcon Seattle, I spoke about some of the challenges Open Source communities like Drupal often have with increasing contributor diversity. We want our contributor base to look like everyone in the world who uses Drupal's technology on the internet, and unfortunately, that is not quite the reality today.

One way to step up is to help more people from underrepresented groups speak at Drupal conferences and workshops. Seeing and hearing from a more diverse group of people can inspire new contributors from all races, ethnicities, gender identities, geographies, religious groups, and more.

To help with this effort, the Drupal Diversity and Inclusion group is hosting a speaker diversity training workshop on September 21 and 28 with Jill Binder, whose expertise has also driven major speaker diversity improvements within the WordPress community.

I'd encourage you to either sign up for this session yourself or send the information to someone in a marginalized group who has knowledge to share, but may be hesitant to speak up. Helping someone see that their expertise is valuable is the kind of support we need to drive meaningful change.

August 12, 2019

44 sec read time

db db
Aug 01 2019
Aug 01

For the sixth year in a row, Acquia has been recognized as a leader in the Gartner Magic Quadrant for Web Content Management.

For the sixth year in a row, Acquia has been recognized as a leader in the Gartner Magic Quadrant for Web Content Management. Acquia first entered the Web Content Management Magic Quadrant back in 2012 as a Visionary, and since then we've moved further than any other vendor to cement our leadership position.

As I've written before, analyst reports like the Gartner Magic Quadrant are important because they introduce organizations to Acquia and Drupal. As I've put if before If you want to find a good coffee place, you use Yelp. If you want to find a nice hotel in New York, you use TripAdvisor. Similarly, if a CIO or CMO wants to spend $250,000 or more on enterprise software, they often consult an analyst firm like Gartner..

In 2012, Gartner didn't fully understand the benefits of Acquia being the only WCM company who embraced both Open Source and cloud. Just seven years later, our unique approach has forever changed web content management. This year, Acquia moved up again in both of the dimensions that Gartner uses to rank vendors: Completeness of Vision and Ability to Execute. You'll see in the Magic Quadrant graphic that Acquia has tied Sitecore for the first time:

The 2019 Gartner Magic Quadrant for Web Content ManagementAcquia recognized as a leader, next to Adobe, Sitecore and Episerver, in the 2019 Gartner Magic Quadrant for Web Content Management.

In mature markets like Web Content Management, there is almost always a single proprietary leader and a single Open Source leader. There is Oracle and MongoDB. Splunk and Elastic. VMWare and Docker. Gitlab and Github. That is why I believe that next year it will be Acquia and Adobe at the very top of the WCM Magic Quadrant. Sitecore and Episerver will continue to fight for third place among companies who prefer a Microsoft-centric approach. I was not surprised to see Sitecore move down this year as they work to overcome technical product debt and cloud transition, leading to strange decisions like acquiring a services company.

You can read the complete report on Acquia.com. Thank you to everyone who contributed to this result!

August 01, 2019

1 min read time

db db
Jul 23 2019
Jul 23

This blog has been re-posted and edited with permission from Dries Buytaert's blog:

Coder Dojo

Volunteering as a mentor at CoderDojo to teach young people, including my own kids, how to write software.

Last week, I published an opinion piece on CNN featuring my thoughts on what is wrong with the web and how we might fix it.

In short, I really miss some things about the original web, and don't want my kids to grow up being exploited by mega-corporations.

I am hopeful that increased regulation and decentralized web applications may fix some of the web's current problems. While some problems are really difficult to fix, at the very least, my kids will have more options to choose from when it comes to their data privacy and overall experience on the web.

You can read the first few paragraphs below, and view the whole article on CNN.

I still remember the feeling in the year 2000 when a group of five friends and I shared a modem connection at the University of Antwerp. I used it to create an online message board so we could chat back and forth about mostly mundane things. The modem was slow by today's standards, but the newness of it all was an adrenaline rush. Little did I know that message board would change my life.

In time, I turned this internal message board into a public news and discussion site, where I shared my own experiences using experimental web technologies. Soon, I started hearing from people all over the world that wanted to provide suggestions on how to improve my website, but that also wanted to use my site's technology to build their own websites and experiment with emerging web technologies.

Before long, I was connected to a network of strangers who would help me build Drupal.

Jul 22 2019
Jul 22

An opinion piece featuring my thoughts on what is wrong with the current web and how we might fix it, ran on CNN last week.

Coder DojoVolunteering as a mentor at CoderDojo to teach young people, including my own kids, how to write software.

Last week, I published an opinion piece on CNN featuring my thoughts on what is wrong with the web and how we might fix it.

In short, I really miss some things about the original web, and don't want my kids to grow up being exploited by mega-corporations.

I am hopeful that increased regulation and decentralized web applications may fix some of the web's current problems. While some problems are really difficult to fix, at the very least, my kids will have more options to choose from when it comes to their data privacy and overall experience on the web.

You can read the first few paragraphs below, and view the whole article on CNN.

I still remember the feeling in the year 2000 when a group of five friends and I shared a modem connection at the University of Antwerp. I used it to create an online message board so we could chat back and forth about mostly mundane things. The modem was slow by today's standards, but the newness of it all was an adrenaline rush. Little did I know that message board would change my life.

In time, I turned this internal message board into a public news and discussion site, where I shared my own experiences using experimental web technologies. Soon, I started hearing from people all over the world that wanted to provide suggestions on how to improve my website, but that also wanted to use my site's technology to build their own websites and experiment with emerging web technologies.

Before long, I was connected to a network of strangers who would help me build Drupal.

July 22, 2019

1 min read time

db db
Jun 28 2019
Jun 28

Acquia is growing its presence in the Asia-Pacific region with a new Pune, India office.

This week, Acquia announced the opening of its new office in Pune, India, which extends our presence in the Asia Pacific region. In addition to Pune, we already have offices in Australia and Japan.

I've made several trips to India in recent years, and have experienced not only Drupal's fast growth, but also the contagious excitement and passion for Drupal from the people I've met there.

While I wasn't able to personally attend the opening of our new office, I'm looking forward to visiting the Pune office soon.

For now, here are a few pictures from our grand opening celebration:

Acquians at the opening of the Pune, India officeAcquians at the opening of the Pune, India officeAcquians at the opening of the Pune, India office

June 28, 2019

27 sec read time

db db
Jun 20 2019
Jun 20

Acquia Content Cloud, a new content-as-a-service solution for simplified content creation and syndication across multi-channel digital experiences, is now available in private beta.

Earlier this week at our Acquia Engage conference in London, Acquia announced a new product called "Content Cloud", a headless, SaaS-based content-as-a-service solution built on Drupal.

Years ago, we heard that organizations wanted to:

  • Create content that is easy to re-use across different channels, such as websites and mobile applications, email, digital screens, and more.

  • Use a content management system with a modern web service API that allows them to use their favorite front-end framework (e.g. React, Angular, Vue.js, etc) to build websites and digital experiences.

As a result, Acquia spent the last 5+ years helping to improve Drupal's web services capabilities and authoring experience.

But we also heard that organizations want to:

  • Use single repository to manage all their organization's content.
  • Make it really easy to synchronize content between all their Drupal sites.
  • Manage all content editors from a central place to enable centralized content governance and workflows.
  • Automate the installation, maintenance, and upgrades of their Drupal-based content repository.

All of the above becomes even more important as organizations scale the number of content creators, websites and applications. Many large organizations have to build and maintain hundreds of sites and manage hundreds of content creators.

So this week, at our European customer conference, we lifted the curtain on Acquia Content Cloud, a new Acquia product built using Drupal. Acquia Content Cloud is a content-as-a-service solution that enables simplified, headless content creation and syndication across multi-channel digital experiences.

For now, we are launching an early access beta program. If you’re interested in being considered for the beta or want to learn more as Content Cloud moves toward general availability, you can sign up here.

In time, I plan to write more about Content Cloud, especially as we get closer to its initial release. Until then, you can watch the Acquia Content Cloud teaser video below:

June 20, 2019

1 min read time

db db
Jun 18 2019
Jun 18

The new version of Acquia Lift makes it simpler for marketers to run website personalization campaigns themselves, without the need for writing code.

Today, we released a new version of Acquia Lift, our web personalization tool.

In today's world, personalization has become central to the most successful customer experiences. Most organizations know that personalization is no longer optional, but have put it off because it can be too difficult. The new Acquia Lift solves that problem.

While before, Acquia Lift may have taken a degree of fine-tuning from a developer, the new version simplifies how marketers create and launch website personalization. With the new version, anyone can point, click and personalize content without any code.

We started working on the new version of Acquia Lift in early 2018, well over a year ago. In the process we interviewed over 50 customers, redesigned the user interface and workflows, and added various new capabilities to make it easier for marketers to run website personalization campaigns. And today, at our European customer conference, Acquia Engage London, we released the new Acquia Lift to the public.

You can see all of the new features in action in this 5-minute Acquia Lift demo video:

The new Acquia Lift offers the best web personalization solution in Acquia's history, and definitely the best tool for Drupal.

June 18, 2019

47 sec read time

db db
Jun 11 2019
Jun 11

Recently I was interviewed on RTL Z, the Dutch business news television network. In the interview, I talk about the growth and success of Drupal, and what is to come for the future of the web. Beware, the interview is in Dutch. If you speak Dutch and are subscribed to my blog (hi mom!), feel free to check it out!

June 11, 2019

15 sec read time

db db
Jun 11 2019
Jun 11

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Recently, GitHub announced an initiative called GitHub Sponsors where open source software users can pay contributors for their work directly within GitHub.

There has been quite a bit of debate about whether initiatives like this are good or bad for Open Source.

On the one hand, there is the concern that the commercialization of Open Source could corrupt Open Source communities or harm contributors' intrinsic motivation and quest for purpose.

On the other hand, there is the recognition that commercial sponsorship is often a necessary condition for Open Source sustainability. Many communities have found that to support their growth, as a part of their natural evolution, they need to pay developers or embrace corporate sponsors.

Personally, I believe initiatives like GitHub Sponsors, and others like Open Collective, are a good thing.

It helps not only with the long-term sustainability of Open Source communities, but also improves diversity in Open Source. Underrepresented groups, in particular, don't always have the privilege of free time to contribute to Open Source outside of work hours. Most software developers have to focus on making a living before they can focus on self-actualization. Without funding, Open Source communities risk losing or excluding valuable talent.

May 29 2019
May 29

I'm pleased to share that Francesco Placella (plach on Drupal.org) is moving from a "provisional" core committer to a full-fledged framework manager. (Read more about Drupal core's governance structure.)

Francesco has been a member of the Drupal community for over 11 years. He contributed an incredible amount to multilingual efforts, the Field and Entity API, and was a top contributor to the Drupal Association's D8 Accelerate program, so you can also thank him for Drupal 8 getting released. :)

This experience has given Francesco an extremely well-rounded knowledge of Drupal's API underpinnings, making him a perfect candidate for Framework Manager. He is also extremely meticulous in his patch reviews, and always willing to jump in on problems to help others.

The rest of the committer team all were extremely happy to recommend his promotion to full-fledged committer, so please join me in formally welcoming plach to the team!

May 24 2019
May 24

This blog has been edited and reposted with permission from Dries.
 

On the day that the Drupal-powered Justice.gov website released Special Counsel Robert Mueller's long-awaited report on Russian interference in the U.S. election, the site experienced a 7,000% increase in traffic.

The report was successfully delivered without interruption via Acquia, using Drupal. 

According to Federal Computer Week, by 5pm on April 18, there had already been 587 million site visits, with 247 million happening in the first hour the report was released. The site typically receives 8 million visits per day. 

There were no IT performance or availability issues during the release of the 142-MB report, which is ideal during these types of high-pressure events when the world is watching. Thus, no news is good news. 

Keeping sites like this up and available to the public is an important part of democracy and the freedom of information. 

I'm proud of Drupal and Acquia’s ability to deliver when it matters most!

May 23 2019
May 23

Last month, Special Counsel Robert Mueller's long-awaited report on Russian interference in the U.S. election was released on the Justice.gov website.

With the help of Acquia and Drupal, the report was successfully delivered without interruption, despite a 7,000% increase in traffic on its release date, according to the Ottawa Business Journal.

According to Federal Computer Week, by 5pm on the day of the report's release, there had already been 587 million site visits, with 247 million happening within the first hour.

During these types of high-pressure events when the world is watching, no news is good news. Keeping sites like this up and available to the public is an important part of democracy and the freedom of information. I'm proud of Acquia's and Drupal's ability to deliver when it matters most!

May 23, 2019

31 sec read time

db db
May 08 2019
May 08

Acquia acquired Mautic, the open source marketing automation platform, to deliver the only Open Digital Experience Platform as an alternative to the expensive, closed, and stagnant marketing clouds.

Acquia joins forces with Mautic

I'm happy to announce today that Acquia acquired Mautic, an open source marketing automation and campaign management platform.

A couple of decades ago, I was convinced that every organization required a website — a thought that sounds rather obvious now. Today, I am convinced that every organization will need a Digital Experience Platform (DXP).

Having a website is no longer enough: customers expect to interact with brands through their websites, email, chat and more. They also expect these interactions to be relevant and personalized.

If you don't know Mautic, think of it as an alternative to Adobe's Marketo or Salesforce's Marketing Cloud. Just like these solutions, Mautic provides marketing automation and campaign management capabilities. It's differentiated in that it is easier to use, supports one-to-one customer experiences across many channels, integrates more easily with other tools, and is less expensive.

The flowchart style visual campaign builder you saw in the beginning of the Mautic demo video above is one of my favorite features. I love how it allows marketers to combine content, user profiles, events and a decision engine to deliver the best-next action to customers.

Mautic is a relatively young company, but has quickly grown into the largest open source player in the marketing automation space, with more than 200,000 installations. Its ease of use, flexibility and feature completeness has won over many marketers in a very short time: the company's top-line grew almost 400 percent year-over-year, its number of customers tripled, and Mautic won multiple awards for product innovation and customer service.

The acquisition of Mautic accelerates Acquia's product strategy to deliver the only Open Digital Experience Platform:

The building blocks of a Digital Experience Platform and how Mautic accelerates Acquia's vision. The pieces that make up a Digital Experience Platform, and how Mautic fits into Acquia's Open Digital Experience Platform. Acquia is strong in content management, personalization, user profile management and commerce (yellow blocks). Mautic adds or improves Acquia's multi-channel delivery, campaign management and journey orchestration capabilities (purple blocks).

There are many reasons why we like Mautic, but here are my top 3:

Reason 1: Disrupting the market with "open"

Open Source will disrupt every component of the modern technology stack. It's not a matter of if, it's when.

Just as Drupal disrupted web content management with Open Source, we believe Mautic disrupts marketing automation.

With Mautic, Acquia is now the only open and open source alternative to the expensive, closed, and stagnant marketing clouds.

I'm both proud and excited that Acquia is doubling down on Open Source. Given our extensive open source experience, we believe we can help grow Mautic even faster.

Reason 2: Innovating through integrations

To build an optimal customer experience, marketers need to integrate with different data sources, customer technologies, and bespoke in-house platforms. Instead of buying a suite from a single vendor, most marketers want an open platform that allows for open innovation and unlimited integrations.

Only an open architecture can connect any technology in the marketing stack, and only an open source innovation model can evolve fast enough to offer integrations with thousands of marketing technologies (to date, there are 7,000 vendors in the martech landscape).

Because developers are largely responsible for creating and customizing marketing platforms, marketing technology should meet the needs of both business users and technology architects. Unlike other companies in the space, Mautic is loved by both marketers and developers. With Mautic, Acquia continues to focus on both personas.

Reason 3: The same technology stack and business model

Like Drupal, Mautic is built in PHP and Symfony, and like Drupal, Mautic uses the GNU GPL license. Having the same technology stack has many benefits.

Digital agencies or in-house teams need to deliver integrated marketing solutions. Because both Drupal and Mautic use the same technology stack, a single team of developers can work on both.

The similarities also make it possible for both open source communities to collaborate — while it is not something you can force to happen, it will be interesting to see how that dynamic naturally plays out over time.

Last but not least, our business models are also very aligned. Both Acquia and Mautic were "born in the cloud" and make money by offering subscription- and cloud-based delivery options. This means you pay for only what you need and that you can focus on using the products rather than running and maintaining them.

Mautic offers several commercial solutions:

  • Mautic Cloud, a fully managed SaaS version of Mautic with premium features not available in Open Source.
  • For larger organizations, Mautic has a proprietary product called Maestro. Large organizations operate in many regions or territories, and have teams dedicated to each territory. With Maestro, each territory can get its own Mautic instance, but they can still share campaign best-practices, and repeat successful campaigns across territories. It's a unique capability, which is very aligned with the Acquia Cloud Site Factory.

Try Mautic

If you want to try Mautic, you can either install the community version yourself or check out the demo or sandbox environment of Mautic Open Marketing Cloud.

Conclusion

We're very excited to join forces with Mautic. It is such a strategic step for Acquia. Together we'll provide our customers with more freedom, faster innovation, and more flexibility. Open digital experiences are the way of the future.

I've got a lot more to share about the Mautic acquisition, how we plan to integrate Mautic in Acquia's solutions, how we could build bridges between the Drupal and Mautic community, how it impacts the marketplace, and more.

In time, I'll write more about these topics on this blog. In the meantime, please feel free to join DB Hurley, Mautic's founder and CTO, and me in a live Q&A session on Thursday, May 9 at 10am ET. We'll try to answer your questions about Acquia and Mautic.

May 08, 2019

3 min read time

Apr 30 2019
Apr 30

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

The Drupal Association is announcing that Heather Rocker, former CEO of Girls, Inc. and Executive Director of Women in Technology, is joining as its next Executive Director.
 

The Drupal Association announced today that Heather Rocker has been selected as its next executive director.

This is exciting news because it concludes a seven-month search since Megan Sanicki left.

We looked long and hard for someone who could help us grow the global Drupal community by building on its diversity, working with developers and agency partners, and expanding our work with new audiences such as content creators and marketers.

The Drupal Association (including me) believes that Heather can do all of that, and is the best person to lead Drupal into its next phase of growth.

Heather earned her engineering degree from Georgia Tech. She has dedicated much of her career to working with women in technology, both as the CEO of Girls, Inc. of Greater Atlanta and the Executive Director of Women in Technology.

We were impressed not only with her valuable experience with volunteer organizations, but also her work in the private sector with large customers. Most recently, Heather was part of the management team at Systems Evolution, a team of 250 business consultants, where she specialized in sales operations and managed key client relationships.

She is also a robotics fanatic who organizes and judges competitions for children. So, maybe we’ll see some robots roaming around DrupalCon in the future!

As you can tell, Heather will bring a lot of great experience to the Drupal community and I look forward to partnering with her.

Last but not least, I want to thank Tim Lehnen for serving as our Interim Executive Director. He did a fantastic job leading the Drupal Association through this transition.

Apr 30 2019
Apr 30

The Drupal Association is announcing that Heather Rocker, former CEO of Girls, Inc. and Executive Director of Women in Technology, is joining as its next Executive Director.

The Drupal Association announced today that Heather Rocker has been selected as its next executive director.

This is exciting news because it concludes a seven month search since Megan Sanicki left.

We looked long and hard for someone who could help us grow the global Drupal community by building on its diversity, working with developers and agency partners, and expanding our work with new audiences such as content creators and marketers.

The Drupal Association (including me) believes that Heather can do all of that, and is the best person to lead Drupal into its next phase of growth.

Heather earned her engineering degree from Georgia Tech. She has dedicated much of her career to working with women in technology, both as the CEO of Girls, Inc. of Greater Atlanta and the Executive Director of Women in Technology.

We were impressed not only with her valuable experience with volunteer organizations, but also her work in the private sector with large customers. Most recently, Heather was part of the management team at Systems Evolution, a team of 250 business consultants, where she specialized in sales operations and managed key client relationships.

She is also a robotics fanatic who organizes and judges competitions for children. So, maybe we’ll see some robots roaming around DrupalCon in the future!

As you can tell, Heather will bring a lot of great experience to the Drupal community and I look forward to partnering with her.

Last but not least, I want to thank Tim Lehnen for serving as our Interim Executive Director. He did a fantastic job leading the Drupal Association through this transition.

April 30, 2019

1 min read time

db db
Apr 23 2019
Apr 23

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

DrupalCon Seattle Driesnote presentation

Last week, many Drupalists gathered in Seattle for DrupalCon North America, for what was the largest DrupalCon in history.

As a matter of tradition, I presented my State of Drupal keynote. You can watch a recording of my keynote (starting at 32 minutes) or download a copy of my slides (153 MB).

Making Drupal more diverse and inclusive

DrupalCon Seattle was not only the largest, but also had the most diverse speakers. Nearly 50% of the DrupalCon speakers were from underrepresented groups. This number has been growing year over year, and is something to be proud of.

I actually started my keynote by talking about how we can make Drupal more diverse and inclusive. As one of the largest and most thriving Open Source communities, I believe that Drupal has an obligation to set a positive example.

Free time to contribute is a privilege

I talked about how Open Source communities often incorrectly believe that everyone can contribute. Unfortunately, not everyone has equal amounts of free time to contribute. In my keynote, I encouraged individuals and organizations in the Drupal community to strongly consider giving time to underrepresented groups.

Improving diversity is not only good for Drupal and its ecosystem, it's good for people, and it's the right thing to do. Because this topic is so important, I wrote a dedicated blog post about it.

Drupal 8 innovation update

I dedicated a significant portion of my keynote to Drupal 8. In the past year alone, there have been 35% more sites and 48% more stable modules in Drupal 8. Our pace of innovation is increasing, and we've seen important progress in several key areas.

With the release of Drupal 8.7, the Layout Builder will become stable. Drupal's new Layout Builder makes it much easier to build and change one-off page layouts, templated layouts and layout workflows. Best of all, the Layout Builder will be accessible.

Drupal 8.7 also brings a lot of improvements to the Media Library.

We also continue to innovate on headless or decoupled Drupal. The JSON:API module will ship with Drupal 8.7. I believe this not only advances Drupal's leadership in API-first, but sets Drupal up for long-term success.

These are just a few of the new capabilities that will ship with Drupal 8.7. For the complete list of new features, keep an eye out for the release announcement in a few weeks.

Drupal 7 end of life

If you're still on Drupal 7, there is no need to panic. The Drupal community will support Drupal 7 until November 2021 — two years and 10 months from today.

After the community support ends, there will be extended commercial support for a minimum of three additional years. This means that Drupal 7 will be supported for at least five more years, or until 2024.

Upgrading from Drupal 7 to Drupal 8

Upgrading from Drupal 7 to Drupal 8 can be a lot of work, especially for large sites, but the benefits outweigh the challenges.

For my keynote, I featured stories from two end-users who upgraded large sites from Drupal 7 to Drupal 8 — the State of Georgia and Pegasystems.

The keynote also featured quietone, one of the maintainers of the Migrate API. She talked about the readiness of Drupal 8 migration tools.

Preparing for Drupal 9

As announced a few months ago, Drupal 9 is targeted for June 2020. June 2020 is only 14 months away, so I dedicated a significant amount of my keynote to Drupal 9.

Making Drupal updates easier is a huge, ongoing priority for the community. Thanks to those efforts, the upgrade path to Drupal 9 will be radically easier than the upgrade path to Drupal 8.

In my keynote, I talked about how site owners, Drupal developers and Drupal module maintainers can start preparing for Drupal 9 today. I showed several tools that make Drupal 9 preparation easier. Check out my post on how to prepare for Drupal 9 for details.

A timeline with important dates and future milestones

Thank you

I'm grateful to be a part of a community that takes such pride in its work. At each DrupalCon, we get to see the tireless efforts of many volunteers that add up to one amazing event. It makes me proud to showcase the work of so many people and organizations in my presentations.

Thank you to all who have made this year's DrupalCon North America memorable. I look forward to celebrating our work and friendships at future events!

Apr 22 2019
Apr 22

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Upgrading from Drupal 8 to Drupal 9 should be easy if you regularly check for and remove the use of deprecated code.

With Drupal 9 targeted to be released in June of 2020, many people are wondering what they need to do to prepare.

The good and important news is that upgrading from Drupal 8 to Drupal 9 should be really easy — radically easier than upgrading from Drupal 7 to Drupal 8.

The only caveat is that you need to manage "deprecated code" well.

If your site doesn't use deprecated code that is scheduled for removal in Drupal 9, your upgrade to Drupal 9 will be easy. In fact, it should be as easy as a minor version upgrade (like upgrading from Drupal 8.6 to Drupal 8.7).

What is deprecated code?

Code in Drupal is marked as "deprecated" when it should no longer be used. Typically, code is deprecated because there is a better alternative that should be used instead.

For example, in Drupal 8.0.0, we deprecated \Drupal::l($text, $url). Instead of using \Drupal::l(), you should use Link::fromTextAndUrl($text, $url). The \Drupal::l() function was marked for removal as part of some clean-up work; Drupal 8 had too many ways to generate links.

Deprecated code will continue to work for some time before it gets removed. For example, \Drupal::l() continues to work in Drupal 8.7 despite the fact that it was deprecated in Drupal 8.0.0 more than three years ago. This gives module maintainers ample time to update their code.

When we release Drupal 9, we will "drop" most deprecated code. In our example, this means that \Drupal::l() will not be available anymore in Drupal 9.

In other words:

  • Any Drupal 8 module that does not use deprecated code will continue to work with Drupal 9.
  • Any Drupal 8 module that uses deprecated code needs to be updated before Drupal 9 is released, or it will stop working with Drupal 9.

If you're interested, you can read more about Drupal's deprecation policy at https://www.drupal.org/core/deprecation.

How do I know if my site uses deprecated code?

There are a few ways to check if your site is using deprecated code.

If you work on a Drupal site as a developer, run drupal-checkMatt Glaman(Centarro) developed a static PHP analysis tool called drupal-check, which you can run against your codebase to check for deprecated code. I recommend running drupal-check in an automated fashion as part of your development workflow.

If you are a site owner, install the Upgrade Status module. This module was built by Acquia. The module provides a graphical user interface on top of drupal-check. The goal is to provide an easy-to-use readiness assessment for your site's migration to Drupal 9.

If you maintain a project on Drupal.org, enable Drupal.org's testing infrastructure to detect the use of deprecated code. There are two complementary ways to do so: you can run a static deprecation analysis and/or configure your existing tests to fail when calling deprecated code. Both can be set up in your drupalci.yml configuration file.

If you find deprecated code in a contributed module used on your site, consider filing an issue in the module's issue queue on Drupal.org (after having checked no issue has been created yet). If you can, provide a patch to fix the deprecation and engage with the maintainer to get it committed.

How hard is it to update my code?

While there are some deprecations that require more detailed refactoring, many are a simple matter of search-and-replace.

You can check the API documentation for instructions on how to remedy the deprecation.

When can I start updating my code?

I encourage you to start today. When you update your Drupal 8 code to use the latest and greatest APIs, you can benefit from those improvements immediately. There is no reason to wait until Drupal 9 is released.

Drupal 8.8.0 will be the last release to deprecate for Drupal 9. Today, we don't know the full set of deprecations yet.

How much time do I have to update my code?

The current plan is to release Drupal 9 in June of 2020, and to end-of-life Drupal 8 in November of 2021.

Contributed module maintainers are encouraged to remove the use of deprecated code by June of 2020 so everyone can upgrade to Drupal 9 the day it is released.

A timeline with important dates and future milestones

Drupal.org project maintainers should keep the extended security coverage policy in mind, which means that Drupal 8.8 will still be supported until Drupal 9.1 is released. Contributed projects looking to support both Drupal 8.8 and Drupal 9.0 might need to use two branches.

How ready are the contributed modules?

Dwayne McDaniel (Pantheon) analyzed all 7,000 contributed modules for Drupal 8 using drupal-check.

As it stands today, 44% of the modules have no deprecation warnings. The remaining 56% of the modules need to be updated, but the majority have less than three deprecation warnings.

Apr 15 2019
Apr 15

DrupalCon Seattle Driesnote presentation

Last week, many Drupalists gathered in Seattle for DrupalCon North America, for what was the largest DrupalCon in history.

As a matter of tradition, I presented my State of Drupal keynote. You can watch a recording of my keynote (starting at 32 minutes) or download a copy of my slides (153 MB).

Making Drupal more diverse and inclusive

DrupalCon Seattle was not only the largest, but also had the most diverse speakers. Nearly 50% of the DrupalCon speakers were from underrepresented groups. This number has been growing year over year, and is something to be proud of.

I actually started my keynote by talking about how we can make Drupal more diverse and inclusive. As one of the largest and most thriving Open Source communities, I believe that Drupal has an obligation to set a positive example.

Free time to contribute is a privilege

I talked about how Open Source communities often incorrectly believe that everyone can contribute. Unfortunately, not everyone has equal amounts of free time to contribute. In my keynote, I encouraged individuals and organizations in the Drupal community to strongly consider giving time to underrepresented groups.

Improving diversity is not only good for Drupal and its ecosystem, it's good for people, and it's the right thing to do. Because this topic is so important, I wrote a dedicated blog post about it.

Drupal 8 innovation update

I dedicated a significant portion of my keynote to Drupal 8. In the past year alone, there have been 35% more sites and 48% more stable modules in Drupal 8. Our pace of innovation is increasing, and we've seen important progress in several key areas.

With the release of Drupal 8.7, the Layout Builder will become stable. Drupal's new Layout Builder makes it much easier to build and change one-off page layouts, templated layouts and layout workflows. Best of all, the Layout Builder will be accessible.

Drupal 8.7 also brings a lot of improvements to the Media Library.

We also continue to innovate on headless or decoupled Drupal. The JSON:API module will ship with Drupal 8.7. I believe this not only advances Drupal's leadership in API-first, but sets Drupal up for long-term success.

These are just a few of the new capabilities that will ship with Drupal 8.7. For the complete list of new features, keep an eye out for the release announcement in a few weeks.

Drupal 7 end of life

If you're still on Drupal 7, there is no need to panic. The Drupal community will support Drupal 7 until November 2021 — two years and 10 months from today.

After the community support ends, there will be extended commercial support for a minimum of three additional years. This means that Drupal 7 will be supported for at least five more years, or until 2024.

Upgrading from Drupal 7 to Drupal 8

Upgrading from Drupal 7 to Drupal 8 can be a lot of work, especially for large sites, but the benefits outweigh the challenges.

For my keynote, I featured stories from two end-users who upgraded large sites from Drupal 7 to Drupal 8 — the State of Georgia and Pegasystems.

The keynote also featured quietone, one of the maintainers of the Migrate API. She talked about the readiness of Drupal 8 migration tools.

Preparing for Drupal 9

As announced a few months ago, Drupal 9 is targeted for June 2020. June 2020 is only 14 months away, so I dedicated a significant amount of my keynote to Drupal 9.

Making Drupal updates easier is a huge, ongoing priority for the community. Thanks to those efforts, the upgrade path to Drupal 9 will be radically easier than the upgrade path to Drupal 8.

In my keynote, I talked about how site owners, Drupal developers and Drupal module maintainers can start preparing for Drupal 9 today. I showed several tools that make Drupal 9 preparation easier. Check out my post on how to prepare for Drupal 9 for details.

A timeline with important dates and future milestones

Thank you

I'm grateful to be a part of a community that takes such pride in its work. At each DrupalCon, we get to see the tireless efforts of many volunteers that add up to one amazing event. It makes me proud to showcase the work of so many people and organizations in my presentations.

Thank you to all who have made this year's DrupalCon North America memorable. I look forward to celebrating our work and friendships at future events!

April 15, 2019

2 min read time

Apr 12 2019
Apr 12

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Open Source communities often incorrectly believe that everyone can contribute. Unfortunately, not everyone has equal amounts of free time to contribute.

On this page:

In Open Source, there is a long-held belief in meritocracy, or the idea that the best work rises to the top, regardless of who contributes it. The problem is that a meritocracy assumes an equal distribution of time for everyone in a community.

Open Source is not a meritocracy

I incorrectly made this assumption myself, saying: The only real limitation [to Open Source contribution] is your willingness to learn.

Today, I've come to understand that inequality makes it difficult for underrepresented groups to have the "free time" it takes to contribute to Open Source.

For example, research shows that women still spend more than double the time as men doing unpaid domestic work, such as housework or childcare. I've heard from some of my colleagues that they need to optimize every minute of time they don't spend working, which makes it more difficult to contribute to Open Source on an unpaid, volunteer basis.

Or, in other cases, many people's economic conditions require them to work more hours or several jobs in order to support themselves or their families.

Systemic issues like racial and gender wage gaps continue to plague underrepresented groups, and it's both unfair and impractical to assume that these groups of people have the same amount of free time to contribute to Open Source projects, if they have any at all.

What this means is that Open Source is not a meritocracy.

Only local images are allowed.

Free time is a mark of privilege, rather than an equal right. Instead of chasing an unrealistic concept of meritocracy, we should be striving for equity. Rather than thinking, "everyone can contribute to open source", we should be thinking, "everyone deserves the opportunity to contribute".

Time inequality contributes to a lack of diversity in Open Source

This fallacy of "free time" makes Open Source communities suffer from a lack of diversity. The demographics are even worse than the technology industry overall: while 22.6% of professional computer programmers in the workforce identify as women (Bureau of Labor Statistics), less than 5% of contributors do in Open Source (GitHub). And while 34% of programmers identify as ethnic or national minorities (Bureau of Labor Statistics), only 16% do in Open Source (GitHub).

Only local images are allowed.

It's important to note that time isn't the only factor; sometimes a hostile culture or unconscious bias play a part in limiting diversity. According to the same GitHub survey cited above, 21% of people who experienced negative behavior stopped contributing to Open Source projects altogether. Other recent research showed that women's pull requests were more likely to get accepted if they had a gender-neutral username. Unfortunately, examples like these are common.

Taking action: giving time to underrepresented groups

While it's impossible to fix decades of gender and racial inequality with any single action, we must do better. Those in a position to help have an obligation to improve the lives of others. We should not only invite underrepresented groups into our Open Source communities, but make sure that they are welcomed, supported and empowered. One way to help is with time:

  • As individuals, by making sure you are intentionally welcoming people from underrepresented groups, through both outreach and actions. If you're in a community organizing position, encourage and make space for people from underrepresented groups to give talks or lead sprints about the work they're interested in. Or if you're asked to, mentor an underrepresented contributor.
  • As organizations in the Open Source ecosystem, by giving people more paid time to contribute.

Taking the extra effort to help onboard new members or provide added detail when reviewing code changes can be invaluable to community members who don't have an abundance of free time. Overall, being kinder, more patient and more supportive to others could go a long way in welcoming more people to Open Source.

In addition, organizations within the Open Source ecosystem capable of giving back should consider financially sponsoring underrepresented groups to contribute to Open Source. Sponsorship can look like full or part-time employment, an internship or giving to organizations like Girls Who CodeCode2040Resilient Coders or one of the many others that support diversity in technology. Even a few hours of paid time during the workweek for underrepresented employees could help them contribute more to Open Source.

Applying the lessons to Drupal

Over the years, I've learned a lot from different people's perspectives. Learning out in the open is not always easy, but it's been an important part of my personal journey.

Knowing that Drupal is one of the largest and most influential Open Source projects, I find it important that we lead by example.

I encourage individuals and organizations in the Drupal community to strongly consider giving time and opportunities to underrepresented groups. You can start in places like:

When we have more diverse people contributing to Drupal, it will not only inject a spark of energy, but it will also help us make better, more accessible, inclusive software for everyone in the world.

Each of us needs to decide if and how we can help to create equity for everyone in Drupal. Not only is it good for business, it's good for people, and it's the right thing to do.

Special thanks to the Drupal Diversity and Inclusion group for discussing this topic with me.

 April 10, 2019

 3 min read time

 Permalink

Apr 11 2019
Apr 11

Upgrading from Drupal 8 to Drupal 9 should be easy if you regularly check for and remove the use of deprecated code.

With Drupal 9 targeted to be released in June of 2020, many people are wondering what they need to do to prepare.

The good and important news is that upgrading from Drupal 8 to Drupal 9 should be really easy — radically easier than upgrading from Drupal 7 to Drupal 8.

The only caveat is that you need to manage "deprecated code" well.

If your site doesn't use deprecated code that is scheduled for removal in Drupal 9, your upgrade to Drupal 9 will be easy. In fact, it should be as easy as a minor version upgrade (like upgrading from Drupal 8.6 to Drupal 8.7).

What is deprecated code?

Code in Drupal is marked as "deprecated" when it should no longer be used. Typically, code is deprecated because there is a better alternative that should be used instead.

For example, in Drupal 8.0.0, we deprecated \Drupal::l($text, $url). Instead of using \Drupal::l(), you should use Link::fromTextAndUrl($text, $url). The \Drupal::l() function was marked for removal as part of some clean-up work; Drupal 8 had too many ways to generate links.

Deprecated code will continue to work for some time before it gets removed. For example, \Drupal::l() continues to work in Drupal 8.7 despite the fact that it was deprecated in Drupal 8.0.0 more than three years ago. This gives module maintainers ample time to update their code.

When we release Drupal 9, we will "drop" most deprecated code. In our example, this means that \Drupal::l() will not be available anymore in Drupal 9.

In other words:

  • Any Drupal 8 module that does not use deprecated code will continue to work with Drupal 9.
  • Any Drupal 8 module that uses deprecated code needs to be updated before Drupal 9 is released, or it will stop working with Drupal 9.

If you're interested, you can read more about Drupal's deprecation policy at https://www.drupal.org/core/deprecation.

How do I know if my site uses deprecated code?

There are a few ways to check if your site is using deprecated code.

If you work on a Drupal site as a developer, run drupal-check. Matt Glaman (Centarro) developed a static PHP analysis tool called drupal-check, which you can run against your codebase to check for deprecated code. I recommend running drupal-check in an automated fashion as part of your development workflow.

If you are a site owner, install the Upgrade Status module. This module was built by Acquia. The module provides a graphical user interface on top of drupal-check. The goal is to provide an easy-to-use readiness assessment for your site's migration to Drupal 9.

If you maintain a project on Drupal.org, enable Drupal.org's testing infrastructure to detect the use of deprecated code. There are two complementary ways to do so: you can run a static deprecation analysis and/or configure your existing tests to fail when calling deprecated code. Both can be set up in your drupalci.yml configuration file.

If you find deprecated code in a contributed module used on your site, consider filing an issue in the module's issue queue on Drupal.org (after having checked no issue has been created yet). If you can, provide a patch to fix the deprecation and engage with the maintainer to get it committed.

How hard is it to update my code?

While there are some deprecations that require more detailed refactoring, many are a simple matter of search-and-replace.

You can check the API documentation for instructions on how to remedy the deprecation.

When can I start updating my code?

I encourage you to start today. When you update your Drupal 8 code to use the latest and greatest APIs, you can benefit from those improvements immediately. There is no reason to wait until Drupal 9 is released.

Drupal 8.8.0 will be the last release to deprecate for Drupal 9. Today, we don't know the full set of deprecations yet.

How much time do I have to update my code?

The current plan is to release Drupal 9 in June of 2020, and to end-of-life Drupal 8 in November of 2021.

Contributed module maintainers are encouraged to remove the use of deprecated code by June of 2020 so everyone can upgrade to Drupal 9 the day it is released.

A timeline with important dates and future milestones

Drupal.org project maintainers should keep the extended security coverage policy in mind, which means that Drupal 8.8 will still be supported until Drupal 9.1 is released. Contributed projects looking to support both Drupal 8.8 and Drupal 9.0 might need to use two branches.

How ready are the contributed modules?

Dwayne McDaniel (Pantheon) analyzed all 7,000 contributed module for Drupal 8 using drupal-check.

As it stands today, 44% of the modules have no deprecation warnings. The remaining 56% of the modules need to be updated, but the majority have less than three deprecation warnings.

April 11, 2019

3 min read time

Apr 10 2019
Apr 10

Open Source communities often incorrectly believe that everyone can contribute. Unfortunately, not everyone has equal amounts of free time to contribute.

In Open Source, there is a long-held belief in meritocracy, or the idea that the best work rises to the top, regardless of who contributes it. The problem is that a meritocracy assumes an equal distribution of time for everyone in a community.

Open Source is not a meritocracy

Free time to contribute is a privilege

I incorrectly made this assumption myself, saying: The only real limitation [to Open Source contribution] is your willingness to learn.

Today, I've come to understand that inequality makes it difficult for underrepresented groups to have the "free time" it takes to contribute to Open Source.

For example, research shows that women still spend more than double the time as men doing unpaid domestic work, such as housework or childcare. I've heard from some of my colleagues that they need to optimize every minute of time they don't spend working, which makes it more difficult to contribute to Open Source on an unpaid, volunteer basis.

Or, in other cases, many people's economic conditions require them to work more hours or several jobs in order to support themselves or their families. Systemic issues like racial and gender wage gaps continue to plague underrepresented groups, and it's both unfair and impractical to assume that these groups of people have the same amount of free time to contribute to Open Source projects, if they have any at all.

Underrepresented groups don't have the same amount of free time

These are just a few examples of free time not being equally distributed. What this means is that Open Source is not a meritocracy.

Free time is a mark of privilege, rather than an equal right. Instead of chasing an unrealistic concept of meritocracy, we should be striving for equity. Rather than thinking, "everyone can contribute to open source", we should be thinking, "everyone deserves the opportunity to contribute".

Time inequality contributes to a lack of diversity in Open Source

This fallacy of "free time" makes Open Source communities suffer from a lack of diversity. The demographics are even worse than the technology industry overall: while 22.6% of professional computer programmers in the workforce identify as women (Bureau of Labor Statistics), less than 5% of contributors do in Open Source (GitHub). And while 34% of programmers identify as ethnic or national minorities (Bureau of Labor Statistics), only 16% do in Open Source (GitHub).

Diversity in data

It's important to note that time isn't the only factor; sometimes a hostile culture or unconscious bias play a part in limiting diversity. According to the same GitHub survey cited above, 21% of people who experienced negative behavior stopped contributing to Open Source projects altogether. Other recent research showed that women's pull requests were more likely to get accepted if they had a gender-neutral username. Unfortunately, examples like these are common.

Taking action: giving time to underrepresented groups

A person being ignored

While it's impossible to fix decades of gender and racial inequality with any single action, we must do better. Those in a position to help have an obligation to improve the lives of others. We should not only invite underrepresented groups into our Open Source communities, but make sure that they are welcomed, supported and empowered. One way to help is with time:

  • As individuals, by making sure you are intentionally welcoming people from underrepresented groups, through both outreach and actions. If you're in a community organizing position, encourage and make space for people from underrepresented groups to give talks or lead sprints about the work they're interested in. Or if you're asked to, mentor an underrepresented contributor.
  • As organizations in the Open Source ecosystem, by giving people more paid time to contribute.

Taking the extra effort to help onboard new members or provide added detail when reviewing code changes can be invaluable to community members who don't have an abundance of free time. Overall, being kinder, more patient and more supportive to others could go a long way in welcoming more people to Open Source.

In addition, organizations within the Open Source ecosystem capable of giving back should consider financially sponsoring underrepresented groups to contribute to Open Source. Sponsorship can look like full or part-time employment, an internship or giving to organizations like Girls Who Code, Code2040, Resilient Coders or one of the many others that support diversity in technology. Even a few hours of paid time during the workweek for underrepresented employees could help them contribute more to Open Source.

Applying the lessons to Drupal

Over the years, I've learned a lot from different people's perspectives. Learning out in the open is not always easy, but it's been an important part of my personal journey.

Knowing that Drupal is one of the largest and most influential Open Source projects, I find it important that we lead by example.

I encourage individuals and organizations in the Drupal community to strongly consider giving time and opportunities to underrepresented groups. You can start in places like:

When we have more diverse people contributing to Drupal, it will not only inject a spark of energy, but it will also help us make better, more accessible, inclusive software for everyone in the world.

Each of us needs to decide if and how we can help to create equity for everyone in Drupal. Not only is it good for business, it's good for people, and it's the right thing to do.

Special thanks to the Drupal Diversity and Inclusion group for discussing this topic with me.

April 10, 2019

3 min read time

Apr 09 2019
Apr 09

For most people, today marks the first day of DrupalCon Seattle.

Open Source communities create better, more inclusive software when diverse people come to the table. Unfortunately, there is still a huge gender gap in Open Source, and software more broadly. It's something I'll talk more about in my keynote tomorrow.

One way to help close the gender gap in the technology sector is to give to organizations that are actively working to solve this problem. During DrupalCon Seattle, Acquia will donate $5 to Girls Who Code for every person that visits our booth.

April 09, 2019

22 sec read time

db db
Mar 13 2019
Mar 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Three stars will align and the Open Web will win.

Today, the world wide web celebrates its 30th birthday. In 1989, Sir Tim Berners-Lee invented the world wide web and changed the lives of millions of people around the globe, including mine.

Tim Berners-Lee sitting in front of a computer showing the first website

Tim Berners-Lee, inventor of the World Wide Web, in front of the early web.

Milestones like this get me thinking about the positive impact a free and Open Web has had on society. Without the web, billions of people would not have been able to connect with one another, be entertained, start businesses, exchange ideas, or even save lives. Open source communities like Drupal would not exist.

As optimistic as I am about the web's impact on society, there have been many recent events that have caused me to question the Open Web's future. Too much power has fallen into the hands of relatively few platform companies, resulting in widespread misinformation, privacy beaches, bullying, and more.

However, I'm optimistic that the Open Web has a chance to win in the future. I believe we'll see three important events happen in the next five years.

First, the day will come when regulators will implement a set of laws that govern the ownership and exchange of data online. It's already starting to happen with GDPR in the EU and various state data privacy laws taking shape in the US. These regulations will require platforms like Facebook to give users more control over their data, and when that finally happens, it will be a lot easier for users to move their data between services and for the Open Web to innovate on top of these data platforms.

Second, at some point, governments globally will disempower large platform companies. We can't leave it up to a handful of companies to judge what is false and true, or have them act as our censors. While I'm not recommending governments split up these companies, my hope is that they will institute some level of algorithmic oversight. This will offer an advantage to the Open Web and Open Source.

Third, I think we're on the verge of having a new set of building blocks that enable us to build a better, next-generation web. Thirty years into the web, our data architectures still use a client-server model; data is stored centrally on one computer, so to speak. The blockchain is turning that into a more decentralized web that operates on top of a distributed data layer and offers users control of their own data. Similar to building a traditional website, distributed applications (dApps) require file storage, payment systems, user data stores, etc. All of these components are being rebuilt on top of the blockchain. While we have a long way to go, it is only a matter of time before a tipping point is reached.

In the past, I've publicly asked the question: Can we save the Open Web? I believe we can. We can't win today, but we can keep innovating and get ready for these three events to unfold. The day will come!

With that motivation in mind, I want to wish a special happy birthday to the world wide web!

Feb 21 2019
Feb 21

Learn how to implement lazy loading images to improve the performance of your website.

Recently, I've been spending some time making performance improvements to my site. In my previous blog post on this topic, I described my progress optimizing the JavaScript and CSS usage on my site, and concluded that image optimization was the next step.

Last summer I published a blog post about my vacation in Acadia National Park. Included in that post are 13 photos with a combined size of about 4 MB.

When I benchmarked that post with https://webpagetest.org, it showed that it took 7.275 seconds (blue vertical line) to render the page.

The graph shows that the browser downloaded all 13 images to render the page. Why would a browser download all images if most of them are below the fold and not shown until a user starts scrolling? It makes very little sense.

As you can see from the graph, downloading all 13 images take a very long time (purple horizontal bars). No matter how much you optimize your CSS and JavaScript, this particular blog post would have remained slow until you optimize how images are loaded.

Webpagetest images february before

"Lazy loading" images is one solution to this problem. Lazy loading means that the images aren't loaded until the user scrolls and the images come into the browser's viewport.

You might have seen lazy loading in action on websites like Facebook, Pinterest or Medium. It usually goes like this:

  • You visit a page as you normally would, scrolling through the content.
  • Instead of the actual image, you see a blurry placeholder image.
  • Then, the placeholder image gets swapped out with the final image as quickly as possible.
An animated GIF of a user scrolling a webpage and a placeholder images being replaced by the final image

To support lazy loading images on my blog I do three things:

  1. Automatically generate lightweight yet useful placeholder images.
  2. Embed the placeholder images directly in the HTML to speed up performance.
  3. Replace the placeholder images with the real images when they become visible.

Generating lightweight placeholder images

To generate lightweight placeholder images, I implemented a technique used by Facebook: create a tiny image that is a downscaled version of the original image, strip out the image's metadata to optimize its size, and let the browser scale the image back up.

To create lightweight placeholder images, I resized the original images to be 5 pixels wide. Because I have about 10,000 images on my blog, my Drupal-based site automates this for me, but here is how you create one from the command line using ImageMagick's convert tool:

$ convert -resize 5x -strip original.jpg placeholder.jpg
  • -resize 5x resizes the image to be 5 pixels wide while maintaining its aspect ratio.
  • -strip removes all comments and redundant headers in the image. This helps make the image's file size as small as possible.

The resulting placeholder images are tiny — often shy of 400 bytes.

Large metal pots with wooden lids in which lobsters are boiledThe original image that we need to generate a placeholder for. An example placeholder image shows brown and black tonesThe generated placeholder, scaled up by a browser from a tiny image that is 5 pixels wide. The size of this placeholder image is only 395 bytes.

Here is another example to illustrate how the colors in the placeholders nicely match the original image:

A sunrise with beautiful reds and black silhouettesAn example placeholder image that shows red and black tones

Even though the placeholder image should only be shown for a fraction of a second, making them relevant is a nice touch as they suggest what is coming. It's also an important touch, as users are very impatient with load times on the web.

Embedding placeholder images directly in HTML

One not-so-well-known feature of the <img> element is that you can embed an image directly into the HTML document using the data URL scheme:

<img src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/data:image/jpg;base64,/9j/4AAQSkZJRgABAQEA8ADwAAD/2wB
  DAAEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQICAQECAQEB
  AgICAgICAgICAQICAgICAgICAgL/2wBDAQEBAQEBAQEBAQECAQEBAgICAgICA
  gICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgL/wA
  ARCAADAAUDAREAAhEBAxEB/8QAFAABAAAAAAAAAAAAAAAAAAAACf/EAB8QAAM
  AAQMFAAAAAAAAAAAAAAECAwcEBQYACAkTMf/EABUBAQEAAAAAAAAAAAAAAAAA
  AAIG/8QAJBEAAQIFAgcAAAAAAAAAAAAAAQURAAIDBDEGFBIVQUVRcYH/2gAMA
  wEAAhEDEQA/AAeyb5HO8o8lSLZd01Jz2nbKoK4yxDVvZqYl7uaV4CWdmZQSSS
  ST86FJBsafEJK15KD05ioNk4G6Yeg0V9bVCmZpXt08sB2hJ8DJ2Tn7H/2Q==" />

Data URLs are composed of four parts: the data: prefix, a media type indicating the type of data (image/jpg), an optional base64 token to indicate that the data is base64 encoded, and the base64 encoded image data itself.

data:[<media type>][;base64],<data>

To base64 encode an image from the command line, use:

$ base64 placeholder.jpg

To base64 encode an image in PHP, use:

$data =  base64_encode(file_get_contents('placeholder.jpg'));

What is the advantage of embedding a base64 encoded image using a data URL? It eliminates HTTP requests as the browser doesn't have to set up new HTTP connections to download the images. Fewer HTTP requests usually means faster page load times.

Replacing placeholder images with real images

Next, I used JavaScript's IntersectionObserver to replace the placeholder image with the actual image when it comes into the browser's viewport. I followed Jeremy Wagner's approach shared on Google Web Fundamentals Guide on lazy loading images — with some adjustments.

It starts with the following HTML markup:

<img class="lazy" src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/placeholder.jpg" data-src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/original.jpg" />

The three relevant pieces are:

  1. The class="lazy" attribute, which is what you'll select the element with in JavaScript.
  2. The src attribute, which references the placeholder image that will appear when the page first loads. Instead of linking to placeholder.jpg I embed the image data using the data URL technique explained above.
  3. The data-src attribute, which contains the URL to the original image that will replace the placeholder when it comes in focus.

Next, we use JavaScript's IntersectionObserver to replace the placeholder images with the actual images:

document.addEventListener('DOMContentLoaded', function() {
  var lazyImages = [].slice.call(document.querySelectorAll('img.lazy'));

  if ('IntersectionObserver' in window) {
    let lazyImageObserver = new IntersectionObserver(
      function(entries, observer) {
        entries.forEach(function(entry) {
          if (entry.isIntersecting) {
            let lazyImage = entry.target;
            lazyImage.src = lazyImage.dataset.src;
            lazyImageObserver.unobserve(lazyImage);
          }
        });
    });

    lazyImages.forEach(function(lazyImage) {
      lazyImageObserver.observe(lazyImage);
    });
  }
  else {
    // For browsers that don't support IntersectionObserver yet,
    // load all the images now:
    lazyImages.forEach(function(lazyImage) {
      lazyImage.src = lazyImage.dataset.src;
    });
  }
});

This JavaScript code queries the DOM for all <img> elements with the lazy class. The IntersectionObserver is used to replace the placeholder image with the original image when the img.lazy elements enter the viewport. When IntersectionObserver is not supported, the images are replaced on the DOMContentLoaded event.

By default, the IntersectionObserver's callback is triggered the moment a single pixel of the image enters the browser's viewport. However, using the rootMargin property, you can trigger the image swap before the image enters the viewport. This reduces or eliminates the visual or perceived lag time when swapping a placeholder image for the actual image.

I implemented that on my site as follows:

const config = {
    // If the image gets within 250px of the browser's viewport, 
    // start the download:
    rootMargin: '250px 0px',
  };

let lazyImageObserver = new IntersectionObserver(..., config);

Lazy loading images drastically improves performance

After making these changes to my site, I did a new https://webpagetest.org benchmark run:

A diagram that shows page load times for dri.es before making performance improvements

You can clearly see that the page became a lot faster to render:

  • The document is complete after 0.35 seconds (blue vertical line) instead of the original 7.275 seconds.
  • No images are loaded before the document is complete, compared to 13 images being loaded before.
  • After the document is complete, one image (purple horizontal bar) is downloaded. This is triggered by the JavaScript code as the result of one image being above the fold.

Lazy loading images improves web page performance by reducing the number of HTTP requests, and consequently reduces the amount of data that needs to be downloaded to render the initial page.

Is base64 encoding images bad for SEO?

Faster sites have a SEO advantage as page speed is a ranking factor for search engines. But, lazy loading might also be bad for SEO, as search engines have to be able to discover the original images.

To find out, I headed to Google Search Console. Google Search Console has a "URL inspection" feature that allows you to look at a webpage through the eyes of Googlebot.

I tested it out with my Acadia National Park blog post. As you can see in the screenshot, the first photo in the blog post was not loaded. Googlebot doesn't seem to support data URLs for images.

A screenshot that shows Googlebot doesn't render placeholder images that are embedded using data URLs

Is IntersectionObserver bad for SEO?

The fact that Googlebot doesn't appear to support data URLs does not have to be a problem. The real question is whether Googlebot will scroll the page, execute the JavaScript, replace the placeholders with the actual images, and index those. If it does, it doesn't matter that Googlebot doesn't understand data URLs.

To find out, I decided to conduct an experiment. Yesterday, I published a blog post about Matt Mullenweg and me visiting a museum together. The images in that blog post are lazy loaded and can only be discovered by Google if its crawler executes the JavaScript and scrolls the page. If those images show up in Google's index, we know there is no SEO impact.

I'm not sure how long it takes for Google to make new posts and images available in its index, but I'll keep an eye out for it.

If the images don't show up in Google's index, lazy loading might impact your SEO. My solution would be to selectively disable lazy loading for the most important images only. (Note: even if Google finds the images, there is no guarantee that it will decide to index them — short blog posts and images are often excluded from Google's index.)

Conclusions

Lazy loading images improves web page performance by reducing the number of HTTP requests and data needed to render the initial page.

Ideally, over time, browsers will support lazy loading images natively, and some of the SEO challenges will no longer be an issue. Until then, consider adding support for lazy loading yourself. For my own site, it took about 40 lines of JavaScript code and 20 lines of additional PHP/Drupal code.

I hope that by sharing my experience, more people are encouraged to run their own sites and to optimize their sites' performance.

February 21, 2019

6 min read time

Feb 20 2019
Feb 20

A month ago, Matt Mullenweg, co-founder of WordPress and founder of Automattic, visited me in Antwerp. While I currently live in Boston, I was born and raised in Antwerp, and also started Drupal there.

We spent the morning together walking around Antwerp and visited the Plantin Moretus Museum.

The museum is the old house of Christophe Plantin, where he lived and worked around 1575. At the time, Plantin had the largest printing shop in the world, with 56 employees and 16 printing presses. These presses printed 1,250 sheets per day.

Today, the museum hosts the two oldest printing presses in the world. In addition, the museum has original lead types of fonts such as Garamond and hundreds of ancient manuscripts that tell the story of how writing evolved into the art of printing.

The old house, printing business, presses and lead types are the earliest witnesses of a landmark moment in history: the invention of printing, and by extension, the democratization of publishing, long before our digital age. It was nice to visit that together with Matt as a break from our day-to-day focus on web publishing.

An old printing press at the Plantin Moretus Museum

Dries and Matt in front of the oldest printing presses in the world

An old globe at the Plantin Moretus Museum

February 20, 2019

47 sec read time

db db
Feb 14 2019
Feb 14

I've been thinking about the performance of my site and how it affects the user experience. There are real, ethical concerns to poor web performance. These include accessibility, inclusion, waste and environmental concerns.

A faster site is more accessible, and therefore more inclusive for people visiting from a mobile device, or from areas in the world with slow or expensive internet.

For those reasons, I decided to see if I could improve the performance of my site. I used the excellent https://webpagetest.org to benchmark a simple blog post https://dri.es/relentlessly-eliminating-barriers-to-growth.

A diagram that shows page load times for dri.es before making performance improvements

The image above shows that it took a browser 0.722 seconds to download and render the page (see blue vertical line):

  • The first 210 milliseconds are used to set up the connection, which includes the DNS lookup, TCP handshake and the SSL negotiation.
  • The next 260 milliseconds (from 0.21 seconds to 0.47 seconds) are spent downloading the rendered HTML file, two CSS files and one JavaScript file.
  • After everything is downloaded, the final 330 milliseconds (from 0.475 seconds to 0.8 seconds) are used to layout the page and execute the JavaScript code.

By most standards, 0.722 seconds is pretty fast. In fact, according to HTTP Archive, it takes more than 2.4 seconds to download and render the average web page on a laptop or desktop computer.

Regardless, I noticed that the length of the horizontal green bars and the horizontal yellow bar was relatively long compared to that of the blue bar. In other words, a lot of time is spent downloading JavaScript (yellow horizontal bar) and CSS (two green horizontal bars) instead of the HTML, including the actual content of the blog post (blue bar).

To fix, I did two things:

  1. Use vanilla JavaScript. I replaced my jQuery-based JavaScript with vanilla JavaScript. Without impacting the functionality of my site, the amount of JavaScript went from almost 45 KB to 699 bytes, good for a savings of over 6,000 percent.
  2. Conditionally include CSS. For example, I use Prism.js for syntax highlighting code snippets in blog posts. prism.css was downloaded for every page request, even when there were no code snippets to highlight. Using Drupal's render system, it's easy to conditionally include CSS. By taking advantage of that, I was able to reduce the amount of CSS downloaded by 47 percent — from 4.7 KB to 2.5 KB.

According to the January 1st, 2019 run of HTTP Archive, the median page requires 396 KB of JavaScript and 60 KB of CSS. I'm proud that my site is well under these medians.

File type Dri.es before Dri.es after World-wide median JavaScript 45 KB 669 bytes 396 KB CSS 4.7 KB 2.5 KB 60 KB

Because the new JavaScript and CSS files are significantly smaller, it takes the browser less time to download, parse and render them. As a result, the same blog post is now available in 0.465 seconds instead of 0.722 seconds, or 35% faster.

After a new https://webpagetest.org test run, you can clearly see that the bars for the CSS and JavaScript files became visually shorter:

A diagram that shows page load times for dri.es after making performance improvements

To optimize the user experience of my site, I want it to be fast. I hope that others will see that bloated websites can come at a great cost, and will consider using tools like https://webpagetest.org to make their sites more performant.

I'll keep working on making my website even faster. As a next step, I plan to make pages with images faster by using lazy image loading.

February 13, 2019

2 min read time

Feb 13 2019
Feb 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

An abstract image of three boxes

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST for Drupal core's needs.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetchdata but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability.

Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an anyHTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted conventionamongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GETrequest with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling.

Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:APIis on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions.

Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure.

What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, for Drupal core's needs, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. That said, Drupal's GraphQL implementation is fantastic, especially when you have the developer capacity to build a bespoke API for your project.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein(Acquia) for their feedback during the writing process.

 February 11, 2019

 11 min read time

 Permalink

Feb 11 2019
Feb 11

We compare REST, JSON:API and GraphQL — three different web services implementations — based on request efficiency, operational simplicity, API discoverability, and more.

An abstract image of three boxes

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST for Drupal core's needs.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetch data but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability.

Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an any HTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted convention amongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GET request with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling.

Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:API is on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions.

Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure.

What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, for Drupal core's needs, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. That said, Drupal's GraphQL implementation is fantastic, especially when you have the developer capacity to build a bespoke API for your project.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein (Acquia) for their feedback during the writing process.

February 11, 2019

11 min read time

Feb 07 2019
Feb 07

I'm frequently sent examples of how Drupal has changed the lives of developers, business owners and end users. Recently, I received a very different story of how Drupal had helped in a rescue operation that saved a man's life.

The Snowdonia Ultra Marathon website

In early 2018, Race Director Mike Jones was looking to build a new website for the Ultra-Trail Snowdonia ultra marathon. He reached out to a good friend and developer, Rob Edwards, to lead the development of the website.

A photo of a runner at the Ultra-trail Snowdonia ultramarathon© Ultra-trail Snowdonia and No Limits Photography

Rob chose Drupal for its flexibility and extensibility. As an organization supported heavily by volunteers, open source also fit the Snowdonia team's belief in community.

The resulting website, https://apexrunning.co/, included a custom-built timing module. This module allowed volunteers to register each runner and their time at every aid stop.

A runner goes missing

Rob attended the first day of Ultra-Trail Snowdonia to ensure the website ran smoothly. He also monitored the runners at the end of the race to certify they were all accounted for.

Monitoring the system into the early hours of the morning, Rob noticed one runner, after successfully completing checkpoints one and two, hadn't passed through the third checkpoint.

A photo of a runner at the Ultra-trail Snowdonia ultramarathon© Ultra-trail Snowdonia and No Limits Photography

Each runner carried a mobile phone with them for emergencies. Mike attempted to make contact with the runner via phone to ensure he was safe. However, this specific area was known for its poor signal and the connection was too weak to get through.

After some more time eagerly watching the live updates, it was clear the runner hadn't reached checkpoint four and more likely hadn't ever made it past checkpoint three. The Ogwen Mountain Rescue were called to action.

Due to the terrain and temperature, searching for the lost runner on foot would be too slow. Instead, the mountain rescue volunteers used a helicopter to scan the area and locate the runner.

How Drupal came to rescue

The area covered by runners in an ultra marathon like this one is vast. The custom-built timing module helped rescuers narrow down the search area; they knew the runner passed the second checkpoint but never made it to the third.

After following the fluorescent orange markers in the area pinpointed by the Drupal website, the team quickly found the individual. He had fallen and become too injured to carry on. A mild case of hypothermia had set in. The runner was airlifted to the hospital for appropriate care. The good news: the runner survived.

Without Drupal, it might have taken much longer to notify anyone that a runner had gone missing, and there would have been no way to tell when he had dropped off.

NFC and GPS devices are now being explored for these ultra marathon runners to carry with them to provide location data as an extra safety precaution. The Drupal system will be used alongside these devices for more accurate time readings, and Rob is looking into an API to pull this additional data into the Drupal website.

Stories about Drupal having an impact on organizations and individuals, or even helping out in emergencies, drive my sense of purpose. Feel free to keep sending them my way!

Special thanks to Rob Edwards, Poppy Heap (CTI Digital) and Paul Johnson (CTI Digital) for their help with this blog post.

February 06, 2019

2 min read time

Jan 31 2019
Jan 31

Since I was young, I've been an avid tennis player and fan. I still play to this day, though maybe not as much as I'd like to.

In my teens, Andre Agassi was my favorite player. I've even sported some of his infamous headbands. I also remember watching him win the Australian Open in 1995.

In 2012, I traveled to Melbourne for a Drupal event, the same week the Australian Open was going on. As a tennis fan, I was lucky enough to watch Belgium's Kim Clijsters play.

Last weekend, the Australian Open wrapped up. This year, their website, https://ausopen.com, ran on Acquia and Drupal, delivered by the team at Avanade.

In a two-week timeframe, the site successfully welcomed tens of millions of visitors and served hundreds of millions of page views.

I'm very proud of the fact that many of the world's largest sporting events and media organizations (such as NBC Sports who host the Super Bowl and Olympics in the US) trust Acquia and Drupal as their chosen digital platform.

When the world is watching an event, there is no room for error!

A photo of the team that worked on the Australian Open website in 2019Team Tennis Australia, Acquia and Avanade after the men's singles final.

Many thanks to the round-the-clock efforts from Acquia's team in Asia Pacific, as well as our partners at Avanade!

January 30, 2019

50 sec read time

db db

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web