Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Oct 21 2021
Oct 21

In a previous article on Drupal personalization I covered content personalization trends in terms of search frequency for the term "content personalization". We saw that interest in the term has been on the increase since around 2015 with possibly a peak being hit around 2020. Whilst it is too early to say that interest in the subject has collapsed, it certainly has taken a breather. In its place it appears that interest in "digital experience platform" (DXP) has picked up over the last couple of years. The chart below (Google Trends) shows the relative rise of DXP as a search term in contrast to personalization.

Red: DXP, Blue: Personalization

The rise in interest is DXP can be seen as an extension in the interest initially shown in content personalization. Attention is now moving from the concept of what it is to how it might be achieved. Personalization cannot be achieved by an isolated system. It requires the coordination of a number of processes including data gathering, management and processing to determine the next best piece of content to deliver to the user. Traditional content management systems (CMS) have been good at managing and delivering the content but not so good at orchestrating the delivery of the experience. Campaign management software has been effective at pushing content to users but has suffered from not having a view into what their interests are. And finally CRMs store content on known users but miss out on insights to anonymous users who probably make up the bulk of traffic to the website and other platforms. Digital Experience Platforms offer the promise of being able to bring the pieces of the puzzle together.

Customer Data Platforms (CDPs) potentially play a role in this ecosystem. They offer capabilities to form unified user profiles based on behaviour and the aggregation from a number of sources. They also offer user segmentation tools which can help drive the delivery of personalized content. In a similar way to DXPs, there has been a rise in interest in them as well over the last few years.

Red: DXP, Blue: Personalization, Yellow: CDP

In conclusion there is increased interest around technology to better understand users and how personalized content can be delivered to them across the various touchpoints.

But what is a DXP?

It has been said that there is no such thing as a DXP. By its very nature a DXP is a collection of different tools acting in unison with the ultimate aim of delivering digital experiences to users. As we know, the marketing technology landscape is vast with countless options available for almost any problem. Buyers will look for solutions which fit their needs, capabilities and budgets. This will inevitably lead to buyers working with a mix of technologies which may not be well integrated. The problem can then be seen as to how to coordinate these tools to act in unison. 

Gartner identifies a number of different capabilities for a DXP:

  • Content management
  • Account services
  • Personalization and context awareness
  • Analytics and optimization
  • Customer journey mapping
  • Customer data management
  • Presentation, delivery and orchestration
  • Search, navigation and insight
  • Collaboration and knowledge sharing
  • Security and access control
  • Artificial intelligence (AI)
  • Cloud capabilities
  • Architecture and platform design
  • Integration, interoperability and extensibility
  • Multiexperience support

The list is long and covers a wide range of requirements. In the DXP space vendors have been scrambling to build out platforms which tick the various boxes for what makes a DXP. The individual pieces of these solutions will have their strengths and weaknesses and may not be right for every customer. Tightly integrated solutions offer the advantage of enabling the flow of data and easier management, however, they may well offer the wrong type of solution for the customer. In all practicality it is not feasible to buy a single solution off the shelf and have success.

The rise of the composable DXP

Whilst DXP vendors have been making acquisitions to build out their monolithic platforms, a counter idea has emerged, that of the "composable" DXP. The main idea behind the composable DXP is the recognition that a monolithic solution is not going to be right for everyone. With the monolith comes vendor lock in and the inability to pick and choose "best of breed" solutions. In a fast moving world businesses need to be able to act in an agile manner to the changing landscape and demands.

In the Gartner report on the matter, the following recommendations were made:

  • Modernization of the DXP technology stack around the principles of composable business.
  • Improvements in operational agility by replacing the DXP monolith with a composable architecture.
  • Implementation of task-oriented packaged business capabilities

The ideas of the composable DXP has taken strong hold with many vendors now using the term in their marketing. This is particularly the case for smaller vendors who may not have developed their own monolith. Composable DXPs require coordination and integration. The focus is now on how well the tools work together and what the effort and cost is in making this happen. For buyers considering their platform, the concept of a composable DXP offers a way to reconceptualise the problem to focussing on business needs and processes, rather than a single silver bullet.

The open DXP

The rise the composable DXP with its emphasis on integration has necessarily lead to an evaluation capabilities of the components to interoperate. Open systems will fare better because they are outward looking and better able to work with other systems. Open source systems, with their collaborative nature, are at an advantage as they will more likely have a breadth of solutions available. Where there is a gap, new functionality can be easily incorporated. The architectures of open system will also make extensions easier to implement and incorporate.

"Openness" can be seen in two ways. The first refers to the licensing of the code. Open source systems will necessarily be open in this sense. However, in a wider sense, openness also refers to the ability for the system to be easily implemented and extended. Is the system open to being extended and improved. Solutions claiming to be open need to be evaluated in this light. 

Drupal's place in the DXP world

Drupal has earned its place as a leading enterprise CMS which excels at content management and integration. These have always been its strong points and as such it is extremely well placed to form a key part of a DXP. The Drupal ecosystem (hosting, code, support, knowhow) is capable of ticking many of the boxes for what makes up a DXP. 

Looking back at the Gartner list, it is apparent that there are key pieces missing:

  • Personalization and context awareness
  • Customer journey mapping
  • Customer data management
  • Orchestration
  • Architecture and platform design

These missing pieces should be seen as an opportunity for the Drupal community as they point the way to how Drupal, as a product, can be improved to remain future proof.

Interestingly, Drupal is namechecked by the composable DXP article:

Currently, most DXP products are too large and not task-oriented. This prevents organizations from managing them easily, or changing them quickly. DXP capabilities are not easily decomposable or replaceable. However, several DXPs do have extensibility built in, via robust APIs, extension frameworks (such as Drupal’s module framework) and/or app marketplaces. These can be the starting point on the journey toward composability.

Drupal is therefore seen as being well placed to take a central role in the composable DXP ecosystem due to its APIs and extensibility.

The future

There will no doubt continue to be increasied interest in DXPs from businesses looking to build integrated platforms. The rise of the composable DXP as a concept should see many smaller vendors with more open platforms starting to make an impact in the space. For Drupal, improvements in the areas of personalization and experience orchestration should see it build on the strong foundations it already has. This should see a range of DXP options becoming available as activity increases in the area.

Oct 19 2021
Oct 19

In a previous article on Drupal personalization I covered content personalization trends in terms of search frequency for the term "content personalization". We saw that interest in the term has been on the increase since around 2015 with possibly a peak being hit around 2020. Whilst it is too early to say that interest in the subject has collapsed, it certainly has taken a breather. In its place it appears that interest in "digital experience platform" (DXP) has picked up over the last couple of years. The chart below (Google Trends) shows the relative rise of DXP as a search term in contrast to personalization.

Red: DXP, Blue: Personalization

The rise in interest is DXP can be seen as an extension in the interest initially shown in content personalization. Attention is now moving from the concept of what it is to how it might be achieved. Personalization cannot be achieved by an isolated system. It requires the coordination of a number of processes including data gathering, management and processing to determine the next best piece of content to deliver to the user. Traditional content management systems (CMS) have been good at managing and delivering the content but not so good at orchestrating the delivery of the experience. Campaign management software has been effective at pushing content to users but has suffered from not having a view into what their interests are. And finally CRMs store content on known users but miss out on insights to anonymous users who probably make up the bulk of traffic to the website and other platforms. Digital Experience Platforms offer the promise of being able to bring the pieces of the puzzle together.

Customer Data Platforms (CDPs) potentially play a role in this ecosystem. They offer capabilities to form unified user profiles based on behaviour and the aggregation from a number of sources. They also offer user segmentation tools which can help drive the delivery of personalized content. In a similar way to DXPs, there has been a rise in interest in them as well over the last few years.

Red: DXP, Blue: Personalization, Yellow: CDP

In conclusion there is increased interest around technology to better understand users and how personalized content can be delivered to them across the various touchpoints.

But what is a DXP?

It has been said that there is no such thing as a DXP. By its very nature a DXP is a collection of different tools acting in unison with the ultimate aim of delivering digital experiences to users. As we know, the marketing technology landscape is vast with countless options available for almost any problem. Buyers will look for solutions which fit their needs, capabilities and budgets. This will inevitably lead to buyers working with a mix of technologies which may not be well integrated. The problem can then be seen as to how to coordinate these tools to act in unison. 

Gartner identifies a number of different capabilities for a DXP:

  • Content management
  • Account services
  • Personalization and context awareness
  • Analytics and optimization
  • Customer journey mapping
  • Customer data management
  • Presentation, delivery and orchestration
  • Search, navigation and insight
  • Collaboration and knowledge sharing
  • Security and access control
  • Artificial intelligence (AI)
  • Cloud capabilities
  • Architecture and platform design
  • Integration, interoperability and extensibility
  • Multiexperience support

The list is long and covers a wide range of requirements. In the DXP space vendors have been scrambling to build out platforms which tick the various boxes for what makes a DXP. The individual pieces of these solutions will have their strengths and weaknesses and may not be right for every customer. Tightly integrated solutions offer the advantage of enabling the flow of data and easier management, however, they may well offer the wrong type of solution for the customer. In all practicality it is not feasible to buy a single solution off the shelf and have success.

The rise of the composable DXP

Whilst DXP vendors have been making acquisitions to build out their monolithic platforms, a counter idea has emerged, that of the "composable" DXP. The main idea behind the composable DXP is the recognition that a monolithic solution is not going to be right for everyone. With the monolith comes vendor lock in and the inability to pick and choose "best of breed" solutions. In a fast moving world businesses need to be able to act in an agile manner to the changing landscape and demands.

In the Gartner report on the matter, the following recommendations were made:

  • Modernization of the DXP technology stack around the principles of composable business.
  • Improvements in operational agility by replacing the DXP monolith with a composable architecture.
  • Implementation of task-oriented packaged business capabilities

The ideas of the composable DXP has taken strong hold with many vendors now using the term in their marketing. This is particularly the case for smaller vendors who may not have developed their own monolith. Composable DXPs require coordination and integration. The focus is now on how well the tools work together and what the effort and cost is in making this happen. For buyers considering their platform, the concept of a composable DXP offers a way to reconceptualise the problem to focussing on business needs and processes, rather than a single silver bullet.

The open DXP

The rise the composable DXP with its emphasis on integration has necessarily lead to an evaluation capabilities of the components to interoperate. Open systems will fare better because they are outward looking and better able to work with other systems. Open source systems, with their collaborative nature, are at an advantage as they will more likely have a breadth of solutions available. Where there is a gap, new functionality can be easily incorporated. The architectures of open system will also make extensions easier to implement and incorporate.

"Openness" can be seen in two ways. The first refers to the licensing of the code. Open source systems will necessarily be open in this sense. However, in a wider sense, openness also refers to the ability for the system to be easily implemented and extended. Is the system open to being extended and improved. Solutions claiming to be open need to be evaluated in this light. 

Drupal's place in the DXP world

Drupal has earned its place as a leading enterprise CMS which excels at content management and integration. These have always been its strong points and as such it is extremely well placed to form a key part of a DXP. The Drupal ecosystem (hosting, code, support, knowhow) is capable of ticking many of the boxes for what makes up a DXP. 

Looking back at the Gartner list, it is apparent that there are key pieces missing:

  • Personalization and context awareness
  • Customer journey mapping
  • Customer data management
  • Orchestration
  • Architecture and platform design

These missing pieces should be seen as an opportunity for the Drupal community as they point the way to how Drupal, as a product, can be improved to remain future proof.

Interestingly, Drupal is namechecked by the composable DXP article:

Currently, most DXP products are too large and not task-oriented. This prevents organizations from managing them easily, or changing them quickly. DXP capabilities are not easily decomposable or replaceable. However, several DXPs do have extensibility built in, via robust APIs, extension frameworks (such as Drupal’s module framework) and/or app marketplaces. These can be the starting point on the journey toward composability.

Drupal is therefore seen as being well placed to take a central role in the composable DXP ecosystem due to its APIs and extensibility.

The future

There will no doubt continue to be increasied interest in DXPs from businesses looking to build integrated platforms. The rise of the composable DXP as a concept should see many smaller vendors with more open platforms starting to make an impact in the space. For Drupal, improvements in the areas of personalization and experience orchestration should see it build on the strong foundations it already has. This should see a range of DXP options becoming available as activity increases in the area.

Oct 11 2021
Oct 11

This has seen a drive to add personalization features to automated marketing campaigns as well as in the content management system (CMS). CMSs have therefore evolved to accommodate this need, moving from platforms to manage and serve content to also service digital experiences which are personalised. This has seen the emergence of digital experience platforms (DXP) which need to orchestrate the personalization process. Related buzzwords in this space include "Content as a Service" (CaaS) and "headless" or "decoupled". 

Personalization trends

A look at Google Trends for "content personalization" emerged from a low base around 2013 to hit a peak around 2020. Interest in the term is still elevated today, however it does seem to have reached its peak. This perhaps represents attention moving to other hotter topics such as DXP and CDP.

This trend has also been seen in the Drupal community. The concept of "Web Experience Management" first appeared in the Drupal community around 2011 and was popularised by Dries in his keynote at DrupalCon Portland in 2013.

This marked the beginnings of the shift in the community. On particularly visionary presentation was given by Dave Ingram, Delivering Hyper Personal Digital Experiences Within Drupal, where he set out a recipe for personalization which covers all of the bases for personalization techniques as well as a sensible approach for iterating and improving over time. 

The concepts of segmentation, engagement and personalization were beginning to take hold. However, prior to 2015 there was only the occasional article or presentation on the specific subject of personalization. From 2015 interest certainly picked up, with more Drupal personalization presentations being given at Drupal conferences and camps. In 2020 in was a very hot topic at DrupalCon, with many presentation given from a variety of Drupal agencies.

Early personalization efforts were based around tracking a logged in user and serving a personalized experience to them based on their properties. Users could signal their interests by explicitly selecting them on their user profile or signing up to various groups. CRM systems could also be integrated with Drupal to sync user profiles of known users. In this light personalization is relatively easy as the user is known and already a customer. Recent developments have been around personalizing for anonymous users and this has lead the need for a more decentralised approach to tracking and the delivery of the personalised experience. This need to personalize for unknown users could be considered as the main driver for the activity in the space since 2015.

A lot of this interest has been driven in the emergence of Acquia Lift (now Acquia Personalization). This product saw a lot of promotional activity from Acquia which, whilst being targetted at potential customers, was also helpful in educating the Drupal community around the need for personalization and also some of the approaches that could be taken to achieve that. This lead to the development of modules based around user context and the conditional serving of content.

That is not to say the rest of the community has sat idly by. There are a number of Drupal agencies which have picked the concept up and integrated it into their practices, driving the way they communicate with clients about the needs for personalization and how it can be achieved. In the presentations below there is a fairly even split between strategy and technology. It is refreshing to see that the problem of personalization has been seen as multifaceted with a strong emphasis on research, planning and design, as well as the technology to achieve it.

This has lead to the development of a number of open source projects in the Drupal space which can be used for personalization projects. Some of these are simple client side solutions, sometimes adopting a decoupled approach to serving the personalized experiences. We have also seen increased integrations with CRMs, marketing tools and CDPs to serve the personalized content.

Conclusion

This article is a roundup of most of the events and presentations which have helped shape the way Drupal practitioners approach the conceptual and technical solutions required for Drupal personalization. If you have any items you think I have missed in the roundup below, please add them to the comments and I will look to incorporate them in. 

What is next for personalization in Drupal? Time will tell.

Oct 06 2021
Oct 06

One of our clients had the need to display calendar links in an email which was being sent out after the user signed up for an event. The event registration was being handled as a Weform submission and and email was being sent out to the user after they had submitted the form. We had no easy way to add the calendar links in as the markup required was quite complex.

One possible alternative was the Calendar Link module. It provides twig functions for converting a link into an invite. This was not of use to us because we wanted to convert the link to an invite in an email, rather than a web page. The natural solution was to extend the token system to do this for us.

The Calendar Links Token module is the simple solution. It makes use of the Calendar Links library to implement a token which can be used in Drupal. We have used it in emails, however it could be used in other situations.

The token has quite a number of options to handle the various fields which can be used when creating links for the various calendar services.

[calendar_links:parameters:nid|date_start_field|date_end_field|title_field|description_field|location_field]

The module page has a few examples of how this can be used.

The output of the token shows links to the common calendar services:

This tiny little module has allowed us to build a simple event registration system which provides a better user experience, allowing users to easily add the event to their calendars once they have signed up.

We hope you find this module of use in one of your projects. 

Oct 04 2021
Oct 04

On a recent client project we had a demanding requirement. Content needed to be managed with the Group module as well as support workflow via the Scheduled Transitions module. The problem was that both of these modules did not work together. When content is in a Group, it operates in a way that is separate to the usal Drupal way of doing things. This goes for permissions, content management and for workflow as well. This meant that Group and Scheduled Transitions did not play nicely together. The Group Scheduled Transitions module has been released to overcome this shortcoming.

Firstly, it is important to understand that Drupal core comes with Content Moderation. Content moderation allows content to be flowed btween a number of different states. It can be used to move content from draft to review to finally being publiched. Content can also be archived. It is a flexible system which can accommodate many different workflows depending on the client needs.

The Scheduled Transitions module works in with Content Moderation, allowing for the transition from one state to another to be scheduled at a certain time in the future. When that time arrives the content is moved onto the next step. This is really handy for publishing content at a certain time, and it can also be used to archive or unpublish content.

Unfortunately (for us), the Scheduled Transitions module did not work in with the Group module which we were using to manage content permissions and access. The can find out more about this issue on drupal.org issue queue where the advice was to build a new module. And so the Group Scheduled Transitions module was born.

Behind the scenes the module adds Group support by adding view scheduled transitions and add scheduled transitions permissions. This can be used to allow group roles to also to be able to scheduled transitions.

We hope you enjoy using the module.

Oct 02 2021
Oct 02

The Personified module provides a flexible way to utilise the strengths of Drupal to deliver a personalized experience for anonymous users of Drupal websites. It is able to take data from user profile context, stored in localstorage, and use this to query a JSON endpoint for data. That data is then transformed using a clientside templating language, such as Handlebars, to present the content to the user. 

In non technical language, Personified provides a way to personalise Drupal web pages for anonymous users. It is very much akin to the Smart Content module, except here we query for data, rather than showing individual blocks.

The requirement

At Morpht, we developed Personified as part of our efforts to uplift the personalization capabilities of Drupal. We were impressed with the design of the Smart Content module, allowing for client-side context to be used to drive the user experience. However, we wanted the option of being able to query data from  Drupal (or elsewhere) to retrieve the data. We wanted to use Drupal as a content hub for serving personalized experiences and to deliver those experiences based on a query rather than 'if, then, else' logic.

The solution

In order to make this work the following ingredients were needed:

  1. The storage of user profile data in localstorage
    • The generation of that data (not covered in this article)
  2. A client side solution for:
    • retrieving the context
    • querying an endpoint with the context (with a default fallback)
    • transforming the data with Javascript
    • displaying the data back to the user.

We created Personified to handle the second part of this problem, handling the state, query, transform and display.

Along the way, we needed to solve the transformation part of the problem with a client-side solution. In order to find a pluggable solution, we developed the JSON Template module which is able to transform JSON to HTML using a variety of backends. We implemented a Handlebars transformer as the initial plugin as this provided us with a simple and well-known solution.

How does it work

The best way to demonstrate this is with a screenshot of what the block edit screen looks like:

The screenshot demonstrates the various ingredients which were mentioned above:

  • Data is gathered from localstorage
  • An enpoint is defined with a querystring parameter
  • A default fallback option is available for when localstorage is empty
  • A JSON Template is selected to transform the data.

In this example, the user's 'season' is used to delete a season promo from the Drupal CMS.

You can see all of this in action on our Convivial Demo site. I encourage you to explore that site to see how Drupal can be personalized for anonymous users. (We built this site with Convivial CMS.)

Oct 01 2021
Oct 01

The VantaJS library provides a range of animated backgrounds for use in websites. The animations are organic and mesmerizing to look at. They are also interactive to mouse hover events and provide a playful addition to a site.

The Modifiers Vanta module is a Drupal module which allows presentational modification to be added to content on a page. Typical Modifiers implementation include background colours, images, parallax images and videos. VantaJS is of a similar ilk, providing a background for your content.

The VantaJS modifers module has been released as a contributed as an open source module on Drupal.org. It has Modifiers as a dependency and implements itself as a Modifier plugin.

To see it in action you can take a look at the header of this blog post. At Morpht we have also implemented it on our individual profile pages.

May 16 2021
May 16

A little while back, almost two years ago, Dries Buytaert wrote an interesting thought piece on the sustainability of open source projects such as Drupal. He reviewed the ways different actors engage with open source projects, dividing them into two camps, the Makers and the Takers. The makers build and create, providing benefit to the wider community and society. The takers are able to benefit from this creative process whilst not suffering any of the costs of the creative process, allowing them to gain a competitive advantage.

The difference between Makers and Takers is not always 100% clear, but as a rule of thumb, Makers directly invest in growing both their business and the Open Source project. Takers are solely focused on growing their business and let others take care of the Open Source project they rely on.

In order to demonstrate the difference in outcomes for makers and takers, a payoff matrix was provided. It shows that if everyone contributes there are shared advantages, however, if "takers" decide not to contribute, they will win out as they do not bare the costs of contributing to the project.

  Company A contributes Company A doesn't contribute Company B contributes

A makes $50
B makes $50

A makes $60
B makes $20 Company B doesn't contribute A makes $20
B makes $60 A makes $10
B makes $10


At the time the article did have an impact on me as it was an attempt by Dries to bring outside concepts to help understand the Drupal project and where it might be heading. Thinking such as this leads to informed ways of conceiving a future for projects such as Drupal and how they might shape themselves to thrive. 

The article examined concepts such as the Prisoners’ dilemma, public goods, common goods, free riders, the tragedy of the commons. Following on from conclusions by Hardin and Olson, the core problem for Dries was that “groups don't act on their shared interests”. How can a group organise to avoid the 'free rider' problem? Dries focused on a conclusion from Ostrom who writes “For any appropriator to have a minimal interest in coordinating patterns of appropriation and provision, some set of appropriators must be able to exclude others from access and appropriation rights.” The conclusion was that “Takers will be Takers until they have an incentive to become Makers.” These ideas have driven some changes being implemented at the Drupal Association, such as contribution credits and organisation accreditation and membership.

These thoughts have also been influential at Morpht, the Drupal agency where I work. We have adopted a set of foundation principles for the company. One of the key concepts is that we are Makers and Creators and value contributing to the Drupal project and the community. In practice, we have built internal systems to incentivise and reward everyone in the company to contribute back where they can. Outcomes of the process include a rise in commits to the project and a more open approach to how we share our code. We are also financial contributors to the project, supporting the Drupal Association as a supporting partner.


Green bearded altruism

Recently an intriguing video popped up in my stream “Simulating Green Beard Altruism” by Justin Helps. It is expertly researched, explained and visualized. It really is worth watching, so go on, I’ll give you a few minutes to take a look.

[embedded content]

For the uninitiated, myself included, Richard Dawkins coined green bearded altruism in his book The Selfish Gene. It represents a way for one actor to signal to another that they are altruistic. If altruists can recognise other altruists, they are able to direct their altruism at them and increase their chances of survival.

"I have a green beard and I will be altruistic to anyone else with green beard". 
Richard Dawkins, The Selfish Gene

The video took this concept and ran some simulations on how Altruists and Cowards fare under different conditions and rules. This setup relates closely to Dries' post around the sustainability of open source ecosystems.

The Makers (Altruists) build the system and positively benefit the whole. The Takers (Cowards) benefit more because they do not suffer the costs of maintaining the system. In this scenario, the Takers (Cowards) win out and thrive.

But what happens when the Makers (Altruists) only share the benefit with other Makers? They succeed and the Takers (Cowards) are less successful. The simulation, as set up, provides a possible way forward for open source projects such as Dupal. If you are a good actor and only reward other good actors, positive results will flow and will continue to flow.

The final simulation in video throws in the curveball of actors being dishonest. Sometimes a Coward will pretend to be an Altruist, tricking them into helping them. What do we find?

  • Altruists who do not signal their altruism tend to die out in a system where Altruists are rewarded. They are labelled as Suckers - doing useful things but not being recognised to their detriment.
  • Cowards who masquerade as Altruists are successful, reaping the benefits and suffering none of the costs. 

And most concerningly, in a world where actors can hide their true identity, even the Cowards, acting as Cowards, have success. The Altruists in their various forms cannot compete. This final outcome is depressing. What is the point of being altruistic in a world where others can just take advantage? Being altruistic is not enough if actors are gaming the system against you.

In a wider context what could we learn from these simulations? If an individual or organisation is indeed a Maker, they should signal this to others and be rewarded or recognised. This runs against the desire to be modest, but it does appear to be a sensible thing to do. Conversely, those gaming the system should be called out and somehow excluded.

A 2021 update

Dries has recently returned to the Maker and Takers concept in the Q&A following the Dries note at DrupalCon Global 2021. The video is yet to be released to the public but will be added here once available. Those of you with access to the video on Vimeo can take a look at 7:12 - 9:06. Dries says:

"Open source is a public good, meaning that everyone can use it. If I use a public good it doesn’t exclude you from using it either… One interesting thing is that leads which are essential to business are not a public good. It is actually a common good, meaning that there is a scarcity to it. If I win a deal, you can’t win that deal. So one of the things that we can do is make sure that the leads, the potential customers, …  go to those who contribute. 

There is something there that I really believe strongly. If we can route leads to organisations that contribute we will maximise the number of contributions Drupal will get and I believe that these organisations are often better serviced too because they get help from those organisations that actually help build Drupal. Its kind of a win win.

Sometimes I feel that we are afraid to talk about these topics because they may be somewhat controversial, but there is so much more that we can do."

These comments take the thinking to the next step. There is a recognition that if payoff for altruistic behaviour is financial, this will lead to further contributions. The way to achieve this is through the “routing of leads”.

Applying it all to Drupal

The main takeaways from the above appear to be:

  • We should be encouraging altruistic behaviour because it benefits the project.
  • Altruists can still benefit if they receive the benefit from other altruists.
  • Real financial benefits need to flow to the altruists if they are to be motivated.

So what is happening in the Drupal space?

The Marketplace

In recent times the Drupal project has reorganized itself to encourage more good actors. This has largely been done through the mechanism of recording contributions and promoting those who have contributed the most. Drupal agencies have been encouraged to support staff in contributing. The results are reflected in the Drupal Marketplace. The system gamifies contributions, motivating Drupal service providers to contribute more to move up the leaderboard.

It appears that following aspects have been valued:

  • Commit credits to core, contrib and other issues.
  • Publishing of case studies.
  • Financial support of the Drupal Association.

There are ongoing efforts to broaden this out and to further incentivise contributions from individuals and organisations.

Increase the exposure

The marketplace system does represent a huge step forward in demonstrating the contributions made by the various providers. It is like an X-ray into who is doing what in the community. It does suffer from a number of shortcomings:

  • What exposure does the marketplace have to potential clients?
  • When a client is looking to engage a Drupal agency, are they referring to the marketplace? If they are not, then the real financial benefits may not flow to the agencies. Then, the system is  just a game between the players. It benefits Drupal for sure, but does it benefit the players?

In order to be effective, the marketplace needs more exposure to end clients so that the 'routing of leads' can be improved.

The little guy

The Drupal Marketplace currently ranks organisations on an absolute scale according to credits in absolute terms. To my knowledge the rankings are not normalised by organisation size.

It would be interesting to see what the results would be if the rankings were normalised by employee count. We would then be able to see who the biggest altruists were, dollar for dollar. This would give more incentive to smaller organisations to contribute so they could better signal their altruism. 

Advertisements on drupal.org

It is possible to advertise on drupal.org in a variety of positions. You may be familiar with the prominently displayed ads for private companies which are displayed on the bottom of pages. These ads are visible across the whole of drupal.org and benefit from more exposure than being on the marketplace.

Open up the ad space

This form of promotion is obviously much more 'private' in nature, designed to promote the interests of the advertiser rather than that of the project. It is a way for actors to promote their own self interests against others and the interests of the project as a whole.

The advertising space is currently a vehicle to promote the interests of a selected few, rather than all of the altruists in the system. It creates a feeling of 'them vs. us' in the community to have certain players promoted in this way and not others. The Drupal Association should consider how this space could support all contributors. I would suggest that revenue for this space could still be maintained whilst opening it up proportionally based on contributions.

Sponsorship and visibility

Supporters of the Drupal Association receive promotion at Drupal conferences and in other ways. A sponsorship entitles a service provider to a variety of advantages, the main one of which is the promotion through badges and logo display at conferences. 

Continue the drive for more members

Supporting the project in this way can be more appealing than one-off sponsorship at individual conferences. Supporters do get quite a good level of exposure at conferences and this is a good way to signal altruism to other members of the community.

This system appears to be working reasonably well. Supporting the project is a good thing and sponsorship is a direct way to do it. The big challenge here is to encourage the non-subscribers to jump on board. If all individuals and companies did this in just a small way, the financial security of the Drupal Association would be assured. This has always been the case. As a community we should be encouraging this where we can. So if you are not yet a member or sponsor, you know what do do :).

Burnout and community funding

It is not uncommon for certain prolific or influential contributors to leave the community. A common reason would be burnout because of the stress of sustaining an important codebase in their spare time. It is not sustainable for them to do so, especially when there are many demands for support of features.

Most recently we have seen Jacob Rockowitz, maintainer of the Webform module, post several articles discussing this situation. The result was the decision to move to a sponsored approach using the Open Collective platform. This model encouraged users of the Webform module, the Takers to help contribute and become Makers by continuing the development of the module.

Closing the altruism loop

The way this message has been communicated has, in my opinion, been done in a very positive way. If you look at the Webform module page, there is a call for support through code, patches and reviews. And for those who cannot do that, financial options exist. This is a direct move to increase the altruism in the community and to close the loop between altruists helping other altruists.

Who sponsors Drupal?

In his blog post: Who Sponsors Drupal, Dries makes the point that companies or smaller agencies support most Drupal development. The bulk of the codebase is supported by actors who no doubt have an active interest in open source and Drupal, as well as the financial and technical ability to help support the codebase.

Deep pockets and shallow expertise

What can be done to broaden this out further, so that we can get more Makers and few Takers?

Larger entities with lower technical capability should have an easy way to fund development of code. Once the altruism feedback loop is strengthened, there should be a bigger drive for Takers to become Makers. We need a path for this to take place.

This is not the same as sponsoring the Drupal Association, i.e. infrastructure, promotion and governance. This is about funding code development, strengthening the code and functionality of the project and making it more attractive as a technological proposition.

In order for this to take place, two things need to happen:

  1. The Drupal Association, or some other body, needs to be ready to take on this responsibility.
  2. A method of dividing the resources needs to be determined.

At the moment, the efforts in this area have been ad hoc. There are notable examples of large companies contributing to initiatives which push the project forward. However, in order for this to be scalable, it needs to be done in a more systemic manner. It may be that the Drupal Association doesn't want to take this responsibility on - that is fair enough. If so, how might it be done?

The Webform module has turned to Open Collective. A recent article from Rachel Norfolk, makes a similar suggestion. Maybe the DA is taking a look at what is going on with Webform?

Shallow pockets and deep expertise

And what of the individual developer? The one who loves Drupal, loves open source and dedicates their time to improving the project. What if they are not supported financially by a larger organisation? These people are the lifeblood of the project and they need to be supported. It makes a heap of sense to harness their creative energies and support them financially to progress the project. 

If we can find a way to put these Makers together with those with the funds and the desire, great strides will be taken.

I would therefore suggest that this seems to be the most practical approach for closing the altruism loop and progressing the project.

Conclusion

While I make some suggestions, I believe Drupal is in an excellent position. The community is broad and deep and there is a lot of desire to keep the momentum going. There are also some excellent systems in place to help reinforce this such as recognition for contribution and the marketplace.

I have argued for the following:

  • Continuing with the credit system and the marketplace;
  • Increase the prominence of the Drupal Marketplace to outsiders;
  • Promote organisations who are punching above their weight in terms of contributions per employee;
  • Continue the sponsorship approach with the community encouraging membership to the Drupal Association;
  • Reconsider the advertising space on the Drupal Association as something for Makers rather than being private for a select few;
  • Develop a system to bring Makers together with Takers who have deep pockets.

The biggest challenge, where we can make the most gains, lies in bringing the big Takers into the fold and supporting talented individuals who do not otherwise have support, and with that closing the altruism feedback loop and increasing the chances for the project to grow and improve.
 

Apr 06 2021
Apr 06

Morpht is located on the traditional lands of the Gadigal people of the Eora Nation as the traditional custodians of this place we now call Sydney. We pay our respects to Elders both past and present and recognise Aboriginal and Torres Strait Islander people as the Traditional Custodians of the land.

Apr 06 2021
Apr 06

Drupal 8 is built on PHP, but using new architecture paradigms that can be difficult to grasp for developers coming from a Drupal 7 background. The Typed Data API lies at the core of Drupal 8, and provides building blocks used throughout the Drupal 8 architecture. In this presentation, Jay Friendly, Morpht's Technical Director, dives into the Typed Data API, what it is, how it works, and why it is so awesome!

Apr 06 2021
Apr 06

Guzzle makes HTTP requests easy. When they work, it's like magic. However, as with all coding, getting something to work requires debugging, and this is where the Drupal implementation of Guzzle has a major usability problem - any returned messages are truncated, meaning that with the default settings, error messages that can help debug an issue are not accessible to the developer. This article will show developers how they can re-structure their Guzzle queries to log the full error to the Drupal log, instead of a truncated error that does not help fix the issue.

Standard Methodology

Generally, when making a Guzzle request, it is made using a try/catch paradigm, so that the site does not crash in the case of an error. When not using try/catch, a Guzzle error will result in a WSOD, which is as bad as it gets for usability. So let's take a look at an example of how Guzzle would request a page using a standard try/catch:

try {
  $client = \Drupal::httpClient();
  $result = $client->request('GET', 'https://www.google.com');
}
catch (\Exception $error) {
  $logger = \Drupal::logger('HTTP Client error');
  $logger->error($error->getMessage());
}

This code will request the results of www.google.com, and place them in the $result variable. In the case that the request failed for some reason, the system logs the result of $error->getMessage() to the Drupal log.

The problem, as mentioned in the intro, is that the value returned from $error->getMessage() contains a truncated version of the response returned from the remote website. If the developer is lucky, the text shown will contain enough information to debug the problem, but rarely is that the case. Often the error message will look something along the lines of:

Client error: `POST https://exaxmple.com/3.0/users` resulted in a `400 Bad Request` response: {"type":"http://developer.example.com/documentation/guides/error-glossary/","title":"Invalid Resource","stat (truncated...)

As can be seen, the full response is not shown. The actual details of the problem, and any suggestions as to a solution are not able to be seen. What we want to happen is that the full response details are logged, so we can get some accurate information as to what happened with the request.

Debugging Guzzle Errors

In the code shown above, we used the catch statement to catch \Exception. Generally developers will create a class that extends \Exception, allowing users to catch specific errors, finally catching \Exception as a generic default fallback.

When Guzzle hits an error, it throws the exception GuzzleHttp\Exception\GuzzleException. This allows us to catch this exception first to create our own log that contains the full response from the remote server.

We can do this, because GuzzleException provides the response object from the original request, which we can use to get the actual response body the remote server sent with the error. We then log that response body to the Drupal log.

use Drupal\Component\Render\FormattableMarkup;
use GuzzleHttp\Exception\GuzzleException;
try {
  $response = $client->request($method, $endpoint, $options);
}
// First try to catch the GuzzleException. This indicates a failed response from the remote API.
catch (GuzzleException $error) {
  // Get the original response
  $response = $error->getResponse();
  // Get the info returned from the remote server.
  $response_info = $response->getBody()->getContents();
  // Using FormattableMarkup allows for the use of

 tags, giving a more readable log item.
  $message = new FormattableMarkup('API connection error. Error details are as follows:
@response
', ['@response' => print_r(json_decode($response_info), TRUE)]);
  // Log the error
  watchdog_exception('Remote API Connection', $error, $message);
}
// A non-Guzzle error occurred. The type of exception is unknown, so a generic log item is created. catch (\Exception $error) {
  // Log the error.
  watchdog_exception('Remote API Connection', $error, t('An unknown error occurred while trying to connect to the remote API. This is not a Guzzle error, nor an error in the remote API, rather a generic local error ocurred. The reported error was @error', ['@error' => $error->getMessage()));
}

With this code, we have caught the Guzzle exception, and logged the actual content of the response from the remote server to the Drupal log. If the exception thrown was any other kind of exception than GuzzleException, we are catching the generic \Exception class, and logging the given error message.

By logging the response details, our log entry will now look something like this:

Remote API connection error. Error details are as follows:

stdClass Object (
  [title] => Invalid Resource
  [status] => 400
  [detail] => The resource submitted could not be validated. For field-specific details, see the 'errors' array.
  [errors] => Array (
    [0] => stdClass Object (
      [field] => some_field
      [message] => Data presented is not one of the accepted values: 'Something', 'something else', or another thing'
    )
  )
)

* Note that this is just an example, and that each API will give its own response structure.

This is a much more valuable debug message than the original truncated message, which left us understanding that there had been an error, but without the information required to fix it.

Summary

Drupal 8 ships with Guzzle, an excellent HTTP client for making requests to other servers. However, the standard debugging method doesn't provide a helpful log message from Guzzle. This article shows how to catch Guzzle errors, so that the full response can be logged, making debugging of connection to remote servers and APIs much easier.

Happy Drupaling!

Apr 06 2021
Apr 06

Background

We live in an age of Drupal complexity. In the early days of Drupal, many developers would have a single Drupal instance/environment (aka copy) that was their production site, where they would test out new modules and develop new functionality. Developing on the live website however sometimes met with disastrous consequences when things went wrong! Over time, technology on the web grew, and nowadays it's fairly standard to have a Drupal project running on multiple environments to allow site development to be run in parallel to a live website without causing disruptions. New functionality is developed first in isolated private copies of the website, put into a testing environment where it is approved by clients, and eventually merged into the live production site.

While multiple environments allow for site development without causing disruptions on the live production website, it introduces a new problem; how to ensure consistency between site copies so that they are all working with the correct code.

This series of articles will explore the Configuration API, how it enables functionality to be migrated between multiple environments (sites), and ways of using the Configuration API with contributed modules to effectively manage the configuration of a project. This series will consist of the following posts:

This article will focus specifically on how developers can manage, declare, and debug configuration in their custom modules.

Configuration Schema

Configuration schema describes the type of configuration a module introduces into the system. Schema definitions are used for things like translating configuration and its values, for typecasting configuration values into their correct data types, and for migrating configuration between systems. Having configuration in the system is not as helpful without metadata that describes what the configuration is. Configuration schemas define the configuration items.

Any module that introduces any configuration into the system MUST define the schema for the configuration the module introduces.

Configuration schema definitions are declared in [MODULE ROOT]/config/schema/[MODULE NAME].schema.yml, where [MODULE NAME] is the machine name of the module. Schema definitions may define one or multiple configuration objects. Let's look at the configuration schema for the Restrict IP module for an example. This module defines a single configuration object, restrict_ip.settings:

restrict_ip.settings:
  type: config_object
  label: 'Restrict IP settings'
  mapping:
    enable:
      type: boolean
      label: 'Enable module'
    mail_address:
      type: string
      label: 'Contact mail address to show to blocked users'
    dblog:
      type: boolean
      label: 'Log blocked access attempts'
    allow_role_bypass:
      type: boolean
      label: 'Allow IP blocking to be bypassed by roles'
    bypass_action:
      type: string
      label: 'Action to perform for blocked users when bypassing by role is enabled'
    white_black_list:
      type: integer
      label: 'Whether to use a path whitelist, blacklist, or check all pages'
    country_white_black_list:
      type: integer
      label: 'Whether to use a whitelist, blacklist, or neither for countries'
    country_list:
      type: string
      label: 'A colon separated list of countries that should be white/black listed'

The above schema defines the config object restrict_ip.settings which is of type config_object (defined in core.data_types.schema.yml).

When this module is enabled, and the configuration is exported, the filename of the configuration will be restrict_ip.settings.yml. This object has the keys enable, mail_address, dblog etc. The schema tells what type of value is to be stored for each of these keys, as well as the label of each key. Note that this label is automatically provided to Drupal for translation.

The values can be retrieved from the restrict_ip.settings object as follows:

$enable_module = \Drupal::config('restrict_ip.settings')->get('enable');
$mail_address = \Drupal::config('restrict_ip.settings')->get('mail_address');
$log = \Drupal::config('restrict_ip.settings')->get('dblog');

Note that modules defining custom fields, widgets, and/or formatters must define the schema for those plugins. See this page to understand how the schema definitions for these various plugins should be defined.

Default configuration values

If configuration needs to have default values, the default values can be defined in [MODULE ROOT]/config/install/[CONFIG KEY].yml where [CONFIG KEY] is the configuration object name. Each item of configuration defined in the module schema requires its own YML file to set defaults. In the case of the Restrict IP module, there is only one config key, restrict_ip.settings, so there can only be one file to define the default configuration, restrict_ip/config/install/restrict_ip.settings.yml. This file will then list the keys of the configuration object, and the default values. In the case of the Restrict IP module, the default values look like this:

enable: false
mail_address: ''
dblog: false
allow_role_bypass: false
bypass_action: 'provide_link_login_page'
white_black_list: 0
country_white_black_list: 0
country_list: ''

 

As can be seen, each of the mapped keys of the restrict_ip.settings config_object in the schema definition are added to this file, with the default values provided for each key. If a key does not have a default value, it can be left out of this file. When the module is enabled, these are the values that will be imported into active configuration as defaults.

Debugging Configuration

When developing a module, it is important to ensure that the configuration schema accurately describes the configuration used in the module. Configuration can be inspected using the Configuration Inspector module. After enabling your custom module, visit the reports page for the Configuration Inspector at /admin/reports/config-inspector, and it will list any errors in configuration.

The Configuration Inspector module errors in configuration schema definitions

Clicking on 'List' for items with errors will give more details as to the error.

The 'enable' key has an error in schema. The stored value is a boolean, but the configuration definition defines a string

Using the Configuration Inspector module, you can find where you have errors in your configuration schema definitions. Cleaning up these errors will correctly integrate your module with the Configuration API. In the above screenshot, then type of data in the active schema is a boolean, yet the configuration schema defines it as a string. The solution is to change the schema definition to be a boolean.

Summary

In this final article of this series on the Drupal 8 Configuration API, we looked at configuration schema, how developers can define this schema in their modules and provide defaults, as well as how to debug configuration schema errors. Hopefully this series will give you a fuller understanding of what the Configuration API is, how it can be managed, and how you can use it effectively in your Drupal projects. Happy Drupaling!

Apr 06 2021
Apr 06

Background

We live in an age of Drupal complexity. In the early days of Drupal, many developers would have a single Drupal instance/environment (aka copy) that was their production site, where they would test out new modules and develop new functionality. Developing on the live website however sometimes met with disastrous consequences when things went wrong! Over time, technology on the web grew, and nowadays it's fairly standard to have a Drupal project running on multiple environments to allow site development to be run in parallel to a live website without causing disruptions. New functionality is developed first in isolated private copies of the website, put into a testing environment where it is approved by clients, and eventually merged into the live production site.

While multiple environments allow for site development without causing disruptions on the live production website, it introduces a new problem; how to ensure consistency between site copies so that they are all working with the correct code.

This series of articles will explore the Configuration API, how it enables functionality to be migrated between multiple environments (sites), and ways of using the Configuration API with contributed modules to effectively manage the configuration of a project. This series will consist of the following posts:

Part 1 gives the background of the Configuration API, as well as discusses some terminology used within this article, and Part 2 describes how the API works, and Part 3 explains how to use functionality provided by core, so they are worth a read before beginning this article. 

Read-only configuration

In some situations, site builders may want to prevent any configuration changes from being made on the production environment, preventing changes that may cause unexpected issues. For example, clients with admin access could log into the production server, and make what they think is an innocent configuration change, that results in unexpected and drastic consequences. Some site builders consider it to be a best practice to prevent configuration changes on the production server altogether, under the idea that only content should be editable on the production server, and configuration changes should only be made in development and/or staging environments before being tested and pushed to production.

The Config Readonly module, allows for configuration changes through the UI to be disabled on a given environment. It does this by disabling the submit buttons on configuration pages. The module also disables configuration changes using Drush and Drupal console.

A configuration form that has been disabled with the Configuration Readonly module

Note: some configuration forms may still be enabled when using this  module. Module developers must build their forms by extending ConfigFormBase for the Configuration Readonly module to do its magic. If the developer has built the form using other means, the form will not be disabled, and the configuration for that form can be changed through the admin UI.

To set up an environment as read-only, add the following line to settings.php, then enable the module:

$settings['config_readonly'] = TRUE;

After an environment is set as read-only, changes to configuration can only be made on other environments, then migrated and imported into the active configuration on the read-only environment.

Complete split (blacklist) configuration

Sometimes configuration needs to exist on some environments, but not exist in other environments. For example, development modules, like the Devel module, or UI modules like Views UI (Drupal core) and Menu UI (Drupal core) should not be enabled on production environments, as they add overhead to the server while being unnecessary since the production server should not be used for development.

A problem arises when configuration is exported from one environment, and imported into the production environment. All the configuration from the source environment is now the active configuration on the production environment. So any development modules that were enabled on the source environment are now enabled on the production environment. In the case of development modules like Devel, this may only add some overhead to the server, but imagine a module like the Shield module, which sets up environments to require a username and password before even accessing the site. If this module is accidentally enabled upon import on production, it will block the site from public access - a disaster!

The solution to this situation is to blacklist configuration. Blacklisted configuration is blacklisted (removed) from configuration upon export. This functionality is provided by the Configuration Split module. This module allows for black-listing configuration either by module, by individual configuration key(s), and/or by wildcard.

Note that more detailed directions for creating blacklists can be found on the documentation page. The following is meant to give an overview of how black lists work.

Blacklists are created as part of a configuration profile. Configuration profiles allow for 'splitting' (a divergence in) configuration between environments. Profiles may be created for environment types such development, staging and production allowing for configuration specific to those types of environments. Or profiles could be set up for public non-production environments, that would have the Shield module enabled and configured. While a development profile may apply to all development environments, not all development environments are on publicly accessible URLs, and therefore may not need the Shield module enabled.

When setting up a configuration profile, note that the folder name must be the same as the machine_name of the profile.

Configuration split profile settings

Note that you must manually create the folder specified above, and that folder can and should be tracked using Git, so it can be use on any environment that enables the profile.

Configuration can then be set up to be blacklisted either by module, by configuration key, or by wildcard:

Complete split (blacklist) can be set by module, configuration key, or by wildcard

Finally, environments need to be set up to use a given profile. This is handled by adding the following line to settings.php on the environment:

$config['config_split.config_split.PROFILEMACHINENAME']['status'] = TRUE;

Where PROFILEMACHINENAME is the machine_name from the profile you created.

Although blacklisted configuration does not become part of the exported archive, it is not ignored altogether. When an environment has the profile enabled, upon export, blacklisted configuration is extracted, then written to the folder specified in the profile. The remaining configuration is written to the default configuration directory. When importing configuration, environments with the profile enabled will first retrieve the configuration from the default configuration directory, then apply any configuration from the folder specified in the profile. Environments not set up to use the profile ignore the configuration in the blacklisted directory altogether on both import and export.

This means that a developer can enable the Devel module on their local environment, blacklist it, then export their configuration. The blacklisted configuration never becomes part of the default configuration, and therefore the module will not accidentally be installed on environments with the configuration profile enabled.

Conditional split (grey list) configuration

Grey lists, also provided by the Configuration Split module, allow for configuration to differ by environment. With a blacklist (previous section), the configuration only exists in the active database configuration for environments that are set up to use the configuration profile containing the blacklisted configuration. With a grey list, the configuration exists in the active configuration in all environments, but the configuration profiles can be set up to allow environments to use differing values for the configuration.

Imagine an integration with a remote API requiring a username, password, and endpoint URL. The production server needs integrate with the remote API's production instance, while other environments will integrate with the remote API's sandbox instance. As such, the values to be used will differ by environment:

Production Environment:

remoteapi.username: ProductionUsername
remoteapi.password: ProductionPassword
remoteapi.endpoint: https://example.com/api

Other Environments:

remoteapi.username: SandboxUsername
remoteapi.password: SandboxPassword
remoteapi.endpoint: https://sandbox.example.com/api

A grey list allows for the setup of these values by configuration profile.

You may be remembering that Part 3 of this series of articles discussed overriding configuration in settings.php, and thinking that a grey list sounds like the same thing. After all, the default values for the sandbox instance of the API could be set up as the configuration values, and the production values could be overridden in settings.php on the production environment, with the same end-result.

The difference is that with a grey list, the remote API values are saved to the configuration profile folder, which is tracked by Git, and therefore can be tracked and migrated between environments. When grey listed configuration is exported, the grey listed configuration is written to the configuration profile folder, in the same manner as blacklisted configuration. When configuration is imported, the default values are retrieved, and the grey list values are used to override the default values, after which the configuration is imported into active configuration.

With the configuration override method using settings.php, site builders need to store the various configuration values somewhere outside the project, communicating environment-specific configuration values to each other through some means, to be manually entered on the relevant environment(s). With a grey list, the configuration values are managed with Git, meaning site builders do not need to record them outside the project, nor communicate them to each other through some other means. Site builders simply need to enable the relevant configuration profile in settings.php, and the environment-specific values can then be imported into active configuration from the configuration profile directory. This means that the sandbox API values can be set up as the values used by default on all environments, and a production configuration profile can be enabled on the production environment using the values to connect to the production instance of the remote API.

Conditional split items can be selected either from a list, or by manually entering them into the configuration profile:

Conditional split (grey list) settings can be selected or manually entered

Finally, note that grey lists can actually be used in conjunction with configuration overrides in settings.php. Grey lists are applied during import and export of configuration from the database. Values in settings.php are used at runtime, overriding any active configuration. So a developer could choose to set up their local instance of the system to connect to an entirely different instance of the remote API altogether by overriding the values in settings.php.

Ignoring configuration (overwrite protection)

Sometimes developers will want to protect certain configuration items in the database from ever being overwritten. For example imagine a site named Awesome Site, with a module that supplies the core of the site, named awesome_core. Since this module provides the core functionality of the site, it should never be disabled under any circumstances, as that would disable the core functionality of the site. In this case, the configuration for this module can be set to be 'ignored'. Any attempts to import ignored configuration from the file system to the active configuration in database will be skipped, and not imported.

Configuration can be ignored using the Config Ignore module. The functionality this module provides is similar to the functionality provided by the Config Readonly module discussed earlier, however the Config Readonly module covers the entire configuration of an environment, while the Config Ignore module allows for choosing configuration that should be protected. This configuration is protected by ignoring it altogether on import.

Configuration can be ignored as follows:

  1. Enable Config Ignore module on all environments.
  2. Navigate to the config ignore UI page, and set the configuration item to be ignored. In the case of preventing the awesome_core module from being disabled, the following would be added:
    core.extension:module.awesome_core Configuration to be ignore is entered one item per line. Wildcards can be used.

This setting will ensure that any attempts to change or remove core.extension:module.awesome_core upon configuration import will be ignored. So if the module is enabled on production, and a developer pushes configuration changes that would uninstall this module, those changes will be ignored, and the module will still be set as enabled after import.

Summary

In this article, we looked at various modules that extend the Configuration API, use cases behind these modules, and how the modules worked. We looked at the Config Readonly module, the Configuration Split module, and the Config Ignore module, and how to use these modules to manage configuration differences between environments. In the next, final fifth part of this series, we will look at configuration management for module developers, and how developers can define the schema for the configuration in modules they develop.

Apr 06 2021
Apr 06

Background

We live in an age of Drupal complexity. In the early days of Drupal, many developers would have a single Drupal instance/environment (aka copy) that was their production site, where they would test out new modules and develop new functionality. Developing on the live website however sometimes met with disastrous consequences when things went wrong! Over time, technology on the web grew, and nowadays it's fairly standard to have a Drupal project running on multiple environments to allow site development to be run in parallel to a live website without causing disruptions. New functionality is developed first in isolated private copies of the website, put into a testing environment where it is approved by clients, and eventually merged into the live production site.

While multiple environments allow for site development without causing disruptions on the live production website, it introduces a new problem; how to ensure consistency between site copies so that they are all working with the correct code.

This series of articles will explore the Configuration API, how it enables functionality to be migrated between multiple environments (sites), and ways of using the Configuration API with contributed modules to effectively manage the configuration of a project. This series will consist of the following posts:

Part 1 gives the background of the Configuration API, as well as discusses some terminology used within this article, so it's worth a read before beginning this article.

Active configuration is in the database

In Drupal 8, configuration used at runtime is stored in the database. The values in the database are known as active configuration. In Drupal 7, configuration was known as settings, and stored in the {variable} table. In Drupal 8, configuration is stored in the {config} table. The active configuration is used at runtime by Drupal when preparing responses.

Configuration is backed up to files

The Configuration API enables the ability to export the active database configuration into a series of YML files. These files can also be imported into the database. This means that a developer can create a new Field API field on their local development environment, export the configuration for the new field to files, push those files to to the production environment, then import the configuration into the production environment's active configuration in the database.

The configuration values in the database are the live/active values, used by Drupal when responding to requests. The YMLfiles that represent configuration are not required, and are not used at run-time. In fact, in a new system the configuration files don't even exist until/unless someone exports the active configuration from the database. The configuration files are a means to be able to back up and/or migrate configuration between environments. Configuration files are never used in runtime on a site.

Configuration architecture

Let's look at the Configuration API on a more technical level, using a real-world example. The Restrict IP module allows users to set a list of rules that whitelist or blacklist users based on their IP address. Upon visiting the module settings page, users are presented with a checkbox that allows them to enable/disable the module functionality.

From a data standpoint, checkboxes are booleans; they represent either a true or false value. When exporting the configuration of a site with the Restrict IP module enabled, the relevant configuration key will be saved with a value of either true or false to a .yml file. Modules are required to define the schema for any configuration the module creates. Developers can look at the configuration schema declarations to understand what file(s) will be created, and what values are accepted.

Modules declare the schema for their configuration in the [MODULE ROOT]/config/schema directory. In the case of the Restrict IP module, the schema file is restrict_ip/config/schema/restrict_ip.schema.yml. This file contains the following declaration:

restrict_ip.settings:
  type: config_object
  label: 'Restrict IP settings'
  mapping:
    enable:
      type: boolean
      label: 'Enable module'

Schema declarations tell the system what the configuration looks like. In this case, the base configuration object is restrict_ip.settings, from the first line. When this configuration is exported to file, the file name will be restrict_ip.settings.yml. In that file will be a declaration of either:

enable: true

Or:

enable: false

When the file restrict_ip.settings.yml is imported into the active configuration in another environment's database, the value for the enable key will be imported as defined in the file.

On top of this, enabled modules are listed in core.extension.yml, which is the configuration that tracks which modules are enabled in a given environment. When the Restrict IP module is enabled in one environment, and configuration files exported from that environment are imported into a different Drupal environment, the Restrict IP module will be enabled due to its existence in core.extension.yml, and the setting enable will have a value of either true or false, depending on what the value was exported from the original environment.

Note that if you were to try to import the configuration without having the Restrict IP module in the codebase, an error will be thrown and the configuration import will fail with an error about the Restrict IP module not existing.

Summary

In this article, we looked at how the Drupal 8 Configuration API works on a technical level. We looked at how active configuration lives in the database, and can be exported to files which can then be imported back into the database, or migrated and imported to other Drupal environments. In part 3 of the series, Using the API, we will look at how to actually use the Configuration API, as well as some contributed modules that extend the functionality of the Configuration API, allowing for more effective management of Drupal 8 projects.

Mar 26 2021
Mar 26

It seems that with each passing year there is a new paradigm for how content can be arranged and organised in Drupal. Over the years a number of approaches have moved in and out of being in vogue: Panels, Displya Suite, IPE, Bricks and Paragraphs to name a few. Some change has been positive, providing leaps forward in flexibility or control. Other developments have not lived up to their promise.

In February 2021 I presented a new module, Layout Paragraphs, to the Sydney Meetup. The slides and video have been provided below. This presentation demonstrates Layout Paragraphs in action and how offers some advanced layout options for Paragraphs. Conceptually it is similar to Layout Builder in many respects, however, it performs its magic on the Node Edit page, integrating with the natural content editing environment for site editors.

Layout Paragraphs offers a new way forward for the following reasons:

  • Editing happens on node edit, rather than the layouts page. Better for editors.
  • Paragraphs can be placed into Layout regions to bring more flexibility to Paragraphs. This is similar to what Bricks was doing.
  • Nicer UI for Paragraph selection.
  • Nicer UI for Paragraph display - no need for Preview view mode any more.

A bit of history

It is worth reviewing a little history to see where Layout Paragraphs fits in. The presentation takes a look at some of the popular combinations over the years and gives them over all scores, weighted by functionality and editor experience. Here is a spoiler of what is covered in the video:

Recipe Year Score Pure template 2010 65 Display Suite 2010 58 Panelizer 2012 69 Panelizer and Paragraphs 2014 73 Panelizer and IPE 2016 39 Panelizer, Bricks and Paragraphs 2017 63 Layout Builder and Blocks 2018 70 Layout Builder and Paragraphs 2019 78 Layout Builder, Layout Paragraphs, Paragraphs 2021 81

The scores were calculated from a weighted average of various aspects of the techniques: flexibility, control, editor experience, etc. Watch the video for the details.

Conclusion

You can see that Layout Paragraphs is the latest in the line of approaches and that it is scoring quite well. A recioe based around Lout Builder, Layout Paragraphs and Paragraphs seems to work quite will. Layout Builder remains the domain of the sitebuilder, using it to define the basic layouts for the page. With Layout Paragraphs, a new set of simpler layouts can be used by the editor for their paragraphs.

I think that the approach holds a lot of promise moving forward and it is good enough for Morpht to be considering it as a standard part of our editor toolkit. All up we have found the module to be usable and a definite improvement on editor experience. We are adopting it into projects where we can.

Watch the video and let us know what you think in the comments below.

Watch the video

Feb 26 2021
Feb 26

How does it stack up

Those of you who work with Drupal, you are probably familiar with the combination of using Search API with a search backend such as MySQL or Solr. A pluggable architecture makes Search API a good choice for indexing content in Drupal.

For a long time MySQL and Solr were the popular choices. MySQL was an easy choice as performance was good and results were OK. For those working with large datasets and many concurrent facets, Solr made more sense. Most Drupal hosting companies provide it as a service for this reason. As the search market has matured, other backends have become available, including one for Sajari.

The table below compares these three options and highlights the strengths and weaknesses of each.

Feature Database Solr Sajari

Separate service

No

Built into Drupal.

Yes

Drupal hosting companies provide a Solr as SaaS.

Yes

Sajari is available as a SaaS.

Full text search

Yes

Yes

Yes

Facets

Yes

Yes

Yes

More like this

No

Yes

A useful feature for providing item recommendations based on similarity.

No

Result quality

OK

Good

Very good

Performant

Partial

Slow with many filters over large datasets with facets.

Yes

Yes

Easy install

No

Requires a module such as Search API Database to push data across to Solr.

No

Requires a module such as Search API Solr to push data across to Solr.

Yes

We can configure Sajari in the Sajari UI to run from metadata on the page. Sajari provides an embeddable widget.

We recommend the Search API Sajari module approach.

Search API Integration

Yes

Search API Database module

Yes

Search API Solr module

Yes

Search API Sajari module

Federation

No

No

Yes

A site parameter can be passed into the index for easy filtering.

ReactJS components

No

No

Yes

Interface is faster than Search API as server round trips are not needed.

Result tracking

No

No

Yes

Built-in metrics understand page trends and poorly performing keywords to help you see what searches led your users to individual pages, or which content visitors are searching for but can’t find.

Reporting

No

Reports can be set up in analytics software.

No

Reports can be set up in analytics software.

Yes

Sajari provides logs and charts of search requests.

Autocomplete - suggestions

Yes

Extra module can be installed.

Yes

Extra module can be installed.

Yes

Synonyms

No

No

Yes

Libraries of synonyms can be uploaded via Sajari UI.

Typos

No

No

Yes

Support for misspelled words.

Boosting

Limited

Limited

Yes

Advanced rules can be defined on certain plans.

Machine learning

No

No

Yes

Sajari will learn which results are more or less relevant, promoting the best results to the top.

Pricing

Free

Database comes with Drupal hosting.

Included

Solr server comes built in with typical Drupal hosting.

Free and up

Starts free for smaller sites and then increases.

https://www.sajari.com/pricing

Summary

An easy, low cost search solution.

A more scalable solution with handy features such as “more like this”.

A fast system with smart results helpful for those looking for synonyms, results boosting, tracking and reporting.

Sajari is a viable alternative for clients who are looking for more insights into how their audience use the search on their site and more control over the delivery of the results. This is the case for content driven sites as well as for ecommerce configurations where preferences play a big role.

Integrating Sajari with Drupal

The Sajari Widgets

It is possible to implement Sajari search into any website without the need for the addition of modules or custom code in the backend. Sajari provides a set of widgets which will allow search to operate without the need for much technical knowledge.

Firstly, a Javascript tracking code will allow for “instant indexing”. When a user visits a page, the code fires up and tells Sajari about the page. Sajari can then visit and index the page to update its index. This approach is simple to set up but has its downsides - freshly updated or deleted content will not make it into the index immediately. If this is a concern, then using Search API Sajari, below, would be an alternative.

Secondly, Sajari offers a tool in the admin UI to define a search form and results. It covers things such as the search query, filters, tabs, result counts and result display. It is very easy to configure. The result is a snippet we can embed onto your search page. A set of ReactJS components drive the search and return results in lightning speed, leading to a good experience for users.

Drupal Module: Search API Sajari

For those looking for a tighter integration between their Drupal site and Sajari, it is possible to use their API to push updated content across. The Search API Sajari module , authored by the developers at Morpht, provides a backend to the venerable Search API module  This will update Sajari when content is updated on your Drupal site.

The main advantages of this approach are:

  • Content is indexed instantly, even when no one views it;
  • Deleted content is removed from the index immediately;
  • The tools within Search API allow for the fine tuning of the various fields;
  • There is support for sending a site name across in the result, allowing for federation of results.

Drupal Module: Sajari

The widgets provided by Sajari offer a quick way to get up and running with a search page. However, there are some limitations in the way they work. At the time of writing (early 2021) the widgets did not support the definition of facets.

In order to overcome this shortcoming, Morpht developed a ReactJS library which sits on top of the components provided by Sajari. It has quite a number of configuration options for queries, result counts, filters, tabs and facets. It even has the ability to customise the results through the provision of a callback function which can convert the JSON result to HTML. This code is available at Sajari Configuartor.

The Sajari module makes use of Sajari Configuartor to power the way search is implemented. The module provides a block for defining how the search will operate. The configuration is then passed through to the Sajari Configurator and the UI and results are then shown.

The Sajari module also makes use of the JSON Template module which allows for different handlebars templates to be defined by the themer. These templates can then be selected by an editor during the block creation process. The select template then forms the basis for the callback which is passed into the Sajari Configuartor. The result is that editors can select how to show results. There is no need to alter the ReactJS templates which are in the library.

A recipe

If you are looking to get up and running with Sajari, we recommend this process:

  • Sign up for a free account at Sajari;
  • Set up an initial collection in Sajari, but add no fields;
  • Install JSON Template, Sajari and Search API Sajari;
  • Configure Search API Sajari with your collection details in a new Server;
  • Define your Node Index and assign it to the Sajari server you have just created. The schema will be updated automatically in Sajari with the changes you make in Drupal;
  • Confirm that content is being indexed properly;
  • Add a Sajari search block to your search page and configure it. Be sure to use the correct pipeline and get the field names right;
  • Test the search and confirm it is working.

Conclusion

Sajari is an up-and-coming search provider offering a new breed of search which can utilise human behaviour to improve the results it shows. It's useful for content heavy and ecommerce sites which have a strong need for good search results. There are now integration modules for Drupal to get you up and running with Sajari easily.

Is Sajari right for you?

If you currently have a Drupal site based on a different engine and are interested in what Sajari can offer you, please get in touch with us to discuss it further.

Feb 26 2021
Feb 26

Bright vivid colours

While this trend might be counteracted by another - brutalism - you’ll find a gluttony of sites brandishing a set of bright colours emboldened further with soft gradients. This of course poses some accessibility challenges with text overlapping such backgrounds. Check out Stripe.

Animated background

The more courageous of the brands push their vivid colours even further with background gradient animations. Check out Qoals.

Glassmorphism

Peering through glass-like interface elements might hark back to Windows Vista times, and more recently with the latest builds of iOS. UI designers are taking it further with 'glassy' overlays to help text become a bit more accessible over gradients and bright backgrounds. Check out DesignCode.

Other trends to get you inspired

Stay tuned

We’ll keep you updated as we release some of these and more on Convivial

Nov 06 2020
Nov 06

A case study on how we configured GovCMS8 SaaS platform to handle bulk uploads and the assignment of metadata. 

The Attorney-General’s Department supports the technical and content management for several Royal Commissions. A common request is for the timely publishing and management of numerous documents tendered before or at public hearings - 200 documents or more in some instances. With a series of hearings taking place all around Australia, the pressure on the publishing team to manage these loads presents challenges.

The Bulk Upload solution addresses some key requirements:
    

  • Classify and sort documents in one bulk upload process,
  • Manage different file formats of the same document,
  • Run a publishing workflow to gain legal sign off before publishing,
  • Handle the versioning of files when a legal update is required,
  • Display those documents in different views on different pages: hearings, document library, and
  • Make the complex simple and work around limitations.
     
Aug 10 2020
Aug 10

At Morpht, we have been busy experimenting and building proof-of-concept personalisation features on Drupal 8. We started off with content recommendations as one of the cogs in a personalisation machine. Recombee stood out as a great AI-powered content recommendations engine and we decided to develop some contributed modules to integrate it with Drupal.

We have implemented three modules which work together:

  • Search API Recombee indexes content and pushes it across to Recombee.
  • Recombee module tracks users and displays recommendations.
  • JSON Template module defines a plugin system for transformers and templates and controls how JSON can be transformed client side.

The video below is a demonstration of these three modules and how they combine to deliver a power recommendations system capable of providing content to anonymous and logged in users alike. 

[embedded content]

Download the slides from the presentation
(PDF 785KB)

Let's talk

Find out how personalisation can help you increase audience engagement and lift user experience.

Contact us

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web