Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jan 24 2024
Jan 24

Introduction

A staff engineer is an engineering architect within Axelerant. 

They plan and create the path for others to follow for project deliveries.

Ensure everyone and everything related to the project remain on the same page.

One might say, if a project team were a band, they would be the lead lyricist, musician, and sometimes, the singer, too. They are the rockstars of engineering.

Since learning that Axelerant was taking on a big Python project—first of its kind in India, or possibly, South Asia—I wanted to learn more about the Python staff engineer role.

And I wanted to learn it from the person responsible for hiring the next Python staff engineer at Axelerant—Bassam Ismail.

Meet Bassam—The Director of Digital Engineering

Bassam lives with his family in the majestic city of Srinagar, in Jammu and Kashmir, India.

Bassam-the Director of Digital Engineering in Axelerant

As the rest of India began enjoying the cool pleasant winds that follow Indian summers, Bassam’s hometown was already covered in snow. In fact, a snowstorm pushed our conversation back by a day.

Bassams home covered in snow in Kashmir

Bassam joined Axelerant in 2012, after working as a freelance designer in Pixoto for a few years.

Ankur Gupta (Axelerant’s CEO) reached out to me one day about an opportunity. I didn’t know anything about Axelerant at the time,” he said.

He met Aliya Khanam during the hiring process—a fellow Kashmiri who vouched for Ankur and their work.

Little did he know that this was the beginning of two relationships—with Axelerant, that would take his career to the next level, and with Aliya, who would eventually become his life partner. 

Bassam with his family

After Four Years, He Left Axelerant In 2016

“As an engineer, I always look forward to solving different problems. Applying and expanding my skill set. I had to see how other companies were doing things,” said Bassam.

It was this drive and curiosity that led him to a rather strange conversation with Ankur Gupta.

“I was honest about my intention. I told him that I wanted to try out different roles, different projects in the industry,” he said.

As a fellow software engineer, Ankur understood what Bassam needed at the time. 

“We both understood that I was coming back,” he shared. 

I started the interview with the most pricking question.

Why did you rejoin Axelerant?

Surprisingly, I found the work very similar, almost the same wherever I went.

And there was a powerful hierarchy. And a disregard for work-life balance, innovation, or people pushing beyond what is expected from them. 

Bassam early on in his career with Axelerant

There was also the cultural difference—complete remote work, a people-centric atmosphere, and empowering policies. 

It made me realize Axelerant always thought ahead of time.

Moreover, as I said, every time I left Axelerant, I always intended to learn things from others. 

Ankur and I were clear on this: I was going to return. 

It was either this or taking a sabbatical. 

I was spoiled with the flexibility to experiment with different jobs, not something you see every day.

Can you tell me about the Python project?

Basically, the project is a betting application. 

And in a betting application, you can bet on many things, like Dream 11, which is explicitly focused on cricket. 

But what we are building includes sports, casinos, racing, and more.  

Firstly, it allows for betting across several different sports and races. 

And on top of that, what we are building is not a single application. 

It's a code base that will allow you to spin up multiple of these sites. For reference, we can take Shopify.

You go to Shopify, sign up, and have your own store. It's very similar to that.

It's a massive project with an equally massive scope. 

Click here to apply for the Python staff engineer role at Axelerant

Does this project allow customers to start their betting websites?

Yeah, and these betting websites will be configurable. 

Customers can choose which games and genres to build their betting websites on.

It's up to them whether one wants just sports like cricket, soccer, rugby, or casinos, or a combination of genres. They will be charged a fee for it. 

But on our end, we need to build a scalable platform.

Let's say we are hosting ten websites, each with 20,000 active visitors.

That's around one-fifth of a million.

Tomorrow, instead of ten, we may have 100 websites. That would take the number of active users to 2 million (or 20 lakhs).

What regions are we targeting at first for the application?

At first we are going to target South Asia.

Because this type of an application hasn’t been done yet over here. So, we’ll have significant leverage in the market.

What type of skill set are we looking for in the staff engineer?

What we are looking for is experience building highly scalable systems.

Systems that can easily grow based on traffic—that already have a lot of traffic and a lot of active users. 

These are the primary metrics for us.

The application will first start attracting customers—or, as we call them, white labels.

Each white label will bring in its active users. 

So, with each white label, the application must grow to accommodate thousands of active users.

What type of technologies would we use in this project?

Since this is a staff-level role, the staff engineer is expected to architect the whole system—a continuous, evolving process.

Building large, scalable systems that can handle a lot of traffic should be their core area of interest.

For this project, we’re using Python along with FastAPI to ensure the maximum performance of the application. The microservices architecture is adored for its fault tolerance and ease of deployment, and we haven’t failed to leverage this architecture to shape this app.

So the project involves deploying multiple code bases as opposed to the single codebase architecture.

All these deployments happen on AWS, and to manage and orchestrate these services, we use Kubernetes on AWS EKS.

Also, engineers don’t have to manually provision the infrastructure as we use TerraForm for automation.

Another primary piece is the database—we're using Postgres in a dp-per-service fashion. We're using Redis as the caching system, and AWS ElasticCache to manage Redis.

Since the project is still in its early phase, we’ll be leveraging AWS Services such as SNS, SQS, or Kafka and a time series database from AWS. 

I do understand that not all professionals have expertise in the exact technologies we use.

If potential engineers don't have FastAPI experience, anyone with a Python background should be relevant as long as they have worked with Postgres as it is the go-to DBMS in many cases.

Also, if the engineers do not have experience in AWS, familiarity with Azure or GCP works. Other than that, they should have experience with either Flask or Django as they are the leading Python frameworks used worldwide.

What type of responsibilities would the Python staff engineer have?

The staff engineer's primary focus would be to understand the business requirements. 

Not just a specific business requirement, but the whole business requirement. 

What is the problem that the business is solving? 

It's building a platform to allow users or sort of proprietors to launch white labels for betting and the traffic would be significant, especially around international sporting events. 

They should have this high-level picture, because based on this picture they have to architect the system. 

What does it look like on a granular level?

On a day-to-day level, the staff engineer will work with a product manager and a project manager. 

  • They will come up with work worth two weeks, which we call a sprint. They need to ensure that the work done in these two weeks is demonstrable to the client, it's solving a very specific feature of the product. And it's done in the right way. 
  • Every time somebody makes an increment in their work, they usually raise a pull request on GitHub, they have to review the code, they have to give them feedback. 
  • If required, they have to work with the developers and figure out whether certain problems are solvable or not. Or if there's a different approach required. 
  • And finally, collaborating with the end client. 

The technical staff engineer has to go through the client requirements, understand them, clarify doubts, get all the references, and then come up with our architecture based on those requirements.

And then also talk to the developers whether everything is feasible and develop level or not.

What type of compensation are we looking at?

We actually started with INR 4 million (40 lakhs) per annum. 

But given the current market rate, we are willing to go up to 6 million (60 lakhs) INR. Maybe even this might be insufficient to hire the kind of person we want to hire for the role. 

We are keeping our stance flexible in this case, because this project will be highly beneficial for Axelerant.

That's the case with digital engineering projects, the profit margins are higher than other types of projects.

And the staff engineer's role won't be limited to this project only. 

What would be their impact at Axelerant?

The role of a staff engineer is unique. 

They typically don't work as individual contributors and write code daily.

Their workload can vary depending on the project—some might require their presence for an extended period, while others, an hour or so a day.

And their impact will be across multiple projects, not just one.

The staff engineer within the Digital Engineering team will help us work on many projects, which means a lot of profitability for Axelerant because digital engineering projects have greater profit margins than our other projects.

Their efforts will help Axelerant grow, and we will be able to try out new and exciting things down the line.

How do you ensure work-life flexibility in the team? 

There isn't a unique answer to this. 

At Axelerant, work-life balance and flexibility are given significant importance. Much more than I've seen in most other places.

Bassam out with his team members in a fine winter day

Even now, though we're running late on the project timelines, especially from the backend side, we don't push people to work more than eight hours. 

And we have never done that. Never. 

We only like people to be accountable for their work. And that's it. 

The expectation is that you will be working 40 hours a week. 

You're working more only if that is what you want to do. 

But we'll still recommend that you just work 40 hours a week. And we're bringing this number down to 35. 

We want people to take breaks. Go on vacation. 

We only ask that you give us a decent heads-up so that we can plan our work accordingly.

Bassam and the Kashmir team

Why should someone join Axelerant as a staff engineer?

From a technical standpoint, you will be solving a lot of challenging, complex problems that have yet to be solved within India. 

We don't have a betting platform that does something like this. 

This application is like a SaaS (Software as a Service) for betting. This would be one unique problem set for someone to solve. And that would give them serious leverage in their career. 

Then there is Axelerant's culture. 

Look at the thought behind our empowering benefits for team members

The benefits might change, but the thought behind it would be the same—because that people-centric attitude is what matters.

And one has a lot of growth prospects here.

You get to define your own career based on the efforts you put in.

Click here to apply for the Python staff engineer role at Axelerant
Jan 24 2024
Jan 24

You are probably interested in setting up a workign environment for Drupal-based projects or maybe you have new members in your development team, so the configuration of the correct development environment is a fundamental part of the process of working with Drupal, you are right. By reading this how-to guide, you will implement a complete and ready-to-go Drupal working environment ready for versions 8, 9, and 10 of our favorite CMS/framework. Do you want to start?…

Picture from Unsplash, user Mathyas Kurmann, @mathyaskurmann.

This content has been constructed as a “how-to” guide, based on the Diátaxis approach for How-to guides, described by Daniele Procida.

Index of sections

Introduction

A local development environment (or “LDE” for short) refers to the combination of software and hardware configurations necessary to develop software comfortably and productively. This includes operating systems, software for programming (IDE), programming languages, frameworks, and versioning systems.

The configuration of an appropriate local development environment is related to the developers’ programming experience, influencing processes such as on-boarding or context-switching adaptations (when you go from programming in one language to working with another technology). As you can imagine, properly configuring LDE is necessary and very important.

This is even more critical when working with tools that already have a complex learning curve, just as in the case of Drupal: facilitating a local working environment is a key to starting work. Following this article, you will set up a complete LDE for Drupal, ready to use, and start your work. Happy Hacking!

What you will accomplish

Through the implementation of the steps recommended in this article, you will achieve the following goals:

  • You will build a ready-to-go work environment.
  • You will deploy Drupal projects based on Docker containers in LDE.
  • You will commit code that meets quality standards from your IDE to a remote repository.

Specifically, you will have correctly configured the following environments:

  • A lightweight local environment.
  • A heavyweight local environment based on software virtualization (containers).
  • An Integrated Development Environment (IDE) with all the necessary configurations to develop good quality code.

Software requirements

Although there are no software requirements, there are operating system requirements. This how-to guide works on Ubuntu systems, specifically 20.04.6 and 22.04.1 (both LTS) and WSL, the Windows subsystem for Linux.

To find out your current version of Linux / Ubuntu, access the terminal and run:

lsb_release -d 

This will return the description of your current version:

Getting Ubuntu versions from prompt

As you can see in the image above:

  • lsb_release -d, getting Ubuntu 20.04.6
  • lsb_release -d, getting Ubuntu 22.04.1

Set up a lightweight local environment for PHP

While all Drupal development relies nowadays on software virtualization environments (containers), some organizations require at least the installation of some basic PHP resources for complementary tasks, such as file validation or the execution of some functions from the terminal out of containers.

This implies a minimal installation on the host system. Specifically, you will install only PHP CLI, the command line tool that allows you to execute PHP scripts.

Drupal 10 requires at least PHP 8.1, so you will have to execute different steps in Ubuntu 22.04.3 and Ubuntu Ubuntu 20.04.6.

Get start

To install PHP CLI in Ubuntu 22.04.3, follow these steps:

  • Update system dependencies:

    sudo apt update
    
    sudo apt upgrade
    
  • Install the available package for PHP 8.1, but avoiding dependencies such as Apache and other unsolicited default packages:

    sudo apt install --no-install-recommends php8.1
    
  • Install some basic PHP extensions:

    sudo apt-get install -y php8.1-cli php8.1-common php8.1-zip php8.1-gd php8.1-mbstring php8.1-curl php8.1-xml php8.1-bcmath
    

Now, it’s time for the basic installation of PHP on an older LTS version of Ubuntu. In this case, the PHP version available in the official repositories is still PHP 7.4.3, so to align it to the versions required by Drupal 10, we will have to make some adjustments.

To install PHP CLI in Ubuntu 20.04.6, follow these steps:

  • Update system dependencies:

    sudo apt update
    
    sudo apt upgrade
    
  • You will use the reference repository of Ondřej Surý for PHP versions, so add a new “Personal Package Archive” (PPA) as a new available repository in your system:

    sudo add-apt-repository ppa:ondrej/php
    
    sudo apt update
    

    And press ENTER when prompted.

  • Now install the required PHP versions:

    sudo apt install php8.1
    
  • Finally, install some basic PHP extensions:

    sudo apt install -y php8.1-cli php8.1-common php8.1-zip php8.1-gd php8.1-mbstring php8.1-curl php8.1-xml php8.1-bcmath php8.1-sqlite3
    

The purpose of these lightweight installations of PHP resources in the local environment is to serve as an “extra tool” for working with PHP files.

Check your installation

Now, you must perform some basic checks to confirm that everything is working well. To test your local PHP installation, follow these steps:

  • Check your PHP modules installation:

    php -m
    

    This will return a complete list of PHP and Zend modules installed on your systen, including basic resources as gd (image graphics library), mbstring (multibyte enconding) or Zend OPcache (objects cache).

    Getting a list of installed PHP modules

  • Create a simple PHP file:

    cat > phpinfo.php
    
    

    And exit from text editor typing CTRL+D in prompt.

  • Execute the PHP file using the PHP built-in web server:

    php -S localhost:8000 phpinfo.php 
    
  • Open the URL in your favourite browser and get data from your PHP local installation:

    Getting info from PHP installation

  • Prepare an on-the-fly Drupal installation following the steps recommended in the Quick Start documentation:

    curl -sSL https://www.drupal.org/download-latest/tar.gz | tar -xz --strip-components=1
    

    You may encounter permissions issues from the execution of tar command running the above recommended command. In that case, try running it:

    wget -c https://www.drupal.org/download-latest/tar.gz -O - | sudo tar -xz
    

    But you will need to change owner and permissions for the new folder:

    sudo chown -R $USER:$USER drupal-*
    
    sudo chmod -R 775 drupal-*
    
  • Launch a Drupal installation:

    cd drupal-*
    
    php -d memory_limit=256M ./core/scripts/drupal quick-start standard --site-name QuickInstall --host localhost --port 8080
    
  • Check out the new Drupal site automatically created and available in a tab of your preferred web browser:

    Enabling a Drupal site in your local environment

For more information about the PHP built-in web server, read the PHP documentation page.

Set up a heavyweight local environment based on software containers

Now, it is time to prepare installation of a suitable local working environment related to the trend established in recent years: container-based (Docker and related resources). For this, we will install Docker as the base system for container management, and on this platform, we will install DDEV.

As for Docker, in 2023-2024, there is little left to say: the de-facto standard for software virtualization based on the concept of “containers” and the base concept for other DevOps technology stacks currently in extensive use.

We have already talked about DDEV in other articles, posts, tutorials and how-to guides: a solution running on Docker for PHP-based web platforms that facilitates the execution of projects (previously existing and new ones). Why DDEV? Compared to other container-based tools such as Docker4Drupal, Lando, or Docksal, DDEV has recently gained significant support from the Drupal community, which makes it almost already the chosen option for development.

Read more about DDEV as solution:

Get start

To install Docker and DDEV in your local environment, follow the next steps:

  • Install Docker and its resources:

    sudo apt update
    
    sudo apt -y install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
    
  • Add DDEV’s GPG key to your keyring:

    curl -fsSL https://pkg.ddev.com/apt/gpg.key | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/ddev.gpg > /dev/null
    
  • Add DDEV releases to your package repository:

    echo "deb [signed-by=/etc/apt/trusted.gpg.d/ddev.gpg] https://pkg.ddev.com/apt/ * *" | sudo tee /etc/apt/sources.list.d/ddev.list
    
  • Now update info and install DDEV:

    sudo apt update
    
    sudo apt install -y ddev
    
  • Confirm the installation of the software by checking version:

    ddev -v
    

    You will get from prompt something like:

    ddev version v1.22.1
    

Check your installation

To test your local DDEV installation, follow these steps:

  • Create a new Drupal 10 site:

    Prepare the main folder:

    mkdir drupal10-site && cd drupal10-site
    
  • Enable the basic configuration for the site:

    ddev config --project-type=drupal10 --create-docroot --docroot=web
    
  • Init the DDEV container ecosystem:

    ddev start
    
  • Build the new site downloading basic resources and executing the installation:

    ddev composer create drupal/recommended-project && \
    ddev composer require drush/drush && \
    ddev drush site:install --account-name=admin --account-pass=admin -y
    
  • Finally, launch the new Drupal site:

    ddev drush uli | xargs xdg-open
    
  • You can run a DDEV command to show the site:

    ddev launch
    

    Enabling a Drupal site in your local environment using DDEV

Set up an IDE for Drupal development

As a third step, you will install an integrated development environment (IDE). An IDE is a fundamental tool for software development, and in the case of Drupal, you will have to make some adaptations to work with its code.

In this scenario you will work with VSCode, Microsoft’s IDE tool released as Open Source that has a 100% free alternate build (VSCodium, which does not integrate Microsoft’s usage telemetry).

You will install the IDE on Linux (Ubuntu / Debian-based) and then you will make the necessary configurations and custom changes. In order to prepare an IDE in your local environment, follow the steps below:

Get start

To have a fully functional environment, follow the steps below.

Install VSCode

Install XDebug

  • Open your VSCode installation and launch VSCode Quick Open (Ctrl+P).

  • Install the PHP Debug extension for VSCode typing the next command in the new box and press enter:

    ext install xdebug.php-debug
    
    ![Installing PHP Debug extension in VSCode](../../images/post/davidjguru_drupal_how_to_set_up_a_local_development_environment_6.jpg)  
    
  • Prepare a launch.json file per project’s folder, including the lines for Xdebug connections:

       {
       "version": "0.2.0",
       "configurations": [
          {
                "name": "Listen for Xdebug",
                "type": "php",
                "request": "launch",
                "hostname": "0.0.0.0",
                "port": 9003,
                "pathMappings": {
                   "/var/www/html": "${workspaceFolder}"
                }
          }
       ]
    }
    
  • Enable XDebug in ddev, just run:

    ddev xdebug on
    
  • Now you can enable the debug mode by clicking in option “Run and Debug”, then put some breakpoints whitin the source code and try to run the site. You are ready for debugging.

Install PHP Codesniffer (PHPCS)

PHP CodeSniffer (PHPCS) is a pair of scripts (phpcs and phpcbf) to detect violations and perform automatic repairs of coding standards. This integration requires some tasks:

  • Install PHPCS as a resource in DDEV containers:

    ddev composer require --dev drupal/coder dealerdirect/phpcodesniffer-composer-installer
    
  • Verify PHPCS has been installed in your DDEV-based Drupal site:

    ddev exec vendor/bin/phpcs -i
    

    Your should see something like:

    The installed coding standards are MySource, PEAR, PSR1, PSR2, PSR12, Squiz, Zend, Drupal, DrupalPractice, VariableAnalysis and SlevomatCodingStandard
    

    This is the list of coding standards enabled for code sniffing.

  • Enable version control in project root (if not already done) and create a new folder for scripting:

    cd $PROJECT_ROOT
    git init
    mkdir -p scripts/git/pre-commit
    
  • Create a new pair of scripts in the folder, pre-commit and pre-commit-phpcs.php with content:

    #!/bin/sh
    # Run pre-commit check PHP script inside ddev when committing from host.
    if [ "$IS_DDEV_PROJECT" != true ]; then
      ddev exec /usr/bin/php scripts/git/pre-commit-phpcs.php
    else
      /usr/bin/php scripts/git/pre-commit-phpcs.php
    fi
    

    and:

     /dev/null', $files, $return);
    $against = ($return == 0) ? 'HEAD' : '4b825dc642cb6eb9a060e54bf8d69288fbee4904';
    
    // Identify changed files.
    exec("git diff-index --cached --name-only $against", $files);
    
    print "\nPrecommit PHPCS\n\n";
    
    foreach ($files as $file) {
    
    if (file_exists($file) && !is_dir($file)) {
    
       // Perform PHP syntax check (lint).
       $return = 0;
       $lint_cmd = "php -l {$file}";
       $lint_output = [];
       exec($lint_cmd, $lint_output, $return);
       if ($return !== 0) {
          // Format error messages and set exit code.
          $exit_code = 1;
       }
    
       // Perform phpcs test.
       $return = 0;
       $phpcs_cmd = 'phpcs ' . $file;
       $phpcs_output = [];
       exec($phpcs_cmd, $phpcs_output, $return);
       if ($return !== 0) {
          // Format error messages and set exit code.
          echo implode("\n", $phpcs_output), "\n";
          $exit_code = 1;
       }
    }
    }
    
    exit($exit_code);
    
  • Connect PHPCS with git commits to perform code reviews before submitting to repository.

    chmod +x scripts/git/pre-commit
    cd .git/hooks && ln -s ../../scripts/git/pre-commit
    

    Now, every time you commit a new change, git will identify the newly modified files and if applicable (within the PHPCS configuration rules), it will perform a code review, giving you feedback.

  • Add new configuration rules for PHPCS:
    Create a new PHPCS config file in root folder, the new phpcs.xml.dist will contain:

     
     PHP CodeSniffer configuration for Drupal website development.RoboFile.phpweb/modules/customweb/themes/custom./.ddev./vendor./web/core./web/libraries./web/modules/contrib./web/themes/contrib./web/sites./config

The new file will provide the basic enabled rules for PHPCS. You can find more inspiration and examples in others phpcs.xml.dist files, such as the one in the Drupal core, path web/core/phpcs.xml.dist. Don’t forget to put this new file under git control and commit it to repository.

Tip: Create Drupal 10 sites on the fly in your environment

Create bash functions to launch Drupal 10 web sites on the fly from your terminal. Now you can reuse common steps to save repetitive tasks in your system, for example creating new Drupal 10 sites to test features.

To create Drupal 10 websites in an automated way, follow the steps below:

  • Stop Apache in your environment, this will free port 80:

    /etc/init.d/apache2 stop
    
  • Create (if it does not exist) a .bash_functions file in your home directory:

    vim  ~/.bash_functions
    
  • Add a specific block with some bash commands, gathering all the related DDEV commands to create a new Drupal 10 site:

    ## Creating Drupal projects by using DDEV. 
    d10ddev () {
      # If you don't provide a name the script will get one random for the site.
      if [ -z "$1" ]
      then
          check=$(shuf -n1  /usr/share/dict/words)
          shortened=${check::-2}
          varkeyname=${shortened,,}
      else
          varkeyname=$1
      fi
      # Create main project folder.
      mkdir $varkeyname && cd $varkeyname
      # Prepare basic configuration.
      ddev config --project-type=drupal10 --docroot=web --create-docroot
      yes | ddev composer create "drupal/recommended-project:^10"
      # Require some extra Drupal resources.
      ddev composer require drush/drush drupal/admin_toolbar drupal/devel drupal/coffee
      ddev composer update --lock
      # Execute site install.
      ddev exec drush si --site-name=$varkeyname --account-name=admin --account-pass=admin -y
      # Enable modules and clean cache.
      ddev drush en -y admin_toolbar admin_toolbar_tools admin_toolbar_search admin_toolbar_links_access_filter devel devel_generate coffee
      ddev drush cr
      # Start the new site and open it in browser.
      ddev start && ddev launch
    }
    
  • Edit your main .bashrc file and make sure you have a block like this (if not, add the lines):

    # Alias definitions.
    # You may want to put all your additions into a separate file like
    # ~/.bash_aliases, instead of adding them here directly.
    # See /usr/share/doc/bash-doc/examples in the bash-doc package.
    
    if [ -f ~/.bash_aliases ]; then
        . ~/.bash_aliases
    fi
    
    if [ -f ~/.bash_functions ]; then
        . ~/.bash_functions
    fi
    
  • Source the .bashrc file to make the changes take effect:

    source ~/.bashrc
    
  • Now you can create new Drupal 10 sites on the fly, just run:

    d10ddev
    
  • To delete dummy Drupal sites, just add a new fuction in the .bash_functions file:

    ## Destroy enabled Drupal site based in DDEV by name from project folder.
    ddevdestroy () {
    varkeyname=${PWD##*/}
    ddev stop
    yes |ddev delete -O
    cd ..
    rm -rf $varkeyname
    }
    

    This will stop the containers network, destroy the DDEV containers and finally delete the source code from the project folder.

  • Get some examples of bash scripting for day-to-day use in Drupal / DDEV based projects here in Github.

  • Read more about how to customize bashrc files.

:wq!

That’s it! congratulations, if you have followed all the steps in this how-to guide, then you have completed a local development environment for Drupal. I leave you with a final song, which you can find in the Spotify playlist “The Russian Lullaby”.

See you!

[embedded content]

Jan 23 2024
Jan 23

2023 was that year that Artificial Intelligence emerged from the futuristic shadows and into the spotlight, sparking transformative new levels of innovation, efficiency, and productivity. 

Here at Promet Source, we’ve leaned into the tipping point for AI, and the reality that in 2024, AI is serving as the driver of transformative leaps forward. 

That’s what’s inspired this riff on Time magazine’s annual Person of the Year. AI as an influencer of modern life is truly in a class of technology unto itself. 

AI has huge possibilities within Drupal, and the Drupal artificial intelligence (AI) community initiative is pooling powerful innovative energies into multiple projects. At this point, there are essentially three systems being built. 

  1. Open AI / ChatGPT Integration,
  2. Augmentor AI, and 
  3. AI Interpolator.

Here’s an overview of each of the three projects. 
 

1. Open AI / ChatGPT Integrations

The OpenAI module is amazingly powerful and is designed enable a suite of modules and an API foundation for OpenAI integration in Drupal for generating text content, images, and content analysis.

OpenAI is the company behind artificially generated intelligence products that powers applications such as ChatGPT, GPT-4, GPT-3, DALL-E, and GitHub CoPilot. The goal of this initiative is to find ways of augmenting and adding assistive AI tech leveraging OpenAI API services in Drupal.

While the newest GPT-4 models support up to 128,000 token context window, it’s important to note that the number of tokens that can be sent and received is dependent on the limits of an OpenAI account. This module cannot override tier limitations with OpenAI.

There are a vast range of submodules included in the Open AI / ChatGPT Integration. Among them:

  • openai_audio: Adds capability to interact with the OpenAI audio (speech to text) endpoints.
  • openai_chatgpt: Enables interaction with the Chat endpoint via ChatGPT API
  • openai_ckeditor: Provides a button for CKEditor 5 to send a prompt to OpenAI and get generated text back.
  • openai_content: Adds assistive tools for different areas of the content editing process. It adds functionality to adjust the tone of the content, summarize body text, suggest taxonomy terms for nodes, and checks content for Moderation violations.
  • openai_dalle: Adds capability to interact with the OpenAI DALL·E (image generation) endpoint, using either the new DALL·E 3 model or DALL·E 2 model.
  • openai_eca: Adds capability to build your own custom workflows with the ECA module. With this, you can create and combine your own custom functionality with ChatGPT.
  • openai_devel: Adds GPT content generate capability to Devel Generate. This provides Devel a way of generating realistic content (not lorem ipsum) using GPT and ChatGPT models. Users can generate sample content from the Drupal UI or via Drush. This is useful if you want to fill out your site with realistic sounding content for client demonstration, layout, theming or QA.
  • openai_dblog: This module demonstrating log analysis using OpenAI to find potential solutions/explanations for error logs. Responses from OpenAI are saved and will persist for common error messages so you can review them.
  • openai_prompt: Adds an area in the admin where you can explore OpenAI text generation capability and ask it (prompt) whatever you'd like.
  • openai_embeddings: This module that analyzes nodes and generates vectors and text embeddings of your nodes, taxonomy, media, and paragraph entities from OpenAI. Responses from OpenAI are saved and could augment search, ranking, automatically suggest taxonomy terms for content, and improve search relevancy without expensive search backends. Content personalization and recommendation may also be possible with this approach.
  • openai_tts: Adds capability to interact with the OpenAI TTS (text to speech) endpoints.
     

2. Augmentor AI

Augmentor is an artificial intelligence (AI) integration module which allows content to be augmented in Drupal via connections with external services. The module is designed as an AI framework which allows for the easy integration of disparate AI systems. It provides a plugable ecosystem for managing a variety of AI services such as GPT3, ChatGPT, NLP Cloud, and Google Cloud Vision.

 Important features include CKEditor 4/5 integration and multiple field widgets such as: 

Augmentor Default Widget. Designed for a simple augmentor field in which only a single value response needs to be managed, which amounts to the best choice according to the connection API from the augmentor execution. This widget can be compared to the Google  "I'm feeling lucky" button.
Augmentor Select Widget. Intended for the receipt of multiple result options in a select box, allowing for the choice of preferred options. 
Augmentor Tags Widget. Helps when planning to target a reference field (tag style).


3. AI Interpolator

The idea of the AI Interpolator module is this: take one field and interpolate a result using AI or other modifiers into another type of field. One simple example is to summarize a long text or create an image from a text area with a description.


This modules exposes an API that allows other modules to do the following:

  • Attach to a field config and allow any AI Interpolator field plugin to be choosable with custom configs.
  • Add a processors plugin that allows other modules to decide how to process the fields. Everything is chainable, meaning that nearly complete entity generations can be completed using very little input.

A few examples of Interpolator AI possibilities: Generate Podcasts from URLs, Research and summarize Google Search words, Set taxonomies. The website: Workflows of AI offers more examples of possible workflows. 


Promet's AI-Powered Initiatives

Here at Promet, we’re channeling resources and sharpening AI-related expertise and focus from multiple angles that include:

 

Metatag AI

Developed and maintained by Promet Source, Metatag AI is an an open-source Drupal module that leverages OpenAI to enhance SEO. Automatically generating meta descriptions for blogs and other content based on headers and descriptions, Metatag AI provides content creators with new levels of efficiency and expertise by drawing upon SEO best practices and streamlining SEO-related tasks.  

Metatag AI is dependent on the Metatag module, and requires a paid Open AI account.  
 

AI Modules in Development

We’re currently working on transformative, AI-powered enhancements for search and chat functionality. 

To review: traditional search matches key words in a query and returns a long list of links on a search engine results page (SERP). While answers to specific search queries are sometimes pulled from what appears to be the most relevant site, and highlighted on the SERP, AI-powered search takes it to the next level, leveraging large language models to construct specific answers to specific questions. 

Utilizing AWS technology we are working through questions pertaining to security with the expectation of releasing an AI-powered search capabilities in the relative near term.

Once intelligent search becomes available, traditional chat and search functionality will begin to appear as cumbersome and tedious. We’re excited about the profound advantages that AI-powered search will have in store for our clients and their users.

Interested in exploring AI-powered possibilities to take your website to the next level? It’s one of our favorite topics these days! Let’s talk

Jan 23 2024
Jan 23

In one of our previous posts, we told about the method of creating pages for a Drupal site that gives the site administrators and content managers more freedom and independence. We are talking about the Layout Builder module, on which a lot of hopes are pinned. In this post, we will take you through the steps necessary to create a beautiful site page with basic content management features.

Our post is intended for beginning Drupal developers and Drupal site owners who will want to read how to make their content managers’ jobs easier.

What is Layout Builder?

Let us recap what our first post was about. Layout Builder is a module whose flexibility allows you to create layouts for any type of content and use them to develop unique pages.

These pages use such bricks as sections, layouts, on which pages are based, and blocks filled with headers, descriptions, images, and other content entities.

The module developers took into account the page creation techniques using the Blocks, Paragraphs and Views modules, but eliminated the need to modify them with custom code and added some WYSIWYG (What You See Is What You Get) and drag-and-drop features. This makes it easier to fill pages with content, but does not replace site developers who are still needed to create the page structure, content type, its fields, and the final appearance of the page. Layout Builder is the big step Drupal has taken towards healthy competition with Tilda and WordPress.

Preparing to creating pages with Layout Builder

It is probably more correct to call Layout Builder a project rather than a module, since its work is maintained by two modules: Layout Builder and Layout Discovery. The former connects to other modules via API, defines the layout purpose and helps to draw the layouts, while the latter is an add-on to the former that provides an interface to manage the content type within blocks and blocks within sections. 

To add Layout Builder and Layout Discovery, place the cursor on the Extend tab in the administration panel, select ‘Install new module’ and check the boxes next to the module names.

Enabling the Layout Builder and Layout Discovery modulesEnabling the Layout Builder and Layout Discovery modules

Next, we need to create the content type using the ‘Add content’ menu. Let it be an article named ‘test’.

Content type creationContent type creation

Then, on the Structure tab, point to the Content Types menu item, select Article, and open the Manage Display page. You will see the Use Layout Builder checkbox. Click it to disable the Field Formatter module, which usually sets the way the content of the page is displayed, and the Manage Layout option will be available for you. Clicking on this button will open the screen where you can create the layout of the pages we are going to develop.

Selection of the Use Layout Builder optionSelection of the Use Layout Builder option

Setting the page structure in Layout mode

Clicking on the Manage Layout button opens the default page where you can add sections and blocks. You need to set up the required structure for this page.

Default layout creation page

Click on the Add section to see several layouts that differ in the number of columns. The layout includes blocks such as content fields, views, user fields, etc.

Layout sections with unfilled blocksLayout sections with unfilled blocks

Content types have a number of default fields that are stored in the base and are not accessible to developers for modification (for example, ID). There are also fields that developers create themselves: header, descriptions, images, etc. This field is used to store information of the type set when the field was created. Sometimes fields are created to make it easier to present certain information. You can simply create an HTML field and stuff the whole page into it (just theoretically, as this code means sheer trouble in the future), but every time we edit the node, we will have to edit the entire HTML code. Therefore, multiple fields are created so that you can combine them in different ways.

Adding a fieldAdding a field

The developer creates the blocks and sets their position on the Structure/Block page: click the Structure tab and select the Block layout option. 

Adding the block in the Drupal admin panelAdding the block in the Drupal admin panel

Return to the page you are working on, click on the Add Block button within the section, and on the side panel you will see the blocks necessary to model the content components that will be displayed on the page: text, images, time of the post, author of the post, call to action, form, and so on. In our example, we are adding the Title field. Before it appears in the block (for example, the field with the author’s name, as on the screenshot), you will be offered to set it in the panel on the right.

Adding the Title block to the layoutAdding the Title block to the layout

If you need to edit the content in the block, click the pencil icon in the upper right corner of the section, and select Configure.

Editing the block contentEditing the block content

To save your changes to the layout, click the Save Layout button in the upper left corner of the page. We recommend doing this after each series of major changes. After you save your changes, you can preview the layout with all configured blocks and fields on the page of the content type you are working with.

Default layout override

What if the default template you created for the content type does not apply to certain pages of the configured content type? For example, you have an online store and want to highlight a particular product among the others. To do this, open the Manage Display page and tick the ‘Allow each content item to have its layout customized’ checkbox. Now you can create a custom layout for each individual node.
 

Enabling the Layout Override optionEnabling the Layout Override option

Notice that the Layout tab has appeared among the Local Tasks tabs at the top of the node. Now, if you create and save a custom layout for this node, it will no longer be affected by changes made to the default layout applied to that type of content.

Layout tab in the Local TasksLayout tab in the Local Tasks

Use the Discard Changes button to undo any changes made to this node, and click on the Revert to Defaults button to disable the reassigned node layout and apply the default layout to it.

Discard changes and Revert to defaults buttonsDiscard changes and Revert to defaults buttons

Modules that complement Layout Builder

The Drupal community has created many modules that can improve and simplify the process of creating layouts with the Layout Builder. We use only some of them:

  1. Layout Builder Browser creates the categories by which you can group the blocks that, by default, appear as a single list in the formatter.
  2. Layout Builder Modal displays the newly added block as a pop-up window on the same page. There is no need to reload the page.
  3. Layout Builder Component Attributes allows developers to add HTML attributes to the blocks.
  4. Layout builder Styles lets you select a style from the list to apply to blocks and sections.

An ADCI Solutions’ developer:

In some projects, we created blocks and templates for Layout Builder ourselves using standard tools. This is because what’s available on drupal.org either doesn’t quite fit our needs (and so it’s easier to write from scratch than to modify) or contains too much excessive code and too many dependencies that we don’t want to bring into the site for the sake of, say, a template. In this case, again, it’s easier to write it yourself. Statistically, it always took longer and was more difficult to modify scripts and styles in the existing code than to write your own.

Layout Builder is the most advanced tool for designing pages in Drupal 

Like any other technology, Layout Builder has bug fixes in prospect, but we have not heard any radical voice against the module. It does not advocate abandoning the classic Paragraph and Field Group modules, which are used to create components that are further grouped by the Layout Builder. This tool is not so much for head-on development of pages for a Drupal site, but rather helps content managers and site administrators fill the site with content and change the page structure without involving the developer. And it works: in one of the projects where we used the Layout Builder, the content managers are already working without our help, and the site owner is not spending money on small tasks.

Jan 23 2024
Jan 23

Happy New Year, everyone! Even with the hectic festive season, the final month of 2023 brought us a ton of fantastic Drupal-related content. Check out our top picks from December below!

Planning a Better "Hello, World" for Drupal

This time we’re kicking off our monthly selection with an article from Joe Shindelar of Drupalize.Me which highlights the need for a lower barrier of entry for newcomers to the project, namely with the introduction of a streamlined “Hello world” guide for new developers, along the lines of React’s Quick Start, for example.

Joe’s suggestion is perfectly in line with the overall mission of Drupalize.Me: providing quality learning resources for users of the Drupal platform. He also includes his current working draft of a proposed Drupal Module Developer Guide and invites other community members to share their suggestions and ideas with him.

Read more about the need for a better “Hello world” guide for Drupal

Drupal 10.2 is now available

Next up, we have an announcement of the Drupal 10.2 minor release by Gábor Hojtsy. Since the new approach to Drupal releases, major versions became compatibility releases while minor versions are now reserved for new features – and the same holds true for Drupal 10.2.

This minor release of Drupal brings new features such as easier content management, more flexible block placements, built-in file name sanitization options, faster permission management and other performance improvements, as well as developer experience optimizations. In addition, Drupal 10.2 is already compatible with the recently released PHP 8.3, with Drupal Core also starting to adopt the new PHP attributes.

Read more about Drupal 10.2

Using DrupalPod for core and contrib development

For the third article from December, we have Michael Anello (ultimike) from DrupalEasy showing how to use DrupalPod, a browser extension and GitPod configuration, for core and contrib Drupal development. One of its main benefits is that it allows developers to easily spin up a personal development environment in their browser without the need to pre-install anything else.

Mike’s article covers all the typical FAQs and steps, from getting started and launching your personal development environment, to the types of cases and users that DrupalPod is ideal for, and more. He concludes with a few next steps for the project.

Read more about DrupalPod

Now’s the time to plan your migration to Drupal 10

Fourth on this month’s list is an article from Four KitchensLaura Johnson about how to successfully upgrade to Drupal 10 from Drupal 7 which is nearing its final end-of-life extension next January. So, after 5th January 2025, any websites still on Drupal 7 will no longer get feature updates, bug fixes or security releases from the community.

On the other hand, moving to the modern Drupal 10 will bring many advantages, in addition to regular new feature releases and security coverage, of course. Some of the main ones are improved website speed & SEO, improved security with reduced maintenance costs, and an optimized content editing experience.

Read more about upgrading from Drupal 7 to Drupal 10

Managing Software Updates for Hundreds of Websites

We continue with an article by David Burns of Lullabot about Renovate, a tool that can automate your Drupal updates and thus help you to very efficiently manage software updates for all your websites.

David starts off by explaining why automating updates is so beneficial, with the key advantage of the amount of time saved as opposed to doing these updates manually. As he states: “Having Renovate (properly configured) is like having an extra developer on your team.”

He also breaks down the tool and its setup in a bit more detail, as well as two other similar options that have not worked as well for Lullabot. Finally, he shares their preferences and configuration for Renovate.

Read more about automating Drupal updates with Renovate

Drupal 10 Single Directory Components + Storybook

Another interesting article from December that we wanted to feature comes from freelance Drupal developer Albert Skibinski who wrote about integrating Drupal 10 single directory components (SDC) with Storybook, a front-end workshop for UI development. This article is a follow-up to one from 2019, when Drupal 8 was the current version and SDC had not yet been a thing in Drupal.

Single Directory Components have been available since Drupal 10.1 as an experimental module and can be integrated with Storybook by using Vite as a builder. Albert’s article covers the entire process, from setting up Storybook with Vite, to the Twig setup, the configurations for both Vite and Storybook, and to finally creating a component.

Read more about integrating Drupal 10 SDC with Storybook

Driven by Community; Not by VC Funds: Andrew Berry on What Differentiates Drupal from SaaS Offerings

Moving on, we have an interview with Lullabot’s Director of Technology Andrew Berry, from Kazima Abbas of The Drop Times. In the interview, Andrew recalls his first experience with Drupal and how it was Drupal’s open-source nature that drew him in, as that forces it to constantly compete with itself, in contrast with proprietary SaaS platforms with vendor lock-in.

Andrew also talks about Evolve Drupal and last year’s event in Toronto, his own contributions to the Drupal project, and shares some of Lullabot’s work and success stories. He closes with his thoughts on the current state of Drupal and in which directions the platforms will evolve.

Read the full interview with Andrew Berry

The new old: Jamstack and MACH's journey towards traditional CMS concepts

We conclude our selection for December with a great article from Dries about the Jamstack and how it has evolved since its original inception alongside other important web architecture trends.

As Dries highlights, one of the most notable trends in Jamstack’s evolution is how it has begun to embrace more traditional CMS features, with the leading Jamstack platform Netlify now moving towards the term “composable web platform".

Another related concept that the article covers is MACH architecture and how it, too, could benefit from traditional CMS features. We really like Dries’s conclusion to all this: while initially, the paths of these different web architecture approaches seemed to diverge, they are now starting to converge.

Read more about the evolution of Jamstack & MACH

Four skyscrapers converging in the sky

We hope you enjoyed revisiting our top picks from last month. We have a lot more content on Drupal, leadership and more coming out soon – stay tuned!

Jan 22 2024
Jan 22

Today we are talking about te show itself. We’ll also cover Autosave Form as our module of the week.

For show notes visit:
www.talkingDrupal.com/434

Topics

  • Update on the show
    • Guest hosts
    • MOTW Correspondent
    • Newsletter
  • Sponsorship
  • Open Collective
  • Content
  • New content in 2024
  • Expanding team

Resources

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan
John Picozzi - epam.com johnpicozzi
Stephen Cross - stephencross.com stephencross

MOTW

Correspondent

Martin Anderson-Clutz - mandclu

  • Brief description:
    • Have you ever wanted an autosave feature on your Drupal site’s forms, so content creators won’t lose their work if they accidentally close the window or lose power? There’s a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Nov 2016 by Hristo Chonov of 1x Internet, who is also one of the organizers of Drupal Dev Days 2024 in Burgas
    • Versions available: 8.x-1.4 which works with Drupal 9 and 10
  • Maintainership
    • Actively maintained, most recent comment less than 3 months ago
    • Test coverage
    • 38 open issues, 20 of which are bugs
  • Usage stats:
    • 6,414 sites
  • Module features and usage
    • Works by automatically saving the content of the current form every 60 seconds, though the time period is configurable
    • When a user opens a form, if an autosaved state exists for that form a dialog will be shown asking if they want to resume editing or discard any autosaved states
    • Once a form is submitted, any saved states will be automatically deleted
    • Notionally it should work with both content entity forms and config forms, but the majority of development and testing has been with entity forms in mind
    • The project page also mentions an issue with nested entity reference inline forms, and has links to relevant Drupal core issues
    • Worth noting that this module uses AJAX to save the states to the Drupal database, separate from entity revisions
    • If you want a solution that save form states into the browser’s localStorage instead, you can check out the Save Form State module, using the jQuery Sisyphus plugin
Jan 22 2024
Jan 22

2023 has been an eventful year, full of ideas, discussions and plans regarding innovation, where Drupal is heading, and, in our case, how the Drupal Association can best support. On top of that, you may have already heard, but innovation is a key goal for the Drupal Association.

Drupal is nothing but a big, decentralized, community. And before we can even think of how we can innovate, we need to understand how contribution actually happens and evolves in our ecosystem. And one of the things we agreed early on was that, without numbers, we don’t even know where we are going. 

For that reason in 2024 we want to introduce you to part of the work we’ve been doing during the last part of 2023 to make sure that we know where we are coming from, we understand where we are going and how the changes we are doing are affecting (or not) the whole contribution ecosystem. I want to introduce you to the Contribution Health Dashboards (CHD).

The CH dashboards should help identify what stops or blocks people from contributing, uncover any friction, and if any problems are found, help to investigate and apply adequate remedies while we can as well measure those changes.

One thing to note is that the numbers we are showing next are based on the contribution credit system. The credit system has been very successful in standardizing and measuring contributions to Drupal.  It also provides incentives to contribute to Drupal, and has raised interest from individuals and organizations.

Using the credit system to evaluate the contribution is not 100% perfect, and it could show some flaws and imperfections, but we are committed to review and improve those indicators regularly, and we think it’s the most accurate way to measure the way contribution happens in Drupal.

It must be noted as well that the data is hidden, deep, in the Drupal.org database. Extracting that data has proved a tedious task, and there are numbers and statistics that we would love to extract in the near future to validate further the steps we are taking. Again, future reviews of the work will happen during the next months while we continue helping contributors to continue innovating.

You can find the dashboards here, in the Contribution Health Dashboards, but keep reading next to understand the numbers better.

Unique individuals and organisations

Jumping to what matters here, the numbers, one of the most important metrics to understand in the Drupal ecosystem is the number of contributions of both individuals and organisations.

Unique individual credits and credits year over year

As you can see, the number of individuals has stayed relatively stable, while their contribution has been more and more significant over the years (except for a slide in the first year of the pandemic). In a way this is telling us that once a user becomes a contributor, they stay for the long run. And, in my opinion, the numbers say that they stay actually very committed.

The number of organisations on the other hand displays a growing healthy trend. This shows that organisations are an important partner for Drupal and the Drupal Association, bringing a lot of value in the form of (but not just) contributors.

Unique organizational contributors and their credits year over year

It definitely means that we need to continue supporting and listening to them. It’s actually a symbiotic relationship. These companies support and help moving forward, not just Drupal, but the whole concept of the Open Web. And their involvement doesn’t end up there, as their daily role in expanding the reach, the number of instances and customers of every size using Drupal is as well key.

In practical terms in 2023 we have been meeting different companies and organisations, and the plan is to continue listening and finding new ways to help their needs in 2024 and beyond. One of the things we are releasing soon is the list of priorities and strategic initiatives where your contributions, as individuals as well as organisations, are most meaningful. This is something I have been consistently asked for when meeting with those individuals and organisations, and I think it’s going to make a big difference unleashing innovation in Drupal. I recommend you to have a look at the blog post about the bounty program.

First year contributors

The next value we should be tracking is how first time users are interacting with our ecosystem.

While the previous numbers are encouraging, we have a healthy ecosystem of companies and a crowd of loyal individuals contributing to the project, making sure that we onboard and we make it easier and attractive for new generations to contribute to the project is the only possible way to ensure that this continues to be the case for many years to come.

That’s why we are looking at first time contributions, or said differently, how many users make a first contribution in their first 12 months from joining the project. During 2024 I would like to look deeper into this data, reveal contribution data further on time, like after 24 and 36 months. For now this will be a good lighthouse that we can use to improve the contribution process.

New users with a contribution in the first 12 months

Although last year's numbers give us a nice feeling of success, we want to be cautious about them, and try to make sure that the trend of previous years of a slight decline does not continue.

That is the reason why my first priority during the first months of 2024 is to review the registration process and the next step for new users on their contribution journey. From the form they are presented, to the documentation we are facilitating, to the messages we are sending them in the weeks and months after.

The changes we make should be guided as well by the next important graph, which is the Time To First Contribution. In other words, the amount of time a new user has taken to make their first contribution to Drupal.

/files/average-time-to-first-contribution-by-registration-year.png

You’ll see that the Contribution Health Dashboards includes other data that I have not mentioned in this post. It does not mean that it is not equally important, but given the Drupal Association has a finite amount of resources, we consider that this is the data that we need to track closely to get a grasp of the health of our contribution system.

For now, have a look at the Contribution Health Dashboards to get a grasp of the rest of the information that we have collected. If you are curious about the numbers and maybe would like to give us a hand, please do not hesitate to send me a message at [email protected]

Jan 22 2024
Jan 22

2023 has been an eventful year, full of ideas, discussions and plans regarding innovation, where Drupal is heading, and, in our case, how the Drupal Association can best support. On top of that, you may have already heard, but innovation is a key goal for the Drupal Association.

Drupal is nothing but a big, decentralized, community. And before we can even think of how we can innovate, we need to understand how contribution actually happens and evolves in our ecosystem. And one of the things we agreed early on was that, without numbers, we don’t even know where we are going. 

For that reason in 2024 we want to introduce you to part of the work we’ve been doing during the last part of 2023 to make sure that we know where we are coming from, we understand where we are going and how the changes we are doing are affecting (or not) the whole contribution ecosystem. I want to introduce you to the Contribution Health Dashboards (CHD).

The CH dashboards should help identify what stops or blocks people from contributing, uncover any friction, and if any problems are found, help to investigate and apply adequate remedies while we can as well measure those changes.

One thing to note is that the numbers we are showing next are based on the contribution credit system. The credit system has been very successful in standardizing and measuring contributions to Drupal.  It also provides incentives to contribute to Drupal, and has raised interest from individuals and organizations.

Using the credit system to evaluate the contribution is not 100% perfect, and it could show some flaws and imperfections, but we are committed to review and improve those indicators regularly, and we think it’s the most accurate way to measure the way contribution happens in Drupal.

It must be noted as well that the data is hidden, deep, in the Drupal.org database. Extracting that data has proved a tedious task, and there are numbers and statistics that we would love to extract in the near future to validate further the steps we are taking. Again, future reviews of the work will happen during the next months while we continue helping contributors to continue innovating.

You can find the dashboards here, in the Contribution Health Dashboards, but keep reading next to understand the numbers better.

Unique individuals and organisations

Jumping to what matters here, the numbers, one of the most important metrics to understand in the Drupal ecosystem is the number of contributions of both individuals and organisations.

Unique individual credits and credits year over year

As you can see, the number of individuals has stayed relatively stable, while their contribution has been more and more significant over the years (except for a slide in the first year of the pandemic). In a way this is telling us that once a user becomes a contributor, they stay for the long run. And, in my opinion, the numbers say that they stay actually very committed.

The number of organisations on the other hand displays a growing healthy trend. This shows that organisations are an important partner for Drupal and the Drupal Association, bringing a lot of value in the form of (but not just) contributors.

Unique organizational contributors and their credits year over year

It definitely means that we need to continue supporting and listening to them. It’s actually a symbiotic relationship. These companies support and help moving forward, not just Drupal, but the whole concept of the Open Web. And their involvement doesn’t end up there, as their daily role in expanding the reach, the number of instances and customers of every size using Drupal is as well key.

In practical terms in 2023 we have been meeting different companies and organisations, and the plan is to continue listening and finding new ways to help their needs in 2024 and beyond. One of the things we are releasing soon is the list of priorities and strategic initiatives where your contributions, as individuals as well as organisations, are most meaningful. This is something I have been consistently asked for when meeting with those individuals and organisations, and I think it’s going to make a big difference unleashing innovation in Drupal. I recommend you to have a look at the blog post about the bounty program.

First year contributors

The next value we should be tracking is how first time users are interacting with our ecosystem.

While the previous numbers are encouraging, we have a healthy ecosystem of companies and a crowd of loyal individuals contributing to the project, making sure that we onboard and we make it easier and attractive for new generations to contribute to the project is the only possible way to ensure that this continues to be the case for many years to come.

That’s why we are looking at first time contributions, or said differently, how many users make a first contribution in their first 12 months from joining the project. During 2024 I would like to look deeper into this data, reveal contribution data further on time, like after 24 and 36 months. For now this will be a good lighthouse that we can use to improve the contribution process.

New users with a contribution in the first 12 months

Although last year's numbers give us a nice feeling of success, we want to be cautious about them, and try to make sure that the trend of previous years of a slight decline does not continue.

That is the reason why my first priority during the first months of 2024 is to review the registration process and the next step for new users on their contribution journey. From the form they are presented, to the documentation we are facilitating, to the messages we are sending them in the weeks and months after.

The changes we make should be guided as well by the next important graph, which is the Time To First Contribution. In other words, the amount of time a new user has taken to make their first contribution to Drupal.

/files/average-time-to-first-contribution-by-registration-year.png

You’ll see that the Contribution Health Dashboards includes other data that I have not mentioned in this post. It does not mean that it is not equally important, but given the Drupal Association has a finite amount of resources, we consider that this is the data that we need to track closely to get a grasp of the health of our contribution system.

For now, have a look at the Contribution Health Dashboards to get a grasp of the rest of the information that we have collected. If you are curious about the numbers and maybe would like to give us a hand, please do not hesitate to send me a message at [email protected]

Jan 22 2024
Jan 22

What makes Drupal 9 so good as compared to Drupal 7?

Ideological differences between the Drupal versions can be traced back to Drupal 8 that emerged in 2015. At the time, developers appreciated the fact that Drupal now had the same database as the PHP framework Symfony and, consequently, programming in Drupal was now based on the principles of object-oriented programming. In this respect, Drupal 9 is so close to its predecessor that it’s conventionally called the next version of Drupal 8.

Benefits for the developer:

  • State-of-the-art technology stack and popular solutions that lower the entry barrier for novices and PHP developer who work with Drupal for the first time;
  • Some of the most popular modules are included in the core, which removes the need to update them separately from the core;
  • The package manager Composer is used to control dependencies and files in the core and to add modules and themes; 
  • The system of configuration export and import;
  • Development is based on the Headless Drupal principle where we use Drupal as the RESTful server. 

Benefits for the admin and content manager:

  • Out-of-the-box functionality and modules which were not available in Drupal 7 or which were available as contrib modules: 
    • Views is the most popular multipurpose module in Drupal 7 without which no project can start;
    • Tour shows tips on the interface. Many modules include such tips in their functionality;
    • Layout builder allows convenient management of the page appearance and content, for example, make it possible to display the content in two columns instead of one;
    • CKEditor allows the content manager to work with the content using the WYSIWYG technology;
    • In-place editor lets you edit the content (e.g. correct misprints) directly on the page without opening the admin panel;
    • Settings tray again makes it possible to manage some settings directly on the page with the content;
    • Media allows managing files and images;
    • Web services (such as JSON:API);
    • The editorial workflow previously organized by Workbench Moderation or Workflow modules.
  • New tools are created based on these innovations. For example, profile Site Studio by Acquia where you can build your website without programming;
  • Dependencies and update are managed using a couple of commands in Composer;
  • Nice theme Claro you can enable immediately after installation.

Benefits for the business owner:

  • Long-term support reduces the risk of intrusion and, consequently, the cost of patches;
  • The site becomes somewhat faster by using the new PHP version, giving up heavy JavaScript libraries and using new cache modules. This makes the site work faster while retaining users and improving the rank in search engines;
  • The low entry barrier makes it easier to search for developers;
  • New business expansion opportunities. 

Let’s explain the latter advantage using an example. You have an online store based on Drupal 7. In Drupal 9 it’s possible to use Drupal as the RESTful server just out of the box. When moving the site and setting up REST based on the existing store you can create a mobile application or some IoT solutions, e.g. self-checkout as in McDonald’s.

Artem Zenkovets, Drupal developer at ADCI Solutions says:

Jan 22 2024
Jan 22

Drupal 10 was released in late 2022. The developers promised the new version to be safer and more convenient for the users, but you will not be able to prove this until you upgrade the old version of Drupal CMS that your website is currently running on.

In this guide, you will learn how to upgrade your site from Drupal 7 or Drupal 8 to Drupal 10.

*Important note: All recommendations in this post apply only if you are upgrading from Drupal 7 or 8 to Drupal 10. If your site is already running on Drupal 9, follow the instructions given in the Upgrade to Drupal 10 Twice as Fast post.

Migration to Drupal

Generally, migration is the process of moving data from one source to another. The source can be a database (DB) or files of CSV and XML types.

drupal migration

In the case of Drupal, there is a source database (Drupal 7 database) and the database to which this data must be transferred (Drupal 10 database). As you can see from the diagram, there are several operations you need to perform during this transfer.

  1. Data acquisition. A query system allows data to be retrieved from the source database.
  2. Mapping. At this stage, you need to define which field of the source base needs to be transferred to which field of the target base.
  3. Processing. This operation allows us to modify the data retrieved from the source database. For example, we may need to change the format of the data we receive so that it is in line with the new structures in Drupal 10.
  4. Setting. Setting is the task of the part of the target object that inserts the data into the structures of the Drupal 10 database.

Why move to Drupal 10?

  • Upgrading your Drupal version is critical to the site's security. Here is an example from ADCI Solutions’ experience: We were approached by the managers of a clinic that was planning to work with the government. To do so, they must pass the site security certification and the portal rate certification. This is impossible with a site running on an outdated CMS version that has a lot of vulnerabilities.
  • Newer versions of Drupal run on PHP 8 and higher. Later versions of PHP guarantee the speed and security of a resource — the two parameters the users will appreciate the most. Also, most hosters only support PHP 8 and higher.
  • The longer you wait to upgrade, the more difficult it will be to upgrade your site in the future. There is a gulf between Drupal 7 and other versions of the CMS, as version 7 has a completely different code base. You will have to do the site migration in any case, but it will be harder to find a specialist who can do the job.
  • The updated version of the system is more convenient for both developers and users. Professionals will find assistance with Drupal administration in the new front-end theme, Olivero, while content managers will find help in the new admin panel theme, Claro.
  • What the Drupal community is most active in developing are contrib modules for Drupal  9 and 10. This means that if you wait to upgrade, you will miss out on new features and modules incompatible with older versions.
  • And finally, the new version of the system always implies better performance of the resource and support for the latest web development standards, which means a stable and up-to-date basis for your project.

What are the benefits of Drupal 10?

Automatic updates of the core and modules

Developers no longer need to write special scripts to check the exit and install upgrades. Drupal 10 implements the ‘set up and forget’ principle, which simplifies maintenance and is suitable for people without programming experience. This functionality is especially useful for small projects with a limited number of dependencies.

CKEditor 5

Integrating the new version of the WYSIWYG editor, CKEditor, was one of the most difficult tasks for the core developers. Now administrators have a tool that looks a lot like the familiar Microsoft Word and Google Docs. A contextual balloon that pops up when you click on selected text or add an image now allows you to edit the content in real time.

Decoupled Menus

The functionality of Decoupled Menus allows the frontender to more easily transform these menus into navigation elements instead of hardcoding them. This also means that non-developers will be able to edit the menu.

Olivero Theme

The Olivero theme was introduced to Drupal 9.1 as an experimental theme and has since become an out-of-the-box front-end theme. The previous theme, Bartik, which had been in use since 2011, became antiquated and did not meet modern requirements. The Olivero brings about to the site a contemporary design, including improved typography, animation and color palette. Olivero is also more suitable for disabled people.

Starterkit Theme Generator

Developers of Drupal themes now have a stable Starterkit theme generator. Previously, each new theme had to be based on the Classy theme, inheriting its CSS classes and markup. Now the Starterkit will help developers save time. They will be able to use it to copy default themes into a new folder and inherit the CSS and HTML only from them. This will allow them to safely change the base theme without worrying about affecting the default themes.

How difficult is it to upgrade to Drupal 10?

Migrating from Drupal 7 to Drupal 10

drupal 7 to drupal 10 migrationDifficulty: 5 drops out of 5

Drupal 7 has a code base that is critically different from that of Drupal 8, 9, and 10, which is why it is the most difficult to migrate to the latest version. When Drupal 8 was created, the developers completely overhauled Drupal’s internal system: the Symfony components were integrated, and since then the release of new CMS versions has been tied to the framework's releases.

Why do you need to upgrade Drupal 7?

  • Drupal 7 is slow. D7 uses outdated versions of PHP that no longer provide the speed and security required of a modern resource.
  • If a site is built on D7, it means that it was created a long time ago and has accumulated some technical legacy. Migrating to another version of Drupal is a chance to get rid of this legacy.
  • Many contrib modules created for version 7 are no longer being developed. The main focus of the community is Drupal 9 and 10.
  • As the number of sites built on Drupal 7 decreases, so does the number of developers who can work with Drupal 7. Over time, it will become increasingly difficult for you to get support or help from the community.

Step-by-step guide to upgrading from Drupal 7 to Drupal 10

Since Drupal 7 has little in common with Drupal 8, 9, and 10 from a technical perspective, migrating to the ninth or tenth version means completely rebuilding the project. This is a good time to schedule a site redesign, get rid of the technical legacy, perform a resource audit, and review poorly built algorithms.

*Important note: Assess how many contrib modules are used on the site. Developers who support the modules may not have updated them to be compatible with D10. If they provide important functionality to the site and you are not ready to patch them, you can upgrade to D9, which has all the modules already ported. After that, upgrading to D10 will be easy.

Step 1. Backing up
Create a backup of the database using MySQL:

mysqldump -u yourusername -p yourdatabase > backup.sql

Also create a backup of the site files. You can run the `rsync` command from the command line, or simply copy the contents of the site root directory to another folder.

Step 2. Server and system preparation
– Make sure the server meets Drupal 10 requirements. For example, install PHP 8.1 or higher, enable PHP extensions like `mbstring`, `pdo`, `json` that you may need.
– Upgrade the Composer to the latest version:

composer self-update

Step 3. Checks of the main modules
You can check the availability of upgraded versions of modules running on Drupal 10 using the Upgrade Status module. It will also show if the module is due for an upgrade. Determine what functions you use the deprecated modules for and whether there are alternative modules that could replace them. 

Step 4. Creating a site on Drupal 10
Create a new site on the latest version of Drupal 10. Add the modules you selected in the previous step. You will also need to recreate the types of content, blocks, multimedia files, menu, etc. There is no point in copying every bit of the old site's architecture, as you now have the opportunity to improve the page structure and use new tools such as the Layout Builder.

Step 5. Custom module upgrade
We recommend minimizing the use of custom modules. Typically, upgrading them means rewriting the code and using new APIs provided by Drupal 10.

Step 6. Data migration
Conduct a thorough content audit before migration. Make sure you have created all the necessary structures to transfer the content to your new site (Step 4). If the site is small, you can transfer the data manually. For large sites, we recommend using automatic content migration.

Step 7. Theme update
At this stage, you can do a rebuild of the old theme, create your own theme from scratch based on a new design, or use a theme that has already been created and adapted for Drupal 10. 

Step 8. Debugging and troubleshooting
– Using Drush, you can get information about the status of the site, check if the necessary modules are enabled, etc. The drush status command prints out the most important information about the Drupal installation and its parameters.

– Drush allows you to view event logs and other logs that can help identify the issues. The drush watchdog-show command prints log entries on the screen.

Step 9. Testing
– Automated browser testing tools such as Selenium WebDriver or Cypress can help test the site interface in real browsers. You can set up test scenarios that mimic user actions such as mouse clicks, typing text, and reviewing page content.

– Verify site performance using load testing tools such as Apache JMeter or Gatling. This allows you to determine how the site responds to heavy load and assess its performance.

Step 10. Relocation to production server
– Once you are satisfied that everything is working correctly on the test server, move the site to the production server using tools such as `rsync` or `git`.

Migrating from Drupal 8 to Drupal 10

drupal 8 to drupal 10 migrationDifficulty: 3 drops out of 5

Support for D8 will end on November 30, 2021; updates to D9 will no longer be released after November 2023. This means that it is now, or will soon be, unsafe to use these versions of the system. But do not panic: it is relatively easy to migrate from version 8 to version 10.

Migration from Drupal 8 to Drupal 10 is a switch to a different version within the same code base, so it is not as difficult as dealing with D7. Even for a complex site, this is only a matter of a few weeks.

Step-by-step guide to upgrading from Drupal 8 to Drupal 10

*Important note: Assess how many contrib modules are used on the site. Their keepers may not have updated them to be compatible with D10. If they provide important functionality to the site and you are not ready to patch them, you can upgrade to D9, which has all the modules already ported. After that, upgrading to D10 will be easy.

Step 1. Backing up
Before starting the upgrade, make a full backup of your current site, including files and database. You can use such tools as phpMyAdmin to back up the database.

Step 2. Server and system preparation
– Make sure the server meets Drupal 10 requirements. Check your PHP version (PHP 8.1 or higher is recommended) and enable all necessary PHP extensions like `mbstring`, `pdo`, `json` and others.
– Upgrade the Composer to the latest version:

composer self-update

Step 3. Upgrade of Drupal Core
Upgrade the Drupal core to at least version 9.4 using Composer. In the root directory of the project run:

composer update "drupal/core-*" --with-all-dependencies

Step 4. Upgrading contrib modules and themes
You need to update all contrib modules and themes to Drupal 10 compatible versions while you are still using Drupal 9. You can perform the check using the Upgrade Status module.

Step 5. Custom code compatibility check
Check if custom modules and themes are compatible with Drupal 10 using the same Upgrade Status module. The module analyzes the code and provides recommendations for updating the code.

Step 6. Upgrade of Drupal Core
Edit the composer.json file and update the core version drupal/core-* to ^10.   

"require" : {   
"composer/installers" : "^2.0"   
"drupal/core-composer-scaffold" : "^10"    
"drupal/core-project-message" : "^10"    
"drupal/core-recommend" : "^10"

In addition to the components mentioned above, composer.json can also include the drupal/core-dev and drupal/core components. 

As soon as you run the composer update, the update will begin.

Step 7. Database update
drush updatedb

Step 8. Debugging and troubleshooting
– Using Drush, you can get information about the status of the site, check if the necessary modules are enabled, etc. The drush status command prints out the most important information about the Drupal installation and its parameters.

– Drush allows you to view event logs and other logs that can help identify the issues. The drush watchdog-show command prints log entries on the screen.

Step 9. Testing
– Automated browser testing tools such as Selenium WebDriver or Cypress can help test the site interface in real browsers. You can set up test scenarios that mimic user actions such as mouse clicks, typing text, and reviewing page content.

– Verify site performance using load testing tools such as Apache JMeter or Gatling. This allows you to determine how the site responds to heavy load and assess its performance.

Drupal 9 update to Drupal 10

drupal migration toolsDifficulty: 2 drops out of 5

The core of Drupal 10 itself has a cool built-in upgrade system that helps you move smoothly to the latest version of the system. In addition, version 10 offers a ready-made set of modules that can save a tremendous amount of developer time.

The move to the next version is not as difficult as the leap across several versions, which is why it is called an update, not a migration. You can find the detailed instructions in this guide: Guide to Quick Site Upgrade from Drupal 9 to Drupal 10.

Migrating a Drupal site with the help of a Drupal team

drupal website migrationDifficulty: 1 drop out of 5

None of the above upgrades will turn into a headache if it is not you doing the work, but a team consisting of Drupal developers, a project manager, and a tester.

Conclusion

Upgrading Drupal is like going to the doctor — tedious, but necessary for normal life. As the new version is prepared, the Drupal development team identifies new vulnerabilities and develops patches to fix them. The updated Drupal ensures that the site is compatible with the latest standards in web development, so that the site is displayed correctly on different devices and browsers. In addition, each upgrade of the system brings new features and improves site performance.

No one is likely to object to such an update. And while it looks more like a script for a movie about the future, upgrade your version of Drupal.

Jan 19 2024
Jan 19

For over a decade, Drupal has been using Symfony Components. In 2015, with the release of Drupal 8, these components became a part of Drupal's core software. It's possible to build complex Drupal sites without worrying about what these components do. But learning about the system we're using will make us better developers of Drupal sites and other PHP applications.

In this first part of a 4-part series, we'll explore how Symfony helps Drupal with its HttpKernel component. We'll look at the component itself and how Drupal uses it to coordinate the request/response cycle.

Spotlight on Symfony in Drupal

  1. Part 1: HttpKernel in Drupal
  2. Part 2: EventDispatcher in Drupal
  3. Part 3: Routing in Drupal)
  4. Part 4: Utility Components in Drupal

Symfony's HttpKernel component in Drupal

One key Symfony component in Drupal is the HttpKernel Component. Its first job is to take the incoming HTTP request from the server, and turn it into a PHP Request object. This format allows other code to interact with the request without worrying about parsing HTTP. Then, the kernel coordinates the steps to create an HTML response for the browser to display.

To do this, it parses the request and notifies events that convert the Request object into a Response.

Drupal's StackedHttpKernel class

Drupal decorates the HttpKernel in its StackedHttpKernel (/core/lib/Drupal/Core/StackMiddleware/StackedHttpKernel.php). As you read the code, notice that the StackedHttpKernel class has 2 main functions: handle and terminate.

  • handle deals with incoming requests.
  • terminate sends the final response to the user.

Here's the truncated class:

    /**
    ...
    */
    class StackedHttpKernel implements HttpKernelInterface, TerminableInterface {
    // ...
    /**
       * Constructs a stacked HTTP kernel.
       *
       * @param \Symfony\Component\HttpKernel\HttpKernelInterface $kernel
       *   The decorated kernel.
       * @param array $middlewares
       *   An array of previous middleware services.
       */
      public function __construct(HttpKernelInterface $kernel, array $middlewares) {
        $this->kernel = $kernel;
        $this->middlewares = $middlewares;
      }
    
      /**
       * {@inheritdoc}
       */
      public function handle(Request $request, $type = HttpKernelInterface::MAIN_REQUEST, $catch = TRUE): Response {
        return $this->kernel->handle($request, $type, $catch);
      }
    
      /**
       * {@inheritdoc}
       */
      public function terminate(Request $request, Response $response) {
        $previous = NULL;
        foreach ($this->middlewares as $kernel) {
          // If the previous kernel was terminable we can assume this middleware
          // has already been called.
          if (!$previous instanceof TerminableInterface && $kernel instanceof TerminableInterface) {
            $kernel->terminate($request, $response);
          }
          $previous = $kernel;
        }
      }
    }

Symfony's HttpKernel class

Let's examine the handle method in the HttpKernel class.

You can find the code for this in vendor/symfony/http-kernel/HttpKernel.php in your Drupal setup. Here's the truncated class:

    /**
    ...
    */
    class HttpKernel implements HttpKernelInterface, TerminableInterface
    // ...
    
        public function handle(Request $request, int $type = HttpKernelInterface::MAIN_REQUEST, bool $catch = true): Response
        {
          // ...
          $this->requestStack->push($request);
            $response = null;
            try {
                return $response = $this->handleRaw($request, $type);
            }
    
          // ...
        private function handleRaw(Request $request, int $type = self::MAIN_REQUEST): Response
        {
            // request
            $event = new RequestEvent($this, $request, $type);
            $this->dispatcher->dispatch($event, KernelEvents::REQUEST);
    
    
            if ($event->hasResponse()) {
              return $this->filterResponse($event->getResponse(), $request, $type);
            }
    
            // ...
        }

It uses something called requestStack to organize the request into a request stack object. Another Symfony component, HttpFoundation creates it. The requestStack method pushes the current request onto that stack. Then the handleRaw method creates a RequestEvent object, and dispatches it to the EventDispatcher, giving certain events an opportunity to respond to the request. If an event responds to the request, it sends its response, triggering the terminate method.

Learn more

Dig into the details in our tutorial, How Drupal Turns a Request into a Response, part of our course, Routes and Controllers in Drupal.

Infographic shows the request to response workflow that the HttpKernel coordinates.

Next up

Stay tuned for the next post in our Spotlight on Symfony in Drupal series, where we'll look at the EventDispatcher component and its role in handling requests.

Jan 18 2024
Jan 18

Note: This blog was originally published on November 2, 2022 and has been updated to reflect new information and insights.

Search Engine Optimization or SEO is what we do to ensure our website is as visible as possible to our target markets when they search for topics related to our website on Google, Bing, and other search engines. We do this by optimizing our website for users while following search engine guidelines.

As any SEO professional knows, SEO advice is always changing due to search engine algorithm updates and discoveries by other SEO professionals. 2023 in particular has been a challenging year for SEO due to updates being released one after another (we’ll talk about these in a bit).

However, there are a couple of best practices that we have found to be consistent.

This guide is your ultimate guide to Drupal SEO. In here you will find answers to common questions on SEO specifically for Drupal, consistent best practices, where to look for information when performing an audit, and more.

How was 2023 for SEO?

Google Search Status

Google released nine official updates last year, one less than in 2022 and 2021. However, we can see here that the updates were released in succession (February-April, August-November).

And these are the updates that were officially announced—there were also instances of SERP volatility outside of these, leading to speculation of continuous adjustments due to the September 2023 helpful content update and unconfirmed Google updates.

Is Drupal or WordPress better for SEO?

So your website is on Drupal (or you’re thinking of migrating), and if you want to know if ranking is a possibility with this CMS. Although WordPress remains to be the most popular content management system, Drupal is used by 6.42% of the top 10,000 websites.

Since Drupal is considered to be a heavy lifter—taking in large amounts of data and traffic easily—it’s no surprise that it’s used by some of the most famous companies and organizations in the world such as:

Based on Aaron’s comparison of Drupal and WordPress, Drupal is second to WordPress not because of SEO capabilities but because it’s easier to use, has more theme options, and is more affordable.

If you don’t need a complex website, WordPress is probably a better option. But if you’re worried about Drupal sites not ranking, there’s no evidence that they don’t.

Does Drupal enable search engine friendly URLs?

It does. Taking a look at one of our recent blog posts, How to Optimize Digital Experiences in Drupal, you will see that there is an option to automatically generate a URL alias:

URL Alias

This is dependent on the H1 you give your blog, which means changing your H1 automatically changes the URL and puts a 301 redirect on it.

You can also choose to not generate the URL alias. For example, we wanted this post to focus on auditing accessibility issues, so we opted to manually put the URL slug.

How to Fix Common Web Accessibility Barriers in Drupal

If we decide that we want to change the URL to reflect the H1, I can just click Generate automatic URL alias and it will do the redirect for me.

Drupal modules for SEO

If anyone tells you that you just need to install a module or plugin and it will solve your SEO issues, I advise you to run the other way.

SEO cannot be done simply through the use of plugins or modules, but I do have to say that they are great for helping out with some technical and on-page SEO tasks.

Here are some modules we have found useful for the Promet site and for clients such as Frank Lloyd Wright Trust:

  • Metatag module
  • Metatag AI
  • Schema.org Metatag
  • Simple XML Sitemap
  • Redirect
  • Pathauto

And here are some modules I have seen recommended that look good to me:

  • SEO Checklist

Metatag module

For example, Drupal has a Metatag module that automatically puts in the metatag for you (you can also edit these as you deem fit). You can add basic tags, referrer policies, open graphs, and Twitter cards.

Metatag AI

The Metatag AI module is one our team developed (yay), and it generates metatags based on the information you provide.

Note from our team: If you’re using a paid account, better. The free Open AI account results in erroneous information.

Schema.org Metatag

The Schema.org Metatag module extends the Metatag module above to give you the ability to implement structured data to your site.

Simple XML Sitemap

This sitemap module generates an XML sitemap for your website. When you go to your website[com]/sitemap.xml, you will see something like this that you then can submit to Google Search Console for crawling:

Sitemap

Redirect

Redirects—especially 301 redirects—are essential to ensuring that people are getting to the most relevant pages they need. For example, this blog is the destination URL for a couple of earlier blogs that had thinner content.

With the URL Redirects module, it’s so much easier to apply and keep track of the redirects we have on our website.

Pathauto

As mentioned earlier, Pathauto is a powerful module that allows you to have search engine-friendly URLs and redirects depending on your H1.

SEO Checklist

I’ve seen the SEO Checklist module recommended a few times but I haven’t tried it yet. This module creates an SEO task list for you based on best practices—complete with links to the settings and module recommendations—that you can check off to help you keep track of your progress.

I’m not sure if the task list is configurable. If it is, that would be great!

But again, SEO is so much more than installing modules and calling it a day. So, let’s move on to our checklist.

Drupal SEO checklist

Here’s a quick primer on SEO. SEO is divided primarily into three categories, whether you’re using Drupal or another CMS:

1. Technical factors

Optimizing a website to make it easier for search engines to crawl and index (speed, redirects, directives).

2. On-page factors

Optimizing factors on the web page for search engines and users to easily make sense of their content (titles, meta descriptions, search intent).

3. Off-page factors

Optimizing for how search engines and users perceive your site’s experience, expertise, trustworthiness, and authority (E-E-A-T, incoming links, socials).

Google has over 200 ranking factors, divided into these three groups. Feel free to dive into the 200 here (some of these are speculative, so use your own judgment).

Yes, it can get tedious and confusing. All in all, what I can say is that if you have these three principles as your guiding light, everything will fall into place:

  1. Relevant, informative content that fulfills search intent;
  2. Smooth and fast user experiences; and
  3. High site authority and reputation.

Let’s begin.

Drupal SEO checklist

How to improve Drupal SEO

In this portion, I will walk you through the process of how I do SEO. This has gone through some improvements in the past year, so I’m excited to share.

Get these tools ready for the entire process:

  • Google Analytics
  • Google Search Console
  • PageSpeed Insights
  • Screaming Frog
  • Semrush, SE Ranking, or your preferred tool for keyword research
  • A spreadsheet to record your changes

Preliminaries

  1. Start with a Google Search Console audit.
  2. Audit your site.
  3. Prioritize your tasks.

1. Start with a Google Search Console audit.

If you haven’t looked at your GSC data in a while, now is a good time. GSC tells you what Google is seeing on your site. This is valuable information and must not be overlooked.

Action Plan:

  • Set your timeframe to 16 months to see more data.
  • Check click data for branded queries, non-branded queries, branded pages, and non-branded pages.
  • Analyze the data. How many queries/pages are there in total for each segment? How many are zero-click? How many are generating clicks?
  • Make sure to record your data in a spreadsheet.

2. Audit your site.

We will be using Screaming Frog for this one.

Tip: Make sure to add your Google Analytics, Google Search Console, PageSpeed Insights, and other tools you’d like to pull data from. Don’t forget to add your sitemap as well!

Screaming Frog API Access

Action Plan:

  • Find the optimal settings for you. I use this Screaming Frog settings guide with a few modifications based on the features of our site. It explains what the options are for.
  • Perform your audit and note the sitewide issues.
  • Head over to the Site Structure tab and check what subfolders (that actually contain content that show up on the site) make up the most of your site structure.
  • Identify what subfolders hold the most important pieces of your website. Note the number of URLs.

Site Structure

3. Prioritize your tasks.

Now that you have data, it’s time to work on prioritization. Prioritization is up to three things:

  1. Your capacity
  2. Developers' capacity
  3. Impact on site performance

For example, if Screaming Frog is showing a High Priority issue but it’s for three URLs, you probably should reprioritize it (unless it’s a crawling/indexing issue for an important URL) for an Opportunity with 1,500 URLs that can be fixed by a developer in one go.

Action Plan:

  • Check what issues affect the most number of URLs and prioritize accordingly.
  • Cross-check your subfolder URLs with your Google Search Console data to identify important URLs losing traction.
  • Check issues for these important URLs and prioritize accordingly.
  • Ticket to developers what you can and QA your tickets!

Now, onto the issues we would normally find in an SEO audit.

Technical SEO

We're working on technical SEO issues first because we want to make sure the site is crawlable, indexable, and usable.

  1. Review your site structure.
  2. Secure your website.
  3. Submit an XML sitemap.
  4. Upload a robots.txt file to the root directory of your website.
  5. Check your robots meta tags.
  6. Check what is being rendered.
  7. Optimize URL slugs.
  8. Optimize site speed.
  9. Fix broken pages and content issues.
  10. Implement structured data.
  11. Prioritize mobile-friendliness.

1. Review your site structure.

A poorly designed site structure can confuse your site builders, leaving behind orphan pages. It takes extra time and effort to find these orphan pages and link to them when this happens.

Another issue is click depth. Without a properly planned site structure, important pages can be buried 3 clicks or more, making them difficult to reach for your target audience.

If you see issues with your sitemap, you can solve this by doing some reorganization. Here’s a helpful article by Yoast on the topic.

You can also begin looking for orphan pages so you can check if they need to be linked to, deleted, or left alone.

2. Secure your website.

We will prioritize three things here:

  • HTTPS
  • Referrer policy
  • Mixed content

If you haven’t yet, switch to HTTPS by using a Secure Sockets Layer (SSL). You usually can get this for free depending on your hosting provider. If not, you can speak to your web developers to purchase and activate it.

When you go Overview > Security of Screaming Frog, you should not be seeing any HTTP URLs.

Another important way to secure your website is to use a referrer policy. For WordPress, this is automatically applied, but for Drupal you will have to set it.

We will be using strict-origin-when-cross-origin as it offers more privacy.

You can activate this in global in your site by going to Configuration > Search and metadata > Metatag. From there, you can click Global and scroll down until you see the Referrer policy option, then select Strict Origin When Cross-Origin.

referrer policy

Lastly, check for Mixed Content. These are resources that are being loaded over HTTP even when the actual page itself is HTTPS. You can easily fix these by looking for the bad resources and replacing them with resources loaded over HTTPS.

3. Submit an XML sitemap.

If you haven’t yet, make sure to go to Google Search Console and submit your sitemap there.

GSC sitemap

4. Upload a robots.txt file to the root directory of your website.

According to Google, "A robots.txt file tells search engine crawlers which URLs the crawler can access on your site." However, it does not prevent Google from indexing your pages. Use a noindex tag for that case.

To verify your robots.txt, simply go to website[com]/robots.txt. If you don’t have one yet, you can follow the instructions here on how to create and upload the file to your website.

5. Check your robots meta tags.

Robots meta tags are the directives that give specific instructions to robots on crawling and indexing your web pages. For example, if I don’t want Google or other search engines to index an archived blog post, I can add a noindex directive on the page.

You can find the pages that have this directive by going to Overview > Directives. Make sure to review the URLs so you know the pages you want indexed do not have the noindex directive.

noindex

6. Check what is being rendered.

I first realized that rendering is important to check when I did an audit and realized Google could be seeing something different (or worse, not seeing your site at all) when it renders your website.

For example, JavaScript could be editing your titles, so what you had originally written out and what Google is seeing could be two completely different things. Sitebulb has a great explanation as to when this is or isn't an issue.

Some development teams have chosen to use Drupal as a headless or decoupled CMS. Let's clarifiy what this means, and discuss whether there are any SEO considerations.

Using Drupal as a headless CMS means that you are using the CMS functionality of Drupal to store your content, then another platform to pull through this information and display it for users.

So for example you would log into Drupal as normal and edit/publish your content, but then a platform like Next.js as a front-end framework - which is what users connect to.

This would typically be done via Drupal's JSON:API or Restful Web Services module.

So what are the SEO considerations for this?

The first step is to test what is actually being pulled through to the site. For example, are title tags, SEO meta tags, Open Graph tags and similar being displayed as you would expect?

You should also check if your content is being rendered correctly to ensure that Google can easily access your content.

—Colin McDermott, Head of SEO at Whop.com

7. Optimize URL slugs.

As we discussed earlier, you can automatically generate URL slugs based on your H1. If your H1 is too long, make sure your URL slug is short, clear, yet descriptive.

For example, instead of:

/how-to-perform-seo-to-your-drupal-website-for-2023

You can choose:

/how-optimize-drupal-2023

It’s much cleaner while giving the same information to your users and the search engines.

But again, don’t go changing your URL slugs if there’s no need. For example, if you refresh a blog, you don’t have to change the URL if it’s already set. Just go to GSC and ask Google to recrawl the page.

8. Optimize site speed.

Some tips to improve site speed for Drupal include reducing the size of images, using a caching module, and optimizing code. PageSpeed Insights can help identify specific ways to improve performance.

PageSpeed Insights

Images are often the largest files on a web page, and can significantly slow down loading times. Try to reduce the size of images without compromising quality using Photoshop or an image resizer tool.

You can also check your caching configuration at Configuration > Development > Performance to help speed up loading times by storing frequently accessed files, so they don’t need to be retrieved from the server every single time a user visits your site.

Finally, take a look at the code on your site and see if there are any ways to optimize it for faster loading. This may involve minifying CSS and JavaScript files. Go again to Performance and you’ll see an option to aggregate CSS and JavaScript files under Bandwidth Optimization.

By following these tips, you can help improve the speed of your Drupal site and provide a better experience for your users.

9. Fix broken pages and content issues.

First, let’s talk about broken pages. Check for your URLs that have an HTTP response code of 404 then consider if these should be redirected to a page that is more relevant to your users (if you have them) or if they should be left alone.

If you don’t have a more relevant page on a similar topic, do not redirect to the homepage. You won’t be getting any value from it and your users will only be confused.

John Mueller also said that if the link has been 404 for years, you can just leave it alone instead of redirecting it. I suggest just replacing the 404 link with another one if that’s the case.

For content issues, you can have thin and duplicate content.

If the content of the page does not give value to the users, it needs to be fixed. You don’t necessarily have to pump out 5,000 words—but you do need to make sure that the content is relevant, valuable, and informative.

For duplicate content, make sure to have your canonical links set up. One culprit we find (if we aren’t using the same titles/meta descriptions for different pieces of content) is pagination. Here is an incredibly helpful guide on the topic.

10. Implement structured data.

According to Google, "Structured data is a standardized format for providing information about a page and classifying the page content." For example, you can check this blog which uses the Article schema.

Structured data is a great signal to get the SERP to display your results in a more attractive way to your users. And with zero-click searches, it's important to get those positions as much as possible as well.

This is what the structured data looks like:

structured data

To check what information is required for the schema to work, I suggest downloading the Schema Builder for Structured Data extension. When you click on the extension on your toolbar, you’ll see what schema is detected and view their markup, or you can choose a schema type and see its requirements.

schema builder extension

To apply the schema, install the Schema.org Metatag module and fill in the necessary details you’ve checked out using the Schema Builder extension. To check the validity of your schema, you can click the extension again and it will show you if you missed any of the required items. 

Alternatively, you can insert your code or URL on the Schema Validator and also check it using the Rich Results Test.

11. Prioritize mobile-friendliness.

Having a mobile-friendly website matters to SEO since:

  • Websites behave differently on mobile.
  • Users engage websites differently than when they’re on desktop.

And since 63% of US searches happen on mobile, it’s essential to ensure your website is mobile-friendly so you can rank both on desktop and mobile.

Since Google has deprecated the Mobile-Friendly Test tool, Mobile-Friendly Test API, and the Mobile Usability report last December 2023, it's recommended to run the tests using Lighthouse in Chrome.

I would also suggest to talk to the QAs in your development team. Here at Promet Source, any change done to our website and our client websites always have to go through QA, and they perform the mobile testing for us to make sure everything is good to go before and after deployment in the live environment.

On-page SEO

  1. Research your keywords and topics.
  2. Place keywords in your titles.
  3. Ensure headers are correct.
  4. Don't force your keywords.
  5. Add short, descriptive alt texts to images.
  6. Ensure links are live and have descriptive anchor texts.
  7. Write clear and succinct meta descriptions.
  8. Ensure colors pass the contrast test.
  9. Perform a content refresh.
  10. Write helpful content.

1. Research your keywords and topics.

Writing starts with a lot of research—including keyword research. Some people prefer to just use keyword research tools and others believe just Google is good enough. I prefer to use both.

Semrush search intent

You can use your keyword research tool to help you determine what the ranking sites are for the keyword you're vying for and you can find associated keywords. But don't depend on just the tool (and God forbid, don't depend on AI for your keyword research).

See, keyword research tools aren’t foolproof—here’s a great interview with Mark Williams-Cook, the founder of AlsoAsked—on why zero-volume keywords should not be overlooked.

Simply put, use Google for long-tail keywords and questions people ask about your topic.

Keyword research tools are great for short to medium-tail keywords. It’s also easier to see variations of the keyword you have in mind.

2. Place keywords in your titles.

This is not a hard and fast rule—but I prefer putting keywords at the beginning of the title since it makes it easier for users to understand and remember what your article is about. But if that makes the title awkward, then just make sure the keyword is in your title.

Remember, your title is the first thing your users see when you come up in the search results. If you have a title that tells your reader absolutely nothing about the page, why would they click?

And ultimately, why should Google rank you in the SERPs if users aren't clicking?

Lastly, make sure your titles are different from each other. You can check your Screaming Frog audit results for duplicate titles and meta descriptions.

3. Ensure headers are correct (H1, H2, H3).

The heading tags on your pages (H1, H2, H3, etc.) play an important role in helping search engines and users understand the structure and content of your page.

Well-structured and descriptive headers are helpful context clues. Imagine picking up a book for your research paper—wouldn’t you skim the table of contents to see if its contents are useful to you?

And not just that, proper headers are great for accessibility.

See, screen readers tell the users what headers they’re reading. For example, they would indicate if a header is H2 or H3. So, your disabled users will have a difficult time if your headers are all over the place or improperly formatted.

4. Don't force your keywords.

When optimizing your on-page factors for SEO, it's important to make sure that you're using keywords in the right places. This means using them in your title tags, meta descriptions, headings, and throughout the body of your content.

Placing keywords strategically will help search engines understand what your page is about. This is going to happen naturally as you discuss your content thoroughly.

The problem comes when people treat keywords like they should be peppered on content to the point that it’s just incredibly obvious that the writer is trying to rank the page instead of being useful to the reader. Google (and your reader) hates that.

Keyword stuffing definition

If you have done proper keyword research, your content will flow.

5. Add short, descriptive alt texts to images.

Images can be a great way to break up your content and make your content look more alive. 

They can also help improve your SEO. Images are great for showing examples and extra information to your users without making them go through extra blocks of texts. Plus images can also rank on search engines—which means your content can rank on search engines.

When adding images to your pages, make sure to include descriptive alt texts. This will help people find your images easier on Image Search.

Another reason to put descriptive alt text to your images is that it will help screen readers "read" the image to the users. So if you have an image with an alt text that’s clearly used to add more keywords to the page instead of helping users, it will ruin their user experience.

You can check your Screaming Frog results for images missing their alt texts so you can see if they are decorative images or not.

missing alt text

6. Ensure links are live and have descriptive anchor texts.

Links are an important part of both technical and on-page SEO. They help search engines understand the structure of your site and can be used to pass along authority and value.

They also help users navigate your site better and find information you want them to find.

When adding links to your pages, make sure that they're live and have descriptive anchor text. This will help search engines and users understand better what they're pointing to.

And yes, this is also good for user experience and accessibility. Imagine clicking on a link and finding out that it’s a 404. Isn’t that annoying?

Or you’re using a screen reader and you completely miss a link because the anchor text just says "here." It just doesn’t help anyone.

7. Write clear and succinct meta descriptions.

When writing your meta descriptions, it's important to be clear and concise. You want to make sure that your meta description accurately reflects the content on your page and includes relevant keywords.

But you also don't want to stuff your meta description with too many keywords. The point is to make it easier for users to understand what your page is about while they’re still browsing the SERPs.

Lastly, a good rule of thumb is to keep your meta descriptions around 155 characters just to be sure it’s descriptive enough but not too long that it gets truncated by Google.

Do meta descriptions help with ranking? No, and Google does rewrite meta descriptions if they aren't very helpful. But I like to still work on meta descriptions because that snippet of information can make the difference between a click and no click.

truncated meta description

To find which pages on your website to pay attention to for meta descriptions, check out the Meta Description tab on Screaming Frog and filter by results.

Screaming Frog meta description

8. Ensure colors pass the contrast test.

You need to make sure your page passes the color contrast test. What that means is your foreground color and background color shouldn’t be too similar or shouldn’t hurt the eyes of your readers.

By having a good color contrast (e.g., black text on white background), you help your users—especially colorblind or low vision readers—access your content without giving them a hard time.

And yes, this includes links. Make sure your links are of a different color from the rest of your text, or else your users could pass on potentially important information (or pass on going to another page on your website) just because the link wasn’t obvious.

9. Perform a content refresh.

Remember the URLs you cross-checked from your site structure and GSC audit? If you're seeing a downward trend in their clicks and there isn't much technical work to do on them, I suggest you review the content.

Content decays for a multitude of reasons, so it's good to audit your content and see if they're still relevant and helpful to your target audience.

You can also do this on the top URLs losing clicks on your site. For example:

  • Content that used to perform well then began losing traction.
  • Content that hasn't performed in a while but are still getting high impressions.

These are worth looking into and refreshing.

10. Write helpful content.

Not all your posts need to be an ultimate guide, but it’s still important that your content is well-thought-out. That’s also why keyword research is important—it doesn’t just tell you what you want to rank for, but what questions people have that you can answer.

Then make sure you answer those questions as clearly and effectively as you can. Here are other things you can do:

  • Ensure your content serves your target users.
  • Fulfill your users' search intent.
  • Give examples and proof.
  • Explain jargon and use clear language.
  • Add your personal experience on the topic.
  • Add other SME's experiences on the topic.
  • Proofread your work religiously.

Focus on excellent UX and site speed, but more importantly cited sources and actually answering all the user questions. Find these using the SERP PAA (People Also Ask), customer service, and tools like AlsoAsked. If you nail this you have a great chance of doing excellent in the SERPs. Can you use AI to generate this? Yes, but it's risky. I would only do it with human checks built in.

—Arnout Hellemans, Online Markethink

Some people argue that you cannot leverage AI at all to write helpful content, but Google (and a lot of SEO experts) disagrees. Cassey Bowden, our Director of Marketing, suggests using AI as an assistive tool instead of depending on it for everything.

Some use cases for AI in writing:

  • Simplifying your content.
  • Proofreading your work.
  • Summarizing your sections.
  • Suggesting improvements.

Off-page SEO

Building links is perhaps the most important activity for off-page SEO. The more high-quality links you have pointing to your website, the higher your site will rank in search results.

There are a number of ways to build links (this comprehensive guide from Semrush is great) but it’s important to note that off-page SEO is highly dependent on the quality of your onsite content.

You can have the most populated social media calendar and all the journalist contacts in the world, but if your content is poorly done, it will not matter.

Why? Because other people would not share or give precious links to your content if they don’t think it’s valuable or relevant enough to share or link to.

  1. Optimize your posts for easier sharing.
  2. Create infographics and other easily shareable visuals.
  3. Plan your content distribution ahead of time.
  4. Use tools like HARO and Featured.
  5. Don’t be a cold-caller.

1. Optimize your posts for easier sharing.

You can optimize your content for sharing in three ways:

  1. Writing relevant, informative content (as mentioned earlier).
  2. Writing descriptions that persuade users to click and share.
  3. Adding an eye-catching image.

You can use the Twitter Cards meta tag to choose what your blog would look like when you post it on Twitter, for example. 

Twitter meta tag

I suggest selecting the Summary Card with large image so it looks like this:

Blog post on Twitter with summary card

2. Create infographics and other easily shareable visuals.

Another thing people love sharing are infographics and other easily shareable visuals. Visuals are such a powerful tool to give just the right amount of information in a pleasing way.

An image search of "Drupal history infographic" shows us a bunch of beautifully done visuals that are bound to catch our attention. You would want yours to be the same way—not just informative, but eye-catching.

3. Plan your content distribution ahead of time.

The best performing content is often the best distributed content.

—Melanie Deziel, The Content Fuel Framework

Content distribution tends to be an afterthought and is usually not included in SEO plans. I say it should be—since you need to understand how and why you will be distributing your content the way you want to before even conceptualizing the kind of content you will be creating.

For example, would you create YouTube videos of a massive statistics post? Probably not, but you can send that to a journalist who does articles that would need one.

I suggest reaching out to whoever is handling your socials and other members of your team so you can plan out your distribution.

4. Use tools like HARO and Featured.

Using these tools and platforms are the easiest way I've gotten backlinks. We got links by answering queries from American Marketing Association - Colorado about A/B testing in marketing, GoDaddy about Google's helpful content update, and Lightkey.io about SEO to boost discoverability.

You can pitch on behalf of other SMEs in your organization as well, so you don't have to be the one answering all the time.

5. Don’t be a cold-caller.

Lastly, make friends. This, I think, is the most difficult one for off-page SEO.

I firmly believe that no SEO specialist is an island. Aside from ensuring our content is ridiculously relevant and useful, we also need to build relationships with other folks in our community.

Thankfully, there are more and more communities popping up now online (shout-out to the Women in Tech SEO group and the Neurodivergents in SEO group). Joining these communities make it easier to get your SEO questions answered by people who are supportive, and to form friendships with amazing SEO folks.

As the example goes, would you help the unknown number who called you and gave you a sales talk? Probably not. But you would help your friend who asked you a favor.

Off-page SEO can be a long game. So, ensure your content is awesome and make some friends.

Attract your target audience and boost relevant traffic through SEO

SEO is an investment worth having, especially if your audience uses search engines to look for answers to their problems. Optimizing your website means working to be the authoritative source in your field, so that when the time comes that your audience is ready for your product or service, you're at the top of their mind.

SEO is a long game. It typically takes months before you see the fruits of your hard work, and some even see the effects after an official update! But just because it takes months, it doesn't mean you aren't helping yourself now. So be patient, stick with it, and keep optimizing.

Good luck!

Want to hire us to optimize your website instead? Contact us for an audit.

Jan 18 2024
Jan 18

Ever needed to give a Drupal content editor access to edit site sections instead of the whole site? Introducing the Content Access by Path module.

Here's a module I created, funded by Essex County Council and based on a spec developed by Will Callaghan, that allows you to create taxonomy terms, then set paths (via a text field) on those taxonomy terms, and then attach those taxonomy terms to users. 

Now, when a user has a taxonomy term attached to them, they can only edit content that is within the range of that path.

I'm surprised this module didn't already exist.

Jan 18 2024
Jan 18

Today, something clicked for me. At some level, I guess I knew this, but apparently I didn't know this. Config translation and interface translation in Drupal are connected. This is the part I basically knew; when you do translation imports, config will often get updated translations. I didn't really know how they were connected, though. 

I ran into this while trying out Message Subscribe, a module intended to connect Flag and Message modules to allow users to subscribe to content updates. It came with a flag named Content, which I wished to rename, because it did apply to content, but apart from that, the label isn't particularly descriptive. Since the site is in Dutch, I gave it a Dutch label, translating to "Subscribe to content". As soon as I saved it, the label to the top-level admin menu entry, that would usually read the Dutch translation of simply "Content", now also became "Subscribe to content". What? Surely there was something wrong here. I clicked around a bit, changed the label back and forth. I went to look in Interface Translation. Sure enough, the translation for "Content" had changed to "Subscribe to content". I changed it back. And the label of the flag changed too. How annoying, what is going on. So, I broke out XDebug.

I found that when saving the translated config, Drupal will check if there is default configuration anywhere matching the config ID, and compare any translatable strings to that. When a match is found, Drupal will then find the interface translation matching the original, and update the translation. Which I suppose makes perfect sense at some level and it actually means my idea for a "config translation module" doesn't really make sense; as long as there is default configuration that matches the configuration you want to allow your users to translated, they can use the standard interface translation.

In case you don't know what I mean with default configuration, I mean the configuration that is in config/install and config/optional directories in a module. The config in install will get installed no matter what (and presumably will complain if some dependency is not available), that in optional will only get installed when all dependencies are available. 

I did create an issue for the Message Subscribe module. I already thought the flag label was too generic, but now I found it is actively causing issues for translated sites. 

Jan 18 2024
Jan 18

The Drupal Association is excited to announce the DrupalCon Portland 2024 t-shirt design contest! For this year's DrupalCon North America, we want to see the Drupal community's design ideas for the official DrupalCon Portland t-shirt. Do you have a fantastic idea in mind? Let’s see your creativity!

The winner will get THEIR design on the front of the official t-shirt for DrupalCon Portland 2024!

Make your mark at DrupalCon Portland! Enter our t-shirt design contest.

Now, for the finer details…

Your design must include the DrupalCon Logo and will only be featured on the front of the t-shirt. Sponsor logos will be added to the t-shirts sleeves after the design is finalized. Specs: PNG or PDF preferred, 16 inches tall, and graphics need to be 300 dpi. All designs must be submitted by 12 February 2024 at 24:00 UTC, after which the submission form will close.

The top four designs as chosen by the Drupal Association will then be voted upon by the public, with voting closing on 28 February. The winning design will be on the front of the official DrupalCon Portland 2024 t-shirt and will be announced during the Driesnote at the conference! The winner will receive a complimentary ticket to their choice of either DrupalCon Portland 2024 or DrupalCon North America 2025.

How do I enter?

To enter: Simply create your design, then fill out our submission form by 12 February to submit your final design. We also ask that you include a sentence or two describing why you chose your design and how it represents the Drupal community.

So, what are you waiting for? Submit your design now, and please help us spread the word throughout the Drupal community!

Good luck to all of our participants!

** Drupal Association staff will not be permitted to enter this contest.

Jan 18 2024
Jan 18

Introduction

We are expanding our global operations to provide nearshore coverage for customers requiring extended support by inviting contractors and outsourcing partners in Latin America to join The Axelerant Network.

While we continue to hire internationally, including within the United States and Canada, our approach to expansion at Axelerant is informed by how we see partnership as a means of organic growth.

After successfully expanding our customers' capabilities using our teams, we are working to enhance and extend our teams through collaboration with vendor partners and contractors. The goal is to scale.

“We have always believed—and we’ve said it over and over—that ‘partnership is the new normal,’” said Ankur Gupta, CEO at Axelerant. “And it’s time for us to forge strategic partnerships with agencies and others to help support some of our most important engagements. This will help us in Axelerant’s ongoing mission: to accelerate digital outcomes and scale with our customers as their partners of record.”

This move underscores our commitment to global collaboration. It opens doors for mature, partnership-focused digital experience outsourcing agencies and contractors inspired by long-term, symbiotic growth built on trust.

“We’re really excited about what this could mean for support services. We can go beyond Indian Standard Time (IST) coverage with a multi-region approach,” said Hetal Mistry, Director of Delivery at Axelerant.

Beginning The Axelerant Network

Axelerant’s Vendor Partner Management function will accept applications from interested contractors and agencies to create an Axelerant Partner Network. Ultimately, this network will serve as a community of agency contacts and contractors interested in submitting responses to Requests for Proposals (RFPs) / Invitations to Tender (ITT). 

There are several benefits to joining this network, including:

  • High-Impact Opportunities: Expand your service portfolio, reduce reliance on a single client base, and gain access to new markets and customer segments, significantly expanding your reach and influence. Within the Axelerant Network, engage in transformative collaborations with change-making organizations and recognizable institutions. Axelerant delivers work for clients like Stanford University, Doctors Without Borders, and agencies within the United Nations.
  • Achieve More Together: Collaborating as a vendor with Axelerant enables agencies to tackle larger projects by combining resources and expertise. This partnership also offers the advantage of leveraging Axelerant's global presence to provide coverage in different time zones for your diverse client needs. With this arrangement, we can mutually optimize and utilize our resources, serving a broader and more diverse client base.
  • Transparency & Fair Treatment: Axelerant does not subcontract work without explicit permission from our customers. We believe in transparency and mutual benefit. As a vendor partner or contractor with Axelerant, you can expect an open approach, and we are eager to help you publish case studies that showcase the value and success of our collaborations.
  • Global Connectivity & Community: Join a global network of industry leaders and professionals. Benefit from meaningful connections with external agency peers. We aim to create a space where professionals from different backgrounds can openly engage, ask questions, and share perspectives.
  • Tailored Excellence: Axelerant offers flexible partnership models, recognizing the unique strengths of each agency and contractor arrangement. We will work with you to customize our engagement for mutual alignment. 

Join The Axelerant Network 

Please fill out this form to explore partnership possibilities. 

Jan 18 2024
Jan 18

Introduction

According to the latest GlobeNewswire report, the Digital Experience Platform (DXP) market is expected to grow to $18.92 billion in 2027 at a Compound Annual Growth Rate (CAGR) of 11.7%. This shows that DXPs have become increasingly popular.

When implemented correctly, DXPs enable organizations to optimize their digital transformation, climb the ladder of digital maturity, and address customers' growing needs. It plays an instrumental role in enhancing customer experience across industries.

What Is A DXP?

According to Gartner,

DXP is a cohesive and integrated technology designed to enable the management, composition, optimization, and delivery of digital experiences through multiple customer journey touchpoints.

DXPs enhance customer experiences by getting rid of technology silos and providing users with access to a central hub through which one can create, manage, deliver, and optimize experiences. It can help collect data and gather insights into customer behavior that can be used to create, manage, and deliver high-quality personalized content.

DXPs pack a lot of functionalities into a single system. It combines popular technologies like Customer Relationship Management (CRM), campaign management, Digital Asset Management (DAM), personalization tools, and Customer Data Platforms (CDP). Other capabilities of a DXP include:

  • Media storage and content management
  • Development of portals, websites, apps, and landing pages
  • Collecting customer data and content
  • Utilizing data to personalize the experience
  • Analytical insights

What Is A Composable DXP?

Composable DXP (Digital Experience Platform) refers to a modular approach to building a DXP by combining individual, self-contained services or capabilities. It allows organizations to select and integrate specific features, such as content management, personalization, or analytics, based on their unique requirements.

This composable architecture provides flexibility, scalability, and adaptability, enabling businesses to create tailored digital experiences for their customers. By leveraging APIs and microservices, composable DXP empowers organizations to easily assemble and reconfigure components, streamlining development processes and enhancing overall agility.

What Is A True Open DXP?

Though most DXPs are composable, the fact remains that they are also highly opinionated and do not integrate properly with other DXPs or other components of different DXPs. Prateek Jain and Dominique De Cooman in DXP Deconstructed Episode 3 also highlighted similar views.

[embedded content]

In a true open DXP, users can switch and upgrade any components without affecting the rest of the stack. This will allow businesses to exercise complete freedom in providing their customers with the best-in-class experience without worrying about integration complexities.

How You Can Use A Composable DXP To Enhance Your Customer Experience

According to a report by PwC, 49% of consumers would walk away from a brand that delivers a bad customer experience. Another study by Salesforce found that 76% of customers expect organizations to understand their needs and expectations.

Importance of personalization

Both these reports indicate that it’s easy for customers to be dissatisfied, and personalization is the key to providing meaningful digital experiences. A composable DXP can help achieve this by:

Delivering Compelling Omnichannel Content

Organizations need to reach out to their customers across channels. A recent study by Omnisend highlighted that the purchase rate of campaigns using three or more channels is 494% higher than single-channel marketing.

Personalizing the customer experience for all these channels individually can pose a challenge for organizations using traditional marketing approaches. Composable DXPs can solve this problem by optimizing marketing efforts and controlling the experience customers get on all channels.

Ensuring Cutting-Edge, Interactive Customer Experiences

A Composable Digital Experience Platform (DXP) gathers information on challenges and obstacles in processes across various channels and devices, creating a comprehensive dataset. This data can be analyzed and utilized to enhance the digital customer experience.

Utilizing Targeted Content

Consumers consistently express their desire for content and offers that are tailored to their individual needs. They have high expectations for personalized experiences. But traditional monolithic CMS frameworks can limit the ability to deliver content specifically targeted to different customer segments.

In contrast, composable platforms provide marketers with the capability to precisely target the most responsive audiences, thereby optimizing customer acquisition and retention.

For instance, composable DXPs enable localization efforts such as translating content into local languages or automatically adapting it for different contexts and delivery channels. This empowers marketers to design campaigns that revolve around a highly personalized customer experience.

Maximizing Content Usage and Value

When data is isolated within specific divisions or verticals, its value is limited. This value multiplies when the data is accessible and utilized across all aspects of a business, enabling widespread content distribution.

A composable platform leverages APIs to distribute both legacy and new content extensively, without incurring significant expenses to adapt the content for various platforms. This enhances the value of your enduring content.

Delivering Content At Speed

A composable platform's APIs are valuable for accelerating content distribution. Unlike monolithic systems that often demand programming work or time-consuming adjustments for deployment, a composable solution eliminates this requirement for marketers.

In today's fast-paced content landscape, the speed of distribution is crucial. Additionally, organizations can provide the most relevant customer experiences by swiftly updating marketing materials to reflect new developments or changes at the product level.

Personalizing Interactions

A digital experience platform offers the ability to customize a customer's experience, which is facilitated by centralizing customer data on a single platform. This platform can integrate with other systems, like CRMs, to provide a comprehensive view of the customer.

This centralized data allows for the delivery of personalized content and functionality based on contextual factors such as job role. It also enables the creation of user segments based on key audiences and behaviors. 

Industries Adopting DXPs

Digital experience platforms are changing how businesses interact with their customers across industries.

Nonprofit: Transforming Digital Experience And Content Publishing For OHCHR.org

The Office of the United Nations High Commissioner for Human Rights (OHCHR) is an international organization that promotes and protects human rights globally. They needed to transform their digital experience and content publishing to reach a wider audience and improve user engagement.

Axelerant partnered with OHCHR to design and develop a new website using Drupal, a robust content management system. The experts implemented a user-centric design, improved content organization, and integrated multimedia features.

The transformation resulted in a more user-friendly website that allowed OHCHR to communicate its human rights mission effectively and engage with its global audience. The new website saw increased traffic, engagement, and positive user feedback.

OHCHR

Higher Ed: Engineering An Intuitive, Accessible, Secure, And Personalized DXP For UEL

The University of East London faced challenges managing content, collaboration, and personalized user experiences. They needed a solution to deliver relevant and engaging content to students, staff, and other stakeholders.

The experts at Axelerant integrated different technologies and platforms to create a centralized system for managing content, enhancing collaboration, and personalizing user experiences. This DXP solution gave UEL the tools to deliver engaging content to its stakeholders. This helped UEL successfully transform its digital presence and improve overall user experience.

UEL

Retail: Consolidating DX Across 90+ Brands For Retail Megalith

The customer is an enterprise-level retail megalith comprising some of the world's most recognized brands and household names. The franchise currently operates 90+ brands and 4000+ stores.

The client faced challenges in frontend development, specifically managing their franchise websites effectively. The experts at Axelerant provided a solution by implementing a robust frontend development strategy. They focused on improving the franchise websites' performance, user experience, and overall design.

The solution helped enhance website performance, better user engagement, and increased customer satisfaction.

Consolidating_DX

How Axelerant Can Help Implement A Composable DXP

By understanding your organization's goals, customer journeys, and technical landscape, we can help craft a holistic DXP strategy that integrates various components into a cohesive digital ecosystem. Our team ensures a flexible, scalable, and future-ready DXP solution by leveraging agile methodologies and cutting-edge technologies.

Schedule a call to learn how we can help your organization enhance customer experiences, drive operational efficiencies, and achieve sustainable growth through a Composable DXP.

Jan 17 2024
Jan 17
Project: Drupal coreDate: 2024-January-17Security risk: Moderately critical 11∕25 AC:None/A:None/CI:None/II:None/E:Theoretical/TD:DefaultVulnerability: Denial of ServiceAffected versions: >=8.0 <10.1.8 || >=10.2 <10.2.2Description: 

The Comment module allows users to reply to comments. In certain cases, an attacker could make comment reply requests that would trigger a denial of service (DOS).

Sites that do not use the Comment module are not affected.

Solution: 

Install the latest version:

All versions of Drupal 10 prior to 10.1 are end-of-life and do not receive security coverage. (Drupal 8 and Drupal 9 have both reached end-of-life.)

Drupal 7 is not affected.

Reported By: Fixed By: 
Jan 17 2024
Jan 17

This post is brought to you from our partners at Skynet Technologies.

Uplifting the digital experience of your Drupal website by making it accessible is inevitable.

The reason behind digital evolution is its easy availability for all. But unfortunately, the web is still full of inaccessible experiences, which become a hindrance for users with any sort of disability. And that is the reason why Drupal incorporated various accessibility features with time to ensure its website accessibility.

Along with accessibility features, Drupal has accessibility modules as well that are contributed by its active community. The modules improve Drupal website accessibility without having to put much effort into coding.

Let’s know which are those modules that enhance Drupal website accessibility.

Top Drupal web accessibility modules!

#1 All in One Accessibility

Drupal All in One Accessibility is an AI based accessibility module to enable Drupal websites to be accessible among people with hearing or vision impairments, motor impaired, color blind, dyslexia, cognitive & learning impairments, seizure and epileptic, and ADHD problems. It manages website UI and design related alteration as an accessibility interface.

Drupal All in One Accessibility module installs in just 2 minutes. PRO version reduces the risk of time-consuming accessibility lawsuits.

This module improves accessibility compliance for the standards WCAG 2.0, WCAG 2.1, WCAG 2.2, ADA, Section 508, European EAA EN 301 549, Canada ACA, California Unruh, Israeli Standard 5568, Australian DDA, UK Equality Act, Ontario AODA, France RGAA, German BITV, Brazilian Inclusion law LBI 13.146/2015, Spain UNE 139803:2012, JIS X 8341, Italian Stanca Act, and Switzerland DDA.

It is a cornerstone of improving web accessibility through its ease of use for companies of all sizes. Top features of the module:

  1. Accessibility statement
  2. Accessibility interface for UI design fixes
  3. Dashboard Automatic accessibility score
  4. AI based Image Alternative Text remediation
  5. AI based Text to Speech Screen Reader
  6. Keyboard navigation adjustments
  7. Content, Color, Contrast, and Orientation Adjustments
  8. Supports 53 languages
  9. PDF / Document Remediation Add-On
  10. White Label Subscription
  11. Live site translation add-on
  12. Custom widget color, position, icon size, and type
  13. Dedicated email support

#2 Monsido Tools

Monsido tool helps to optimize Drupal websites easily and swiftly. The tool ensures that the website is validated for the de facto international standard, which is WCAG 2.1. So that website will be accessible to everyone in each region.

Monsido scans your Drupal website to identify all persisting accessibility issues and gives you suggestions on addressing the issues to rectify them. It also finds SEO errors and helps you optimize every page of your website.

#3 Editoria11y Accessibility Checker

Editoria11y (editorial accessibility ally) is supported by Princeton University. It is made focusing on content quality and accessibility.

The module checks content automatically, authors are not required to get trained to use it. It detects issues that appear after Drupal assembles the pages by testing rendered content.

Editoria11y prioritizes content issues by inserting alerts and tooltips to help authors fix the problems without troubling them with complex code. It majorly supplements the accessibility issues and does not replace the elements.

#4 Civic Accessibility Toolbar

The Civic Accessibility Toolbar has a block with accessibility utilities which is an aid for end-users if they wish to switch between theme versions with higher color contrast and update text font sizes as well.

The module enables its users to create a block with both or at least one of the utilities to make your Drupal website accessible for visually impaired users. It is tested with Garland, Bartik, Zen Starterkit, Stark, and Olivero themes.

It uses colourContrast and fontSize cookies to remember user selection. The cookies only use functional or necessary details and don’t keep the user’s personal information.

#5 Accessibility toolkit

Basically, Accessibility Toolkit helps Drupal developers with reusable tools so that they can fulfil the requirements of people with disabilities by making websites compatible with assistive technologies. It is tested for Drupal 7, 8, and 9. It does this through aggressive CSS additions and remembers the setting using Drupal's built-in usage of jQuery Cookie.

It provides a block with all little settings to allow for –

  • High contrast mode
  • Dyslexic font support
  • Text scaling
  • Inverted colors mode
  • Keyboard navigation (only for D8/D9)

#6 Fluidproject UI Options

The module is maintained by Ukrainian developers. It helps users to modify a web page’s line height, font size and style, contrast, and link style. All changes are retained using cookies for a longer span. Fluidproject UI options integrate Drupal libraries into non-admin pages.

To use this module, you need to have Grunt and NPM installed for compiling the infusion library, and a jQuery 1.7 version is required.

However, the module cannot do internationalization through the Drupal interface, JSON files within the module folder can perform this function. This Drupal accessibility module is tested with its most popular themes successfully. Please note here that some of its themes require additional CSS to adjust font size and line heights. Also, Contrast settings don’t work properly for website elements that use CSS gradients.

YOU MAY ALSO LIKE: PDF Document Accessibility Remediation

#7 High contrast

High contrast provides a quick solution for users to switch between an active theme and its high-contrast version.

It only needs to install it and press the tab from the keyboard, then click on the ‘Toggle high contrast’ link. You will find yourself in high contrast mode, returning to normal view is possible via following the same steps.

#8 Style Switcher

This Drupal website accessibility module enables every website visitor to select the stylesheet they want to view the site content with. They only require clicking on its link to get the new look of the website.

Style Switcher reduces the duplication of work since developers don’t need to create themes for alternative stylesheets. Themer has the capacity to provide a theme with alternate stylesheets and the Site builder can add alternate stylesheets in the admin section.

The module gathers and presents all the styles as a list of links in a block for site visitors. Thus, all visitors can easily choose their preferred styles. And the module uses cookies, so, if a user returns to the site, they get the same chosen style.

#9 Text Resize

The text resize accessibility module offers a block to end-users that helps in changing the font size of text on Drupal websites. The block includes a button to increase or decrease the text size, which is an aid for visually impaired users. Text resize uses JavaScript with jQuery and jQuery Cookie to bring accessibility.

#10 Automatic Alternative Text

The Automatic Alternative Text accessibility module uses the Microsoft Azure Cognitive Services API or Alttext.ai to generate alternative texts for images if the alt text is missing.

The module provides algorithms to process images. It can be used to understand if the image has relevant content or not. It also has features like categorizing the content of images, describing the images in human-readable language, and estimating the dominant and accent colors of the image.

P.S. All above-mentioned modules have free and premium versions available. You can select the best suited version.

YOU MAY ALSO LIKE: Voluntary Product Assessment Template (VPAT)

Some more contributed modules to fine-tune the Drupal website’s accessibility!

  • CKEditor Abbreviation
  • HTML Purifier
  • Siteimprove
  • htmLawed
  • Block ARIA Landmark Roles

Read more for detail information.

Wrapping up

Having an accessible website is crucial and the need of an hour. All in One Accessibility is a quick and comprehensive solution with AI based features to improve your website accessibility compliance at next level. The cherry on top is its 2 minutes installation and 10 days free trial. Not limited to this, the dashboard add-ons and upgrades like PDF / document accessibility remediation, white label subscription, and live site translation helps in increasing digital accessibility.

Jan 17 2024
Jan 17

Join us TOMORROW, January 18 at 1pm ET / 10am PT, for our regularly scheduled call to chat about all things Drupal and nonprofits. (Convert to your local time zone.)

This month we'll be discussing the return of the Nonprofit Summit to DrupalCon Portland 2024!  We're currently looking for breakout discussion leaders, and we'll be answering questions about what that involves, as well as throwing around ideas for potential topics. 

And we'll of course also have time to discuss anything else that's on our minds at the intersection of Drupal and nonprofits -- including our plans for NTC in March.  Got something specific you want to talk about? Feel free to share ahead of time in our collaborative Google doc: https://nten.org/drupal/notes!

All nonprofit Drupal devs and users, regardless of experience level, are always welcome on this call.

This free call is sponsored by NTEN.org and open to everyone. 

  • Join the call: https://us02web.zoom.us/j/81817469653

    • Meeting ID: 818 1746 9653
      Passcode: 551681

    • One tap mobile:
      +16699006833,,81817469653# US (San Jose)
      +13462487799,,81817469653# US (Houston)

    • Dial by your location:
      +1 669 900 6833 US (San Jose)
      +1 346 248 7799 US (Houston)
      +1 253 215 8782 US (Tacoma)
      +1 929 205 6099 US (New York)
      +1 301 715 8592 US (Washington DC)
      +1 312 626 6799 US (Chicago)

    • Find your local number: https://us02web.zoom.us/u/kpV1o65N

  • Follow along on Google Docs: https://nten.org/drupal/notes

View notes of previous months' calls.

Jan 17 2024
Jan 17

Michael E. Meyers joined Tag1 Consulting as Managing Director in 2017 after a decade of working with the team on many projects.

Prior to joining Tag1, Michael was VP of Developer Relations at Acquia, responsible for developer marketing & events, developer relations, and helped launch the developer products group. Michael co-founded & was CTO of NowPublic.com, the first venture-backed Drupal based startup that pioneered citizen journalism. With the help of Tag1, he grew NowPublic into a top 500 website. As CTO of The Clarity Digital Group, which acquired NowPublic, Michael and the Tag1 team rebuilt Examiner.com, the first Drupal-based top 100 Internet site, and the leading contributor to Drupal 7.

Michael has made major contributions to and helped establish Drupal as one of the most successful open source platforms; He’s an advisor to the Drupal Association board and several startup companies.

Jan 16 2024
Jan 16

Authored by: Nadiia Nykolaichuk.

Upon entering the new year, many of us have started working towards resolutions or aspiring achievements. In the digital space, Drupal websites, too, can aspire to be on the cutting edge of trends and provide the most compelling online journeys to users.

With Drupal, there are plenty of opportunities for innovation, fresh ideas, and great functionalities, because the CMS constantly moves forward. So what is expected to be trending for Drupal websites in 2024? Let’s take a look.

Unveiling Drupal Trends for 2024

New releases ahead: Drupal 10.3 and Drupal 11

As ambitious as it may sound, this year is set to bring us a new major Drupal core version — Drupal 11. This is based on the new major release schedule where major versions roll out every other year. Drupal 11 will start yet another chapter in the exciting story of Drupal’s evolution.

There are three scenarios as to when exactly we might be able to see Drupal 11. Depending on how fast the beta requirements for Drupal 11 are going to be fulfilled, this exciting historic moment might occur during the following times:

  • week of June 17, 2024
  • week of July 29, 2024
  • week of December 9, 2024

When it comes to minor releases, with the closing of 2023, we welcomed Drupal 10.2. This year, we should see yet another minor release of Drupal core — Drupal 10.3. Based on the above-mentioned possible scenarios page, Drupal 10.3 might be out in June or July, either together with Drupal 11 or six months earlier.

It is worth mentioning again that staying abreast of the Drupal release cycle and keeping your website updated is vital for fresh and modern features, improved security, faster performance, and bug-free work.

Drupal maintenance to be a lot easier and updates maximally automated

It looks like Drupal core magicians like to make fairy tales come true. During DrupalCon Lille 2023, Drupal creator Dries Buytaert made his keynote in the form of a fairy tale about the Drupal Village. He mentioned that one of the villagers’ aspirations was to reduce the manual upkeep and maintenance.

The year 2024 promises to bring a breakthrough in this area because Automatic Updates are scheduled to eventually appear in Drupal 10.3. With this, it will be possible to run Drupal core updates either at a couple clicks of a button or even totally automate them.

Primarily, the functionality is meant for security and patch updates (like from Drupal 10.1.1 and 10.1.2). Updates between minor core versions (like from Drupal 10.2 and 10.3) might also be possible but in the attended mode only. Updates for contributed modules and themes should be available with the help of an additional contributed module, said project leads Adam G-H (phenaproxima) and Ted Bowman (tedbow) at their DrupalCon Pittsburgh 2023 session.

When speaking about what helps simplify website maintenance in his fairy tale keynote, in addition to the upcoming Automatic Updates, Dries also mentioned two useful things that had been around for a while:

  • the new release and innovation model in Drupal
  • the automatic code fixes, especially for deprecated APIs, with tools like Drupal Rector 
Drupal features to reduce the manual upkeep and maintenance presented at DrupalCon Lille 2023’s DriesnoteDrupal features to reduce the manual upkeep and maintenance presented at DrupalCon Lille 2023’s Driesnote.

Modules and themes to be installable via the UI

Another great trend for the year 2024 is extending websites with new modules effortlessly. Much has been said about Project Browser — the expected innovation that should enable users to browse for modules and themes and install directly via the Drupal admin dashboard. During the year 2024, Project Browser will eventually become available in Drupal core. This is planned for Drupal 10.3, so we can expect to see it in summer.

There is currently a contributed module Project Browser for those who would like to try it out. There are also plenty of contribution opportunities because all contributed modules need to be represented in Project Browser with a brief summary, logo, and more. 

A demo of finding modules with Drupal’s Project Browser.A demo of finding modules with Drupal’s Project Browser.

More AI power to be harnessed in 2024

While the digital world was cautiously getting used to the boom of artificial intelligence tools last year, they are set to rock the year 2024, creating the next remarkable trend. Just one look at the DrupalCon Lille 2023 program shows that AI was one of the hottest topics with half a dozen sessions dedicated to it. They were related to AI-assisted content editing with CKEditor 5, discussions of whether OpenAI in Drupal 10 is a friend or foe, and so on.

On our blog, we overviewed the available Drupal modules for OpenAI/ChatGPT integration and walked you through the steps of using the OpenAI module for content workflows on your Drupal website. The available tools are able to generate text, translate it, change its tone of voice, suggest taxonomy tags for it, and more. To further boost Drupal content workflows, the existing modules are expected to improve their stability and get more features, and new modules should appear in the course of 2024.

Using the OpenAI module in Drupal’s content workflows.Using the OpenAI module in Drupal’s content workflows.

CKEditor 4 is end of life — long live CKEditor 5!

As yet another trend for the upcoming year, more Drupal websites are expected to upgrade to CKEditor 5 — the new version of the rich-content WYSIWYG editor incorporated into Drupal core. That’s because CKEditor 4 has reached its end of life on January 1, 2024.

CKEditor 4 has been around since 2012, assisting content creators with rich editing experiences, but the time has come to say goodbye to it. CKEditor 4 has stopped receiving open-source security updates in the Drupal ecosystem since the first day of the year. 

However, it has been published under commercial terms of the Extended Support Model contract. This means that, as an alternative to an immediate upgrade, there is an option for long-term commercial support for CKEditor 4 that will last until December 2026. A special CKEditor 4 LTS (“Long Term Support”) module has been created for this mission. LTS will give Drupal websites another three years if, for whatever reason, they cannot upgrade to CKEditor 5 right now.

Starting with Drupal 9.5, the Drupal core already has CKEditor 5, hence the common question — do Drupal 9.5 or newer websites need an upgrade to CKEditor 5? They don’t if they are newly-built Drupal installations and they do if they have been upgraded from older versions. That’s because the upgrade procedure requires checking all of your website’s components for compatibility with CKEditor 5. For more information, you might want to view our compilation of five can’t-miss articles on CKEditor 5 that covers everything from its innovative features to upgrade tips.

An example of content creation with CKEditor 5 containing the call to upgrade.An example of content creation with CKEditor 5 containing the call to upgrade.

More editorial teams to rely on Layout Builder in 2024

Drupal’s built-in tool for creating page layouts is going to become more editor-friendly in the upcoming year, opening new opportunities for editorial teams to embrace it. A whole bunch of editorial improvements for the Layout Builder are currently in the works.

Among other things, this year’s plans include making the list of available blocks less overwhelming, making the block options better understood by editors, improving the current UIs used for selecting and editing blocks, making it easier to move or change sections, and much more.

In addition, the ecosystem of extra modules to extend Layout Builder keeps growing with each day, making it easy to fulfill any special needs or requirements of editorial teams. This might be choosing different layouts and blocks for different languages, adding granular permissions, creating a library of sections, and more — the sky’s the limit. It’s great to know that one of the most famous modules in this area is the Bootstrap Layout Builder, co-maintained by our team’s developers.

Modelling content structure to be a lot easier in 2024

A system of fields in Drupal is amazing because it enables site builders and website administrators to shape content structure as desired. An image, a text field, a date, an email address, and lots of other field types are available to be added in the needed order to content types.

However, until recently, the field adding process wasn’t very straightforward and intuitive and involved multiple steps split between different pages. Luckily, this is going to be totally different in 2024. After a thorough user research, the Drupal team came up with a grand plan to improve field creation experience.

The new Field UI is going to present available field types as a user-friendly grid. The field types will be grouped and accompanied with brief explanations and icons for better clarity. Once the field type is selected, there’s only one configuration step left to make. There will also be a convenient UI to re-use existing fields. 

Many of those features have gradually been added to Drupal in Drupal 10.1 and Drupal 10.2.

The new field creation interface already included with Drupal 10.2.The new field creation interface already included with Drupal 10.2.

There are some other improvements awaiting in 2024 such as moving the field selection into a modal window, providing keyboard accessibility, and more. We can take a sneak peek thanks to Tim Plunkett (tim.plunkett), Drupal core contributor and initiative coordinator, who presented a demo at DrupalCon Lille opening keynote of how the Field UI is going to look in the near future. 

A demo of the future modal functionality for the field creation.A demo of the future modal functionality for the field creation.

Administrative navigation to be considerably improved

It’s very likely users will be able to move more seamlessly through the Drupal administrative interface with the new toolbar in 2024 with no more need to customize it with contributed modules. There is a plan to improve administration navigation that should increase user satisfaction thanks to reduced number of clicks or page visits needed to move through the UI and give them more confidence as to where to find the desired pages.

As Cristina Chumillas (ckrina), Drupal core UX Maintainer, shared at DrupalCon Lille 2023 opening keynote, it’s planned to move most of the main admin navigation functionality into a left vertical sidebar and a possible top bar with contextual tools based on where a user currently is.

A demo of Drupal’s new admin toolbar in the works.A demo of Drupal’s new admin toolbar in the works.

The creators of the new toolbar also strive to improve the wording used in the administrative menu items so it’s not so full of “drupalisms” and is clear even for beginners. Another great plan among a whole bunch of new ideas is to revamp the content creation menu by adding direct content creation links, which was also demoed by Cristina.

A demo of direct content creation links in the new admin toolbar.A demo of direct content creation links in the new admin toolbar.

The expected success of these design works is ensured by extensive usability testing results. The new toolbar is targeted for the Drupal 10.3 release in 2024.

Need a “bolder” theme? Consider Gin!

Of course, the new Drupal default administration theme Claro is amazing. It’s super clean, modern, user-friendly, and accessible. However, there is another option that has been creating a lot of buzz in the Drupal circles. One of the lead designers of Claro and Drupal Design System, Sascha Eggenberger (saschaeggi), is also leading the works on a new, even more modern admin theme that is currently a contributed project — Gin.

The Gin theme is based on Claro but offers a radically new UI layout. There is a Darkmode available for the theme. There also are interesting features such as, for example, the Published state toggle, the Preview, and the Save buttons always sticky at the top of the node editing form. The creators of Gin are experimenting with bold and progressive ideas, making Gin an active topic for discussion in 2024 and, of course, a trending option of a theme to be installed on websites. 

The node editing form in the Gin admin theme for Drupal.The node editing form in the Gin admin theme for Drupal.The Darkmode for the node editing form in the Gin theme.The Darkmode for the node editing form in the Gin theme.

Sascha Eggenberger is also involved with creating the above-mentioned new administration toolbar, and there is some successful exchange of ideas and best practices between the toolbar and the Gin theme. He and Cristina Chumillas gave a session “Next Drupal admin UI improvements” at DrupalCon Lille 2023 where Sascha mentioned that the new admin toolbar was already implemented in the Gin administration theme, so anyone could check it out. However, he said that it’s also possible to test the toolbar if you are using Drupal core Claro theme — with the help of the Navigation module.

Improved approaches to creating events and calendars

The year 2024 should provide smarter options for teams to create commonly used content formats such as events and calendars. Useful modules for this are being actively updated with the most modern features, getting multiple new releases one after another.

Among other tools, this applies to the lightweight Calendar View module that enables you to display the results of Drupal Views as monthly or weekly calendars with Ajax navigation and lots of other modern features.

Another great example is the Smart Date module that provides advanced date and time functionality, including recurring dates, auto-population of end time, default duration, and lots of other options that make it a treasure for events or other time-specific content. 

Final thoughts

We have covered at least some of the most prominent trends to be expected in the world of Drupal websites in 2024. How to make sure your website is trendy? For many features, it might be enough to keep it up-to-date with the Drupal releases, because they are going to be built into Drupal core. For some, you’d also need to install or update an additional contributed module. In any case, we wish you and your team the most productive year 2024 with your Drupal website as a super efficient assistant!

Jan 16 2024
Jan 16

As part of my role in the Drupal Association, we are trying to find new ways to unleash innovation. Innovation as it happens is a key goal for the Drupal Association. What surprised me when I started with the Drupal Association was to meet companies that were contributors, (some of them known for being long-time contributors) or that are very interested in contributing, but then not knowing how they could maximize their contributions or even where they should be contributing to.

I don’t think that these are a few isolated cases, as it’s not the first time I've seen this trend. Back when I was working for a 100+ developer consultancy firm there was a big corporate push to increase our contribution to open source. And contribute we did. We started “Pizza Fridays”, which meant we were spending Fridays contributing, doing presentations between us, and having pizza for lunch. We had fun, but we lacked structure, purpose, and higher goals (and a healthy diet on Fridays). Our plan was not aligned with anything other than our own appetite to experiment or learn something.

If we had a structure that aligned us to the project we were contributing to, our contributions would have been more impactful, business would have benefited in a more meaningful way, and the whole team would have probably been allowed to contribute even further and longer in time. We did amazing things, don’t get me wrong, but the impact of those could have been much bigger.

That’s why, today, we are introducing the credit bounty program. The idea is to do an initial experiment, and if it has an impact on Drupal moving forward, we’ll tweak it if needed and continue with new iterations.

I expect that the issues and projects that we are promoting will change over time, so we’ll share soon how you can get updated information.

If you are a maintainer and you would like us to include your issues in this pilot program, that may be a possibility as well, so please send me an email: [email protected]. Depending on how this first phase goes, we may start promoting contributed module issues as well based on the popularity of the modules, usage on sites, complexity, how innovative they are, etc, etc

For now, this is the list of issues where (core for now) maintainers need your help. The reward will be a boost to marketplace rank equivalent to 5 times the normal amount for these issues. Sounds good?

Maintainers will grant credit as normal on these issues, and the contributing organizations that the maintainers credit will receive the full bounty

Make sure to read Drupal Core's Issue Etiquette for core contribution, and the Contributor Guide. 

Have questions or ideas? Please ping me: [email protected]
 

Jan 16 2024
Jan 16

As part of my role in the Drupal Association, we are trying to find new ways to unleash innovation. Innovation as it happens is a key goal for the Drupal Association. What surprised me when I started with the Drupal Association was to meet companies that were contributors, (some of them known for being long-time contributors) or that are very interested in contributing, but then not knowing how they could maximize their contributions or even where they should be contributing to.

I don’t think that these are a few isolated cases, as it’s not the first time I've seen this trend. Back when I was working for a 100+ developer consultancy firm there was a big corporate push to increase our contribution to open source. And contribute we did. We started “Pizza Fridays”, which meant we were spending Fridays contributing, doing presentations between us, and having pizza for lunch. We had fun, but we lacked structure, purpose, and higher goals (and a healthy diet on Fridays). Our plan was not aligned with anything other than our own appetite to experiment or learn something.

If we had a structure that aligned us to the project we were contributing to, our contributions would have been more impactful, business would have benefited in a more meaningful way, and the whole team would have probably been allowed to contribute even further and longer in time. We did amazing things, don’t get me wrong, but the impact of those could have been much bigger.

That’s why, today, we are introducing the credit bounty program. The idea is to do an initial experiment, and if it has an impact on Drupal moving forward, we’ll tweak it if needed and continue with new iterations.

I expect that the issues and projects that we are promoting will change over time, so we’ll share soon how you can get updated information.

If you are a maintainer and you would like us to include your issues in this pilot program, that may be a possibility as well, so please send me an email: [email protected]. Depending on how this first phase goes, we may start promoting contributed module issues as well based on the popularity of the modules, usage on sites, complexity, how innovative they are, etc, etc

For now, this is the list of issues where (core for now) maintainers need your help. The reward will be a boost to marketplace rank equivalent to 5 times the normal amount for these issues. Sounds good?

Maintainers will grant credit as normal on these issues, and the contributing organizations that the maintainers credit will receive the full bounty

Make sure to read Drupal Core's Issue Etiquette for core contribution, and the Contributor Guide. 

Have questions or ideas? Please ping me: [email protected]
 

Jan 16 2024
Jan 16

What is a views display extender

The display extender plugin allow to add additional options or configuration to a views regardless of the type of display (e.g. page, block, ..).

For example, if you wanted to allow site users to add certain metadata to the rendered output of every view display regardless of display type, you could provide this option as a display extender.

What we can do with it

We will see how we implement such a plugin, for the example, we will add some metadata (useless metatags as example) to the document head when the views is displayed.

We will call the display extender plugin HeadMetadata (id: head_metadata) and we will implement it in a module called views_head_metadata.

The implementation

Make our plugin discoverable

Views do not discover display extender plugins with a hook info as usual, for this particular type of plugin, views has a variable in his views.settings configuration object.

You need to add your plugin ID to the variable views.settings.display_extenders (that is a list).

To do so, I will recommend you to implement the hook_install (as well uninstall) in the module install file. To manipulate config object you can look at my previous notes on CMI.

Make the plugin class

As seen in the previous post on Drupal 8 plugins, you need to implement the class in the plugin type namespace, extend the base class for this type of plugin, and add the metadata annotation.

In the case of the display extender plugin, the namespace is Drupal\views_head_metadata\Plugin\views\display_extender, the base class is DisplayExtenderPluginBase, and the metadata annotation are defined in \Drupal\views\Annotation\ViewsDisplayExtender.

The display extender plugins methods are nearly the same that the display plugins, you can think of its like a set of methods to alter the display plugin.

The important methods to understand are :

  • defineOptionsAlter(&$options) : Define an array of options your plugins will define and save. Sort of schema of your plugin.
  • optionsSummary(&$categories, &$options) : To add a category (the section of the views admin interface) if you want to add one, and define your options settings (in wich category there are, and the value to display as the summary).
  • buildOptionsForm(&$form, FormStateInterface $form_state) : Where you construct the form(s) for your plugin, of course linked with a validate and submit method.

Generate the metadata tags in the document head

Now that we have our settings added to every views display, we need to use those to generate the tags in the document head as promised.

To work on the views render we will use the hook for that : hook_views_pre_render($view) and the render array property #attached.

Implement that hook in the .module of our module views_head_metadata, let's see :

Jan 15 2024
Jan 15

Today we are talking about The Drupal 10 Masterclass book, How it’s different from other Drupal books, and why you need it on your bookshelf with author Adam Bergstein. We’ll also cover Dashboards as our module of the week.

For show notes visit:
www.talkingDrupal.com/433

Topics

  • What is Drupal 10 Masterclass about
  • Who is this book for
  • Why did you write the book
  • Can you explain the subtitle a bit
  • How does this differ from other recent Drupal books
  • Can you tell us about the authoring experience
  • What can our listeners do to make this book a success
  • Do you think you’ll write another book
  • Simplytest.me update

Resources

Guests

Adam Bergstein - @n3rdstein

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan
John Picozzi - epam.com johnpicozzi

MOTW

Correspondent

Martin Anderson-Clutz - mandclu

  • Brief description:
    • Have you ever wanted to add a dashboard to your Drupal site, to provide at-a-glance information to editors? There’s a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Nov 2019 by Erik Seifert of 1x Internet
    • Versions available: 2.0.8 and 2.1.6 versions available, the latter of which works with Drupal 9 and 10
  • Maintainership
    • Actively maintained
  • Test coverage
    • 13 open issues, 5 of which are bugs on th 2.1.x branch
  • Usage stats:
    • 1,878 sites
  • Module features and usage
    • Allows for the creation of dashboards as exportable config entities, using Layout Builder to define the layout and placement of blocks
    • It’s possible to create unique dashboards per user
    • Out of the box you get a number of dashboards components to embed views, show recent errors, display content from RSS feeds, and more
    • Dashboard components are defined using a new plugin base, so you can also create custom components to meet the unique needs of your site
    • The dashboards are also optimized for use with Gin, which isn’t a surprise because 1x Internet is also a sponsor of the Gin admin theme. If your site is also using Gin then this module will provide a central dashboard that seamless integrate with the backend UI
    • If you’re looking to implement dashboards on your site, you can also look at Moderation Dashboard and Homebox as other options. The latter of those is even more widely used, but mostly by D7 sites. That said, drupal.org is one of those sites, so if your team is active on drupal.org then the interface will be very familiar
    • There is also a Dashboard Initiative that has been started by some core maintainers, so using one of these modules can set you up to weigh in on what the ideal state for the initiative might look like
Jan 15 2024
Jan 15

Introduction

In 2023, DoorDash added a pop-up in its app, warning customers that orders with no tip can take longer to deliver. 

This latest feature has become a source of disagreement among the company, drivers, and customers, leading to a division among the app's user base.

The new feature of DoorDash is an example of a Dark Pattern, which leads to a deceptive user experience. 

Coined in 2010 by Dr. Harry Brignull, the founder of The Deceptive Patterns Initiative, Dark Patterns take undue advantage of the users’ habits. These patterns make them take an action, which they didn’t intend to do.

Understanding Design Ethics and Dark Patterns empower designers to create effective and trustworthy user experiences.

Ethics In The Design Process

Design Ethics Vs. Dark Patterns 

The correlation between Dark Patterns and Design Ethics is an interesting one, representing two sides of the same coin. 

Design Ethics

Dark Patterns

Design Ethics emphasize informed consent, promotes autonomy, and builds trust by leveraging user psychology. These are principles which help create positive customer experiences.

Dark Patterns exploit psychological biases and cognitive limitations to manipulate user behavior. These patterns emphasize short-term gains over sustained relationships.

Examples Of Design Ethics

User-Centric Design

User-Centric Design
Places the needs and wants of the user at the core of every design decision.
Transparency & Honesty
Transparency & Honesty
Clearly communicates information to users.

Providing Autonomy To Users

Providing Autonomy To Users
Gives users control over their data and decisions, avoiding manipulation and coercion.

Accessibility & Inclusivity in Design
Accessibility & Inclusivity
Products and services designed to be used by everyone without bias.

Social Responsibility in Design
Social Responsibility
Considers the broader impact of design decisions on society and the environment.

Examples Of Dark Patterns

Using Ambiguous Language
Using Ambiguous Language 
Misleads users and compels them to take a call-to-action.

Bait & Switch
Bait & Switch
Displaying something and delivering something else such as clickbait ads with no clear terms and conditions.

Hidden Or Obscure Opt-Outs
Hidden Or Obscure Opt-Outs
Confusing the users from averting unwanted services or communications such as newsletter signups.

Creating A Sense Of Urgency

Creating A Sense Of Urgency
Pushing users to take an immediate action such as opting for insurance while booking a flight ticket.

The Seven Principles Of Design Ethics

[embedded content]

Design ethics are a system of moral principles that guides actions and decisions, helping designers navigate complex situations. Humane By Design aptly tries to cover some of the core principles that can help designers stay true to ethics. These are:

Resilient Design

Resilient design focuses on the well-being of the most vulnerable and anticipates the potential for abuse.

Empowering design

Empowering design ensures products center on the value they provide to people over the revenue it can generate.

Finite design

Finite design maximizes the overall quality of time spent by bounding the experience and prioritizing meaningful and relevant content.

Inclusive

Inclusive design is a methodology that enables and draws on the full range of human diversity.

Intentional design

Intentional design is about understanding the needs of the users, focusing on the problem and the corresponding solution.

Respectful design

 Respectful design prioritizes people’s time, attention and overall digital well-being.

Transparent design

Transparent design is clear about intentions, honest in actions, and free of dark patterns.

How Can Designers Stay Ethical 

Design is a powerful tool and designers need to use it responsibly, and apply design ethics in their work. Here are some of the ways to implement ethics as a designer.

Design With A Conscience

  • Prioritize Social Responsibility: Designers are a part of a larger community and their work impacts others. They should choose to design for positive change.
  • Leave A Positive Legacy: Designers sometimes create solutions that stay the same for years to come. They need to design with intention, considering how their work will have an impact for a long time. 

Veja is a sneaker brand, which employs organic cotton, recycled plastic bottles, wild rubber, and various other environmentally friendly materials in its products.This enables Veja to diminish its environmental impact and craft a product that prioritizes sustainability.

The Veja Project

Design For Impact, Not Just Aesthetics

  • Focus On Ethical Outcomes: Clever design doesn't excuse harmful consequences. Designers need to focus on ethics and driving meaningful impact.
  • Evaluate The Bigger Picture: Design exists within a complex system and designers need to consider the social, economic, and ecological impact of their work.

Patagonia's environmental activism and Worn Wear program go beyond visually appealing clothes to promote sustainability and reduce garment waste.

Patagonia Worn Wear Stories

Embrace Responsibility, Not Just Creativity

  • Accountability: Designers need to own their work and the consequences.
  • Guide Clients Towards Ethical Choices: Designers need to offer expertise beyond just execution. They need to inform clients of potential negative impacts and suggest alternatives.
  • Say No: Refusing unethical projects is also a crucial design skill. Designers should question the purpose and impact of their work.

Atom is a meditation app, which gamifies user experience. The app encourages planting virtual trees for milestones accomplished. This demonstrates that creativity can be harnessed to steer users towards a greater good.

Atom Screen

Welcome Constructive Criticism

  • Receptive To Feedback: Designers should learn and improve from feedback, regardless of its source. They should seek criticism early and often as this helps identify flaws before the work reaches the public.
  • Embrace Diverse Perspectives: Designers should value feedback from all backgrounds to create inclusive solutions.

GitLab uses public issue trackers and encourages discussion, transparently welcoming bug reports and feature requests from its community.

[embedded content]

Design With Empathy, Not Exclusion

  • Understand Audience Personas: Designers should understand the pain points of their user personas to design effective solutions.
  • Build Diverse Teams: Designers should create a team that reflects their audience, leading to varied perspectives and better problem-solving.
  • Understand That Every User Matters: Designers should prioritize their design to solve problems for everyone, not just a select group.

Google Translate’s inclusive interface and emphasis on accurate translation for lesser-known languages demonstrate a commitment to breaking down language barriers and fostering connection.

Google Translate Design

Prioritize Ethical Introspection

No designer becomes unethical overnight. It's a gradual shift, a series of seemingly harmless decisions that can lead to unintended consequences. 

Before designing the filtering interface, one must ask questions like:   

  • Am I staying true to my values?
  • Are small compromises eroding my ethical boundaries?
  • Am I prioritizing financial gain over ethical considerations?

Regularly reflect on your work and decisions. Are you aligning with your values? Are you contributing to the world you want to see? This critical self-reflection is vital for maintaining ethical integrity in a design practice. 

Ethical considerations in design

Conclusion

By incorporating ethical design principles, designers can contribute to a sustainable and harmonious relationship between technology, the environment, humanity, and their own well-being.

At Axelerant, we follow a set of ethical design principles, which reflects in our collaboration with our clients. Connect us to learn more about how we put design ethics into practice.

Jan 15 2024
Jan 15

Introduction

Accelerated Mobile Pages (AMP) transform traditional, static emails into a dynamic, interactive, and engaging way of communication. It provides several benefits like elevated engagement, reduced drop-offs, and enhanced user experience.

static vs dynamic email

AMPScript allows users to create such dynamic emails within Salesforce Marketing Cloud Engagement. This scripting language can also be used across landing pages, SMS, and push notifications.

What Can AMPScript Do?

AMPScript expands Marketing Cloud functionality to create personalized, data-driven email campaigns. It offers dynamic content and scripting capabilities that enable the creation of dynamic subject lines, personalized greetings, and real-time content updates.

By utilizing the potential of AMPScript, marketers can build hyper-targeted and engaging campaigns by:

  • Personalizing emails using subscriber or contact data
  • Creating complex, highly dynamic emails using conditional logic
  • Cleaning and formatting data
  • Adding real-time information to emails, such as date or time
  • Tracking impressions

Other capabilities of AMPScript are mentioned below.

Category

What AMPScript Can Do 

API

Create SOAP API interactions

Contacts

Modify Marketing Cloud contact information in the all-subscribers database

Content

Modify Marketing Cloud content, such as text and images in emails

Data Extension

Modify data in data extensions

Date Time

Modify date and time information in the Marketing Cloud

Encryption

Encrypt and decrypt Marketing Cloud data

HTTP

Get, post, and modify HTTP information in the Marketing Cloud

Math

Perform basic math functions

Microsoft Dynamics CRM

Interact with Microsoft Dynamics CRM data

Salesforce

Interact with Sales and Service Cloud data in Marketing Cloud

Sites

Interact with CloudPages sites

Social

Interact with Social Forward functionality in Email Studio

String

Modify string information in the Marketing Cloud

Utilities

Return and evaluate types of Marketing Cloud data

Best Practices For Using AMPScript

There are a few best practices that one can follow while using AMPScript.

1. Handle Inaccurate Data

During any campaign, there are always chances of getting bad data that both developers and marketers cannot control. The following steps can be used to handle inaccurate data.

Step 1: Check if a value is empty or null and assign default values for variables

%%[ IF Empty(@Firstname) OR @Firstname == "Unknown" THEN 
set @defaultValue = "Customer" 
Else
set @defaultValue = @Firstname
Endif]%%

Step 2: Validate Lookup calls with RowCount and check isEmpty() or isNull() functions.

For example:

IF RowCount(@rows) > 0 then

 /*code here*/

2. Specify Default Content

In dynamic content, specify content in the Default Content field that is shown to a subscriber who doesn’t meet a personalization rule.

3. Cancel A Sent Email

Use the RaiseError function to suppress or cancel an email sent. If an unexpected result occurs, it is also useful for handling exceptions in the code.

%%[
if Empty(@couponRow) then
   RaiseError(“No coupons available”, false) 
else 
   SET @Code = Field(@couponRow, “CouponCode”)
endif
]%%
%%=v(@Code)=%%

4. Set A Specific Date Format

Use FormatDate() to display dates in the preferred format.

For example:

%%=FormatDate(‘2023-09-01 09:00′,’ddddd dd MMMM yyyy’,’HH:MM’,’en-US’)=%%

5. Save A Default Time Format

By default, Salesforce Marketing Cloud uses CST format for dates. To set any other default format, use SystemDateToLocalDate().

For example:

%%=SystemDateToLocalDate(‘2014-11-06 05:00’)=%%

6. Secure Subscriber Information

Use MicrositeURL or CloudPagesURL to secure subscriber information. This function returns the landing page URL appended with an encrypted query string with the subscriber data.

7. Replace Variable Values

Use the Replace() function to replace one variable value with another. This function can be used to replace special characters in the email text.

8. Prevent Validation Errors

Functions like AttributeValue prevent validation errors that terminate emails sent in case of unavailable fields in a data extension.

For example:

%%[ var @myname 
set @myname = AttributeValue('FirstNameColumnName')
]%%

9. Unify Different Formats In Databases

Data might be in different formats within different databases. This can be resolved by using the ProperCase() function for properly capitalizing text or a variable.

For example:

Dear %%=propercase(‘john)=%%

10. Ensure Proper Code Maintenance

Debugging AMPScript in the Salesforce Marketing Cloud is not easy due to a lack of built-in features for showing script errors. But there are a few Trailblazer community ideas that developers can explore.

  • Try to wrap your code with Server-side JavaScript try/catch block.


%%[ your AMPscript block goes here  ]%%

  • Align the code with proper opening and closing script block tags.
  • Align ifelse and endif elements.            
  • Wherever possible, use the variable names that match the send context variable. This helps in ensuring readability in complex scripting scenarios.
  • Add comments wherever required to make the code more understandable.

11. Maintain Code Performance

Salesforce Marketing Cloud follows a multitenant-based architecture. This means that any increase in the database usage can affect all users. Because of this, it is crucial to maintain the performance of the overall code.

  • Reuse the code by writing it in separate blocks.
  • Limit the number of lookup functions to ensure better performance in case of large data extensions.
  • Merge data extensions into one to improve send performance.
  • Avoid overusing functions that impact the send speed like:
    • HTTPGet (Or use cache option)
    • CreateSalesforceObject
    • RetrieveSalesforceObjects
    • UpdateSingleSalesforceObject
    • EncryptSymetric
    • Lookup[Rows]
    • Update/Insert/UpsertDE
    • TreatAsContent

Conclusion

AMPscript unlocks the ability to craft intricate and highly responsive messages that aren't limited by pre-designed templates or user interface guides. By implementing AMPscript, you can enhance the capabilities of Salesforce Marketing Cloud and create personalized communications that appeal to your target audience.

Adhering to recommended best practices will not only improve the quality of the code but also accelerate development. Additionally, it facilitates seamless collaboration between your team.

Schedule a call to learn more about how you can utilize AMPScript to launch personalized email marketing campaigns.

Jan 15 2024
Jan 15

Discover the power of custom Drupal blocks as we delve into website customization in this article. Custom Drupal blocks offer personalized functionality, design, and content that set your website apart. Investing time in creating these blocks unlocks many benefits for Drupal website development.
To inspire your creativity, we will present real-world use cases demonstrating how custom Drupal blocks can elevate user experiences and deliver tangible results.
Are you ready to unlock the full potential of your Drupal website? Join us as we open the secrets of creating custom Drupal blocks, sharing best practices, and exploring compelling use cases.

Importance of Drupal blocks in website customization and content management

Drupal blocks are essential components of the Drupal content management system, allowing website owners and developers to customize and organize the layout and content of their sites. These versatile building blocks play a crucial role in enhancing the user experience and functionality of Drupal websites.
With Drupal blocks, you can effortlessly create and display various types of content, such as:

  • navigation menus
  • promotional banners
  • social media feeds
  • contact forms

They provide a flexible framework for organizing and arranging information, enabling you to design dynamic and engaging pages.
The beauty of Drupal blocks lies in their adaptability. You can easily configure their visibility, control their placement, and assign specific permissions to determine who can view or manage them. This level of customization empowers you to tailor the user experience, ensuring that visitors see the right content at the right time and place.
Drupal blocks are limited to built-in functionality and can be extended by creating custom blocks. By leveraging the Drupal Block API, developers can design and implement blocks that align perfectly with their requirements. This opens possibilities for incorporating unique features, integrating third-party services, or implementing advanced functionality into your Drupal website.
Drupal blocks serve as the building blocks of a Drupal website. They allow creating, organizing, and displaying content in a visually appealing and user-friendly manner. Their role extends beyond simple content placement, empowering you to personalize the user experience, enhance website navigation, and create interactive elements.

Overview of Drupal Blocks Role

Drupal blocks allow administrators to customize the layout and presentation of content on a website. Blocks can be created and managed through the user interface or programmatically using code. They can contain various types of content, such as text, images, videos, forms, menus, or custom code.
Blocks play several important roles in the Drupal CMS ecosystem:

  • Blocks help administrators organize and structure content within a Drupal site, improving navigation and user experience.
  • Blocks allow personalized content delivery by displaying targeted content or advertisements to different sections or user groups.
  • Drupal's modular architecture allows for creating custom blocks or integrating contributed modules, extending Drupal's capabilities and adding advanced features to blocks.
  • Blocks offer a flexible approach to theming and design, enabling developers to define regions within a theme's layout and administrators to place blocks accordingly.
  • Blocks can be displayed conditionally based on page URLs, content types, user roles, languages, or date ranges, allowing precise content targeting.
  • Drupal supports multilingual blocks, allowing administrators to create translations and display content in the user's preferred language.
  • Drupal includes a caching mechanism that stores rendered blocks in memory, improving website performance and scalability.
  • With blocks, administrators have granular control over layout, visibility, and contextual display, enhancing the user experience on Drupal-powered websites.

Creating custom Drupal blocks: 8 Benefits

Creating custom Drupal blocks offers many benefits that enhance website functionality, design, and user experience. Here are the key advantages of utilizing custom Drupal blocks:

1. Tailored Functionality

Custom Drupal blocks allow you to add unique functionalities and features to your website tailored to your requirements. You can create blocks for features like contact forms, search filters, social media integrations, or any other custom functionality you need. This empowers you to provide a seamless user experience and fulfill specific business needs.

2. Design Flexibility and Branding

Custom Drupal blocks give you complete control over the design and layout of your website. You can create visually appealing blocks that align with your brand identity using custom styles, fonts, colors, and forms. This design flexibility ensures a cohesive and captivating visual experience for your visitors.

3. Personalized Content Presentation

With custom Drupal blocks, you can personalize the presentation of content based on user preferences, demographics, or other contextual factors. This enables you to deliver targeted content that resonates with your audience, increasing engagement and conversion rates. 

4. Improved User Experience

Custom Drupal blocks enhance the user experience by providing intuitive and interactive elements. You can create blocks that facilitate easy navigation, highlight important information, or incorporate interactive features like sliders, accordions, or tooltips. These elements engage users and make their journey on your website more enjoyable and effortless.

5. Content Reusability

Custom Drupal blocks follow a modular approach, allowing you to reuse blocks across multiple pages or sections of your website. This saves time and effort in content management, as you only need to update the block once for the changes to reflect across all instances. This promotes consistency, reduces maintenance overhead, and ensures efficient content management.

6. Lightning-Fast Performance

You can control and optimize the code for improved performance by creating custom Drupal blocks. You can streamline the block's code, minimize dependencies, and optimize assets to ensure faster loading times and a smooth browsing experience for your users.

7. Seamless Integration

Custom Drupal blocks seamlessly integrate with the broader Drupal ecosystem. You can leverage the vast library of modules and themes available to enhance the functionality and appearance of your custom blocks. This integration enables you to tap into a wealth of resources and expand the capabilities of your website.

8. Future-Proof Your Website

Custom Drupal blocks future-proof your website by allowing for scalability and adaptability. As your business grows and evolves, you can easily modify and extend your custom blocks to meet changing needs. Whether incorporating new technologies or embracing emerging trends, your website remains agile and ready to adapt, keeping you ahead of the competition.

The benefits are boundless, from tailored functionality and design freedom to an enhanced user experience and seamless integrations. 

What are the different types of blocks available in Drupal?

Here are some of the commonly used block types in Drupal include:

  1. Basic Blocks: Basic blocks allow you to add simple content, such as text, images, or HTML, and display them in designated regions of your theme.
  2. Views Blocks: Views blocks enable you to display these dynamic lists or grids as blocks in various regions of your site. This lets you showcase recent articles, featured products, or customized content based on specific filters or sorting options.
  3. Menu Blocks: Menu blocks enable you to display menus as blocks in different regions of your site. This is useful when you want custom menu placements or multiple menus in other areas of your theme.
  4. Custom Blocks: Drupal provides the flexibility to create custom blocks using the administrative interface. Custom blocks allow you to add any content, including text, images, videos, forms, or custom code, and display them as blocks on your site.
  5. Plugin Blocks: Drupal's architecture allows for integrating contributed modules that provide additional block functionality. Examples include social media integration blocks, search blocks, or e-commerce-related blocks.
  6. Related Content Blocks: Some modules or themes in Drupal offer related content blocks that display related or similar content based on the currently viewed page or article. 

It's important to note that Drupal's modular architecture allows developers to create custom block types tailored to specific site requirements.

Best Practices for creating custom Drupal Blocks

Check out these best practices for creating custom Drupal blocks:

  1. Utilize Drupal's Block API for creating custom blocks rather than directly manipulating the database or hard-coding blocks in templates. The Block API provides a standardized and reliable way to create, configure, and manage blocks.
  2. Drupal's hook system defines and renders custom blocks, such as hook_block_info() and hook_block_view(). Hooks allow you to integrate custom blocks seamlessly into Drupal's block management system.
  3. Create a custom module to encapsulate your custom block functionality. This ensures your code is organized, reusable, and easily maintainable. It also allows easy deployment and version control.
  4. Specify your custom blocks. This could include settings for content, appearance, or other configurable parameters. By implementing configuration options, site administrators can customize the behavior and appearance of blocks.
  5. Consider caching for your custom blocks to improve performance. Drupal provides caching mechanisms that cache block content for a specified period or until certain conditions change. Caching can significantly reduce server load and improve site performance.
  6. Adhere to Drupal's coding standards while developing custom blocks. This includes using proper naming conventions, indentation, commenting, and following appropriate code organization and structure practices. Following coding standards ensures consistency, readability, and ease of maintenance.
  7. Thoroughly test your custom blocks to ensure they function as intended and do not introduce conflicts or errors. Test them for various scenarios, including configurations, permissions, and user roles. Validate your code to ensure it follows Drupal's coding standards and has no security vulnerabilities.
  8. Document your custom blocks and their functionality for future maintenance and collaboration. Include comments and documentation within your code, explaining the purpose, usage, and any important considerations for your custom blocks.
  9. Design your custom blocks with accessibility in mind. Ensure that the blocks' content and functionality are accessible to users with disabilities, following accessibility guidelines and best practices.
  10. Keep your custom blocks and associated modules updated with the latest Drupal core and contributed module releases. Regularly check for updates and security advisories to ensure your custom blocks are secure and compatible with the latest Drupal ecosystem.

Real-World Use Cases of Custom Drupal Blocks

Real-world use cases highlight the practicality and effectiveness of custom Drupal blocks in enhancing website functionality, personalization, and user experience. Here are the most useful real-world use cases of custom Drupal blocks and the types of websites that can benefit from them:

Content Publishing Websites

Websites that focus on publishing articles, blog posts, news, or other forms of content can use custom Drupal blocks for content promotion, related content recommendations, multimedia galleries, and customized menus.

E-commerce Websites

E-commerce sites can leverage custom Drupal blocks for product promotions, related product recommendations, customized forms for order processing or product inquiries, and personalized user dashboards.

Corporate Websites

Corporate websites can use custom Drupal blocks to showcase testimonials and reviews, display news and announcements, integrate social media feeds, and create customized forms for lead generation or contact purposes.

Community and Social Networking Websites

Community-driven or social networking websites can benefit from custom Drupal blocks for user profiles and social interactions, integration with external APIs for social media feeds, and customized menus for community navigation.

Event Websites

Websites focused on events, conferences, or meetups can utilize custom Drupal blocks to display event listings, registration forms, event calendars, and personalized user dashboards to manage registrations and event participation.

Educational Websites

Educational platforms can use custom Drupal blocks to showcase course recommendations, related learning materials, multimedia integration for educational content, and customized forms for student enrollment or feedback.

Nonprofit and Charity Websites

Nonprofit organizations and charities can leverage custom Drupal blocks to display donation campaigns, beneficiaries' testimonials, news and updates, and customized forms for volunteer sign-ups or donation processing.

Media and Entertainment Websites

Media and entertainment websites can benefit from custom Drupal blocks for multimedia galleries, related content recommendations, customized menus for navigation between different media categories, and advertisement placements.

Government Websites

Government websites can use custom Drupal blocks to display news and announcements, integrate with external APIs for weather updates or map data, and customize forms for citizen feedback or service requests.

Membership and Association Websites

Membership-based websites and associations can use custom Drupal blocks for personalized user dashboards, member directories, customized forms for membership applications or event registrations, and related content recommendations.

Have we successfully convinced you of the benefits that Custom Drupal blocks can bring to your website? We hope that you answered "yes" in your mind! Our skilled Drupal web developers can assist you with tailored customization and implementation. Don't hesitate to reach out to us for assistance.

Jan 12 2024
Jan 12

Understanding your users is crucial for organizational success. A robust feedback system gathers data ( through reviews, surveys, and social media) to unlock valuable insights. This identifies key areas for product improvement and informs data-driven decisions that optimize future offerings. 

The redesign of Airbnb, motivated by customer feedback, is an example of how well this principle works.

On May 3, 2023, Brian Chesky, the CEO of Airbnb, announced a significant platform redesign featuring over 50 improvements to enhance its customer experience. This initiative was driven by extensive user feedback gathered from diverse sources, such as:

  • Guest reviews
  • Host input
  • Surveys
  • Social media
  • Direct channels

Acting upon it enabled Airbnb to 

  • Understand user needs and trends
  • Improve functionality, usability, and platform performance
  • Prevent negative experiences and reduce operational costs
  • Increase host satisfaction and reduce guest complaints
  • Build trust and strengthen the brand's reputation

Airbnb's success hinges on its strong feedback system. To replicate these results, organizations need to solve challenges in setting up a strong feedback system. 

Quality Engineering (QE) offers valuable assistance in overcoming these hurdles.

Challenges Setting Up A Strong Feedback System

Data Processing

Data processing in feedback systems is about turning raw data into valuable insights by cleaning, classifying, and analyzing it. It's sorting and refining messy input to reveal hidden patterns and trends, a key ingredient for continuous improvement.

Lack Of Qualitative Data (Non-Numerical Information & Concepts)

Usability testing and user research can help build context by sharing insights on user motivations, behaviors, and pain points. 

Poor Data Quality

The QE team automates data quality tests integrated into the CI/CD pipeline. During product implementation, the team identifies anomalies in historical data through close collaboration and implements relevant scenarios into the test suite, ensuring comprehensive testing.

According to Gartner, each year, poor data quality causes organizations to lose an average of $12.9 million. Besides the direct impact on revenue, it complicates data systems and results in bad decisions over time.

Usability

Simple interfaces, clear instructions, and quick processes encourage more people to share their thoughts, providing valuable insights for improvements and better outcomes. The usability of a feedback system has to be top-notch.

Creating a Good Experience

Data about user actions and emotions can be used to create an empathy map. These maps guide design, ensuring each iteration resonates with real users, ultimately leading to delightful experiences. 

Feedback System Design

A good A/B test helps determine the preferred interface. Testers can assess user preferences by comparing variations and making data-driven decisions on visually appealing and impactful changes.

Netflix maximizes user engagement and revenue growth in the competitive streaming industry through a data-driven strategy. At the core of this approach is the extensive use of A/B testing. 

Netflix optimizes content recommendations and personalizes the user experience by continuously iterating and testing features and algorithms. 

This emphasis on A/B testing allows Netflix to refine its offerings based on real-time insights into customer preferences.

Scalability

Scalability in a customer feedback system refers to its ability to efficiently handle growing volumes of feedback data, users, and interactions. It involves seamlessly adapting and expanding the system to accommodate increased demand without compromising performance or the user experience.

Feedback Volume Surge

Conducting load testing by mimicking high user loads reveals performance bottlenecks within a system. This process aids in identifying areas for improvement, allowing optimization of server capacity to enhance overall performance and the user experience.

Security Concerns

A growing user base attracts more security risks. To protect the system, perform security audits, conduct penetration testing, and enforce strict access controls, ensuring robust defenses against potential vulnerabilities.

User Interface Responsiveness

Data surge poses the risk of slowing down the user interface. Frontend performance testing enhances code efficiency, reduces rendering delays, and ensures a responsive user experience.

Meta's AI training experienced bottlenecks due to the huge amount of data that slowed down their CPUs' data processing, which lagged behind the lightning-fast GPUs used for AI-model training. 

This and limited power budgets restricted the number of models they could run.

Meta implemented a comprehensive strategy to address these challenges, including building a new data ingestion infrastructure and last-mile transformation pipelines. A key component is the Disaggregated Data PreProcessing Tier (DPP), responsible for fetching, decoding, and transforming data for AI training. The DPP allows scalable data ingestion and independent scaling.

The optimizations led to a 35–45% improvement in the power budget required for data ingestion, enabling support for a growing number of AI models within power constraints.

Organizational Challenges

Organizational challenges in setting up a feedback system include securing stakeholder buy-in, defining clear objectives aligned with business goals, addressing potential resistance to change, and establishing a culture that values data-driven decision-making.

Stakeholder Buy-In

Generating a comprehensive report showcasing tangible benefits can positively influence stakeholder buy-in. Utilizing QE metrics to define SMART (Specific, Measurable, Achievable, Relevant, and Time-bound) goals for the feedback system is also convincing.

Fostering A Data-Driven Culture

QE values data as a learning tool for guiding informed decisions and ensuring inclusive products. With clear goals, shared celebrations, and stakeholder involvement, QE sets the stage for a data-driven culture. 

A Short Tale:

When it comes to the Cost Per Ticket (CPT), IT companies rank among the most expensive. The CPT value typically ranges from $25 to $35, but it can reach up to $100 depending on operational factors.

In a scenario with 500 monthly support tickets at $35 each, a 20% reduction through efficient QE results in 400 tickets.

Monthly cost savings amount to $3,500 ($17,500 - $14,000).

Data Privacy, Security, & Compliance

Establishing a feedback system involves navigating data privacy, security, and compliance challenges. Safeguarding user information, implementing robust security measures, and ensuring compliance with relevant regulations are critical aspects of creating a trustworthy and reliable feedback platform.

Identifying Vulnerabilities 

Security testing identifies vulnerabilities and establishes robust encryption protocols for secure customer data handling. Regular audits validate compliance with data protection regulations, ensuring the integrity of the feedback system.

Protecting Users' Privacy

Collecting only essential information, implementing secure storage with encryption, employing access controls, being transparent with users, complying with privacy regulations, and conducting regular audits contribute to privacy.

A 2023 survey by Deloitte claims that 50% of customers think that the benefits they get from online services are not worth worrying about data privacy. This highlights the importance of organizations showcasing their commitment to robust data privacy measures.

Businesses must clearly communicate their data practices, secure handling, regular audit details, and other security measures through various mediums. 

How Canva Made Their Customers and Team Happy with QE Approach 

Canva strategically enhanced its quality model despite resource constraints and tight deadlines. 

Their new Quality Assistance model embraces a proactive "shift-left" approach that involves addressing potential challenges at the early stages of development to prevent them from escalating. 

They leveraged data-driven decisions to optimize their quality process, which included

  • Prioritizing testing for the most-used features, which minimized critical bug encounters and boosted overall satisfaction.
  • Focusing on code coverage improved test efficiency, reduced reworks, and increased release confidence.
  • Balancing technical debt with new features, preventing future issues, and intuitive user experiences.
  • Understanding user behavior helped test high-traffic areas, reducing user friction points.

As a result, their team achieved better test coverage, reduced code rework, and increased confidence in feature releases, resulting in zero incidents in the past three months.

If you're eager to delve deeper into how QE can elevate your systems and processes, why not schedule a call with one of our QE experts?

Jan 12 2024
Jan 12

LocalGov Drupal is a CMS created by councils for councils. It is the perfect example of how, by pooling their resources and sharing code, local governments can save money and have brilliant websites that better serve their communities.

It's a project that means a lot to us at Annertech, and we have worked with many councils, designing new websites, creating new features for them, and then experiencing the thrill of these new features being made available to other councils to use.

We've been working with LocalGov Drupal (LGD) for a few years now and are proud of our contributions to the project. The past few months have been really productive for the Annertech team's LGD offering – we've developed new features, created new tutorials, and more.

Open Digital Cooperative

Our Director of Development Mark Conroy has been involved with LocalGov Drupal since the project was in its beta phase. Mark is the project’s front-end lead, developing the theme and working with many councils to ensure that their websites look the way they want them to look.

Towards the end of 2023, Mark was elected to the board of Open Digital Cooperative, the body that oversees the LocalGov Drupal project. He is joined by:
 

  • Will Callaghan, Friendly Digital
  • Finn Lewis, Open Code
  • Maria Young, Agile Collective
  • Kate Hurr, Cumberland Council
  • Michael Brown, Newcastle Council
  • Jamie Dixon, Wirral Council
Jan 12 2024
Jan 12

AI is all around us. It's an undeniable part of our personal and professional lives. Isn't it time we put it to work for us? 

That’s exactly what our "AI in Action" panel session at Acquia Engage was set to explore. Every organization will implement AI in their own way, but the outcome is the same: empowering marketers and developers.

We were having a blast at the TAG booth, so no panel pic, but check out the Engage main stage. (Photo credit: Acquia.com)

About the panelists

Moderated by Matthew Gonnering, GM Content Cloud at Acquia, the expert panel featured:

Presentation highlights

A few key points that stood out from the presentation include:

Bring your own data, build your own model

One of the biggest use cases of AI is to improve internal sales and marketing processes. Recent feature releases from Open AI are making this possible. Justin Emond commented, “we're getting closer to finding good, easy ways for organizations to spin up custom versions of ChatGPT with internal data.” He predicted, “that's where we're going to start to see interesting value.” 

This is already a reality in Eric Williamson’s world. Every marketer knows all too well the pain of a long email thread requesting links to sales content. Eric shared how his team uses ChatGPT to tackle this challenge. They built a help bot for sales, taking the sting out of Seismic document searches. 

A CMO’s take on scaling content with AI

Eric also highlighted how generative AI can revolutionize content production, an often resource-intensive area. "I oversee the CallMiner marketing team. We're utilizing generative AI in sort of a test mode. Our content team has doubled their volume, using it as a starter kit for first drafts," said Eric. 

Creating engaging content at scale can be a daunting task. By using AI for routine content creation, marketing teams can focus their efforts on strategy and creative direction, ensuring that every piece of content serves a purpose and drives engagement.

Watch your competition 

“Don't worry about AI. Worry about what your competitors are doing with AI.”

Moderator Matthew Gonnering shared the above quote, sparking conversation about how AI can help marketers rise above the competition. 

Eric offered an insightful analogy on what it takes to win: “I think of artificial intelligence like an ingredient, he said. “It’s ubiquitous. What makes the difference between McDonald's and a five-star restaurant is how those ingredients are used. The guardrails you put around it, the data that you have ChatGPT tapping into and the workflows that you put into it – that's what's going to build to beat the competition.”

…but don’t overestimate AI’s competitive edge

“I think of AI as the next spreadsheet,” said Justin. “The first spreadsheet, VisiCalc, was introduced in 1979. Until then, no one in human history was analyzing more than a hundred rows of data. Think about that. They built and ran the Roman Empire and they couldn't analyze more than a hundred rows of data until 1979.” 

And now? There are even more accountants today than there were in 1979. “It’s not a strategic advantage if everybody has it,” Justin remarked.

Learn fast and leave things better than you found them

Keep exploring and learning, but choose your sources wisely. Justin shrugged off predictive reports: "If it’s from McKinsey or Accenture, I skip those. They’re great at predicting futures that never happen,” he said.

To fully capitalize on AI’s potential, marketing leaders must foster a culture of innovation within their teams. Deanna Ballew encouraged experimentation with AI tools: "Issue a challenge for your team. Put out a quarterly OKR to go find five applications on how you're going to do this. Go try it out and start bringing back ideas." 

The panel also addressed AI ethics, including the White House's recent Executive Order on AI. Deanna urged marketers to prepare for the future. She advised, "At this point, as a marketer, you are accountable to executive action. Start thinking about that now. Have your guidelines now. Because when that day comes, and maybe you'll have it in Europe, the United States, you're going to have to be considerate."  

Where to next?

We hope that the key highlights of the "AI in Action" panel will provide marketers with valuable insights to help them decide whether or not to integrate AI into their upcoming work priorities.

Planning for the coming year involves more than just marveling at AI’s magic—it’s about using it wisely. As Justin reflected, “We’re getting lost in how magical AI seems…but it’s just a bit of interesting tech.” So, it’s time to get practical. It’s about making AI work for you strategically—nailing data use, refining workflows, and keeping things ethical. 

A special thanks to Deanna, Eric, and Matthew for sharing their time and expertise.

Jan 11 2024
Jan 11

We're thrilled to share that the Sovereign Tech Fund (STF), based in Germany, has generously entrusted the Drupal Association with a $300,000 USD service contract for work done to benefit the public. This funding is set to fuel two crucial projects that promise to strengthen security for Drupal and enhance the Drupal ecosystem.

The Sovereign Tech Fund (STF) supports the development, improvement, and maintenance of open digital infrastructure in the public interest. Its goal to strengthen the open source ecosystem sustainably, focusing on security, resilience, technological diversity, and the people behind the code. STF is funded by the German Federal Ministry for Economic Affairs and Climate Action (BMWK) and hosted at and supported by the German Federal Ag-ency for Disruptive Innovation GmbH (SPRIND).

The Drupal Association, along with the Drupal community, support Drupal with core support, community support, flagship programs, and new innovation. The Drupal Association is a unicorn in the software sector in terms of structure and true community - and is a leader for open source collaboration and an open web.

Project 1: Developer Tools Acceleration

This project will optimize GitLab CI, streamline user authentication with Keycloak, migrate Drupal contribution credits from the old issue queue to a new GitLab integration, create a seamless opt-in process for Drupal.org hosted projects to transition to GitLab issues, and develop an accessible learning guide. The guide will be a valuable resource for project maintainers looking to shift from Drupal.org's custom tooling to GitLab.

Project 2: Community Supply Chain Security

This initiative aims to enhance the security of the Drupal ecosystem by securing the signing prototype, conducting a third-party security audit of the PHP-TUF client and Rugged server, and performing a third-party security audit of the Drupal integration code. Additionally, the project will deploy secure signing in a production environment, further bolstering the security measures in place.

This funding aligns perfectly with the Drupal Association's strategic priorities. It enables us to make significant strides towards our goals, particularly in terms of optimizing our workflows through GitLab and enhancing our security measures with secure signing. Both projects will conclude before 31 March 2024.

The partnership with STF allows us to make a positive difference in the Drupal community and advance the open source platform for all users. We are grateful to the Sovereign Tech Fund for their generous support. Their funding shows dedication to open source and their belief in the Drupal Association and the community's ability to innovate and ensure the future of web development.

Jan 11 2024
Jan 11

Hey nonprofit Drupal users! The DA is interested in supporting community-driven content that is specifically relevant to nonprofit organization staff and related agencies at DrupalCon North America in Portland, Oregon, at the Nonprofit Summit on May 9, 2024.

We are looking for volunteers who would be interested in giving back to the community by contributing some subject matter expertise via a day of informal breakout sessions or other group activities. We are open to ideas!

Who are we looking for?

Do you have some Drupal expertise or a recent experience with a Drupal project that you would like to share with others? Is there something about Drupal that you think is really cool that you would love to share with the nonprofit Drupal community?

What’s required?

You will not be required to make slides! You don’t need to have lots of (or any) speaking experience! All you need is a willingness to facilitate a discussion group or engaging activity around a particular topic, and some expertise or enthusiasm for that topic that you wish to share. 

How to Submit an Idea or Topic

Please fill out this form by February 13th and we will get back to you as soon as we are able. Thank you! https://forms.gle/MJthh68rsFeZsuVc8

Discussion leaders will be selected by the Nonprofit Summit Planning Committee and will be notified by the end of February. 

Questions? 

Email [email protected].

Jan 11 2024
Jan 11

Introduction

Adopting the cloud-native approach can be expensive when you manually scale up and down your resources. Users may also face frequent service failures due to a lack of resources for handling the load.

Monitoring Kubernetes workloads and utilizing an autoscaling option can help solve these challenges.

What Is Kubernetes Monitoring?

Kubernetes can be monitored by keeping tabs on several metrics. These metrics play a crucial role in configuring dashboard activities and alerts, providing valuable insights into both the Kubernetes system and the applications operating within it.

The Kubernetes monitoring metrics can be sourced from various providers, such as cAdvisor, Metrics Server, Kubernetes API Server, and Kube-state-metrics.

Types Of Kubernetes Monitoring

There are also different types of Kubernetes monitoring.

  1. Cluster Monitoring

    The Kubernetes cluster functions as the central host for all containers and the machinery executing applications. For effective container management, it’s important to oversee the environment and the health of all cluster components, like:

    • Cluster Nodes

    Within a cluster, nodes facilitate the execution of applications through several resources. It is crucial to monitor and observe the health of these resources. Worker nodes are responsible for hosting containers, while master nodes oversee the activities of the worker nodes.  

    • Cluster Pods

    A pod, the smallest unit within a cluster, is composed of one or more containers. The quantity of active pods directly influences the number of nodes required. Monitoring the health and resource utilization of pods is essential for effective Kubernetes oversight.

    • Resource Utilization

    Gaining insights into resource utilization metrics helps understand the capabilities and limitations of cluster nodes, aiding in the assessment of sufficiency and redundancy. Some essential resources to track are disk utilization, memory utilization, CPU utilization, and network bandwidth.

  2. Pod Monitoring

    Pods, composed of containers deployed on nodes, form a fundamental component of the Kubernetes ecosystem. It is necessary to monitor pods by evaluating the following metrics.

    1. Container Metrics

      Comprehend and manage the number of containers within a pod, along with their lifecycle. Strive to prevent pod overload and optimize for scalability.

    2. Application Metrics

      Performance metrics for applications gauge performance levels and furnish industry-specific data. These metrics provide valuable insights into traffic, the frequency of unsuccessful requests, request durations, and feature utilization.

    3. Kubernetes Scaling And Availability Metrics

      Comprehending Kubernetes' scaling and availability is important for configuring auto-scaling tools within clusters. The node requirements are influenced by the number of containers or pods present in a cluster.

    4. Load Average

      Load average reflects the count of programs in execution or waiting to be executed on the CPU. It's essential to keep it within the limit of the number of CPU cores. For effective troubleshooting, monitor load average in conjunction with sys CPU usage and I/O wait.

    5. Resource Requests And Limits

      Containers come with designated resource requests and limits for CPU and memory. It's crucial to efficiently handle these to prevent either underutilization or overutilization. Strive for a target of approximately 80% actual usage on the 90th percentile for both resource requests and limits.

Types Of Autoscaling Options In Kubernetes

In Kubernetes, a cluster is a group of machines that execute containerized applications. At its core, a cluster contains a set of nodes and a control plane. The control plane is responsible for preserving the desired state of the cluster, including the specific applications running and the associated images.

On the other hand, nodes represent the virtual or physical machines responsible for executing applications and workloads, also known as pods. These pods are composed of containers that request computational resources like CPU, Memory, or GPU.

Kubernetes Master and Worker Nodes

Pod-Based Scaling

Pod-based scaling is the ability of applications or services in a containerized environment to scale by modifying the number of pods. In Kubernetes, a pod is the smallest deployable unit that houses one or more containers running together on a node. Containers within a pod share the same network namespace and can communicate with each other using localhost.

  • Horizontal Pod Autoscaling

Horizontal scaling involves adjusting the computational resources within an existing cluster, such as adding new nodes or increasing pod counts. This can be done by using the Horizontal Pod Autoscaler (HPA) to raise the replica count.

Horizontal Pod Autoscaler

The Horizontal Pod Autoscaler (HPA) is a tool designed to dynamically adjust the number of pods in a cluster based on the current computational workload demands of an application. It assesses the required number of pods using user-defined metrics, typically CPU and RAM usage, but it also supports custom metrics.

The HPA continuously monitors CPU and memory metrics provided by the installed metrics server in the Kubernetes cluster. When a specified threshold is reached, the HPA initiates the creation or deletion of pods to maintain the desired number based on the set metrics. This involves updating the number of pod replicas within the deployment controller.

Consequently, the deployment controller scales the number of pods up or down until it aligns with the desired count. If custom metrics are preferred to dictate scaling rules for pods through the HPA, the cluster needs to be connected to a time-series database to store the relevant metrics. Horizontal Pod Autoscaling cannot be applied to objects that cannot be scaled, such as DaemonSets.

  • Vertical Pod Autoscaling

Vertical scaling involves adjusting the inherent resources, such as CPU or RAM, for each node within the cluster. Typically, this entails creating a new node pool using machines with varying hardware configurations.

Vertical Pod Autoscaler

In the context of pods, vertical scaling dynamically tunes the resource requests and limits based on the present application's needs. This process is facilitated by the Vertical Pod Autoscaler (VPA).

The Vertical Pod Autoscaler (VPA) allocated necessary CPU and memory resources to existing pods, modifying the available computational resources for an application. This functionality ensures effective monitoring and adaptation of the allocated resources of each pod throughout its lifecycle.

Accompanied by a tool named VPA Recommender, the VPA assesses current and past resource consumption data to suggest optimal CPU and memory allocations for containers. The VPA doesn't directly update resource configurations for existing pods. Instead, it identifies pods with incorrect configurations, terminates them, and allows their controllers to recreate them with the recommended settings.

In scenarios where both HPA and VPA are utilized simultaneously to manage container resources, conflicts may arise if they rely on the same metrics. This simultaneous attempt to address the situation can lead to incorrect resource allocations.

Coexistence is possible if HPA and VPA operate on different metrics. For example, if VPA utilizes CPU and memory consumption for precise resource allocation, HPA can be employed with custom metrics.

Node-Based Scaling

The Kubernetes Node Autoscaler complements the Horizontal and Vertical Pod Autoscalers by facilitating the scaling of cluster nodes in response to the number of pending pods. The Cluster Autoscaler (CA) regularly examines if there are pending pods and adjusts the cluster size accordingly.

It also efficiently deallocates idle nodes to maintain the cluster at its optimal size. In cases where resource limits are specified, the Node Autoscaler can initiate the deployment of new nodes directly into the pool, adhering to the defined resource constraints.

  • Cluster Upscaling

If pods are slated for execution and the Kubernetes Autoscaler identifies a potential resource shortage, it can dynamically increase the number of machines in the cluster. The diagram below provides a visual representation of how the cluster can undergo automatic upscaling:

Flow of Cluster Upscaling

The scenario depicted involves two pods scheduled for execution, but the current node's compute capacity has been reached. The cluster autoscaler systematically scans all nodes to assess the situation and triggers the provisioning of a new node under the following conditions:

  • Some pods have failed to schedule on existing nodes due to insufficient available resources.
  • The addition of a node with specifications identical to the current ones aids in redistributing the workload.
  • The cluster has not reached the user-defined maximum node count.

Once the new node is deployed and detected by the Kubernetes Control Plane, the scheduler assigns the pending pods to the cluster's fresh node. If there are pending pods, the autoscaler repeats the steps.

  • Cluster Downscaling

The Kubernetes Cluster Autoscaler reduces the count of nodes within a cluster when some nodes are deemed unnecessary for a predefined duration. A node is considered unnecessary if it exhibits low utilization, and all critical pods residing on it can be relocated to other nodes without causing a resource shortage.

The node scale-down evaluation considers the resource requests specified by the pods. If the Kubernetes scheduler determines that the pods can be relocated, it eliminates the node from the cluster to enhance resource utilization and minimize costs.

In cases where you have set a minimum threshold for the number of active nodes in the cluster, the autoscaler refrains from decreasing the node count below the specified threshold.

Autoscaling Best Practices In Kubernetes

There are some recommended practices for achieving seamless workload scaling in Kubernetes.

  1. Use An Up-To-Date Version Of The Autoscaler Object

    Kubernetes undergoes frequent updates with the addition of new features. It is advisable to use the compatible autoscaling object of the Kubernetes control plane version. This ensures that the cluster autoscaler effectively emulates the Kubernetes scheduler.

  2. Keep Requests Close To The Actual Usage

    Efficient scaling operations by the cluster autoscaler depend on accurate pod resource provisioning. Overprovisioning pod resources may lead to inefficient resource consumption or lower node utilization.

    To enhance performance, cluster administrators should align pod resource requests with historical consumption statistics. This helps ensure that each pod's resource requests closely match its actual usage trend.</span>

  3. Retain Node Groups With Similar Capacity

    The cluster autoscaler assumes uniform memory and CPU resource capacity for every node within a node group. It creates a template node on which all cluster-wide scaling operations are performed. To ensure accurate performance, it is recommended to have node groups with nodes that share the same resource footprint.

  4. Define Resource Requests And Limits For Each Pod

    The autoscaler relies on node utilization and pod scheduling status for scaling decisions. Missing resource requests for pods can impact the calculation of node utilization, leading to suboptimal algorithm functioning. To ensure optimal scaling, administrators should define resource requests and limits for all pods running on a node.

  5. Specify Disruption Budgets For All Pods

    Kubernetes supports defining pod disruption budgets to manage the voluntary/involuntary disruption of workload replicas and prevent losses. Administrators should define disruption budgets to maintain a minimum threshold of pods. This ensures that the autoscaler optimally manages cluster services without exceeding the defined budget.

Conclusion

Navigating the complexities of Kubernetes autoscaling requires a strategic approach for optimal performance. Axelerant, with its expertise in cloud-native solutions, can help you navigate the intricacies of autoscaling in Kubernetes.

Whether it's Horizontal Pod Autoscaling (HPA), Vertical Pod Autoscaling (VPA), Node Autoscaler, or Cluster Autoscaler, our digital engineering experts ensure seamless integration tailored to your specific needs.

Schedule a meeting with our experts today and embark on a journey towards efficient, cost-effective, and automated Kubernetes scaling.

Jan 11 2024
Jan 11

The Gutenberg Editor is a powerful page building tool for Drupal that utilizes a block building system to create and edit content.

This comprehensive guide walks you through each step of setting up and using the Gutenberg Editor.

From downloading and installing the Gutenberg module, to enabling it on your content types, and finally using it to create your content using blocks, this guide has you covered.

To get the most out of this guide, follow the video above.

Jan 11 2024
Jan 11

Authored by: Nadiia Nykolaichuk.

Multimedia content, such as engaging videos, insightful podcasts, and vibrant images, is meant to captivate, inform, and entertain website users. However, while creating an immersive world of multimedia experiences, it’s necessary to be mindful of the people with a wide range of impairments who need alternative ways to perceive the content. 

When it comes to images, alt text can tell users what the specific image conveys. But for video and audio content, the mission passes on to subtitles, captions, transcripts, and audio descriptions. Using them is one of the best practices for creating accessible content, so they are up next as the main characters of Part 5 in our series on essential accessibility elements. You’ll note that we’ve also paid particular attention to how they can be used on Drupal websites. 

Introduction to subtitles, captions, transcripts, and audio descriptions

Subtitles, transcripts, captions, and audio descriptions ensure that multimedia content can be understood and enjoyed by a diverse audience. They provide textual support for those who have difficulty hearing sounds and, on the contrary, an audial version of multimedia content for those who have vision impairments. In addition, they can also significantly improve comprehension, which might be helpful for users with cognitive issues.

While web accessibility is the primary goal of these elements, their unique ability to make content more understandable can also be super useful for educational purposes or multilingual experiences. That’s because the information can be perceived in various formats and provided in different languages.

Subtitles

Subtitles are a textual representation of spoken dialogue in a video file. They are usually displayed at the bottom of the screen (however, this may vary) and synchronized with the audio. Subtitles ensure that everyone, regardless of their hearing ability, can follow and comprehend the dialogue or narration.

Captions

Captions are similar to subtitles but also include information about non-verbal elements such as music, various sound effects, speaker identification, and more. This provides a more comprehensive representation of the sounds in a video. In the context of captioning platforms and tools, the terms “captions” and “subtitles” are often used interchangeably.

There are two main types of captions: open captions and closed captions. 

  • Open captions are permanently embedded in the video file. They cannot be turned off or customized by the user. They are part of the video itself and are always displayed when the video is played.
  • Closed captions are captions that can be turned on or off by the user based on their preferences. Closed captions offer more flexibility, enabling viewers to customize their experience.

Transcripts

Transcripts are a comprehensive written record of all spoken words and other relevant audio elements in a video or audio file. They offer a textual representation of the entire content, including dialogue, narration, and non-speech sounds. Unlike subtitles and captions, which are designed to run synchronously with the media playback, transcripts are often provided as a standalone document. Transcripts are beneficial for a broad range of users, including those with auditory impairments, those with cognitive disabilities, and people who prefer reading over listening (or reading while listening).

Audio descriptions

Audio descriptions, also known as video descriptions or described video, are narrated descriptions of the visual elements in a video. These descriptions provide information about actions, scenes, and other visual content that may not be apparent through dialogue or sound effects. Audio descriptions are typically provided as a separate audio track or file. Users can choose to enable or disable audio descriptions based on their preferences.

Examples of popular tools for creating accessible multimedia

  • YouTube Studio

YouTube Studio enables content creators to add subtitles and closed captions to their videos in various languages, or edit the existing ones. You can enter the text manually or upload a file and sync it with the video. There is the automatic captioning feature at YouTube that uses machine learning algorithms.

Managing video captions in YouTube Studio.Managing video captions in YouTube Studio.
  • CapScribe 3

CapScribe 3 is a tool for adding captions and audio descriptions for video content. The caption mode provides various editing features and controls. Voice recordings can be live or use various synthetic TTS (text-to-speech) voices.

Managing video captions in CapScribe 3.Managing video captions in CapScribe 3.

Audacity is an open-source audio editing app. While it doesn't primarily specialize in creating audio descriptions, it can be used to record and edit audio files, including those containing audio descriptions.

Descript is a platform for creating and sharing videos and podcasts. It is equipped with an innovative tool that automatically transcribes spoken content into written text in 22 languages with flexible editing features. The platform also converts text to speech using AI. 

Managing video captions in Descript.Managing video captions in Descript.
  • Amara Subtitling

Amara is a cloud-based platform that offers subtitle and caption creation services. It supports collaborative editing, providing customizable workflows for review and final approval of subtitles, which is available in premium plans.

Otter.ai is an AI-powered transcription tool that provides real-time transcription during meetings. It can record audio, write notes, capture action items, and generate summaries. While being primarily used for meetings, it also supports the upload and transcription of pre-recorded multimedia files.

Rev provides speech to text services such as the creation of captions, transcripts, and subtitles. The options include human transcription services from freelancers across the globe and AI-powered transcription. Rev’s speech-to-text APIs can be built into applications. 

  • Happy Scribe

Happy Scribe provides both human and AI-powered transcription for audio files and caption generation for video files. It supports 60+ languages.

  • Subtitle Edit 

Subtitle Edit is a free, open-source software that enables you to create, adjust, synchronize, and translate subtitle lines. The tool supports 300+ subtitle formats with possible conversion between them.

Managing video subtitles in Subtitle Edit.Managing video subtitles in Subtitle Edit.

Aegisub is another open-source tool specifically designed for creating and editing subtitles. It enables you to translate subtitle files from one language to another, sync a translated script to an audio, put subtitles on a video, correct a subtitle that’s not properly synchronized to the video, and more.

Multimedia accessibility in Drupal

If you have a website built with Drupal, it might be interesting for you to see some ideas and examples of how subtitles, captions, transcripts, and audio descriptions can be used in Drupal for creating accessible video and audio content.

A note on Drupal’s Media system

First off, Drupal core has an amazing Media system that provides consistent and user-friendly ways to use multimedia on your website. The Drupal Media system provides 5 built-in media types: Remote video, Video, Audio, Document, and Image. You can store multimedia in the Media Library to reuse at any time. 

Using the Media system, you can add video and audio items to your website in the following ways:

  • embed them directly in CKEditor with the help of the Media Library button
  • attach them to content via standalone Media fields

You can find more specific how-tos in our guide to using Drupal’s Media and Media Library.

Pulling remote multimedia with subtitles or captions

Perhaps the most seamless way to add videos with subtitles or captions to a Drupal website is by embedding them directly from third-party multimedia hosts. That’s what is provided by the Remote video media type in Drupal, which is different from the other media types because it doesn’t require file uploads. 

The Remote video media type relies on the modern oEmbed format, so third-party hosts that support it are referred to as oEmbed providers. All you need is to place a link to the multimedia item in Drupal and it gets embedded into your Drupal content pages either inline in CKEditor or via fields. Videos will play just like they do in their original hosts — together with the subtitles or captions they come packed with. 

As you can see on the example below, a video from YouTube shows captions and offers the controls to turn the captions on and off, which is also important for accessibility. Namely, there is the “CC” (“Closed Captions”) button in the right bottom corner provided by YouTube.

A YouTube video with captions embedded in Drupal’s CKEditor.A YouTube video with captions embedded in Drupal’s CKEditor.

If you are embedding videos directly in CKEditor, like in the example, you can adjust the dimensions for the embedded video. These settings are available in Structure > Manage Display > Remote video. Just open the gearbox next to the field for the remote URL and set the maximum width and the maximum height in pixels.

Defining the dimensions for videos embedded in Drupal’s CKEditor.Defining the dimensions for videos embedded in Drupal’s CKEditor.

Drupal has an out-of-the-box support for YouTube and Vimeo as oEmbed providers, but it’s possible to add others. Here is where the oEmbed Providers contributed module comes in handy. It has a user-friendly interface for managing the providers, including the options to add custom ones.

Adding an oEmbed provider to Drupal.Adding an oEmbed provider to Drupal.

A wide range of oEmbed providers can work with Drupal (Soundcloud, Spotify, Flickr, Dailymotion, TED, and many others). The oEmbed standard is designed to support various types of media, not just video. That’s why the Remote video media type in Drupal, despite its video-specific name, can also include audio or other types of multimedia. The embedded content inherits the multimedia accessibility options available on the original platforms, so it’s worth checking their specific features out.

Adding transcripts and audio descriptions to multimedia

As transcripts and audio descriptions are commonly provided as standalone items and don’t need to run synchronously with the multimedia playback, adding them is simpler. The most straightforward way is to add a field of the corresponding format to your content type or media type for attaching transcripts or audio description files.

Attaching a transcript via the file field on the content editing form.Attaching a transcript via the file field on the content editing form.

Based on your website's design and user experience goals, you can use numerous other solutions for placing transcripts and audio descriptions. For example, you can use an accordion tab, a show/hide toggle, dynamic loading upon request, and so on.

Adding non-remote multimedia

As far as multimedia of the Video and Audio media types in Drupal, they are not pulled from third-party services but need to be uploaded to the website. Just like with remote video, local video and audio can either be embedded in CKEditor or added to content as fields.

Local Video and Audio media items can be equipped with captions and subtitles as well. If the multimedia items already contain embedded captions or subtitles, they can be uploaded to Drupal with them. Otherwise, additional steps will be needed like adding a captions file and making sure it’s properly associated with the multimedia item.

The good news is that this might change in Drupal 11. The Drupal community understands the importance of multimedia accessibility, so there is special work being done on providing authors with tools to manage transcripts and captions/subtitles for Video and Audio media types. As part of the big improvements, audio and video file field formatters are planned to update to include track elements.

While the out-of-the-box support is being worked on, some of the below-described options might be interesting to check out if you are planning to add non-remote video or audio. 

Using HTML5 tags

It’s possible to apply some customization that involves the use of HTML5 tags. The HTML component is used to display a video on a web page. It has a special tag that is helpful for managing the files with subtitles, captions, or other files containing text that needs to be displayed when the media is playing. You can specify the type of element (subtitles, captions, etc.), the language, and the format (WebVTT).

Using HTML5 tags for video captioning.Using HTML5 tags for video captioning.

In addition, it’s possible to enable editorial teams to add files with subtitles or captions for multimedia via the corresponding fields that can be added to content structure. 

Third-party integration

Another option is to integrate third-party media players or captioning/transcribing platforms directly with your Drupal website. This can be done either with the help of contributed modules or through custom integration. Drupal provides some contributed modules for media players such as, for example:

Final thoughts

Mentioned above are just some key points on how captions, subtitles, transcripts, and audio documents can be used in Drupal. The choice of the specific implementation depends on the type of multimedia you have, how often it is added to your website, and many other factors. It’s great to know Drupal’s multimedia accessibility is growing, so there’s surely an optimal solution for your website.

With this fifth article, we are wrapping up our series on accessibility elements, though, of course, we are aware that accessibility encompasses much more. Hopefully, this has inspired you to improve your website’s accessibility, and, with that, contribute to creating a more inclusive digital environment for everyone. As always, our development team will be happy to give you a hand.

Jan 10 2024
Jan 10

The greatest advantage of Symfony Messenger is arguably the ability to send and process messages in a different thread almost immediately. This post covers the worker that powers this functionality.

This post is part 3 in a series about Symfony Messenger.

  1. Introducing Symfony Messenger integrations with Drupal
  2. Symfony Messenger’ message and message handlers, and comparison with @QueueWorker
  3. Real-time: Symfony Messenger’ Consume command and prioritised messages
  4. Automatic message scheduling and replacing hook_cron
  5. Adding real-time processing to QueueWorker plugins
  6. Making Symfony Mailer asynchronous: integration with Symfony Messenger
  7. Displaying notifications when Symfony Messenger messages are processed
  8. Future of Symfony Messenger in Drupal

The Symfony Messenger integration, including the worker, is provided by the SM project. The worker is tasked with listening for messages ready to be dispatched from an asynchronous transport, such as the Doctrine database transport. The worker then re-dispatches the message onto the bus.

Some messages may be added to a bus with no particular execution time, in which case they are serialised by the original thread. Then unserialised almost immediately by the consume command in a different thread.

Since Messenger has the concept of delaying messages until a particular date, the DelayStamp can be utilised. The consume command respects this stamp and will not redispatch a message until the time is right.

The worker is found in the sm console application, rather than Drush. When SM is installed, Composer makes the application available in your bin directory. Typically at /vendor/bin/sm

The command takes one or more transports as the argument. For example if you’re using the Doctrine transport, the command would be:

sm messenger:consume doctrine

Multiple instances of the worker may be run simultaneously to improve throughput.

WorkerThe worker. Messages output to stdout for demonstration purposes.

Prioritised messages

The worker allows you to prioritise the processing of messages by which transport a message was dispatched to. Transport prioritisation is achieved by adding a space separated list of transports as the command argument.

For example, given transports defined in a site-level services.yml file:

parameters:
  sm.transports:
    doctrine:
      dsn: 'doctrine://default?table_name=messenger_messages'
    highpriority:
      dsn: 'doctrine://default?table_name=messenger_messages_high'
    lowpriority:
      dsn: 'doctrine://default?table_name=messenger_messages_low'

In this case, the command would be sm messenger:consume highpriority doctrine lowpriority

Routing from messages to transports must also be configured appropriately. For example, you may decide Email messages are the highest priority. \Symfony\Component\Mailer\Messenger\SendEmailMessage would be mapped to highpriority:

parameters:
  sm.routing:
    Symfony\Component\Mailer\Messenger\SendEmailMessage: highpriority
    Drupal\my_module\LessImportantMessage: lowpriority
    '*': doctrine

More information on routing can be found in the previous post.

The transport a message is sent to may also be overridden on an individual message basis by utilising the Symfony\Component\Messenger\Stamp\TransportNamesStamp stamp. Though for simplicity I’d recommend sticking to standard routing.

Running the CLI application

The sm worker listens and processes messages, and is designed to run forever. A variety of built in flags are included, with the ability to quit when a memory or time limit is reached, or when a certain number of messages are processed or fail. Flags can be combined to process available messages and quit, much like drush queue:run.

Further information on how to use the worker in production can be found in the Consuming Messages (Running the Worker) documentation.

The next post covers Cron and Scheduled messages, a viable replacement to hook_cron.

Jan 10 2024
Jan 10

DrupalCon Portland 2024 is approaching soon, and you can’t wait to head to vibrant Portland. If this is you, you also must be stressed about persuading your boss to invest in your attendance at the Drupal event of the year. But don’t worry, we’ve got you covered! This article is your go-to resource, where you’ll find all the ammo you need to make your case. Let’s get started!

But First, Are You Convinced About Attending DrupalCon?

Naturally, your organization has various factors to weigh, with the primary concern being whether sending you to DrupalCon Portland is worth their investment. But the pivotal question is the value you see in it. Explore our list of strong reasons to attend DrupalCon 2024.

  • For everyone - DrupalCon, the biggest open-source event in North America, offers a unique experience for all Drupal enthusiasts—whether you're diving into Drupal for the first time or have been a community member for years. The benefits of attending are vast.
  • Training - Learn specific skills relevant to your role through targeted training at DrupalCon. Develop a deep knowledge of Drupal directly relevant to your career, ensuring a direct and positive return on investment. (Can we mention some of the Training sessions lined-up for DCON 2024?)
  • Sessions - Dive into sessions led by the Drupal experts at DrupalCon. These are not just classes; they're conversations with the thought leaders who know their Drupal inside out. It's not just learning; it's getting hands-on wisdom from the best in the biz. (Can we mention some of the sessions lined-up for DCON 2024?)
  • Keynotes - Want a front-row seat to the State of Drupal and the future of the web? Then you cannot miss out on DriesNote. Plus, there are other keynotes that'll fire up your imagination about what's possible in the digital world.
  • Networking - Imagine being in a room with thousands of Drupal enthusiasts at DrupalCon. It's a community buzzing with passion. Got Drupal questions? Tap into a wealth of knowledge and enthusiasm at one of the largest open-source communities. Hallway Tracks and Exhibition areas are the heart of networking at DrupalCon. Who knows, you might just score a selfie with Dries on your stroll!
  • Industry Summits - It's not just about networking—it's about conversing with peers who've been there, done that. Learn the nitty-gritty of industry best practices at industry summits like Higher Education Summit, Nonprofit Summit, Government Summit, and Community Summit. Discover how to tackle business challenges head-on with Drupal solutions, giving your job skills a serious boost.
  • Peer Connection - It’s a chance to connect with folks who share the same passion as you. Swap stories, share insights, and stay in the loop about the latest in Drupal. Learn firsthand from those who get your role and challenges.
  • Contribution Sprint - If you’re new to Drupal contributions, get hands-on guidance and sprint mentoring by experts. Whether you’re a coder or a non-coder, they are ways for everyone to participate in community contribution sprints to amplify the power of Drupal together.

Sample Letter to Your Boss

Here’s a sample letter to help you convince your boss about attending DrupalCon 2024. We guarantee they'll see the light!

Dear [Boss’s Name],

I am writing to express my strong interest in attending DrupalCon 2024 in Portland, Oregon, and to request your approval for participation in this significant event. I believe that attending DrupalCon will not only benefit my professional development but also contribute to the success of our team and the company as a whole.

Here are several compelling reasons why my attendance at DrupalCon is beneficial for us:

  • Industry Insights: Networking at Industry Summits will keep us updated on best practices and innovative solutions.

  • Strategic Vision: Keynotes, especially DriesNote, offer strategic insights vital for our long-term planning.

  • Community Engagement: Networking with thousands of community members ensures immediate answers and collaborations.

  • Role-Specific Learning: Connecting with peers in our specific roles provides insights into the latest in Drupal.

  • Contribution Sprint: Active participation contributes to Drupal's strength, enhancing our company's reputation.

I am seeking approval for the associated expenditures, which include:

EXPENSE

AMOUNT

Airfare

Visa Fees (if required)

Ground Transportation

Hotel

Meals

Conference Ticket

TOTAL EXPENSE

[Add this line if you’re traveling from overseas] The Drupal Association can issue an official letter of invitation to obtain a visa for my travel to the United States.

The Drupal Association can also issue a Certificate of Attendance for the conference if required for our records.

Please accept this proposal to attend, as I'm confident in the significant return we will receive for the small investment. For more information on the event, please visit the conference website: https://events.drupal.org/portland2024.

I'm available to discuss this further at your earliest convenience.

Sincerely,
[Your Full Name]
[Your Position]
[Your Contact Information]

Jan 10 2024
Jan 10


Introduction

According to a report by Uptime Institute, application failures can result in a loss of at least $100,000. The chance of these failures is 88% higher in the case of on-premise infrastructure hosting than in the cloud.

Causes of publicly reported outages

This is why digital engineering developers looking to build, ship, and host full-stack, high-quality Next.js applications should choose a reliable cloud platform like AWS Amplify.

Why Choose AWS Amplify

AWS Amplify is a user-friendly, open-source framework empowering developers to create secure and scalable applications. It simplifies the intricacies of setting up backend infrastructure, enabling developers to concentrate on crafting features and providing value to users.

Amplify’s rapid prototyping feature enables faster time to market. It is compatible with well-known frontend frameworks like React, Angular, and Vue.js, ensuring accessibility for a diverse community of developers.

AWS Amplify facilitates organizations to move quickly without investing significant efforts in designing and implementing backend AWS services. It helps organizations build mobile and web applications on AWS Cloud with start-up speed and enterprise scale.

- Dr. Biswajit Mohapatra, 20+ CIO Awards Winner | AWS and DevOps Institute Global Ambassador

How To Deploy Next.js App With AWS Amplify

Developers can deploy a Next.js app with Amazon Amplify in four steps without losing important features like Incremental Static Regeneration (ISR) and Server-Side Rendering (SSR).

Step 1: Create A Next.js App

Start by configuring the development environment. Navigate to the deployment settings to customize the app according to requirements.

Get started with building a Next.js application.

Type ‘npx create-next-app next-amplify’ in the Terminal.

Use ‘cd next-amplify’ to navigate to the project directory with the default options chosen.

Use npm run dev to run the Next.js app with its default code.

Next, developers can choose to follow any deployment strategies for the Next.js apps.

Developers using Static Side Generation (SSG) have to modify the package.json file to deploy a Next.js application as shown below.

"scripts": {
  "dev": "next dev",
  "build": "next build && next export",
  "start": "next start"
}

Use ‘next build && next export’ to support SSG pages. This script also exports the app to static HTML, making it work without a Node.js server. 

In the case of other deployment methods, no changes are needed in the package.json file.

Step 2: Add Authentication To Your Next.js App

Amplify makes it easy to streamline the frontend and backend for cloud-connected apps. Developers can use the Amplify CLI to add authentication or integrate APIs into the web app.

Run ‘amplify init’ to get started.

Set up authentication resources with ‘amplify add auth.’

After making code changes for authentication, use ‘amplify push -y’ to update the Amplify project.

Install the needed dependencies with the Amplify package to create a functional Authentication UI in the Next.js app.

npm install aws-amplify @aws-amplify/ui-react

Now, replace the existing code in the pages/_app.js file with the following:

import "@aws-amplify/ui-react/styles.css";
import { Amplify } from "aws-amplify";
import awsExports from "../src/aws-exports";
import '@/styles/globals.css'

Amplify.configure({ ...awsExports, ssr: true });

export default function App({ Component, pageProps }) {
 return ;
}

Replace the code inside the index.js file as well.

import { withAuthenticator } from "@aws-amplify/ui-react";

function Home({ signOut, user }) {
 return (
   


     
     


       Sign out
     


   
 );
}
export default withAuthenticator(Home);

In this example, the “withAuthenticator” function from the ‘@aws-amplify/ui-react’ is used to include authentication functionality to the ‘Home’ component. There are two properties to the ‘Home’ component namely, “signOut” and “user” to display the username and a button to sign out from the app.

The ‘withAuthenticator’ function is a higher-order component that houses the ‘Home’ component and offers authentication features such as login, registration, and password reset. The ‘export default’ statement exports the ‘Home’ component housed within the ‘withAuthenticator’ function. This means that any component importing and using the ‘Home’ function will automatically get the authentication functionality.

Adding Authentication Functionalities to the Home Properties

Step 3: Push The Code On GitHub

When the web app is fully developed, push it to GitHub. Create a new repository for the project and publish the code as per the given instructions.

git init

git add .

git commit -m "first commit"

git branch -M main

git remote add origin https://github.com/youraccountname/repo.git

git push -u origin main

Step 4: Deploy The Next.js App

Navigate to Amplify from the AWS Console by using the search bar at the top.Search for Amplify

Developers who are new to AWS Amplify will see the signup page. Click ‘Get Started’ to proceed.

Click on Get Started

Scroll down to the Amplify Hosting section and click ‘Get Started’ to proceed.

Host your web app

Connect Amplify to the source code from GitHub as the app code has already been pushed to GitHub.

Connect Amplify to the source code from GitHub

When connecting to GitHub for the first time, developers need to authorize the account to connect with AWS. Once authorized, they will see a new page.

GitHub Authorization

Now, choose a repository. For example, developers can click on a repository named “next-amplify”, read the branch name displayed below it, and click ‘Next’ to proceed.

Add repository branch

Go to the ‘Build Settings’ page to see the app's name and its recognized frontend and backend frameworks.

Build Settings

Review the app on this page before deployment. If everything looks good, click ‘Save and deploy.’

App review before deployment
Wait for the system to finish deploying. Click the link provided to see your Next.js app in action.

How We Built A Highly Performant News Aggregator For Internet Users In Kashmir

In Kashmir, where internet access is often slow due to government restrictions, we developed a special news app. This app is quick to load, can be used offline, and sends users updates.

Axelerant's experts made a simple app that works like a regular one but is light and fast. We used Next.js to create an app that is both speedy and good for search engines.

Our app lets users get update notifications without installing a big app. The website is powered by serverless functions (AWS), making it better for search engines. The whole system runs on AWS and uses serverless tech.

Because our app is small, it's easy to install on phones. It sends push notifications to keep users informed about the latest news, and it works well even without the internet.

Schedule a call with our experts to learn more about how we can help build similar solutions.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web