Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
May 27 2021
May 27

The process of migrating data into a Drupal database from a CSV file can be fulfilled through Drupal’s integrated Migrate API and three extra custom modules (Migrate Source CSV, Migrate Plus and Migrate Tools). 

This is known as the ETL (Extract - Transform - Load) process, in which data is fetched from one source in the first step, transformed in the second step, and finally loaded to its destination on the Drupal database in the third step. 

This tutorial will explain the creation of 12 book nodes for a library database.  Keep reading to learn how!

Migrate Data from a CSV File in Drupal 8/9

 Step # 1 - Install Drush and the Required Modules

To execute migrations in Drupal, we need Drush. Drush is not a Drupal module, but a command-line interface to execute Drupal commands. To install the latest version of Drush,  

  • Open the terminal application of your system
  • Place the cursor outside the /web directory.
  • Type:  composer require drush/drush

This will install Drush inside the vendor/bin/drush directory of your Drupal installation. However, it is cumbersome to type vendor/bin/drush instead of drush, each time you want to execute a drush command.

Drush launcher makes it possible to execute the specific Drush version of each Drupal installation on a per-project basis. 

Why does this make sense?

Each project has different requirements, specific Drush versions help to avoid dependency issues. Some contrib modules may not work properly with the latest version of Drush.

The specific instructions for OSX and Windows systems can be found here: https://github.com/drush-ops/drush-launcher#installation---phar

For a Linux system:

  • Type the following to download the file named drush.phar from GitHub:

wget -O drush.phar
https://github.com/drush-ops/drush-launcher/releases/latest/download/drush.phar

  • Type the following to make the file executable:  chmod +x drush.phar
  • Type the following to move the .phar file to your $PATH and run Drush as a global command: sudo mv drush.phar /usr/local/bin/drush

It is now time to install the required contributed modules to perform the migration.

  • Type the following:

composer require drupal/migrate_tools
composer require drupal/migrate_source_csv

Migrate Data from a CSV File in Drupal 8/9Once Composer has finished downloading the modules,

  • Open the Drupal backend in your browser
  • Click Extend
  • Enable Migrate, Migrate Plus, Migrate Tools and Migrate Source CSV
  • Click Install

Migrate Data from a CSV File in Drupal 8/9

Step # 2 - More About the ETL Process

The process of Extracting, Transforming, and Loading data can be achieved by defining the migration in a .yml file, and then executing it with a Drush command, so the Drupal database can correctly be populated. 

There are some important facts to notice:

  1. Each one of the steps is performed through Drupal plugins.
  2. You are only allowed to use one plugin In the first step (source definition i.e. Extract) and one plugin in the last step (destination definition, i.e. Load) of the process. 
    • In other words, you may only fetch data from one source (CSV file, JSON feed, etc) and store it in Drupal only under a particular entity bundle, for example, Article, Page, a custom content type, a user, or a configuration entity, as well.
  3. You are allowed to use as many plugins as necessary to model the data so that it matches the format expected by Drupal.
  4. Drupal has by default a list of source/process/destination plugins that can be used within the definition file.

To see a list of all source plugins, 

  • Open your terminal window.
  • Type the following:

drush ev
"print_r(array_keys(\Drupal::service('plugin.manager.migrate.source')->getDefinitions()));"

That list is a little bit longer, remember that you can use as many plugins as you need in the process step.

To see a list of all destination plugins,

drush ev "print_r(array_keys(\Drupal::service('plugin.manager.migrate.destination')->getDefinitions()));"


Notice: the csv plugin is there because we already have enabled the Migrate Source CSV module.

To see a list of all process plugins,

  • Type the following:

drush ev
"print_r(array_keys(\Drupal::service('plugin.manager.migrate.source')->getDefinitions()));"

That list is a little bit longer, remember that you can use as many plugins as you need in the process step. 

   Notice: the csv plugin is there because we already have enabled the Migrate Source CSV module.  To see a list of all process plugins,  Type:  drush ev getDefinitions()));" That list is a little bit longer, remember that you can use as many plugins as you need in the process step. To see a list of all destination plugins, Type: drush ev "print_r(array_keys(\Drupal::service('plugin.manager.migrate.destination')->getDefinitions()));"" />

 To see a list of all destination plugins,

  • Type the following:

drush ev "print_r(array_keys(\Drupal::service('plugin.manager.migrate.destination')->getDefinitions()));"

   Notice: the csv plugin is there because we already have enabled the Migrate Source CSV module.  To see a list of all process plugins,  Type:  drush ev getDefinitions()));" That list is a little bit longer, remember that you can use as many plugins as you need in the process step. To see a list of all destination plugins, Type: drush ev "print_r(array_keys(\Drupal::service('plugin.manager.migrate.destination')->getDefinitions()));"" />

Step # 3 - Create the Content Type

  • Click Structure > Content types > Add Content type
  • Create the ‘Book’ content type
  • Click Save and manage fields
  • Use the title row of the CSV file to create the fields

 I am going to concatenate the values of the columns edition_number and editor, so I need only one field in the database for this purpose. 

Notice: the field names (machine names) do not have to match exactly to the column names of the CSV file, yet it makes sense, to at least relate them with similar words - that eases the field mapping in the process step.

 Migrate Data from a CSV File in Drupal 8/9

 The title field is mandatory for every Drupal node, so there is no need to create a title field. You can leave the body field untouched or you can delete it, it depends on what you are planning to do with your particular content type.

Step # 4 - The Migration Definition File

Source Step

  • Open your preferred code editor
  • Type the following:

id: my_first_migration
label: Migrate terms from a CSV source
source:
plugin: csv
path: public://csv/library.csv
header_row_count: 1
ids:
  [id]
delimiter: ';'
enclosure: "'"

The id of the file matches its name. 

We are using the csv plugin of the contrib module Migrate Source CSV in the source section. 

The number 1 at the header_row_count option indicates the value of the column titles (they are all placed on the first row).

The ids definition provides the unique identifier for each record on the CSV file, in this case, the id column, which is of type integer. Don’t forget the brackets, since the module is expecting an array here.

delimiter and enclosure refer to the structure of the CSV file, in my particular case, it is delimited by “;” characters, whereas strings are enclosed between “‘“ single quotation marks.

Migrate Data from a CSV File in Drupal 8/9

Notice also, the definition of a path. That is where you have to place the CSV file, so Drupal will be able to read the data from it. 

  • Open your terminal application.
  • Type: 

mkdir web/sites/default/files/csv
cp /home/path/to/library.csv web/sites/default/files/csv/
chmod -R 777 web/sites/default/files/csv/

This will:

  1. create a directory called csv inside the public folder of your Drupal installation.
  2. place a copy of the CSV file inside that directory.
  3. make the file accessible to everyone in the system (that includes the source plugin). 

Process Step

This is where we map each one of the columns of the CSV file with the fields in the content type:

process:
title: title
field_id: id
field_author: author
field_location: location
field_availability: availability
field_editor:
  plugin: concat
  source:
    - editor
    - edition_number
  delimiter: ' '
type:
  plugin: default_value
  default_value: book

The field_editor field will be the result of performing the concatenation of 2 strings (the values inside the editor and edition_number columns).
The first 5 key/value pairs in the form drupal_machine_name: csv_column_name map the CSV records to the database fields without performing any changes. 

The delimiter option makes it possible to set a delimiter between both strings, in this case, a blank space.

The default_value plugin helps us to define an entity type since this information is not available in the source data.

Destination Step

The final part of this process is the destination step.

destination:
plugin: entity:node

We are migrating content and each record will be a node.

Step # 5 - Execute the Migration 

  • Click Configuration > Configuration synchronization > Import > Single item
  • Select Migration from the dropdown
  • Paste the code from the .yml file into the textarea
  • Click Import

Migrate Data from a CSV File in Drupal 8/9 

  • Click Confirm to synchronize the configuration. You will get the message “The configuration was imported successfully”
  • Change to the terminal application.
  • Type the following:

drush migrate:import my_first_migration


Migrate Data from a CSV File in Drupal 8/9

 You can now check the content on your site.

Migrate Data from a CSV File in Drupal 8/9

 You have learned the basic principles of migrating data from a CSV file to Drupal 8/9. 

As you have already seen, the migration process requires attention to the details, so make sure that you work first on a staging server because one little mistake could break the whole site. I hope you liked this tutorial. 

Thanks for reading!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
May 27 2021
May 27

Targeted for release in June 2022, a little over a year from now, here’s an update on the current state of development, what Drupal 10 means for your business and how you can get involved.

As Dries highlighted in the Driesnote at DrupalCon North America 2021, the Drupal 10 initiative is important to ensure that Drupal uses the latest and greatest components that we depend on; ensuring stability and security for all. Dries stressed the importance of maintaining the activity and momentum to push Drupal to be ready in time for the targeted release next year.


What will change?


Some of the third-party dependencies and system requirements will be upgraded to, ideally, their latest stable releases. Code deprecated throughout the Drupal 9 lifecycle will also be removed in the first release of Drupal 10.

  • CKEditor 4 will be replaced with CKEditor 5
  • Symfony 4 will be upgraded to Symfony 5 or Symfony 6
  • jQuery UI will be removed
  • PHP 8.0 will be required (up from PHP 7.3)
  • PostgreSQL 12 with pg_trgm will be required (up from PostgreSQL 10)

The Drupal stock themes “Seven” and “Bartik” will be replaced with “Claro” and “Olivero” respectively. A much needed and welcomed change is that Drupal core is dropping support for Internet Explorer 11 in Drupal 10. A high-level overview of all changes coming to Drupal 10 is listed on the drupal.org website.

We already know many of the deprecated APIs, some dating back to Drupal 8, which means module maintainers can start to update their code to rely on the newer APIs.


Current Tooling


To ease the process of upgrading from Drupal 9 to Drupal 10, there are a few tools in place. These tools were already developed to help aid in the upgrade of Drupal 8 to Drupal 9, so might be familiar to many.

  • The Drupal 10 Deprecation Status page shows Drupal 10 compatibility based on all currently confirmed deprecated API uses in Drupal 9 extensions.
  • The Upgrade Rector and the underlying drupal-rector project will be used to automate updates of deprecated code.
  • The Upgrade Status UI or CLI will check your Drupal 10 readiness on your Drupal 9 site (as much as already defined).

The best place to help with the upgrade path right now is updating drupal-rector. The rector 0.10 version update currently underway enables the tool to run on a modern setup. After that, adding support for Drupal 9 is key. Once that is in place, work on covering the top deprecated APIs can start.

The Drupal Project Update Bot also relies on Upgrade Status and drupal-rector to automatically post compatibility fixes to community projects. Automating as much as possible will streamline the upgrade process not only for Drupal core but for all contributed modules.


Get Involved


If you’re interested in helping to get Drupal ready for version 10, feel free to join the Drupal 10 Readiness initiative meetings, held every other Monday. The meetings are always open to all and held on the #d10readiness channel on Drupal Slack. The meetings are text-only, and transcripts of past meetings are available in the meeting archive.

While June 2022 might seem a way off and many website owners are still preparing for the shift to Drupal 9, version 10 will be here before you know it, so plan ahead and let our web development team help you make the move - get in touch with us today.

May 27 2021
May 27

Virtual business card, flyer, minisite – all these terms perfectly reflect the nature and essence of landing pages. They consist of several horizontal segments in which advertising content is presented, encouraging you to continue exploring a given topic on other dedicated and full-sized webpages. Let's check the possibilities Drupal gives us when creating these specific websites.

A page made of blocks

To create a one-page website composed of horizontal segments – clearly separated from each other, yet forming a consistent whole – we can use several tools or modules available in Drupal's core. The first option that comes to mind is using blocks. In Drupal, blocks work just like Lego bricks, elements containing any content. They can be displayed in different regions of the page, one above the other. Sounds like something we need!

While the idea of using simple blocks, mainly containing text, will work great in this case, achieving visual requirements or the desire to include slightly more complicated content (e.g. counters, multimedia, combination of text and multimedia, carousel, etc.) seems to be time- and labour-consuming. A high level of block personalization will require us to create several block types and/or content types and additional fields from scratch, along with additional classes for a purpose of a correct display.

For the proponents of clean code - Twig

One of the available solutions is also the option to create a twig collection, where – using the HTML and PHP code – we are able to create a landing page with any level of complexity, using the available or new fields. This option, however, requires us to spend many hours in front of the screen, countless lines of code, not to mention the subsequent maintenance and content management. To implement this solution, one definitely needs a person with technical knowledge – also at the stage of introducing the content.

Creating a Drupal landing page - modules

As we mentioned earlier, a landing page in Drupal can also be built using modules. Let's check out which of them we can use and how.

Layout Builder module

A very useful tool that has been recently added to Drupal's core is the Layout Builder module, which allows you to create templates for the structure of displaying elements on a page. The user has the ability to define their own "regions" for different types of content, using a very helpful drag and drop interface. In simple terms, it replaces the default display management function, which defines how and which fields are to be displayed, and additionally gives the opportunity to put blocks in place. When creating a template, you can feel like a graphic designer juggling the elements, while all the tools they need (in our case – the options for editing blocks and sections) are at hand in the form of an edit bar. To use this module, you need to launch it manually, as it isn’t enabled by default when installing Drupal.

When creating a new content type or wanting to use Layout Builder for an existing one, go to the display management tab. An additional "Layout options" section will appear below the list of fields, where we select the "use Layout Builder" option. Instead of a list of fields, a "manage display" button will be added, which will redirect us to the Layout Builder interface. We can freely manage the order of elements, add blocks that may contain fields from any entity, forms, plain text, links, views, and even an entire menu. Blocks are added within sections. Each section can be arranged into any number of columns between 1 and 4. The order of sections can't be changed or "dragged", so be careful when adding them.

Creating a Drupal landing page with the Layout Builder module

 

Unfortunately, Layout Builder won’t do everything for us. We need to put more effort into styling all the elements. There is, however, an additional module – Layout Builder Styles – dedicated to creating new classes for Layout Builder blocks and sections, with the ability to define restrictions concerning which blocks a given class will be assignable to. A beta version is currently available, and our Drupal installation can’t be older than version 8.7.7. However, to take full advantage of the style management capabilities, without having to interfere with the code, we need to install one more module - Asset Injector, where we'll define all the parameters of the previously created classes. Then, when editing every block or section, we'll be able to assign one class available on the list.

Styling the elements of the landing page using the Layout Builder Styles and Asset Injector Drupal modules

 

Paragraphs module

The second most frequently chosen option when creating a Drupal landing page is the use of the Paragraphs module along with Entity Reference Revisions. Paragraphs allows you to create templates that will later be sections (aka paragraphs of a page). One paragraph may consist of many fields of any type. Such a collection will constitute the type of paragraph to be used by any selected content type. For example, let's create a new content type named "Landing page". At this stage, in addition to the default "body" field, we add a few other fields that are important to us. This time, we only need one field of the paragraph type.

Adding a paragraph field to a content type during creation of a landing page using the Paragraphs module

Now we'll create two new types of paragraphs: “banner” and “image + text”. Under Structure -> Paragraph types -> Add paragraphs type, we create new types of paragraphs and manage the fields to be included in them. For a banner, it'll mean an image field. For the “image + text” type, it'll also mean an image field and formatted text. As is the case when creating content types, with paragraph types we can manage the display of fields and their formatters. You'll surely notice in the list that you can add a field of the paragraph type, this way we can create nested paragraphs and use the already existing ones.

Creating new types of paragraphs with the Paragraphs and Entity Reference Revisions Drupal modules

Having the types of paragraphs, let's check out what the landing page creation will look like. It's worth mentioning that without defining in the content type settings what specific paragraph is to be its content, we are presented with a list of all available paragraph types, which we can add multiple times thanks to setting “unlimited” when creating a paragraph field.

Creating a Drupal landing page with the Paragraphs module

 

Paragraphs is a powerful tool that allows you to maintain a high degree of flexibility when creating landing pages. The pitfall when using this module is managing the content later, especially if we create a complex structure with multiple nested fields. The person editing the content on the page may then feel lost and overwhelmed by the enormity of tabs and settings.

Tuned-up paragraphs. Creating a landing page in Droopler

If you like the option of using paragraphs the most, you'll surely find what Droopler has to offer for creating one-page websites to be even better! Droopler is our free Drupal distribution for creating webpages. It contains many ready-made templates and components.

For this tool, we used the idea of creating paragraphs and nested paragraphs to "assemble" a webpage. In the default version, along with the Droopler installation we get a number of predefined types of paragraphs at our disposal. These are the most commonly used kinds of paragraphs when creating webpages.

Types of paragraphs in Droopler, a free Drupal distribution

Types of paragraphs in Droopler

Then why should we use Droopler and how does it really differ from the previously presented possibilities offered by the Paragraphs module, apart from saving time when creating the most popular types of paragraphs?

Appearance

Already at first glance, we can see one fundamental difference – the added paragraphs look professional and, if we are satisfied with the used colour scheme, we don't have to do anything else with them! However, if we decide to change it, overwriting the default skin won't be a problem. You can find more information on this topic here.

Editing directly on the created page

I mentioned earlier that it can be confusing to navigate around editing a content type that consists of paragraphs. This problem has been solved in Droopler thanks to using the Geysir module and the possibility of editing paragraphs directly on the created page, in the form of modal windows. The interface also allows you to “cut” and “paste” paragraphs, that is, to reorder and remove them without having to go to a content type edit page. This way everything remains clear and we immediately see the effects of our changes.

Possibility of editing paragraphs directly on the created page in Droopler

 

Additional options

The Paragraphs module is field-based, allows you to choose the formatter and manage the display, but there is no room anywhere for additional options related to styling or quick reorganisation of the content within a single paragraph. If we want to have a paragraph with an image gallery, we need separate types for putting 4 or 8 thumbnails. The same happens when using different types of media – a separate type is needed for images and videos.

Droopler is highly flexible in this aspect. The banner paragraph type accepts both an image and a video file. What's more, for every type of paragraph in the "settings" tab in the adding paragraph window, it's possible to configure margins and padding, as well as to define additional classes that we determine in our skin, and there's even the option of choosing a predefined colour scheme.

While editing a paragraph in Droopler, you can configure margins and padding, as well as define additional classes

 

If we want to create a paragraph composed of tiles, we can choose which of them should be highlighted by increasing their size in relation to the rest, achieving the effect of a "masonry" gallery.

Do you have a prepared block that you'd like to use and place among the paragraphs? In Droopler, paragraphs can also consist of blocks, in the tab we can choose from among all those that currently exist on our page. The block with icons and links to social media - ready. A quick contact form? This has also already been done for us.

Summary

Landing pages or one-page websites are a very specific type of content that we can find around the web. It should be simple, transparent, look modern and encourage us to go to the further, target pages. It's a huge tool in the hands of marketing teams, so Drupal was also designed to give users, including non-programmers, the option to quickly create these kinds of pages. Some of the methods presented in the post show a low degree of difficulty and can be implemented by a layman (Layout Builder, Paragraphs). However, they have their limitations and at some point the "architecture" created with them becomes too complicated and difficult to maintain. Additionally, the intention to create something non-standard will require the help of a developer, especially for styling. By choosing Droopler, we can be sure that the landing page creation process will be simple and pleasant, as well as visually effective, without having to tamper with the code. Anyone, without exception, will be able to learn to manage paragraphs with ease, and in less time than expected.

May 26 2021
May 26

This year at DrupalCon North America Redfin Solutions’ CTO Chris Wells had the honor to speak for the first time at a DrupalCon. His presentation Migrating into Layout Builder had the privilege of being one of the most well-attended sessions of the conference.

The Client

Redfin Solutions has a longstanding relationship with the University of New England (UNE)--Maine's largest private university--and they were at a turning point where their previously cutting-edge website felt dated, especially the content editor experience. With Drupal 7's end-of-life on the horizon, we worked with them to come up with an upgrade plan to Drupal 8, so that we would have better access to a modern interface.

Previously, their Drupal website had been implementing responsive component-based design principles using WYSIWYG Templates. With more modern tools like Gutenberg and Layout Builder in core, we knew we had great options and opportunities to provide a better content editor experience.

The Transformation

We knew that we would have to find a strategy for migrating the older paradigm to the new paradigm, and for this we chose layout builder. With its core support and logical application of blocks as components, it was a natural choice. But, how would we get larger blocks of HTML into a place where all new pages were using the new paradigm of pages where each page is a Layout Builder override?

Luckily, Drupal has just such a way to transform data on input, which is the Migrate API. The Migrate API follows a common pattern in Computer Science called Extract, Transform, Load. In the parlance of our times (that is, Drupal), we use the phrases "source" (extract), "process" (transform), and "destination" (load). Each of these phases are represented by Plugins to the Migrate API.

Our Situation

In the case of UNE, we were migrating from (source) Drupal 7 nodes (the page body field) into (destination) "basic text" blocks. For the source, we used the live Drupal 7 database on Pantheon. The "basic text" block is the one that comes out of the box in Drupal 8 as a custom block type, and has a title and a body.

We did NOT go down the rabbit hole of regex'ing out each of the components, but rather we migrated the old body into the new paradigm, so that every page uses the same paradigm from the start, and content editors can expand into using layout builder overrides over time. We simply migrated in some legacy styles, which eventually we will discard. We had the staff and resources to clean up any egregious inaccuracy in the translation as needed, so this ended up being the most time-and-cost-efficient solution.

However, the real magic of this migration is really the process part, where we change the data into the format it needed for layout builder.

Layout Builder Storage

So first, we need to understand how Layout Builder actually stores things behind the scenes. Much like an entity reference field, layout builder is really storing a list of sections. When you build a page with Layout Builder, you are adding sections to it (a one-col, followed by a two-col, followed by another one-col, for example). Much like with regular field tables, it stores the entity ID, revision ID, delta (so it knows the right order!), and then some data value. For taxonomy term references, for example, it would store the "tid" for the term being referenced.

With Layout Builder, there's additional complication. Since each section may contain multiple components, there's an extra layer where we need to then store the components for a section each in their proper order.

For this, Drupal's Layout Builder is not nesting another set of entity references. Instead, it's actually storing a serialized Section object. One of the main tenets of a Section object is an array of SectionComponent objects, which each store their own location and position within the section.

The actual table where this information is stored is the [entity]__layout_builder__layout table in the database. Depending on which entity you've enabled Layout Builder overrides for, this may be the node__layout_builder__layout table, or the user__layout_builder__layout table.

Most layout builder SectionComponents are just "blocks" in the traditional Drupal sense of that entity. With that said, there is one new concept that should be introduced, which is whether or not blocks are to be considered "re-usable." Re-usable blocks are the ones you normally create from Structure > Blocks > Custom Block Library, and you then place to be "re-used" across the website, for example on a sidebar on every page.

Non-re-usable blocks are those which are created when you insert block content into a Layout Builder layout. The difference between these two is really just a boolean (and hidden) field on the block, which helps filter blocks using the UI.

And, the very last piece of the storage puzzle to be aware of is the "inline_block_usage" table. This simply stores the block_content_id, the layout_entity_type (ex.g. "node"), and the layout_entity_id (ex.g. "node id"). It's a record of where the non-re-usable blocks are, in fact, used.

OK, so let's do this!

We need to transform Drupal 7 node bodies into blocks, and then migrate the pages into pages, where the "body" of the node is now the Layout Builder overrides.

To do this, we are going to:

  • migrate bodies into non-re-usable blocks

  • migrate the nodes into nodes

  • be sure and link up the previously migrated blocks as Layout Builder Sections/SectionComponents

To help demonstrate these concepts, I've created a fully-functional website repo on Drupal 9 using some CSVs as a source. I'm going to dissect some of the main parts of that for you.

Step 1: Import the Blocks

In many ways, this is a very standard block migration, but the special thing to call your attention to is the "reusable" field in the "process" section:

  # whether or not it's reusable   reusable:     plugin: default_value     default_value: 0

View code on GitHub

This specifies that the blocks coming in are inline blocks. You may or may not want to use this, but we certainly did, and this is how you set it.

Step 2: Import the Nodes

In many ways, you are just migrating node fields in the way you normally would, mapping fields like title, uid, etc.

Where this one gets special is that we migrate into a field called layout_builder__layout which is the field that stores the overrides. With that, fields expects a Section object (or an array of Sections).

  # This is the layout_builder__layout field, which stores everything!   layout_builder__layout:     # Where do we get them from? This `components` field comes from us. We use prepareRow to set it.     source: components     # We need a custom plugin to correctly map this.     plugin: layout_builder_sections_pages

The source for where to get the "body" (blocks / SectionComponents for our Section) is this "components" field. That's not a field in my CSV, it's one where I do a lookup to get all the blocks that were migrated in relative to this node. To do this, I use the prepareRow() method provided by migrate_tools to add a new source property.

# Basics about the source plugin - where to get the data, # what kind it is, describe the columns in the csv. source:   plugin: my_pages

View code on GitHub

In this new prepareRow method, we can look up the migrated blocks and return them in the correct order; each will become a section component:

Source Plugin

Now, the components source field is an array of (non-re-usable) block IDs.

Now, we can use that with our custom plugin which is a Migrate API Process Plugin.

Where the Magic Happens

The process plugin has a main entry point of transform(). This method is responsible for returning a value formatted in the way that the destination plugin expects it. In our case, we need to return a Section (or perhaps an array of Sections if you're feeling adventurous). Remember that SectionsComponents primarily make up Sections, we need to first build up the SectionComponents themselves.

To do this, we need access to the UUID generator service in Drupal, and to create a configuration array for the SectionComponent. The following array details the configuration.

  • id: the plugin and derivative you're using, specifically for us "inline_block" and then the bundle, yielding "inline_block:basic" (the type of block).

  • label: what the label of this block is (the block title). This is a required field, so set it to something.

  • provider: layout_builder - always the same in our case.

  • label_display: whether or not to show the label (boolean)

  • view_mode: which view mode to use when displaying this block

  • block_revision_id: the revision ID of the block to display

  • block_serialized: the serialized version of the block (you can probably leave this null and it will be serialized for you later)

  • context_mapping: to be perfectly honest I don't know what this is and maybe someone out there can explain it to me, but it works when it's an empty array :)

After creating your SectionComponents array, you can return a new Section object by specifying the layout you're using for that section, any settings for the Section, and the array of SectionComponents to put into it.

Try it for Yourself!

If you download the example repo, you can restore the included DDEV database snapshot (if using DDEV) or use the .sql file to import the database. You may need to change the paths in your migrations depending on your setup.

As always feel free to be in touch if you would like to learn more!

May 26 2021
May 26

Update: 2021-06-11: Added CVE-2021-33829 identifier

Drupal core uses the third-party CKEditor library. This library has an error in parsing HTML that could lead to an XSS attack. CKEditor 4.16.1 and later include the fix.

Update: 2021-06-11: More details are available on CKEditor's blog.

Users of the CKEditor library via means other than Drupal core should update their 3rd party code (e.g. the WYSIWYG module for Drupal 7). The Drupal Security Team policy is not to alert for issues affecting 3rd party libraries unless those are shipped with Drupal core. See DRUPAL-SA-PSA-2016-004 for more details.

This issue is mitigated by the fact that it only affects sites with CKEditor enabled.

May 26 2021
May 26

Lynette has been part of the Drupal community since Drupalcon Brussels in 2006. She comes from a technical support background, from front-line to developer liaison, giving her a strong understanding of the user experience. She took the next step by writing the majority of Drupal's Building Blocks, focused on some of the most popular Drupal modules at the time. From there, she moved on to working as a professional technical writer, spending seven years at Acquia, working with nearly every product offering. As a writer, her mantra is "Make your documentation so good your users never need to call you."

Lynette lives in San Jose, California where she is a knitter, occasionally a brewer, a newly-minted 3D printing enthusiast, and has too many other hobbies. She also homeschools her two children, and has three house cats, two porch cats, and two rabbits.

May 26 2021
May 26

You Might Also Like

Enabling WebP images on your website can save millions of bytes per page load! That might sound like a bit of an exaggeration, or maybe a little tacky, but it’s true. On slower connections, that can be the difference between a visitor viewing your page or pressing the back button in frustration.

What is WebP?

WebP is a new(ish) image format that renders higher-quality images with drastically smaller file sizes. It also supports several cool features such as transparency (generally handled with PNG images) and animations (generally handled with animated GIFs or videos). 

Browser support

All modern browsers support WebP, including Chrome, Edge, Safari, and Firefox. However, Safari support is limited to machines running macOS 11 Big Sur (released November 2020) or newer. 

You can use WebP images just like a standard JPG, PNG, or GIF. 

Awesome

Fallback support for older browsers

Do you still have to support older versions of Safari and/or Internet Explorer? You can automatically create JPG fallbacks using the HTML tag. If you’re not familiar with , it’s used for responsive images (serving different images based on your screen width), but you can also use it to serve different images based on what MIME type the user’s browser supports. It’ll look something like this.

Awesome

Older browsers will look at the type attribute on the element and fallback to the element.

Drupal core support

Drupal core 9.2 (due June 2021) supports WebP! You can use Drupal core’s built-in image styles to convert your image easily.

Unfortunately, the integration between core’s responsive images module and WebP is lacking. If you’re interested, follow along in the issue queue

Start using WebP today

However, you can use WebP today by using either the WebP or ImageAPI Optimize WebP modules. Both modules support Drupal 8 and 9, integrate with Drupal core’s Responsive Image module, and generate WebP derivative images for each image style. 

For the ImageAPI Optimize WebP module, note that only the dev version supports Drupal 9 integration with the Responsive Image module.

Support for the Stage File Proxy module

If you use the Stage File Proxy module to pull production images to your local environments automatically, you’ll need to download a patch to support WebP images. This issue was brought up at Lullabot’s weekly engineering roundtable, and we subsequently created and submitted the patch. 

WebP on Lullabot.com

We’re using the WebP module on Lullabot.com. You can see the difference yourself using Chrome Developer Tools.

On Lullabot’s Our Work page, the WebP module saves over 1MB of data at wide viewports. Note that these images were already fairly optimized JPGs. In situations where users are uploading PNGs, the savings are even more substantial!

WebP is only one piece of your site’s image serving strategy

Serving WebP images is not a panacea to speeding up images on your website, but it is a great start. 

Other important best practices for serving images include

  • Using Responsive Images
  • Lazy loading images (Drupal 9.1 and newer does this by default)
  • Ensuring that images contain width and height attributes to prevent content shifts

Remember, images are only one factor of your site’s overall performance. If you’re in doubt, defer to measurement tools rather than hard and fast rules.  

May 26 2021
May 26

Owing to the numerous aspects and applications of web personalization, it might be a confusing term for many despite its omnipresence across the virtual world today. Web personalization is best understood as individualization of a website customised in a manner to cater to an individual's unique requirements. The importance of this concept can be realised from the fact that the founder and CEO of Amazon, Jeff Bezos, prophetically mentioned something along the lines of personalization almost 2 decades ago.

"If we have 4.5 million customers, we shouldn't have one store, we should have 4.5 million stores."

Jeff Bezos in 1998


Why do you need website personalisation?

There are numerous statistics and numbers that point towards better business relationships with customers when personalisation is adopted, as when a user feels a personal connection with a piece of content or a layout, he is more likely to invest in the product or service. This is substantiated by the fact that Invesp's survey points out that around 56% of online shoppers are likely to return to a website when they see product recommendations customised according to their interests. Running a cross check from the other end as well, it has also been noted by MarketingProfs that 66% of marketers said that a chance of an improved business performance was their main reason for going forward with personalization. Listed below are some additional statistics that showcase the benefits of web personalization.

three blue circle with text enclosed talking about the benefits of web personalisation


On the other hand, if data and numbers don't speak to you to dig deeper into the human value of building relationships and don't just care about the turnover, web personalization is still your best friend. Apart from the very obvious benefits of increased conversions and sales, DSR duniya prasad the ways in which web personalization helps a business.

  • Meeting customer expectations

From customers' point of view, in this age of integrated tools and technologies, web personalization is not something out of the blue - rather, they expect a degree of customisation from the businesses that they associate with. As backed by sufficient research and data, custom recommendations and suggestions work better in attracting and retaining customers in the longer run.

  • Building brand loyalty

As a subsequent side effect of having invested more into the research and the choices made by the customers, you would garner greater brand loyalty. Tailored marketing strategies, no doubt, speak more to the user than the generic ones. 

  • Engaging your customers

User relationships are nurtured and allowed to grow with your website creating relevant offers, content and CTAs (Call To Action). Once you have done the job of gaining their attention right, the purchase can be accelerated even further by using personalization.

  • Trustworthiness

When you give to the customer, you are bound to receive back. For the smooth User experience ranging across the site and getting recommendations for products that they actually like and value, customers are bound to trust your website more than the others and are likely to rely more on your data.

Types of website personalisation 

User controlled

User-controlled personalization is where the entire website is controlled by a custom criteria set by the user himself. It is usually explicitly asked what the user's specifics related to geography, gender, languages spoken etc. are. These kinds of websites rely on the fact that the user will  take out the time to tell them they are in order to be directed towards personalized products and services based on their own inputs.

The most common method of initiating user-controlled personalization is by asking the consumer to create an account and hence provide a bunch of details about themselves in the process.

For the geographical location of the user, more often than not, their IP(Internet Protocol) addresses are tracked to see what part of the globe they belong to.

Behavioral

In the other kind of personalization, it is not explicitly asked from the user about what they want and who they are. Personalization here is placed more on the user's website browsing history and also based on their interactions with the website content. The data so gathered is then utilized to understand their deeper interests and gauge out their persona. For example, Netflix first observes what are shows one watches and then recommends newer shows based on their history. Considering how much data this process involves, the research can be a pretty daunting task, but works in a magical way once conducted properly.

Contextual

Contextual personalisation is when the activities that you do in the day are analysed and used as a context to provide further recommendations to you. Perhaps the best example of this is a fitness tracking software - it recommends activities to you based on how much you’ve walked or exercised in a day. 

Personalisation can also range from varying degrees and levels depending on where you want your user to find you . 

Email personalisation

Consumers that identify themselves to a business by giving out their email address for some form of notifications or newsletters are sent targeted emails that form a part of an essential communication based on the data that the consumer has provided. Addresses are utilised for targeted email campaigns and differ from other generic ones as they have greater chance of lead to user conversion.

Campaigns

Custom landing pages might be created from time to time to support the cause of a campaign if it is relevant to the organisation's ethical cause. Usually CTA is placed in these landing pages to donate or contribute in some manner. 
Implementing web personalisation

Where to start?

With research, of course! Research regarding personalization will vary greatly depending on the kind of industry and business vertical you are looking to create content for. It should also go along well with your other goals and objectives should fall in line with the rest of your tools in use. While collecting data ensure that it is factually  correct and not outdated, that you are adhering to all regulations while collecting it, and that you have a virtual silo for storing all of this data and extracting it whenever it is required.

Identifying your audience

Once all the data is within your arms' reach, used to identify your target audience and also create the personas of your visitors. These personas should be vividly detailed and should contain all relevant information needed to create custom experiences. Here's an example of a detailed user persona.

three columns with text outlining a user persona


Details like this will save essential time and effort while going through the data of thousands of users as you'll know exactly what to look for. This will help you immensely in the process of identifying your target audience. Learn more about the importance of audience segmentation in web personalisation here.

Setting your goals

Assuming that the research and segmentation part is done and dusted with, it is now time to get realistic and think about what is achievable and how much resources are required to complete the exercise - also whether the allocation is in line with the bigger picture. Once the data is studied and understood, you can now create an actionable plan that demonstrates how you plan to achieve your goals by using personalization and what your monthly or quarterly targets should be. 

Also always keep in mind that personalization is an incremental process and as you get more data with each interaction, the metrics need to be updated - that is, if you want to stay updated.

Mapping out the approach

But where does the plan of action start with? Do you create a separate landing page based on the data, or should it be incorporated in the home page itself? Now is the time to lay out a plan for personalizing the navigation and examining the touch points where the data needs to be delivered. You should not lose focus of the main goal, which is to generate maximum returns from this exercise. The right or wrong placement of a single line has the capacity to make or break your efforts.

Content and Design

This is the step where all of the research done is put into practice. The actual elements of a page that a customer interacts with are its content and design, and these should be highly reflective of what the user wants. The teams dealing with these two elements should be given access to the data and should be briefed comprehensively about what their work should comprise. Everything from the colour palette, the layout, to the kinds of images used are dynamic - but should also be intrinsically connected to your brand image. You should not be struggling to maintain your identity in the face of all of this personalization.

  • The header is your hero. Rightly said, the first impression is extremely important and it is important to put forward a good one. If you can personalize the header using an IP address to create a personal greeting to the customer, there would be nothing like it. The sub heading underneath every header and the image in the backdrop can also be personalized for each category of consumer that you are targeting. 
  • Add elements like featured blogs and other content based on the interactions that the user has had - like recommending ebooks or product related educational pieces. 
  • Highlight specific features of a product or solutions that you offer which are sure to resonate with the kind of audience that you are trying to attract. You could also try experimenting with motion UI to ensure that attention is attracted towards that part of the page.
  • A huge driver can also be customer testimonials, hence be sure to include some at a prominent spot in the homepage. Logos of brands that are associated with you, are partners with you or have invested in your company also add to the trustworthiness factor.

To know more, read about the significance of UX in the age of personalisation.

But is it working?

Early trends in metrics should not be ignored. Keep some analytics tools handy for examining the early trends in data after everything has been implemented. Have a clear analysis of 'before' so that you can compare the 'after'. Take into account the things like

  • Time spent on the site
  • Frequency of visitors
  • Content interacted with
  • Volume of new and returning visitors in comparison to the previous volumes recorded

Dive into the statistics and use it for the initial A/B testing. Analyse the data from different perspectives by keeping into account any seasonal or contextual effects that might have impacted the user behaviour during the said period of time. 

In mid stage metrics, points of data should be deduced from

  • Number of actions taken on the site
  • User feedbacks, if any
  • Quality of visitors - are they a part of your target audience?
  • Trends in lead to user conversion

Lastly, these metrics should be utilized to provide better iterations in the future.

Web personalisation in the near future

Less physical, more digital.

A miniscule number of companies have been deploying personalisation  beyond their digital presence. It has been predicted that all the places of interaction that have been largely physical till now will gradually be digitised, like food chains or clothing stores. Several clothing stores that have digitally transformed themselves to adhere to the present times allow 'online trials' using augmented reality to provide a personalized experience to the consumer. AI (Artificial Intelligence) enabled tools are utilised to improve the services - for example, it has been noted that food since Macy's and Starbucks use GPS (Global Positioning System) to trigger relevant in app push notifications to the consumer when they are near one of their stores. To know more, read how machine learning enhances personalisation at scale and in what different ways machine learning can be utilised for effective web personalisation.

Hence, in upcoming years we can expect the digitisation of presently predominantly physical spaces.

Scaling empathy

What makes AI stand out in this world surrounded by technology is its ability to incrementally improve after every interaction. Recently, McKinsey noted that smart speakers like Alexa and Google home for getting smarter over time and adding more to the skill set day by day.
 

bar graphs with black and blue bars talking about the smartness of home devices like alexa and their role in perosnalisation


The relationship of humans with technology and personalization is at a somewhat peculiar junction where we want the machines to understand us and not be completely mechanical - rather, a slight emotional, human touch is preferred. Thus, the concept of empathy will scale and devices will be expected to be more understanding of human behaviour in different times and contexts.

Formation of ecosystems

It is well known by now that a shopper's buying experience cycle - starting right from the introduction to a product till the end result of finally buying it is a cumulation of several touchpoints both offline and online that contributed to the overall buying experience. For example, an ad seen online, a promotional phone call, an offline retail store all might have contributed to the experience but at different timelines and regions. The next step towards personalization as it expands further would be to create a connection between all of these points and create an entire ecosystem to get personalized products and services delivered to the customer. It does not seem to be that far away, with smart home devices working towards integrating the entire place together and using shared experiences to render personalized offerings. For a complete guide on web personalisation, read here.

Taking the future as a reference

It is always better to live in the future as compared to the present. If you want to future proof your strategies it would be wise to roll out bigger investments on customer data and analytics. You can also make investments in finding and training translators so that you can personalize your brand's experience even further for the global crowd. The future of personalisation is agile and cross disciplinary, and traditional marketing efforts could go futile. Therefore, it would be best to refer to the upcoming trends and strategize your efforts accordingly - in order to remain a step forward than the others. 

May 26 2021
May 26

Those of us working in the digital industry have been lucky enough to continue working from home throughout the pandemic rather than lose our jobs or get furloughed.

Now, as the successfully carried out vaccination programs are promising a return to the office, many employers are considering whether to return to fully in-office work, go fully remote - or combine the best of both worlds in a hybrid working arrangement, typically with 2 or 3 days per week working on site.

Seeing how effective remote working has been, a return to full-time office work seems largely unnecessary; and yet, we know that collaboration is more easily enabled and team spirit more easily realized with in-person interactions.

Taking this into account, the hybrid approach to work seems the best way to move forward for a lot of employers. All the LinkedIn and Twitter polls seem to indicate that this is what employees would prefer as well.

So, in this article, we’ll be taking a closer look at 5 main things companies need to guarantee in order to ensure a successful transition to hybrid work. You can use these as a checklist to help you determine how much you still need to do, or if you’re all set to go hybrid once a return to the office is possible.

1. Choice

The number one thing you need to provide is choice. Some employees perform better when they’re in the office full-time, while others thrive in a fully remote environment.

You need to offer the choice of working remotely or fully in-office even when transitioning to hybrid work, and you need to offer it on a per employee basis.

Depending on what your hybrid arrangement looks like, you should also consider providing the choice of which days to come to the office, unless certain days are specifically dedicated to meetings and in-person collaboration.

2. Transparency and trust

A culture based on transparency is essential for effective remote working, and that remains true in a hybrid environment.

Openness and transparency will facilitate collaborating via asynchronous communication and distributed workplaces, prevent key information from getting siloed and help keep everyone aligned on the same goals.

Trust is also key, but it needs to be a two-way street; you can’t expect employees to trust you if you don’t first establish a culture of trust and show that you trust them. By making trust one of your core values, you’ll be able to invest less on-going effort into things like daily checkups and micromanagement.

3. People-first culture

The companies that already had an employee-centric company culture pre-Covid were much more easily able to transition into and thrive in a remote arrangement, and again, this will still hold true for hybrid.

The importance of a good employee experience has grown drastically throughout the pandemic, and for successful hybrid work, a key element will be flexibility. Why should your employees be subject to inefficient processes and schedules just because that’s how things had always been done before?

With colleagues no longer fully in office together and asynchronous communication on the rise, there’s really no need for super strict daily work schedules. You should have some time overlap outlined for easier collaboration, of course, but you shouldn’t just copy-paste the former in-house schedule.

Empathy and understanding are also crucial here - in fact, they’re what forms the basis of a truly people-first culture. All the other factors we’ve discussed so far are rooted in and enabled by empathy, and it’s become an essential element of business due to Covid and other societal shifts that are affecting work.

4. Well-set up processes and communication channels

Under the previous point we highlighted inefficient processes as a thing to avoid in hybrid work. Indeed, a top prerequisite for ensuring successful hybrid work are definitely well-established processes and communication.

Right away at the start of the pandemic, effective video conferencing solutions became a must, and other tools that enable asynchronous collaboration have also been seeing more and more adoption. Messaging apps such as Slack were already important pre-Covid, but they’ve now become indispensable.

In order to truly succeed with hybrid work, however, you’ll need to think even bigger than just such tools. Efficient communication is all well and good, but if it’s bogged down by inefficiencies in business processes, its benefits are all but wasted.

At Agiledrop, we realized this early on into the pandemic and started working on an internal dashboard, accessible to all employees, which would drastically facilitate tasks such as time tracking or applying for vacation or sick leave, while providing managers with a highly capable and easy to use management tool.

We successfully implemented this in the second half of 2020 and have since streamlined it even more to include features like skills reviewing. And, while the dashboard is already super useful for employees, this usefulness can hardly compare to how much it facilitates the job of our project and resource managers.

This has definitely been a huge boon that’s allowed us to make the best use of working in a remote and distributed setting. If this sounds like something your company needs as well, reach out to us and we can help develop the right solution to optimize your processes.

5. Agility

Taking all the above points into account, one final success factor for hybrid work that we need to point out is embracing agility. This doesn’t necessarily mean you’ll pick and stick with a specific framework 100% of the time, but you’ll inevitably adopt some agile practices just by following the previous points of advice.

Flexibility and async communication are common agile elements, and as we’ve already pointed out, these are key for effective remote and hybrid work. To learn more about how agility can enhance collaboration in times of Covid, check out our recent podcast episode.

If you’re a heavily technology-oriented business, investing more fully in a framework like DevOps may be the right way to go; in any case, your teams will benefit from any lean and agile tactics you implement, either naturally or deliberately.

What’s more, agility will give you an additional competitive edge by facilitating innovation, reducing your time to market and enabling you to be more future-ready. Along with more flexible and hybrid work, these will all be essential building blocks of the future of work (which has in fact already begun).

Conclusion

Abstract female figure managing digital experiences

Before 2020, becoming a hybrid company would likely have demanded slow and painful overhauls, but thanks to the now over 14-month long remote work experiment, it now only requires some minor adjustments and refinements, which are themselves much less of a hassle with the whole world already more prepared to accept change.

Most of the points discussed in this article require very little organizationally, and even with the ones that do, the returns make any more substantial investments more than worthwhile. We’re seeing more and more focus on the employee experience, and giving employees the choice to work in the way that makes them perform best is definitely an integral part of it. 

We hope this article helps you make a smooth and successful transition to hybrid work. As said, if you could use some extra development capacity for a new tool or process optimization, drop us a line and let’s talk about how we can help you out.

May 25 2021
May 25

Credits & Thanks

Thank you to:

About Tokens and the Token System

To fully take advantage of patterns, you need to understand a little about the Token system.

Tokens are variables in Drupal. There are thousands of Tokens available for you to use. To get a UI in the admin area for browsing all available tokens (and have tokens that aren’t available in core), then you’ll want to have your developer install the Token module.

To see what they are and get a better understanding of how they work, you can go to Manage > Help and click on the Token link.

drupal token listing


Bear in mind that not all of the tokens are available in all areas. For now, we are going to focus on just a handful of critical Tokens that we’ll use to create our URL path patterns. They are:

[node:title] - The title of the piece of content being displayed.
[term:vocabulary] - The vocabulary (top level category, so to speak) of the current taxonomy page.
[term:name] - The name of the current term (bottom level category).

KEY CONCEPT:
 
Whenever Drupal sends a page to a visitor, it first replaces the tokens with the corresponding text. i.e. the “Today” token might be replaced with “February 22, 2021” or “August 26, 2021”.
 
Don’t worry if this doesn’t make sense yet. What you need to know right now is that we’re going to tell Drupal to create some paths for us, and we’re going to use Tokens to make it happen.

Create Pathauto Patterns

Drupal URL paths operate in patterns. Instead of creating a path to every single piece of content, it’s better to specify a pattern (using tokens) for groups of content. Drupal will follow the pattern to create the path for each new piece of content, ensuring consistency across your website.

You’re going to add a Pathauto pattern for each Content Type and taxonomy that you have.

  1. Go to the URL aliases page: Click Manage > Configuration > Search and metadata > URL aliases (Coffee: url aliases) or or visit https:///admin/config/search/path/patterns.
     
  2. Click the Patterns tab.
     drupal path patterns in pathauto
  3. Click the + Add Pathauto pattern button.
     
  4. From the Pattern type drop-down, select Content. The form will update to reflect that content type’s settings.

    Note: Leave existing path patterns as they are unless you have a good reason to change them. Any changes to these settings will not change existing pages, only pages created moving forward and could create issues with your content siloing efforts.

    drupal pathauto pattern type dropdown selection
  5. Fill out the fields as shown.
     
    • Path pattern: “[node:title]”
    • Content type: select “Article” and “Basic page”
    • Label: Anything goes. We use something descriptive such as “Content Types: Article & Basic”
    • Select the Enabled checkbox.

      drupal pathauto pattern field settings

  6. Click the Save button near the bottom of the page. The resulting page will look something like this:

    pathauto page results
     

  7. Repeat the above steps for each of your Content Types.
     

    Note: When you create new content types, you’ll want to go through this process for each of those at that time.

Now, when you create new pages of these content types, the [node:title] will be replaced with a normalized version of the title of the content. Drupal will change the letters to lowercase, replace spaces with dashes, and remove any odd characters.

For example, if you create a page with the title of “Cat Pictures for the Cat Lover in All of Us” would get converted to something like this:

https:///cat-pictures-cat-lover-all-us

Going a step further with Pathauto

If you create your patterns like the example above, you will have a flat website with no hierarchy. If you created three basic pages called “Our Products”, “Our Team”, and “Our Customers” then you’d have three pages that look like this:

https://www.example.com/our-products
https://www.example.com/our-team
https://www.example.com/our-customers

Maybe that’s what you want, but maybe you want something a little deeper. You can edit the patterns you’ve created or delete them and create new ones. For example, let’s say you’ve created a new Content Type for your blog called “Blog Postings” and you want them to be under the /blog directory. You’d create a Pathauto pattern that looks like this:

pathauto patterns using tokens and static content

As you can see, you’d enter blog/[node:title] in the Path pattern field. Your blog URLs might look like this:

https://www.example.com/blog/my-happy-cat
https://www.example.com/blog/my-big-cat
https://www.example.com/blog/why-i-love-cats

Or, if your blog focuses on a single topic (cats, is it?) then you might put this:

cat-blog/[node:title]

Which produces this:

https://www.example.com/cat-blog/happy-cats
https://www.example.com/cat-blog/sad-cats
https://www.example.com/cat-blog/why-cats

Better! Now you’ve used the powerful key phrase “cat blog” which improves your SEO for that keyword. Now, when you create each new piece of content, it will be in the /cat-blog/ section of your website.

Now let’s update your taxonomy terms. Let’s say you create a new Pathauto pattern that looks like this:

pathauto patterns for taxonomies in drupal

You might be wondering about the Path pattern for the Taxonomy term.

[term:vocabulary] is the top level category that the term belongs to. In this case, Tags.
[term:name] is the name of the tag (i.e., the tag itself).

In use, it might look like this:

https://www.example.com/tags/siamese
https://www.example.com/tags/persian
https://www.example.com/tags/abyssinian/

This is not an exhaustive look at tokens, but it should be a good start when conducting SEO for your website.

Did you like this walk through? Please tell your friends about it!

facebook icon twitter social icon linkedin social icon

May 25 2021
May 25

Update: After some delays, the new estimate for this release is 20:00UTC on May 26th, 2021. Apologies for the inconvenience.

There will be a security release of Drupal Core 8.9.x, and 9.1.x on May 26th, 2021 between 16:00 - 18:00 UTC. This Public Service Advisory is to notify that the Drupal core release is outside of the regular schedule of security releases. For all security updates, the Drupal Security Team urges you to reserve time for core updates at that time because there is some risk that exploits might be developed within hours or days. Security release announcements will appear on the Drupal.org security advisory page.

The security risk of the advisory is currently rated as Moderately Critical.
This is not a mass-exploitable vulnerability as far as the security team is currently aware.


Given that this is a moderately critical vulnerability and is not believed to be mass exploitable it is not covered by Drupal Steward partners.

May 25 2021
May 25

One of the ways to engage your visitors is by using various multimedia on your website and Drupal’s Media module provides a fantastic way to achieve that. An additional support was added to the Media module in Drupal 8.6.0 that lets you easily embed third-party videos from an URL into your site’s content called the oEmbed feature.

While having oEmbed support in the Drupal core is great, there are a few key requirements that might still be missing. In this blog we shall try out the oEmbed Providers module that extends the core’s media oEmbed functionality. The oEmbed Providers module is compatible with Drupal 8.8.4 version and above (and also works with Drupal 9!).

oEmbed Providers Module

What is oEmbed Technology?

According to oembed.com, oEmbed is a format for allowing an embedded representation of a URL on third party sites. The simple API allows a website to display embedded content (such as photos or videos) when a user posts a link to that resource, without having to parse the resource directly.

Basically, oEmbeds allows a website's content to be embedded to another web page. The oEmbed format is supported by the most popular resources with multimedia content - Facebook, Twitter, YouTube, Instagram, Flickr, Vimeo and many more.

Configuring oEmbed Providers on Drupal

Step 1: Enabling the Media and Media Library core module

  • Enable the media and media library core module in your site.
Media Library

Step 2: Install and Enable the oEmbed Providers module

oEmbed Provider

Step 3: Configuring the oEmbed Providers module

  • Once enabled you can visit the configuration form at /admin/config/media/oembed-providers.
  • This module provides several features as listed below by extending core's oEmbed functionality:

         a. Add custom oEmbed providers

            ▪ Core Media provides no option for adding custom providers. The oEmbed providers module lets you do that by navigating to Configuration >  Media > oEmbed Providers > Custom Providers and clicking on Add oEmbed provider.

Custom eEmbed Provider

             • Fill in the necessary details to define a custom oEmbed provider and save the configuration form. Now you will be able to embed videos from this custom provider into your Drupal site. Here, I have given an example of a RadioPublic provider, you can fill in details of providers of your choice.

Add Custom oEmbed ProviderAdd oEmbed Provider

       

     b. Global enable/disable providers

      ▪ The oEmbed Providers module provides an user interface for enabling and disabling providers.

Allowed Provider

   

     c. Modify the providers list URL

        ▪ The oEmbed Providers module provides an user interface to alter the providers URL as the core Media module by default fetches the list of providers from here

oEmbed Provider URL

     d. Disable the fetching of providers list

        ▪ The oEmbed Providers module provides an user interface to disable the external fetch of providers, for instance, when the site uses only custom providers.

External Fetch

   

    e. Provides hook_oembed_providers_alter()

       ▪  The oEmbed Providers module provides a hook_oembed_providers_alter() that lets to alter provider definitions.

Step 4: Adding media to a content

  • Once you're done adding a custom oEmbed provider, you can now embed the videos from that custom provider into your Drupal site.

  • Add a media field to any of your content types and choose ‘Remote Video’ media type.

  • Now provide a video link from your custom provider. Here we have created a RadioPublic provider and saved the content.
Add Media

Okay, that’s it! You can now see the videos from your custom provider being rendered on your Drupal site.

Video Testing

Please Note: Once you get your custom providers working, you can also contribute this to a public oEmbed repository. It contains configuration information (the registry) for oEmbed providers, as YAML files in the providers directory. Once your PR gets merged, it will be listed out in https://oembed.com/providers.json and thus available for all Drupal developers to use with just the core media. Few months back, the YAML for Radiopublic provider (as shown in above example) got merged to the repository and it’s now available for us to use on Drupal. How cool is that?!

See the Drupal.org change record for more details on the oEmbed support and status.

May 25 2021
May 25

A little while back, almost two years ago, Dries Buytaert wrote an interesting thought piece on the sustainability of open source projects such as Drupal. He reviewed the ways different actors engage with open source projects, dividing them into two camps, the Makers and the Takers. The makers build and create, providing benefit to the wider community and society. The takers are able to benefit from this creative process whilst not suffering any of the costs of the creative process, allowing them to gain a competitive advantage.

The difference between Makers and Takers is not always 100% clear, but as a rule of thumb, Makers directly invest in growing both their business and the Open Source project. Takers are solely focused on growing their business and let others take care of the Open Source project they rely on.

In order to demonstrate the difference in outcomes for makers and takers, a payoff matrix was provided. It shows that if everyone contributes there are shared advantages, however, if "takers" decide not to contribute, they will win out as they do not bare the costs of contributing to the project.

  Company A contributes Company A doesn't contribute Company B contributes

A makes $50
B makes $50

A makes $60
B makes $20 Company B doesn't contribute A makes $20
B makes $60 A makes $10
B makes $10


At the time the article did have an impact on me as it was an attempt by Dries to bring outside concepts to help understand the Drupal project and where it might be heading. Thinking such as this leads to informed ways of conceiving a future for projects such as Drupal and how they might shape themselves to thrive. 

The article examined concepts such as the Prisoners’ dilemma, public goods, common goods, free riders, the tragedy of the commons. Following on from conclusions by Hardin and Olson, the core problem for Dries was that “groups don't act on their shared interests”. How can a group organise to avoid the 'free rider' problem? Dries focused on a conclusion from Ostrom who writes “For any appropriator to have a minimal interest in coordinating patterns of appropriation and provision, some set of appropriators must be able to exclude others from access and appropriation rights.” The conclusion was that “Takers will be Takers until they have an incentive to become Makers.” These ideas have driven some changes being implemented at the Drupal Association, such as contribution credits and organisation accreditation and membership.

These thoughts have also been influential at Morpht, the Drupal agency where I work. We have adopted a set of foundation principles for the company. One of the key concepts is that we are Makers and Creators and value contributing to the Drupal project and the community. In practice, we have built internal systems to incentivise and reward everyone in the company to contribute back where they can. Outcomes of the process include a rise in commits to the project and a more open approach to how we share our code. We are also financial contributors to the project, supporting the Drupal Association as a supporting partner.


Green bearded altruism

Recently an intriguing video popped up in my stream “Simulating Green Beard Altruism” by Justin Helps. It is expertly researched, explained and visualized. It really is worth watching, so go on, I’ll give you a few minutes to take a look.

[embedded content]

For the uninitiated, myself included, Richard Dawkins coined green bearded altruism in his book The Selfish Gene. It represents a way for one actor to signal to another that they are altruistic. If altruists can recognise other altruists, they are able to direct their altruism at them and increase their chances of survival.

"I have a green beard and I will be altruistic to anyone else with green beard". 
Richard Dawkins, The Selfish Gene

The video took this concept and ran some simulations on how Altruists and Cowards fare under different conditions and rules. This setup relates closely to Dries' post around the sustainability of open source ecosystems.

The Makers (Altruists) build the system and positively benefit the whole. The Takers (Cowards) benefit more because they do not suffer the costs of maintaining the system. In this scenario, the Takers (Cowards) win out and thrive.

But what happens when the Makers (Altruists) only share the benefit with other Makers? They succeed and the Takers (Cowards) are less successful. The simulation, as set up, provides a possible way forward for open source projects such as Dupal. If you are a good actor and only reward other good actors, positive results will flow and will continue to flow.

The final simulation in video throws in the curveball of actors being dishonest. Sometimes a Coward will pretend to be an Altruist, tricking them into helping them. What do we find?

  • Altruists who do not signal their altruism tend to die out in a system where Altruists are rewarded. They are labelled as Suckers - doing useful things but not being recognised to their detriment.
  • Cowards who masquerade as Altruists are successful, reaping the benefits and suffering none of the costs. 

And most concerningly, in a world where actors can hide their true identity, even the Cowards, acting as Cowards, have success. The Altruists in their various forms cannot compete. This final outcome is depressing. What is the point of being altruistic in a world where others can just take advantage? Being altruistic is not enough if actors are gaming the system against you.

In a wider context what could we learn from these simulations? If an individual or organisation is indeed a Maker, they should signal this to others and be rewarded or recognised. This runs against the desire to be modest, but it does appear to be a sensible thing to do. Conversely, those gaming the system should be called out and somehow excluded.

A 2021 update

Dries has recently returned to the Maker and Takers concept in the Q&A following the Dries note at DrupalCon Global 2021. The video is yet to be released to the public but will be added here once available. Those of you with access to the video on Vimeo can take a look at 7:12 - 9:06. Dries says:

"Open source is a public good, meaning that everyone can use it. If I use a public good it doesn’t exclude you from using it either… One interesting thing is that leads which are essential to business are not a public good. It is actually a common good, meaning that there is a scarcity to it. If I win a deal, you can’t win that deal. So one of the things that we can do is make sure that the leads, the potential customers, …  go to those who contribute. 

There is something there that I really believe strongly. If we can route leads to organisations that contribute we will maximise the number of contributions Drupal will get and I believe that these organisations are often better serviced too because they get help from those organisations that actually help build Drupal. Its kind of a win win.

Sometimes I feel that we are afraid to talk about these topics because they may be somewhat controversial, but there is so much more that we can do."

These comments take the thinking to the next step. There is a recognition that if payoff for altruistic behaviour is financial, this will lead to further contributions. The way to achieve this is through the “routing of leads”.

Applying it all to Drupal

The main takeaways from the above appear to be:

  • We should be encouraging altruistic behaviour because it benefits the project.
  • Altruists can still benefit if they receive the benefit from other altruists.
  • Real financial benefits need to flow to the altruists if they are to be motivated.

So what is happening in the Drupal space?

The Marketplace

In recent times the Drupal project has reorganized itself to encourage more good actors. This has largely been done through the mechanism of recording contributions and promoting those who have contributed the most. Drupal agencies have been encouraged to support staff in contributing. The results are reflected in the Drupal Marketplace. The system gamifies contributions, motivating Drupal service providers to contribute more to move up the leaderboard.

It appears that following aspects have been valued:

  • Commit credits to core, contrib and other issues.
  • Publishing of case studies.
  • Financial support of the Drupal Association.

There are ongoing efforts to broaden this out and to further incentivise contributions from individuals and organisations.

Increase the exposure

The marketplace system does represent a huge step forward in demonstrating the contributions made by the various providers. It is like an X-ray into who is doing what in the community. It does suffer from a number of shortcomings:

  • What exposure does the marketplace have to potential clients?
  • When a client is looking to engage a Drupal agency, are they referring to the marketplace? If they are not, then the real financial benefits may not flow to the agencies. Then, the system is  just a game between the players. It benefits Drupal for sure, but does it benefit the players?

In order to be effective, the marketplace needs more exposure to end clients so that the 'routing of leads' can be improved.

The little guy

The Drupal Marketplace currently ranks organisations on an absolute scale according to credits in absolute terms. To my knowledge the rankings are not normalised by organisation size.

It would be interesting to see what the results would be if the rankings were normalised by employee count. We would then be able to see who the biggest altruists were, dollar for dollar. This would give more incentive to smaller organisations to contribute so they could better signal their altruism. 

Advertisements on drupal.org

It is possible to advertise on drupal.org in a variety of positions. You may be familiar with the prominently displayed ads for private companies which are displayed on the bottom of pages. These ads are visible across the whole of drupal.org and benefit from more exposure than being on the marketplace.

Open up the ad space

This form of promotion is obviously much more 'private' in nature, designed to promote the interests of the advertiser rather than that of the project. It is a way for actors to promote their own self interests against others and the interests of the project as a whole.

The advertising space is currently a vehicle to promote the interests of a selected few, rather than all of the altruists in the system. It creates a feeling of 'them vs. us' in the community to have certain players promoted in this way and not others. The Drupal Association should consider how this space could support all contributors. I would suggest that revenue for this space could still be maintained whilst opening it up proportionally based on contributions.

Sponsorship and visibility

Supporters of the Drupal Association receive promotion at Drupal conferences and in other ways. A sponsorship entitles a service provider to a variety of advantages, the main one of which is the promotion through badges and logo display at conferences. 

Continue the drive for more members

Supporting the project in this way can be more appealing than one-off sponsorship at individual conferences. Supporters do get quite a good level of exposure at conferences and this is a good way to signal altruism to other members of the community.

This system appears to be working reasonably well. Supporting the project is a good thing and sponsorship is a direct way to do it. The big challenge here is to encourage the non-subscribers to jump on board. If all individuals and companies did this in just a small way, the financial security of the Drupal Association would be assured. This has always been the case. As a community we should be encouraging this where we can. So if you are not yet a member or sponsor, you know what do do :).

Burnout and community funding

It is not uncommon for certain prolific or influential contributors to leave the community. A common reason would be burnout because of the stress of sustaining an important codebase in their spare time. It is not sustainable for them to do so, especially when there are many demands for support of features.

Most recently we have seen Jacob Rockowitz, maintainer of the Webform module, post several articles discussing this situation. The result was the decision to move to a sponsored approach using the Open Collective platform. This model encouraged users of the Webform module, the Takers to help contribute and become Makers by continuing the development of the module.

Closing the altruism loop

The way this message has been communicated has, in my opinion, been done in a very positive way. If you look at the Webform module page, there is a call for support through code, patches and reviews. And for those who cannot do that, financial options exist. This is a direct move to increase the altruism in the community and to close the loop between altruists helping other altruists.

Who sponsors Drupal?

In his blog post: Who Sponsors Drupal, Dries makes the point that companies or smaller agencies support most Drupal development. The bulk of the codebase is supported by actors who no doubt have an active interest in open source and Drupal, as well as the financial and technical ability to help support the codebase.

Deep pockets and shallow expertise

What can be done to broaden this out further, so that we can get more Makers and few Takers?

Larger entities with lower technical capability should have an easy way to fund development of code. Once the altruism feedback loop is strengthened, there should be a bigger drive for Takers to become Makers. We need a path for this to take place.

This is not the same as sponsoring the Drupal Association, i.e. infrastructure, promotion and governance. This is about funding code development, strengthening the code and functionality of the project and making it more attractive as a technological proposition.

In order for this to take place, two things need to happen:

  1. The Drupal Association, or some other body, needs to be ready to take on this responsibility.
  2. A method of dividing the resources needs to be determined.

At the moment, the efforts in this area have been ad hoc. There are notable examples of large companies contributing to initiatives which push the project forward. However, in order for this to be scalable, it needs to be done in a more systemic manner. It may be that the Drupal Association doesn't want to take this responsibility on - that is fair enough. If so, how might it be done?

The Webform module has turned to Open Collective. A recent article from Rachel Norfolk, makes a similar suggestion. Maybe the DA is taking a look at what is going on with Webform?

Shallow pockets and deep expertise

And what of the individual developer? The one who loves Drupal, loves open source and dedicates their time to improving the project. What if they are not supported financially by a larger organisation? These people are the lifeblood of the project and they need to be supported. It makes a heap of sense to harness their creative energies and support them financially to progress the project. 

If we can find a way to put these Makers together with those with the funds and the desire, great strides will be taken.

I would therefore suggest that this seems to be the most practical approach for closing the altruism loop and progressing the project.

Conclusion

While I make some suggestions, I believe Drupal is in an excellent position. The community is broad and deep and there is a lot of desire to keep the momentum going. There are also some excellent systems in place to help reinforce this such as recognition for contribution and the marketplace.

I have argued for the following:

  • Continuing with the credit system and the marketplace;
  • Increase the prominence of the Drupal Marketplace to outsiders;
  • Promote organisations who are punching above their weight in terms of contributions per employee;
  • Continue the sponsorship approach with the community encouraging membership to the Drupal Association;
  • Reconsider the advertising space on the Drupal Association as something for Makers rather than being private for a select few;
  • Develop a system to bring Makers together with Takers who have deep pockets.

The biggest challenge, where we can make the most gains, lies in bringing the big Takers into the fold and supporting talented individuals who do not otherwise have support, and with that closing the altruism feedback loop and increasing the chances for the project to grow and improve.
 

May 25 2021
May 25

Providing an optimum digital platform that is accessible, meaningful as well as communicative of your organisation’s plans and objectives is imperative to creating a healthy brand image and also for being considerate of your customers’ demands and expectations from you. Most consumers today expect a widespread and uniform and digital presence of the brand that they are associating with - a presence that lays everything out on the table and leaves close to nothing behind closed doors.

This is where the concept of a Digital Experience Platform (DXP) comes in. A DXP is an integrated software framework that is used to reach out to a wide horizon of audiences through an array of digital channels and touchpoints. The recent decades of the digital boom have brought it to many organisations’ notice to invest in DXPs to build, deploy and continually improve their websites, applications and digital experiences. 

There are two core principles that stand out in the explanation of a digital experience platform.

  • Multiple integrated technologies are needed to connect to a wide array of digital touchpoints.
  • Management of all of these expanded experiences is done from a central platform that acts as a single control centre.

Let's have a look at the visual definition of a DXP.

white background with green graphics showcasing the visual representation of a DXPSource : Xtevia

A recent report showcased that 47% consumers would abandon a brand that doesn't provide relief and product suggestions and shop from some renowned organisation like Amazon instead. Impact found that companies have been realising the importance of an expanded digital presence, with over 44% of the surveyed companies were presently working towards a digital first approach, while 34% had already undergone a transformation. 

Owing to these statistics, it comes as no surprise that the Global Digital Experience Platform Market continues to keep expanding year after year.

Bar graph using yellow and blue lines tracking the popularity of DXPsSource : Data Bridge Market Research

Kinds of DXPs

Two of the primary approaches to Digital Experience Platforms are:

Open DXP

A platform that serves as the connective bridge of digital experiences by integrating multiple products from different vendors together so that they can work and perform as one, is an Open DXP. Hence organisations that already have a wide digital presence and own multiple digital experience products will be able to utilize Open DXP better.

Closed DXP

A closed DXP on the other hand is also a one-stop-shop that has all major DXP components - but the differences that all of these components are maintained by a single provider and integrations with other applications is fairly limited. 

When it comes to choosing the right approach out of these two, it can be very tricky for an organisation as it depends upon what a certain business might be looking for. For instance, an organisation can avoid closed DXP if it does not want to completely ditch different parts of its platform that are working well for it today. On the other hand, choosing open DXP can be fruitful as it lets a business maintain the part of its experience toolset that is working fine. An Open source DXP can lend flexibility to an organisation’s future. A digital business should still look at what it really wants based on all the different factors.

Based on organisation’s specific use cases, DXP has branched itself into three further categories:

CMS DXPs

For today’s creative agencies that require management tools and creative freedom than what a CMS offers, a CMS (Content Management System) DXP makes the cut by focusing more on the needs of the marketing department as well as the User experience (UX) of the platform. These are particularly sought-after if the business is B2C - where the sale cycle is short and the audience is large, as CMS-heritage DXPs have strong offerings for web-based analytics, user segmentations, advertising and campaigns. Learn more about choosing the right CMS and why Drupal can be the way to go.

Portal DXPs

On the other hand if your business is one of those where long-term customer relationships are valued even after the transaction has been made, a Portal DXP is better suited for the purpose. These trace their origins back to providing customer portals and help businesses understand the factors that lead to better customer loyalty and retention. These DXPs can be especially useful for gathering inputs before providing customer service and assistance in issue resolution. 

Commerce DXPs

Commerce DXPs are used for setting up online e-commerce platforms. In addition to product related content delivery and online shopping platform style web interfaces, Commerce DXPs also provide the capabilities related to inventory management, shopping cart, payment integration and checkouts.

DXP vs CMS vs WEM

Now that we're done with defining and analysing the different kinds of DXPs, let’s address the most common query when it comes to these platforms - how is a DXP different from a CMS or a WEM (Web Experience Management)? As the diagram below shows, these similar sounding terms have different use cases and are suited to companies with different goals and aspirations. A DXP has grown out of the limitations posed by a traditional CMS, with regard to creating and managing customer experiences - letting the user dive deeper into the concepts and applications of UX. Forrester's Mark Grannan defines the difference between CMS and DXP as 

“Web CMS is critical for developing, managing and optimizing web, mobile, and other content-based experiences. API-first architecture and cloud deployments are reshaping the packaging of digital capabilities into more granular tools that can be assembled on demand.”
column consisting of three subheadings each talking about DXP, CMS and WEM


A Web Experience Management platform was also born out of newer customer needs, introducing cross-channel functionality for content and data delivery. A WEM enables a brand's business units to share information digitally across channels, and gives the business greater visibility into the user behaviours and personas according to their activities on these channels. 

But the ultimate solution for delivering multi-channel marketing and better user experiences emerges in the form of a DXP, as it brings out the best in both worlds. 

When do you need a DXP?

But of course, not everyone would require a DXP as it completely depends on an organisation's present needs and future aspirations. One must look out to onboard a DXP if 

  • Building relationships with your audience throughout the transaction is one of the chief goals of your organisation.
  • Your future plans include organising omni-channel experiences that provide you the ability to reach out and deliver content to multiple devices and channels like mobile phones, tablets, email inboxes and social media.
  • In a bid to enhance the User Experience of your digital presence, your organisation's pipeline mentions several integrations of new data and platforms to optimise the content and the layouts for your end users. 
  • Further elaborating on the UX, content personalization and differentiated customer experiences are rated high on importance to retain customers and create value. When you work with a DXP, it does the job of collecting consumer data, defining user personas, and serving custom content to specific audiences, also alongside connecting this data with other channels like social media and mobile applications. Hence, a DXP integrates with other systems and departments to create highly personalized experiences for your audience.
  • A platform that does most of the analytical thinking by itself is also great for comprehensive strategizing and planning your roadmaps across channels all while having a single, sturdy management. Not just in the planning, a DXP is also great for retrospection and management of the entire campaign and for incorporating the inputs so received into the workflow.

Choosing the right DXP

While it may seem like a daunting task to choose the perfect digital experience  for your enterprise, the job becomes much easier when you go through the process step by step. Choosing the ideal DXP should consist of the following steps.

Map out your requirements

It is easy to get caught up in the flashy new features offered in the market, but you must keep in mind that your company has unique requirements and that the features and functionalities you might have on your wishlist shouldn't just be there because they're an upcoming trend or a buzzword. Anything that you plan to invest on should add value to the organisation. Set realistic goals and about your requirements in a phase by phase manner, ranging from high priority to low priority.

Assess your readiness for transformation

There is quite a lot of change that happens when fundamental decisions like onboarding a DXP are taken. It is now time to analyse whether your content delivery channels and your diverse audience would benefit from this change - and if they would, would they benefit enough for you to go through the ordeal? Is your audience receptive to change and welcomes new initiatives? Or is it more a laid back user base that wants to identify with traditional systems and methodologies? More importantly, do you have enough resources to absorb the changes that come about - i.e., are you ready both in spirit and capacity to make this change?

Onboard some experts

Bigger projects require a broader perspective, so do not take it upon yourself to analyse who the leading technology vendors are and which one to go for, as there is expert help available for that at any point. Keep an eye out for analysts like Gartner and Forrester that constantly release rankings including top players in digital experience, commerce, CMS and the likes. All of these should be considered while the research is undergoing for a new DXP.

Evaluating surrounding ecosystems

Don’t trace the journey without conducting thorough research about what is going on in the market and with your competitors. Note which organisations used which parts of different technologies and tools, and how well these have helped the enterprise to flourish in the duration. Also take the time to examine  the goals or features of other businesses that overlap with your organization's and how they have been catalysing their workflow with the use of newer tech. Methods that are already tried and tested will help you out a great deal as there is no better testimony than a practical  example.

Also ask your vendor for plenty of references for you to study during the research process to be absolutely sure before finalising a DXP.

Plan a roadmap

Lastly, don’t think only about the present but also the future. There should be a pipeline of tasks that are to be done with regard to the platform in the next few months. Team discussions regarding the execution and usage of the DXP should take place beforehand so that everything works at its optimum. 

Benefits of using a DXP

There's a reason why many big organisations seem to be gravitating towards DXPs to take care of their present and future roadmaps. This is because a DXP comes with some pretty solid advantages.

Integrated Controls

DXPs are best suited for multi-channel deliveries and expanding into the far reaches of the web all while keeping a single integrated control panel. A DXP is well equipped to create comprehensive strategies across different platforms all while keeping the operations seamless, and also enabling close collaboration among web page optimisation, content optimisation and also email campaigns. Not only this, you can also rely on the platform for analytics tracking and A/B testings.

graphics of a phone, community, store and a watch showing the central integration facilitated by a DXP


Flexibility

DXPs can be pretty dynamic and flexible in application and that is exactly what's needed in the fast evolving world of today. DXPs, owing to their open API (Application Programming Interface) first architecture, are flexible enough to integrate the latest technologies so that you can serve that to your customers as soon as possible. Thus, a DXP is both accommodative of changes in your plans as well as scope of the project, as scalability is one of its primal strengths.

Personalization

As DXPs are great with analysing, tracking, and everything else that constitutes user research before rolling out a product, they also are simultaneously great for creating personalized user experiences of these channels by taking into account the several inputs that they received over these touchpoints. If your business bases itself on creating unique experiences for your customers, a DXP is quite a godsend. 

Being future proof

Investing in a platform which is as moldable and customisable as a DXP also is a method of future proofing yourself against an impending technical debt. Apart from this, as organisations dive deeper into the user journeys, relationships that could sustain in the future are also nurtured as needs are understood as well as addressed in a better manner. The information that a DXP gathers for you can turn out to be invaluable in the time to come.

Exploring upcoming digital experience trends

DXP and AI

To study all the data flowing in from multiple sources, analyse it, and to bring forward better content and servicing after reviewing the inputs received and also to continuously maintain and improve the process, artificial intelligence is brought into the loop. The entire concept of a DXP is based on artificial intelligence - AI is interwoven intrinsically in DXP and has access to data from every tool and touchpoint. In turn, the DXP acts as the ultimate seat for the AI to understand and improve the experience from customer acquisition to loyalty.

Voice Interaction

'Smart speakers' constitute a space in the digital sphere that hasn't been mastered yet. But there is no denying that they have become an essential part in the digital experience journey of a customer, and brands that can offer these services seamlessly are sure to have competitive advantage over the others. More about conversational UI here.

Maintaining Privacy

Customer privacy is more important than ever. If you can give your customers the assurance that every data that is collected from them is used for a purpose that they know of, hence involving them into the brand operations, it adds great value to your overall relationship. It shows the end user that you have nothing to hide, that you are transparent, and more importantly, you can be trusted. 

The process matters

Consumers will care more about how you do things. Do you tap on the trends to deliver updated content to your user base? Or is your style more laid back and rigid? As business relationships get more intermingled consumers are going to care it's more than just the product that they're purchasing. They are also going to be  interested in your method of operations, your digital presence, your accessibility on the various channels you're present in, your problem solving techniques, etc. 

Digital will bridge the gap

Digital has been the saving grace during these stressed times, and will continue to be in the saviour even later on. Companies will rely on their digital platforms to access the far reaches of the globe to their scattered audiences. Even within the organisation, a business that is digitally integrated is, in a way, future proof, as it delivers greater visibility into the work of each department and hence better insights on the working of the entire organisation as a whole. This enables one to correct the mistakes faster and achieve greater efficiency faster.

It is inevitable that a business that is flexible, omnipresent, considerate and updated will win the competitive edge in the near future. Hence, it is recommended to start your digital transformation journey as early as possible!

May 25 2021
May 25

Last week we released a new version of OpenLucius: a lean and fast Drupal social collaboration distribution with features like: groups, social posts, messages, group chats, stories, file -and folder management, notebooks, categories, activity streams, notifications, @-mentions, comments and likes.

OpenLucius 2.0 has been in production for the last ~5 months, it's stable enough to go into beta! We also keep on improving: what we globally did since last release:

  • Added new features;
  • Enhanced existing features;
  • Tweaked UI / Design.

All work we did was based on feedback we got internally, from our customers -and trial users. And we plan to keep it this way, so if you have ideas for new/better features: let me know!

The fastest way to explore OpenLucius is by trying it via the product site. And since a lot has changed, I though I'd make it easy on myself by just showing off with current main features -with the newest on top, here you go:

Task / Kanban Board (*Sneak peak*)

You can already try this task board, but it needs work to get it to an open source release. We plan on releasing this as an add-on contrib module:

Screenshot kanban board

@-mentions (*new*)

What we really missed in previous version where @-mention, so that's now included in texts and chats, with autocomplete:

screenshot @mentions

@group mention: As you can see, you can also mention everyone in current group.

Technical background: via Drupal core's CKEditor this was hard to accomplish, so we tested out other editors and came up with open source editor Summernote, a light-weight editor based on Bootstrap. It's extendable, able to facilitate inline-editing and it's also very nice that it automatically inherits theme styling (since it's not loaded via an iframe). 

Also, we are building a Kanban/Scrum board with highly interactive modals and for example: inline editing of card descriptions and comments. For that we also needed a lean editor.

And last but not least: we could tweak the editor UI, making it blend with the theme smoothly.

Summernote also facilitates drag-and-drop images & texts:

Drag/drop images (*new*)

So drag-and-drop images is now available in all text editors:

screenshot drag and drop images

It also has some great, user friendly, inline image options:

Screenshot image options summernote editor

General settings (*new*)

Set global colors, homepage tabs and homepage image:

screenshot General settings

Order book pages (*new*)

You can now order book pages easily, with unlimited depth:

Screenshot order pages

Groups

Screenshot groups home

  1. Group name, with drop down for group settings.
  2. Group sections, with activity badges that you can turn on/off per group.
  3. Group activity stream, bundled per day.

Activity streams (global and per group)

Homepage with activity stream example:

Screenshot homepage

  1. Home banner, configurable;
  2. Stories;
  3. Activity stream, personalised, bundled per day, per group;
  4. Your Groups, with link to group archive;
  5. Social posts, global, can also be turned off.

Social posts

Screenshot  social posts

Messages

Screenshot messages

Group chats

Screenshot group chats

Stories

Screenshot stories

File -and folders management

Screenshot docs and files

Notebooks

Overview:

Screenshot notebooks

  1. Hierarchical book pages;
  2. Order pages modal;
  3. Like and comment;
  4. Add file attachments to comments and notebooks.

Order pages easily, with unlimited depth:

Screenshot order pages

Use notebooks for example for:

  • Project documentation
  • Manuals
  • Notes
  • Web links
  • Agreements
  • Minutes
  • Ideas
  • Brainstorm sessions
  • Onboarding information
  • House rules
  • Customer information
  • ...whatever needs text.

Notifications (non-disturbing)

Screenshot notifications

Comments

Screenshot reacties

Likes

Screenshot Likes

Get it, got it, get that!

That's it for now, if you want to test OpenLucius this instant, that of course is possible via the product website. Or download and install OpenLucius yourself via the project page on Drupal.org

Planet Drupal Written by Joris Snoek | May 25, 2021

Join the conversation

May 25 2021
May 25

All open source thrives on collective inputs and collaboration, there is the need for a centralised channelizing force every now and then to bring out the best in the existing resources. For any organisation that indulges in open source, it is important to remain proactive and constantly reflect upon and retrospect the direction in which their open source investments are heading. 

In any project leadership provides the much-needed vision pathway and blueprint for progressing in a certain direction. Even when something is collaborative, there needs to be a supervisor's perspective to solve problems and meet goals efficiently. 

More often than not, companies have open-source programs with objectives in place that they aim to achieve through the program. Assigning a lead not only improves the overall efficiency of the project but also helps in a number of other ways.

Some of the reasons why organisations care about diving into open source are as follows -

blue background with black text talking about the various benefits of using open source programs


The Seven Laws of Leadership in open source

Leading an open source program varies greatly from the same role in other dimensions in the sense that it is not as straightforward and upfront. And a lot of times people might mistakenly assume that leadership is not required in such a project but they could not be more wrong. Something as vast, spontaneous and continuously evolving needs to be managed and utilised well at every point and in turn needs more supervision than regular platforms.

Let's move ahead and explore some mutually agreed and understood laws of open-source leadership that make a good leader in these platforms.

Competence

Open source leaders are very different from conventional bosses. It can be seen by the personality of Linus Torvalds, founder of Linux or Jim Whitehurst, CEO of Red Hat that their personalities spark liveliness and dialogue. They are highly approachable and remain open to the community base that they are a part of, and can be seen acknowledging and crediting their success to the community regularly. 

It can hence be deduced that rarely does a successful open source leader think just for himself, and it is always the values of togetherness and teamwork that take away the cake when it comes to allocating credit to the platform's success and growth. This is exactly the mentality that is required in an open source leader - somebody who drives their authority not from the designation that they hold but from the millions of people that toil everyday to keep the community thriving.

Being a teamplayer

Conventional leadership is largely surrounded by the aspects of authority charisma and holding onto the crown as long as possible. It is hierarchical and depends on the concept of formal leadership. 

On the contrary, an open-source leader isn't supposed to be perfect and utopian. He is more like the first among equals - he is as good as his team members and is not required to show his complete hands above the others in the team. He is supposed to be on the ground and lead the team by example. An open-source leader, hence, is not an unattainable figure, but he is just one commoner who has the ability to manage and improvise. 

Being spontaneous

For a platform that is as fast moving and evolutionary at every second, a laid-back leader just won't make the cut. You should be someone who is capable of taking spontaneous decisions as the leader here should be someone who can formulate and execute ideas quickly. Swift decision making is the key to succeeding in open source, and it must be the central idea of all plans of action. 

The open-source crowd is not patient enough for leaders who will slowly drop suggestions or complain about an issue or expect their contemporaries to resolve the problem at hand. Spontaneity and team spirit are integral to an open source leader and he is expected to pull up his socks and be the first one to start working.

Autonomy

In open source, autonomy replaces hegemony in the leadership role. The management does not exercise coercive control to get everything done, rather, all the tasks are worked upon and pulled together as a team. The job of the leader here is to lay down the framework of all the initiatives to be taken, within the community and establish related objectives to bring those initiatives into action.

This shows that authority and control are obsolete concepts when it comes to this dimension and creativity and autonomy are the desired qualities of an ideal open source leader instead. He is not only supposed to collaborate and lead but also to empower the team members by facilitating every tool and piece of knowledge that is needed to get the work done. 

Delegating it right

Leadership in open source focuses on the right delegation and decentralization of duties. As stated above, blending in well with the team and keeping everybody on the same same page is essential, hence each team member should be enabled to take the authority and ID fix any minor issues that arise from time to time and not reach out to the leader for every minor inconvenience. 

Never ignore the soft skills

Sure, getting the technicalities right is pretty important, but soft skills are the ones that pay off in the long run. More than anyone else, a leader requires a 'human touch' and needs to be empathetic and engaging with the team to keep each member's spirits high and productivity maximum. A leader should be able to perform, communicate explicitly, be very clear with the expectations along with being result oriented. Humility also accounts for a great soft skill and shines above everything else.

Say no to micromanaging

Needless to say, micromanaging can be detrimental to such a vast project as you might be biting off more than you can chew. It is physically impossible to manage everything that goes on in the ground in an open source platform, and one needs to realise that they cannot manage every project from tip to toe. A safer bet is to teach the skills required to your team members and delegate tasks to them regularly - along with trusting them with what they're doing.

Understanding the Leadership Role
 

five coloured circles stating the different qualites of an open source leader


Gaining leadership in an open source project can lead an enterprise to great commercial success. This is because

  • A lot of products depend on open source software technologies 
  • The path to Innovation unfolds much faster in open source
  • The software ecosystem is more or less dominated by open source.
  • Software can make or break your online business and is one of the biggest differentiating factors in present industries. Therefore, the best one matters now more than ever.

Now that we have covered the basic tenets of open source leadership, let's dive deeper into the model and understand the leadership role better. As widely recognised, this role is said to be the embodiment of an integration of these interrelated characteristics.

  • The first trait is to be awake about any intolerance or discrimination experienced by the team. Nothing kills team spirit quicker than bias. With such things to worry about, it would be a big hurdle for any person to look beyond being undervalued or excluded and perform to their full potential. The primary goal is to empower and motivate everybody and take care of all of their insecurities and weaknesses.
  • Branching out from the first characteristic is the second one - and it is trustworthiness. To act as the glue amidst the chaos, the first quality that you need to develop is being trustworthy. Honesty of the team members is extremely important to bring out the best in each resource and you need to give them an open slate to chalk out their aspirations, fears and problems. 
  • A trait exclusively important to open-source leadership is being agile and quick as the world would not wait for you to formulate an idea and implement it, and will move on at its own pace. If you are left behind there's nothing you could possibly do about it. 
  • Lastly, and most importantly, be accessible!  You do not want good ideas to be silenced or issues to not be brought to your notice until the last minute. An open-source leader must acknowledge that authority and responsibility go hand in hand and it is imperative for him to stay proactive and respond to each and every query and argument of his team.

This is how a leadership role in an open source project is outlined. After these essential requirements have been met, the next step would be to break the project down into consecutive steps and work on it one at a time.

Defining the project structure

Most open source projects follow the rule book in having a common repository that defines all the major principles and objectives of the project's governance model, structure, rules and processes. An open source project mein have any one of the three common governance structures - BDLF, meritocracy and liberal contribution.

Allocation of resources

Allocation of resources also includes assignment of roles to different team members. A member might be 

  • A contributor - someone who creates or follows up on an issue.
  • A committer - someone who has been given the access to write access to the repository.
  • A maintainer - someone who sets priorities and directions for a project or a special interest group. 

Mapping out the requirements

This is actually one of the most difficult parts of the process. The requirements in conventional projects are already defined by the company's authorities, but open source requirements come from users and contributors. Working in open source has to be very agile and swift due to this. Needless to say, the pipeline keeps changing and evolving by each passing day.

Defining coding standards

The team must have rules and standards related to both the overall process and coding explicitly mentioned in some documents and shared on a common platform that ensures easy day today communication. A set of directions on how to proceed with the code would avoid unnecessary conflicts and irregularities later on in the projects and save crucial time. 

Mentoring 

While everybody is expected to perform at their individual level, the leader is also expected to go the extra mile and establish a homogenous flow to help everyone on the team. Regular interactions and stand ups might help reinstating the common agendas. A simplistic pathway should be put in place for volunteers to start contributing to the project. Learn more about the perks of contributing to open source here.

How to make the most of open source?

Utilizing and making the most of an open source platform can become significantly easier when you have a strategy in place. 

  • Owing to the dynamic nature of open source goals should be set and reset on a regular basis. Once the agenda has been finalized, it needs to be reviewed annually or biannually for projects to sustain themselves and not go obsolete by the time you get them done. Therefore, even the long term plans in open source can only stretch up to a certain extent.
  • Needless to say, the planning needs to be proactive and there is absolutely no place for neglect in an open source platform specially from the leaders' end. Even when a project is complete it needs to be maintained in order to adapt to the present software trends for your targeted ecosystem.
  • The tech team should remain in sync with every other stakeholder so that the product that they are engineering is always close to the final idea in everyone's mind. To stay updated, after every milestone achieved, try to get appropriate feedback and incorporate it in the execution.

Lastly, one of the most important leadership roles is to foster a culture of psychological safety where the contributors feel free and excited about contributing to a project and not worried about the response that they will get. This is how Michael Cardy, Red Hat's Chief Technology Strategist puts it.

“The culture has to treat failure as an opportunity for learning, and failure is only bad when lessons are not sought and shared with the rest of the organization.”
May 21 2021
May 21

The homepage is the most important element of a website. The first impression of the user depends on it – whether they'll stay on the website, whether they'll be interested in the company's offer and the company itself. In just a few simple steps, we'll show you how to create a homepage that will help you keep your visitors' attention.

 

What elements should a homepage contain?

A prospect enters the homepage. You have a few seconds to interest them and make them continue to check out your offer. How can you do that? You need to correctly convey the most important information that will immediately answer the customer's questions.

Banner with a call to action

First of all, you need to briefly describe what your company does. This type of information is often displayed in a banner at the top of the page. In this section, you can add a CTA (Call to Action). It's a link which when clicked redirects the user the destination area, e.g. to the next section of a page or to a contact form.

Blocks with content

You should also put blocks that contain information about what your company can offer to a potential customer and why they should choose your offer. Various types of representing information are suitable for these purposes: from simple text blocks to advanced interactive carousels that allow you to put more content into a fairly tight framework of one block.

Once you know what elements to put on the home page, the matter of creating the page in a fairly quick and easy way arises. Website builders – one of which is Droopler – are suitable for solving such a problem.

 

Create a homepage in Droopler

Droopler is an open source website builder based on CMS/CMF Drupal. It contains ready-made components and tools that allow you to quickly create modern webpages. Droopler is simple to edit and easy to expand. Let's try to create the most important elements of a homepage with its help.

Banner paragraph

As we've already mentioned, one of the most important elements of a homepage is a banner with brief information about the company at the top of the page. In order to add a block of this type, press the “Paragraph overlay” button on the top right of the page.

The Paragraph overlay button is located on the top right of the page in Droopler

 

Next, you have the option to add a new block in the selected area. After pressing the "Add" button, select the "Banner" component in the pop-up window.

Adding a new block on the homepageIn the pop-up window, select the Banner component to add it to the homepage

 

In the next window, you need to add a banner background. Additionally, you can add a title, subtitle, icon that appears at the top of the block and some brief text. It’s also possible to add the already mentioned Call to Action button and configure its main attributes.

Configuring a new paragraph for a homepage in Droopler

 

In the "Settings" tab, you can choose the shape of the block: placing the text in the central part of the banner or on the left side with a translucent background. It’s also possible to set different colour versions of a paragraph and define your own colours to ensure good readability of the text. Additional options include configuring the margins and padding, as well as adding additional classes for the block.

In the Settings tab, we can configure more settings for a particular paragraph

 

After pressing the "Save" button, the first block of the main page will be put in the lower right corner of the window:

The block with a banner on a homepage in Droopler

 

Carousel paragraph

A carousel is a type of block that contains a looped slideshow that can be controlled by the user. In order to create such a block in Droopler, select "Carousel" in the component list window.

Selecting a Carousel component to add it to the homepage

 

As in the case of a banner, you can add a title, icon, description and CTA. In the "Settings" tab, it’s also possible to set the margins. In addition, you can define the number of columns - how many elements should be displayed in this block.

Configuring the settings for a Carousel paragraph

 

The carousel items are added in the "Items" tab. You can choose your own title, image, description and link for each of them. The same as for the entire paragraph, for every element, you can set separate graphic themes - change the colour of the background or text. The next picture shows a finished carousel.

A finished carousel with graphic and text elements in Droopler

 

About Us section

You can create a block with information about the company using the "Text page" paragraph type. It's a simple section where you can enter a title, subtitle, add an icon, text and link. The available settings are the same as for a banner and carousel. After all the fields are filled in, the section will look like this:

The About Us section on the homepage, created using Text page paragraph

 

Summary

In this article, we explained how to create the most important elements of a homepage in a fairly easy and quick way. We used Droopler – a modern website builder, which not only allows you to build websites without coding, but also facilitates Drupal development.

May 21 2021
May 21

The homepage is the most important element of a website. The first impression of the user depends on it – whether they'll stay on the website, whether they'll be interested in the company's offer and the company itself. In just a few simple steps, we'll show you how to create a homepage that will help you keep your visitors' attention.

 

What elements should a homepage contain?

A prospect enters the homepage. You have a few seconds to interest them and make them continue to check out your offer. How can you do that? You need to correctly convey the most important information that will immediately answer the customer's questions.

Banner with a call to action

First of all, you need to briefly describe what your company does. This type of information is often displayed in a banner at the top of the page. In this section, you can add a CTA (Call to Action). It's a link which when clicked redirects the user the destination area, e.g. to the next section of a page or to a contact form.

Blocks with content

You should also put blocks that contain information about what your company can offer to a potential customer and why they should choose your offer. Various types of representing information are suitable for these purposes: from simple text blocks to advanced interactive carousels that allow you to put more content into a fairly tight framework of one block.

Once you know what elements to put on the home page, the matter of creating the page in a fairly quick and easy way arises. Website builders – one of which is Droopler – are suitable for solving such a problem.

 

Create a homepage in Droopler

Droopler is an open source website builder based on CMS/CMF Drupal. It contains ready-made components and tools that allow you to quickly create modern webpages. Droopler is simple to edit and easy to expand. Let's try to create the most important elements of a homepage with its help.

Banner paragraph

As we've already mentioned, one of the most important elements of a homepage is a banner with brief information about the company at the top of the page. In order to add a block of this type, press the “Paragraph overlay” button on the top right of the page.

The Paragraph overlay button is located on the top right of the page in Droopler

 

Next, you have the option to add a new block in the selected area. After pressing the "Add" button, select the "Banner" component in the pop-up window.

Adding a new block on the homepageIn the pop-up window, select the Banner component to add it to the homepage

 

In the next window, you need to add a banner background. Additionally, you can add a title, subtitle, icon that appears at the top of the block and some brief text. It’s also possible to add the already mentioned Call to Action button and configure its main attributes.

Configuring a new paragraph for a homepage in Droopler

 

In the "Settings" tab, you can choose the shape of the block: placing the text in the central part of the banner or on the left side with a translucent background. It’s also possible to set different colour versions of a paragraph and define your own colours to ensure good readability of the text. Additional options include configuring the margins and padding, as well as adding additional classes for the block.

In the Settings tab, we can configure more settings for a particular paragraph

 

After pressing the "Save" button, the first block of the main page will be put in the lower right corner of the window:

The block with a banner on a homepage in Droopler

 

Carousel paragraph

A carousel is a type of block that contains a looped slideshow that can be controlled by the user. In order to create such a block in Droopler, select "Carousel" in the component list window.

Selecting a Carousel component to add it to the homepage

 

As in the case of a banner, you can add a title, icon, description and CTA. In the "Settings" tab, it’s also possible to set the margins. In addition, you can define the number of columns - how many elements should be displayed in this block.

Configuring the settings for a Carousel paragraph

 

The carousel items are added in the "Items" tab. You can choose your own title, image, description and link for each of them. The same as for the entire paragraph, for every element, you can set separate graphic themes - change the colour of the background or text. The next picture shows a finished carousel.

A finished carousel with graphic and text elements in Droopler

 

About Us section

You can create a block with information about the company using the "Text page" paragraph type. It's a simple section where you can enter a title, subtitle, add an icon, text and link. The available settings are the same as for a banner and carousel. After all the fields are filled in, the section will look like this:

The About Us section on the homepage, created using Text page paragraph

 

Summary

In this article, we explained how to create the most important elements of a homepage in a fairly easy and quick way. We used Droopler – a modern website builder, which not only allows you to build websites without coding, but also facilitates Drupal development.

May 21 2021
May 21

1) Flexibility

Drupal is extremely flexible and versatile. Due to the modular structure, content and layout are separated from each other. Both can be edited or exchanged independently without affecting the site as a whole.

The page can be changed or extended at any time. For example, it is possible to add language variants, redesign the layout, add further user interaction, integrate new data sources, etc.

2) Tailored solutions

Drupal is very well customizable and adaptable to the actual goals and requirements of the web project. There are (almost) no limits to the ideas and possibilities.

As a CMS, Drupal offers a very good basis, which can be extended and customized via thousands of extensions (modules). The already large selection is also constantly being expanded with new modules. Thus, there is already a pre-built solution for an enormous number of use cases. The Drupal site is then assembled only from the building blocks that are needed for the particular use case. And of course, these can be extended at any time.

3) Usability & Accessibility

Drupal has a strong focus on usability for users and accessibility for visitors.

The backend for Drupal users is kept clear and focused. Users can work from anywhere via the browser and do not need any additional software. Via the integrated role and rights system, the view and editing options are adapted for the user groups. Thus, all users (e.g. marketing, editors, developers, etc.) can work clearly and efficiently in their work environment.

Accessibility standards are a basic requirement in Drupal. Drupal ensures that all features and content comply with the World Wide Web Consortium guidelines WCAG, WAI-ARIA and ADA guidelines. This supports a low barrier website to allow easy access to information for all users.

4) Open interfaces

Drupal integrates really well with existing tools and software in the existing business environment. Through robust APIs, the Drupal site can communicate and interoperate with many other interfaces. For example, your own CRM software, external media databases, marketing tools, etc. can be shared with Drupal.

5) Community & Security

Drupal is an open source solution and is (further) developed by an extremely dedicated community. Worldwide, many web professionals invest their time in documentation, testing, code review, further development and networking. Thus, they are constantly working on the improvement of Drupal, which becomes visible in the form of ever new features and modules.

An important role is played by the security team within the community. The experts continuously analyze and eliminate potential risks. Thus, every new module undergoes a strict quality test by the security team before release.

May 21 2021
May 21

This year, Pantheon set out to evaluate the digital landscape and recognize the world’s best WebOps leaders for creating web experiences that “drive deep, positive change.”

As a leading hosting platform for Drupal sites, and a top SaaS-based website operations platform for developers, designers, and marketers, Pantheon provides a distinct perspective and knowledge of the current scope of transformative solutions that are making a difference. That’s among the reasons why we believe that the Promet Provus platform’s receipt of Pantheon’s 2021 Lighting Award for Innovation represents such a profound achievement.

Pantheon selected Provus:

For creating an innovative user experience enabling content editors to quickly and easily customize, iterate and deploy websites on-brand and at scale for large, multisite networks.

Citing an example of what Provus is enabling Promet Source to achieve in service to our clients, Pantheon further noted that, “The [Promet] team successfully rolled out 53 websites for [a large county website], completing two to three sites every six weeks."

What is Provus?

A Drupal platform that provides marketers and content editors with a library of design components, along with drag-and-drop capabilities to quickly build and update pages as needed, Provus is paving the way for a new normal of no-code, content editor empowerment.


Inspired by the realization that nearly every website consists of various combinations of roughly 15-20 types of features or patterns, Provus organizes a library of high-quality components that can be repurposed for low-code, no-code site building, to create a foundation for:

  • Easier content editing capabilities with drag-and-drop functionality,
  • Greater design flexibility within defined brand standards, and
  • Streamlined development using Drupal’s proven content models.

How Provus Works

Combining the latest drag-and-drop page-building tools in Drupal with a curated library of components, Provus enables content editors to layer designs, add functionality, and rearrange layouts for more engaging and interactive web experiences. Provus is also accelerating development by combining commonly used content types and features with a library of off-the-shelf components.  

Provus on Pantheon, pushes this accelerated development cycle even further by leveraging Pantheon’s WebOps tools, enabling integration in ways that include: 

  • Access to Terminus through the Docksall docker orchestration tool.
  • Use of a single command, “fin pull,” to get the latest database and media assets. 
  • An automatic push of updated versions, to Pantheon from Travis, when pull requests are merged to the “develop” branch.  
  • Configuration import and database updates that happen automatically using Quicksilver.
  • Notification of Slack channels when new code is deployed using Quicksilver.

Under the Hood

Provus leverages Drupal’s Layout Builder module, enhanced by additional contributed modules that include Layout Builder Browser, Layout Builder Library, along with some custom configuration and theming. 

To create high quality, flexible, and accessible components, Provus uses the Emulsify theme, taking advantage of Emulsify’s Storybook integration to facilitate the development and reuse of components across projects. 

Provus’ most flexible and powerful component, the “Link Group,” allows marketers and content editors to choose any content type and apply any of the available cards or component layouts. For example, a content editor can choose events, filter by the event type, sort by title or date, and then place that on the page in a number of available carousels, content lists types, or in rows or columns.

An essential differentiator with Provus is the built-in brand governance that ensures marketers and content editors are working within a framework of brand and style guidelines to maintain a consistent aesthetic, brand-aligned user experience. 

See a demo of Provus in action!

 

Inherently Open Source

Provus is Open Source and is powered within an Open Source ecosystem that includes:

  • Docker
  • Cypress
  • Behat
  • Storybook
  • Drupal
  • MySQL / MariaDB
  • Solr
  • PHP
  • Redis

Signature Promet Source

I could not be more proud of the collective brainpower and the Promet team’s dedication to our clients’ success that has factored into the development of Provus. Provus serves as a powerful representation of our fervent dedication to our clients’ success, as we consistently ask the kinds of questions that begin with “Why Not?” and “What If?” 


It’s great to win awards and the Pantheon 2021 Lightning Award for Innovation is a big one. But truly, our greatest reward is the opportunity to serve as a trusted partner for our clients, while continuing to ignite digital possibilities.


Interested in a demo of Provus in action, or want to talk about new possibilities? Let us know what we can do for you! 
 

May 20 2021
May 20

When I ask people why they fell in love with Drupal, most often they talk about feeling empowered to build ambitious websites with little or no code. In fact, the journey of many Drupalists started with Drupal's low-code approach to site building.

With that in mind, I proposed a new Project Browser initiative in my DrupalCon North America keynote. A Project Browser makes it easy for site builders to find and install modules. You shouldn't need to use the command line!

Making module discovery and module installation easier is long overdue. It's time to kick off this initiative! I will host the first meeting on May 24th between 14:30 UTC and 15:15 UTC. We'll share a Zoom-link on the Project Browser Slack channel before the meeting starts. Join our Slack channel and mark your calendars.

We'll start the meeting with high-level planning, and we need people with all kinds of skills. For example, we'll need help defining requirements, help designing and prototyping the user experience, and much more.

May 20 2021
May 20

Our normally scheduled call to chat about all things Drupal and nonprofits will happen TODAY, Thursday, May 20, at 1pm ET / 10am PT. (Convert to your local time zone.)

No set agenda this month, so we'll have plenty of time to discuss whatever Drupal-related thoughts are on your mind. 

All nonprofit Drupal devs and users, regardless of experience level, are always welcome on this call.

Feel free to share your thoughts and discussion points ahead of time in our collaborative Google doc: https://nten.org/drupal/notes

This free call is sponsored by NTEN.org and open to everyone.

View notes of previous months' calls.

May 19 2021
May 19

In our final installment of our series on Laravel, Senior Software Engineer Laslo Horvath and Managing Director Michael Meyers talk about the future of application development, and Laravel’s role in that future.

Laravel’s strengths include a strong push towards reusable components. This methodology enables developers and designers to take advantage of using the things they need, and leave everything else behind. Laravel-based stacks enable website builders to more easily switch from developer to designer, and back.

Application development in Laravel is growing rapidly. Integrations with other software are common, and they’re bringing tools together that help developers work together more smoothly and efficiently. VILT (Vue.js, Inertia, Laravel, TailwindCSS) and TALL (TailwindCSS, Alpine.js, Laravel, Livewire) combinations really start to highlight the potential of these packages in concert. Learn more about how these stacks are being used, and where they're headed.

[embedded content]


Related content

For a transcript of this video, see Transcript: Laravel with Laslo - Part 3.

Photo by Nick Fewings on Unsplash

May 18 2021
May 18

The Drupal Community Working Group (CWG) is happy to announce the addition of Donna Bungard to the Conflict Resolution Team. Based in New Hampshire, United States, Donna has been contributing to the Drupal project and community for more than five years. She is the Accessibility Manager and a Senior Project Manager with Kanopi Studios, a skilled writer, and accessibility expert. 

Over the past year, the Drupal Community Working Group has expanded to include a Community Health Team, of which Donna has been a member since the inception. One of the goals of the Community Health Team was to provide an on-ramp to the Conflict Resolution Team.

In addition, we have formalized our on-boarding process for the Conflict Resolution Team with the goal of making it easier for us to add additional members in the future, and making participation on the team more sustainable for all members. As part of the on-boarding process Donna will act as a provisional member of the Conflict Resolution Team with limited access to previous issues as well as mentoring on current issues.

As per the CWG charter, Donna has been approved by the CWG review panel.

We would like to thank the other 5 candidates we interviewed for the Conflict Resolution Team; we hope that all of them continue to remain active in Drupal Community health issues in the future. As we have decided to make the expansion a slow and deliberate process, we will be discussing additional options and future contributions with them.

The CWG is responsible for promoting and upholding the Drupal Code of Conduct and maintaining a friendly and welcoming community for the Drupal project. To learn more about the group and what we’ve been up to over the last year, check out our recently-published annual report
 

May 18 2021
May 18

php code on a laptop screen

To keep your organization at the forefront of open source security and innovation now's the time for a Drupal upgrade or migration to Drupal 9. Drupal 7’s end-of-life is November 2022, but if you’re currently on Drupal 8 the end-of-life hits in November 2021.

In this guide, we’ll cover Mediacurrent’s tried and tested approach to upgrading sites of all sizes. Kick-off with a codebase audit. Then, tackle code and compatibility issues. In the final mile, run a first-attempt upgrade to find and fix any remaining issues. And finally, the actual Drupal 9 upgrade.

Introduction

As you think about your upgrade path (whether moving from Drupal 7 or 8), a good starting point in preparing for Drupal 9 is performing a readiness audit. An audit will assess the level of effort and provide recommendations and preparations for a smooth upgrade or migration path to Drupal 9.

At a high level, a Drupal 9 upgrade process would look like this:

  1. Audit the codebase for deprecated code

  2. Audit the codebase for composer compatibility

  3. Fix deprecated code issues

  4. Fix composer compatibility issues

  5. Attempt Drupal 9 upgrade and see what’s left

  6. Perform actual Drupal 9 upgrade

In the first blog post of this Drupal 9 Upgrade series, I will focus on the first two steps and show you how to audit a codebase for Drupal 9 readiness. By the end, you will have a better understanding of the process and level of effort required for a successful upgrade and be more prepared to estimate a budget and timeline.

Audit for Drupal 9 Readiness

Performing an initial audit on the codebase is straightforward. This process should result in tickets in your task management system of updates to be performed way before the actual Drupal 9 upgrade release date.

Scan for deprecated code

Drupal-check is an invaluable tool for scanning files in a codebase to check their Drupal 9 readiness by looking for deprecated code -- basically code that was previously lingering around in Drupal 8 but is now removed from Drupal 9.

If drupal-check helps you during this process, and I’m sure it will, consider sponsoring the ongoing development and improvements of the project.

Install drupal-check

The most typical way to install drupal-check is to install it as part of the project via composer as a dev requirement. Be sure to review the drupal-check installation documentation.

composer require --dev mglaman/drupal-check

Run drupal-check

The drupal-check command can be run on any single file or directory. These steps assume it was installed in the project with composer, therefore the executable exists in the “vendor” folder and can be run as follows.

Here is an example of running the command against a contributed module that contains some deprecated code issues:

vendor/bin/drupal-check docroot/modules/contrib/allowed_formats

drupal-check error report

Now would be a good time to create a task in your task management system for addressing that deprecated code issue. Thankfully a solution already exists for the Allowed Formats contributed module that fixes this one particular issue.

That issue and fix were found by visiting the module’s project page that you are working on making Drupal 9 ready, search for “drupal 9” in the issue search box, and review what Drupal 9 related issues exist.

Allowed Formats module drupal 9 issue search

There is typically an issue labeled “Drupal 9 Deprecated Code Report”, but it may be named something else, and there may also be multiple related issues.

Allowed Formats module Drupal 9 issue list

Here is another run of drupal-check against another contributed module, which in this case, has no deprecated code issues.

vendor/bin/drupal-check docroot/modules/contrib/crop

drupal-check success report

While it appears at this time that this module is Drupal 9 ready per the drupal-check tests, it may not be completely Drupal 9 ready yet. In the next section, we’ll look at ways to check composer compatibility, which once the module is both composer compatible and no code deprecations present, it will be in great shape for the Drupal 9 upgrade.

Good, but not perfect

The drupal-check tool is very helpful, but it is not perfect. Again, consider sponsoring the project to help continue future development and improvements!

One of the false positives drupal-check may report relates to class usage. For example, it may be validating a class that is used by a module that no longer exists in the codebase, e.g. old code from a Drupal 7 migration that’s no longer used, so there’s nothing to do about that.

Also, sometimes drupal-check does not catch certain issues. For example, the codebase had a custom module that still contained a call to the `path.alias_manager` service, but that service is no longer available in Drupal 9 and was moved to `path_alias.manager`. However, drupal-check did not report this as an issue - I only found out about this issue once the Drupal 9 site was built and I tried to access the page that was controlled by the code that contained that old, removed service class call.

An alternative to drupal-check is to use the contributed module, Upgrade Status, to check on the Drupal 9 readiness of your existing site.

You should now have a good understanding of what custom and contributed packages need work to make them Drupal 9 ready. Be sure to keep notes or skip to the Issue Tracking section below.

Check composer compatibility

In addition to the deprecated code issues, the codebase also needs to be compatible with Drupal 9 from a composer perspective. Without packages being Drupal 9 supported in composer, composer will simply not allow the codebase to upgrade to Drupal 9.

What’s Not D9 Compatible?

A quick test you can do right away is to run this command, which will list all packages that are only supported by Drupal 8. These will need to be updated to also support Drupal 9.

composer why drupal/core | grep -Ev "\^9"

Essentially, this command tells us which other packages depend on Drupal core that do not yet have a 9 version in its composer metadata. Composer will not allow a Drupal 9 upgrade. Read more about the Composer Why command.

The output of the command will look something like the following:

Composer compatibility check results

These values come from the composer.lock file and are basically the list of packages used in the codebase that depend on drupal/core, specifically packages that only work with Drupal 8. This should be pretty much all of the themes, modules, and profiles, unless they have been kept updated on a regular basis and when security releases are available and necessary.

Just to be clear, the first lines that start with “drupal/core” can be ignored, the only ones to focus on are the other lines that reference contributed (or custom) modules.

For instance, in the example above the Facets module has a current version of “1.4.0”, or 8.x-1.4. This module will need to be updated to a later version, or a patch added that makes it Drupal 9 compliant; it might also be worth testing the current dev snapshot, the necessary fixes might have been committed, just not available as a full release yet. It appears for Facets module, the 8.x-1.5 version adds the Drupal 9 support.

Results in this output should be added as new tickets in the ticket management system of packages needing updates to address the deprecated code and composer compatibility issues.

Issue Tracking

To track the status of packages and their Drupal 9 readiness, I recommend creating a spreadsheet that helps track the ongoing upgrade process and what needs to be done. The below is just a small subset of changes needed, but I do recommend having multiple rows per package, in the cases where there may be multiple Drupal.org issues to cover all the D9 related fixes you’ll need.

Without getting too deep into it right now (saving for the next blog post), I’ve seen deprecated code issues be addressed in one Drupal.org issue, and the fix was committed to the latest version. But simply updating to the latest version did not make it Drupal 9 ready, because there were still composer compatibility fixes, and in some more rare cases, even still some deprecated code issues that were missed in the first pass. So, those issues may be split among multiple issues on Drupal.org.

Tracking Drupal 9 issues in spreadsheet

This planning, tracking, and documentation will help you as you continue through the Drupal 9 Upgrade process and keep tabs on what is remaining along the way. It may also serve as a good starting point or baseline for the next Drupal 9 upgrade process you may be involved in.

Conclusion

This blog post focused on the up-front audit process and preparation work required to understand the amount of work required to get into Drupal 9 readiness status. In the next blog post in this Drupal 9 Upgrade series, we will work on fixing the deprecated code and composer compatibility issues we discovered and documented as a result of this audit process.

Mediacurrent is a top 10 contributor on Drupal.org and has created solutions to accelerate, ease, and enhance the upgrade process. We can help you prepare for a Drupal 9 upgrade. Reach out anytime if you want to discuss your Drupal upgrade path or are interested in our team performing a Drupal 9 audit on your site and how we can help you succeed.

May 18 2021
May 18

You Might Also Like

With Drupal’s new versioning and release planning came the promise of easy upgrades between major versions. No more major database overhauls. No more rewriting business logic just to keep things working. No more major investments in expensive migrations to maintain feature parity.

Upgrading to Drupal 9 from Drupal 8 provides the first test of these promises. Have the promises been fulfilled? 

Yes.

Is it as simple as flipping a switch?

No.

When moving to Drupal 9 from Drupal 8 (which reaches end-of-life on November 2, 2021), planning is required. The larger your codebase, the more you need to take into account. Here is a roadmap to get you started.

Create a module inventory

The Drupal 9 codebase is very similar to the Drupal 8.9 codebase, but it removes the code marked as deprecated. This deprecated code can range in impact depending on what you are using and what APIs you have taken advantage of. 

You need to look at three things: custom modules, contrib modules, and themes. Create a list in whatever format will work best for your team.

To get started, we suggest installing the Upgrade Status module. Enable it in a development environment. This will give you a good baseline of the information you want to track. To get this information into a spreadsheet, install this patch for the module.

Organize the sheet to easily filter your contrib and custom modules because the work process will differ for each category.

Custom modules

Get drupal-check installed and running on a development environment or as part of your CI workflow. This will show you where the work is to get your custom codebase compatible with Drupal 9. 

The benefit of adding it to your CI workflow is to make sure you aren’t introducing any new incompatibilities. A VS Code extension is also available.

For each module that has a Drupal 9 compatibility issue, create a ticket in your ticketing system. Enter the details. Some things will need minor changes, like a different function call. Others might need deeper refactoring. Identify those as early as possible. Now is also a great time to check the contrib space and see if you even need these custom modules anymore.

Contrib Modules

Organize the contrib modules on which you are dependent into groups that represent levels of effort. Ask the following questions:

  • Which have a D9 release, and which does not?
  • For modules with a D9 release, are they minor updates or major updates?
  • Which modules are we currently forking/patching that may need more hand-holding to get to their D9 release?
  • Will this contrib module even have a D9 release? Is there a different module that will take its place?

Your groups might look like this:

  • Group 1: D9 minor tagged release available
  • Group 2: D9 major tagged release available
  • Group 3: D9 minor dev release available, not forked
  • Group 4: D9 major dev release available, not forked
  • Group 5: D9 major or minor release available, forked modules
  • Group 6: D9 release is not yet available

Themes

If you use any contrib base or sub-themes, keep them on your list to track and treat them as modules. 

Twig has its own deprecations moving from Twig 1 to Twig 2. Make sure your templates are not using functions in Twig templates that are not supported in version 2. The Upgrade Status module should flag these for you. Many projects won’t require any changes, but it is best to check early so you are not surprised.

Add any flags as tickets to your ticketing system.

Updating modules and code

Fixing your custom codebase is all work you are going to have to do sooner or later. Start adding some of the tickets you created into your sprints or general workload. Pick away at them. There is no substitute for rolling up your sleeves and getting to work.

For contrib modules, the latest releases will often support both Drupal 8 and Drupal 9, so they can go ahead and be updated. That represents the lowest level of effort. This is where you want to start.

Focus on your custom modules and the easy contrib updates first. This allows time for some of the more difficult contrib work to be completed by the module maintainers and the wider community. By the time you get to them later, it may be a trivial update. Keep your contrib module inventory up to date during your transition period.

If you are done with all of your custom code and low-hanging fruit, dedicate some time inside the issue queues of contrib modules your website depends on. There are a lot of ways to help out to speed up the overall process. Review patches and leave feedback. Submit your own patches. The work you do not only helps your own organization but potentially countless others.

While doing this work, be sure and stay up to date with the latest Drupal 8 minor releases. When the time comes and all of your code is ready, you’ll be able to upgrade to Drupal 9 with minimum hassle.

What to avoid

As you go through this process with each custom module, you’ll inevitably find other things that need improvement—features that were left half-baked, technical debt, cleaner comments, newer solutions, etc.

It will be tempting to fix everything since you’ll be touching the code anyway. But take heed. Avoid. You will be stumbling down a rabbit hole that will delay the completion of your true goal. 

Take note of these things. Create tickets. But stay focused on Drupal 9 readiness. If each code change and pull request becomes huge, full of unrelated changes, it will slow things down.

Summary

When you compare the process for upgrading from Drupal 8 to Drupal 9 to the process for upgrading from Drupal 7 to Drupal 8, it looks like night and day. 

D7 to D8 required a host of dedicated resources and time. The amount of work required inevitably made it a big project. In contrast, D8 to D9 can be done alongside regular work. Your developer team might be able to slot it into their roadmap without additional overhead.

  • Create an inventory of your code
  • Use drupal-check and Update Status to start updating your code
  • Keep Drupal 8 up-to-date during this transition period
  • Upgrade to Drupal 9

Drupal 8 reaches end-of-life on November 2, 2021, which means that Drupal 8.9 will no longer receive security updates. This will sneak up on you quickly if you aren’t tracking it. At this point, there is no value in delaying. Start your move to Drupal 9.

May 18 2021
May 18
easy file (field) paths module install and configuration

 

https://www.drupal.org/project/filefield_paths

Credits & Thanks

Thank you to:

About the File (Field) Paths Module

The File (Field) Paths module extends the default functionality of the Drupal core File module. That is, it allows you to automatically sort and rename your uploaded files using token based replacement patterns to maintain a nice clean filesystem.

We like to use it to make sure all uploaded files (especially images) get renamed to something similar to the node title in which it's used. Not only does this help with SEO, it makes it easier to find the files within the file system.

Install and Enable the File (Field) Paths Module

Install the File (Field) Paths module on your server. (Go here for more instructions on installing modules.)

  1. Go to the Extend page: Click Manage > Extend (Coffee: “extend”) or visit https:///admin/modules.
     file field paths installation
  2. Select the checkbox next to File (Field) Paths and click the Install button at the bottom of the page.

There are no separate permissions required for the File (Field) Paths module.

Configure the File (Field) Paths module

The File (Field) Paths module needs to be configured for each content type that uses file fields that you wish to use.

  1. Go to the Content types page by clicking on Manage > Structure > Content types (coffee: “content types) or go to https:///admin/structure/types.
     
  2. Next to the first content type, select Manage fields from the Operations drop down selection box. This will take you to the Manage fields page for that content type.

    file field paths manage fields
     

  3. Find the Image field and click the Edit button for that field. This will take you to the Image settings for that content type.

    file field paths image field type configuration
     

  4. Scroll down to the File (Field) Path Settings section, and expand it using the drop down arrow next to the heading.

    file field paths configuration settings
     

  5. Leave the File path field at its current default or, if empty, set it to [node:title].
     
  6. Expand the File Path Options section and make sure to select the following checkboxes:
    • Remove slashes (/) from tokens
    • Cleanup using Pathauto
    • Transliterate
       
  7. Leave the File name field at its current default.
     
  8. Expand the File Name Options section and make sure to select the following checkboxes:
    • Remove slashes (/) from tokens
    • Cleanup using Pathauto
    • Transliterate
       
  9. Select the Create Redirect checkbox.
     
  10. Select the Enable Alt field checkbox.
     
  11. [Optional] Select the Alt field required checkbox. This will remind content creators to add an alt field for this image.
     
  12. Scroll to the bottom of the page and click the Save settings button.
     
  13. If there are any additional Image Field types within this Content Type, complete steps 3-10 for them.
     
  14. Complete steps 1-11 for all content types. If a content type does not have an image field, then move to the next content type.

Did you like this walkthrough? Please tell your friends about it!

facebook icon twitter social icon linkedin social icon

May 18 2021
May 18

Several of our clients manage multiple small sites for independent centers or groups within their organization. These sites often share an information architecture and key features like SSO integration and editorial workflows. They differ mainly in their content, and who manages the content for each site. They  are visually distinguished by logo and color, but share much of their page layout and components, meaning they can share a design system and theme as long as it provides a few options.

This group of sites can be run as Drupal "multisites", where multiple Drupal installations run on the same Drupal codebase and hosting infrastructure. A typical approach for setting up these sites would be either to install and configure each independently, using the same underlying code but re-creating the content types, taxonomies, and content management workflows for each -- or, to create one site and clone it. What this does is create multiple distinct Drupal sites that need to be updated independently when the content types or module configuration needs to evolve.

At this point you could consider building a Drupal "install profile", which lets you create identical new sites. However, the process for maintaining shared configuration after installation is to write update hooks, a tedious process compared to Drupal's config management.

Sharing base configuration

The approach we took instead was to share a base set of configuration -- the bulk of the Drupal config -- between all of the sites. This means that sites start from a complete set of features, including content type, taxonomy, and menu configuration as well as modules like SSO and workflow. A limited number of configuration items -- like the site name and 404 page, the logo and color theme settings, and others as needed -- are split out into site-specific config folders.

Not only do sites start from a complete shared set of features, but as features are added to the main configuration, each of the sites receive the new functionality at the same time.

Consider a case where you want to install a new module across all of the sites. For multisites set up with an install profile and independent configuration, you would need to enable and configure the module on each site independently. This would require careful attention to ensure the changes were applied on each of the sites, and verifying that you made exactly the same configuration decisions on each. When you discover that an option needs to be tweaked, you go back into each site and update it.

For multisites set up with shared config, you can install and configure the module once. The shared config will propagate the changes to each of the multisites. As you tune the module configuration, these changes too will go out to the multisites using straightforward config management processes -- no update hooks required.

Step by step

Here's how this works:

Set up

  1. Start by installing and configuring the first site
    1. Export the configuration to what will be the shared config directory
  2. Install Config Ignore, and configure it to at least ignore “system.site” (and probably “blocks.*”). Any other settings that vary across sites (ex: theme settings, colors, etc.) will also need to be added here.
    1. Configuration that is ignored will still be used when a new site is installed -- they act as "base config" -- but won't be imported again after installation
    2. If there's configuration that shouldn't be used as base configuration (often relevant for blocks), there are two options for making the Config Ignore workflow cleaner:
      1. Config Export Ignore will prevent certain configuration objects from being exported
      2. OR add a .gitignore in your config directory to exclude the files
  3. Install Config Split if there are modules that only need to be enabled on certain sites; when using Config Split, each site or group of sites will need their own split configuration directory.
  4. Now your config is ready for re-use.

Adding more sites

  1. Set up the scaffolding for a new multisite using the normal multisite set up process
  2. Make sure each multisite uses the shared config directory that you've created
    1. See Drupal's documentation on changing the storage location of the sync directory
  3. Install the new site from config when running install.php
    1. Using the web interface, select "install from config"
    2. OR using the command line, run drush si --uri=MULTISITE_URI --existing-config

Updating sites

  1. To make a change across all sites:
    1. Edit any site
    2. Export configuration -- this will update config in the shared config directory
    3. Check the changes into your repository
    4. Import config on the other sites
  2. To make a change that will only affect one site (this change will live in the destination/prod environment only)
    1. Make the change on the one site, locally
    2. Add the config to Config Ignore and export the configuration
      1. Both the Config Ignore settings and the changed config will be exported!
      2. Commit the changes to Config Ignore, but NOT the config file that you're ignoring
      3. You may want to add the specific files to the .gitignore so they don't accidentally get checked in and used as base configuration on new sites
      4. Note that the Config Ignore rule won't prevent the new configuration from being imported unless it is present BEFORE the new config is present
    3. Deploy the Config Ignore changes
    4. Make the desired change on the prod site
    5. If you're ignoring an entire class of config (like all block configuration), you only need to do this the first time you set up the Ignore
  3. To update core or modules across all sites
    1. Managing updates across multisites follows the standard process -- update the shared codebase and then run the update hooks on each site individually<
    2. Config that is changed by a module update only needs to be exported once

Making Multisites Easy

There's always a balancing act between the specific needs of one website among many, and the stability and maintainability of a single codebase and feature set. Creating a family of multisites is a great way to isolate editorial responsibilities, allowing divisions or centers independent control of their content and navigation; sharing the code AND configuration between these sites allows sustainable maintenance and iterative feature rollouts.

Airstream Ranch by Carlyle Ellis Photography/Human Quotient, licensed under CC BY-NC-ND 2.0.

May 18 2021
May 18

A computer is a marvellous creation, it can be used to build and execute any number of software and instructions that we may want. What is even more marvellous is that it understands exactly what we want it to do. And this extraordinary task is possible because we have programming languages that speak in computers. 

One of them is PHP or Hypertext Preprocessor. It is an open source programming language that runs on a web server, which has been popular throughout its life cycle being the go-to language for web development.

PHP with its server-side scripting together with HTML and its execution in the browser are immensely popular especially when creating dynamic websites is in question. Majority of Drupal sites are built on it, the entirety of Drupal is written in PHP, I think that should be proof of the language’s epic capabilities. 

A graph represents the market position of PHP.Source: W3Techs

The above image further clarifies the doubts about PHP’s capabilities and well, its popularity. 

There are numerous adjectives that I can use to describe PHP’s competencies, however, the one that is appropriate for how the language is going right now would be ‘behind schedule.’ And sadly this adjective isn’t for the language itself but for the websites and developers using it. The reason is the delay in its update. 

Today we’ll try to understand the language in regards to its versions and the capabilities that come with each of these versions and then we’ll get to the important question as to why PHP is not being updated and what you stand to lose, if you keep delaying. 

Let’s start by understanding PHP’s Life Cycle

Like everything else in this world, PHP also comes with an expiration date. Each of the PHP versions has a fixed life cycle, beyond which using it can be detrimental. Usually this life cycle is around three years, once those three years are completed the version becomes unsupported, you can still use it, at your discretion though.

When a PHP version is released, it is actively supported for the next two years. This active support entails the frequent fixing of bugs and security issues and their consequent patching. The third year, sometimes even fourth, will only entail releases pertaining to critical security issues. This is also the time when the next version becomes available. 

Once these three years have been completed for that particular version, it becomes unsupported. There won't be any fixes or patches, not even for security and using such a version would mean you are making yourself vulnerable.

As of now, 

Version 5.6;
Version 7.0;
Version 7.1;
Version 7.2;

Have reached their end of life, meaning they are no longer supported, yet many sites are still on them. So, which are the versions that are supported and should be ideally used? The answer is 7.3 and above. 

The image includes a table and a graph, both depicting the supported versions of PHP.Source: php.net

Any version that is still actively supported would be the only one ideal for your web projects. Talking about projects under the Drupal flagship, the same three supported versions are the only ones recommended by the CMS; proving that there is truth to what I am saying. This is exactly also why Drupal is able to avoid the misconception that it is difficult to use and has stayed relevant with changing times.

Versions of PHP supported by Drupal are listed in a table.Source: Drupal.org

Taking the supported PHP versions under the microscope

I have said a number of times that using a supported version is ideal, now let’s understand the why. Yes, they would get new updates, fixes and patches quite frequently, but that is not it; these versions are also perfect to use because they have features that are unfound in the previous versions. It’s like being on Android 9, you are missing out on a lot of features that Android 10 is powered with and there is no advantage in that. 

Let’s look at each of the supported PHP versions to understand this. 

Version 7.3 

The third version of PHP 7, version 7.3 was released in December, 2018 and is said to be faster than its predecessor. There weren’t changes in this version, but the improvements that were made proved quite helpful. 

  • The 7.3 came with more flexible heredoc and nowdoc syntaxes; 
  • The 7.3 came with multibyte string functions; 
  • And the 7.3 also had Argon2 password hash enhancements; 

All of these made this release better than the last.

Version 7.4 

Released almost an year after the launch 7.3, PHP 7.4 brought along significant in two key areas; 

  • Performance, by adding the preload functions to speed up loading significantly; 
  • And code readability, with features like typed properties and custom object serialisation. 

PHP 7.4 also marked the end of PHP 7 as this was the last version before 8 came along.

The Newest Kid on the Block: PHP Version 8 

PHP 8.0.0 is the latest version of the scripting language. Released on 26 November, 2020. This was a major release for PHP coming almost three years after version 7 was released. Although there are a lot of new features added to this new release, speed optimisation that was introduced in PHP 7 will continue improving on it. 

JIT Compiler, union types, type annotations and reflection signatures are some of the new features found in this release. Here is a video highlighting and explaining the prominent new additions in PHP 8.

[embedded content]

What will the update mean for you?

Up until now, you must have gotten a fair idea that updating PHP will be beneficial for your web project. Yes, sometimes updating can seem like a mistake, but that isn’t the case with PHP, not in the least. 

There are the additional features that were lacking in the preceding version of the update, you definitely benefit from that. Apart from that bonus, three discernible reasons to update PHP that should make the process all the more worthwhile.

Heightened Security 

The paramount benefit of always updating PHP is security. With every update, you’ll be more secure in the site because that running version is going to be fully supported and patched frequently for any and all security vulnerabilities. 

As we’ve already discussed, as versions come to their end of life, they do not get patches and fixes, making your project unprotected from threats. 

A graph illustrates the security vulnerabilities of PHP from 2000 to 2019.Source: CVE Details

Looking at this graph, it is clear that PHP experiences security vulnerabilities, with 2016 being the year marking the highest level at 106. 

A line graph depicts the types of security vulnerabilities in PHP.Source: CVE Details

Talking about vulnerabilities by type, denial of service, overflow and execute code are the ones topping the ranks. An update is what will prevent these security threats from being a threat and if you don’t update, you’ll have to be extra diligent about them. Is that something you want to do instead of working towards achieving your goals? I doubt it.

Perpetual Support

Then comes support, which becomes almost absent in older versions of PHP. Much like every other advancement we see, whenever a new one rolls by, interest in its older version starts diminishing and it is understandable. For PHP, this diminished interest means that the preceding versions would not get the kind of support from the community that it used to and its compatibility would lessen.

Think about it from the developers’ perspective, since they are the ones creating plugins and themes for every version. When they can devote their entire time and attention in creating something wonderful for a new version, do you think they’ll be interested in doing the same for something that is almost antiquated? I do not think so. 

There is also the concern of time, developers become occupied with the newer versions, so they do not get the time to oversee support for the older versions, making them somewhat incompatible for newer browsers and general development. 

So, when you keep updating PHP, you will continue to get support for it in the form of new plugins and themes and any issues you may face will be resolved quite quickly. You won’t ever see a 2000 comment thread for an issue in a new version, since there will be perpetual support to fix that.

Superior Performance

New technology will always outshine old in terms of performance, right? The same is true for PHP versions. When you update, security and support benefits would be there obviously, however, the more apparent benefit would be the one related to performance. You will be able to tell the difference almost immediately. 

Things that you had to work for diligently to achieve will come out of the box. From better latency rates to being able to handle more requests per second, everything would improve when you update.

Here are two images that show PHP Benchmarks for performance in version 5.6 and 8.0 and the improvements in performance are more than evident between the two versions.

The performance analysis of PHP 5.6 is shown.The performance analysis of PHP 8.0 is shown.Source: PHP Benchmarks

Then, why are the older versions still active?

When an update brings along such benefits, it should be a no brainer. Yet there are umpteen sites that are still on the older versions of PHP. 

The usage statistics of PHP versions are shown in a bar graph.Source: W3Techs

The fact that more than one-third of PHP sites are still on version is kind of sad and hard to digest even. I mean version 5.6 reached its end of life on 31 December 2018, almost two and a half years ago, yet it is still active without any support.

So, why is that? Why are these sites keeping themselves vulnerable when they can be secure? Let’s try to understand the reasoning. 

The obliviousness about the update 

The foremost reason for not updating is the obliviousness surrounding the update itself. The thing about web projects and websites in general is that, although they are built by developers, the people running them on the daily may not have knowledge about its technical sides. All they are concerned about is the fact that the site is running and functional and that it looks good. 

For these non-technical site owners, updating PHP isn’t something to be concerned about. Half the time they are oblivious about the fact and even if they do know it, they’d not pay much heed to it. Many a time, it is the developers and web hosts, who have to push the site owners for the update and they don’t always win that battle based on the W3Techs statistics.

The investment of time

If we talk about updating PHP to version 8.0.0, it is fairly easy to do so. However, that ease is contingent upon the code being updated. If the code is up-to-date, you’ll have an easy-breezy update. If not, you’ll be looking at a huge investment of time and resources. And that is the second reason why sites don’t update.

  • Developers who have older plugins and themes will have to put in a lot of time and effort to update their code because the newer version would not be compatible with an outdated code. 
  • Then comes the testing that needs to be done to ensure the compatibility of the update, which is quite extensible because there will be a great deal of plugins to test.

The apprehension surrounding the update’s effects

Lastly, it is the apprehension that comes with the thought of the update that makes site owners immensely hesitant about updating. 

What if the update breaks the site?
What if the update results in additional support tickets?
What if the update makes the site stop functioning altogether?

These what if pose a massive roadblock even before the road to update has begun. Fortunately, all of them are unwanted and misguided. 

Remember the previous reason, it talked about updating the code before updating to a newer version. If you update without making the code compatible your site will break, there isn’t a shadow of doubt about it. 

Apart from that the update won’t affect your site in a negative light. Upon updating PHP on a site built on a CMS like Drupal, you’ll instantly see an improvement in performance and that isn’t something to be apprehensive about, rather you should be looking forward to it. Learn more about how keeping the dependencies like PHP has helped Drupal 9, the best version of it so far, to make upgrade from previous version simpler.

How do you go about the update?

Now that we’ve discussed everything concerning the PHP update and have established that it is more than imperative to keep PHP updated, we have come to the final aspect of the process, which is how to update PHP.

The update is done in two parts, one is a prerequisite and the second is performing the actual update itself.

Checking PHP version compatibility

The first part of the PHP update is to check whether your site is even compatible with the version you are trying to update to, in regards to the themes and plugins, they should be updated to their latest version. This is also one of the reasons I mentioned in the previous section as to why sites don’t update PHP.

You would need your developers for this, since they’d be the ones who would add or fix support for the version to be updated. There is also the option of choosing alternate plugins, if that is what you prefer. 

Once your themes, plugins and current code is updated, you’ll be ready to perform the update. 

A little side note; You should remember though, that updating the staging site before updating the production site is the better way to go. This is because if you do encounter any mishaps during the update, you’ll be able to fix them on the staging site without any effect on the live site.

Choosing the suitable way to update 

The actual update can be done in three ways depending on the build of your site and the kind of tools you have access to.

Through the cPanel 

If you have access to a cPanel or your host provides access to the same, you can simply go to the control panel, log in and change the version of PHP to the one you want; it’s as simple as that.

Through your own server 

You can update PHP using the migration guide provided by php.net, that is if you are the administrator of your own server. This guide covers the migration procedures from version 5.5 to 8.0 and clarifies all the details related to the update from new features to backward incompatible changes and deprecated features.

Through a web host 

If the aforementioned ways don’t work for you, that is you don’t have access to the cPanel and you are also not the administrator of your site, then you can simply ask your web host to perform the update for you. 

Conclusion 

There comes a point in life when you have to say goodbye to what has been normal for you because something else has taken its place. And it’s almost never a bad thing. Has anyone ever felt sad after updating their iphone? I don’t think I know anybody who has, maybe a little upset about the hefty bill, but never about the product. 

The same is the story with PHP updates. It may seem like a daunting task to undertake, much like paying a thousand dollars for the iphone, but in the end, it’ll be you, who’ll be basking in the benefits of the update. Is that not something you’d want?

May 18 2021
May 18
[embedded content]

Don’t forget to subscribe to our YouTube channel to stay up-to-date.

Creating a block using views is pretty straightforward. You could create a block to display a list of published articles or ones that have been promoted to the front page. Then you can add that block into any theme region.

But you may encounter a situation where you no longer have any articles which are published and then you end up with an empty block.

Views comes with a feature that allows you to hide a block if no results are returned and this is what will be covered in this tutorial.

Table of Contents

Getting Started with a Simple Example

We need to display a block that lists all articles that have been promoted to the front page. This can easily be achieved by using a custom views block

The views configuration will look something similar to the following:

This is a block created from views, and we can place the block on the sidebar, like the following:

The criteria of the views setting is based on some articles having the ‘Promoted to front page’ option checked.

The Problem

However, sometimes articles may not be promoted at all.  As a result, there are no results returned from the views setup above, and the block will look like this:

How to Fix It

To avoid this situation, we can choose to Hide the Block when no results are returned.  In fact, there is an option of ‘Hide block if the view output is empty’ at the bottom corner of the views setting, which a lot of people might easily overlook.

Simply enable this option, and it is done.

With this option enabled, the block will disappear when the view returns no results, like the image below:

Summary

This ‘Hide block if the view output is empty’ option is actually one which is very useful, but it can be easily overlooked.

Editorial Team

About Editorial Team

Web development experts producing the best tutorials on the web. Want to join our team? Get paid to write tutorials.

May 17 2021
May 17

Since January, as these funds increased, my day-to-day commitment to the Webform module's issue queue decreased. The estimated annual budget for the Webform module's Open Collective has grown to $10,082.34. I finally decided to see if and how I could use these funds to compensate myself for my time and allow me to start wrangling the Webform module's issue queue.

May 17 2021
May 17

Furthermore, there were community summits held for higher education, healthcare, nonprofit, and government. These were individual events focused entirely on their subject with panels, workshops, and discussions within breakout rooms.

The Forgotten Site-builder


During the Driesnote, Dries Buytaert mentioned five big initiatives to focus on for Drupal 10, all of which have made progress throughout the lifecycle of Drupal 9:

  1. Decoupled Menus
  2. Easy Out of the Box
  3. Automated updates
  4. Drupal 10 readiness
  5. New front end theme

Dries made us aware that with the evolution and growth of Drupal, we have left behind the “site-builder” role. Dries believes that we need to get back to our identity, to reinvigorate and re-focus on our roots by empowering site-builders to build ambitious websites with low code.

“Let’s make Drupal the go-to technology for site-builders experience.”

Finally, Dries mentioned a sixth big initiative, a project browser, to improve the site-building experience. One of the first things site-builders do when they start with Drupal is to install a module, a project browser will therefore simplify the process of finding and installing modules.

DrupalCon North America 2021 - Driesnote by Dries Buytaert

Back to the sessions I attended, rather than recap them in timeline order, I decided to group them into different category themes.

The main themes for me during this DrupalCon were:

  • Contribution
  • Editor Experience
  • Decoupled Drupal
  • Accessibility
  • Diversity, Equity, and Inclusion

Contribution


The conference schedule was split into two parts, daily sessions during the day followed by a daily afternoon-evening spotlight on contribution. In the Driesnote, we saw a comparison of how approachable it is to contribute to the Symfony foundation compared to the almost pain-staking way of contributing to Drupal. It is clear that we need to improve Drupal’s contributor experience, which is why the Drupal Association is looking to give new Drupalists an easier path to start contributing by improving the integration with GitLab and enabling newer features in the near future.

Contribution was organised on OpenSocial, an online collaboration platform, which saw a healthy group of members attend each day to work on improving Drupal. There were five main focus groups: Decoupled Menus initiative, Easy out of the Box initiative, Automated Updates, Bug Smash, and Drupal 10 readiness. There was also the general contribution area and as always, the first timer’s contribution lounge.

Editor Experience


Throughout the conference, a recurring theme was a focus on improving the editor experience. I attended the following sessions related to this topic:

  • Easy out of the Box initiative Keynote
  • Editor Experience: Compare Drupal, WordPress, & Contentful
  • A Better Experience for Content Editors
  • Editor UX Matters: Gutenberg Can Help
  • Expand Building With Components Truly Achieve No-code Drupal
  • Reimagining the WYSIWYG: CKEditor 5 in Drupal Core

The Easy Out of the Box Initiative makes the editorial experience clear and empowering from the moment Drupal is installed by enabling Media, Layout Builder, and Claro in the default Drupal installation profile. This initiative is important because media, layouts, and modern administration design are fundamental to creating a delightful user experience for everyone building with Drupal.

Sascha Eggenberger, a Senior UX Designer at Unic and a core member of the Easy out of the Box initiative, shared his work on a better experience for content editors along with various modules that he recommends such as "admin toolbar", "media library", and "inline entity form". Sascha also gave an overview of quick wins and explained how a good user experience along with site-building is crucial in helping towards wider adoption of Drupal.

DrupalCon North America 2021. A Better Experience for Content Editors, by Sascha Eggenberger

At Amazee Labs, we actively seek to improve the editor experience for our clients’ needs. We build solutions involving the Gutenberg editor coupled with bespoke admin views, along with the implementation of an improved admin UI utilising the Gin admin module.

Decoupled


Following on from improving the editor experience, we explored the state of decoupled Drupal, with a heavy focus on Gatsby and Next.js. I attended the following sessions:

  • Decoupled Menus Initiative Keynote
  • An Iterative Approach: Decoupling Drupal Sites With Gatsby
  • JS Web Components (Demo)
  • Decoupled Translations with Drupal and Gatsby
  • Using Drupal's Layout Builder with Gatsby
  • Relaunch Blog of Unity.com with Headless Drupal 8, Next.js

The decoupled menus initiative is set up to provide the best way for JavaScript front ends to consume configurable menus managed in Drupal. Menus were chosen because the majority of digital experiences have a menu. They also touch on some of the most difficult problems facing decoupled sites today.

DrupalCon North America 2021 - An Iterative Approach to Decoupling Drupal Sites in Gatsby, by Brian Perry and Matthew RamirRecently, the Drupal Association made it possible to distribute JavaScript packages within the Drupal ecosystem, helping to push this initiative forward. In the evening, a hackathon was held where participants were invited to build their own custom menu component with their favourite JavaScript frameworks. This is an important initiative to help realise Drupal’s goal of being the best decoupled CMS.

Amazee Labs developer, Nick O’Sullivan, gave a talk about “Decoupled Translations with Drupal and Gatsby”. Gatsby is a popular static site generator that uses React, and it’s a perfect match for a decoupled Drupal architecture to create lightning-fast websites. Nick took us through how to create a multilingual Gatsby site with content sourced from Drupal, then gave an overview of the key considerations to take into account when developing a translation management workflow.

AmazeDrupalCon North America 2021 - “Decoupled Translations with Drupal and Gatsby” by Nick O’Sullivan

Accessibility 


There were many great sessions on accessibility, I was only able to attend the following:

  • Accessibility is a Moving Target
  • An Accessible Digital World
  • Accessibility for Deaf Beyond Video Captions & Sign Language
  • Bake Accessibility into Every Project
  • Inclusive and Accessible Co-Creation

AmyJune Hineline, an Open Source Community Ambassador and winner of this years’ Aaron Winborn award, gave a presentation on how “Accessibility is a moving target”. AmyJune walked us through embracing accessibility, understanding the high-level principles, and shared tips on designing for accessibility.

  • Perceivable: A user can identify content and interface elements by means of the senses.
  • Operable: A user can successfully use necessary interactive elements.
  • Understandable: Users should be able to comprehend the content, and learn how to use the interface.
  • Robust: Users should be able to choose the technology they use to interact with digital assets.

DrupalCon North America 2021 - Accessibility is a Moving Target, by AmyJune Hineline

In “Accessibility for Deaf Beyond Video Captions & Sign Language”, Svetlana Kouznetsova, an experienced deaf professional providing consulting services to businesses, presented a unique session to demonstrate that not all deaf and hard of hearing people are the same. As the vast majority of deaf and hard of hearing people don't know sign language, they can’t benefit from interpreters, and not everyone can lip read or benefit from hearing devices.

Sadiyah Ali spoke about “An Accessible Digital World” by baking it in from the beginning of the development phase. By ensuring headings, links, images, and focus states are all accessible, you will cover a lot of the requirements laid out by the law. “If you forget to put butter in the batter of a cake, you can’t put it in at the end, similarly digital accessibility is simplest when it’s planned as part of the process.”
 

Diversity, Equity, and Inclusion


Heather mentioned at the beginning of the Driesnote that the Drupal Association has an increased focus on Drupal’s strategic initiatives. One strong Drupal value is to cultivate talent with an emphasis on Diversity, Equity, and Inclusion. Heather shared Drupal's Diversity, Equity, and Inclusion resources found on drupal.org.

There were many wonderful talks throughout the conference, however, the topic that focuses on diversity and inclusion is always more thought-provoking, inspiring and important to me.

  • Allyship - Key to Unlocking the Power of Diversity
  • Building Successful Mentorships for People of Color in Tech
  • Creating Systemic Change: Digital Rights For All
  • School Needs Open Source, Now More Than Ever

Sheree Atcheson, a global diversity, equity and inclusion leader, gave the keynote entitled “Allyship - the key to unlocking the power of diversity”. Sheree spoke about intersectionality, being an ally and allyship, along with the tools to aid you and your organisation to improve your DE&I processes. Sheree explained that an ally is any person that actively promotes and aspires to advance the culture of inclusion through intentional, positive and conscious efforts that benefit people as a whole. Allyship is a lifelong process of building relationships based on trust, consistency, and accountability with marginalized individuals and/or groups of people. It is a continuous and consistent process in which you need to educate yourself by following and learning from people different to you.

“Most importantly - listen, support, self-reflect, and change.”

DrupalCon North America 2021 -Allyship - Key to Unlocking the Power of Diversity, Sheree Atcheson

Byron Woodfork, a senior software engineer, spoke about “Building Successful Mentorships for People of Color in tech”. Byron explained that we must change the way we mentor as traditional teaching methods will not work for minorities. Help build a mentor network to give a diverse and supportive community for your mentee to learn and grow from. Build trust with your mentee by providing quality feedback, coach and counsel your mentee by listening to them. Mentoring is really hard work, and you will get things wrong, but the important thing is to learn from your mistakes and improve upon them for the future.

Stu Keroff and his students of The Penguin Corps, Aspen Academy's Linux Club, gave the final session entitled “School needs Open Source, now more than ever”. Maya began the keynote by announcing the Penguin Corps pledge: What are we trying to do? Change the world! How do you change the world? Be crazy enough to think you can!

Stu then gave a brief history of how he set up the first two Linux clubs in Minnesota, and how the penguin corps has grown year after year, from 15 members to over 50. The equation is simple: open-source software + user computers + enthusiastic kids = more students learn! The talk was my favourite and I look forward to learning more about the growth of the Penguin Corps.

DrupalCon North America 2021 - School Needs Open Source, Now More Than Ever, by Stu Keroff

A Final Word


Overall, the conference has proven that splitting the schedule to accommodate a shorter day of interesting sessions coupled with a half-day of contribution has been a healthy balance to avoid “Zoom fatigue” and to keep all attendees engaged and motivated.

As always there was a great deal to learn and share, and of particular value were the sessions on Accessibility. At Amazee Labs, we offer Accessibility audits so reach out to us if you want to create a better User Experience and ensure your site is compliant.

If this resonates with you and you’re as passionate about Drupal and the open-source community as we are, get in touch today!

May 17 2021
May 17

The year 2021 is special for Drupal because it is Drupal’s 20th Anniversary. In our second year of virtual events, DrupalCon North America and DrupalFest 2021 expanded on the theme #CelebrateDrupal for the entire month of April. 

DrupalCon North America 2021 was a major celebration of the community. We’re so grateful for each person and organization that contributed their time, talent, and treasure. DrupalCon North America hosted 2,330 registered participants. This would not have been possible without our generous sponsors and the 85 dedicated contributors who made up the committees and initiative teams. Over 50% of the contributors were new to contributing to DrupalCon. #DrupalThanks 

DC NA Infographic

Participants came from 80 countries across 6 continents, which made DrupalCon North America among the most geographically diverse DrupalCons on record. Most attendees came from the US, followed by Canada, UK, India, Germany, and Brazil.

In addition to global diversity, approximately 40% of all registrants self-identified with an underrepresented community.

The virtual conference allowed us to continue an expanded scholarship program. We awarded more than 150 scholarships with the support of 29 local Drupal associations around the world, Drupal Diversity & Inclusion, SheCodeAfrica, WomenTech Network, and Kennesaw State University. The scholarship mentorship program which paired recipients with seasoned Drupal community members received positive feedback. There is tremendous value in mentorship for new attendees and this is something we will continue to build on for future events.

DrupalCon North America was designed with flexibility in mind. Each day was structured with four hours of live content including sessions, round-table discussions, social, and networking events, followed by four hours of mentored contribution. On Tuesday, 13 April - Friday, 16 April, each day focused on a specific Drupal initiative. To accelerate rapid Drupal development, mentored contribution opportunities were extended to each day of the conference. 

DrupalCon North America’s Contribution Opportunities yielded fantastic results for the Drupal project. There were seven contribution groups including groups for first-time contributors and mentors and groups that focused on contributing to the same Drupal initiatives that were the focus of DrupalCon North America. 

Each day of initiative-related content began with an Initiative Keynote. Key members of the Initiative took us all on a journey through the goals of that initiative, and how it will help create a better product for us all to build future digital experiences. After those sessions, all the keynote speakers were available for discussion with participants.

Watch these brief video recaps of the DrupalCon North America Initiatives.

Additionally, the DrupalCon North America program contained content that helped participants of all backgrounds and skillsets to learn in detail how the initiative works, how it will enable them to build more exciting experiences, how to sell those experiences more effectively, and how they can participate in contributing to the initiative’s success.

  • 143 informative sessions
  • 206 speakers, 40% of whom identified with one or more historically underrepresented groups
  • 150 scholarships awarded

In addition to Drupal initiative-related content and contribution, DrupalCon also hosted an impressive slate of Main Stage speakers. 


DrupalCon North America was proudly supported by 51 sponsoring organizations and 17 individual sponsors. Most sponsors received virtual “booths” in the Expo Hall where attendees viewed lightning talks, demos and had group or individual interactions. Module sponsors were able to gain an audience by sponsoring popular DrupalCon elements such as the Hallway Track, Women in Drupal Workshop, and Drupal Trivia. 

DrupalCon North America was a wonderful celebration of 20 years of Drupal and the Drupal Community. If you did not have the opportunity to participate in DrupalCon North America, session videos will be available to the public at the end of May on the Drupal Association YouTube channel. Join the Drupal Community again for DrupalCon Europe. The virtual DrupalCon Europe conference will take place online, from 4-7 October 2021, in a CET Timezone.

May 14 2021
May 14

When you are used to working in Drupal 8 and beyond, having to make changes to custom Drupal 7 code can be unpleasant. If you’re lucky, the code was well written and maintained and you can make your changes without feeling like you’re adding to a house of cards. Too often, though, the code is in a state where it could really use refactoring.

One of the many nice changes Drupal 8 brought about was changing most of the code over to object-oriented programming, or OOP. There are many benefits to this and one of them is that it organizes most of the code into classes. In Drupal 7, all that code was dumped into the .module files because that’s how things were done back then. But it doesn’t need to be that way.

Why Add OOP?

Drupal 7 may not have the OOP framework that Drupal 8 built, and you still need the hooks, but there are benefits to adding OOP to your refactor:

  • Upgrades - It gets your custom code closer to Drupal 8/9. If there is any thought of an upgrade in the site’s future, having the custom modules as close to Drupal 8/9 as possible will make that upgrade easier.
  • Clean Code - It makes your Drupal 7 code cleaner and easier to maintain. Even if an upgrade is a long way off, having clean and organized code is much easier to maintain and reduces the chance of knocking down that house of cards when you need to make a change.
  • Easier Refactoring - It makes refactoring easier because you can work on a copy of the code that is moved to a new place rather than trying to do everything in place. As you move and refactor each function, you can easily see what code is refactored and what is still the old code.
  • Testing - It makes it easier to test your refactored code. There’s more on that in the “Testing the refactoring” section below. If you are following along with this blog post and making changes to your module as you go, you’ll want to read that section before you start.

An Example Module

For the purpose of examples, let’s say we are refactoring a custom module called XYZ Events. Our site is for company XYZ and this module is where we stored all the custom functionality related to our events.

This module has custom menu items, blocks, and a bunch of miscellaneous helper functions all sitting in the .module file and we want to clean that up.

I’ll reference this fictional module to provide examples along the way.

Make class loading simple

To make using classes much easier, start with a very handy contributed module: X Autoload. X Autoload lets you make use of the automagic class loading that Drupal 8 has just by putting your classes in the proper directories and setting the namespace. With that in place, you are ready to start moving your code into classes.

Whenever you add a new class, be sure to clear the cache before you try to use it.

Services for all the miscellaneous functions

While hooks and a few other things need to stay in the .module file, chances are there are a lot of miscellaneous functions that can be organized into one or more service classes. These won’t be true D8/9 services with the service.yml and all of that but they serve the same purpose. In theory, you could write your own service container if you wanted to push this even further but even just regular classes help.

Make the events service:

  • Add the directory “src” to the root of the xyz_events directory.
  • In there, add EventsService.php. It’s not required to have “Service” in the name but it helps make the purpose clear.
  • The basic outline of the class looks like this:

Move the code:

For each of the non-hook, non-callback functions in the .module file (and .inc files), copy it into the appropriate class. Some functions might make more sense in a site-wide utility class if they aren’t directly related to the module’s “theme”. Once it’s in its new home, do the cleanup and refactoring:

  • Change the function names to camel case and remove the module name (ie: xyz_events_get_event_list() becomes getEventList())
  • Add the public, protected, or private designation to the front. Since these used to be in a .module file, most of them are likely to be public. But, if any functions are only used by other functions that are now in the same class, those could be changed to protected or private.
  • Now is a good time to clean up any coding standards issues. I like to use Drupal 8/9’s standards if I know I’m on a PHP version that supports it such as using short array syntax. This gets it as close to D8/9 as possible.
  • Do whatever refactoring is needed to make the code easier to follow, improve performance, fix bugs, etc.

Update calls:

Using grep, check your whole site to find out all the places where that function was being called. For any place it’s being called that isn’t already in the class, change the call from the old function to the new method:

  • Add $events_service = new EventsService(); if that isn’t already in scope.
  • Change xyz_events_get_event_list() to $events_service->getEventList()

If the call is within the class already, change it to use “$this” instead of the service reference variable:

  • xyz_events_get_event_list() to $this->getEventList()

You now have all the miscellaneous custom code in a service (or multiple services if it makes sense to divvy them up). When moving to Drupal 8/9, all that’s needed is to update the class so that it’s a proper service container service and then change the calls to go through the container rather than using “new SomeClass()”.

Block classes

Drupal 8 introduced blocks as classes which is much cleaner than the old style that used multiple hooks. If you have multiple custom blocks each with a chunk of code, hook_block_view() can get quite long and hard to follow. While the hooks themselves are still needed, the actual code can be split off into classes. hook_block_info() stays the same but hook_block_view() becomes much simpler. 

  • If you haven’t already, add a directory “src” at the root of the module directory.
  • In “src” add a directory structure “Plugin/Block”.
  • For each block in hook_block_view():
    • Add a file in “Block” that is BlockNameBlock.php. Like services, the “Block” at the end isn’t required but makes it clearer what the class does. For our example module, we end up with UpcomingEventsBlock.php and FeaturedEventsBlock.php.
    • Take all the code for generating that block out of the hook and put it in the class.
    • Replace the content in the hook with a call to the class.
  • If your blocks have a lot of similar functionality, you can take advantage of inheritance and move the common functionality into a base block. In our case, since both blocks are listing events, we add EventListingBlockBase.php.

In the .module file we have:

/**
 * Implements hook_block_info().
 */
function xyz_events_block_info() {
  $blocks = [];

  $blocks['upcoming'] = [
    'info' => t('Show upcoming events'),
  ];

  $blocks['featured'] = [
    'info' => t('Show featured events'),
  ];

  return $blocks;
}

/**
 * Implements hook_block_view().
 */
function xyz_events_block_view($delta = '') {
  $block = [];

  switch ($delta) {
    case 'upcoming':
      $block_object = new UpcomingEventsBlock();
      $block = $block_object->build();
      break;

    case 'featured':
      $block_object = new FeaturedEventsBlock();
      $block = $block_object->build();
      break;
  }

  return $block;
}

And then our blocks:

EventListingBlockBase.php

EventsService = new EventsService();
  }

  /**
   * Builds the content for the block.
   */
  abstract public function build();
  
  /**
   * Format the content into the array needed for the block.
   *
   * @param string $title
   *   The block title.
   * @param array $items
   *   The complete list of items.
   * @param string $empty
   *   The text to print if there are no items.
   * @param string $theme_hook
   *   The theme hook for the block content.
   *
   * @return array
   *   The block content array.
   */
  protected function formatContent(string $title, array $items, string $empty, string $theme_hook) {
    // Only keep the empty text if there are no items.
    $empty = (count($items) == 0) ? $empty : '';

    $variables = [
      'items' => $items,
      'empty' => $empty,
    ];

    $content = [
      'subject' => $title,
      'content' => theme($theme_hook, $variables),
    ];

    return $content;
  }

}

UpcomingEventsBlock.php and FeaturedEventsBlock.php both use the following code, just altering the “upcoming” to “featured” as appropriate.

EventsService->getEventList('upcoming')

    // What it should print if there aren't any.
    $empty = t('There are no upcoming events.');

    // The theme hook to use to format the contents of the block.
    $theme_hook = 'xyz_events_upcoming_events';

    return $this->formatContent($title, $items, $empty, $theme_hook);
  }

}

Now all the content for building each block is encapsulated in the class for that block. When moving to Drupal 8/9, add the block annotation that it uses to identify blocks and remove the block-related hooks from the .module file.

If your blocks need configuration, this can be taken a step further by adding the form code and save code as methods on the block class and then referencing those from hook_block_configure() and hook_block_save().

Menu items to controllers

While hook_menu itself usually doesn’t get too overwhelming due to the actual code for the menu items being in separate functions, it does contribute to the .module file bloat. It’s also a lot nicer to have the menu items be in individual controllers like in Drupal 8/9.

To make this happen:

  • If you haven’t already, add a “src” directory at the root of your module.
  • Add a “Controller” directory under that.
  • For each menu item, add a file in there that is SomeController.php. Like services, “Controller” isn’t required but it makes it clearer. Another option is to use “Page” if the item corresponds to a viewable page rather than an API callback. For our example module, we end up with “UpcomingEventsController.php” and “FeaturedEventsController.php”.
  • As with blocks, a base controller can be used if the controllers have similar code.
  • Replace the hook code with a reference to the class (explained below).

There are two ways that the hook_menu() code can reference your class. Using a static function on the class and calling it directly or using a wrapper function to call the object.

Static method:

  • In hook_menu:
    'page callback' => 'UpcomingEventsController::build',
  • The build method on the class needs to be static.

Wrapper method:

  • In hook_menu:
    'page callback' => 'xyz_events_controller_callback',
    'page arguments' => ['controller_class' => 'UpcomingEventsCoursesPage'],
  • function xyz_events_controller_callback() needs to be in the .module file. (see below)
  • The build method on the class does not need to be static as we are instantiating an object.

In the .module file:

/**
 * Implements hook_menu().
 */
function xyz_events_menu() {
  $items = [];

  $items['events/upcoming'] = [
    'title' => 'Upcoming events',
    'page callback' => 'xyz_events_controller_callback',
    'page arguments' => ['controller_class' => 'UpcomingEventsController'],
    'type' => MENU_NORMAL_ITEM,
  ];

  $items['events/featured'] = [
    'title' => 'Featured events',
    'page callback' => 'xyz_events_controller_callback',
    'page arguments' => ['controller_class' => 'FeaturedEventsController'],
    'type' => MENU_NORMAL_ITEM,
  ];

  return $items;
}

/**
 * Menu callback that wraps the controllers.
 */
function xyz_events_controller_callback($controller_class) {
  $controller_class = "\\Drupal\\xyz_events\\Controller\\$controller_class";
  $controller = new $controller_class();
  return $controller->build();
}

The classes:

EventListingControllerBase.php

eventsService = new EventsService();
  }

  /**
   * Builds the content for the page.
   */
  abstract public function build();
  
}

UpcomingEventsController.php and FeaturedEventsController.php have the same code with “upcoming” changed to “featured” as needed.

eventsService->getEventList('upcoming');

    $content = theme('xyz_events_upcoming_events', ['items' => $items]);
    return ['#markup' => $content];
  }

}

The rest of the hooks

While Drupal 7 relies on a lot of hooks and these need to be in a .module or .inc file so they can be found, there’s nothing requiring the actual code for the hooks to live in the functions. Like we did with blocks and menu items, the hook functions can serve as a wrapper around a call to a class where the actual code lives.

Testing the refactoring

While writing actual tests is the best way to test, that isn’t always possible with time and budget considerations. Still, you want to be sure your refactored code gets you the same results as the old code. Moving functions into classes while refactoring helps with that.

  • Add an include file to the root of the module. Ex: xyz_events_replaced_functions.inc
  • At the top of the .module file, add include_once 'xyz_events_replaced_functions.inc';
  • As you move functions into the classes, copy them into this file instead of deleting them from the .module file.

This keeps all the old functions active at the same time as the new ones which lets you test them in parallel within the same site.

Add this to the top of the module file:

/**
 * Implements hook_menu().
 */
function xyz_events_menu() {
  // Adds a testing page for dev use.
  $items['admin/code-test'] = array(
    'access arguments'  => array('administer content'),
    'description'       => 'Place to test code.',
    'page callback'     => xyz_events_test_code',
    'title'             => 'Code testing',
  );

  return $items;
}

/**
 * Place to put test code that is called from admin/code-test.
 */
function xyz_events_test_code() {
  print("Test completed.");
}


Within xyz_events_test_code() you can do something like this:

Within xyz_events_test_code() you can do something like this:

$events_service = new EventsService();
$old_list = xyz_events_get_event_list();
$new_list = $events_service->getEventList();

Set a breakpoint at the top of the function and then visit /admin/code-test on the site. You can then step through and compare what you get running the original function vs running your refactored function and make sure the result is either the same or has any differences that you intended with the refactor.

Once you have finished and are ready to commit your refactor, delete the include file, the include line, and the testing code.

Wrapping up

At this point, your .module file should be much smaller, consisting of just hooks and things like callback functions that can’t be moved into a class. Your directory structure should look a lot like a Drupal 8/9 module with all the code wrapped up into classes. There will still be work needed to move to Drupal 8/9 in dealing with the API changes but the actual structure of the code and where things are found shouldn’t need much changing. And maintaining the Drupal 7 code until that time should be a much more pleasant experience.

Further reference

This blog post came from actual client work but that work was inspired by others. These two were my biggest sources for reference:

I didn't get into forms above but here's an article that covers them: https://roomify.us/blog/object-oriented-forms-in-drupal-7/

May 14 2021
May 14

Perhaps of all the Drupal available APIs, the Migrate API is one of the areas of knowledge most hidden from the average Drupal user. It’s even possible that as you read this post, you know for the first time that there is a whole set of Drupal modules both in core and contributed, and fully dedicated to the design and execution of migration processes within Drupal. You can move data from an old Drupal to the most recent version but also from others databases, files, systems, frameworks, CMS, DXPs&mldr; and all these options are aggregated around the Drupal Migrate API, a set of resources that requires much time of dedication, among other things, due to the disintegration of the available information, until now. The book that I will discuss today comes to bring order and unification in all this. Pay attention.

Picture from Unsplash, user Will Paterson, @willpat

Table of Contents

1- Introduction
2- The Book
3- Recommendations
4- Book Information
5- Fast Review
6- Ratings
7- :wq!

1- Introduction

Migration processes are a fairly frequent issue in project development: it is quite normal to build a new Drupal-based website for a client that already has its platform implemented in a different technology (not kidding, this hapens). As part of these processes, a client or customer may need to move their data to the new platform using a new data model as Drupal does with its tables and relations.

In these situations, several scenarios open up for these processes: the study of the origin of the data, the processing possibilities, the methodology to sanitize the information and how to store it in a stable way in our new environment. What I have just briefly described is what is involved in an ETL process: Extraction - Transformation and Load for the data migration, and this is the spirit of the book I am discussing today: the exhaustive study of the design and execution for ETL process in Drupal.

For those of us who at some point have had to learn to migrate data to Drupal by the hard way, I’m sure it has been recurrent a) to search for information on Google at some point. and b) Reach Mauricio Dinarte’s blog: understanddrupal.com. For me, this website has been a very important reference (almost mandatory) to understand how to perform Drupal Migrations. In fact, it was one of the recommended authors that I considered fundamental to learn how to execute Drupal Migrations, when more than a year ago I started writing about this deep topic here in The Russian Lullaby, linking to his website and the platform of Agaric.coop, with shared related content, too. So the author and his work is not alien to me, but what happens is that today I want to talk about a novelty discovered recently: the compilation of all the great work around this topic in his book about migration processes in Drupal: 31 days of Drupal Migrations.

2- The Book

It’s difficult for me to evaluate this book as it deserves. So I guess maybe the first thing I want to say is that I think this is the first time I’m faced with such an integrated and compact body of knowledge/experience. This ir very important, because in many cases Drupal documentation is not very extensive or not sufficiently updated. When it comes related to a complex or advanced topic, this becomes a real and deep problem: It’s necessary to consult outdated blogs, review contributed modules, review code and read deprecated documentation&mldr; in order to be able to go step by step building some ideas about how you can do this or that&mldr;

Cover of 31 Days of Drupal Migrations, by Mauricio Dinarte

Maybe because of this is why initiatives like this from Mauricio Dinarte, @dinarcon from the initiative Understand Drupal are so important and generate a very high value (only surmountable if the same information becomes part of the official Drupal.org documentation, of course).

As I said at the beginning of this section, it’s difficult for me to describe how much this resource has helped me, but I would like write down some special points that you can learn from this book, some key points of interest, something like:

Key Points:

  • Perhaps the first fundamental learning here: to know how a migration process works, what are the workflows and its possibilities.

  • Learn about how to migrate data in a lot of entities and resources: files, images, paragraphs, fields and subfields&mldr;All full of a) theory and concepts and b) examples, examples, and more examples.

  • Know the way to perform migrations from diverse sources and origins: XML files, CSV, JSON&mldr; executing processing ad-hoc for the values.

  • Get the most complete list of the most valuable contributed modules related with Migrations: Media Handler, Media Migration, Geofield, Migrate Devel&mldr;core modules, contrib modules, modules that includes plugins from Migrations&mldr;the set is long and pretty interesting. This book contains the most comprehensive and centralized list of all.

3- Recommendations

Actually, I guess I could say - in a nutshell - that any work team implementing Drupal-based projects should have a copy of this book available. And I wouldn’t be exaggerating, I swear. I think this huge, extensive and thorough work compiled by Mauricio Dinarte has too much importance and it’s quite interesting too. This can’t to go unnoticed.

On the other hand, being “Migrations” a topic that actually encompasses and relates to other Drupal APIs, it can also provide quite a lot of knowledge to people in a technical team who want to learn more about Drupal internals: Plugins, Services, Entities, Drush, Debugging&mldr;these are cross-cuttings topics in this book and they are also very important issues. Maybe you wanna learn more about that.

For everyone who has to go through ETL processes in Drupal, this is a must-have resource: you can’t get through the maze without using a good map of the territory and this chart is the best compilation yet of all the key issues when defining a migration process: Do you know under which criteria to decide how to define your migration? code or configuration? advantages and disadvantages?&mldr; These are just the initial questions. Along the way there are many more, and this is the best guide to deal with it. The fact that this e-book is also available for only $10 makes it even more obvious: there are no excuses. It’s a must. And then, if you want you can be supporter of the Understand Drupal initiative, sponsoring the tutorials production: understanddrupal.com/supporters. Think about it.

4- Book Information

FieldDescriptionTitle31 Days of Drupal Migrations.AuthorMauricio Dinarte.PublisherLeanpub.DateOctober, 2020.Pages193OverviewComplete guide to implement migrations and ETL processes in Drupal.KeywordsDrupal, Migration, Migrate API, Plugins, Source, Process, Destination.Price10$LinksGumroad,

5- Fast Review

Question // Answer1- Is this book progressive, Iterative and Incremental?Yes, it is. It’s constructed in a very didactic and useful way.2- Does it offer specific solutions to particular problems or concrete issues?Yes, it offers solutions. The book contains many (many) practical examples and errors.3- Does it explain well the original problems or needs it aims to solve?Yes, the book is built from the needs that normally originate migration tasks.4- Is this book rich in examples?Yes, for each situation, case, resource, it offers a real and practical example, downloadable from a repository. .5- Is this book written in plain English, suitable for non-English speakers?Yes, this book is an easy read for non-English speakers, very pleasant.6- Is it up to date?Yes, seems to be aligned with the last big changes related to the Migrate API of Drupal, November 2020.

6- Ratings

Table of Book Ratings for 31 Days of Drupal Migrations

7- :wq!

[embedded content]

May 13 2021
May 13

Drupal currently supports IE11, and will do until Drupal 10 in about a year's time. Here's my proposal for how to support IE11 in a distribution without holding back from modern CSS.

Drupal will support IE11 until about this time next year. We have been seeing many clients say to us that they are happy to drop support for IE11 if it means they can have a better website. However, some organisations (government, etc) still use IE11 internally as their main browser, so those users need a website that works and is a pleasure to use. However, often the only users of these websites in IE11 are internal users - editors updating content - and not "visitors" to the website who are mostly using mobile browsers on their phones or modern browsers like Firefox, Edge, and Chrome on their computers.

Within LocalGov Drupal, we aim to support IE11 as long as Drupal does. However, I don't want us in a position this time next year needing/wishing to rewrite our frontend in modern CSS after we drop IE11 support. Here's my proposal to give IE11 users a really good experience via a default theme (like Wordpress does for mobile sites hosted on Wordpress.com) but uses progressive enhancement to supplement how the site works for visitors using modern browsers.

If we take the approach in this video, it means that all CSS properties are variables (fonts, line-heights, spacing, colours, etc) and then any council can use our base theme but brand it to their own brand guidelines simply using a variables.css file to override the base CSS variables. Since variables are calculated at run time, we don't need to recompile the theme to each time we make a simple change. Why is this so great/sustainable/scalable? It means sub-themes are very small, but also means we can create theme settings so editors can set some of these variables themselves (e.g. setting background colour via the CMS for Christmas, or when a notable person has passed away) all without needing to contact developers.

But won't the site look different in IE11 to other browsers? If you site uses the base theme, it will look the same in all browsers. If you site uses the base theme but extends it, then it will look like the base theme in IE11 but your branded theme in other browsers. Remember, sites already look different in different browsers - everything stacks in one column on phone browsers buy is side-by-side on desktop browsers, etc. Different does not mean bad.

I've started streaming some of my work, especially if it's contributions to open source, on twitch if you'd like to follow along/subscribe. As it turns out, Twitch deletes your videos after 14 days, so I have also been uploading them to YouTube. Feel free to subscribe.

May 13 2021
May 13

Reason #1: You Get Two Robust Systems In One.

Most e-commerce sites include integration with third-party content management systems. These may have many inconsistencies and quirks, requiring site owners to spend precious time integrating two systems. And if they achieve a satisfactory result, they would have to maintain the two systems separately: upgrade them, back them up, and correct security problems, and so on. It doubles the amount of time and effort required.

Drupal 8/9 helps you manage content by changing product descriptions, writing blog posts, creating landing pages, and doing other content marketing tasks. On the other hand, Drupal Commerce handles all e-commerce, including book and order management, payment processing, tax estimates, and so on.

You can also easily link your product descriptions from other parts of your website, such as a landing page or blog. The better your search engine visibility, the higher your website will rank and thus attract more traffic.

Reason #2: You Get An Accessible Website.

Accessibility is critical for all websites, but it is especially essential for e-commerce sites. An online store that fails to consider the needs of people with disabilities gets doomed from the start. Legal issues, sales loss, and being penalized by search engines are just a couple of the reasons to take accessibility seriously.

When you develop an e-commerce website in Drupal 8 and Drupal 9, you get accessibility to many Drupal 8 features out of the box. Some of these features are all image alt tags must-have details, and all Drupal 8/9 themes have a contrast ratio appropriate for visitors with low vision.

Reason #3: You Can Create An E-commerce Website In Multiple Languages With Drupal 8/9.

As previously said, one of the most significant benefits of e-commerce is the ability to conduct business across borders. Host your online store on a server and make it accessible to people all over the world, regardless of distance or time. You can also make your content accessible to customers in different countries in appropriate regional languages. It will increase confidence and traffic to your store.

Also Read: Drupal 8 Module Development: A Compressive Step-By-Step Guide

When you build an e-commerce website in Drupal 8/9 or Drupal Commerce, you save a lot of time and effort in localization. If you want to create multilingual ecommerce websites, Drupal is most likely the best tool. It provides you with access to several language-related core modules without much effort.

The Interface Translation module, for example, allows you to easily translate your store's UI, while the Content Translation module does the same for your content. All you have to do is properly configure them.

drupal for website development

Reason #4: Mobile Ready.

If your e-commerce site performs well on mobile devices, it will perform well on other devices as well! Creating user scenarios will assist you in determining what type of content the user will enjoy on their mobile device. This method will assist you in designing the essential elements for your website.

Mobile compatibility has evolved into an essential function for every eCommerce platform. All in today's world must be mobile-ready. Drupal websites not only impress clients with their appearance but also with their mobile responsive design. Drupal websites are simple to use on mobile devices and tablets.

Reason #5: Multilingual.

The internet has taken over the world, and with so many people using similar channels and so many brands spreading internationally, multilingual websites are a must! Even though most internet users choose English as their primary language, ten other languages account for 90 percent of the top ten million websites. Meaning, you need a website in multiple languages to thrive and grow in this highly competitive world.

Drupal is the best platform for a multilingual website. It offers many languages to choose from and four main modules explicitly designed for language and translation support. This Drupal function has yielded excellent results, including increased conversions, improved SEO, unrivalled translation workflows, and a significant contribution to audience expansion. It can also detect the user's favourite language by defining the user's IP address, sessions, browser settings, and so on.

Reason #6: A Quick Website That Is Simple To Extend And Scale.

An excellent online store eventually "grows out of its clothing" and must scale to meet increased traffic. It is essential to add new features. Scalability is not an issue when building an e-commerce website with Drupal 8/9.

First, both the CMS and Drupal Commerce have an extensive range of core and contributed modules that can handle nearly any task imaginable, from creating custom forms to dealing with currencies and calculating discounts.

If this isn't enough, we can further expand the systems by using different application programming interfaces or APIs for short. Drupal websites may use this to access a wide range of external resources, from social media to shipment monitoring.

Drupal REST API has also separated the backend and frontend, and developers can now use any technology to create UIs for Drupal-based websites, including the common JavaScript frameworks Vue.js, React.js, and Angular.js. As a result, websites are much quicker than they used to be. Speed is a decisive conversion-increasing factor. Nobody wants to stay on a page more than a few seconds for a website to load.

Furthermore, the REST API has broadened the channels by which consumers can access an online store. Visitors use the same database when using a smartphone, tablet, desktop, Mac, or PC. It allows you to reach a much larger audience and therefore generate even more sales.

Drupal Website

Reason #7: You Get A Website That No Hacker Can Exploit.

Due to the rise of ecommerce stores, hackers are also using advanced technology to hack ecommerce websites. Cybersecurity is especially important for e-commerce stores as it holds a large amount of customer data, payment details and more. A data breach can result in various issues, including consumer financial information falling into the wrong hands.

Also Read: Web Development Strategy You Must Follow for a Successful Drupal 9 Upgrade

If you are using Drupal 8/9 and Drupal Commerce to create an e-commerce website, you are well-protected against any security threats. The Drupal 8/9 software package is a model of protection in and of itself. There is also a dedicated community of Drupal experts devoted to ensuring the system's and modules' absolute security.

When a new security patch is released, site owners may immediately begin using it. Scammers do not have the reason to create havoc.

May 13 2021
May 13

We often end up with a lot of templates when theming, but we should also remember to create "sensible defaults", so before you start theming specifically your site doesn't look broken.

Writing sensible defaults means that the dev we might perform or the CSS we might write for a new feature/component is additive, and you do not need to reset libraries and/or write code to overwrite other code. Here's an outline of creating sensible defaults for our region template within the new localgov_base theme - region.html.twig.

I've started streaming some of my work, especially if it's contributions to open source, on twitch if you'd like to follow along/subscribe. As it turns out, Twitch deletes your videos after 14 days, so I have also been uploading them to YouTube. Feel free to subscribe.

May 12 2021
May 12

Open Collaborative Connections - Join us to experience the open web with a collaborative culture and a connection to DrupalCamps across Europe

There is no DrupalCon without an awesome program, and our program team is working on various activities to make this happen.

First of all - what are we going to learn about this year?

Last year, we had five areas in which our program was centered around, and we decided that we would like to continue with them again this year.

That said, the main topics for this years’ edition of DrupalCon Europe will be:

  • Agency & Business
    Attendees looking for actionable advice, ideas and challenges from business owners, operators, executives and project managers on how to manage their business and projects successfully.
  • Clients & Industry Experiences
    Digital managers and executive decision-makers using Drupal or considering Drupal as their next digital experience management platform. Agency leaders, product owners and tech leads would also find the showcases insightful.
  • Makers & Builders
    This track is for everyone interested in Drupal development, including all disciplines and all levels of experience. This track is for both technical profiles, and also anyone interested in how Drupal works from the inside.
  • Open Web & Community
    This track is for everyone in the Drupal and Open Web community interested in the web being an inclusive, diverse, open and healthy place. 
  • Users & Editors
    If you are using Drupal extensively to reach your organizations' digital goals or to create content regularly, or if you are creating the editorial experience for a project, this track is for you.

We are aiming to give you high-quality sessions from seasoned speakers, where you can learn and discover new things. We are also adjusting the schedule, so as an attendee, you can make the most out of your DrupalCon experience, without the feeling of being overloaded.

Then - who is going to select the sessions?

The program team leads (Rouven Volk, Dan Lemon and Zsófi Major) have already started weekly meetings to figure out the details, and they are working on putting together a great program team for each track. Once complete, the team will start building the program and we will publish our call for papers.

And finally - how can you get involved?

There are many ways to help out around DrupalCon. Since we are currently forming our program team, we are looking for motivated volunteers who would be interested in joining a track and helping us find the best sessions of the season.

Do you have good contacts in one of the audience groups we have mentioned above? Are you interested in helping promote the program or selecting the right sessions for one of these tracks? The track team is committed to providing a diverse and inclusive program, which is why we welcome help, especially from underrepresented groups.

Your help as a track chair can be a very rewarding experience and will shape the program for the next DrupalCon.

Please complete our track chair form if you are interested in helping with the DrupalCon Europe 2021 program team!

Furthermore, please reach out to the organizer team, if you would like to participate in DrupalCon Europe 2021 in any other way, we have all kinds of opportunities for you!

Please take a look at our volunteer survey and submit your name - link: https://kuonicongress.eventsair.com/drupal21/volunteering-opportunities/Site/Register

And if you are the type of person who likes to share their knowledge with the world - polish your best DrupalCon talk material, and submit your session proposal today!

In the upcoming days, we will post further updates about our program.

The DrupalCon Europe 2021 Program Team

May 12 2021
May 12

Lynette has been part of the Drupal community since Drupalcon Brussels in 2006. She comes from a technical support background, from front-line to developer liaison, giving her a strong understanding of the user experience. She took the next step by writing the majority of Drupal's Building Blocks, focused on some of the most popular Drupal modules at the time. From there, she moved on to working as a professional technical writer, spending seven years at Acquia, working with nearly every product offering. As a writer, her mantra is "Make your documentation so good your users never need to call you."

Lynette lives in San Jose, California where she is a knitter, occasionally a brewer, a newly-minted 3D printing enthusiast, and has too many other hobbies. She also homeschools her two children, and has three house cats, two porch cats, and two rabbits.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web