Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jun 29 2020
Jun 29

The Drupal migrations, despite their linearity in terms of definitions, contain a lot of inherited complexity. The reason is very intuitive: although the Migrate API is a supersystem that offers a very simple “interface” of interactions for the user-developer who wants to build migration processes, in reality several subsystems work by interacting with each other throughout a migration process: Entities, Database, Plugins…There are a lot of classes involved in even the simplest migration process. If we add the irrefutable fact that a migration will tend to generate errors in many cases until it has been refined, it’s clear then that one of our first needs will be to learn…how to debug migrations.

Picture from Unsplash, user Krzysztof Niewolny, @epan5

Table of Contents

1- Introduction
2- Basic Debugging: Keep an eye on your file
3- Average Debugging with Migrate Devel
4- :wq!

This article is part of a series of posts about Drupal Migrations:

1- Drupal Migrations (I): Basic Resources

2- Drupal Migrations (II): Examples

3- Drupal Migrations (III): Migrating from Google Spreadsheet

4- Drupal Migrations (IV): Debugging Migrations First Part

5- Drupal Migrations (V): Debugging Migrations-II

1- Introduction

In the wake of the latest articles, I wanted to continue expanding information about migration in Drupal. I was thinking about writing a sub-series of debugging migrations (inside the main series about Drupal Migrations), and I want to publish now the first part, just a set of basic steps in order to get all the available information from a migration process. All the examples in this post were taken of the migration_google_sheet example, from my Gitlab account.

2- Basic Debugging (Keep an eye on your files)

First, we will start with a very primary approach to error detection during a migration. To begin with, it is essential to keep the focus on reducing the range of error possibilities as much as possible by approaching the migration in an iterative and incremental manner. In other words: we will go step by step and expand our migrated data.

2.1- Reviewing your Migration description file

First of all we are going to comment on the most intuitive step of all we will take, since sometimes there are errors that occur at first sight and not because they are recurrent but end up being more obvious.

The first steps in our process of debugging a migration will be a review of two fundamental issues that usually appear in many migrations. So before anything else, we’ll do a quick review of:

Whitespaces: Any extra whitespace may be causing us problems at the level of the migration description file: we review all lines of the file in a quick scan in order to detect extra whitespace.

Errors in the indentation: The migration description file has a format based on YAML, a language for data serialization based on a key scheme: a value where it is structured by parent - child levels and an indentation of two spaces to the right at each level down in the hierarchy. It is very frequent that some indentation is not the right one and this ends up producing an error in the processing of the file. As a measure, as in the previous case, we will review all the cases of indentations registered in the file.

You can rely on a YAML syntax review service such as www.yamllint.com, but you will have to monitor the result as well.

2.2- Reviewing registers in database

If you’re in a basic Drupal installation (standard profile) we have seventy-three tables, after the activation of the basic modules related to migrations: migrate, migrate_plus, migrate_tools and in this case the custom migration_google_sheet_wrong the number of tables in the database is seventy-five. Two more tables have been generated:

cache_migrate
cache_discovery_migration

But also, later, after executing the migration with ID taxonomy_google_sheet_wrong contained in our custom module, we see in the database that two new tables have been generated related to the executed migration:

  • migrate_map_taxonomy_google_sheet
    This table contains the information related to the movements of a row of data (migrations are operated ‘row’ to ‘row’). Migrate API is in charge of storing in this table the source ID, the destination ID and a hash related to the ‘row’ in this data mapping table. Combinations between the source ID and the hash of the row operation then make it easier to track changes, progressively update information, and cross dependencies when performing a batch migration (see below for how they are articulated).
    The lookup processes for migrations are supported by this data: for example, to load a taxonomy term you must first lookup its “parent” term to maintain the hierarchy of terms. If we go to our database and we do not see recorded results after launching a migration, no data was stored and the migration requires debugging.

  • migrate_message_taxonomy_google_sheet
    In this table, messages associated to the executed migration will be stored, structured in the same way as the previous table (based on the processing of a ‘row’ of the migration), each message with its own identifier and an association to the id_hash of the ‘row’ of origin of the data:

Drupal Migration columns from table Messages

This information can be obtained through Drush, since the content of this table is what is shown on the screen when we execute the instruction:

drush migrate:messages id-migration

And this can be a useful way to start getting information about the errors that occurred during our migration.

2.4- Reloading Configuration objects

Another issue we’ll need to address while debugging our migration is how to make it easier to update the changes made to the configuration object created from the migration description file included in the config/install path.

As we mentioned earlier, each time the module is installed a configuration object is generated that is available in our Drupal installation. In the middle of debugging, we’ll need to modify the file and reload it to check that our changes have been executed. How can we make this easier? Let’s take a look at some guidelines.

On the one hand, we must associate the life cycle of our migration-type configuration object with the installation of our module. For it, as we noted in the section 2.3.2- Migration factors as configuration, we will declare as forced dependency our own custom module of the migration:

dependencies:
   enforced:
      module:
          - migration_google_sheet

We can use both Drush and Drupal Console to perform specific imports of configuration files, taking advantage of the single import options of both tools:

Using Drupal Console

drupal config:import:single --directory="/modules/custom/migration_google_sheet/config/install" --file="migrate_plus.migration.taxonomy_google_sheet.yml"


Using Drush

drush cim --partial --source=/folder/

Similarly, we can also remove active configuration objects using either Drush or Drupal Console:

drush config-delete "migrate_plus.migration.taxonomy_google_sheet"
drupal config:delete active "migrate_plus.migration.taxonomy_google_sheet"

If we prefer to use the Drupal user interface, there are options such as the contributed Config Delete module drupal.org/config_delete , which activates extra options to the internal configuration synchronization menu to allow the deletion of configuration items from our Drupal installation. It’s enough to download it through Composer and enable it through Drush or Drupal Console:

composer require drupal/config_delete
drush en config_delete -y

Drupal Config Delete actions

This way we can re-import configuration objects without colliding with existing versions in the database. If you choose to update and compare versions of your configuration, then maybe the Configuration Update Manager contributed module can be a good option https://www.drupal.org/project/config_update.

3- Average Debugging with Migrate Devel

Well, we have looked closely at the data as we saw in the previous section and yet our migration of taxonomy terms from a Google Spreadsheet seems not to work.

We have to resort to intermediate techniques in order to obtain more information about the process. In this phase our allies will be some modules and plugins that can help us to better visualize the migration process.

3.1- Migrate Devel

Migrate Devel https://www.drupal.org/project/migrate_devel is a contributed module that brings some extra functionality to the migration processes from new options for drush. This module works with migrate_tools and migrate_run.

UPDATE (03/07/2020): Version 8.x-2.0-alpha2

Just as I published this article, Andrew Macpherson (new maintainer of the Migrate Devel module and one of the accessibility maintainers for Drupal Core), left a comment that you can see at the bottom of this post with some important news. Well, since I started the first draft of this article, a new version had been published, released on June 28th and it’s already compatible with Drush 9 (and I didn’t know…) So now you know there’s a new version available to download compatible with Drush 9 and which avoids having to install the patch exposed below.

To install and enable the module, we proceed to download it through composer and activate it with drush: Migrate Devel 8.x-2.0-alpha2.

composer require drupal/migrate_devel
# To install the 2.0 branch of the module:
composer require drupal/migrate_devel:^2.0
drush en migrate_devel -y

Follow for versions prior to 8.x-2.0-alpha2:

If you’re working with versions prior to 8.x-2.0-alpha2, then you have to know some particularities: The first point is that it’s was optimized for a previous version of Drush (8) and it does not seem to have closed its portability to Drush 9 and higher.

There’s a tag 8.x.1.4 from two weeks ago in the 8.x-1.x branch: migrate_devel/tree/8.x-1.4

There is a necessary patch in its Issues section to be able to use it in versions of Drush > 9 and if we make use of this module this patch https://www.drupal.org/node/2938677 will be almost mandatory. The patch does not seem to be in its final version either, but at least it allows a controlled and efficient execution of some features of the module. Here will see some of its contributions.

And to apply the patch we can download it with wget and apply it with git apply:

cd /web/modules/contrib/migrate_devel/
wget https://www.drupal.org/files/issues/2018-10-08/migrate_devel-drush9-2938677-6.patch 
git apply migrate_devel-drush9-2938677-6.patch

Or place it directly in the patch area of our composer.json file if we have the patch management extension enabled: https://github.com/cweagans/composer-patches.

Using:

composer require cweagans/composer-patches 

And place the new patch inside the “extra” section of our composer.json file:

Drupal Debugging adding the patch

How it works:

The launch of a migration process with the parameters provided by Migrate Devel will generate an output of values per console that we can easily check, for example using –migrate-debug:

Drupal Devel Output first part
Drupal Devel Output second part

This is a partial view of the processing of a single row of migrated data, showing the data source, the values associated with this row and the final destination ID, which is the identifier stored in the migration mapping table for process tracking:

Drupal Devel Output in database

Now we can see in the record that for the value 1 in origin (first array of values), the identifier 117 was assigned for the load in destination. This identifier will also be the internal id of the new entity (in this case taxonomy term) created within Drupal as a result of the migration. This way you can relate the id of the migration with the new entity created and stored.

What about event subscribers?, Migrate Devel creates an event subscriber, a class that implements EventSubscriberInterface and keeps listening to events generated from the event system of the Drupal’s Migrate API, present in the migrate module of Drupal’s core:

Called from +56 
/var/www/html/web/modules/contrib/migrate_devel/src/EventSubscriber/MigrationEventSubscriber.php

The call is made from the class where events are heard and actions from the module’s Event classes are read. Many events are defined there modules/migrate/src/Event, but in particular, two that are listened to by Migrate Devel:

  1. MigratePostRowSaveEvent.php
  2. MigratePreRowSaveEvent.php

What are the two Drush options offered by Migrate Devel, and in both cases results in a call to the Kint library dump() function provided by the Devel module to print messages. In fact the call to Kint has changed in the last version 8.x-2.0-alpha2, where Kint is replaced by a series of calls to the Dump method of ths Symfony VarDumper. Where we used to do:

  /**
   * Pre Row Save Function for --migrate-debug-pre.
   *
   * @param \Drupal\migrate\Event\MigratePreRowSaveEvent $event
   *    Pre-Row-Save Migrate Event.
   */
  public function debugRowPreSave(MigratePreRowSaveEvent $event) {
    $row = $event->getRow();

    $using_drush = function_exists('drush_get_option');
    if ($using_drush && drush_get_option('migrate-debug-pre')) {
      // Start with capital letter for variables since this is actually a label.
      $Source = $row->getSource();
      $Destination = $row->getDestination();

      // We use kint directly here since we want to support variable naming.
      kint_require();
      \Kint::dump($Source, $Destination);
    }
  }

Now we’re doing:

  /**
   * Pre Row Save Function for --migrate-debug-pre.
   *
   * @param \Drupal\migrate\Event\MigratePreRowSaveEvent $event
   *    Pre-Row-Save Migrate Event.
   */
  public function debugRowPreSave(MigratePreRowSaveEvent $event) {
    if (PHP_SAPI !== 'cli') {
      return;
    }

    $row = $event->getRow();

    if (in_array('migrate-debug-pre', \Drush\Drush::config()->get('runtime.options'))) {
      // Start with capital letter for variables since this is actually a label.
      $Source = $row->getSource();
      $Destination = $row->getDestination();

      // Uses Symfony VarDumper.
      // @todo Explore advanced usage of CLI dumper class for nicer output.
      // https://www.drupal.org/project/migrate_devel/issues/3151276
      dump(
        '---------------------------------------------------------------------',
        '|                             $Source                               |',
        '---------------------------------------------------------------------',
        $Source,
        '---------------------------------------------------------------------',
        '|                           $Destination                            |',
        '---------------------------------------------------------------------',
        $Destination
      );
    }
  }

You can see the update and changes in migrate_devel/8.x-2.0-alpha2/src/EventSubscriber/MigrationEventSubscriber.php.

And you can get more information about creating events and event subscribers in Drupal here in The Russian Lullaby: Building Symfony events for Drupal.

3.2- Debug Process Plugin

The contributed module Migrate Devel also brings a new processing plugin called “debug” and defined in the Debug.php class. This PHP class can be found in the module path: /web/modules/contrib/migrate_devel/src/Plugin/migrate/process/Debug.php and we can check its responsibility by reading its annotation section in the class header:

/**
 * Debug the process pipeline.
 *
 * Prints the input value, assuming that you are running the migration from the
 * command line, and sends it to the next step in the pipeline unaltered.
 *
 * Available configuration keys:
 * - label: (optional) a string to print before the debug output. Include any
 *   trailing punctuation or space characters.
 * - multiple: (optional) set to TRUE to ask the next step in the process
 *   pipeline to process array values individually, like the multiple_values
 *   plugin from the Migrate Plus module.

And it consists directly with the transform() method - inherited from the ProcessPluginBase abstract class, where instead of applying transformation actions during processing, it simply uses PHP’s print_r function to display information by console and will print both scalar values and value arrays.

This plugin can be used autonomously, being included as part of the migration pipeline, so that it prints results throughout the processing of all value rows. In this case, we are going to modify the pipeline of the processing section of our taxonomy terms migration, with the idea of reviewing the values being migrated.

To begin with, we are going to modify the structure. We already know (from previous chapters) that this is actually the way:

process:
 name: name
 description: description
 path: url
 status: published

It’s just an implicit way of using the Get.php Plugin which is equivalent to:

process:
 name:
   plugin: get
   source: name
 description:
   plugin: get
   source: description
 path:
   plugin: get
   source: url
 status:
   plugin: get
   source: published

Now we add to the pipeline the Debug plugin with an information label for processing:

process:
 name:
   plugin: debug
   label: 'Processing name field value: '
   plugin: get
   source: name
 description:
   plugin: debug
   label: 'Processing description field value: '
   plugin: get
   source: description
 path:
   plugin: debug
   label: 'Processing path field value: '
   plugin: get
   source: url
 status:
   plugin: debug
   label: 'Processing status field value:  '
   plugin: get
   source: published

After this change we reload the migration configuration object by uninstalling and installing our module (as it is marked as a dependency, when uninstalled the migration configuration will be removed):

drush pmu migration_google_sheet && drush en migration_google_sheet -y 

So when we run the migration now we will get on screen information about the values:

Drupal Migrate Devel feedback

This way we get more elaborated feedback on the information to be migrated. If we want to complete this information and thinking about more advanced scenarios, we can combine various arguments and options to gather as much information as possible. Let’s think about reviewing the information related to only one element of the migration. We can run something like:

 drush migrate-import --migrate-debug taxonomy_google_sheet --limit="1 items"

Which will combine the output after storage (unlike its –migrate-debug-pre option), showing in a combined way the output of the Plugin, the values via Kint and the final storage ID of the only processed entity.

In this case, we only see basic values and with little processing complexity (we only extract from Source and load in Destiny) but in successive migrations we will be doing more complex processing treatments and it will be an information of much more value. Interesting? think about processing treatment for data values that must be adapted (concatenated, cut, added, etc)…if at each step we integrate a feedback, we can better observe the transformation sequence.

Here you can check the Plugin code: migrate_devel/src/Plugin/migrate/process/Debug.php.

Here you can review the Drupal.org Issue where the idea of implementing this processing Plugin originated: https://www.drupal.org/node/3021648.

Well, with this approach to Migrations debugging we will start the series on debugging…soon more experiences!

4- :wq!

[embedded content]

Jun 28 2020
Jun 28
All US lighthouses on a map.

Waaaaay back in 2013, I wrote a blog post about importing and mapping over 5,000 points of interest in 45 minutes using (mainly) the Feeds and Geofield modules. Before that, I had also done Drupal 6 demos of importing and displaying earthquake data. 

With the recent release of Drupal 9, I figured it was time for a modern take on the idea - this time using the Drupal migration system as well as (still!) Geofield. 

This time, for the source data, I found a .csv file of 814 lighthouses in the United States that I downloaded from POI Factory (which also appears to be a Drupal site). 

Starting point

First, start with a fresh Drupal 9.0.1 site installed using the drupal/recommended-project Composer template. Then, use Composer to require Drush and the following modules:

composer require drush/drush drupal/migrate_tools drupal/migrate_source_csv drupal/migrate_plus drupal/geofield drupal/geofield_map

Then, enable the modules using

drush en -y migrate_plus migrate_tools migrate_source_csv geofield geofield_map leaflet

Overview of approach

To achieve the goal of importing all 814 lighthouses and displaying them on a map, we're going to import the .csv file using the migration system into a new content type that includes a Geofield configured with a formatter that displays a map (powered by Leaflet).

The source data (.csv file) contains the following fields: 

  • Longitude
  • Latitude
  • Name
  • Description

So, our tasks will be:

  1. Create a new "lighthouse" content type with a "Location" field of type Geofield that has a map formatter (via Geofield map).
  2. Prepare the .csv file.
  3. Create a migration that reads the .csv file and creates new nodes of type "Lighthouse".

Create the Lighthouse content type

We will reuse the Drupal title and body field for the Lighthouse .csv's Name and Description fields. 

Then, all we need to add is a new Geofield location field for the longitude and latitude:

Geofield configuration

Next, we'll test out the new Lighthouse content type by manually creating a new node from the data in the .csv file. This will also be helpful as we configure the Geofield map field formatter (using Leaflet).

Mystic lighthouse
 

By default, a Geofield field uses the "Raw output" formatter. With Leaflet installed and enabled, we can utilize the "Leaflet map" formatter (with the default configuration options).

Leaflet formatter

With this minor change, our test Lighthouse node now displays a map!
 

Mystic lighthouse on a map!

Prepare the .csv file

Prior to writing a migration for any .csv file, it is advised to review the file to ensure it will be easy to migrate (and rollback). Two things are very important:

  • Column names
  • Unique identifier

Column names help in mapping .csv fields to Drupal fields while a unique identifier helps with migration rollbacks. While the unique identifier can be a combination of multiple fields, I find it easiest to add my own when it makes sense. 

The initial .csv file looks like this (opened in a spreadsheet):
 

CSV file before modifications

In the case of the lighthouse .csv file in this example, it has neither column names nor a unique identifier field. To rectify this, open the .csv as a spreadsheet and add both. For the unique identifier field, I prefer a simple integer field. 

Once manually updated, it looks like this:

CSV file after modifications

Create the migration

If you've never used the Drupal 8/9 migration system before it can be intimidating, but at its heart, it is basically just a tool that:

  • Reads source data
  • Maps source data to the destination
  • Creates the destination

Writing your first migration is a big step, so let's get started.

The first step is to create a new custom module to house the migration. First, create a new, empty web/modules/custom/ directory. Then, easily create the module's scaffolding with Drush's "generate" command:

$ drush generate module

 Welcome to module-standard generator!
–––––––––––––––––––––––––––––––––––––––

 Module name:
 ➤ Lighthouse importer

 Module machine name [lighthouse_importer]:
 ➤ 

 Module description [The description.]:
 ➤ Module for importing lighthouses from .csv file.

 Package [Custom]:
 ➤ DrupalEasy

 Dependencies (comma separated):
 ➤ migrate_plus, migrate_source_csv, geofield

 Would you like to create install file? [Yes]:
 ➤ No

 Would you like to create libraries.yml file? [Yes]:
 ➤ No

 Would you like to create permissions.yml file? [Yes]:
 ➤ No

 Would you like to create event subscriber? [Yes]:
 ➤ No

 Would you like to create block plugin? [Yes]:
 ➤ No

 Would you like to create a controller? [Yes]:
 ➤ No

 Would you like to create settings form? [Yes]:
 ➤ No

 The following directories and files have been created or updated:
–––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––
 • modules/lighthouse_importer/lighthouse_importer.info.yml
 • modules/lighthouse_importer/lighthouse_importer.module

Then, let's create a new web/modules/custom/lighthosue_importer/data/ directory and move the updated .csv file into it - in my case, I named it Lighthouses-USA-updated.csv.

Next, we need to create the lighthouse migration's configuration - this is done in a .yml file that will be located at web/modules/custom/lighthouse_importer/config/install/migrate_plus.migration.lighthouses.yml

The resulting module's file structure looks like this:

web/sites/modules/custom/lighthouse_importer/
  config/
    install/
      migrate_plus.migration.lighthouses.yml
  data/
    Lighthouses-USA-updated.csv
  lighthouse_importer.info.yml
  lighthouse_importer.module

Note that the lighthouse_importer.module, created by Drush, is empty. 

While there are a couple of ways to create the migration configuration, we're going to leverage the Migrate Plus module. 

For more information about writing migrations using code or configurations, check out this blog post from UnderstandDrupal.com.

One of the big hurdles of learning to write Drupal migrations is figuring out where to start. It doesn't make much sense to write the migrate_plus.migration.lighthouses.yml from scratch; most experienced migrators start with an existing migration and tailor it to their needs. In this case, we'll start with the core Drupal 7 node migration (web/core/modules/node/migrations/d7_node.yml)

Let's break up the configuration of the new lighthouse migration into three parts: 

  • Everything before the "process" section.
  • Everything after the "process" section.
  • The "process" section.

Everything before the "process" section

Our starting point (d7_node.yml) looks like this:
 

id: d7_node
label: Nodes
audit: true
migration_tags:
  - Drupal 7
  - Content
deriver: Drupal\node\Plugin\migrate\D7NodeDeriver
source:
  plugin: d7_node

Let's update it to look like this:

id: lighthouses
label: Lighthouses
source:
  plugin: 'csv'
  path: '/var/www/html/web/modules/custom/lighthouse_importer/data/Lighthouses-USA-updated.csv'
  ids:
    - ID
  fields:
    0:
      name: ID
      label: 'Unique Id'
    1:
      name: Lon
      label: 'Longitude'
    2:
      name: Lat
      label: 'Latitude'
    3:
      name: Name
      label: 'Name'
    4:
      name: Description
      label: 'Description'

The main difference is the definition of the "source". In our case, since we're using a .csv as our source data, we have to fully define it for the migration. The Migrate Source CSV module documentation is very helpful in this situation.

Note that the "path" value is absolute. 

The "ids" section informs the migration system which field(s) is the unique identifier for each record.

The "fields" section lists all of the fields in the .csv file (in order) so that they are available (via their "name") to the migration. 

Everything after the "process" section

This is often the easiest part of the migration configuration system to write. Often, we just have to define what type of entity the migration will be creating as well as any dependencies. In this example, we'll be creating nodes and we don't have any dependencies. So, the entire section looks like this:

destination:
  plugin: entity:node

The "process" section

This is where the magic happens - in this section we map the source data to the destination fields. The format is destination_value: source_value.

As we aren't migrating data from another Drupal site, we don't need the nid nor vid fields - we'll let Drupal create new node and revision identifiers as we go.

As we don't have much source data, we'll have to set several default values for some of the fields Drupal is expecting. Others we can just ignore and let Drupal set its own default values.

Starting with the just the mapping from the d7_node.yaml, we can modify it to:

process:
  langcode:
    plugin: default_value
    source: language
    default_value: "und"
  title: Name
  uid:
    plugin: default_value
    default_value: 1
  status: 
    plugin: default_value
    default_value: 1

Note that we set the default language to "und" (undefined) and the default author to UID=1 and status to 1 (published). The only actual source data we're mapping to the destination (so far) is the "Name", which we are mapping to the node title.

One thing that is definitely missing at this point is the "type" (content type) of node we want the migration to create. We'll add a "type" mapping to the "process" section with a default value of "lighthouse".  

We have three additional fields from the source data that we want to import into Drupal: longitude, latitude, and the description. Luckily, the Geofield module includes a migration processor, which allows us to provide it with the longitude and latitude values and it does the dirty work of preparing the data for the Geofield. For the Description, we'll just map it directly to the node's "body/value" field and let Drupal use the default "body/format" value ("Basic HTML"). 

So, the resulting process section looks like:

process:
  langcode:
    plugin: default_value
    source: language
    default_value: "und"
  title: Name
  uid:
    plugin: default_value
    default_value: 1
  status: 
    plugin: default_value
    default_value: 1
  type:
    plugin: default_value
    default_value: lighthouse
  field_location:
    plugin: geofield_latlon
    source:
      - Lat
      - Lon
  body/value: Description

Once complete, enable the module using 

drush en -y lighthouse_importer

It is important to note that as we are creating this migration using a Migrate Plus configuration entity, the configuration in the migrate_plus.migration.lighthouses.yml is only imported into the site's "active configuration" when the module is enabled. This is often less-than-ideal as this means every time you make a change to the migration's .yml, you need to uninstall and then re-enable the module for the updated migration to be imported. The Config devel module is often used to automatically import config changes on every page load. Note that this module is normally for local use only - it should never be used in a production environment. As of the authoring of this blog post, the patch to make Config Devel compatible with Drupal 9 is RTBC. In the meantime, you can use the following to update the active config each time you make a change to your lighthouses migration configuration:

drush config-delete migrate_plus.migration.lighthouses -y  && drush pm-uninstall lighthouse_importer -y && drush en -y lighthouse_importer

Testing and running the migration

Use the migrate-status (ms) command (provided by the Migrate Tools module) to check the status of our migration:

$ drush ms lighthouses
 ------------------- -------------- -------- ------- ---------- ------------- --------------- 
  Group               Migration ID   Status   Total   Imported   Unprocessed   Last Imported  
 ------------------- -------------- -------- ------- ---------- ------------- --------------- 
  Default (default)   lighthouses    Idle     814     0          814                          
 ------------------- -------------- -------- ------- ---------- ------------- --------------- 

If everything looks okay, then let's run the first 5 rows of the migration using the migrate-import (mim) command:

$ drush mim lighthouses --limit=5
 [notice] Processed 5 items (5 created, 0 updated, 0 failed, 0 ignored) - done with 'lighthouses'

Confirm the migration by viewing your new nodes of type "lighthouse"!

If all looks good, run the rest of the migration by leaving out the --limit=5 bit:

$ drush mim lighthouses          
 [notice] Processed 804 items (804 created, 0 updated, 0 failed, 0 ignored) - done with 'lighthouses'

If you don't like the results, then you can rollback the migration using "drush migrate-rollback lighthouses" (or "drush mr lighthouses"), make your changes, update the active config, and re-import. 

Next steps

There's a lot more to the Drupal migration system, but hopefully this example will help instill some confidence in you for creating your own migrations. 

The "Leaflet Views" module (included with Leaflet) makes it easy to create a view that shows all imported lighthouses on a single map (see the image at the top of the article). Once you have the data imported, there's so much that you can do!
 

Jun 27 2020
Jun 27

Ashraf Abed, founder of Debug Academy and Drupal.tv talks with Ryan about the Debug Academy's long-form Drupal training. Also, Mike and Ryan take a trip around recent events in the Drupal Community.

URLs mentioned

DrupalEasy News

Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Jun 26 2020
Jun 26

Drupal 9 was launched on June 3, 2020. Given this, it would be necessary for enterprises to upgrade to it later or sooner to acquire complete functionality and retain the ability to receive security updates within the bi-yearly cycles.

In the past, migrating from one version to another has been similar to moving from another CMS to Drupal, bringing in more time and fatigue.

However, the upgrade from D7/8 to D9 is much easier and painless. Let’s dive into more details and understand as to why moving on to Drupal 9 would be a better choice.

Why Should You Upgrade?

With the end of life approaching for Drupal 7 and 8 soon, operating the website on them securely and with complete functionality won’t be a feasible option.

At the same time, it might also be overwhelming for Drupal 7/8 site owners to know that their website will need the upgrade, especially when their site is running absolutely fine; thereby, resulting in confusion among them.

 

Here are 3 reasons why you should consider upgrading your site to Drupal 9:

 

  1. The Drupal security team will soon no longer provide support or security advisories, wavering your website’s and its users’ cybersecurity
  2. D7 and 8 releases’ on all project pages will be flagged as ‘not supported’. D7/ 8 may be flagged as insecure in 3rd party scans making the integration with other third-party tools and systems challenging
  3. Leading hosting services providers like Acquia and Pantheon will also soon withdraw their support from D7 leaving you without many options but to assume hosting responsibility for maintaining your application and server level configurations

The good news for Drupal 7/8 site owners is that even when it goes out of official support in November 2022, remaining Drupal 7/8 sites won't stop working at that point.

Should an Existing Drupal 7 Site Be Upgraded to Drupal 8 or 9?

One of the major reasons that more than seven hundred thousand Drupal 7 sites still haven’t migrated to Drupal 8, is due to the known challenges in the migration process. And with the majority of people on Drupal 7, it is quite likely that most of them did not want to upgrade their CMS twice in the span of one year.

A safe bet seems to be migrating from Drupal 7 to Drupal 9. But will the site be secure? Let’s get to know a few facts.

Since D8 and D9 are similar except for deprecated codes removed and third-party updates in D9, it would be a feasible option for enterprises to migrate to D9 instead of D8 - to save them from constantly going through the same process and investing time, money, and efforts unnecessarily.

What’s New in Drupal 9?

There are innumerable capabilities added in Drupal 9 which further will be consistently updated biannually to help enterprises stay up-to-date.

Now once you upgrade your system to D9, you won’t require to make major changes the next time you plan to update it to a newer version. 

Here are some of the new capabilities that are added to D9-

  1. Backward compatible

    Drupal 9 is backward compatible, i.e., it is compatible with its predecessor, Drupal 8. That being said, D9 will be able to use modules, configurations, and data created on D8 of the same software, unlike the case with D7 and D8.
    Additionally, preserving this functionality won’t burden Drupal with historical baggage and so the performance of the system will remain unaffected. The Drupal community has also focused on breaking code and not the data.
    This way, Drupal will remain fast, clutter-free, and yet an up-to-date technology.

  2. Faster and Better Performance

    Drupal 9 has taken it further to extend its support for responsive images, wherein mobiles can display the best-sized images and hence, consume fewer amounts of data.
    In a recent webinar by Dries, he mentioned that Drupal 9.1 onwards versions/updates will witness the innovation and pave the way for faster and better performances of the websites. Drupal 9.1 update is just six months post the release of Drupal 9. Meanwhile, here are some of the features of D9 that you can leverage for efficient workflows-

        A.  BigPipe increasing page view performance and supporting faster initial page loading

        B.  Content Workflow allowing you to define multiple workflows

        C.  Multilingual capabilities

        D.  Structure Content- Drupal 9 comes in with an array of available fields, encompassing phone, email,       data, and time.

  3. Cleaner code base

    Drupal 9 has removed the support for deprecated codes in D8. This implementation will ensure that the code marked as deprecated will no longer be supported and used in the Drupal ecosystem. 
    The motive behind this is to make D9 a cleaner version so that whenever the modules in D8 want to become compatible with D9, they need to first eliminate the deprecated code. 
    Thus, the end result is clear- to make the code more nimble and improve the website’s performance.

  4. Newer Major Versions of Symfony and Twig

    Symfony 3 will be replaced with Symfony 4 or 5 after November 2021. Also, the Drupal community can introduce an upgrade to Twig 2.0. These upgrades will only result in enhanced performance, improved developer experience, and enhanced security.

  5. Panelizer will be removed and replaced 

    What’s new in Drupal 9? Well, the panelizer will be replaced with the Layout Builder, the “star” module of the moment.

  6. Headless CMS

    Drupal 8 and 9 both come with an API-first approach. Dries also mentioned in the webinar that the Drupal community is vigorously capitalizing on Headless CMS so that it can enhance users’ experience with the powerful front-end of the website with Javascript framework like React or Angular. 

The essential features of Drupal Headless CMS are-

  • Front-End Freedom
  • Create Once, Publish Anywhere
  • API-First Approach
  • Easier Resourcing

Drupal 9 is more usable, accessible, inclusive, flexible and scalable than previous versions, with the following updated features-

  • It will be significantly easier for marketers to use D9
  • Simple than ever to maintain and upgrade for developers
  • D9 is experimenting with its headless or decoupled capabilities

Additionally, you can also learn from our previous blog where we have explained how to find and fix the deprecated code - Site Owner’s Guide to a Smooth Drupal 9 Upgrade Experience.

Why Remove Deprecated Code in Drupal 9?

To ensure that the D8 modules remain compatible with D9, it’s typically essential to remove deprecated codes- 

  1. The all-new Drupal 9 ready code gets deployed on Drupal 8 sites and issues can be tested.
  2. It is a continuation of the fully-tested and stable codebase of Drupal 8

With time, the effort is being made to make Drupal better. There are functions that have been around for a long time but will not be a good fit in the latest release. Most were deprecated in Drupal 8.7.0, which will be removed in Drupal 9.

To sum it all, the key to achieving this smooth transition to Drupal 9 is to rollout your migration plan within deadlines and save yourself from any unnecessary hassle later on.

Srijan is working with leading enterprises to help them migrate their digital web properties to Drupal 9 for better user experience. 

If you are also looking for a smooth upgrade/migration process for your enterprise’s system, we are all ears and excited to assist you. Contact Us!

Jun 26 2020
Jun 26
Jun 26, 2020 Drupal

For our Drupal distribution we needed to redirect all anonymous users to the login page, as for now it's implemented as a closed platform for social collaboration. The Drupal 8 way didn't work anymore; we fixed it for Drupal 9 and published a working module. 

So if you're building a Drupal social intranet, collaboration tool or community this might help you to direct them the right way -so they don't get an unfriendly 'access denied'.

Keep in mind that you still have to build the correct access control into all your pages with help of permissions / access checks / advanced route access, this module doesn't provide for that.

Clone and run off

We published the Drupal module here on Github so you can copy it and run off with it to do whatever you need. At the moment it's not a published Drupal.org project with all kinds of configurable stuff.

A short explanation of the code

First you'll have to implement an 'Event Subscriber' in the .services.yml file.

drupal event subscriber

More info: Subscribe to and dispatch events.

Next, it took a while before we figured it out, but this code in /src/EventSubscriber/AnonymousRedirectSubscriber.php is all it takes to make it work:

Drupal anonymous redirect subscriber

  1. More info on responding to events in Drupal here on Drupalize.me.
  2. Get current user with dependency injection.
  3. Get current request with ->getRequest(), here's what differs from the Drupal 8 version: we couldn't get ->getRouteName() to work properly.
  4. Check if current user is anonymous, but exclude some paths. Also, we had to facilitate user/reset/* with php's fnmatch, because that path contains a variable (the reset hash).
  5. Respond with the redirect to Drupal's login page.

Drupal 8 version

You can find the previous Drupal 8 code here, and we also found the Anonymous login module, but both didn't work in our Drupal 9 (then beta) install.

'quick-start' Drupal to test

To test this module fast in a fresh install, Drupal's quick-start might come in handy.

Need this as a contrib Drupal module?

Please let me know in comments down below, if there is enough positive feedback on this we might make it configurable and contrib it!

Jun 25 2020
Jun 25

The fundamental building blocks of running an efficient and user-focused public transportation network and building a well-designed, effective, and user-centric website are actually pretty similar: You need talented people, quality data, and elite technology to get the most out of your investment.

That’s why the widespread adoption of open data standards combined with an effective and affordable technology like Drupal helps to ensure that public transit works for all users.

Ultimately, the key to great transit service is not about getting 100 percent of people to ride public transit for 100 percent of their trips. Success comes from giving people a viable choice of getting around without needing to drive -- a choice built on affordability, convenience, and quality

Giving people viable choices to get around does not end with good urban planning, congestion management, and low fares. It includes giving people the information they need to plan trips, to plan their day, and to plan travel across, through, and around their cities using transportation solutions that meet their evolving mobility needs.

Where does most of that information come from? Open Data.

Open Source & Open Data A Smooth Ride

Many cities have General Transit Feed Specification (GTFS) feeds available online. These are usually curated by regional public transit agencies. 

GTFS is just one example of many Open Data resources available and in use by transit agencies. But having access to that data is only part of the equation. The important question to answer is how are they to manage that data and repurpose  it in a way that is responsive, accessible, meaningful, and convenient for people to consume? 

Transit authorities, including such mass transit hubs as Santa Clara Valley Transportation Authority, Bay Area Rapid Transit District, and New York City - Metropolitan  Transportation Authority, are turning to open source technologies, like Drupal. 

Why? Because it is possible to handle real-time data in Drupal and harness such entities as GTFS, Google APIs, and other APIs, to fuel a great-looking, purpose-driven site, be it on a smartphone, ipad, or pushed outward to something else entirely, like digital billboards and signage solutions.

Why Drupal for Transportation?

The Drupal open-source content management system (CMS) fits the unique needs of the transportation and transit industry. 

Drupal supports:

  • high-traffic websites with hundreds, thousands, or more registered users of varying privileges and access roles;
  • websites that require the ability for many users to act as contributors and publish content in the form of pages, articles, blog posts, and forum posts;
  • sites with complex structures that require a finely tuned architecture to serve relevant content to its end users;
  • organizations that demand very high security of their websites; and
  • websites that receive a high volume of traffic and require a solid backend in order to ensure functionality in spite of traffic spikes.

Drupal is non-proprietary and benefits from one of the largest open-source communities in the world. It has more than a million passionate developers, designers, trainers, strategists, coordinators, editors, and sponsors working together to constantly develop, iterate, update, refine, and improve its technology.

In addition, thousands of Drupal service providers benefit from support through digital experience platform Acquia’s forward-thinking, ever-expanding catalogue of enterprise-ready technology solutions and technical support. 

I encourage you to connect with us to learn more about Drupal solutions in transportation -- or any other large-scale industry. We’d also love to receive the opportunity to bid on your next project or complete an RFP. 

Jun 25 2020
Jun 25

Brevity and simplicity are great characteristics of web page URLs. No wonder that various URL shorteners are very popular. Our web agency that creates appealing, feature-rich, and user-friendly websites, shares a post on URL shorteners: what they are, why they are used, and how to integrate a URL shortener with a website.

If your website is built on Drupal, the second part of the post will be especially interesting for you because we will discuss URL shortener integration on Drupal websites.

What is a URL shortener?

A link shortener is a service that reduces the length of your URL. Even web addresses over a hundred characters long can turn into trimmed links that do not exceed 20 characters. The underlying technology is web page redirection. A user clicks on the short address and is navigated to the long one. Famous examples of link shorteners include, but are not limited to Bitly, Google’s URL Shortener, TinyURL, DyingLinks, Bag.gy, Ow.ly, and Yourls.

Bitly link shortener

Reasons to use a link shortener

Initially used for Twitter, links shorteners became a common practice in a variety of marketing activities. Here are a few scenarios in which the link shortening technique is used:

  • There are restrictions in the allowed number of characters imposed by messaging apps — for example, link shorteners became especially popular due to Twitter’s rule of 140 characters in one tweet.
  • You need to make the URL easy to remember and to save from a printed source (banner, presentation, etc.)
  • Your goal is to beautify your links and make them more aesthetically appealing, which should convince more visitors into clicking.
  • You want to learn your customers’ behavior by using the monitoring features in the famous URL shorteners.
  • You want to use the UTM parameters, or tags added to your URLs for the purpose of tracking the user behavior. These usually look long but link shorteners can make them much shorter.
  • You want to use the advanced features like changing destination addresses, link retargeting, traffic splitting, and more.

Link shortener integration with a website built on Drupal

You can use link shorteners directly on their own websites where you just need to enter the long address and grab the short address.

However, it’s a more interesting idea to integrate a link shortener with a website. In particular, we will see how to integrate Drupal with a link shortener.

To set up URL shortening in Drupal, you can use a third-party service integration via a URL shortener API or just the Pathauto module to auto generate short web addresses. However, the most convenient way is to use a contributed module to perform URL shortener integration for Drupal.

Drupal modules to integrate your website with a link shortener

Shorten URLs

The Shorten URLs is an easy-to-use and lightweight module that provides an API to integrate Drupal with TinyURL, Bitly, and a dozen of other link shortener services. It is used by other modules as well because it is primarily an API. The Shorten URLs module is ready both for Drupal 7 and Drupal 8 sites.

It offers an intuitive user interface for URL shortening as a page or a block. In addition, there is a block that displays the current page’s shortened address is meant for quick copying.

The module comes packed with four submodules:

  1. The Shorten URLs as the main module provides an API and user interface for URL shortening using common services.
  2. The Record Shortened URLs submodule tracks and shows the statistics of shortened URLs to website admins.
  3. The Shorten URLs Custom Services submodule allows you to add custom URL shortening services via the UI.
  4. The Shorten URLs Input Filter submodule offers an input filter for automatic URL shortening in the text.
Shorten URLs Drupal module

ShURLy

The ShURLy is a URL shortening service created as a Drupal module. It has a stable Drupal 7 version but Drupal 8 websites will need to wait a little because ShURLy for Drupal 8 is in development. The ShURLy module has plenty of interesting features, some of which are as follows:

  • Clicks are tracked on short URLs.
  • Authenticated users can track their URLs activity.
  • You can auto generate URLs or enter custom short URLs.
  • There are fine-grained permissions based on user roles.
  • There is a click-to-copy-to-clipboard feature.
  • URLs are case sensitive.
  • There is a web services API for shortening or expanding the URLs.

Bit.ly for Drupal

The Bit.ly for Drupal module for Drupal 7 helps you integrate Drupal with Bitly. It provides a rich API that enables other modules to use the Bitly features (shorten or expand links and view their statistics). With this module, users can link their Bitly account to their Drupal profiles. Site admins can enter their Bitly API and oAuth details. The module integrates with the Shorten module we have described above.

Bit.ly for Drupal module

Let us help you integrate your website with a link shortener

Hopefully, the basics of how to integrate with a link shortener will be helpful to you. Contact our web agency for Bitly integration with Drupal or if you want to use any other link shortener on any kind of website. We will install and configure the right modules or create custom functionality from scratch at very affordable anti-crisis prices.

Jun 25 2020
Jun 25

Gábor Hojtsy

An avid open source enthusiast and contributor. I am a the Drupal 9 initiative coordinator, Drupal core product manager and initiative coordinator coordinator working with and on the open source project itself at Acquia. I am a regular Drupal event speaker and organizer and do communication and social media for various initiatives.

I used to be the Drupal 8 multilingual initiative lead and the former release manager of Drupal 6.

You can also find me passionate about singing, music and amateur acting, especially when these are all combined, however I have little time for that alongside my two adorable kids.

Head to the contact page to send a mail.

Jun 25 2020
Jun 25
Our implementation of serverless, decoupled Drupal enabled a luxury resort client to consolidate their web properties and serve multilingual content to a worldwide audience.
Jun 25 2020
Jun 25
Choosing the right support vendor is a challenging process. Here's how you can tell if your Drupal services provider is right for you.
Jun 25 2020
Jun 25
Drupal 8 developers, making Drupal 8 PHP 7 ready is one of the best things to happen for us. This is why.
Jun 25 2020
Jun 25

One of the things that is remarkable with Drupal 8 (and Drupal 9) is the support for multiple languages. You can translate anything, so making a page completely Norwegian is something Drupal supports out of the box.

If you download Drupal and a couple of modules, you can with just a few clicks have almost all of the interface translated into Norwegian, since translation is a community effort. This means that most of the strings we and our users see during a normal work day, are already translated by us, or someone else in the community. A very special shout out here to svenryen, who almost by himself went through all the strings in Drupal 8.

Today I want to introduce another interesting challenge, and how we are able to solve that with Drupal. Namely deploying new features that should contain translations.

At Ny Media we develop tailored and highly customized solutions for our clients. This way we ensure that their result will make their job and their business easier and more profitable. To do this, we often end up writing custom code for use either between our own projects, or for a specific client.

Translating Drupal Core is a community effort. But brand new strings coming from features that we just created in custom code is not something we can rely on the community to get translated. So we have to do it ourselves. The challenge here lies in how we can get the strings translated on the production site, when the strings are only found in this specific project. Let’s look at a couple of theoretical solutions, and then look at the solution we use. Which we by coincidence think is the best, and now we want to share this with you.

Let’s imagine that we are developing a feature where we display some Copyright information on every page, and we want to translate that. Let’s just say that we call this module “copyright_info”. And maybe we have some code that looks like this:

/**
* Implements hook_preprocess_HOOK().
*/
function copyright_info_preprocess_page(&$variables) {
 $variables["page"]["content"][] = [
   '#markup' => t('Thank you for visiting this site, and we hope you like the content. However it is copyrighted so please do not steal it.'),
 ];
}

Now on every page we will have the whole interface in Norwegian. The only exception is the part with copyright, since that is something we just wrote ourselves.

Screenshot from translated frontpage. Note the untranslated custom string. Screenshot from translated frontpage. Note the untranslated custom string.

Option 1: Translating “by hand” when the new feature is deployed.

This is the easiest and most obvious way to do it. After you have deployed your new feature, you go into the administration form for your site. This part is located under  Administration -> Configuration -> Regional and language -> User interface translation, or path admin/config/regional/translate. So we search up the string(s) we just made. Then we can translate them one by one.

String translation

This is a very transparent and easy solution. And that has its advantages:

  • Requires no additional build steps to import translations
  • Requires no additional modules to enable

However, it has some disadvantages as well:

  • Tedious to do for more than one translation.
  • Error prone (Copy-paste errors, or typos)
  • You end up having a deployed site with a temporarily untranslated string while you are manually editing the translation(s).

In this example case this might be a good solution since we only have that one string. But once you get more than 1 string to translate, you really do not want to do that by hand. And if you have a high traffic site, you probably do not want your site do be untranslated while you work.

Option 2: Exporting translation on a dev/staging site, and importing it on the live site

Another option that is available out of the box, is to export all the translated strings you have on one site, and then import them on another.

Export strings from development or staging site First you can export and download the strings from a copy of the site where your new strings are translated. Import on live site Then you upload and import the strings on the live site

This is also fairly easy to do so it does have similar advantages:

  • Requires no additional build steps to import translations
  • Require no additional modules to enable
  • Supports importing of huge amounts of text

It does however come with some disadvantages:

  • Still manual work, and might leave your site temporarily untranslated
  • Requires manual intervention
  • The translations batches are not version controlled

Option 3: Importing custom translations as a part of your build process.

As you probably have figured, this is the option I wanted to highlight in this blog post, and this is the method we are currently using in Ny Media. To achieve this we use a Drupal module called Custom translation deployments. This is a module we developed as part of a client project, which in turn ended up being our standard setup for deploying translations on our projects.

The module provides two ways of providing translations to easily deploy as part of the build process. We will look at the most simple here, and then briefly mention a slightly more advanced way.

The first step of being able to deploy translations is making sure your translations are committed to your version control system. A default Drupal installation will set the translation directory to sites/default/files/translations,. This is typically a directory ignored by the version control system. So we start this process by changing it to somewhere we can check in to our version control system. Like ../translations. To change it, we go to the page for File system located under Administration -> Configuration -> Media, or at path admin/config/media/file-system.

File settings Since the project is located one level below the Drupal installation, we place the translations in a translations folder in the root of the directory.

Here in Norway we are usually looking for Norwegian translations for our Norwegian clients. The module Custom translation deployments comes with a predefined translation file pattern defined. Meaning if we place a file called project_specifc-custom.LANGUAGE.po (or in the case of Norwegian project_specifc-custom.nb.po) in the newly defined and created translation folder, it will get imported. Let’s try that.

First let's make sure the file is in place.

$ ls ../translations
project_specific-custom.nb.po

Now one can either use the interface (under Administration -> Reports -> Available translation updates - admin/reports/translations) or do this with drush.

The other convenient thing is to add this as part of your deployment script. Maybe you have some sort of Continuous Deployment tool for it, or maybe you just use some manual and run it on your server. Whatever boat you are in, you can always do something like this:

$ drush sdel locale.translation_last_checked && drush locale-update && drush cr
>  [notice] Translation file not found: http://ftp.drupal.org/files/translations/8.x/project_specific/project_specific-custom.nb.po.
>  [notice] Checked nb translation for project_specific.
>  [notice] Imported nb translation for project_specific.
>  [notice] Translations imported: 1 added, 0 updated, 0 removed.
>  [notice] Message: En oversettelsesfil importert. /1/ oversettelser ble lagt til, /0/ 
> oversettelser ble oppdatert og /0/ oversettelser ble fjernet.

This is actually 3 commands. The first one makes sure that Drush is forced to think it is time to update the translations. The second one updates the translations, and the third one clears the cache. This third step is important if you have translated strings in your JavaScript.

But it is also possible to do this through the interface. Which would look something like this.

Interface for importing a local file

One other thing to note is that having the project defined as a locale project, means you can have a translation file on a server, and Drupal will look for updates to it. As you can see from the drush output above, it will look for it in a specific pattern. This is defined as part of the hook implementation for custom_translation_deployments. So if you want a translation file for your organization you can have a custom module implementing this hook, and therefore also make it available to be updated and downloaded automatically. Her is one such example:

/**
 * Implements hook_custom_translation_deployments_files().
 */
function my_org_custom_translation_deployments_files() {
  $items = [];
  $items[] = [
    'name' => 'my_org',
    'project_type' => 'module',
    'core' => '8.x',
    // We set the version to something static, but not to "dev".
    'version' => 'custom',
    'server_pattern' => 'http://my_org.com/files/translations/%core/%project/%project-%version.%language.po',
    'status' => 1,
  ];
  return $items;
}

This way we can now either ship a translation file called my_org-custom.nb.po or a file on http://my_org.com/files/translations/8.x/my_org/my_org-custom.nb.po.

The last part I would recommend is to only look for translation updates in the local filesystem on your production server, and only manually. This way, Drupal will never try to download new translations on your live site, creating conflicts with your version controlled translations. The setting for this is at  Administration -> Configuration -> Regional and language -> User interface translation -> Interface translation settings, or path admin/config/regional/translate/settings.

Admin settings for translations

You can also do this as part of the settings on the live site, and instead keep this setting on for development sites. To do that in settings.php you would do this:

$config['locale.settings']['translation']['use_source'] = 'local';

Flexible translation methods

However your advanced methods of deployment and translation is, Drupal has a way for you to handle it. And when Drupal out of the box can not support your preferred method of deployment or automation, contributed projects like Custom translation deployments can help you the last steps of the way. The module is fully tested, used in production on many sites, and supports Drupal 9 out of the box!

If you are looking for a partner in developing your localized Drupal site, contact Ny Media!

Jun 25 2020
Jun 25

Website redesign decisions often percolate for months, if not years, before action is taken and the team dives in for a do-over. Prior to taking the plunge, there tends to be a lot of low-grade dissatisfaction that gains momentum, as more and more conversations focus all the ways that operations would run smoother, marketing would work better, or all of the tasks that could be offloaded to the website if only navigation was simpler or the site was up to date. 

During my many years in the web design and digital strategy world, I’ve witnessed lots of discontent over existing websites. Here are eight indicators that it is truly time to take action.


1. Your information architecture is organized around your org chart and not user needs.

Too often, websites are architected from the perspective of insiders -- organized by department, without stepping back to ask: Who visits the site? What kinds of information are they seeking? How can we align the navigation of the site around what makes sense to users? The best web experiences leverage radical empathy, and begin with Human-Centered Design processes that dig deep and question any all assumptions about how users interact with the websites can it be designed so that information is easier to find and the experience of visiting the site is engaging and value added.
 

2. Your site isn’t mobile responsive or accessible.

Web designs in the current world need to work equally well on a wide range of screen sizes and different orientations -- from tiny hand-held screens to oversized desktop monitors. Responsive websites also tend to load faster, which is a critical component for a positive user experience, as well as a big factor in better SEO rankings. Your site also needs to adhere to current Web Content Accessibility Guidelines (WCAG 2.1) to ensure compliance for users with a wide range of disabilities. 
 

3. Your user interface (UI) is dated or dull

Expectations are high and patience is low in the current world of online engagements. Once web users get accustomed to well-designed, aesthetically pleasing sites, they have little patience for an outdated design elements. Colors, typography, images, and layout need to align with your brand, make sense to users, and present a balanced, uncluttered flow of information. When this not the case, it takes a toll on your business.
 

4. Your user experience (UX) is sub-par

More so than ever before, web experiences that are difficult to navigate can cut visits short, drive away customers, and reflect poorly on your overall brand. Don’t underestimate the degree to which a frustrating web experience can turn off -- or even enrage -- users. At the same time, a web experience that is built upon an understanding of pain points in order to create an engaging, streamlined and uncomplicated experience is a true delight that can significantly boost many aspects of your marketing objectives. 


5. Your content is outdated.

Ensuring that website content is up-to-date may sound like an obvious objective, but when web experiences start to lag behind what’s going on in the organization, the impact can start to snowball. Quickly, it’s not just the content that’s out of date, but the branding, the messaging, and the overall tone might all fall out of alignment -- not to mention lost SEO opportunities. While a redesign can get a website in sync with the current organizational vision, migrating to the most up-to-date CMS ensures a framework for more easily making changes that keep your content up to date.

6. You are increasingly suffering from “website envy.”

When online experiences consistently spark a sense of “Why can’t our site do that?” or “Why doesn’t our site look more like this?” It’s time. Your site can step up, and serve as a strategic driver in an environment in which your web presence is an increasingly defining factor for your organization, your brand, and your values.

7. Your website uses Flash.

Once the dominant software platform for production of animations and embedded web browser video players, Flash has long since fallen from favor. In 2016 Adobe announced that at the end of 2020, it would be ending support for support for Flash. If your website uses Flash, or other outdated technologies, that’s a good sign that you can benefit from a redesign and migration to up-to-date solutions. 

8. Your CMS is no longer supported or is soon to lose support. 

Both Drupal 7 and Drupal 8 are heading toward end-of-life status, after which point, the Drupal community will no longer be maintaining either version. For websites that are currently on Drupal 7, this end date represents a much-needed incentive to migrate over to a far superior CMS and a vast array of new features. Websites that have already migrated to Drupal 8 can count on a seamless upgrade to Drupal 9 that’s more akin to a point release. The migration process is an absolute ideal time to also redesign a site. In fact, combining the two initiatives represents an opportunity to create alignment among functionality, look and feel, branding, UI, UX -- every objective that your website is intended to achieve.

As Vice President of Digital Experience for Promet Source, the most gratifying part of my work involves helping to shepherd the transformation of websites from a source of frustration or simply a functional presence, to a beautiful experiences that ignites new digital possibilities. We’d love to talk with you about challenges with your current site and all that you hope to achieve with a redesign. Contact us today!
 

Jun 24 2020
Jun 24
Date: 2020-June-24Description: 

Previously, Drupal 7's end-of-life was scheduled for November 2021. Given the impact of COVID-19 on budgets and businesses, we will be extending the end of life until November 28, 2022. The Drupal Security Team will continue to follow the Security Team processes for Drupal 7 core and contributed projects.

However, this means extra work from the Drupal community at large and the security team in particular to review security reports, create patches, and release security advisories for Drupal 7. This community effort will give site owners more time while budgets recover, but the organizations that sponsor security team members and the individual security team members who volunteer their time could use your support. If you can, please donate to support the end-of-life extension.

Drupal 8 will still be end-of-life on November 2, 2021, due to Symfony 3's end of life. However, since the upgrade path from Drupal 8 to Drupal 9 is much easier, we don't anticipate the same impact on end-users.

What does this mean for my Drupal 7 site?

You can continue to run the site and get security updates via the normal channels and processes. This will give you an extra year to work on converting your site to Drupal 9.

Do I need to upgrade to Drupal 8 before I upgrade to Drupal 9?

Migrating directly from Drupal 7 to Drupal 9 is supported with the core Migrate module. Read more on preparing a Drupal 7 site for Drupal 9.

How can I help?

Consider donating to support this effort. If you are a representative of a large end-user of Drupal, we'd love you to join the Drupal Association and the security team as a partner.

You can also consider getting more involved in fixing issues in the issue queue or joining the Security Team as a way to support the effort.

What about Drupal 7 Vendor Extended Support?

The extended support will now run from November 2022 until November 2025. You can read more about the Drupal 7 Vendor Extended Support program.

What about contributed projects?

The Security Team will continue to follow the Security Team processes for contributed projects. Contributed project maintainers are asked to consider supporting existing Drupal 7 releases if they are able.

Jun 24 2020
Jun 24

Layout Builder

Layout Builder, and the related ecosystem of modules, provides a set of powerful tools that allow content creators and site administrators to modify the layout of a page using a drag-and-drop interface. We've published 11 new tutorials to help you Learn Drupal's Layout Builder and create flexible layouts for your Drupal site.

Learn Drupal's Layout Builder

We're working on more tutorials on Layout Builder as well as new tutorials on managing media in Drupal and videos to accompany tutorials in the Views: Create Lists with Drupal series of tutorials.

Happy layout building!

P.S. Drupal 9 has launched! Learn more about the latest major release of Drupal and what it means for tutorial compatibility and your learning journey in our Guide to Drupal 9 video and resources page.

Jun 24 2020
Jun 24

The content management ecosystem at large and Drupal in particular have evolved tremendously over the last 20 years, and many developers who haven't engaged with the CMS market in a while may have an outdated conception of what a CMS can do. With Drupal 9's release at the cutting edge of the CMS world, now is a better time than ever to reintroduce developers at large to what Drupal can accomplish.

Stack Overflow recently invited Drupal Association CTO Tim Lehnen to write about the evolution of the CMS marketplace, and what Drupal has to offer:

For many people discussion of content management systems raises unpleasant specters of the early 2000s. But while CMS platforms may not feel like the shiniest new tech on the block, they still have a lot to offer, and they've evolved in ways that might surprise you. Let's talk about Drupal, a 20 year old open source project that still manages to be on the leading edge of the CMS world.
The Overflow

Read the article

Jun 24 2020
Jun 24

After four-and-a-half years of development, Drupal 9 was just released, a milestone in the evolution of the Drupal content management system. The Drupal Association has long played a critical role not only in supporting the advancement and releases of one of the world's largest and most active open-source software projects; it also contributes to the Drupal roadmap and drives its forward momentum in other important ways. In addition to maintenance releases for Drupal 7 and Drupal 8, the Drupal 9 release not only promises an easy upgrade for Drupal 8 users but also ushers in a new period of innovation for Drupal.

But that's not all. Drupal 9's release also means long-awaited upgrades to Drupal.org as well as some of the most essential infrastructure and services that underpin Drupal.org and its associated properties, like localize.drupal.org, groups.drupal.org, and api.drupal.org. Releases in Drupal have also garnered greater scrutiny from nefarious actors who target launch dates to seek security vulnerabilities. The Drupal Association works tirelessly to buttress all of these initiatives and responsibilities, with the support of Tag1 and other organizations.

In this Tag1 Team Talks episode, part of a special series with the engineering team at the Drupal Association, we speak discuss Drupal 9 and what it portends for Drupal's future with Tim Lehnen (Chief Technology Officer, Drupal Association), Neil Drumm (Senior Technologist, Drupal Association), Narayan Newton (Chief Technology Officer, Tag1 Consulting), Michael Meyers (Managing Director, Tag1 Consulting), and Preston So (Editor in Chief at Tag1 Consulting and author of Decoupled Drupal in Practice). We dove into some of the nitty-gritty and day-in-the-life of Drupal core committers and how Drupal is taking a uniquely new approach to tackle technical debt.

[embedded content]

---

Links

Photo by asoggetti on Unsplash

Jun 24 2020
Jun 24

Despite all the disruption and technological advances that occurred over the past few years, the structure of higher education institutions and state of academic curriculums are closer to what would be the case during the early stages of industrialization.

In 2020, we are well into the digital era and students face a set of challenges unique to their age and technology plays a significant role, both as part of the challenge and an opportunity to overcome those challenges.

Can schools and universities afford to operate using the existing model? What technologies will be needed? How will higher education experiences change in the coming years?

One fact is undeniable; higher education should focus on the direct needs of students and the demands of an increasingly digital market in the future.


1. Flexible Learning Experiences

A flexible learning experience is essential these days; not just because of unexpected force majeure circumstances such as the global COVID-19 outbreak.

New challenges have arisen due to the shift in the digital economy's demand for new skill sets and capabilities. The job market has witnessed a sharp demand for "digital", "innovation", "data science", and "information" in their job descriptions and titles.

The current curriculum being taught is fast becoming irrelevant and seems rigid to those seeking to learn a new in-demand skill or enhance their professional status. As a result, learning centers that provide focused online workshops and certifications are increasingly becoming a popular and cheaper alternative thanks to the flexibility, focus, and accessibility they have on offer for learners.


“The learner, the learning provider and the employer all are speaking different languages that don’t interconnect,” said Michelle Weise, chief innovation officer at the Strada Institute for the Future of Work.

Universities and schools should consider the subscription-based model where students would pay monthly fee subscriptions to attend the modules and courses they prefer instead of attending 9 hour school days. Learners can complete their selected courses at their own pace and enjoy convenient accessibility.

Students should be able to access specific learning resources, materials, libraries, labs, and digital learning activities at any time from anywhere across all devices.

Flexibility and accessibility are standard requirements for any digital experience and should be at the forefront when considering an enterprise-level digital project. But it's easier said and done - identifying and investing in the right IT infrastructure will be essential to support a comprehensive digital learning experience for your students.

2. Remote and Distance Learning

The days of strict daily hours of attendance of pre-fixed classroom schedules are over. 

We were inevitably going to rely more heavily on remote and distance learning in the upcoming years, however; the recent outbreak of COVID-19 across the globe has accelerated this process.

They realized the importance of aligning their strategic business needs with the ideal mix of technology value-stack required to ensure their digital growth and future.

We specialize in providing higher education institutions with comprehensive digital transformation solutions based on Drupal and we witnessed a sharp increase in both verbal interest and demand from schools, universities, and even teachers for our services.

Schools and universities have immediately reallocated their resources and budgets to prioritize remote and distance learning. Unfortunately, not all schools and universities are technologically capable to embrace full-fledged remote learning as the norm.

Almost every industry has been disrupted by the outbreak of Coronavirus but schools and universities have been hit the hardest because they had to reckon with the fact that their entire business model and operations have to change. Do they even need a campus anymore?

CS: American School of Dubai

Drupal 8 Transforms the American School of Dubai

3. Immersive Learning Experiences

There is no doubt that flexibility, accessibility, and maximum immersion will be critical in appealing to students around the world. 

This has led over 70% of the top 100 universities around the world to adopt Drupal which enabled them to build a truly digital campus and immersive learning experience for their students and faculty. Drupal has enabled these institutions to seamlessly integrate essential cutting-edge solutions that complement and support their digital transformation.

Essential integrations such as VR.

VR implementation in learning is on the rise and enables any school or university around the world to offer classes to a global audience of interested pupils. For example, a student from Beijing can attend a French language class at a middle school based in San Francisco through VR.

Some universities have introduced A.I. into their learning experiences as well.

Through A.I., students can learn from a robot teacher. This teacher is capable of grading, checking spelling, grammar, etc. and also monitor the performance of students to identify gaps in knowledge and weaknesses which in turn can be analyzed by academic advisors to further support their students improve their grades and performance.

There are a number of interesting experiments being run by Georgia Tech's Center for 21st Century Universities (a Drupal 8 platform) to identify which solutions work and which don't that are trying to find the right balance with regards to the involvement of A.I. in education and learning experiences.

Who Leads Digital Transformation?

Transformation is inevitable but it is often misguided due to lack of education (pun unintended) and awareness regarding the technologies that will be essential to elevate a university or school into a modern higher education experience.

Academics and higher education institutions must realize that technology alone is not the answer. It is only the tool.

Top management and leadership at academic institutions must drive the change and digital transformation based on facts and solid requirements in a bid to identify which technologies will be essential to build the foundation for an evolving interconnected web of digital learning assets and platforms.

Some have even adopted the wrong tools - technologies that do not match their digital transformation requirements or support their objectives.

Technology alone won’t be the answer to the existential challenges that schools and universities face. The culture of teaching and learning must also change. The real question that needs to be addressed is: what good or value is higher education supposed to deliver?

Addressing the needs of students already living in a digital environment and a digital marketplace will be paramount and Coronavirus might just be that disruptor that forces us to accelerate the digital transformation process.

One thing is for sure; it will force higher education institutions and academics to go back to the roots of education: storytelling, research, and development.

In the meantime, we will continue to provide our assistance to universities and schools in identifying the appropriate technologies needed to develop the ideal digital learning experience for their students.

Questions About Drupal 9

Questions About Drupal 9?

Jun 24 2020
Jun 24
Jun 24, 2020 Product

We collected user feedback from which new functions were built, we also improved existing features and did a design update. The highlights:

Activity badges in main menu

As soon as there is activity in a group, team or project, you will now see this with badges in the main menu on the left:

Activity badges

Text documents / Notebooks

It is now possible to create text documents, you can also use this as a notebook. Some examples:

  • (Project) documentation
  • Ideas
  • Manuals
  • General business information
  • Onboarding information, for new employees
  • Brainstorms
  • Event preparation.
  • ... and basically everything you want to share and manage in a text document.

The documents can be added in folders, in order to provide structure for multiple documents.

Add new text document:

Text doc

Format text document:

Text doc 2

  1. Title
  2. Content
  3. Add attachments
  4. Send e-mail notifications

Organize text documents in folders:

Text doc

Group types

The generic term 'group' turned out to be too abstract, we have now divided this into:

  • Company wide
  • Teams
  • Projects

group types

When you add new members, they are automatically placed in the 'company wide' groups and therefore have immediate access to general information such as company manual, onboarding information or house rules.

Stream items: chats vs other activity

A visual distinction has now been made between chat items and other activity:

Stream items

Design update

An upgrade has been made in the design: more modern color scheme, font and icons, which also improves readability and usability:

update design

Tuning existing features

For optimization, we worked on about ~30 minor issues on existing features.

Try it instantly 

Would you like to try Lucius right now? Click here to get started.

Install, host and customise it yourself, 100% open source

Please check this project page for detailed information.

Feedback, feature request

Please let us know your feedback or/and feature requests in the comments below, via our support form or via the open source issue queue.

Jun 23 2020
Jun 23

Agaric is excited to announce online training on Drupal migrations and upgrades. In July 2020, we will offer three trainings: Drupal 8/9 content migrations, Upgrading to Drupal 8/9 using the Migrate API, and Getting started with Drupal 9.

We have been providing training for years at Drupal events and privately for clients. At DrupalCon Seattle 2019, our migration training was sold out with 40+ attendees and received very positive feedback. We were scheduled to present two trainings at DrupalCon Minneapolis 2020: one on Drupal migrations and the other on Drupal upgrades. When the conference pivoted to an online event, all trainings were cancelled. To fill the void, we are moving the full training experience online for individuals and organizations who want to learn how to plan and execute successful Drupal migration/upgrade projects.

Up to date with Drupal 9

Drupal is always evolving and the Migrate API is no exception. New features and improvements are added all the time. We regularly update our curriculum to cover the latest changes in the API. This time, both trainings will use Drupal 9 for all the examples! If you are still using Drupal 8, don't worry as the example code is compatible with both major versions of Drupal. We will also cover the differences between Drupal 8 and 9.

Drupal 8/9 content migrations

In this training you will learn to move content into Drupal 8 and 9 using the Migrate API. An overview of the Extract-Transform-Load (ETL) pattern that migrate implements will be presented. Source, process, and destination plugins will be explained to show how each affects the migration process. By the end of the workshop, you will have a better understanding on how the migrate ecosystem works and the thought process required to plan and perform migrations. All examples will use YAML files to configure migrations. No PHP coding required.

Date: Tuesday, July 21, 2020
Time: 9 AM – 5 PM Eastern time
Cost: $500 USD

Click here to register

Upgrading to Drupal 8/9 using the Migrate API

In this training you will learn to use the Migrate API to upgrade your Drupal 6/7 site to Drupal 8/9. You will practice different migration strategies, accommodate changes in site architecture, get tips on troubleshooting issues, and much more. After the training, you will know how to plan and execute successful upgrade projects.

Date: Thursday, July 23, 2020
Time: 9 AM – 5 PM Eastern time
Cost: $500 USD

Click here to register

Getting started with Drupal 9

We are also offering a training for people who want to get a solid foundation in Drupal site building. Basic concepts will be explained and put into practice through various exercises. The objective is that someone, who might not even know about Drupal, can understand the different concepts and building blocks to create a website. A simple, fully functional website will be built over the course of the day-long class.

Date: Monday, July 13, 2020
Time: 9 AM – 5 PM Eastern time
Cost: $250 USD

Click here to register

Discounts and scholarships available

Anyone is eligible for a 15% discount on their second training. Additionally, if you are a member of an under-represented community who cannot afford the full price of the training, we have larger discounts and full scholarship available. Ask Agaric to learn more about them.

Customized training available

We also offer customized training for you or your team's specific needs. Site building, module development, theming, and data migration are some of the topics we cover. Check out our training page or ask Agaric for more details. Custom training can be delivered online or on-site in English or Spanish.

Meet your lead trainer

Mauricio Dinarte is a frequent speaker and trainer at conferences around the world. He is passionate about Drupal, teaching, and traveling. Over the last few years, he has presented 30+ sessions and full-day trainings at 20+ DrupalCamps and DrupalCons over America and Europe. In August 2019, he wrote an article every day to share his expertise on Drupal migrations.

We look forward to seeing you online in July at any or all of these trainings!

Jun 23 2020
Jun 23

Twig tweak module is a huge time saver for Drupal developers working with advanced twig templates. It offers several useful functions and filters that can ease the developer’s job. Developers can also write well formatted code which is more comprehensible. I highly recommend usage of the Twig tweak module in Drupal 8 for quick and easy Drupal development. What’s more, Twig tweak is also supported in Drupal 9!

Twig tweak

How to Install the Twig Tweak Module

You can either download and install the module from Drupal.org or from composer. Once you download and extract the file, place inside your project directory. You can also choose to use composer to download with the following command – 

composer require drupal/twig_tweak

Go to Extend and enable the module. You can even use drush to enable the module.

Implementing Twig Tweak Functions and Filters in Drupal 8

Views and views results

To render the views in twig template, we can use the twig tweak snippet as below.

Syntax  
 {{drupal_view(views_name,display_id, args...)}}
 {{drupal_view_result(views_name,display_id)}}
   Arguments

 views_name : Machine name of the views.
 display_id: Machine name of the current display.
 args : Optional arguments for filters. 


Examples

  • {{ drupal_view('who_s_new', 'block_1') }}  This will place the who’s new views block_1 in a template.
  • {{ drupal_view('who_s_new', 'block_1', arg_1, arg_2) }} Extra arguments can be passed if the contextual filters are added can be used with passing arguments.
  • {% set view = drupal_view_result('related', 'block_1')|length %}
    {% if view > 0 %}
    {{ drupal_view('related', 'block_1') }}

{% endif %} In this code the view is displayed only if the view has results in a template.

Blocks

To place the plugin blocks in template, we can use the twig tweak snippet as below.

Syntax
 {{drupal_block('plugin_id',{label:'Example'|t,some_setting:'example'})}} 
  Arguments

 
  plugin_id : Plugin id of the blocks can be found in class annotations of the block.
 options : Second parameter is optional for providing  the configurations for the blocks.


Example 
{{ drupal_block('system_breadcrumb_block') }} to place the breadcrumbs block in template.

Region

To render the region contents from default or specific theme.

Syntax  {{drupal_region(region_name,theme_name)}}  Arguments

 region_name :  Machine name of the region can be found at info.yml   file.

 theme_name: Optional theme name to render the region content of   specific theme other than default.


Example
{{ drupal_region('sidebar_first', 'bartik') }} to place the sidebar first contents of the bartik theme in a template.

Entity and Entity form

To render the available entity and entity form in twig template, we can use the snippet as below.

Syntax

 {{ drupal_entity(entity_type, id, view_mode, language, access_check) }}

 {{ drupal_entity_form(entity_type, id, form_mode, values, access_check ) }}

Arguments  

 entity_type : Type of an entity i.e node, block, block_content, webform,     paragraphs etc.
 Id: Id of an entity. 
 view_mode: To view the entity by its viewmode available.
 language:  To specify language if not it will take default site language. 
 access_check: To check the access of block default value is true.
 form_mode: To view the entity form by its formmode available. 
 values: Array of property values defaults to [].

Examples

  • {{ drupal_entity('block_content', 1) }}  to display the content block whose id is equal to 1 in a twig template.
  • {{ drupal_entity('node', 123, 'teaser') }} to display the node teaser data of entity whose id is equal to 123.
  • {{ drupal_entity_form('node', 1) }} to display node edit entity form in a template.
  • {{ drupal_entity_form('node', values={type: 'article'}) }} to display node add form of article type.

Field

To display the available field value of any entity in twig template or to get field value, use the following twig tweak snippet.

Syntax  {{drupal_field(field_name,entity_type,id,view_mode,language,access_check)}} Arguments

 field_name:  Machine name of the field.
 entity_type : Type of an entity i.e node, block, block_content, webform,   paragraphs etc. 
 Id: Id of an entity.
 view_mode: To view the entity by is viewmode available.
 language:  To specify language if not it will take default site language.

Examples 

  • {{ drupal_field('field_image', 'node', 1, 'teaser') }} to display the field_image value of node 1 with teaser view mode.
  • {{ drupal_field('field_image', 'node', 1, {type: 'image_url', settings: {image_style: 'large'}}) }} to display the image field value with more detailed settings in a template.

Menu

To print the menu items in a template, we use the twig tweak as below.

Syntax  {{drupal_menu(menu_name, level , depth, expand)}}  Arguments

 menu_name : Machine name of the menu
 level:  Menu level to display default value is 1.
 depth: Maximum number of menu levels to display   default is 0.
 expand: To allow the user to select whether it is expanded or not. Default is false.

Example
{{ drupal_menu('main_navigation') }} to render the menu with its default values.

Form

To render the Drupal form in a template we can implement the twig tweak as below.

 Syntax  {{drupal_form(form_id. args ….)}} Arguments

 form_id: Form id of the form that you want to display in template.
 args:  Optional arguments that can be passed to the constructor.

Example
{{ drupal_form('Drupal\\system\\Form\\CronForm') }} to display the form in a twig template by providing the path to its class.

Image

There are several ways of rendering images in twig. Some easy ways with using twig tweaks are as follows.

Syntax  {{drupal_image(property_id, style, attribute, responsive, access_chek)}} Arguments

 property_id: It is the unique id of the image i.e fid, uuid or file_uri.
 style : Image style of the image.
 attribute : For providing the attributes to the image such as alt, title etc.
 responsive : To indicate the image style is responsive image style.
 access_chek : To specify the access check is required.

Examples 

  • {{ drupal_image(123) }} this will render the original image whose fid is 123.
  • {{ drupal_image('9bb27144-e6b2-4847-bd24-adcc59613ec0') }} this will render the image using the unique uuid of the image.

  • {{ drupal_image('public://2020/05/ocean.jpg') }} to render the image using the file uri of the image.

  • {{ drupal_image('public://2020/05/ocean.jpg', 'thumbnail', {alt: 'The alternative text'|t, title: 'The title text'|t}) }} here is an example of displaying an image thumbnail by adding the alt and title for the image. Note : uri path will be based on the default files location since it is set to sites/default/files/2020/05/ocean.jpg.

Token

Drupal 8 introduced tokens which are very useful and huge time savers for Drupal developers. From twig tweak we can render the tokens as below. 

 Syntax  {{drupal_token(token, data, options)}}  Arguments

 token : Name of the token to be printed.
 data: Array of values used for replacement of tokens.
 options: This also used as a flag when replacing tokens.

Examples
{{ drupal_token('site:name') }} to display the site name using the tokens in twig

Configurations

Printing the configuration values in twig will be very useful in a site where you need dynamic data to be printed according to the configurations.

Syntax   {{drupal_config(config_name,key)}} Arguments

 config_name: Name of the configuration yml file.
 key:  Specific value to be printed from the configuration

Example
{{ drupal_config('system.site', 'name') }} to print the name of the site from system.site.yml configuration into a template.

Dump

Debugging is the most important part of the development process. Here are some quick and cool methods to debug in twig. It needs Symfony var dumper component. It can be installed from composer by $ composer require --dev symfony/var-dumper 

Syntax  {{drupal_dump(var)}}  Arguments  var: to print the specific arguments in a template. If a variable is not provided, then all the variables inside the template will be printed. dd() is the shorthand for the method.

Example
{{ dd(var) }} this will  dump the var data from a template.

Drupal Title

To get the title of the current route we can use the twig as below.

Example
{{ drupal_title() }} this will print the title of the current route.

Drupal URL

To generate a URL from an internal path inside the twig we can use the twig tweak snippet as below.

Syntax  {{drupal_url(input, options, access_check)}} Arguments

 Input : use input link path.
 options : array of options including query params.
 access_check: to check the access default is false.

 

Example
{{ drupal_url('node/1', {query: {foo: 'bar'}, fragment: 'example', absolute: true}) }} this will return the https://site_name/node/1?foo=bar#example as output.

Drupal Link

It generates the link path if the URL is not accessible to the user. It works the same as Drupal URL but with extra attributes.

 Syntax  {{drupal_link(text, input, options, access_check)}} Arguments

 text : Text to be displayed.
 Input : use input link path.
 options : array of options including query params
 access_check : to check the access default is false.

Example
{{ drupal_link('View'|t, 'node/1', {attributes: {target: '_blank'}}) }} this will create a link with name View and also has an option to open the link in the new tab.

Drupal Messages

To display the Drupal status messages in template, find the example below.

Example
{{ drupal_messages() }} will display status based on user actions in the site.

Drupal Breadcrumbs

To display the breadcrumbs of the site in template as below, find the example below.

Example
{{ drupal_breadcrumb() }} This will display the breadcrumbs block in template.

Drupal Breakpoints

To add a debug point to a twig template is as below.

Example
{{ drupal_breakpoint() }} This will add a breakpoint in the template where you have added it.

Token Replace Filter

To replace all the tokens in a given string with appropriate values.

Example
{{ '<h1>[site:name]</h1><div>[site:slogan]</div>'|token_replace }} this will replace the tokens available in a string with its appropriate values.

Preg replace filter

It will perform a regular expression search on the text and replaces it with the options.

Syntax  {{text|preg_replace('(pattern)',replacement) }} Arguments  text: Text where the searching and replacing should   happen.
 pattern:  Text that should replace in an input.
 replacement: Replacement text to replace.

Example
{{ 'foo' | preg_replace('(foo)', 'bar') }} in this example “foo” is getting replaced with bar

Image style filters

It will return the image URL with the downloaded image style.

Syntax  {{image_uri|image_style(style)}} Arguments  image_uri: File uri of the image field.
 style: the style which you want to apply for the image.

Example
{{ 'public://images/ocean.jpg'|image_style('thumbnail') }} this will apply the thumbnail image style to the given image.

Transliterate filter

It will Transliterate text from Unicode to US-ASCII and replace unknown characters with “?” by default.

Syntax {{text|transliterate(language,unknown_chars,maxlength)}}   Arguments 

 text : The text which needs to be translated.
 language: language code for which it should translate default is en(English).
 unknown_chars: The character which needs to be replaced with unknown character   default is “?”. 
 maxlength: Max length of word that should check for the translation.

Example
{{ 'Привет!'|transliterate }} this will return the output in english that is “Private!”.

Check markup filter

Apply the enabled filters for the text provided.
 

Syntax  {{text |check_markup(format_id, language,filters_to_skip)}}  Arguments

 text : the text for which the filters should get applied.
 format_id : id of the text format should be used.
 language:  language to which the formatter should apply.
 filters_to_skip: array of filters to skip defaults to [].

Example
{{ '<b>bold</b> <strong>strong</strong>' | check_markup('restricted_html') }} This will apply the 'restricted_html' filter for the text given.

Truncate filter

This will be used to truncate the strings to a specified number of characters.

Syntax  {{text|truncate(max_length,word_safe,add_ellipsis,word_safe_len)}} Arguments

 text : text to be truncated.
 max_length : max length of text to be truncated.
 word_safe : boolean value to truncate at the end defaults to False;
 add_ellipsis: add … to the end of the text default is False. 

 Word_safe_len : if word_safe is true this option will specify the min length.

Example
{{ 'Some long text' | truncate(10, true) }} This will truncate the text to 10 characters and also truncate on boundary.

With Filter

This will add new elements to the array. It also returns an array with modified value.

Example
{{ content.field_image | with('#title', 'Photo' | t) }} This will add a title to the field image and returns.

Children Filter

To filter out the child elements of the render array in a twig template. Also, this will have an option to sort the elements by its weight. This will be useful when processing individual fields.

Example
{{ node.field_example|children }} this will render all the child elements of field_example in template.

File Uri and Url filter

These both filters will be used when dealing with images in twig. You can get the image Uri and URL using these methods in twig tweak.

Example

  • {{ node.field_image | file_url }} returns the absolute url of the image file.
  • {{ node.field_image | file_uri }} returns the uri of the image file relative to its sites/default/files directory.
Jun 23 2020
Jun 23

With a simplified translation flow with a balanced use of both human and machine translation, you can reduce your costs and reach even more markets.

In this blog, we’ll further explore a few user stories (introduced in part 1 of this series) and how we resolved their translation needs with the help of the Translation Management (TMGMT) module.

We will see how to:

  • Simplify and accelerate the translation process to empower content editors

  • Delegate the translation task to a machine and human translation to feed other translation systems (like e.g. Trados or Memsource)

  • Prevent data loss or duplication by choosing the content model that fits your expectations, from the beginning

  • Identify possible translation processes in a publishing workflow

  • Set a deadline, word count, and allow non-Drupal users to receive translation files by mail


These user stories can be grouped into three topics: Content Moderation, Paragraphs translation, and UX experiments. The implementation described below is Drupal 9 ready.

Content moderation
 

The very basic requirement for a “translation flow” would be to mark the translation as outdated.
This is working perfectly fine for non-moderated content, and there is also work in progress in Drupal 9.1 to set this feature back for moderated content.

Translation-outdated

It could also be a valid use case to set a state for a translation while making use of a publication workflow (e.g. content moderation). So if the source is published, it might not necessarily be the case for the translation.

There is work in progress with TMGMT to support pending revisions and accept translation as a specific moderation state, so this option is available while doing a Job review.

Content-moderation-translation-state

Additionally, we are experimenting with the following features with the TMGMT Content Moderation module:

  • Display the current moderation state close to the published status in the content translate form

  • Enable translation operation only when the source content reaches a specific state (example: “Published” state)

  • Exclude states from the translated entity (example: “For translation” state)

  • Redirect on Job completion to the latest revision of the entity


Combined with the Moderation State Columns module, it can also produce this kind of view.

Content Dashboard

Paragraphs asymmetric translation


Paragraphs can be configured to have asymmetric translations. In this case, the structure from the source entity can differ from the translations. It also allows to not have a translation fallback to the source on the frontend.

Example:

  • English source

    • Paragraph text 1 EN

    • Paragraph text 2 EN

  • French translation

    • Paragraph text 1 FR


While working with this setup, we need to be aware of several possible issues.

  • Data “loss” and dangling references can occur

    With existing content, switching to and from symmetric to asymmetric can cause data to not appear in the backend or the frontend, and then produce dangling references as the Paragraph entity ID will or will not be the same depending on the chosen setup. So this needs to be taken carefully into account before giving access to the content editors.



     
  • While using TMGMT, the flow might not be the expected one while doing several translations

    Let’s continue with our minimal example.



     
  • For our English source, the first edit is
    • Paragraph text 1 EN

    • Paragraph text 2 EN



  • A first French translation Job via TMGMT produces the expected result
    • Paragraph text 1 FR

    • Paragraph text 2 FR



  • A second edit of the English source adds a 3rd paragraph, so we have
    • Paragraph text 1 EN

    • Paragraph text 2 EN

    • Paragraph text 3 EN


We translate the same content again with a TMGMT Job: the 3rd paragraph appears in the Job review, but it will never appear on the frontend nor while editing the translation. This is probably not expected. If it is still accepted in the flow, it consumes unnecessary translation resources.

If we manually edit the French translation, to say remove the 2 originally translated paragraphs then add 2 others, we end up with

  • Paragraph text 4 FR

  • Paragraph text 5 FR


If we translate again the source at this stage, the situation between the TMGMT Job and the frontend will become quite unclear.

  • Work in progress

    We are close to a solution but the integration of TMGMT with Paragraphs asymmetric translation (including content moderation) is still a work in progress.


Conclusion:
Paragraphs asymmetric integration with TMGMT is feasible but if you plan to translate the content several times/update translations with TMGMT it will most likely not be the right fit for content editors.

UX experiments


A client requested to adapt the TMGMT translation flow for content editors. The assumption was that the Job review will be done right after the translation, so it allowed us to propose a simplification of the UI and skip some steps. A similar request came from another client then we’ve decided to start a proof of concept and gather some of these requirements to see how they could be generalised.

Here are a few examples of alternate flows and simplified UX that could be used by roles that do not need the whole TMGMT stack.

Original TMGMT flow: Machine translation with DeepL

[embedded content]

The alternate flows described below are combined with the TMGMT Content Moderation features that were previously mentioned.

Alternate flow 1: Machine translation with DeepL

Once a source reaches the translatable state (Published in this case), the translation occurs in 3 steps: Create the Job > Review the translation > View the result as the Latest revision.
The second step could even be skipped by accepting translations without review.

[embedded content]

Alternate flow 2: Send a file by mail to a translation service

Here we presume that we can send the XLF file attached.

[embedded content]

 

 


Combined flow

 

In some cases, we can also combine the 2 flows and first do a machine translation via e.g. DeepL, and then export the result in the XLF output to populate the initial translation in another translation solution (like Trados or Memsource).

List of other features provided by the Simple TMGMT module:

  • Translate with the usual translate operations

  • Disable operations links when the user selects multiple languages

  • Limit operation links to the supported translators

  • Optionally disable translation once already translated (edit only and do not translate again via a Job)

  • Per content translation flag to keep track of automatic translation

  • Optionally add a delivery date. A default delivery date is calculated based on a certain amount of open days

  • Integration with the Swift Mailer module (HTML mail templates with attachments)

  • Integration with the Disable Language module (filter languages with permissions)

  • If a Job is left as unprocessed allow to continue or delete it via the translation form

  • On Job delete or Job item delete, redirect to the entity translation form

  • Re-route machine translation error messages to a specific email address (e.g. [email protected]) and simplify the error messages for content editors


Next steps
 

  • With the deadline, word count, and non-Drupal users in the flow, it opens the door to integration with other information systems and stakeholders. For example, a CRM might contain your translator contact details and translation skills while an accounting platform will get the cost reporting once the translation Job is finished.

  • Introduce a generic way to deal with notifications, as suggested in this issue.


If you need help with any of the above including adding translation functionality to your site - get in touch with us today!

Jun 23 2020
Jun 23

The default Content overview in Drupal is a view. You can perform actions on multiple nodes by selecting one of them from the dropdown and applying it in bulk.

 How to Use The Views Bulk Operations Module in Drupal 8

The Views Bulk Operations module for Drupal 8 enhances this list of actions by adding some more actions to it. For example, you can change the author or delete the comments of one or multiple nodes with just one click.

How to Use The Views Bulk Operations Module in Drupal 8

Keep reading to learn how to use this module!

Note: This example will not work if you generate content with the Devel module.

Step #1. - Install the Required Modules

  • Open the terminal application of your PC and place the cursor in the root of your Drupal installation
  • Type: composer require drupal/views_bulk_operations

How to Use The Views Bulk Operations Module in Drupal 8

  • Click Extend
  • Scroll down and enable Views Bulk Operations and Actions Permissions
  • Click Install

How to Use The Views Bulk Operations Module in Drupal 8

The Actions Permissions submodule allows you as admin to allow or restrict access to the default (or custom) Views Bulk Operations actions to certain users, based on their role within the Drupal platform.

Step #2. - Create an Authenticated User

  • Click People > Add user

How to Use The Views Bulk Operations Module in Drupal 8

  • Enter the required data
  • Click Create new account

How to Use The Views Bulk Operations Module in Drupal 8

Step #3. - Create a View

Besides the 2 default content types in Drupal, I have created an extra content type called “Sponsored Article”. Furthermore, I have created 10 nodes between Articles and Pages as admin user.

  • Click Structure > Views > Add view
  • Give the view a proper name
  • Check Create a page
  • Change the number of items to display to 50
  • Click Save and edit

200609 drupal vbo 005

Let’s tweak the view a little bit.

  • Click Content under the FORMAT section and change the format to Table
  • Click Apply two times
  • Click the Add button under the FIELD section and add the Body field
  • Set the formatter to Trimmed with a limit of 150 characters
  • Click Apply

How to Use The Views Bulk Operations Module in Drupal 8

  • Add the field Authored by (Content)
  • Click Add and configure fields
  • Uncheck Link label to the referenced entity
  • Click Apply

How to Use The Views Bulk Operations Module in Drupal 8

  • Enable the Views bulk operations field
  • Click Add and configure fields

How to Use The Views Bulk Operations Module in Drupal 8

  • Scroll down under the Selected actions and select Change the author of content

Take a look at the other actions available. You might be wondering if there is a difference between Delete selected entities and Delete content item. For example, if you have a multilingual site, the entity would be the base article and the content items would be the translated articles. If you are editing an Italian translation, Delete selected entities would erase the whole entity from the database, whereas Delete content item would only eliminate the Italian article. This principle applies also in Drupal Commerce with Products and Product displays.

  • Click Apply

How to Use The Views Bulk Operations Module in Drupal 8

  • Rearrange the fields like in the image below
  • Click Apply

How To Use The Views Bulk Operations Module in Drupal 8

  • Edit the Table settings
  • Click the checkbox to make the Authored by field sortable
  • Click Apply

How to Use The Views Bulk Operations Module in Drupal 8

  • Save the view

Step #4. - Bulk of Editing Nodes

You can head over to the view page now. 

  • Click Authored by to sort by the author field (i only have one author in my installation)

The upper checkbox with gray background allows you to select all results on the page (not in the whole view), in case there were more than 50 results. 

  • Select 4 - 5 nodes
  • Select Change the author of content from the dropdown

How to Use The Views Bulk Operations Module in Drupal 8

You will only have this option since we selected only this action when configuring the field on step #3.

  • Click Apply to selected items
  • Scroll down and select the name of the authenticated user from Step #2. 
  • Click Apply

How to Use The Views Bulk Operations Module in Drupal 8

Those entities belong now to that particular user.

Let’s edit the view one more time. A blue-grey pencil will appear if you hover on the top right area of the view. Click that pencil to edit the view.

  • Add a filter
  • Search for and select the field Authored by
  • Click Add and configure filter criteria

How to Use The Views Bulk Operations Module in Drupal 8

  • Check Expose this filter to visitors, to allow them to change it
  • Click Apply

How to Use The Views Bulk Operations Module in Drupal 8

  • Save the view

Now you have a text box to filter out by the author of the content. 

How to Use The Views Bulk Operations Module in Drupal 8

In this tutorial, you learned how to sort content by column in a Views table, and how to filter it by a field with an exposed filter. You learned also the basic usage of the Views Bulk Operations module for Drupal 8. 

It is possible to code custom actions to meet the specific needs of the content and/or the site. This is out of the scope of this tutorial, but you can check the documentation of the Views Bulk Operations Example submodule, which comes with the main module by default.  

Thanks for reading! 


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Jun 22 2020
Jun 22

Webform cards open up infinite possibilities that are beyond our current expectations of the Webform module.

Our expectations of the Webform module are directly tied to Drupal’s Form API, a robust and secure framework for building, validating, and submitting forms. Still, Drupal’s current Form API implementation is about ten years old. It is starting to show its age. When Drupal’s Form API was built 16 years ago, form rendering and validation were always executed server-side; Ajax was a new kid on the block. Now, with modern frontend frameworks, like React and VueJS, form rendering and validation happen clientside, in the web browser. This approach results in a faster, more flexible ‘modern’ user experience.

Users now expect a fast and responsive user experience.

Modernizing Drupal and Webform User Experience

The Drupal community is working on modernizing Drupal’s user interface and experience. In the meantime, Drupal and the Webform module should provide the best user experience possible. Any request to the server (aka the Drupal backend) is, by definition, slower than changes and behaviors happening in the client (aka web browser). The first step to improving Drupal and the Webform module’s user experience and performance to move most data validation to the client’s web browser using clientside validation.

Clientside validation

Even though modern browser’s support HTML 5 Clientside validation, the implementation is inconsistent and there are known accessibility issues. The Clientside Validation module , which uses the jQuery Validation plugin, solves some of these accessibility issues and offers the most consistent user experience. Installing the Clientside Validation module will immediately improve everyone’s single page webform user experience because the server won’t have to keep validating and rebuilding the form.

There is still the problem with multistep ‘wizard’ webforms because each wizard page must be requested and built by the backend server. In the best environment, each page request for a multistep wizard form page will take about a ½ second, on a more complex website multistep webform, a page request can take more than a second. One second is a lot of time, especially for a form with dozens of steps. For example, paging through a questionnaire with 30 questions can take an extra 30 seconds of waiting for each page render; this is on top of having to fill out the form.

Users should be able to flip through a multistep wizard webform as fast as a carousel of image.

Clientside (aka JavaScript) pagination

Clientside pagination is not a new concept; it is how all image slideshows are set up. All the assets are sent to the web browser, and JavaScript allows users to quickly page through the slides. Sending a few dozen images to a browser is not a big deal, so I decided to experiment with downloading the entire multistep webform HTML markup to the web browser and use JavaScript to handle the pagination. The result of the multistep webform’s performance was exponentially faster. Once upon a time, sending a large packet of HTML to the web browser could have been a performance issue, but the rendering performance on modern browsers is now ridiculous fast. Once I had a proof-of-concept, I decided to implement this new approach as an optional Webform sub-module, called Webform Cards.

Introducing Webform Cards

Webform Cards provides a ‘Card’ container element for fast client side multistep form pagination.

Saying something is fast is not the same as showing how fast it is, but before I demo the Webform Cards module, it is worth pointing out some key concepts. First, Webform Cards is an experimental module that takes your existing wizard configuration and presentation and moves them to JavaScript. To use cards, you need to convert your wizard pages to cards. The Webform Card module includes a tool to make this conversation for you. To create the best user experience for end-users, the Webform Cards module requires Drupal core’s Inline Form Errors module and the contributed Clientside Validation module with the Webform Clientside Validation sub-module. I know this sound like a lot of dependencies, but once you all these modules download and installed, the only thing left to do is to build a webform using cards

Without further ado, here is a demo of the Webform Cards module.

To infinity and beyond... well kind off

When I completed building out the Webform Cards module, I had a “too infinity and beyond” moment because this new approach opens up so many new possibilities for people building webforms. The saying ”To infinity and beyond” comes from the movie “Toy Story,” which is about an old and maybe outdated toy cowboy, called Woody, learning to work and become close friends with the modern and sleek Buzz Lightyear, a space ranger. In contrast to Drupal, Form API is slowly becoming outdated like Woody, and Form API needs to work with more modern ‘buzz’ frameworks. The Webform Cards module like Buzz Lightyear is nudging our old toy, Form API, to explore new possibilities and gradually become more modern. There is some irony in this analogy because Buzz Lightyear’s wings and lasers don’t exactly work as expected, and Webform Cards is not exactly a ‘modern’ solution. Still in, the end, Buzz Lightyear and Woody, survive and grow. Webform Cards is helping the Webform module survive and thrive, but there is still more work to be done.

There is still more work to be done

The Clientside Validation module needs your help with finding and fixing issues. Drupal’s Admin UI & JavaScript Modernisation needs everyone’s help. Drupal’s Form API needs some love and modernization. Meanwhile, the Drupal community now has multistep webforms that provide a modern feeling user experience.

The new Webform Cards sub-module is available in the latest release of Webform 8.x-5.x (for Drupal 8) and Webform 6.x (for Drupal 8 and 9).

Download the Webform module

Who sponsored this feature?

Memorial Sloan Kettering Cancer Center (MSKCC) has been my primary client for the past 20 years. Without MSKCC’s commitment to Drupal and their early adoption of Drupal 8, I would most likely not be maintaining the Webform for Drupal 8. Most of my work on the Webform module is done during my free time. Occasionally, MSKCC will need a Webform-related enhancement that can be done using billable hours. In the case of “Webform Cards”, MSKCC, needs to build faster and modern screening forms for patients.

I am very fortunate to have an ongoing relationship with an institution like MSKCC. MSKCC appreciates the value that Drupal provides and the work that I am doing within the Drupal community.

If you want to sponsor a feature, please read my blog post and create a ticket in the Webform module’s issue queue.

Backing the Webform module

Open Collective is providing us, Drupal, and Open Source, with a platform to experiment and improve Open Source sustainability. If you appreciate and value what you are getting from the Webform module, please consider becoming a backer of the Webform module’s Open Collective.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OK

Jun 22 2020
Jun 22

Introducing DXPR Builder 1.0.0

Inspired by you — our customers and supporters — we introduce DXPR Builder: the layout builder that showcases the result of our 5 years of listening, developing, and improving the Drupal experience for Drupal's most important audience: The Authors.

Authors, especially digital marketers, have been frustrated by the lack of a visual layout builder for Drupal. With the inability to create mobile-friendly layouts, digital marketers relied on IT support to create landing pages, or even anything more complicated than a Word document.

With DXPR Builder, you can upgrade any Drupal 7, 8, or 9 websites with the following awesome no-code capabilities:

  • Immediate inline editing
  • Mobile-friendly layouts up to 12 columns
  • Access your local media or DAM
  • Add marketing elements including icons, countdown timers, and more
  • Supports Drupal views, blocks, translations, revisions, workflows, and other Drupal technologies.
Jun 22 2020
Jun 22

Drupal 9.0 was launched earlier this month as a continuation of Drupal 8. This time around, the core update was more about updating the technology underlying Drupal's codebase and eliminating dependencies than introducing brand-new features, but fear not: we'll be getting some of those soon enough.

Drupal releases features on a semi-annual basis, and version 9.1 is expected to be rolled out around December this year. Due to the decentralized nature of Drupal's development, the roadmap for 9.1 isn't necessarily set in stone; that said, the strategic initiatives and core objectives are well-documented, so we know what we can expect in the foreseeable future. Most importantly, gone are the days when Drupal was developer-first, editor-second: it's all about usability and accessibility for everyone moving forward.

Let's take a closer look at what that might entail. On the menu: a new front-end theme, automatic updates, and community-driven improvements collected from the 2020 Drupal Product Survey.

Olivero Front-End Theme

When I built my first-ever Drupal site during an Evolving Web training session, I remember thinking two things: "Wow, this is really flexible and fun to use once you get the hang of it", and "Wow, the default theme looks a bit dated". I'm a fan of Drupal, but the Bartik theme and its decade-old design just don't quite do it justice for first-time users.

Drupal has made it a major priority to completely overhaul its user experience and be friendlier for everyone, not just developers. Now that Claro, a new, accessible admin theme, is available in Drupal 9, contributors are focusing on Olivero, a modern front-end theme designed to showcase the CMS in its best light out of the box. Like Claro, Olivero follows a new-and-improved design system that prioritizes user experience and accessibility.

Screen capture of the Olivero front-end theme in Drupal Screenshot of the Olivero front-end theme for Drupal sites

It'll be a good few months before we can officially say bye-bye to Bartik, so here's what we know about Olivero so far to tide you over in the meantime:

  • It looks really good. Olivero's sharp colour palette, modern typography, and judicious use of white space gives Drupal sites a polished, state-of-the art look straight out of the box.
  • It'll be WCAG AA-compliant from the ground up. Accessibility is a major focus in Olivero, which is slated to include a high-contrast mode among myriad other accessibility-first features and functionality.
  • It supports all the most recent features added to Drupal, including embedded media and the drag-and-drop layout builder.

To read more about Olivero's development (and see the prototype high-contrast mode in action), check out this blog post by Lullabot, one of the teams involved in building the theme. According to the post's author, a launch within 9.1 is the most likely release scenario, so stay tuned this December.

Automatic Updates

As it stands, updating a Drupal site isn't the most straightforward process. That's set to change in the foreseeable future, however, as automatic updates have been one of Drupal's main strategic initiatives for some time now.

Major features of the existing Automatic Updates module, which is planned to eventually become part of Drupal core, include:

  • Major update announcements to notify admins when a core update is on its way, what it entails, and how to prepare
  • Update readiness checks to automate the process of ensuring sites are compatible with the latest update
  • One-click updating to allow admins to trigger the database update directly via the Automatic Updates service

These features are currently being tested and refined by the community, and we can expect a core release as soon as they're ready. Get all the details about the Automatic Updates project in the Drupal docs.

The 2020 Drupal Product Survey

Drupal's project lead Dries Buytaert recently started collecting responses to the annual Drupal Product Survey (here's the related post on Dries' blog). The survey's goal is to prioritize upcoming initiatives according to the community's needs. The results will be unveiled this July during the global virtual DrupalCon.

Looking at the survey's contents can give us some clues as to what might be coming to Drupal in the mid- to long-term (but we'll have to wait till the results are out to get a clear picture of how the Survey will influence Drupal's strategic direction).

Target Audiences

When you take the survey, the first questions are about how you use Drupal. The rest of the survey is then tailored according to your response. Here's a glimpse of the different demographics you can choose from to give you an idea of who the questionnaire is intended for (short answer: anyone who has anything to do with Drupal in any capacity!).

The first question on the 2020 Drupal product survey The first question on the Drupal Product Survey shows the scope of its audience

Content Editor Experience

The content editing experience in Drupal has seen constant improvements over the last several releases, but how will it evolve in 9.1 and beyond? The survey's questions for content creators include a list of potential Drupal enhancements, which respondents are asked to prioritize. A few highlights:

  • More refined draft/publishing control. This has already been addressed in recent updates; Drupal 9 includes enhanced content moderation workflows that are well-suited to actual editorial processes. It'll be interesting to see how this will be improved upon even further.)
  • Improved accessibility testing and control. Drupal core aims to adhere to stringent accessibility requirements out of the box (note leigh link here to that one page), but it could definitely offer even more testing features for creators directly via the admin UI
  • Improved contextual help and overall "how-to" guidance and Redesigned information architecture/simplified terminology for admin pages. A lot has been done in recent updates to make Drupal more user-friendly and approachable, so this survey question should be a good indicator of how successful those efforts have been--and what still needs to be done to further democratize the CMS.

Other noteworthy points relating to content creation workflows include:

  • Making more pre-built templates available
  • Autosaving
  • Real-time previewing of content being edited
  • Improvements to structured data and metadata management

Developer Experience

Site builders, theme builders, designers, and front- and back-end developers answering the survey also get questions about usability and accessibility, but those obviously look a bit different than the ones targeting content authors.

Discussion points aimed at developers and designers include these potential Drupal enhancements:

  • Improved configuration management
  • Additional front-end development tools, like NPM support and SDKs for common JavaScript frameworks
  • Drush-style out-of-the-box command-line tools integrated into Drupal Core (if you're currently looking for Drush commands to use during deployment, consider getting the Drush module, which adds several admin functionalities)
  • Improved data modeling tools
  • Better support for atomic content (i.e. reusable, channel-agnostic assets), in addition to a component-based theme system with reusable interactive theme elements like responsive tables
  • More modules added to Drupal Core, such as Feeds (to provide a migration UI), Rules (to provide a business logic UI), Admin toolbar, and Pathauto (for generating URL path aliases)
  • Privacy management support, such as user-managed identity access for GDPR

Help Shape Future Versions of Drupal

Of course, this is just the tip of the iceberg when it comes to future plans for Drupal. If you have an opinion on anything we just covered (or on Drupal in general), make sure to take the 2020 product survey (direct survey link) to have your voice heard. Drupal is, and always has been, a community effort, so by taking the time to fill out the questionnaire you'll be directly contributing to the future of a powerful open-source CMS that powers millions of experiences across the web.

Meanwhile, if you want the facts about the latest current edition of Drupal, sign up for our upcoming webinar What You Need to Know About Drupal 9.

Jun 22 2020
Jun 22

The Entity Export CSV module allows us to quickly set up CSV exports for any type of Drupal 8 content entity. Sometimes, we may need to customize the exports we perform, such as exporting 2 different pieces of information from the same Entity Reference field for example. Let's find out how to customize our CSV exports.

And because a good example is sometimes better than a long speech, we will cover a specific need here, by way of illustration. From a Drupal commerce order entity, we want to be able to export the Phone field associated with the Billing Profile. Indeed, order entities have an Entity Reference Revisions field that references the Billing Profile filled in during the purchase tunnel, and we want to be able to extract from this Profile entity a particular field, the phone.

First, we will alter the fields retrieved from the Order entity. Indeed the Entity Export CSV module dispatches an event allowing us to easily alter the fields retrieved from a given entity.

In order to be able to extract the phone we will dynamically add a pseudo field on this entity.

We create an EventSubscriber service in our custom module my_module, and more precisely in the file my_module.services.yml.

my_module.entity_export_csv_billing_phone:
  class: Drupal\my_module\EventSubscriber\EntityExportCsvBillingPhoneEventSubscriber
  tags:
    - { name: event_subscriber }

And our Class to alter the fields available on the Order entity. This alteration allows us to add as many pseudo fields as necessary.

Below our Class EntityExportCsvBillingPhoneEventSubscriber.

namespace Drupal\my_module\EventSubscriber;

use Drupal\Core\Field\BaseFieldDefinition;
use Drupal\Core\StringTranslation\StringTranslationTrait;
use Drupal\entity_export_csv\Event\EntityExportCsvEvents;
use Drupal\entity_export_csv\Event\EntityExportCsvFieldsSupportedEvent;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;

class EntityExportCsvBillingPhoneEventSubscriber implements EventSubscriberInterface {

  use StringTranslationTrait;

  /**
   * {@inheritdoc}
   */
  public static function getSubscribedEvents() {
    $events = [];
    $events[EntityExportCsvEvents::ENTITY_EXPORT_CSV_FIELDS_SUPPORTED] = ['alterFields'];
    return $events;
  }

  /**
   * We want to add a pseudo custom field to export the billing phone field..
   *
   * @param \Drupal\entity_export_csv\Event\EntityExportCsvFieldsSupportedEvent $event
   *   The event.
   */
  public function alterFields(EntityExportCsvFieldsSupportedEvent $event) {
    $entity_type_id = $event->getEntityTypeId();
    if ($entity_type_id === 'commerce_order') {
      $fields = $event->getFields();
      if (isset($fields['billing_profile'])) {
        $fields['billing_phone'] = $fields['billing_profile'];
        $event->setFields($fields);
      }
    }
  }

}

We duplicate here the billing_profile field available on the order entity to have a new billing_phone field for which we can set up a specific dedicated export.

Let's create our field export plugin dedicated to this field (see Export content to CSV with Drupal 8).

In the src/Plugin/FieldTypeExport directory of our module, we create our specific BillingPhoneExport Plugin.

namespace Drupal\my_module\Plugin\FieldTypeExport;

use Drupal\Core\Entity\ContentEntityInterface;
use Drupal\Core\Field\FieldDefinitionInterface;
use Drupal\entity_export_csv\Plugin\FieldTypeExportBase;
use Drupal\Core\Field\FieldItemInterface;

/**
 * Defines a billing phone field type export plugin.
 *
 * @FieldTypeExport(
 *   id = "billing_phone_export",
 *   label = @Translation("Billing phone export"),
 *   description = @Translation("Billing phone export"),
 *   weight = 100,
 *   field_type = {
 *     "entity_reference_revisions",
 *   },
 *   entity_type = {
 *    "commerce_order",
 *   },
 *   bundle = {},
 *   field_name = {
 *     "billing_phone",
 *   },
 *   exclusive = FALSE,
 * )
 */
class BillingPhoneExport extends FieldTypeExportBase {

  /**
   * {@inheritdoc}
   */
  public function getSummary() {
    return [
      'message' => [
        '#markup' => $this->t('Billing phone field type exporter.'),
      ],
    ];
  }

  /**
   * {@inheritdoc}
   */
  public function massageExportPropertyValue(FieldItemInterface $field_item, $property_name, FieldDefinitionInterface $field_definition, $options = []) {
    if ($field_item->isEmpty()) {
      return NULL;
    }
    $configuration = $this->getConfiguration();
    if (empty($configuration['format'])) {
      return $field_item->get($property_name)->getValue();
    }

    $field = $configuration['format'];
    if ($field) {
      $entity = $field_item->get('entity')->getValue();
      if ($entity instanceof ContentEntityInterface) {
        if ($entity->hasField($field)) {
          return $entity->{$field}->value;
        }
      }
    }

    return $field_item->get($property_name)->getValue();
  }

  /**
   * {@inheritdoc}
   */
  protected function getFormatExportOptions(FieldDefinitionInterface $field_definition) {
    $options = parent::getFormatExportOptions($field_definition);
    $options['field_phone'] = $this->t('Phone');
    return $options;
  }

}

Our Field Export Plugin declares the field_name in its annotations so that it is only available for the pseudo field we created. We use the export formatting options to add the field we want to export (if necessary, we could for example want to export several fields from the Profile entity, we could then use the same Field Export Plugin) then simply, based on the basic Plugin provided by the module for Entity Reference (EntityReferenceExport) fields, we adapt the massageExportPropertyValue() method to export the desired field from the Profile entity.

And in a few moments we have then the possibility to export the phone field from the Profile entity referenced by an order.

Billing phone export

We can then enable our new pseudo field, select the specific export format and extract all the data needed for the job.

These techniques can be used with the help of a Drupal 8 developer to enrich data exports and adapt them to the most specific business needs, without having to redevelop a whole specific export routine.

Jun 22 2020
Jun 22

Maintaining Drupal projects and managing Drupal modules can be challenging for even contributors who have unlimited time. For decades now, Drupal's ecosystem has cultivated a wide array of tools for contributors to create patches, report issues, collaborate on code, and perform continuous integration. But as many source control providers begin to release shiny new features like web IDEs and issue workspaces that aim to make open-source contributors' lives even easier, many are doubtlessly wondering how Drupal's own developer workflows figure in an emerging world of innovation in the space.

DrupalSpoons, created by Moshe Weitzman and recently released, is a special configuration of groups and projects in GitLab that provides a bevy of useful features and tools for Drupal contributors who are maintaining Drupal projects. A play on the word "fork," which refers to a separately maintained clone of a codebase that still retains a link to the prior repository, DrupalSpoons offers support for GitLab issues, merge requests (GitLab's analogue for GitHub's pull requests), and continuous integration on contributed Drupal projects in the ecosystem. It leverages zero custom code, apart from the issue migration process to aid DrupalSpoons newcomers, and outlines potential trajectories for Drupal contribution in the long term as well.

In this exciting episode of Tag1 Team Talks, Moshe Weitzman (Subject Matter Expert, Senior Architect, and Project Lead at Tag1) hopped on with Michael Meyers (Managing Director at Tag1) and your host Preston So (Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) for a deep dive into what makes DrupalSpoons so compelling for Drupal contributors and the origin story that inspired Moshe to build it. Join us to learn how you can replace your existing Drupal contribution workflows with DrupalSpoons to get the most out of Drupal's recent migration to GitLab and the most modern capabilities in Drupal code management today.

[embedded content]

---

Links

Photo by Richard Iwaki on Unsplash

Jun 22 2020
Jun 22

It’s nearly a decade since the release of Drupal 7. During this time, we have seen new legislation in web accessibility, privacy (GDPR), the rise of mobile internet, and the proliferation of high-performance devices. 

The way we interact through technology has changed too. Customer expectation has risen, and interaction has become automated, facilitated by the integration of CRMs and marketing tools. 

The case for 'Versions'

These social changes are why ‘versions’ of technology are released. When the world changes in such a fundamental way, it is illogical to make a historic version continue to fit. Instead, new versions are built with the way we communicate at their source.

Drupal 7 was created in an unrecognisable world by today’s standards, and by staying on D7, you remain in that past world.

If you wish to remain secure, keep pace with innovation, consumer expectations, and meet modern digital standards, it is necessary to migrate your website to a CMS version built for the new world. 

These new requirements are why Drupal 8 and most recently Drupal 9, which I will come onto later, have been released.

 

What happens at Drupal 7’s End of Life?

Previously, Drupal 7's end-of-life was scheduled for November 2021. Given the impact of COVID-19 on budgets and businesses, the Drupal project has extended the end of life until November 28, 2022. It is important to understand what this means to your organisation:

  • The Drupal Security Team will no longer provide support or Security Advisories for Drupal 7 core or contributed modules (additional components for your website), themes, or other projects.
  • Drupal 7 will no longer be supported by the community at large. The community at large will no longer create new projects, fix bugs in existing projects, write documentation, etc. around Drupal 7.
  • After November 2022, using Drupal 7 may be flagged as insecure in 3rd party scans as it no longer gets support.
  • Best practice is to not use unsupported software, it would not be advisable to continue to build new Drupal 7 sites.

It is important to appreciate that your website does not suddenly become insecure come November 2022, rest assured we present you several options, detailed below.

 

Drupal 8: What’s new?

Drupal 8 is a massive leap forward for the community and the organisations using it.

There are so many reasons why Drupal 8 (and 9) implementations appeal to Drupal 7 site owners. Here are a few which stand out:

  • Content authoring experience designed with marketers in mind
  • Drag and drop page builder
  • Flexible page layouts with components
  • Introduce modern headless front end
  • Mobile-first by default
  • Fast page load times, great for end-users and SEO alike
  • Media library simplifying work with video, images and documents
  • Social integration
  • Easily exchange data with CRM, marketing and back-office systems
  • WCAG 2.1 accessibility
  • Ability to introduce personalisation

 

How to decide your Drupal 7 strategy

Your path forward depends upon your organisation’s attitude towards the Drupal 7 site. Which category does your site fall into?

Category 1 Site Owners
  • Our website is critical to our business operation.
  • Our website needs a redesign.
  • To perform efficiently, we additional features now or in the future.
  • Our website must comply with accessibility and/or GDPR legislation.
  • Our website is in active development.

Recommendation: Drupal 8 Re-platform (Click or keep scrolling)

Category 2 Site Owners
  • We have no plans to develop further features.
  • We will retire the site in the next 12-24 months.
  • Our site content and design will remain the same for a number of years.

Recommendation: Drupal 7 Long Term Support program (Click or keep scrolling)

Category 1 Site Owners:

Drupal 8 replatform 

To help you consider what approach to take, consider which of the next set of categories your site falls into. Each results in all the benefits Drupal 8 offers, but takes a different journey to get there.

Level 1/3: Your site is great as is.

Your site functions with minimal issues. You want to spend little time planning and you're migrating for security reasons.

Level 2/3: In need of a refresh.

You need a visual refresh and to evaluate some features, but on the whole, your site operates just fine.

Level 3/3: Time for a big rethink.

Your site doesn't meet your requirements or business goals. It's time for a big rethink.

Recommendation: Lift and shift £

Maintain the same functionality and look-and-feel, but with a new Drupal 8 CMS.

Steps: A Drupal 8 migration.

Recommendation: Minor upgrade ££

A solution similar to your existing site, with a design refresh.

Steps: Short planning phase to deliver new wireframes, creative design, and a Drupal 8 migration.

Recommendation: Major upgrade £££

A solution significantly different to your existing site with a totally new design.

Steps: A discovery, definition, and full design process before the Drupal 8 migration.

Category 2 Site Owners:

Drupal 7 long term support

Staying on Drupal 7 is an option only if you subscribe to extended support, commercially available security updates are made available via a subscription model. This will be available until 2024.

Additionally, patching of Symfony and PHP will be necessary. Over time this option becomes less attractive since innovation is not here, the burden to maintain a secure site will grow.

What about Drupal 9?

Drupal 9 was released June 3rd 2020, built from the final version of Drupal 8. It can be considered a housekeeping release. The release removes features no longer necessary and any “deprecated code” to maintain compatibility with key underlying third-party systems like Symfony. These third-party systems that are also benefiting from security and performance updates. These changes are all centred around keeping pace with the modern web.

Moving to Drupal 8 means you are ready for Drupal 9. Once a majority of modules are ported to Drupal 9, many of which already are, only then should you update to 9.

If you migrate to Drupal 8, ensure your new site does not reference features deprecated in Drupal 9. If you do this, moving between Drupal 8 & 9 will efficient and return great value.

“Moving between Drupal 8 & 9 is the easiest upgrade in a decade.”

Drupal 8 migration audit

When deciding and planning for a migration, you must audit and consider the following:

  • Integrations and 3rd Party features
  • Bespoke modules and design
  • Front end styling and customer experience
  • Live data systems
  • Data housing and quality
  • Page and content structure, volume, and quality
  • Back office processes
  • Workflows and approval systems
  • Security
  • Accessibility

It would be a missed opportunity to not tell you that we offer this service as both an initial review and an extensive audit. If you require these services, please inform me of your concerns and website address on our contact page.

Building a business case for a Drupal upgrade

Once you have identified the risks of Drupal 7, you may need to convince your colleagues, superiors, and peers. We have developed business cases for Universities, the Public Sector, Membership bodies, Legal Professionals, and Not-For-Profit organisations.

The crux comes from the opportunity deficit. While the risks of security and accessibility are clear to most, the opportunity deficit is created first by your technical knowledge, and finally by your creative application. Having been in the depths of Drupal since the beginning, we know the hidden potential of Drupal, and as such, can help you identify the business-critical opportunity a migration can bring.

Useful links

University of West London D7 to D8 Migration

Drupal 7 Roadmap on Drupal.org

Founder of Drupal, Dries' Drupal 7, 8 and 9 Blog

Accelerated Drupal Migrations with CTI Digital

Jun 20 2020
Jun 20

Composer logoAs always-evolving Drupal developers, we have been in the process of moving towards having a Composer-based workflow for managing our Drupal project codebases. While it is (normally) an easy jump from "drush dl" to "composer require" for Drupal contrib modules and themes, there's another significant opportunity for us to take advantage of during this evolution that should be considered. 

We are all familiar with the concept that Drupal modules extend the functionality of Drupal site; we should also be embracing the fact that there is a whole class of Composer dependencies that extend Composer's functionality. These are appropriately called "Composer plugins".

Composer Plugins 101

If your Drupal project is using the drupal/recommended-project or drupal-composer/drupal-project Composer template (or similar), then you're already using Composer plugins. The second part of this article will surface some additional Composer plugins that you may want to consider adding to your Drupal codebase.

Adding a new Composer plugin is normally a two-step process. First, use "composer require" to add the plugin to your project. The second step usually (but not always) involves adding some plugin configuration to your project composer.json's "extra" section. 

For example, one of the dependencies of the drupal/core-recommended Composer template is the composer/installers plugin - this is how Composer knows to put Drupal modules in the modules/contrib/ directory and not in the (Composer default) vendor/ directory. 

If you were working on a project that didn't already use composer/installers, then you would need to add it to the project using

composer require composer/installers

Then, after reading a little bit of documentation, you would learn that in order for Composer to place dependencies of type "drupal-module" (as defined in the module's composer.json file) in your project's web/modules/contrib/ directory, you would need to add the following to your project composer.json's "extra" section:

"extra": {
    "installer-paths": {
        "web/modules/contrib/{$name}": ["type:drupal-module"],
    }
}

One other thing to be aware of when it comes to Composer plugins is that many (most?) plugins are able to run directly via a Composer command. For example, the drupal/core-composer-scaffold command (details below) can be run at any time via "composer drupal:scaffold". 

Side note: if I have one complaint about Composer plugins it is that often the plugin's vendor/name are not consistent with the plugin's defined key in the "extra" section (composer/installers vs. installer-paths). I would much prefer if the key was defined based on the plugin's vendor/name. For example, something like "composer-installers-paths" - this would make it easier for new users to recognize the relationship between plugins and data in the "extra" section, IMHO.

Useful Composer Plugins for Drupal projects

Must haves

composer/installers

  • Description: Allows dependencies to be placed in directories other than /vendor/. 
  • Availability: Included as part of drupal/recommended-project Composer template (via drupal/core-recommended), part of the default drupal-composer/drupal-project template.
  • Configuration key: installer-paths

drupal/core-composer-scaffold

  • Description: Places default Drupal scaffolding files outside of the /core/ directory and allows for modifications to scaffolding files.
  • Availability: Included as part of drupal/recommended-project Composer template, part of the default drupal-composer/drupal-project template.
  • Configuration key: drupal-scaffold

cweagans/composer-patches

  • Description: Automatically applies patches (both local and remote) to dependencies. 
  • Availability: Included as part of the default drupal-composer/drupal-project template.
  • Configuration keys: patches, patches-file, enable-patching, patches-ignore.

Worth considering

zaporylie/composer-drupal-optimizations

  • Description: Provides Composer performance boost for Drupal projects by ignoring legacy symfony tags. 
  • Availability: Included as part of the default drupal-composer/drupal-project template.
  • Configuration keys: composer-drupal-optimizations.

topfloor/composer-cleanup-vcs-dirs

  • Description: Automatically removes .git directories for cloned dependencies. Only necessary when dependencies are committed to the project's Git repository. 
  • Availability: Install as you would any other Composer dependency.
  • Configuration keys: None.

szeidler/composer-patches-cli 

  • Description: Companion plugin to cweagans/composer-patches that allows for the addition of patches (and other functionality) from the command line.
  • Availability: Install as you would any other Composer dependency.
  • Configuration keys: None.

oomphinc/composer-installers-extender

  • Description: Companion plugin to composer/installers that allows for any arbitrary package type (such as npm packages) to be defined and then handled by the composer/installers plugin.
  • Availability: Install as you would any other Composer dependency.
  • Configuration keys: installer-types.

joachim-n/composer-manifest

  • Description: Creates a simple "composer-manifest.yaml" file in the project root listing all dependencies (and their exact version numbers) currently used in a project.
  • Availability: Install as you would any other Composer dependency.
  • Configuration keys: None.

hirak/prestissimo

  • Description: Composer performance boost - enables parallel downloads of dependencies during Composer commands. Will not be necessary in Composer 2.x.
  • Availability: Install as you would any other Composer dependency.
  • Configuration keys: None.

Not necessary for experienced developers

drupal/core-project-message

  • Description: Allows text to be displayed after "composer create" and "composer install" commands have completed.
  • Availability: Included as part of drupal/recommended-project Composer template.
  • Configuration key: drupal-core-project-message

Am I missing any? If so, please let me know in the comments below. Thanks to everyone who responded to my Twitter post on this topic!

Want to learn more about Composer? Check out our Composer Basics for Drupal Developers workshop.

Jun 19 2020
Jun 19

Is technology by itself good, bad, or neutral? Does the application of technology as its own process carry any inherent viewpoint or judgment? Are technologists — the engineers, scientists, builders, makers, and creators who build, utilize, and share technology — cognizant of their role in the design and functioning of their own tools? If technology advances internal beliefs in a more efficient, widespread manner, is it important to examine those internal beliefs first? 

These types of questions, as well as questions adjacent to technology — such as around privacy, security,  and surveillance — have been part of an ongoing discussion inside Lullabot. As strategists, designers, and developers, we not only do our work, but we also help our clients reach their own audiences, shape their own business models, and spread their own values into the world.

Where there is great power there is great responsibility.

Spider-Man's "Uncle Ben" and Winston Churchill

Discussions about values, morals, and ethics happen daily in workplaces across the country, with tech-specific examples, including, body cameras and police accountability nationwide, GitHub employees protesting the sale of the software they build to U.S. Immigration and Customs Enforcement, Salesforce employees protesting over continuing contracts with the U.S. Customs and Border Protection, the Moral Machine, which gathers a human perspective on moral decisions made by machine intelligence (such as self-driving cars), and the National Institute of Standards and Technology (NIST) report on how face-recognition algorithms fare at identifying people from different demographics.

Values are implicit in those of us architecting, producing, and deploying technology, and we have been grappling for the past year with the creation of a living Engineering Values Statement.

During our 2019 team retreat, Matt Westgate mentioned "engineering values" in his presentation, and our Design & Strategy team led a company exercise inferring values from a hypothetical build of a podcast page. From this work, some wanted to clarify and publicly state the type of work that we engage in to help identify what would be a good fit versus what would not be a good fit for our work from a sales perspective, as well as to share this information with clients, staff, and new hires. 

The evolution of our engineering values came out of continued focus and clarification of our core values.  As the retreat drew to a close, some volunteers decided to start meeting on a regular basis to come up with the parameters of the project with the ultimate goal of encapsulating a list of succinct engineering values for circulation.

The idea was to create a reference list to help with various situations, such as:

  • The sales team outlining our default values when we begin an engagement
  • Account managers evaluating values with clients during ongoing engagements
  • Evaluating technology for ourselves or our clients
  • Arising conflicts from engineering problems with ourselves and our clients

The committee's call for volunteers was: 

  • "We’re looking for 4 to 7 members who can attend a 30-minute meeting each week, along with an hour or two of help with whatever tasks we come up with. Of course, client work will ebb and flow, so we don’t expect this to be 100% every week. And while we’ve called this 'engineering values,' that doesn’t mean this group is only for 'engineers.' We’re sure anyone who's interested will have something valuable to contribute."

Our volunteer group organized around a series of Paper docs, and met via Zoom every week for a half hour with the first task of brainstorming questions and ideas for what we'd like to ask ourselves and the team. Through the initial creation of a charter document, the group identified a list of "must-have" deliverables as well as some "nice-to-haves:"

Deliverables

  • A summary of interviews and feedback from the team, with any notable commonalities or conflicts in values.
  • An initial draft of an engineering values document, presented to leadership and the team.
  • A summary of research on what Drupal agencies and the broader industry communicate as their engineering values.
  • A final document based on a round of feedback from the team.

Nice to haves / to be determined

  • Whatever resources, if any, the sales team would like
  • A page on our company website
  • A marketing announcement (e.g., blog post, podcast, etc)

In terms of logistics, a weekly meeting for thirty minutes worked well, as not all participants could attend all meetings. Any action items identified were in manageable increments and all could asynchronously connect through a shared folder that included meeting notes and the actual list of values. The recurring calendar invite was held by one person and periodically circulated to the greater team as new hires joined, with a specific focus on making sure the group represented a cross-section of business roles, backgrounds, and geographical basis. Discussions centered around a shared wiki-style document. The facilitator role was held by different participants as availability and capacity changed over time. The working group also opened a Slack channel to post information as well as align around comments, next steps, and action items.

In terms of process, the group came up with a first round through assessing personal, individual engineering values, and documenting those, discussing in small groups, and aggregating into an initial group-wide list. We sent a Google Form to the entire staff asking, "What are your engineering values?" This triggered a follow-up activity to do a roundtable on whether or not team members consider themselves engineers. We discovered that not everyone (even those who are developers) considered themselves "engineers." Through continuous feedback and sending reminders to participate approximately every six weeks, the group continued to identify who was not in the room and which voices needed to be added to the conversation. 

Lessons learned

1. Proposing an initiative

The project, like any other project, demanded that the working group align around timelines, success criteria, deliverables and scoping, and getting leadership approval and feedback.

2. Including those not present

Leaning on other groups, such as our Diversity, Equity, and Inclusion and Accessibility groups, helped unblock us throughout the year. Also, inviting others to participate in ways like a one-off interview provided a nice balance of respecting time and getting unique feedback. 

3. Space for different perspectives

Including multiple voices helped form a more comprehensive discussion. Including different perspectives didn't end at the invitation: we followed up. We found we needed to be prepared for surprising disagreements and leaned on the idea that working group leaders should facilitate instead of participate.

4. It’s OK to opt out (or opt in)

Everyone in the working group also had client responsibilities and sometimes project scheduling changed, which meant participation by one or many in the group became difficult or impossible. Other times, a project ended so time was available to participate. We asked participants to commit to blocks of availability at a time (say 2-3 months) so the group could rely on individuals.

5. Delegate!

Sometimes, initiative leaders had to be direct to not allow their own absences to become blockers. We used dice to assign a facilitator out of available participants.

In terms of feedback, the team prepared for a presentation at the 2020 Lullabot Team Retreat, to share the first draft with the greater team.  This list had 13 values on it. By soliciting feedback in the form of comments on the document (comments are visibly tied to a specific username), as well as index cards (snapshots of cards were anonymous), the committee gathered a wide variety of responses. Incorporating retreat feedback, the group continued to narrow down the list to six core engineering values. Along the way the working group continued to ask for feedback from operations, marketing, sales, and other departments. We published the values to an internal blog and released them to a private GitHub repository.

Our team is happy to have phased out the working group and successfully handed over the document, moving to a living document. While we’ve closed the loop on this first phase of the work, we now look forward to discussing these values with the community and engaging around these values with those working in technology who have similar conversations happening within their own organizations.

With many thanks to Andrew Berry, who facilitated the Engineering Values Committee, for his feedback and review.

Members of the Engineering Values Committee included: Andrew Berry, Brian Skowron, Darren Petersen, Hawkeye Tenderwolf, Helena McCabe, James Sansbury, Marcos Cano, Mateu Aguiló Bosch, Matt Oliveira, Matt Robison, and Monica Flores.

Jun 19 2020
Jun 19

This blog has been re-posted and edited with permission from Greg Boggs' blog.

With Drupal 9 here, the upgrade process from Drupal 8 to 9 is short and impressive. Upgrading is similar to a Drupal 8 minor upgrade. After some testing, your Drupal 9 website is ready to go. But, what about Drupal 8 modules? Thanks to the smooth upgrade path to Drupal 9, most large modules are ready to go! But, what about the smaller modules that haven't been maintained in a few years? Enter the Maintainers Project!

What is the Maintainers Project?

The Maintainers Project is a community organized effort started by Damien Mckenna from Media Current and I to improve the support of Drupal contributed modules. Maintaining a large collection of modules is a big undertaking. So, I focus on issues marked "Reviewed and Tested by the Community" or "Needs Review" and testing releases. Rather than trying to write new code, the goal is to ensure more people get responses to their contributions.

Do You Have a Module That Needs Support?

We've started with modules over 3000 that haven't had support in a while and modules we've worked on before. But, we're expanding to as many modules as we can. The best way to get us involved in your module is to join us on Drupal Slack. Or, if Slack isn't for you, you can add Greg Boggs (careful of the space) as a maintainer on your project, or file an issue on the maintainers project, and I'll do my best to jump in to help.

What's the Long Term Plan?

Today, I'm focused on providing support to get Drupal Contrib ready for Drupal 9. To make the project sustainable, we'll need to grow the team. With just the small team we have, when I work on a module, the community springs to life and people from all over the world are already helping to get the work done.

Kudos to Mrinalini Kumari from @Srijan. She's doing great work helping getting patches polished for the release of several of the #Drupal 9 contrib modules I'm working on. It's super awesome opening an issue I'm about to finish and finding the work already done!

— Greg Boggs (@gregory_boggs) June 8, 2020

Jun 19 2020
Jun 19

With the global crisis establishing digital transformation as an unavoidable strategy, many businesses are now thrown into digitalizing their operations, or suddenly needing to dedicate everything they’ve got into streamlining their digital transformation.

Due to all this, we’re also seeing a growing demand for services provided by digital agencies through outsourcing or staff augmentation. Whether that be design, development or marketing, the ability to rely on the proven expertise of a digital agency sure is a welcome one. 

But with so many agencies now offering and advertising their services, how can you start looking for the right one for you? And how can you know you’ve found the right partner before actually working with the chosen agency?

This is exactly what this post will focus on: how to find a digital agency that will take care of your specific digital experience needs. We’ll define the three most important criteria for determining if you’ve found the right one, as well as let you in on some convenient places to look at in order to check that they meet those criteria. 

How do you find the right digital agency for you?

So, let’s say you’ve found a number of agencies that could meet your digital experience needs, and you’re now faced with choosing one among this (likely large) selection.

The first thing you need to determine is, well, their expertise, industry and/or technology-wise, that is. If an agency doesn’t specialize in the tool/technology you need, or doesn’t follow your desired methodologies, you can safely eliminate it in this initial stage.

But things get a little more tricky when you only know what you need, but not which technology to use to achieve that goal. It certainly helps if you’re at least somewhat familiar with the capabilities and limitations of different tools and platforms.

The best partner agency would ideally also help you make the right technology selection in addition to helping you deliver the project. This also means that the partner will work closely with you, and not just execute the work on their own without any oversight and/or control from your side.

Second, you need to determine if your potential partner’s company culture aligns with your own. What are their mission and vision? How do they treat their employees? If they’re an agency that specializes in open-source software, how active are they in those respective communities?

Due to the globally distributed, remote nature of work having eliminated a lot of cultural and physical barriers which used to hinder effective collaboration, culture fit is much less important than company culture fit. 

So, you’ll want to check out their company culture, their communications (both internal and external), stuff like that. The agency’s blog will typically be a great resource for all of these; blogs often contain posts on different topics, ranging from technology tutorials and industry insights to company-related posts and pieces of news.

At Agiledrop, we even have a Community section of our blog, which is mostly dedicated to interviews with members of different open-source communities and industry experts discussing a variety of topics. 

Another great way to ascertain community involvement is social media. In the tech communities, Twitter is probably the go-to platform, but other giants such as Facebook and Instagram also come into play as more company-oriented platforms, while LinkedIn is a great place to learn about individual team members. These are all places where you can learn a lot about a company through content about their culture, events and contributions.

Some open-source projects such as Drupal have a system set up that allows a company’s contributions to the ecosystem to be recognized and displayed on their official website. A bit of a brag here - over the past few months, our team has done a huge amount of work contributing to Drupal, racking up over 100 issue credits in this short time!

Okay, so, when you know that the agency you’re eyeing specializes in the technologies that you need, or offers to help you determine the right tech stack for your project, and also their values and culture align with yours, what’s the last thing you need to do?

Well, you need to find out if they actually own up and deliver on what they offer. And how do you do that?

A lot of companies will have a Clutch profile with additional information and/or client testimonials. Other platforms such as TopDevelopers also do regular technology-specific lists of top companies in a certain field.

But perhaps the best bet would be to just look at the agency’s website, as most of them will feature case studies and client testimonials right there. 

You can learn about the types of companies they’ve worked with, types of projects they’ve worked on, and even more specific information such as details about developer diligence, for instance, or a testament to your potential partner’s timeliness and/or efficiency in communication.

It can be especially helpful if you discover that the agency has previously worked with companies and on projects that are similar to yours - e.g. you’re an educational non-profit from London, and find out they’ve successfully worked with a higher ed institution from, say, Bristol. 

If you manage to find a company that satisfies all three criteria - so, inudstry/technology expertise, company culture and proof of success - you can rest assured that you’ve found the right partner, either for your next project where you just need to scale temporarily, or a long-term digital partner that you know you’ll be able to rely on for all your future digital needs.

Conclusion

Finding the right partner agency among a plethora of options can be a difficult task. We hope this blog post has armed you with the right set of tips to make the process of selection at least somewhat easier.

If you’ve checked out our blog and social media, and discovered that Agiledrop would be the best fit for your development needs, reach out to us and we can start talking about how we can help you out.

Jun 18 2020
Jun 18

Drupal 7 to 9 Upgrade

If you're one of the 70% of Drupal sites that are still on Drupal 7 at the time of this writing, you may be wondering what the upgrade path looks like to go from Drupal 7 to Drupal 9. What does the major lift look like to jump ahead two Drupal versions? How is this different than if you'd upgraded to Drupal 8 sometime in the last few years? And how long will it be before you have to do it again?

Before the release of Drupal 9, the best path for Drupal 7 sites to upgrade to Drupal 9 was to upgrade to Drupal 8. The big selling point in Drupal 9's evolution is that updating from a late version of Drupal 8 to Drupal 9.0 is more like an incremental upgrade than the massive replatforming effort that the older Drupal migrations used to entail. Sites that jumped on the Drupal 8 bandwagon before Drupal 9.0 was released could benefit from the simple upgrade path from Drupal 8 to Drupal 9.0 instead of another big migration project.

Migrating to Drupal 8 is still a good option for Drupal 7 sites, even though Drupal 9 is now out.

You might find that essential modules or themes you need are ready for Drupal 8 but not yet available for Drupal 9. The Drupal 8 to Drupal 9 upgrade path for many modules and themes should be relatively trivial, so many of them should be ready soon. But, there could be some outliers that will take more time. In the meantime, you can do the heavy lift of the Drupal 7 to Drupal 8 migration now, and the simpler Drupal 8 to Drupal 9 upgrade later, when everything you need is ready.

The Drupal 7 to Drupal 8 migration

The Drupal 7 to Drupal 8 upgrade involves some pretty significant changes. Some of the things you previously needed to do via contributed modules in Drupal 7 are now included in Drupal 8 core. However, the way you implement them may not be the same as some refactoring might be required to get feature parity when you migrate to Drupal 8. 

The migration itself isn't a straight database upgrade like it was in Drupal 6 to Drupal 7; instead, you can migrate your site configuration and site content to Drupal 8. You have a choice of doing it two ways: 

  1. Migrate everything, including content and configuration, into an empty Drupal 8 installation (the default method).
  2. Manually build a new Drupal 8 site, setting the content types and fields up as you want them, and then migrate your Drupal 7 content into it. 

For a deeper dive into what these migrations look like, check out An Overview for Migrating Drupal Sites to 8

Planning migrations

The Migration Planner is a helpful tool you may want to consider in your migration planning process. This tool queries a database to generate an Excel file that project managers or technical architects can use to help plan migrations. Developers who are performing the migrations can then use the spreadsheets.

Performing migrations

Core comes with some capability to migrate content automatically. If your site sticks to core and common contributed content types and fields, you may be able to use these automatic migrations. However, if your site relies heavily on contributed modules or custom code, an automatic migration might not be possible; you may need a custom migration approach.

The Drupal Migrate UpgradeMigrate Plus and Migrate Tools modules are good starting points for performing a custom migration. They add things like Drush support for the migration tasks and migration support for some non-core field types. You can access several custom migration processors that make it easy to do some fairly complex migrations. This can be done just by adding a couple of lines to a YAML file, like an entity_lookup processor that will take text from Drupal 7 content and do a lookup to determine what Drupal 8 entity the text refers to.

Drupal 7 works on older versions of PHP but recommends a minimum of 7.2. If you're migrating from an older Drupal 7 site, there may be several other platform requirements to investigate and implement. 

Tooling and paradigm shifts

With the change to Drupal 8, developers are also expected to use new tools. You now use Composer to add modules and their dependencies, rather than Drush. Twig has replaced PHPTemplate as the default templating engine. Some core paradigms have shifted; for instance, developers need to learn to think in terms of events, or extending objects, instead of the old system of hooks. Many hooks still work, but they will probably be deprecated over time, and the new methods are safer ways to write code. The changes aren't insurmountable, but your organization must invest in learning the new way of doing things. You'll need to account for this education overhead when coming from Drupal 7; development teams may need more time to complete tasks as they learn new tools and paradigms.

Drupal 8's deprecation model

In addition to big changes in Drupal 8 core and implementation details, Drupal 8 also features a deprecation model that's familiar in the software world, but new in Drupal version upgrades. Instead of deprecating a bunch of code when there's a major version upgrade, Drupal 8 has introduced a gradual deprecation model. 

As features and improvements are made in Drupal 8's codebase, old methods and functions are marked as deprecated within the code. Then, a few versions later - or in Drupal 9 - that code is removed. This gives development teams a grace period of backward compatibility, during which they can see alerts that code is deprecated, giving organizations time to implement the new code before it's completely removed. 

The deprecated code also provides an easy hint about how to rework your code using new services and methods. Just look at what the hook does, and do that directly in your code.

This gradual deprecation model is one of the core reasons that the Drupal 9 upgrade is more like a minor version release for Drupal 8 than a major replatforming effort.

With that said, can you jump ahead from Drupal 7 to Drupal 9? If you want to skip over Drupal 8 entirely, you can jump directly to Drupal 9. The Drupal 7 migration ecosystem is still available in Drupal 9. Drupal 9 contains the same migrate_drupal module you need to migrate to Drupal 8. There has been discussion around possibly moving this module to a contributed module by Drupal 10, although no decision has been made at the time of this writing.

If you intend to go this route, keep in mind that all of the considerations when upgrading from Drupal 7 to Drupal 8 apply if you jump straight to Drupal 9, as well. You'll still have to manage the migration planning, deal with tooling and paradigm shifts, and consider platform requirements.

Ultimately, however, jumping directly from Drupal 7 to Drupal 9 is a valid option for sites that haven't migrated to Drupal 8 now that Drupal 9 is released. 

Whichever route you choose, whether you're going to migrate via Drupal 8 or straight to Drupal 9, you should start the migration from Drupal 7 to Drupal 9 as soon as possible. Both Drupal 7 and Drupal 8 will reach end-of-life in November 2021, so you've got less than a year and a half to plan and execute a major platform migration before you'll face security implications related to the end of official Drupal security support. We'll cover that in more detail later in this series. 

For any site that's upgrading from Drupal 7, you'll need to do some information architecture work to prepare for the migration to Drupal 8 or Drupal 9. Once you're on Drupal 8, though, the lift to upgrade to Drupal 9 is minimal; you'll need to look at code deprecations, but there isn't a major content migration to worry about. Check out our Preparing for Drupal 9 guide for more details around what that planning process might look like.

But what about waiting for a later, more stable version of Drupal 9, you ask? This is a common strategy in the software world, but it doesn't apply to the Drupal 9 upgrade. Because Drupal 9 is being handled more like an incremental point-release upgrade to Drupal 8, there aren't any big surprises or massive swaths of new code in Drupal 9. The core code that powers Drupal 9 is already out in the world in Drupal 8. There are no new features in the Drupal 9.0 release; just the removal of code that has already been deprecated in minor versions of Drupal 8.

Going forward, the plan for Drupal 9 is to release new features every six months in minor releases. The intent is for these features to be backward compatible, and to bring Drupal into the current era of iterative development versus the major replatforming projects of olde. There aren't any big surprises or major reliability fixes on the horizon for Drupal 9; just continued iteration on a solid platform. So there's no need or benefit to waiting for a later version of Drupal 9!

Plan for migration

Planning for a Drupal 7 to Drupal 8 or Drupal 7 to Drupal 9 migration becomes a question of scope. Do you just want to migrate your existing site's content into a modern, secure platform? Or are you prepared to make a bigger investment to update your site by looking at information architecture, features, and design? Three factors that will likely shape this decision-making process include:

  • Time and budget
  • Developer skillset
  • Release window

Time and budget for a migration

How much time are you able to allocate for what is likely to be a major replatforming effort? What's your budget for the project? Do you need to launch before an important date for your organization, such as college registration or an important government deadline? Can your budget support additional work, such as a design refresh? 

For many organizations, getting the budget for a large project is easier as a one-time ask, so doing the design refresh as part of the migration project may be easier than migrating, and then planning a separate design project in six months. In other organizations, it may be difficult to get enough budget for all the work in one project, so it may be necessary to spread the project across multiple phases; one phase for the migration, and a separate phase for design.

When factoring in the time and budget for additional work, keep in mind that things like revisiting a site's information architecture could save you time and money in the migration process. Budgeting the time to do the work up-front can dramatically save in time and cost later in the migration process, by reducing unnecessary complexity before migrating instead of having to work with custom migrations to bring over content and entity types that you don't use anymore. This also improves maintainability and saves time for developers and editors doing everyday work on the new site.

Consider developer skills when planning your migration

10 years is a long time for developers to be working with a specific framework. If you've been on Drupal 7 since 2011, your developers are likely very experienced with "the Drupal 7 way" of doing things. Many of those things change in Drupal 8. This is a big factor in developer resistance around upgrading to Drupal 8 and Drupal 9.

Composer, for example, is a huge change for the better when it comes to managing dependencies. However, developers who don't know how to use it will have to learn it. Another big difference is that a lot of Drupal 8 and Drupal 9's core code is built on top of Symfony, which has changed many mental paradigms that experienced Drupal developers are accustomed to using. While some things may seem unchanged - a Block is still a Block, for example - the way they're implemented is different. Some things don't look the same anymore; developers will encounter things like needing to use YAML files instead of hooks to create menu items. Even debugging has changed; things like simple debugging via print() statement doesn't always cut it in the new world, so many developers are using IDEs like PHPStorm, or a host of plugins with other editors, just to code effectively in newer versions of Drupal.

All of this change comes with overhead. Developers must learn new tools and new ways of doing things when switching from Drupal 7 to Drupal 9. That learning curve must be factored into time and budget not only for the migration itself but for ongoing development work and maintenance after the upgrade. Progress during sprints will likely slow, and developers may initially feel resistant or frustrated while they learn the new ways of doing things.

Bringing in outside help during the migration process can mitigate some of this learning overhead. Partnering with an experienced Drupal development firm means your migration can be planned and implemented more quickly. One thing to consider when selecting an outside partner is how closely they collaborate with your internal team. When choosing a Drupal development firm to collaborate with your internal team, consider the value of partnering with experienced developers who can "teach" your internal teams how to do things. This reduces the learning curve for a team that's heavily experienced with older Drupal versions and can help your team get up to speed more quickly - saving money during the first year of your new site.

Plan a release window

The other aspect of planning for the Drupal 7 to Drupal 9 upgrade is planning a release window. Plan to have your migration project complete before Drupal 7 is scheduled to reach end-of-life in November 2021. If you can't make that deadline, then start planning now for an Extended Support engagement to keep your site secure until you're able to complete the migration.

You'll want to plan the release window around key dates for your organization, and around other support windows in your stack. For example, if you're a retailer, you may want to have the migration completed before the end of Q3 so you're not upgrading during holiday initiatives. Education organizations may plan their release during slow periods in the school's calendar, or government websites may need to be ready for key legislation. 

When it comes to your stack, you'll want to plan around other important release windows, such as end-of-support for PHP versions, or upgrading to Symfony 4.4. This is particularly important if you need to upgrade dependencies to support your Drupal 7 to Drupal 9 migration. Check out Drupal 8 Release Planning in the Enterprise for more insights about release planning.

Revisit information architecture, features, and design

Because the jump from Drupal 7 to Drupal 9 is so substantial, this is a good time to revisit the information architecture of the site, do a feature audit, and consider whether you want to make design changes. 

Is it time to update your site's information architecture?

Before you jump into a Drupal 9 upgrade project, you should perform an audit of your existing Drupal 7 site to see what you want to carry forward and what you can lose along the way. Did you set up a content type that you only used once or twice, and never touched again? Maybe you can delete that instead of migrating it. Are you using a taxonomy that was set up years ago, but no longer makes sense? Now is a good time to refine that for the new version of your site.

Content migration is also a relatively easy time to manipulate your data. You can migrate Drupal 7 nodes or files into Drupal 9 media entities, for instance. Or migrate text fields into address fields or list fields into taxonomy terms. Or merge multiple Drupal 7 content types into a single Drupal 9 content type. Or migrate content from a deprecated Drupal 7 field type into a different, but supported, Drupal 9 field type. These kinds of things take a bit more work in the migration, but are completely possible with the Migration toolset, and are not difficult for developers with migration experience. The internet is full of articles about how to do these kinds of things.

In addition to the fine details, it's also a good time to take a look at some big-picture questions, like who is the site serving? How has this changed since the Drupal 7 version of the site was established, and should you make changes to the information architecture to better serve today's audience in the upcoming Drupal 9 site? 

Have your feature needs changed?

Drupal 7 was released in 2011. Nearly a decade later, in 2020, the features that seemed important at Drupal 7's inception have changed. How have the feature needs of your content editors changed? Has your site become media-heavy, and do your content editors need large searchable image archives? Do you want to deliver a dynamic front-end experience via a bespoke React app, while giving content editors a decoupled Drupal framework to work in? 

Many editors love the new Layout Builder experience for creating customized site pages. It's something that doesn't exist in Drupal 7 core and is arguably better than what you get even when you extend Drupal 7 with contributed modules. Drupal 8 and 9 have built-in media handling and a WYSIWYG editor, eliminating the need for dozens of Drupal 7 contributed modules that do not always cooperate with each other, and focusing developer attention on the editorial UX for a single canonical solution.

Revisit the needs of your content editors and site users to determine whether any existing features of the current site are no longer important and whether new feature needs warrant attention in the upgrade process. This could be particularly helpful if you find that current features being provided by contributed modules are no longer needed; then you don't have to worry about whether a version of those modules is available in Drupal 8/9, and can deprecate those modules.

Ready for a design update?

If your Drupal 7 site hasn't had a design refresh in years, the upgrade process is a good time for a design refresh. Plan for a design refresh after the upgrade is complete. Drupal 9 will have a new default theme, Olivero, which features a modern, focused design that is flexible and conforms with WCAG AA accessibility guidelines. Olivero has not yet been added to Drupal core - it's targeted to be added in 9.1 - but it is available now as a contributed module any Drupal 8 or Drupal 9 site can use. Olivero is a great starting point for sites that want an updated design. 

If you're planning a custom design project, keep accessibility and simplicity at the forefront of your design process. You may want to engage in the design discovery process with a design firm before you plan your Drupal 9 release; a good design partner may make recommendations that affect how you proceed with your migration.

Perform the migration

The process of migrating from Drupal 7 to Drupal 8 has improved since Drupal 8's initial release, but it can still be an intricate and time-consuming process for complex sites. We wrote An Overview for Migrating Drupal Sites to 8 to provide some insight around this process, but upgrading sites must:

  • Plan the migration
  • Generate or hand-write migration files
  • Set up a Drupal 8 site to actually run migrations
  • Run the migrations
  • Confirm migration success
  • Do some migration cleanup, if applicable

Unlike prior Drupal upgrades, migrating to Drupal 8 isn't an automatic upgrade. A Drupal 7 site's configuration and content are migrated separately into a new Drupal 8 site. There are tools available to automate the creation of migration files, but if you've got a complex site that uses a lot of custom code or many contributed modules, you'll only go so far with automated tools. You'll need to revisit business logic and select new options to achieve similar results or deprecate the use of Drupal 7 contributed modules and custom code in your site to move forward to Drupal 8 and Drupal 9.

Whether you're going upgrade to Drupal 8 and then Drupal 9, or migrating directly from Drupal 7 to Drupal 9, these migration considerations and the process itself will be the same. The only difference would be whether the new site you migrate content into is a Drupal 8 site or a Drupal 9 site.

Upgrading from Drupal 8 to Drupal 9

If you choose to go through Drupal 8, once you get to Drupal 8, finishing the migration to Drupal 9 is relatively easy. Upgrade to the latest version of Drupal 8; the upgrade to Drupal 9 requires Drupal 8.8.x or 8.9.x. Along the way, you'll be notified of any deprecated code or contributed modules you'll need to remove before upgrading to Drupal 9. Make sure any custom code is compatible with Drupal 9, and then update the core codebase to Drupal 9 and run update.php

Voila! The long upgrade process is complete. 

Jun 18 2020
Jun 18

Our normally scheduled call to chat about all things Drupal and nonprofits will happen TODAY, Thursday, June 18, at 1pm ET / 10am PT. (Convert to your local time zone.)

No set agenda this month -- we can discuss whatever Drupal related thoughts are on your mind. If you would like to contribute to the conversation, please join us. 

All nonprofit Drupal devs and users, regardless of experience level, are always welcome on this call.

Feel free to share your thoughts and discussion points ahead of time in our collaborative Google doc: https://nten.org/drupal/notes

This free call is sponsored by NTEN.org but open to everyone.

REMINDER: New call-in information -- we're on Zoom now!

View notes of previous months' calls.

Jun 18 2020
Jun 18

Rahul Dewan sent us this story, documenting how the Drupal India Association was formed, and we wanted to reproduce it here to help it reach the greatest audience. As you will read, the Drupal India Association has significantly helped the Drupal community in India work together to grow the community and joins 28 other local associations around the world. The Drupal India Association is a great example of this, and we’re thrilled to see the momentum behind the community there. 

The formation of a formal ‘Drupal India Association’

For several years, a conversation about the need to form an India-centric Drupal Association has been doing rounds among Drupal business owners. However, this idea finally found conviction and the necessary determination to push it through, when Shyamala Rajaram signed-off from her position on the Board of the Drupal Association after completing her two-year term.

Shyamala’s voice and tone over the phone call one morning, was so full of enthusiasm and energy that inspite of all my skepticism and reservations of ‘why do we need a legal entity to do all the good work we want to do?’, melted away. “We Indians can make so much more impact in the world of Drupal!”, she said. Almost immediately, under Shyamala’s leadership, was laid the foundation of the ‘Drupal India Association’ (DIA) just before the Drupal Camp Delhi in June 2019.

In my experience, it’s the inertia and if i may, a bit of analysis-paralysis, which holds us back from making a determined push to make things happen. Then started the process of forming a board, enrolling all the business owners and key community members to form a formal board. 

The following companies came forward to contribute seed money of Rs.21,00,000 (approx $30,000) to form a kitty: 

  1. Ameex Technologies
  2. Axelerant Technologies 
  3. Innoraft Solutions
  4. Open Sense Labs 
  5. QED42
  6. Srijan Technologies
  7. Unimity Solutions 

The ambition of contribution and influence

By the time Drupal Camp Pune happened in September 2019, what we finally had in place was a functional yet loose governing body — with pretty much every leader from across six cities in India. Inspite of the informality of the board we starting collaborating exceptionally well. On the sidelines of the Pune Camp we stated our ambition of creating a sphere of influence in the entire Indian Ocean Rim, and agreed to not only limit our influence to India.


Board meeting in Pune.  

All of the members present were patting ourselves about the high number of contributions that India makes to Drupal. Dipen Chaudhary, the pioneer of the Drupal community in India and our board member, busted our little bubble by reminding the group that it was a classic of ‘quantity of quality’ and that contributions coming from India were much lower in the rung while all the heavy lifting such as contributions to the Drupal Core were made by the west, Americans in particular. 

Meanwhile, with Prasad’s help we were able to get initiatives like offering discounted Acquia Certifications at every camp. 

Our ‘Big-Hairy-Audacious-Goal’ (BHAG)


After the Chennai Board Meeting

On the sidelines of the Drupal Camp Chennai in December 2019 Shyamala invited—Shri Ramraj the founder of Sify and an advisor and board member of some of the largest Indian IT service companies. We took turns in sharing what we think was inspiring us to come together. Not impressed, Ramaraj prodded us to think of some big goals. He said that coming together is easy, but staying together requires a North Star, and nudged us to think of some big ambitious goals. 

Dipen’s reality check of the Indian contributions had been bothering me since our meeting in Pune. Drupal 9 was on the cards. For us, it wasn’t hard to put two-and-two together and suggest an audacious goal of becoming the largest contributor to Drupal 9 by June 2020 when D9 releases. A lot of anxious looks and pin-drop silence later, a consensus was reached— that we are going to go after this goal. 

Personally, true to the Indian tradition, i felt like touching Ramraj’s feet, truly living by the Indian tradition of showing respect to Gurus or teachers.

Enter ‘COVID-19’ 

Under Mukesh’s leadership Drupal Camp Kolkata planned for March 2020 was in full swing. All the funds being raised were for the first time were being raised under the aegis of Drupal India Association. Excitement levels were high. 

And then came the Covid crash. Drupal Camp Kolkata was cancelled. All the money from the various sponsors of the camp returned. Down but not out! 

Not letting the ball drop

Fortnightly recurring calls and Shyamala showing up on them each & every time, ensured that the group did not fall back into inertia. Our event calendar for Drupal Camps, Meetups, Contribution days agreed to in Chennai, ensured that our activities and also North Star goal remained in sight. 


A social promo prepared for excitement of release of Drupal 9 

Meanwhile, Piyush Poddar led our social media charge and designed systems to ensure Drupal agencies get into a healthy competition of contributing content to be promoted under DIA. 

Drupal Cares
We joined hands as a group to run campaigns for the ‘Drupal Cares’ initiative asking Drupalers in our respective companies to sign up as members and also donate. 

Drupal 9 Porting Weekend
Surabhi Gokte worked with Gábor Hojtsy to help organise the Drupal 9 porting weekend on May 22–23. Led by 10 mentors, 45 Indian Drupalers worked on 165+ modules for porting over to D9.

Under Dipen and Rachit’s leadership, Surabhi is now pushing forward to put together a plan for an all-India online event — our next BHAG (phew!). Do look for news on Drupal Groups for this.

Well, what about our North Star?

When we setup our North Star goal we had decided that our developers would ofcourse continue to tag their respective companies but additionally would add DIA as the client. 

As i write this post, Drupal India Association, has risen from zilch to being at position #7 on the Drupal Marketplace in terms of contributions.

Drupal contributions listing for DIA

While we will not end up meeting our BHAG ‘by June 2020 when D9 releases’, we will continue to strive as a group to become ‘the largest contributor to Drupal 9’.

All credit to the contributing developers and community members

None of our North Star BHAG could ever be possible without all the unnamed Indian developers working in several member companies who’ve been spending time doing Drupal contributions, including on weekends and after-office hours. Contributors like Prof. Sunthar, Prafful, Vidhatananda, Hussain, Rakhi, Vijay, Surabhi, Sharmila cannot go without mention. 

At the Drupal Association, we look forward to hearing more from Rahul and their BHAG.

Jun 18 2020
Jun 18

With this month’s release of Drupal 9, factors fueling migration have moved up a few notches, but as the November 2021 decommission date for Drupal 7 and 8 rapidly approaches, there’s one huge factor that’s superseding all others. Drupal 7 and 8 are both heading toward end-of-life status. 

Now is the time to accept the inevitability of a migration to Drupal 9.

Free Webinar! 6 Steps to Streamline Drupal Migration

While a commercial vendor might step up to provide support for websites that are on a Drupal 7 or Drupal 8 platform, the Drupal community won’t be maintaining Drupal 7 or 8 after Nov. 2021. 

Ultimately, that’s a good thing. If your site is currently on Drupal 8, migration to Drupal 9 will be refreshingly straightforward and seamless. If your site is currently on Drupal 7, a far superior CMS and a vast array of new features are in store. 

Among them:

  • A built-in visual layout builder
  • Enhanced testing and tracking to ensure accessibility compliance 
  • Full multi-lingual support  
  • Assurance of accessibility compliance 
  • Simplified content editing capabilities 
  • Integrated configuration management  
  • The ability to easily control how, when, what, and to whom content is displayed
  • Adept integration of complex functionality and content types
  • Tailored solutions to drive business or engage constituents
  • Excellent security

We covered the straightforward Drupal 8 to Drupal 9 migration process in a blog post earlier this month: Drupal 9 Has Dropped! What to Do Now

For the more than 700,000 websites that are still on Drupal 7, migration looms large on the to-do list. For all practical purposes, Drupal 8 and 9 are a different CMS than Drupal 7. With the complete overhaul of Drupal 7 that resulted in the enterprise-ready Drupal 8, version upgrades are now incremental. 
 

Migration Starting Point

The best and most efficient place to start the migration process is an audit of the current site with the objective of inventorying, categorizing, and rationalizing all content. In a blog post last fall, we compared the initial inventorying process to a household decluttering initiative: A Marie Kondo Inspired Guide to Content Migration.

Dividing the migration process into the following six sequential steps creates a framework for avoiding shortcuts that lead to pitfalls, while ensuring a consistent path forward.

  1. Audit the existing content
  2. Consider the design
  3. Plan the migration
  4. Build a New Content Model
  5. Map the Content
  6. Execute the Migration 

 

Content Audit Considerations

Often, tough choices surrounding what content to keep and what to let go of. 

Fact is, organizations are constantly evolving and the content on a site too often lags behind. Here are some questions to ask for evaluating the relative value of various pages and the content that’s on them:

  • How old is the content? One year? Five years? 
  • How does it rank for page visits?
  • How integrated is the content to your messaging?
  • Are there redundancies?
  • How does it rank on the spectrum of trivial to essential?
  • Does the content drive engagement? Revenue? SEO?
  • Are there regulatory considerations that require the content remain?
  • Are there roles and permissions concerning content governance and workflows?
  • Is this documented?

 

Design Considerations

Migration is the ideal time to redesign a website, or at least to define and determine essential design issues. To cite a few:

  • Is structured content in place?
  • Is CSS used to indicate heading styles, fonts, the color palette, quote style, ordered and unordered lists, buttons, hover states, etc.  Does the design reflect the brand?
  • Is the UX aligned with current expectations and best practices?
  • Would a modern refresh help to engage constituents and clients?

 

Join Us to Learn More!

Thorough time and attention to these initial steps will serve to lay the groundwork for a streamlined website migration that stays on course without pitfalls and unwanted surprises.

Never before has it been more essential for websites to ensure user experiences that offer a straightforward path to desired objectives, compelling content, engaging visuals, and added value. 

Looking for more insights into  wrangling your content, redesigning your site, or mapping out a migration?

Register Now for a FREE Webinar! 
6 Steps to Streamline Drupal Migration 

When:
Wednesday, July 15, 2020
11 a.m. CST

Any Drupal-related questions or concerns in the meantime? Drupal is in our DNA and we are happy to help! Contact us.


 

Jun 17 2020
Jun 17

by David Snopek on June 17, 2020 - 2:15pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Critical security release for Drupal core to fix a (Cross-Site Request Forgery) CSRF vulnerability. You can learn more in the security advisory:

Drupal core - Moderately Critical - Cross Site Request Forgery - SA-CORE-2020-004

Here you can download the Drupal 6 patch to fix, or a full release ZIP or TAR.GZ.

If you have a Drupal 6 site, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

FYI, there were other Drupal core security advisories made today, but those don't affect Drupal 6.

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Jun 17 2020
Jun 17

This is the sixth post in our series about integrating Drupal.org with a 3rd party developer tooling provider:

In March of last year the team here at the Drupal Association migrated all of the Git repositories at Drupal.org from bespoke Git hosting to a self-hosted GitLab instance. This was a major milestone in modernizing the developer workflow, immediately providing better search within project codebases, a better code viewer, and direct code editing in the browser for project maintainers.

However, we know that the primary feature the community has been waiting for is merge requests. We're very excited to report that we're now very close to opening up a beta test of merge requests integrated with Drupal.org issues.

Stepping back for a moment, let's remember the ideal contribution flow that we defined when evaluating our Developer tool options: 

Drupal Flow

  • The issue itself should remain a single-threaded conversation for discussing the particular problem to be solved.
  • Every issue will be associated with one canonical issue fork(with multiple branches, as needed) which can be collaborated on by any contributor, and merged by a project maintainer.
  • Contributors will be able to modify the code in these issue forks by:
    • Checking out the code and committing/pushing changes to the workspace.
    • Inline editing files using GitLab's web interface.
    • Legacy: uploading a patch.
  • All types of contribution—whether merge requests or the legacy patch workflow—will continue to trigger DrupalCI testing.
  • Issue forks can be rebased (manually or automatically) when needed to resolve conflicts with upstream commits.
  • Contributors and project maintainers will be able to comment on individual lines of code.

The foundation for this work is the ability to create these issue forks from an issue on Drupal.org. This involves building an interface from Drupal.org that creates an issue fork in GitLab associated with the Drupal.org issue, and then any Drupal.org user can push to that fork and branches within it. Maintainers may then merge this work from a branch on the issue fork the project.

That foundational work to create the necessary git hooks and access control management is now complete.

The next steps are:

How can you get involved?

If you are a contributed module maintainer, and would like to be a part of the early beta test program for Drupal.org-integrated GitLab merge requests, please indicate your interest by posting a comment to this issue with a list of projects you would like to opt-in.

When will merge requests be available to the community at large?

Creation of issue forks and branches will be available to a limited subset of projects this week, the week of June 14th, 2020. Over the course of the next several weeks we hope to implement the additional UI features that will display relevant information about the issue forks back in the Drupal.org issue, and ultimately allow merge requests themselves.

We hope to have the beta of merge request functionality available to some projects no later than DrupalCon Global, in Mid-July. From there, we'll work through feedback from our beta testers, and work with Drupal core maintainers, to enable issue forks for all projects in the coming months

We hope that you're as excited about this update as we are. It represents a lot of thought on the part of many community contributors, and a lot of work on the part of the Drupal Association staff.

Post-Script - What about DrupalSpoons?

As you read this update, many of you may be wondering how this work fits together with an emerging community initiative called DrupalSpoons. DrupalSpoons is an effort to not only use GitLab for merge requests, code viewing, inline editing, etc… but also to replace Drupal.org issues and DrupalCI with their GitLab equivalents, unifying the developer experience in a single UI.

To be clear, neither we at the Drupal Association, nor the DrupalSpoons initiative are recommending that projects at large begin migrating to the current experimental DrupalSpoons environment. Rather, DrupalSpoons is encouraging experienced maintainers who want to try the workflow as early adopters to try managing their projects on DrupalSpoons. Their goal is that initiative will prove new workflows and ideas that can be brought back into the official community tools on git.drupalcode.org.

The DrupalSpoons contributors have done some helpful and innovative work so far. We appreciate seeing contributors pushing the boundaries of what can be done with 'off-the-shelf' tools, and there's potentially quite a lot we might learn from this work that will inform future tooling for the Drupal project. For example, it's likely that we'll look to DrupalSpoons' use of GitLabCI as we look towards the next generation of DrupalCI.

We are always watching and listening to see what the community wants and needs for Developer Tools. Over time, new tools emerge that add significant value. As the Drupal Association, being responsible for maintaining these tools, we would like to approach each component independently (i.e: look at merge requests, issues, or GitLabCI, each independently) as the DrupalSpoons project continues.

In the meantime, we are moving forward with integrating GitLab merge requests with the Drupal.org issue queues. Once that's complete, we'd like to see a gap analysis to see what would be gained and lost when comparing Drupal.org issues with GitLab issues.

Jun 17 2020
Jun 17
Project: Drupal coreDate: 2020-June-17Security risk: Less critical 8∕25 AC:Complex/A:User/CI:None/II:Some/E:Theoretical/TD:UncommonVulnerability: Access bypassCVE IDs: CVE-2020-13665 Description: 

JSON:API PATCH requests may bypass validation for certain fields.

By default, JSON:API works in a read-only mode which makes it impossible to exploit the vulnerability. Only sites that have the read_only set to FALSE under jsonapi.settings config are vulnerable.

Solution: 

Install the latest version:

Versions of Drupal 8 prior to 8.8.x are end-of-life and do not receive security coverage. Sites on 8.7.x or earlier should update to 8.8.8.

Reported By: Fixed By: 

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web