Aug 13 2019
Aug 13

Today we will learn how to migrate dates into Drupal. Depending on your field type and configuration, there are various possible combinations. You can store a single date or a date range. You can store only the date component or also include the time. You might have timezones to take into account. Importing the node creation date requires a slightly different configuration. In addition to the examples, a list of things to consider when migrating dates is also presented.

Example syntax for date migrations.

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD date whose machine name is ud_migrations_date. The migration to execute is udm_date. Notice that this migration writes to a content type called UD Date and to three fields: field_ud_date, field_ud_date_range, and field_ud_datetime. This content type and fields will be created when the module is installed. They will also be removed when the module is uninstalled. The module itself depends on the following modules provided by Drupal core: datetime, datetime_range, and migrate.

Note: Configuration placed in a module’s config/install directory will be copied to Drupal’s active configuration. And if those files have a dependencies/enforced/module key, the configuration will be removed when the listed modules are uninstalled. That is how the content type and fields are automatically created.

PHP date format characters

To migrate dates, you need to be familiar with the format characters of the date PHP function. Basically, you need to find a pattern that matches the date format you need to migrate to and from. For example, January 1, 2019 is described by the F j, Y pattern.

As mentioned in the previous post, you need to pay close attention to how you create the pattern. Upper and lowercase letters represent different things like Y and y for the year with four-digits versus two-digits, respectively. Some date components have subtle variations like d and j for the day with or without leading zeros respectively. Also, take into account white spaces and date component separators. If you need to include a literal letter like T it has to be escaped with \T. If the pattern is wrong, an error will be raised, and the migration will fail.

Date format conversions

For date conversions, you use the format_date plugin. You specify a from_format based on your source and a to_format based on what Drupal expects. In both cases, you will use the PHP date function's format characters to assemble the required patterns. Optionally, you can define the from_timezone and to_timezone configurations if conversions are needed. Just like any other migration, you need to understand your source format. The following code snippet shows the source and destination sections:

source:
  plugin: embedded_data
  data_rows:
    - unique_id: 1
      node_title: 'Date example 1'
      node_creation_date: 'January 1, 2019 19:15:30'
      src_date: '2019/12/1'
      src_date_end: '2019/12/31'
      src_datetime: '2019/12/24 19:15:30'
destination:
  plugin: 'entity:node'
  default_bundle: ud_date

Node creation time migration

The node creation time is migrated using the created entity property. The source column that contains the data is node_creation_date. An example value is January 1, 2019 19:15:30. Drupal expects a UNIX timestamp like 1546370130. The following snippet shows how to do the transformation:

created:
  plugin: format_date
  source: node_creation_date
  from_format: 'F j, Y H:i:s'
  to_format: 'U'
  from_timezone: 'UTC'
  to_timezone: 'UTC'

Following the documentation, F j, Y H:i:s is the from_format and U is the to_format. In the example, it is assumed that the source is provided in UTC. UNIX timestamps are expressed in UTC as well. Therefore, the from_timezone and to_timezone are both set to that value. Even though they are the same, it is important to specify both configurations keys. Otherwise, the from timezone might be picked from your server’s configuration. Refer to the article on user migrations for more details on how to migrate when UNIX timestamps are expected.

Date only migration

The Date module provided by core offers two storage options. You can store the date only, or you can choose to store the date and time. First, let’s consider a date only field. The source column that contains the data is src_date. An example value is '2019/12/1'. Drupal expects date only fields to store data in Y-m-d format like '2019-12-01'. No timezones are involved in migrating this field. The following snippet shows how to do the transformation.

field_ud_date/value:
  plugin: format_date
  source: src_date
  from_format: 'Y/m/j'
  to_format: 'Y-m-d'

Date range migration

The Date Range module provided by Drupal core allows you to have a start and an end date in a single field. The src_date and src_date_end source columns contain the start and end date, respectively. This migration is very similar to date only fields. The difference is that you need to import an extra subfield to store the end date. The following snippet shows how to do the transformation:

field_ud_date_range/value: '@field_ud_date/value'
field_ud_date_range/end_value:
  plugin: format_date
  source: src_date_end
  from_format: 'Y/m/j'
  to_format: 'Y-m-d'

The value subfield stores the start date. The source column used in the example is the same used for the field_ud_date field. Drupal uses the same format internally for date only and date range fields. Considering these two things, it is possible to reuse the field_ud_date mapping to set the start date of the field_ud_date_range field. To do it, you type the name of the previously mapped field in quotes (') and precede it with an at sign (@). Details on this syntax can be found in the blog post about the migrate process pipeline. One important detail is that when field_ud_date was mapped, the value subfield was specified: field_ud_date/value. Because of this, when reusing that mapping, you must also specify the subfield: '@field_ud_date/value'. The end_value subfield stores the end date. The mapping is similar to field_ud_date expect that the source column is src_date_end.

Note: The Date Range module does not come enabled by default. To be able to use it in the example, it is set as a dependency of demo migration module.

Datetime migration

A date and time field stores its value in Y-m-d\TH:i:s format. Note it does not include a timezone. Instead, UTC is assumed by default. In the example, the source column that contains the data is src_datetime. An example value is 2019/12/24 19:15:30. Let’s assume that all dates are provided with a timezone value of America/Managua. The following snippet shows how to do the transformation:

field_ud_datetime/value:
  plugin: format_date
  source: src_datetime
  from_format: 'Y/m/j H:i:s'
  to_format: 'Y-m-d\TH:i:s'
  from_timezone: 'America/Managua'
  to_timezone: 'UTC'

If you need the timezone to be dynamic, things get a bit harder. The from_timezone and to_timezone settings expect a literal value. It is not possible to read a source column to set these configurations. An alternative is that your source column includes timezone information like 2019/12/24 19:15:30 -07:00. In that case, you would need to tweak the from_format to include the timezone component and leave out the from_timezone configuration.

Things to consider

Date migrations can be tricky because they can be affected by things outside of the Migrate API. Here is a non-exhaustive list of things to consider:

  • For date and time fields, the transformation might be affected by your server’s timezone if you do not manually set the from_timezone configuration.
  • People might see the date and time according to the preferences in their user profile. That is, two users might see a different value for the same migrated field if their preferred timezones are not the same.
  • For date only fields, the user might see a time depending on the format used to display them. A list of available formats can be found at /admin/config/regional/date-time.
  • A field can always be configured to be presented in a specific timezone. This would override the site’s timezone and the user’s preferred timezone.

What did you learn in today’s blog post? Did you know that entity properties and date fields expect different destination formats? Did you know how to do timezone conversions? What challenges have you found when migrating dates and times? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 13 2019
Aug 13

Drupal Tome is a static site generator distribution of Drupal 8. It provides mechanisms for taking an entire Drupal site and exporting all the content to HTML for direct service. As part of a recent competition at SCDUG to come up with the cheapest possible Drupal 8 hosting, I decided to do a proof-of-concept level implementation of Drupal 8 with Docksal for local content editing, and Netlify for hosting (total cost was just the domain registration).

The Tome project has directions for setup with Docker, and for setup with Netlify, but they don’t quite line up with each other (I followed the docker instructions, then the Netlify set, but had to chart my own course to get the site from the first project linked to the repo in the second), and since I’m getting used to using Docksal when I had to fall back and do a bit of it myself I realized it was almost painfully easy to setup.

The first step was to go to the Tome documentation for Netlify and setup an account, and site from the template. There is a button in those directions to trigger the Netlify setup, I’ve added one here as well (but if this one fails, check to see if they updated theirs):

Deploy to Netlify

Login with Github or similar service, and let it create a repo for your project.

Follow Netlify’s directions for setting up DNS so you can have the domain you want, and HTTPS (through Let’s Encrypt). It took it a couple hours to get that detail to run right, but it eventually worked. For this project I chose a subdomain of my main blog domain: tome-netlify.spinningcode.org

Next go to Github (or whatever service you used) and clone the repository to your local machine. There is a generated README on that project, but the directions aren’t 100% correct if you aren’t cloning onto a machine with a working PHP environment. This is when I switched over to docksal, and ran the following series of commands:

fin init
fin composer install
fin drush tome:install
fin drush uli

Then log into your local site using the domain from docksal and the link from drush, and add some content.

Next we export the content from Drupal to send over to Netlify for deployment.

fin drush tome:static
git add .
git commit -m "Adding sample content"
git push

…now we wait while Netlify notices and builds the site…

If you look at the site a few minutes later the new content should be posted.

This is all well and good if I want to use the version of the site generated for the Netlify example, but I wanted to make sure I could do something more interesting. These days Drupal ships with an install profile called Unami that provides a more robust sample site than the more traditional Standard install.

So now let’s try to get Unami onto this site. Go back to the terminal and have Tome reset everything (it’ll warn you that you are about to nuke everything):

fin drush tome:init

…select Unami when it asks for a profile…and wait cause this takes a while…

Now just re-export the content and push it to your repo.

fin drush tome:static
git add .
git commit -m "Converting to Unami"
git push

And wait again, cause this also takes a few minutes…

The Unami home page on my subdomain hosted at Netlify.

That really was all that was involved for a simple site, you can see my repository on Github if you want to see all of what was generated along the way.

The whole process is pretty straight forward, but there are a few things that it helps to understand.

First, Netlify is actually regenerating the markup on their servers with this approach. The Drupal nodes, and other entities, are saved as JSON and then imported during the build. This makes the process reliable, but slow. Unami takes several minutes to deploy since Netlify is installing and configuring Drupal, loading the content, and generating the output. The build command provided in that template is clear enough to follow if you are familiar with composer projects:

command = "composer install && ./vendor/bin/drush tome:install -y && ./vendor/bin/drush tome:static -l $DEPLOY_PRIME_URL" 

One upside of this, is that you can use a totally unrelated domain for your local testing and have it adjust correctly to the production domain. When you are using Netlify’s branching workflow for managing dev, test, and production it also protects your work that way.

My directions above load a standard docksal container because that’s quick and easy, which includes MySQL, but Tome falls back to using a Sqlite database since you can be more confident it is there. Again this is reliable but slow. If I were going to do this on a more complete project I’d want a smaller Docksal setup or to switch to using MySQL locally.

A workflow based on this approach might also struggle with concurrent edits or complex configuration of large sites. It would probably make more sense to have the content created on a hidden, but traditional, server and then run through a different workflow. But for someone working on a series small sites that are rarely updated, a totally temporary instance of the site that can be rapidly deployed to a device, have content updated, push out to production, and then deleted locally until needed again.

The final detail to note is that there is no support for forms built into this solution. Netlify has support for that, and Tome has a module that claim to connect to that service but I wasn’t able to quickly determine how to get it connected. I am confident there are solves to this problem, but it is something that would take a little additional work.

Aug 12 2019
Aug 12

Drupal has pretty good multilingual support out of the box. It's also fairly easy to create new entities and just add translation support through the annotation. These things are well documented elsewhere and a quick search will reveal how to do that. That is not what this post is about. This post is about the UX around selecting which fields are translatable.

On the Content Language page at http://example.com/admin/config/regional/content-language you can select which fields on your nodes, entities and various other translatable elements will be available on non-default language edit pages. The section at the top is the list of types of translatable things. Checking these boxen will reveal the related section. You can then go down to that section and start selecting fields to translate, save the form and they become available. All nice and easy.

I came into the current project late and this is my first exposure to this area of Drupal. We have a few content types and a lot of entities. I was ticking the box for the entity I wanted to add, jumping to the end of the form and saving it. When the form came back though it was not selected. I could not figure out why. It wasn't until a co-worker used the form differently to me that the issue was resolved. Greg ticked the entity, scrolled down the page and found it, ticked some of the checkboxen in the entity itself and then saved the page. The checkbox was still ticked.

The UX on this pretty good once you know how it works. It could be fixed fairly easy with a system message pointing out that your checkbox was not saved because none of the items it exposed were selected.

I feel a patch coming on…

Aug 12 2019
Aug 12
A special bird flying in space has the spotlight while lots of identical birds sit on the ground (lack of diversity)

At Drupalcon Seattle, I spoke about some of the challenges Open Source communities like Drupal often have with increasing contributor diversity. We want our contributor base to look like everyone in the world who uses Drupal's technology on the internet, and unfortunately, that is not quite the reality today.

One way to step up is to help more people from underrepresented groups speak at Drupal conferences and workshops. Seeing and hearing from a more diverse group of people can inspire new contributors from all races, ethnicities, gender identities, geographies, religious groups, and more.

To help with this effort, the Drupal Diversity and Inclusion group is hosting a speaker diversity training workshop on September 21 and 28 with Jill Binder, whose expertise has also driven major speaker diversity improvements within the WordPress community.

I'd encourage you to either sign up for this session yourself or send the information to someone in a marginalized group who has knowledge to share, but may be hesitant to speak up. Helping someone see that their expertise is valuable is the kind of support we need to drive meaningful change.

August 12, 2019

44 sec read time

db db
Aug 12 2019
Aug 12

Like many companies in our technology-enabled, globally connected environment, Promet Source operates with clients and team members all over the world. This reality creates a challenge for communications. The truth is, the more we put into our interactions, the more we get out of them.

I’ve been working with distributed teams for a long time. Even though I got very accustomed to joining video calls, until recently, I had opted to not turn my camera on. I guess it started a while back when I had my first virtual interactions with teammates. Probably due to shyness or my lack of knowledge of virtual communications, I tended to avoid the camera component. That's changed.

Eye-Opening Experience

Lately I’ve started using more face-to-face open communications with clients, collaborators and internally. It brings a higher level of empathy, honesty and receptiveness to the conversations. It’s been like going from a gray-scale image to a sudden, colorful world, and has provided an important step toward building trust and strengthening ties.

A couple weeks ago, I was on a call with a collaborator and a team member with whom I’ve been working for more than two months.

That day, I turned on my camera, and the dynamic quickly changed. We started having a candid conversation between the three of us. 

As I we were chatting, my co-worker felt encouraged and also turned on his camera. We were able to see each other's faces for the first time in more than two months of working together. It made a difference.

Next, our collaborator followed suit and turned his camera on. The conversation was instantly raised up to another level. Once one of us opened up, others felt empowered to do the same. It was like a chain reaction or “Domino Effect.”

A New Dimension

We could comment about our surroundings, our clothes, our hair, what was going on in our lives and in our parts of the world!  We were able to get talking and build rapport so much more easily.

The collaboration on the call became alive and we got more out of it than if we had not had the advantage of video.

Looking back on this conversation, it would be easy to say that it was video that made the difference, but that was only one aspect. It was empathy that drove the emotional connection. 

The cameras helped. We were also willing to open ourselves up to a more honest dialog, sharing something personal, becoming available and responsive to each other. 

The Key: Trust

Trust your teammates. Trust your clients. Trust your collaborators. Trust that there is value in what you have to share.

Here’s what I’ve concluded are the keys to successful interactions even when working across multiple time zones.

Lead by Example

Be confident and share honestly. Let other people see you and hear you. Let your emotions shine through your expressions (facial expressions, expressions through the tone of your voice, the words you choose, etc.) 

Open up and people will trust you, and they will be more likely to open up too.  The Domino Effect can be very exciting.

Leverage Human Interaction

Promet Source is a leading practitioner of human-centered design. We know what it means to design for humans and we facilitate human-centered design workshops all over the country to enhance effectiveness and outcomes.

Just as we consistently emphasize that we are designing for humans, we are careful to not lose sight of the fact that we are designing by humans.

Strengthen Teams through Sharing

Too often, the left brain, technology-driven environment in which we operate ignores the powerful impact of the human element in all of our engagements. Even when separated by borders and time zones, efforts to connect on a personal level pays off in ways that are often unanticipated.

Have you found this to be the case? Share your thoughts in the comment section below on why and how connecting on a human level can drive better outcomes.

Sharing your thoughts and experiences can go a long way toward a greater sense of connection and community in our dispersed, digital world.
 

Aug 12 2019
Aug 12
Image of the Rossetta Stone

In Mastering Drupal 8 Multilingual: Part 1 of 3, we focused on planning for Drupal 8 multilingual and its impact on a project's timeline and budget.

In Part 2 (below), we cover everything you need to know to have a functioning multilingual site with no custom code. Part 3 of the series covers more advanced techniques for site builders and front-end developers.

Aug 12 2019
Aug 12

Sean is a strong believer in the open source community at large, and that working collaboratively is best for creating awesome projects. His community work extends into maintaining and building the BADCamp website build, as well as helping to maintain Docksal, a tool used for managing development environments.

Aug 11 2019
Aug 11

Today we complete the user migration example. In the previous post, we covered how to migrate email, timezone, username, password, and status. This time, we cover creation date, roles, and profile pictures. The source, destination, and dependencies configurations were explained already. Therefore, we are jumping straight to the process transformations in this entry.

Example field mapping for user migration

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD users whose machine name is ud_migrations_users. The two migrations to execute are udm_user_pictures and udm_users. Notice that both migrations belong to the same module. Refer to this article to learn where the module should be placed.

The example assumes Drupal was installed using the standard installation profile. Particularly, we depend on a Picture (user_picture) image field attached to the user entity. The word in parenthesis represents the machine name of the image field.

The explanation below is only for the user migration. It depends on a file migration to get the profile pictures. One motivation to have two migrations is for the images to be deleted if the file migration is rolled back. Note that other techniques exist for migrating images without having to create a separate migration. We have covered two of them in the articles about subfields and constants and pseudofields.

Migrating user creation date

Have a look at the previous post for details on the source values. For reference, the user creation time is provided by the member_since column, and one of the values is April 4, 2014. The following snippet shows how the various user date related properties are set:

created:
  plugin: format_date
  source: member_since
  from_format: 'F j, Y'
  to_format: 'U'
changed: '@created'
access: '@created'
login: '@created'

The created, entity property stores a UNIX timestamp of when the user was added to Drupal. The value itself is an integer number representing the number of seconds since the epoch. For example, 280299600 represents Sun, 19 Nov 1978 05:00:00 GMT. Kudos to the readers who knew this is Drupal's default expire HTTP header. Bonus points if you knew it was chosen in honor of someone’s birthdate. ;-)

Back to the migration, you need to transform the provided date from Month day, year format to a UNIX timestamp. To do this, you use the format_date plugin. The from_format is set to F j, Y which means your source date consists of:

  • The full textual representation of a month: April.
  • Followed by a space character.
  • Followed by the day of the month without leading zeros: 4.
  • Followed by a comma and another space character.
  • Followed by the full numeric representation of a year using four digits: 2014

If the value of from_format does not make sense, you are not alone. It is actually assembled from format characters of the date PHP function. When you need to specify the from and to formats, you basically need to look at the documentation and assemble a string that matches the desired date format. You need to pay close attention because upper and lowercase letters represent different things like Y and y for the year with four-digits versus two-digits respectively. Some date components have subtle variations like d and j for the day with or without leading zeros respectively. Also, take into account white spaces and date component separators. To finish the plugin configuration, you need to set the to_format configuration to something that produces a UNIX timestamp. If you look again at the documentation, you will see that U does the job.

The changed, access, and login entity properties are also dates in UNIX timestamp format. changed indicates when the user account was last updated. access indicates when the user last accessed the site. login indicated when the user last logged in. For brevity, the same value assigned to created is also assigned to these three entity properties. The at sign (@) means copy the value of a previous mapping in the process pipeline. If needed, each property can be set to a different value or left unassigned. None is actually required.

Migrating user roles

For reference, the roles are provided by the user_roles column, and one of the values is forum moderator, forum admin. It is a comma separated list of roles from the legacy system which need to be mapped to Drupal roles. It is possible that the user_roles column is not provided at all in the source. The following snippet shows how the roles are set:

roles:
  - plugin: skip_on_empty
    method: process
    source: user_roles
  - plugin: explode
    delimiter: ','
  - plugin: callback
    callable: trim
  - plugin: static_map
    map:
      'forum admin': administrator
      'webmaster': administrator
    default_value: null

First, the skip_on_empty plugin is used to skip the processing of the roles if the source column is missing. Then, the explode plugin is used to break the list into an array of strings representing the roles. Next, the callback plugin invokes the trim PHP function to remove any leading or trailing whitespace from the role names. Finally, the static_map plugin is used to manually map values from the legacy system to Drupal roles. All of these plugins have been explained previously. Refer to other articles in the series or the plugin documentation for details on how to use and configure them.

There are some things that are worth mentioning about migrating roles using this particular process pipeline. If the comma separated list includes spaces before or after the role name, you need to trim the value because the static map will perform an equality check. Having extraneous space characters will produce a mismatch.

Also, you do not need to map the anonymous or authenticated roles. Drupal users are assumed to be authenticated and cannot be anonymous. Any other role needs to be mapped manually to its machine name. You can find the machine name of any role in its edit page. In the example, only two out of four roles are mapped. Any role that is not found in the static map will be assigned the value null as indicated in the default_value configuration. After processing the null value will be ignored, and no role will be assigned. But you could use this feature to assign a default role in case the static map does not produce a match.

Migrating profile pictures

For reference, the profile picture is provided by the user_photo column, and one of the values is P01. This value corresponds to the unique identifier of one record in the udm_user_pictures file migration, which is part of the same demo module.  It is important to note that the user_picture field is not a user entity property. The field is created by the standard installation profile and attached to the user entity. You can find its configuration in the “Manage fields” tab of the “Account settings” configuration page at /admin/config/people/accounts. The following snippet shows how profile pictures are set:

user_picture/target_id:
  plugin: migration_lookup
  migration: udm_user_pictures
  source: user_photo

Image fields are entity references. Their target_id property needs to be an integer number containing the file id (fid) of the image. This can be obtained using the migration_lookup plugin. Details on how to configure it can be found in this article. You could simply use user_picture as your field mapping because target_id is the default subfield and could be omitted. Also note that the alt subfield is not mapped. If present, its value will be used for the alternative text of the image. But if it is not specified, like in this example, Drupal will automatically generate an alternative text out of the username. An example value would be: Profile picture for user michele.

Technical note: The user entity contains other properties you can write to. For a list of available options, check the baseFieldDefinitions() method of the User class defining the entity. Note that more properties can be available up in the class hierarchy.

And with that, we wrap up the user migration example. We covered how to migrate a user’s mail, timezone, username, password, status, creation date, roles, and profile picture. Along the way, we presented various process plugins that had not been used previously in the series. We showed a couple of examples of process plugin chaining to make sure the migrated data is valid and in the format expected by Drupal.

What did you learn in today’s blog post? Did you know how to process dates for user entity properties? Have you migrated user roles before? Did you know how to import profile pictures? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series is made possible thanks to these generous sponsors. Contact us if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 10 2019
Aug 10

Today we are going to learn how to migrate users into Drupal. The example code will be explained in two blog posts. In this one, we cover the migration of email, timezone, username, password, and status. In the next one, we will cover creation date, roles, and profile pictures. Several techniques will be implemented to ensure that the migrated data is valid. For example, making sure that usernames are not duplicated.

Although the example is standalone, we will build on many of the concepts that had already been covered in the series. For instance, a file migration is included to import images used as profile pictures. This topic has been explained in detail in a previous post, and the example code is pretty similar. Therefore, no explanation is provided about the file migration to keep the focus on the user migration. Feel free to read other posts in the series if you need a refresher.

Example field mapping for user migration

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD users whose machine name is ud_migrations_users. The two migrations to execute are udm_user_pictures and udm_users. Notice that both migrations belong to the same module. Refer to this article to learn where the module should be placed.

The example assumes Drupal was installed using the standard installation profile. Particularly, we depend on a Picture (user_picture) image field attached to the user entity. The word in parenthesis represents the machine name of the image field.

The explanation below is only for the user migration. It depends on a file migration to get the profile pictures. One motivation to have two migrations is for the images to be deleted if the file migration is rolled back. Note that other techniques exist for migrating images without having to create a separate migration. We have covered two of them in the articles about subfields and constants and pseudofields.

Understanding the source

It is very important to understand the format of your source data. This will guide the transformation process required to produce the expected destination format. For this example, it is assumed that the legacy system from which users are being imported did not have unique usernames. Emails were used to uniquely identify users, but that is not desired in the new Drupal site. Instead, a username will be created from a public_name source column. Special measures will be taken to prevent duplication as Drupal usernames must be unique. Two more things to consider. First, source passwords are provided in plain text (never do this!). Second, some elements might be missing in the source like roles and profile picture. The following snippet shows a sample record for the source section:

source:
  plugin: embedded_data
  data_rows:
    - legacy_id: 101
      public_name: 'Michele'
      user_email: '[email protected]'
      timezone: 'America/New_York'
      user_password: 'totally insecure password 1'
      user_status: 'active'
      member_since: 'January 1, 2011'
      user_roles: 'forum moderator, forum admin'
      user_photo: 'P01'
  ids:
    legacy_id:
      type: integer

Configuring the destination and dependencies

The destination section specifies that user is the target entity. When that is the case, you can set an optional md5_passwords configuration. If it is set to true, the system will take an MD5 hashed password and convert it to the encryption algorithm that Drupal uses. For more information password migrations refer to these articles for basic and advanced use cases. To migrate the profile pictures, a separate migration is created. The dependency of user on file is added explicitly. Refer to these articles more information on migrating images and files and setting dependencies. The following code snippet shows how the destination and dependencies are set:

destination:
  plugin: 'entity:user'
  md5_passwords: true
migration_dependencies:
  required:
    - udm_user_pictures
  optional: []

Processing the fields

The interesting part of a user migration is the field mapping. The specific transformation will depend on your source, but some arguably complex cases will be addressed in the example. Let’s start with the basics: verbatim copies from source to destination. The following snippet shows three mappings:

mail: user_email
init: user_email
timezone: user_timezone

The mail, init, and timezone entity properties are copied directly from the source. Both mail and init are email addresses. The difference is that mail stores the current email, while init stores the one used when the account was first created. The former might change if the user updates its profile, while the latter will never change. The timezone needs to be a string taken from a specific set of values. Refer to this page for a list of supported timezones.

name:
  - plugin: machine_name
    source: public_name
  - plugin: make_unique_entity_field
    entity_type: user
    field: name
    postfix: _

The name, entity property stores the username. This has to be unique in the system. If the source data contained a unique value for each record, it could be used to set the username. None of the unique source columns (eg., legacy_id) is suitable to be used as username. Therefore, extra processing is needed. The machine_name plugin converts the public_name source column into transliterated string with some restrictions: any character that is not a number or letter will be converted to an underscore. The transformed value is sent to the make_unique_entity_field. This plugin makes sure its input value is not repeated in the whole system for a particular entity field. In this example, the username will be unique. The plugin is configured indicating which entity type and field (property) you want to check. If an equal value already exists, a new one is created appending what you define as postfix plus a number. In this example, there are two records with public_name set to Benjamin. Eventually, the usernames produced by running the process plugins chain will be: benjamin and benjamin_1.

process:
  pass:
    plugin: callback
    callable: md5
    source: user_password
destination:
  plugin: 'entity:user'
  md5_passwords: true

The pass, entity property stores the user’s password. In this example, the source provides the passwords in plain text. Needless to say, that is a terrible idea. But let’s work with it for now. Drupal uses portable PHP password hashes implemented by PhpassHashedPassword. Understanding the details of how Drupal converts one algorithm to another will be left as an exercise for the curious reader. In this example, we are going to take advantage of a feature provided by the migrate API to automatically convert MD5 hashes to the algorithm used by Drupal. The callback plugin is configured to use the md5 PHP function to convert the plain text password into a hashed version. The last part of the puzzle is set, in the process section, the md5_passwords configuration to true. This will take care of converting the already md5-hashed password to the value expected by Drupal.

Note: MD5-hash passwords are insecure. In the example, the password is encrypted with MD5 as an intermediate step only. Drupal uses other algorithms to store passwords securely.

status:
  plugin: static_map
  source: user_status
  map:
    inactive: 0
    active: 1

The status, entity property stores whether a user is active or blocked from the system. The source user_status values are strings, but Drupal stores this data as a boolean. A value of zero (0) indicates that the user is blocked while a value of one (1) indicates that it is active. The static_map plugin is used to manually map the values from source to destination. This plugin expects a map configuration containing an array of key-value mappings. The value from the source is on the left. The value expected by Drupal is on the right.

Technical note: Booleans are true or false values. Even though Drupal treats the status property as a boolean, it is internally stored as a tiny int in the database. That is why the numbers zero or one are used in the example. For this particular case, using a number or a boolean value on the right side of the mapping produces the same result.

In the next blog post, we will continue with the user migration. Particularly, we will explain how to migrate the user creation time, roles, and profile pictures.

What did you learn in today’s blog post? Have you migrated user passwords before, either in plain text or hashed? Did you know how to prevent duplicates for values that need to be unique in the system? Were you aware of the plugin that allows you to manually map values from source to destination? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

Next: Migrating users into Drupal - Part 2

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 09 2019
Aug 09

At Promet Source, conversations with clients and among co-workers tend to revolve around various aspects of compliance, user experience, site navigation, and design clarity. We need a common nomenclature for referring to interface elements, but that leads to the question of who makes this stuff up and what makes these terms stick?
 
I asked that recently, during an afternoon of back-to-back meetings. In separate contexts, “cookies,” “breadcrumbs,” and “hamburgers” were all mentioned as they pertain to the sites we are building for clients. But I got to wondering: what is it about the evolving Web lexicon that seems inordinately slanted towards tasty snacks?
 

One Theory

As we all know, devs and designers work very hard, with incredible focus for long hours at a stretch. Are we trying to inject some fun language that evokes touch, taste, and smell to a web that can feel rather flat sometimes when we are in the trenches?


I couldn’t help but wonder about a potentially unifying theme to cookies, breadcrumbs and the buns that provide the top and bottom horizontal lines of the increasingly ubiquitous hamburger icon. That sparked my curiosity and a bit of research.

Data/Cookie Jar

Let’s start with cookies -- a term that refers to the extraction and storage of user data such as logins, previous searches, activity on a site, and items in a shopping cart.  Almost all Websites use and store cookies on Web browsers.

a stack of chocolate chip cookies

Generally speaking, cookies are designed to inform better and more personalized Web experiences, but they do, of course, give rise to all sorts of privacy and security concerns. 
 
Potential cookie constraints for Websites developed in the United States for a U.S. audience are moving in an uncertain direction. Up to this point, it’s essentially been the Wild West, with few restrictions governing their usage. 
 
In the European Union, it’s a different story. Assorted rules and regulations, collectively known as the “Cookie Law,” have been in place for nearly a decade -- forbidding the tracking of users’ Web activity without their consent. 
 
As is the case with U.S.-based Websites that need to ensure accessibility, compliance with the Cookie Law can be complicated -- requiring rewriting and reconfiguration of code, followed by careful testing to ensure that the site’s code, server and the user’s browser are aligned to prevent cookies from tracking user behavior and collecting information. And another issue that accessibility and cookies have in common: there’s more at stake than compliance. To an increasing degree, users avoid engaging with Websites when they believe that their activity is being tracked by the use of cookies and there’s no question that overall levels of trust appear to be on the decline as privacy concerns increase. This is among the reasons why many websites are starting to give users the option of just saying no to cookies and still allowing them access to the site.

Connecting the Crumbs

Considerably newer to the Web lexicon than cookies, a breadcrumb or breadcrumb trail is a navigational aid in user interfaces designed to help users track their own activity within programs or websites, providing them with a sense of place within the bigger picture of the site. 
 
Breadcrumbs can take different forms. Generally speaking, a breadcrumb trail tracks the order of each page viewed, as a horizontal list below the top headers. This provides a guide for the user to navigate back to any point where they’ve previously been on the site. Think about Grimms’ story of Hansel and Gretel.
 
Breadcrumbs can be very helpful on complex, content-heavy sites. Who among us hasn’t found themselves frustrated in an attempt to navigate back to a page that seems to have temporarily disappeared?

On the Table

Unlike cookies, which for better or for worse, are stored behind the scenes and consumed in a manner that’s usually not known to the user, a breadcrumb trail is out in the open -- right upfront for the user to see and follow. Breadcrumbs are designed solely to enhance the user experience, functioning as a reverse GPS on complex Websites.  
 
As more and more users come to count on breadcrumbs as a navigational aid, we can expect that the demand for them will increase. At the same time, we can expect that usage of cookies will come under increased scrutiny along with a trend toward escalation of privacy concerns and a growing skittishness about how personal information is being shared. At Promet, we consider cookies to be a must-have on any site.

Time for Some Protein

As for the third item in our list of tasty Web terms, the hamburger is essentially all good. This three-line icon that’s started to appear at the top of screens serves as a mini-portal to additional options or pages.

Actual hamburger on the left. A web hamburger icon on the right.

What’s not to love about this feature that takes up so little space on the screen, but opens the door to a trove additional navigation or features for apps and Websites? Fact is, UX/UI trends are constantly evolving, and users vary widely in the pace in which they pick up what’s new and next. The hamburger icon has a lot going for it and it’s not going away.

 

Meet the Search Sandwich

There’s a item on the table and we were just introduced to it by one of our UX savvy clients. As far as I know, it doesn’t have an official name yet, so we affectionately refer to it as the “search sandwich.” It’s an evolved hamburger combined with a search icon to indicate to users that both the navigation menu and the search bar can be accessed from this icon. It looks a bit like a ham sandwich with an olive on top and might make an appearance on a website soon. Stay tuned.
 
So there you have it. Key factors in our Web design world. -- possibly a reflection of a desire to take our high-tech conversations down a notch, with these playful metaphors for elements that we must all learn to identify with whether a designer, developer, or just a web user. They remind us that the Web is a rapidly evolving environment of UI/UX trends -- created and consumed by humans. 
 
Interested in serving up a tasty web experience? Contact us today

Aug 09 2019
Aug 09

At Promet Source, conversations with clients and among co-workers tend to revolve around various aspects of compliance, user experience, site navigation, and design clarity. We need a common nomenclature for referring to interface elements, but that leads to the question of who makes this stuff up and what makes these terms stick?
 
I asked that recently, during an afternoon of back-to-back meetings. In separate contexts, “cookies,” “breadcrumbs,” and “hamburgers” were all mentioned as they pertain to the sites we are building for clients, and I got to wondering: what is it about the evolving Web lexicon that seems inordinately slanted towards tasty snacks?
 

One Theory

As we all know, devs and designers work very hard, with incredible focus for long hours at a stretch. Are we trying to inject some fun language that evokes touch, taste, and smell to a web that can feel rather flat sometimes when we are in the trenches?

And then, I couldn’t help but wonder about a potentially unifying theme to cookies, breadcrumbs and the buns that provide the top and bottom horizontal lines of the increasingly ubiquitous hamburger icon. That sparked my curiosity and a bit of research.

Data/Cookie Jar

Let’s start with cookies -- a term that refers to the extraction and storage of user data such as logins, previous searches, activity on a site, and items in a shopping cart.  Almost all Websites use and store cookies on Web browsers.

a stack of chocolate chip cookies

Generally speaking, cookies are designed to inform better and more personalized Web experiences, but they do, of course, give rise to all sorts of privacy and security concerns. 
 
Potential cookie constraints for Websites developed in the United States for a U.S. audience are moving in an uncertain direction. Up to this point, it’s essentially been the Wild West, with few restrictions governing their usage. 
 
In the European Union, it’s a different story. Assorted rules and regulations, collectively known as the “Cookie Law,” have been in place for nearly a decade -- forbidding the tracking of users’ Web activity without their consent. 
 
As is the case with U.S.-based Websites that need to ensure accessibility, compliance with the Cookie Law can be complicated -- requiring rewriting and reconfiguration of code, followed by careful testing to ensure that the site’s code, server and the user’s browser are aligned to prevent cookies from tracking user behavior and collecting information. And another issue that accessibility and cookies have in common: there’s more at stake than compliance. To an increasing degree, users avoid engaging with Websites when they believe that their activity is being tracked by the use of cookies and there’s no question that overall levels of trust appear to be on the decline as privacy concerns increase. This is among the reasons why many websites are starting to give users the option of just saying no to cookies and still allowing them access to the site.

Connecting the Crumbs

Considerably newer to the Web lexicon than cookies, a breadcrumb or breadcrumb trail is a navigational aid in user interfaces designed to help users track their own activity within programs or websites, providing them with a sense of place within the bigger picture of the site. 
 
Breadcrumbs can take different forms. Generally speaking, a breadcrumb trail tracks the order of each page viewed, as a horizontal list below the top headers. This provides a guide for the user to navigate back to any point where they’ve previously been on the site. Think about Grimms’ story of Hansel and Gretel.
 
Breadcrumbs can be very helpful on complex, content-heavy sites. Who among us hasn’t found themselves frustrated in an attempt to navigate back to a page that seems to have temporarily disappeared?

On the Table

Unlike cookies, which for better or for worse, are stored behind the scenes and consumed in a manner that’s usually not known to the user, a breadcrumb trail is out in the open -- right upfront for the user to see and follow. Breadcrumbs are designed solely to enhance the user experience, functioning as a reverse GPS on complex Websites.  
 
As more and more users come to count on breadcrumbs as a navigational aid, we can expect that the demand for them will increase. At the same time, we can expect that usage of cookies will come under increased scrutiny along with a trend toward escalation of privacy concerns and a growing skittishness about how personal information is being shared. At Promet, we consider cookies to be a must-have on any site.

Time for Some Protein

As for the third item in our list of tasty Web terms, the hamburger is essentially all good. This three-line icon that’s started to appear at the top of screens serves as a mini-portal to additional options or pages.

Actual hamburger on the left. A web hamburger icon on the right.

What’s not to love about this feature that takes up so little space on the screen, but opens the door to a trove additional navigation or features for apps and Websites? Fact is, UX/UI trends are constantly evolving, and users vary widely in the pace in which they pick up what’s new and next. The hamburger icon has a lot going for it and it’s not going away.

 

Meet the Search Sandwich

There’s a item on the table and we were just introduced to it by one of our UX savvy clients. As far as I know, it doesn’t have an official name yet, so we affectionately refer to it as the “search sandwich.” It’s an evolved hamburger combined with a search icon to indicate to users that both the navigation menu and the search bar can be accessed from this icon. It looks a bit like a ham sandwich with an olive on top and might make an appearance on a website soon. Stay tuned.
 
So there you have it. Key factors in our Web design world. -- possibly a reflection of a desire to take our high-tech conversations down a notch, with these playful metaphors for elements that we must all learn to identify with whether a designer, developer, or just a web user. They remind us that the Web is a rapidly evolving environment of UI/UX trends -- created and consumed by humans. 
 
Interested in serving up a tasty web experience? Contact us today

Aug 09 2019
Aug 09

Today we continue the conversation about migration dependencies with a hierarchical taxonomy terms example. Along the way, we will present the process and syntax for migrating into multivalue fields. The example consists of two separate migrations. One to import taxonomy terms accounting for term hierarchy. And another to import into a multivalue taxonomy term field. Following this approach, any node and taxonomy term created by the migration process will be removed from the system upon rollback.

Syntax for multivalue field migration.

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD multivalue taxonomy terms whose machine name is ud_migrations_multivalue_terms. The two migrations to execute are udm_dependencies_multivalue_term and udm_dependencies_multivalue_node. Notice that both migrations belong to the same module. Refer to this article to learn where the module should be placed.

The example assumes Drupal was installed using the standard installation profile. Particularly, a Tags (tags) taxonomy vocabulary, an Article (article) content type, and a Tags (field_tags) field that accepts multiple values. The words in parenthesis represent the machine name of each element.

Migrating taxonomy terms and their hierarchy

The example data for the taxonomy terms migration is fruits and fruit varieties. Each row will contain the name and description of the fruit. Additionally, it is possible to define a parent term to establish hierarchy. For example, “Red grape” is a child of “Grape”. Note that no numerical identifier is provided. Instead, the value of the <code>name</code> is used as a <code>string</code> identifier for the migration. If term names could change over time, it is recommended to have another column that did not change (e.g., an autoincrementing number). The following snippet shows how the source section is configured:

source:
  plugin: embedded_data
  data_rows:
    - fruit_name: 'Grape'
      fruit_description: 'Eat fresh or prepare some jelly.'
    - fruit_name: 'Red grape'
      fruit_description: 'Sweet grape'
      fruit_parent: 'Grape'
    - fruit_name: 'Pear'
      fruit_description: 'Eat fresh or prepare a jam.'
  ids:
    fruit_name:
      type: string

The destination is quite short. The target entity is set to taxonomy terms. Additionally, you indicate which vocabulary to migrate into. If you have terms that would be stored in different vocabularies, you can use the <code>vid</code> property in the process section to assign the target vocabulary. If you write to a single one, the <code>default_bundle</code> key in the destination can be used instead. The following snippet shows how the destination section is configured:

destination:
  plugin: 'entity:taxonomy_term'
  default_bundle: tags

For the process section, three entity properties are set: name, description, and parent. The first two are strings copied directly from the source. In the case of <code>parent</code>, it is an entity reference to another taxonomy term. It stores the taxonomy term id (<code>tid</code>) of the parent term. To assign its value, the <code>migration_lookup</code> plugin is configured similar to the previous example. The difference is that, in this case, the migration to reference is the same one being defined. This sets an important consideration. Parent terms should be migrated before their children. This way, they can be found by the look up operation. Also note that the look up value is the term name itself, because that is what this migration set as the unique identifier in the source section. The following snippet shows how the process section is configured:

process:
  name: fruit_name
  description: fruit_description
  parent:
    plugin: migration_lookup
    migration: udm_dependencies_multivalue_term
    source: fruit_parent

Technical note: The taxonomy term entity contains other properties you can write to. For a list of available options check the baseFieldDefinitions() method of the Term class defining the entity. Note that more properties can be available up in the class hierarchy.

Migrating multivalue taxonomy terms fields

The next step is to create a node migration that can write to a multivalue taxonomy term field. To stay on point, only one more field will be set: the title, which is required by the node entity. Read this change record for more information on how the Migrate API processes Entity API validation. The following snippet shows how the source section is configured for the node migration:

source:
  plugin: embedded_data
  data_rows:
    - unique_id: 1
      thoughtful_title: 'Amazing recipe'
      fruit_list: 'Green apple, Banana, Pear'
    - unique_id: 2
      thoughtful_title: 'Fruit-less recipe'
  ids:
    unique_id:
      type: integer

The fruits column contains a comma separated list of taxonomies to apply. Note that the values match the unique identifiers of the taxonomy term migration. If you had used numbers as migration identifiers there, you would have to use those numbers in this migration to refer to the terms. An example of that was presented in the previous post. Also note that there is one record that has no terms associated. This will be considered during the field mapping. The following snippet shows how the process section is configured for the node migration:

process:
  title: thoughtful_title
  field_tags:
    - plugin: skip_on_empty
      source: fruit_list
      method: process
      message: 'No fruit_list listed.'
    - plugin: explode
      delimiter: ','
    - plugin: migration_lookup
      migration: udm_dependencies_multivalue_term

The title of the node is a verbatim copy of the thoughtful_title column. The Tags fields, mapped using its machine name field_tags, uses three chained process plugins. The skip_on_empty plugin reads the value of the fruit_list column and skips the processing of this field if no value is provided. This is done to accommodate the fact that some records in the source do not specify tags. Note that the method configuration key is set to process. This indicates that only this field should be skipped and not the entire record. Ultimately, tags are optional in this context and nodes should still be imported even if no tag is associated.

The explode plugin allows you to break a string into an array, using a delimiter to determine where to make the cut. Later, this array is passed to the migration_lookup plugin specifying the term migration as the one to use for the look up operation. Again, the taxonomy term names are used here because they are the unique identifiers of the term migration. Note that neither of these plugins has a source configuration. This is because when process plugins are chained, the result of one plugin is sent as source to be transformed by the next one in line. The end result is an array of taxonomy term ids that will be assigned to field_tags. The migration_lookup is able to process single values and arrays.

The last part of the migration is specifying the process section and any dependencies. Refer to this article for more details on setting migration dependencies. The following snippet shows how both are configured for the node migration:

destination:
  plugin: 'entity:node'
  default_bundle: article
migration_dependencies:
  required:
    - udm_dependencies_multivalue_term
  optional: []

More syntactic sugar

One way to set multivalue fields in Drupal migrations is assigning its value to an array. Another option is to set each value manually using field deltas. Deltas are integer numbers starting with zero (0) and incrementing by one (1) for each element of a multivalue field. Although you could set any delta in the Migrate API, consider the field definition in Drupal. It is possible that limits had been set to the number of values a field could hold. You can specify deltas and subfields at the same time. The full syntax is field_name/field_delta/subfield. The following example shows the syntax for a multivalue image field:

process:
  field_photos/0/target_id: source_fid_first
  field_photos/0/alt: source_alt_first
  field_photos/1/target_id: source_fid_second
  field_photos/1/alt: source_alt_second
  field_photos/2/target_id: source_fid_third
  field_photos/2/alt: source_alt_third

Manually setting a multivalue field is less flexible and error-prone. In today’s example, we showed how to accommodate for the list of terms not being provided. Imagine having to that for each delta and subfield combination, but the functionality is there in case you need it. In the end, Drupal offers more syntactic sugar so you can write shorted field mappings. Additionally, there are various process plugins that can handle arrays for setting multivalue fields.

Note: There are other ways to migrate multivalue fields. For example, when using the entity_generate plugin provided by Migrate Plus, there is no need to create a separate taxonomy term migration. This plugin is able to create the terms on the fly while running the import process. The caveat is that terms created this way are not deleted upon rollback.

What did you learn in today’s blog post? Have you ever done a taxonomy term migration before? Were you aware of how to migrate hierarchical entities? Did you know you can manually import multivalue fields using deltas? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 09 2019
Aug 09

One of Drupal’s biggest strengths is its data modeling capabilities. You can break the information that you need to store into individual fields and group them in content types. You can also take advantage of default behavior provided by entities like nodes, users, taxonomy terms, files, etc. Once the data has been modeled and saved into the system, Drupal will keep track of the relationship between them. Today we will learn about migration dependencies in Drupal.

As we have seen throughout the series, the Migrate API can be used to write to different entities. One restriction though is that each migration definition can only target one type of entity at a time. Sometimes, a piece of content has references to other elements. For example, a node that includes entity reference fields to users, taxonomy terms, and images. The recommended way to get them into Drupal is writing one migration definition for each. Then, you specify the relationships that exist among them.

Snippet of migration dependency definition

Breaking up migrations

When you break up your migration project into multiple, smaller migrations they are easier to manage and you have more control of process pipeline. Depending on how you write them, you can rest assured that imported data is properly deleted if you ever have to rollback the migration. You can also enforce that certain elements exist in the system before others that depend on them can be created. In today’s example, we are going to leverage the example from the previous post to demonstrate this. The portraits imported in the file migration will be used in the image field of nodes of type article.

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD migration dependencies introduction whose machine name is ud_migrations_dependencies_intro. Last time the udm_dependencies_intro_image was imported. This time udm_dependencies_intro_node will be executed. Notice that both migrations belong to the same module. Refer to this article to learn where the module should be placed.

Writing the source and destination definition

To keep things simple, the example will only write the node title and assign the image field. A constant will be provided to create the alternative text for the images. The following snippet shows how the source section is configured:

source:
  constants:
    PHOTO_DESCRIPTION_PREFIX: 'Photo of'
  plugin: embedded_data
  data_rows:
    - unique_id: 1
      name: 'Michele Metts'
      photo_file: 'P01'
    - unique_id: 2
      name: 'David Valdez'
      photo_file: 'P03'
    - unique_id: 3
      name: 'Clayton Dewey'
      photo_file: 'P04'
  ids:
    unique_id:
      type: integer

Remember that in this migration you want to use files that have already been imported. Therefore, no URLs to the image files are provided. Instead, you need a reference to the other migration. Particularly, you need a reference to the unique identifiers for each element of the file migration. In the process section, this value will be used to look up the portrait that will be assigned to the image field.

The destination section is quite short. You only specify that the target is a node entity and the content type is article. Remember that you need to use the machine name of the content type. If you need a refresher on how this is set up, have a look at the articles in the series. It is recommended to read them in order as some examples expand on topics that had been previously covered. The following snippet shows how the destination section is configured:

destination:
  plugin: 'entity:node'
  default_bundle: article

Using previously imported files in image fields

To be able to reuse the previously imported files, the migrate_lookup plugin is used. Additionally, an alternative text for the image is created using a contact plugin concat plugin. The following snippet shows how the process section is configured:

process:
  title: name
  field_image/target_id:
    plugin: migration_lookup
    migration: udm_dependencies_intro_image
    source: photo_file
  field_image/alt:
    plugin: concat
    source:
      - constants/PHOTO_DESCRIPTION_PREFIX
      - name
    delimiter: ' '

In Drupal, files and images are entity reference fields. That means, they only store a pointer to the file, not the file itself. The pointer is an integer number representing the file ID (fid) inside Drupal. The migration_lookup plugin allows you to query the file migration so imported elements can be reused in node migration.

The migration option indicates which migration to query specifying its migration id. Additionally, you indicate which columns in your source match the unique identifiers of the migration to query. In this case, the values of the photo_file column in udm_dependencies_intro_node matches those of the photo_url column in udm_dependencies_intro_image. If a match is found, this plugin will return the file ID which can be directly assigned to the target_id of the image field. That is how the relationship between the two migrations is established.

Note: The migration_lookup plugin allows you to query more than one migration at a time. Refer to the documentation for details on how to set that up and why you would do it. It also offers additional configuration options.

As a good accessibility practice, an alternative text is set for the image using the alt subfield. Other than that, only the node title is set. And with that, you have two migrations connected between them. If you were to rollback both of them, no file or node would remain in the system.

Being explicit about migration dependencies

The node migration depends on the file migration. That it, it is required for the files to be migrated first before they can be used to as images for the nodes. In fact, in the provided example, if you were to import the nodes before the files, the migration would fail and no node would be created. You can be explicit about migration dependencies. To do it, add a new configuration option to the node migration that lists which migrations it depends on. The following snippet shows how this is configured:

migration_dependencies:
  required:
    - udm_dependencies_intro_image
  optional: []

The migration_dependencies key goes at the root level of the YAML definition file. It accepts two configuration options: required and optional. Both accept an array of migration ids. The required migrations are hard prerequisites. They need to be executed in advance or the system will refuse to import the current one. The optional migrations do not have to be executed in advance. But if you were to execute multiple migrations at a time, the system will run them in the order suggested by the dependency hierarchy. Learn more about migration dependencies in this article. Also, check this comment on Drupal.org in case you have problems where the system reports that certain dependencies are not met.

Now that the dependency among migrations has been explicitly established you have two options. Either import each migration manually in the expected order. Or, import the parent migration using the --execute-dependencies flag. When you do that, the system will take care of determining the order in which all migrations need to be imported. The following two snippets will produce the same result for the demo module:

$ drush migrate:import udm_dependencies_intro_image
$ drush migrate:import udm_dependencies_intro_node
$ drush migrate:import udm_dependencies_intro_node --execute-dependencies

In this example, there are only two migrations, but you can have as many as needed. For example, a node with references to users, taxonomy terms, paragraphs, etc. Also note that the parent entity does not have to be a node. Users, taxonomy terms, and paragraphs are all fieldable entities. They can contain references the same way nodes do. In future entries, we will talk again about migration dependencies and provide more examples.

Tagging migrations

The core Migrate API offers another mechanism to execute multiple migrations at a time. You can tag them. To do that you add a migration_tags key at the root level of the YML definition file. Its value an array of arbitrary tag names to assign to the migration. Once set, you run them using the migrate import command with the --tag flag. You can also rollback migrations per tag. The first snippet shows how to set the tags and the second how to execute them:

migration_tags:
  - UD Articles
  - UD Example
$ drush migrate:import --tag=UD Articles,UD Example
$ drush migrate:rollback --tag=UD Articles,UD Example

It is important to note that tags and dependencies are different concepts. They allow you to run multiple migrations at a time. It is possible that a migration definition file contains both, either, or neither. The tag system is used extensively in Drupal core for migrations related to upgrading to Drupal 8 from previous versions. For example, you might want to run all migrations tagged ‘Drupal 7’ if you are coming from that version. It is possible to specify more than one tag when running the migrate import command separating each with a comma (,).

Note: The Migrate Plus module offers migration groups to organize migrations similarly to how tags work. This will be covered in a future entry. Just keep in mind that tags are provided out of the box by the Migrate API. On the other hand, migrations groups depend on a contributed module.

What did you learn in today’s blog post? Have you used the migration_lookup plugin to query imported elements from a separate migration? Did you know you can set required and optional dependencies? Have you used tags to organize your migrations? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with your colleagues.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 08 2019
Aug 08

Our lead community developer, Alona Oneill, has been sitting in on the latest Drupal Core Initiative meetings and putting together meeting recaps outlining key talking points from each discussion. This article breaks down highlights from meetings this past week. You'll find that the meetings, while also providing updates of completed tasks, are also conversations looking for community member involvement. There are many moving pieces as things are getting ramped up for Drupal 9, so if you see something you think you can provide insights on, we encourage you to get involved.

Drupal 9 Readiness (08/05/19)

Meetings are for core and contributed project developers as well as people who have integrations and services related to core. Site developers who want to stay in the know to keep up-to-date for the easiest Drupal 9 upgrade of their sites are also welcome.

  • Usually happens every other Monday at 18:00 UTC.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public Drupal 9 Readiness Agenda anyone can add to.
  • Transcript will be exported and posted to the agenda issue.

Symfony 4/5 compatibility

The issue, Allow Symfony 4 to be installed in Drupal 8, has had a lot of work put into it by Michael Lutz.

Contrib deprecation testing on drupal.org

Gábor Hojtsy grabbed the data and did some analysis and posted his findings to prepare for Drupal 9. The main topic of the post is to stop using drupal_set_message() now.

Examples module Drupal 9 compatibility

Andrey Postnikov posted Kharkov code sprint on where 23 issues were addressed. Follow the example code for developers project for more information. There are still a bunch of issues to review there if anyone is interested!

Module Upgrader Drupal 9 compatibility

Deprecation cleanup status - blockers to Drupal 9 branch opening

Drupal core's own deprecation testing results, there are currently 13 children issues open, most of them need reviews.

Twig 2 upgrade guide and automation and other frontend deprecations tooling

Semantic versioning for contrib projects

Now that we've come to a path forward for how we plan on supporting semver in core (core key using semver + composer.json for advanced features), the Drupal Association is planning on auditing our infrastructure to start implementing semver.

New Drupal 8.8 deprecations that may need to be backported

  • Ryan Aslett and Gábor Hojtsy did some analysis on deprecations in contrib that are new in Drupal 8.8. They found 17% of all contrib deprecated API use now is from either code deprecated in 8.7 or 8.8. Ryan Aslett looked at the toplist of those deprecations and categorized them based on whether the replacements are also introduced in 8.8 or earlier.
  • We're not backporting API's, that carries far too much semver breaking risk. If a contrib maintainer has usages of code that were deprecated in 8.8.x, and they want their module to be 9.x compatible on the day that 9.0 comes out, they can:
    • do internal version checking,
    • open another branch,
    • wait until 9.1 to be 100% compatible with all supported versions, or
    • drop support for 8.7.x.

Renaming core modules, eg. actions

  • There's a meta about renames.
  • Renaming modules can have impacts on other modules that declare a dependency on those modules. 
  • We need some way to prove that a rename doesn't break contrib modules.

Migration Initiative Meeting (08/08/19)

This meeting:

  • Usually happens every Thursday and alternates between 14:00 and 21:00 UTC.
  • Is for core migrate maintainers and developers and anybody else in the community with an interest in migrations.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public migration meeting agenda anyone can add to.
  • Transcript will be exported and posted to the agenda issue.
  • For anonymous comments, start with a bust in silhouette emoji. To take a comment or thread off the record, start with a no entry sign emoji.

Some issues need review:

  1. Add test of D7 term localized source plugin
  2. Migrate D7 synchronized fields
  3. Ensure language is not Null in translation source queries 
  4. Language specific term (i18n_taxonomy) should not rely on entity translation in D7 taxonomy term migration
  5. Migrate D6 and D7 node revision translations to D8 
  6. Migrate D7 i18n taxonomy term language 
  7. Use the lock service for migration locks 
  8. Undeprecate Drupal\comment\Plugin\migrate\source\d6\Comment::prepareComment() and mark as internal
  9. Create Migration Lookup service 
  10. Validate Migration State should load test fixture
  11. Boolean Field On and Off Label not Migrating
  12. Assert plural labels exist on migrate upgrade form
  13. Migrate UI - review help text
Aug 08 2019
Aug 08
Text on the Rosetta Stone

The web is constantly growing, evolving and—thankfully—growing more accessible and inclusive.

It is becoming expected that a user can interact with a website solely via keyboard or have the option to browse in their native language. There are many ways to serve the needs of non-native-language users, but one of the more robust is Drupal Multilingual.

Unlike 3rd party translation plugins like Google Translate or browser translation tools, Drupal's suite of core Multilingual tools allows you to write accurate and accessible translated content in the same manner as you write in your default language content. With no limit on the number languages, settings for right-to-left content, and the ability to translate any and all of your content, Drupal 8 can create a true multi-language experience like never before.

There is, however, a bit of planning and work involved.

Hopefully, this blog series will help smooth the path to truly inclusive content by highlighting some project management, design, site building, and development gotchas, as well as providing some tips and tricks to make the multilingual experience better for everyone. Part one will help you decide if you need multilingual as well as provide some tips on how to plan and budget for it.

Aug 08 2019
Aug 08

For most Drupal projects, patches are inevitable. It’s how we, in the Drupal community, share code. If that scares you, don’t worry-- the community is working hard to move to a pull/merge request workflow. Due to the collaborative nature of Drupal as a thriving open source community and the always growing ecosystem of contrib modules, patches are the ever-evolving glue that can hold a site together.  

Before Drupal 8, you may have seen projects use drush make which is a Drupal specific solution. As part of the “get off the island” movement,  Drupal adopted existing dependency manager Composer. Composer does a decent job alleviating the headaches of managing several sites with different dependencies. However, out of the box Composer will revert patched core files and contrib modules and it is for that reason composer-patches project was created. In this blog post, we are going to review how to set up composer-patches for a composer managed project and how to specify local or remote hosted patches.

The setup

In your favorite command line tool, you will want to add the composer-patches project:

composer require cweagans/composer-patches:~1.0 --update-with-dependencies

With this small change, your project is now set up for success because composer can manage your patches. 

Local patches

Sometimes you will find that you need patch contrib or core specifically for your project and therefore the patch exists locally. Composer-patches can apply that patch for you, we just need to tell it where it is.  Let’s look at an example project that has core patch applied and saved locally in the project root directory ‘patches/core-invalid-config-structures.patch’:
    ...
    "extra": {
      "patches": {
        "drupal/core": {
          "Core Invalid config structures ":"patches/core-invalid-config-structures.patch"
        }
      }
    }

In your composer.json, you will want to add an “extra” section if it doesn’t already exist.  Composer-patches will take the packages listed in “patches” and try to apply any listed patches. In our above example, the package we are patching is “drupal/core”. Patches are declared as follows:

“Patch description”: “path to patch file”

This information will be printed on the command line while composer tries to update the package which makes it important to summarize the patches purpose well.  If you would like to see what this looks like in the wild, take a look at our distribution Rain which leverages a couple of contrib patches.

After manually updating composer.json, it is always a good idea to run composer validate to confirm the json syntax is right.  If you get the green success message run composer update drupal/[projectname], e.g. composer update drupal/core to have the patch applied. 

You will know that the patch is applied based on the output:

patch output

As you can see, the package getting patched is removed, added back and the patch is applied. 

Note: Sometimes I feel like I have to give composer a nudge, always feel comfortable deleting /core, /vendor, or /modules/contrib, but if you delete composer.lock know that your dependencies could update based off your constraints.  Composer.json tracks our package dependencies at certain version constraints while composer.lock is the recipe of computed versions based off those constraints. I have found myself running the following:

rm -rf core && rm -rf modules/contrib && rm -rf vendor
composer install

Remote Patches

When possible we should open issues on Drupal.org and post patches there. That way, the community can work together to solve a problem and usually you’ll get a more reliable, lasting solution. Think about it this way - would you rather only you or your team review a critical patch to your project or hundreds of developers?

To make composer-patches grab a remote patch make the following changes:
    ...
    "extra": {
      "patches": {
        "drupal/core": {

          "#2925890-10: Invalid config structures ":"https://www.drupal.org/files/issues/2018-09-26/2925890-10.patch"
        }
      }
    } 

The only change here is rather than the path to the local patch, we have substituted it for the URL the patch. This will have a similar success message when applied correctly:

remote patches

Tips 

So far, I’ve shown you how to get going with composer-patches project but there are a lot of settings/plugins that can elevate your project.  A feature I turn on for almost all sites is exit on patch failure because it is a big deal when a patch fails.  If you too want to turn this feature on, add the following line to your “extras” section in your composer.json:

"composer-exit-on-patch-failure": true,

I have also found it helpful to add a link back to the original issue in the composer.json patch declaration. Imagine working on a release and one of your patches fail but the only reference you have to the issue is the patch file url? It is times like these that a link to the issue can make your day.  If we made the same change to our example before, it would look like the following:

 "drupal/core": {
          "#2925890-10: Invalid config structures (https://www.drupal.org/project/drupal/issues/2925890)" : "https://www.drupal.org/files/issues/2018-09-26/2925890-10.patch"
        }

Conclusion

Composer-patches is a critical package to any Drupal project managed by Composer. In this blog I showed you how to get started with the project and some of the tips and tricks I’ve learned along the way. How does your team user composer-packages? Do you have a favorite setting that I didn’t mention? Feel free to drop a comment and share what works for you and your team.

Aug 08 2019
Aug 08

If you have a local business — a restaurant, a bar, a dental clinic, a flower delivery service, a lawyer's office, and so on — you will benefit immensely from a strong online presence. In this post, we will discuss why Drupal 8 is a great choice to build a local business website. Read on to see how numerous Drupal 8’s benefits will play in favor of your local business.

Some stats about why your local business needs a website

Local businesses once used to rely on word-of-mouth marketing. But the new digital era has changed the game. Customers widely use local Google search to find places, services, or products, trust online customer reviews, and so on. 

So consider these stats about how things are going for local businesses in the digital world:

  • Users rely on search engines for finding local information. According to Google’s study, 4 in 5 people do so.
  • Local searches are also very goal-oriented. The same Google’s study says that 50% of users who performed a local search on their smartphones visited the store within the next 24 hours. 
  • Mobile local searches grow like crazy. According to Statista, mobile local searches are forecast to reach 141.9 billion in 2019 compared to 66.5 billion in 2014. At the same time, the desktop trend even slightly drops (62.3 and 66.5 billion, respectively).
  • Smartphone shoppers love local search. Statista also informs that 82% of smartphone shoppers in the US have used their device for local search with the “near me” keyword as of 7/2018.
  • Customers read reviews for local businesses. According to the study by BrightLocal, 86% of people do so.

Reasons to build a local business website on Drupal 8

SEO-friendliness with plenty of useful modules

First of all, local businesses shouldn’t miss their unique opportunity and make the best of SEO. Google has special approaches to local search. When users search by adding a city name or the “near me” keyword, Google lists the best results near the top of the SERPs in a variety of rich ways. Among them:

  • the Knowledge Panel
  • locations on the Google Map
  • carousels with images, news, reviews, etc.

Moreover, users are able to get all the necessary information like your business hours, get your contacts, read a review, get direction, book an appointment, and so on, without even clicking to your website (zero-click SERPs).

Local Google search on a mobile phone

How to get to these results? Among the recommendations are:

  • provide detailed information in Google My Business listing
  • have an optimized Knowledge Graph for your website
  • optimize content using local keywords
  • optimize content so it fits Google’s rich snippets
  • and, of course, follow overall general best SEO practices 

Here is where the Drupal 8 CMS can be your very helpful assistant. In addition to being SEO-friendly out-of-box, Drupal 8 has a wealth of useful SEO modules for various purposes. They include:

and many more.

Content easy to manage

Unique and relevant content, regularly updated and optimized with local keywords is one of your most important local SEO secrets. The richer the content is, the richer it looks in Google local search results. In addition, trimming your content to fit Google’s rich snippets is the key to optimizing your website for voice search

In Drupal 8, it is easy to create, edit, and present content in attractive ways. Drupal 8 offers you:

  • quick edits directly on the page
  • the Media Library to easily enrich your content with images, videos, and audios
  • handy content previews
  • drag-and-drop page layouts with Layout Builder
  • Drupal Views grids, slideshows, and carousels for the attractive content presentation
  • content revision history
  • mobile-friendly admin interfaces
  • content moderation workflows

and much more.  
 

Media Library in Drupal 8

Mobile optimization out of box

In addition to the above mobile statistics, here is more. The mobile share of organic search engine visits has reached 59% in 2019 against 27% in 2013. So your successful local business absolutely needs a mobile-friendly website.

Here is where Drupal 8 wins totally. It has been built around a mobile-first approach. The CMS features built-in modules for creating responsive web design — the Responsive Image and the Breakpoint. 

The responsive web design technique allows your website pages to adapt to any user’s screen by showing a different layout. The page elements resize, change their position, or disappear to provide the smoothest viewing experiences for everyone. 

Multi-language to attract more guests

Let’s suppose you run local business in your country in your local language. Consider adding English as an international language or another language based on your touristic audience. See how you can attract your city guests as they go ahead with Google search.

Drupal 8 is the best option for multilingual websites and allows you to easily add as many languages as you wish. Drupal 8 supports a hundred of them out-of-the-box with the interface translations included. 

Thanks to Drupal 8 Multilingual Initiative (D8MI), Drupal 8 has four powerful modules responsible for every aspect of translation. 

High accessibility standards

According to the CDC (Center for Disease Control and Prevention), 26% (1 in 4) adults in the US have some form of disability. This a quarter of your potential customers. Moreover, they are the ones who may need your local services more than others — for example, local delivery.

To be accessible to all users without barriers, your website should adhere to accessibility standards. Drupal 8 has a focus on them and offers advanced accessibility features. They include the use of WAI-ARIA attributes, accessible inline form errors, aural alerts, obligatory ALT text for images, and much more.  

Presence in multiple channels

Local businesses often benefit from the digital presence in multiple channels — imagine, for example, a pizza delivery mobile app connected to your website. 

Drupal 8 offers amazing opportunities to exchange your website’s data with third-party applications. It has five powerful built-in modules for creating REST APIs and sharing Drupal data in the JSON, XML, or other formats needed by the apps. 

Easy social media integration

It’s no longer possible to successfully manage a business without a social media presence. Drupal 8 allows third-party integration with any systems, and social networks are not an exception. 

It is incredibly easy to add social media icons to your website pages, provide social share buttons for your content, embed social media feeds, and much more. Social media modules in Drupal 8 are very numerous and useful. 

Among them, Easy Social, AddToAny Share Buttons, Social media share, Social Media Links Block and Field, and many more. In addition, there are network-specific modules like Video Embed Instagram, Pinterest Hover button, Facebook Album, and plenty of others.

Social media posts can also be embedded in your content using the Drupal 8 core Media module as a basis. 

Build a local business website on Drupal 8 with us!

The above reasons to build a local business website on Drupal 8 are just a few of a thousand. Contact our Drupal team and let’s discuss in more detail how we can help your local business flourish!

Aug 08 2019
Aug 08

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

In our latest interview, Ricardo Amaro of Acquia reveals how his discovery of Drupal has enabled him to work on projects he enjoys and that make a meaningful impact. Read on to learn more about his contributions and what the Drupal community in Portugal is like. 

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

My name is Ricardo Amaro. I live with my wife and 2 kids in Lisbon, Portugal. I’ve been working for Acquia since 2011 and recently promoted to Principal Site Reliability Engineer where we deal with all the challenges of helping ~55k Drupal production sites grow every day.

I’ve been contributing in several aspects to the Drupal Community and sometimes that effort goes beyond. An example of that is the published co-authoring of the “Seeking SRE” book (O’Reilly) with my chapter about Machine Learning for SRE, since that main idea came out of a presentation I did at DrupalCon Vienna 2017 explaining how automation and machine learning could help increase reliability on Drupal sites. 

Other projects I’ve initiated in the past within the Drupal community include:

On the local front I founded the Portuguese Drupal Association 8 years ago and I am its current elected president. That same year we organized our first DrupalCampLisbon2011. Nowadays we organize DrupalDays and Camps all over the country and meet regularly on Telegram and video-conferences. Last year we organized DrupalDevDays Lisbon 2018 which was a really good turn out for the entire community.

My main drivers are a passion for Free Software and Digital Rights. That started back in the 90’s when I found myself struggling with the proprietary/closed software available at the time, and installing Linux/Slackware in 1994 was an enlightening moment to my own question “isn’t there a better option?”. But I only switched all my machines to Linux in 2004 and that’s what I’ve used up to now. Because I think the GNU/Free Software ecosystem, where Drupal was able to grow, is fragile and needs to be nourished by all of us.

I have a degree in Arts and a second one in Computer Science & Engineering and I’m now taking a master in Enterprise Information Systems.

Before Acquia, I worked both in the public sector and in the private sector in Portugal, applying Agile techniques and encouraging the DevOps culture. I’ve managed teams, development projects and operations also in South Africa and around Europe. 

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

I came across Drupal in 2008, when searching for an OpenSource CMS software in order to create some Media Publishing sites for the company I was working for back at that time. My role as an IT Director was not easy, since the company was struggling with funding, so Drupal 6 was an amazing tool that enabled us to grow several of the sites and particularly create a self service on our main classified advertisement sites.

I found the Drupal Portuguese community at that time struggling to have a legal entity and to be able to grow and organize events inside the country. Portugal has always been mostly monopolized by large corporations like Microsoft and Oracle, while Free software has always been seen as “experimental” solutions, at best.

I took upon myself the commitment to bring the local Drupal community the pride and success they all deserve. I’ve grown a friendship for each and every person in our community and now I couldn't imagine myself without them, as I couldn't imagine myself without Drupal.

3. What impact has Drupal made on you? Is there a particular moment you remember?

Putting it simply: Drupal changed my life! Drupal brought justification to my values and aspirations. I honestly couldn’t have imagined, in a world that is more and more inclined to monopolistic visions, being able to exercise and contribute to the Free Software community and make a living out of it.

The particular moment I felt this more strongly the first time was around 2011 when some decision makers from one of these large corporations asked me if I could bring my Drupal presentation to them at the time, because they wanted to know what this Drupal thing was all about. So I organized a few of my usual slides and took them with me.

This was in a very fancy Vila in one of the most expensive areas near Lisbon. I did my pitch and by the end they seemed very impressed with what Drupal had to offer for free, so many powerful features, so much commitment. Naturally one of their questions was how they could make their proprietary software, that started having a descent curve, embark on this positive wave of growth. My obvious answer was “release your code as open source”. They looked at me in discredit of course and still invited me for a boat ride which I declined politely. 

I went back home and from time to time thought about that episode until it started to look like a mirage in the past. To my surprise, in the most recent years, that same corporation has started releasing open source code, created community projects and apparently changed their minds… 

4. How do you explain what Drupal is to other, non-Drupal people?

Drupal lets you turn big ideas into digital realities. An innovative web platform for creating engaging digital websites and experiences. Drupal is the world's most popular enterprise-class web content management system. It’s developed by more than 46,000 people that are part of the 1.3 million users registered on drupal.org.

Last year we had about 1,000 companies with 8,000 code contributions and this is reflected in millions of websites with 12% market share, plus an annual growth of 51%. If these people still had some more time I would present them the Drupal Pitch Deck. :)

5. How did you see Drupal evolving over the years? What do you think the future will bring?

From my perspective Drupal has been always growing and even making positive bonds with other Free Software initiatives out there.  One of the most interesting ones happened last year at Drupal Europe 2018 (11-14 Sept)  where we had the founders of RocketChat and Nextcloud met and they ended up announcing a partnership on the 17th of September…  

We should follow that example and support more interaction and collaboration with other projects in our ecosystem. For starters we should make an effort to use tools like RocketChat (see https://drupalchat.me) and grow awareness that companies like Slack have 0, or even less, to do with our values and we don’t gain anything with crossing our arms and letting people be driven there. The future is open, the future is community and inclusion.

6. What are some of the contributions to open source code or to the community that you are most proud of?

For sure the ongoing effort that I do on the Drupal Portuguese Association to keep people motivated, things organized and events happening is the first one. The highlight of this was DrupalDevDays Lisbon 2018. The second one was the DrupalCI which was of major impact for Drupal8’s final release.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavor. 

I’m most excited about Containers and the power behind them. That is only possible because there is Gnu/Linux operating system supporting them. Kubernetes in particular is also of interest since it follows the reasoning of auto-scalability that we need for distributed systems. Drupal is flying to the future already with its headless/decoupled capabilities. I’m seeing containers even being applied to support machine learning algorithms and neural networks. 

Another thing that I’m particularly interested in is investigating better ways to make communities grow and ensure that they have the necessary tools to make that happen.  

My personal endeavor is, in the end, to see my kids grow in a healthy environment, rich in possibilities, and for that I need to keep information available for them and help the Free Software ecosystem stay alive. After all, what else is there that can guarantee our future human independence from “blackboxed” technology? If you can’t see, study or change the source, what role is left for you? 

 Drupal DevDays Lisbon 2018

Aug 08 2019
Aug 08

Back in early 2010, when Jason Grigsby pointed out that simply setting a percentage width on images was not enough, and that you needed to resize these images as well for a better user experience. He pointed out that if you served the right sized images on the original responsive demo site, more than 75% of the weight of those images can be shaved on smaller screens. 

Ever since, the debate on responsive images have evolved in what is the best solution to render the perfect, responsive images without any hassle.

We all know how Drupal 7 does a great job in handling responsive images with its modules. However, with Drupal 8, things are even better now!

Responsive Images in Drupal 8

The Responsive Image module in Drupal 8 provides an image formatter that maps the breakpoint of the original image to render a flawless responsive image using a picture tag.

When we observe how Drupal 8 handles responsive images when compared to Drupal 7, some of the features to be noted are:

Drupal 7 consists of the contributed module picture element, which in the latest version is known as Responsive Images.
In addition to this, Responsive images & Breakpoint modules are a part of the Drupal core in the latest version of the CMS.

The Problem

One of the major problems with the images in web development is, browsers do not know about the images, and are clueless about what sized images are rendering in relation with a viewport of different screens until the CSS & Javascripts are loaded.

However, the browser can know about the environment in which the images are rendering, which includes the size of the viewport and resolution of the screen.

The Solution 

As we mentioned in previous sections, responsive images use picture element which basically has sizes and srcset attributes which play a major role in notifying the browser to choose the best images based on the image style selections.  

So Drupal 8 has done a great job in providing the responsive images module in the core. This will download the lower sized images for the devices with lower screen resolution, resulting in better website load time and improved performance. 

Steps to reproduce

  1. Enable Responsive images and breakpoint module.
  2. Setup the breakpoints for your projects theme.
  3. Setting up the image styles for responsive images
  4. Creating a responsive image style for your theme
  5. Assigning the responsive image style to an image field.

Enable Responsive images and breakpoint module

Since it's a part of drupal 8 core, we will not require any other extra module. All you have to do is enable the responsive images module, since the breakpoint module will be installed with the standard profile. Else enable the breakpoint module.

To enable the module goto->admin->extends select the module and enable the module.

extend page

Setup the breakpoints for your project's theme
 

breakpoints

Setting up the theme’s breakpoint is the most important part for the responsiveness of your site.


If you are using a core theme like bartik , seven, umami or claro, you will already have the breakpoints file and you don’t have to create any new ones. 

However, if you are using a custom theme for your project, it is important that you define the breakpoints in "yourthemename.breakpoints.yml" which can be found in your theme directory, usually found in "/themes/custom/yourthemename".

Each breakpoint will assign the images to media query.  For example images which are rendering in mobile might be smaller i.e width less than 768px, where in medium screens will have a width between 768px to 1024px.


Each breakpoint will have: 

label:  Is the valid label given for the breakpoint.
mediaQuery:  Is the viewport within which the images are rendered.
weight:  For the order of display.
multipliers:  It's a measure of the viewport's device resolution normally 1x will be used for standard sizes and 2x for retina display.

Setting up the image styles for responsive images

Now we will have to create an image style for each of the breakpoints. You can configure your own Drupal 8 image styles at admin->config->media->image-styles. 

Click ‘Add image style’.  Give the valid name for your image style & use scale and crop effect which will provide the cropped images. If the images are stretched, add multiple image style for different viewports.

add image style

Creating a responsive image style for your theme 

This is where you provide the multiple image style options to the browser and let the browser choose the best out of the lot. 

responsive-image-styleresponsive image


To create new responsive Drupal 8  image style navigate to:
Home -> admin- > config-> media->responsive-image-style and click on ‘Add responsive image’. 

Give a valid name for your responsive image style & select the breakpoint group (choose your theme) & assign the image styles to the breakpoints listed 

There are multiple options for the image style configurations

  • Choose single image style: Where you can select the single image style that will be rendered on the particular screen
  • Choose multiple image style: Where you can select the multiple-image style and also specify the viewport width for the image style

At last, there is an option to select a fallback image style. The fallback image style should only appear on the site if an error occurs.

fallback responsive image

Assigning the responsive image style to an image field 

  • Once all the configurations are done, move to the image field by adding the responsive image style.
  • To do that go to the field’s manage display and select the responsive image style which we created.
  • Add content and see the results on the page with a responsive image style.assigning responsiveresponsive image style to an image field

Final Results 

responsive image style to an image field

 The image at a minimum width of 1024px (For large Devices).

minimum width of 1024px

Image at minimum width of 768px (For Medium Devices).

Responsive image style

Image at maximum width 767px (For Small Devices).

Aug 08 2019
Aug 08

With Dries’ latest announcement on the launch of Drupal 9 in 2020, enterprises are in an urgent need to upgrade from Drupal 7 and 8 to version 9.

Drupal 7 and 8 will reach their end of life in November 2021, and those who wish to stick to previous versions might possibly face security challenges.

Eager but unsure what the process would be like? This comprehensive guide aims to simplify the entire Drupal migration process for easy implementation.

Getting Started with the Migration Process

When site is upgraded to Drupal 7, the old database is upgraded to Drupal 7 structure. However, a different approach is followed when the site is upgraded from Drupal 7 to Drupal 8.

Upgrading D7 to D8

Step 1: Take back-up of your website

Start the migration process by making a local copy of your website. As making changes to live site is not recommended, it is a best practice to keep all data safe by taking a backup locally on your machine.

Step 2: Install fresh new site

Install a new Drupal 8 site by downloading the latest version of Drupal 8 from drupal.org.

Drupal 8.7 is the latest release.

Install the latest release of Drupal 8 along with installing dependencies with Composer.

Step 3: Prepare your Drupal 8 website for the migration

Setup a local Drupal 8 website on your machine as a destination website for the migration process.

Step 4: Verifying the modules are in core and enabled

Ensure Migrate, Migrate Drupal and Migrate Drupal UI modules are enabled on your Drupal 8 site. This can be done by navigating to the ‘Extend’ tab of your website and ensuring all the above modules are present in the core.

Now, check the three modules and click ‘Install’ button at the bottom of the page.

1-526074867869569243

 

Step 5: Upgrade your website

Go to your website and append the website address with /upgrade (www.<yourwebsitename>.com/upgrade) and follow the instructions. Now click ‘Continue’ button.

2-10

Step 6: Enter website details

On clicking ‘Continue’ the below screen comes up which asks you for the website credentials, database location and other details.

4-2

 

Step 7: Start the migration

If the database credentials to your source database are correct, the upgrade review page will appear on the Migrate UI. It will show the summary of the upgrade status for all installed modules on the old site.

As a site builder you should carefully review the modules that will not be upgraded and evaluate if your Drupal 8 site will work without the module.

click on ‘Perform Upgrade’ button.

Tip: Don’t proceed and perform the actual upgrade without first installing the missing Drupal 8 module.

Tip: If you get ID conflict warnings

If you manually create a node to the Drupal 8 site before upgrading and the source Drupal 6/7 site has a node with the same ID, the migration system will overwrite the node that was manually created in Drupal 8.

If conflicting IDs are detected, a warning about conflicting IDs will be shown which can be ignored to risk losing data or abort and take an alternative approach.

Depending on the size and types of content/configuration on the source site, the upgrade may take a very long time. Once the process is finished, you are directed to the site's frontpage with messages summarizing the results:

Upgrading D8 to D9

When it comes to migrating to Drupal 9 from Drupal 8, process is quite simpler. As D9 is an extended version of D8, it is much easier to upgrade. Read the complete guide of Drupal 8 to Drupal 9 upgrade to understand the complete process.

Alternate Method: Migration using Drush Command

Upgrading to Drupal 8 using Drush is useful when migrating complex sites as it allows you to run migrations one by one and it allows rollbacks.

If you are using Composer to build your Drupal 8 site, then you may already have Drush installed. However, if not, then you can install Drush from command line as follows:

composer require drush/drush

To migrate using Drush you need to download and enable the contributed modules: Migrate Upgrade, Migrate Plus and Migrate Tools.

Ensure the Drush is up to date (with the command: “drush –version”)

Now it’s time to start the migration through Drush with following drush command

“drush ://user:[email protected]/db — ://example.com –configure-only”

Where the below mentioned values can be with your values in the above command

  • ‘user’ is the username of the source database
  • ‘password’ is the source database user’s password
  • ‘server’ is the source database server
  • ‘db’ is the source database

Now check your migration status (with the command “drush migrate-status”)

Import the data with the command (“drush migrate-import –all”).

After successful migration, go to the structure->migrations to check the status of migration.

5-1

Check the list migration button next to the migration group ‘import from drupal 7’ to view the entire migrated data.

 

6-590

 

After clicking on all upgraded data will be visible. Click to the execute button and data will be imported.

7-2

 

Once you click on the execute button, you will be redirected to the page with below mentioned options.

8-2

Import button imports all previously unprocessed records from the source into destination Drupal objects.

With this we come to an end of our Drupal migration process. If the above steps are followed carefully, a website can be easily migrated to the latest version.

Srijan has more than 35 Acquia certified Drupal experts with expertise in migrating projects to newer versions of Drupal. Contact us to seamlessly get started with the latest Drupal version.

 

Aug 08 2019
Aug 08

We have already covered two of many ways to migrate images into Drupal. One example allows you to set the image subfields manually. The other example uses a process plugin that accomplishes the same result using plugin configuration options. Although valid ways to migrate images, these approaches have an important limitation. The files and images are not removed from the system upon rollback. In the previous blog post, we talked further about this topic. Today, we are going to perform an image migration that will clear after itself when it is rolled back. Note that in Drupal images are a special case of files. Even though the example will migrate images, the same approach can be used to import any type of file. This migration will also serve as the basis for explaining migration dependencies in the next blog post.

Code snippet for file entity migration

File entity migrate destination

All the examples so far have been about creating nodes. The migrate API is a full ETL framework able to write to different destinations. In the case of Drupal, the target can be other content entities like files, users, taxonomy terms, comments, etc. Writing to content entities is straightforward. For example, to migrate into files, the process section is configured like this:

destination:
  plugin: 'entity:file'

You use a plugin whose name is entity: followed by the machine name of your target entity. Other possible values that could be used are user, taxonomy_term, and comment. Remember that each migration definition file can only write to one destination.

Source section definition

The source of a migration is independent of its destination. The following code snippet shows the source definition for the image migration example:

source:
  constants:
    SOURCE_DOMAIN: 'https://agaric.coop'
    DRUPAL_FILE_DIRECTORY: 'public://portrait/'
  plugin: embedded_data
  data_rows:
    - photo_id: 'P01'
      photo_url: 'sites/default/files/2018-12/micky-cropped.jpg'
    - photo_id: 'P02'
      photo_url: ''
    - photo_id: 'P03'
      photo_url: 'sites/default/files/pictures/picture-94-1480090110.jpg'
    - photo_id: 'P04'
      photo_url: 'sites/default/files/2019-01/clayton-profile-medium.jpeg'
  ids:
    photo_id:
      type: string

Note that the source contains relative paths to the images. Eventually, we will need an absolute path to them. Therefore, the SOURCE_DOMAIN constant is created to assemble the absolute path in the process pipeline. Also, note that one of the rows contains an empty photo_url. No file can be created without a proper URL. In the process section we will accommodate for this. An alternative could be to filter out invalid data in a source clean up operation before executing the migration.

Another important thing to note is that the row identifier photo_id is of type string. You need to explicitly tell the system the name and type of the identifiers you want to use. The configuration for this varies slightly from one source plugin to another. For the embedded_data plugin, you do it using the ids configuration key. It is possible to have more than one source column as identifier. For example, if the combination of two columns (e.g. name and date of birth) are required to uniquely identify each element (e.g. person) in the source.

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD migration dependencies introduction whose machine name is ud_migrations_dependencies_intro. The migration to run is udm_dependencies_intro_image. Refer to this article to learn where the module should be placed.

Process section definition

The fields to map in the process section will depend on the target. For files and images, only one entity property is required: uri. It has to be set as a reference to the file using stream wrappers. In this example, the public stream (public://) is used to store the images in a location that is publicly accessible by any visitor to the site. If the file was already in the system and we knew the path the whole process section for this migration could be reduced to two lines:

process:
  uri: source_column_file_uri

That is rarely the case though. Fortunately, there are many process plugins that allow you to transform the available data. When combined with constants and pseudofields, you can come up with creative solutions to produce the format expected by your destination.

Skipping invalid records

The source for this migration contains one record that lacks the URL to the photo. No image can be imported without a valid path. Let’s accommodate for this. In the same step, a pseudofield will be created to extract the name of the file out of its path.

psf_destination_filename:
  - plugin: callback
    callable: basename
    source: photo_url
  - plugin: skip_on_empty
    method: row
    message: 'Cannot import empty image filename.'

The psf_destination_filename pseudofield uses the callback plugin to derive the filename from the relative path to the image. This is accomplished using the basename PHP function. Also, taking advantage of plugin chaining, the system is instructed to skip process the row if no filename could be obtained. For example, because an empty source value was provided. This is done by the skip_on_empty which is also configured log a message to indicate what happened. In this case, the message is hardcoded. You can make it dynamic to include the ID of the row that was skipped using other process plugins. This is left as an exercise to the curious reader. Feel free to share your answer in the comments below.

Tip: To read the messages log during any migration, execute the following Drush command: drush migrate:messages [migration-id].

Creating the destination URI

The next step is to create the location where the file is going to be saved in the system. For this, the psf_destination_full_path pseudofield is used to concatenate the value of a constant defined in the source and the file named obtained in the previous step. As explained before, order is important when using pseudofields as part of the migrate process pipeline. The following snippet shows how to do it:

psf_destination_full_path:
  - plugin: concat
    source:
      - constants/DRUPAL_FILE_DIRECTORY
      - '@psf_destination_filename'
  - plugin: urlencode

The end result of this operation would be something like public://portrait/micky-cropped.jpg. The URI specifies that the image should be stored inside a portrait subdirectory inside Drupal’s public file system. Copying files to specific subdirectories is not required, but it helps with file organizations. Also, some hosting providers might impose limitations on the number of files per directory. Specifying subdirectories for your file migrations is a recommended practice.

Also note that after the URI is created, it gets encoded using the urlencode plugin. This will replace special characters to an equivalent string literal. For example, é and ç will be converted to %C3%A9 and %C3%A7 respectively. Space characters will be changed to %20. The end result is an equivalent URI that can be used inside Drupal, as part of an email, or via another medium. Always encode any URI when working with Drupal migrations.

Creating the source URI

The next step is to create assemble an absolute path for the source image. For this, you concatenate the domain stored in a source constant and the image relative path stored in a source column. The following snippet shows how to do it:

psf_source_image_path:
  - plugin: concat
    delimiter: '/'
    source:
      - constants/SOURCE_DOMAIN
      - photo_url
  - plugin: urlencode

The end result of this operation will be something like https://agaric.coop/sites/default/files/2018-12/micky-cropped.jpg. Note that the concat and urlencode plugins are used just like in the previous step. A subtle difference is that a delimiter is specifying in the concatenation step. This is because, contrary to the DRUPAL_FILE_DIRECTORY constant, the SOURCE_DOMAIN constant does not end with a slash (/). This was done intentionally to highlight things. First, it is important to understand your source data. Second, you can transform it as needed by using various process plugins.

Copying the image file to Drupal

Only two tasks remain to complete this image migration: download the image and assign the uri property of the file entity. Luckily, both steps can be accomplished at the same time using the file_copy plugin. The following snippet shows how to do it:

uri:
  plugin: file_copy
  source:
    - '@psf_source_image_path'
    - '@psf_destination_full_path'
  file_exists: 'rename'
  move: FALSE

The source configuration of file_copy plugin expects an array of two values: the URI to copy the file from and the URI to copy the file to. Optionally, you can specify what happens if a file with the same name exists in the destination directory. In this case, we are instructing the system to rename the file to prevent name clashes. The way this is done is appending the string _X to the filename and before the file extension. The X is a number starting with zero (0) that keeps incrementing until the filename is unique. The move flag is also optional. If set to TRUE it tells the system that the file should be moved instead of copied. As you can guess, Drupal does not have access to the file system in the remote server. The configuration option is shown for completeness, but does not have any effect in this example.

In addition to downloading the image and place it inside Drupal’s file system, the file_copy also returns the destination URI. That is why this plugin can be used to assign the uri destination property. And that’s it, you have successfully imported images into Drupal! Clever use of the process pipeline, isn’t it? ;-)

One important thing to note is an image’s alternative text, title, width, and height are not associated with the file entity. That information is actually stored in a field of type image. This will be illustrated in the next article. To reiterate, the same approach to migrate images can be used to migrate any file type.

Technical note: The file entity contains other properties you can write to. For a list of available options check the baseFieldDefinitions() method of the File class defining the entity. Note that more properties can be available up in the class hierarchy. Also, this entity does not have multiple bundles like the node entity does.

What did you learn in today’s blog post? Had you created file migrations before? If so, had you followed a different approach? Did you know that you can do complex data transformations using process plugins? Did you know you can skip the processing of a row if the required data is not available? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with your colleagues.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 07 2019
Aug 07

Decoupled Drupal 8 and GatsbyJS webinar

How did the City of Sandy Springs, GA improve information system efficiency with a unified platform? Join our webinar to see how we built this city on decoupled Drupal 8, GatsbyJS, and Netlify.

We'll explore how a “build-your-own” software approach gives Sandy Springs the formula for faster site speed and the ability to publish messages across multiple content channels — including new digital signage.

What You'll Learn

  • The City of Sandy Springs’ challenges and goals before adopting Drupal 8 

  • How Sandy Springs manages multi channel publishing across the website, social media, and a network of digital signage devices. 

  • Benefits gained from Drupal 8 and GatsbyJS, including: a fast, reliable site, hosting costs, and ease of development for their team.  

Speakers

Jason Green, Visual Communications Manager at City of Sandy Springs, and Mediacurrent Director of Front End Development Zack Hawkins share an inside look at the project.

Registration

Follow the City of Sandy Springs on the path to government digital innovation.  Save your seat today!

Aug 07 2019
Aug 07

Lately, you can often hear that Drupal 9 is coming. Drupal 8.7 was released in May and Drupal 8.8 is planned for December 2019. At the same time, D9 is becoming more and more hotly discussed topic in the Drupal world. 

Drupal 9’s arrival perfectly fits into one of a thousand memes inspired by the Game of Thrones’ quote — “Brace yourself, winter is coming.” But is there a need to brace yourself because of D9? Well, it is promised to arrive easily and smoothly. Still, some important preparations are needed. Let’s review them in this post.

Drupal 9 is coming in June 2020

The year of D9 release became known back in September 2018. Drupal creator Dries Buytaert announced it at Drupal Europe in Darmstadt. Later on, in December, the exact date arrived — Drupal 9 is coming on June 3, 2020!

What will happen to Drupal 7 and Drupal 8? Both D7 and D8 will reach their end-of-life in November 2021. This means the stop of official support and no more updates in the functional and security areas. Some companies will come up with extended commercial support, but it’s way better to keep up with the times and upgrade. All the development ideas and innovations will be focused on “the great nine.”

The Drupal creator explained the planned release and end-of-life dates. In a nutshell, D8’s major dependency is the Symfony 3 framework that is reaching end-of-life in November 2021. Drupal 9 will ship with Symfony 4/5. So the Drupal team has to end-of-life Drupal 8 at that time, but they want to give website owners and developers enough time to prepare for Drupal 9 — hence the June 2020 release decision. 

Well, according to the timing, you need to be on Drupal 9 by By November 2021. In the meantime, it is necessary to prepare. 

Preparations for Drupal 9 in the coming

1. How to prepare for Drupal 9 if you are on Drupal 8

Hearing that Drupal 9 is coming, many D8 website owners could wonder “Hey, we have just had an epic upgrade from Drupal 7 to Drupal 8, and here we go again!”.

Keep calm — everything is on the right track. Your upgrade from Drupal 8 to Drupal 9 should be instantaneous. D9 will look like the latest version of D8, but without deprecated code and with third-party dependencies updated (Symfony 4/5, Twig 2, and so on).

Dries Buytaert's quote: we are building Drupal 9 in Drupal 8

There are two rules of thumb regarding the Drupal 9 preparations:

1) Using the latest versions of everything

To have a quick upgrade from Drupal 8 to Drupal 9, you need to stick to the newest versions of the core, modules, and themes. According to Gábor Hojtsy, Drupal Initiative Coordinator, you are gradually becoming a D9 user by keeping your D8 website up-to-date.

Gabor Hojtsy's quote: you become a Drupal 9 user by keeping up to date with Drupal 8.

“The great eight” has adopted a continuous innovation model, which means a new minor version every half a year. Our Drupal team is ready to help you with regular and smooth updates.

2) Getting rid of deprecated code

It is also necessary to keep your website clean from the deprecated code. Deprecated code means APIs and functions that have newer alternatives and are marked as deprecated, or obsolete. 

Any module that does not use deprecated code will just continue working in Drupal 9, Dries said.

Dries Buytaert's quote: without deprecated code websites will be ready for Drupal 9

How to discover deprecated code? Here are a few tools that check everything including custom modules:

  • The command-line tool Drupal Check that checks your code for deprecations
  • The Upgrade Status contributed module that offers a graphical interface to check the modules and theme and get the summary

Many deprecations are very easy to replace. You can always rely on our development team to have a thorough check and cleanup from deprecations. 

2. How to prepare for Drupal 9 if you are on Drupal 7

The best way to prepare for Drupal 9 in the coming is to upgrade to Drupal 8 now. Even if this might sound like a marketing mantra to you, it has very practical and pragmatic grounds.

There are plenty of reasons to upgrade and no reason to skip Drupal 8. These are words from Dries Buytaert's presentation

Dries Buytaert's presentation: there are many reasons to upgrade to Drupal 8 now

You will enjoy a wealth of Drupal 8’s benefits for business all the time before 2021. And when Drupal 9 arrives, you will just click your finger and move ahead to it!

Gabor Hojtsy's quote: skipping Drupal 8 does not bring benefits

Don’t worry, despite the immense difference between D7 and D8, the 7-to-8 upgrades are getting easier every day. Developers have studied the D7-to-D8 upgrade path well. In addition, very helpful migration modules have recently reached stability in D8 core.

Your upgrade to Drupal 8 will depend on your website’s custom functionality and overall complexity. In any case, our Drupal developers will take care of making it smooth. 

So make up your mind and upgrade now — welcome to the innovative path that will lead you further to “the great 9.”

Plan for Drupal 9 with us!

Yes, Drupal 9 is coming with a sure step. No matter which version of Drupal you are using now, we can help you make the right Drupal 9 preparation plan — and fulfill it, of course. Just contact our Drupal experts!

Aug 07 2019
Aug 07

Website Refresh: The Only Thing Missing is a Purring Sound

The Animal Humane Society (AHS), in Minneapolis, Minnesota is the leading animal welfare organization in the Upper Midwest, helping 25,000 dogs, cats and critters in need find loving homes each year, while providing a vast array of services to the community, from low-cost spay and neuter services to dog training to rescuing animals from neglectful and abusive situations. 

TEN7 has been working with AHS since 2008, making piecemeal updates to their website and finding creative solutions for desired changes with a limited budget. In 2016, the Animal Humane Society wanted to reimagine the animalhumanesociety.org website as not just an adoption source, but a resource, an authority, and an advocate for all things related to companion animals and the community that loves them. 

One of the main goals was to include even more information to support pet owners and animal lovers, including more photos, videos and shareable content. Other goals were to integrate the separate Kindest Cut website (a low-cost spay and neuter clinic) into the main site, and improve functionality of the Lost and Found bulletin boards.

“We wanted the user experience on the site to match the user experience when people come to the shelter. That it would be colorful and emotional and warm and inviting, and that it would give people that same wonderful feeling that they have when they walk in the door at the [shelter] and see the puppies and kittens.”—Paul Sorensen, Director of Brand and Communications, Animal Humane Society

To give AHS the increased functionality they desired (like the enhanced image and video capabilities), we embarked on building a complex Drupal 8 site from scratch. It was more than just a one-and-done update, however. Over a nine-year period, the site had evolved from a manually-updated custom CMS to a new Drupal 5 installation, and later Drupal 6. Additional functionality and one-off customizations to the codebase had created a great deal of technical debt, making the site difficult to maintain and support. 

Drupal 8 functionality allowed us to scrap some custom code, while in other cases we were able to replace custom code with contributed modules developed by the Drupal community. 

Integration with PetPoint (the animal information database) under Drupal 6 was challenging, requiring custom code from beginning to end. We were able to use Drupal 8’s built-in functionality to talk to PetPoint in a more standards-based way, which meant far less custom code.

As we were making these updates, we also followed best practices and implemented coding standards for the new site, which reduce the amount of technical debt that was created.

We launched the site in the summer of 2017, and although there were some hiccups, results were immediate: people LOVED the bold photos, video and shareable content. As a result of the site update, more Minnesotans are:

  • Visiting the website and staying longer. Traffic is up 8.5% from the previous year, and the average visit is over four minutes, up 8.6% percent from the previous year
  • Viewing animal profiles, with nearly 4 million views, leading to 10,751 animal adoptions
  • Sharing and responding to AHS content on social media, with double and triple-digit traffic increases on Twitter, Instagram, LinkedIn and Reddit
  • Donating online, with donations driven by site content up 18.2% from the previous year

We continue to support and collaborate with the Animal Humane Society, adding more functionality we couldn’t squeeze in during the big update, like setting up visitor accounts with the ability to “favorite” animals. And we still have to figure out how to make the site purr.

Aug 07 2019
Aug 07

We have presented several examples as part of this migration blog post series. They started very simple and have been increasing in complexity. Until now, we have been rather optimistic. Get the sample code, install any module dependency, enable the module that defines the migration, and execute it assuming everything works on the first try. But Drupal migrations often involve a bit of trial and error. At the very least, it is an iterative process. Today we are going to talk about what happens after import and rollback operations, how to recover from a failed migration, and some tips for writing definition files.

List of drush commands used in drupal migration workflows

Importing and rolling back migrations

When working on a migration project, it is common to write many migration definition files. Even if you were to have only one, it is very likely that your destination will require many field mappings. Running an import operation to get the data into Drupal is the first step. With so many moving parts, it is easy not to get the expected results on the first try. When that happens, you can run a rollback operation. This instructs the system to revert anything that was introduced when then migration was initially imported. After rolling back, you can make changes to the migration definition file and rebuild Drupal’s cache for the system to pick up your changes. Finally, you can do another import operation. Repeat this process until you get the results you expect. The following code snippet shows a basic Drupal migration workflow:

# 1) Run the migration.
$ drush migrate:import udm_subfields

# 2) Rollback migration because the expected results were not obtained.
$ drush migrate:rollback udm_subfields

# 3) Change the migration definition file.

# 4) Rebuild caches for changes to be picked up.
$ drush cache:rebuild

# 5) Run the migration again
$ drush migrate:import udm_subfields

The example above assumes you are using Drush to run the migration commands. Specifically, the commands provided by Migrate Run or Migrate Tools. You pick one or the other, but not both as the commands provided for two modules are the same. If you were to have both enabled, they will conflict with each other and fail.
Another thing to note is that the example uses Drush 9. There were major refactorings between versions 8 and 9 which included changes to the name of the commands. Finally, udm_subfields is the id of the migration to run. You can find the full code in this article.

Tip: You can use Drush command aliases to write shorter commands. Type drush [command-name] --help for a list of the available aliases.

Technical note: To pick up changes to the definition file, you need to rebuild Drupal’s caches. This is the procedure to follow when creating the YAML files using Migrate API core features and placing them under the migrations directory. It is also possible to define migrations as configuration entities using the Migrate Plus module. In those cases, the YAML files follow a different naming convention and are placed under the config/install directory. For picking up changes, in this case, you need to sync the YAML definition using configuration management workflows. This will be covered in a future entry.

Stopping and resetting migrations

Sometimes, you do not get the expected results due to an oversight in setting a value. On other occasions, fatal PHP errors can occur when running the migration. The Migrate API might not be able to recover from such errors. For example, using a non-existent PHP function with the callback plugin. Give it a try by modifying the example in this article. When these errors happen, the migration is left in a state where no import or rollback operations could be performed.

You can check the state of any migration by running the drush migrate:status command. Ideally, you want them in Idle state. When something fails during import or rollback, you would get the Importing or Rolling back states. To get the migration back to Idle, you stop the migration and reset its status. The following snippet shows how to do it:

# 1) Run the migration.
$ drush migrate:import udm_process_intro

# 2) Some non recoverable error occurs. Check the status of the migration.
$ drush migrate:status udm_process_intro

# 3) Stop the migration.
$ drush migrate:stop udm_process_intro

# 4) Reset the status to idle.
$ drush migrate:reset-status udm_process_intro

# 5) Rebuild caches for changes to be picked up.
$ drush cache:rebuild

# 6) Rollback migration because the expexted results were not obtained.
$ drush migrate:rollback udm_process_intro

# 7) Change the migration definition file.

# 8) Rebuild caches for changes to be picked up.
$ drush cache:rebuild

# 9) Run the migration again.
$ drush migrate:import udm_process_intro

Tip: The errors thrown by the Migrate API might not provide enough information to determine what went wrong. An excellent way to familiarize yourselves with the possible errors is by intentionally braking working migrations. In the example repository of this series, there are many migrations you can modify. Try anything that comes to mind: not leaving a space after a colon (:) in a key-value assignment; not using proper indentation; using wrong subfield names; using invalid values in property assignments; etc. You might be surprised by how Migrate API deals with such errors. Also, note that many other Drupal APIs are involved. For example, you might get a YAML file parse error, or an Entity API save error. When you have seen an error before, it is usually faster to identify the cause and fix it in the future.

What happens when you rollback a Drupal migration?

In an ideal scenario, when a migration is rolled back, it cleans after itself. That means, it removes any entity that was created during the import operation: nodes, taxonomy terms, files, etc. Unfortunately, that is not always the case. It is very important to understand this when planning and executing migrations. For example, you might not want to leave taxonomy terms or files that are no longer in use. Whether any dependent entity is removed or not has to do with how plugins or entities work.

For example, when using the file_import or image_import plugins provided by Migrate File, the created files and images are not removed from the system upon rollback. When using the entity_generate plugin from Migrate Plus, the create entity also remains in the system after a rollback operation.

In the next blog post, we are going to start talking about migration dependencies. What happens with dependent migrations (e.g., files and paragraphs) when the migration for host entity (e.g., node) is rolled back? In this case, the Migrate API will perform an entity delete operation on the node. When this happens, referenced files are kept in the system, but paragraphs are automatically deleted. For the curious, this behavior for paragraphs is actually determined by its module dependency: Entity Reference Revisions. We will talk more about paragraphs migrations in future blog posts.

The moral of the story is that the behavior migration system might be affected by other Drupal APIs. And in the case of rollback operations, make sure to read the documentation or test manually to find out when migrations clean after themselves and when they do not.

Note: The focus of this section was content entity migrations. The general idea can be applied to configuration entities or any custom target of the ETL process.

Re-import or update migrations

We just mentioned that Migrate API issues an entity delete action when rolling back a migration. This has another important side effect. Entity IDs (nid, uid, tid, fid, etc.) are going to change every time you rollback an import again. Depending on auto generated IDs is generally not a good idea. But keep it in mind in case your workflow might be affected. For example, if you are running migrations in a content staging environment, references to the migrated entities can break if their IDs change. Also, if you were to manually update the migrated entities to clean up edge cases, those changes would be lost if you rollback and import again. Finally, keep in mind test data might remain in the system, as described in the previous section, which could find its way to production environments.

An alternative to rolling back a migration is to not execute this operation at all. Instead, you run an import operation again using the update flag. This tells the system that in addition to migrating unprocessed items from the source, you also want to update items that were previously imported using their current values. To do this, the Migrate API relies on source identifiers and map tables. You might want to consider this option when your source changes overtime, when you have a large number of records to import, or when you want to execute the same migration many times on a schedule.

Note: On import operations, the Migrate API issues an entity save action.

Tips for writing Drupal migrations

When working on migration projects, you might end up with many migration definition files. They can set dependencies on each other. Each file might contain a significant number of field mappings. There are many things you can do to make Drupal migrations more straightforward. For example, practicing with different migration scenarios and studying working examples. As a reference to help you in the process of migrating into Drupal, consider these tips:

  • Start from an existing migration. Look for an example online that does something close to what you need and modify it to your requirements.
  • Pay close attention to the syntax of the YAML file. An extraneous space or wrong indentation level can break the whole migration.
  • Read the documentation to know which source, process, and destination plugins are available. One might exist already that does exactly what you need.
  • Make sure to read the documentation for the specific plugins you are using. Many times a plugin offer optional configurations. Understand the tools at your disposal and find creative ways to combine them.
  • Look for contributed modules that might offer more plugins or upgrade paths from previous versions of Drupal. The Migrate ecosystem is vibrant, and lots of people are contributing to it.
  • When writing the migration pipeline, map one field at a time. Problems are easier to isolate if there is only one thing that could break at a time.
  • When mapping a field, work on one subfield at a time if possible. Some field types like images and addresses offer many subfields. Again, try to isolate errors by introducing individual changes each time.
  • Commit to your code repository any and every change that produces right results. That way, you can go back in time and recover a partially working migration.
  • Learn about debugging migrations. We will talk about this topic in a future blog post.
  • See help from the community. Migrate maintainers and enthusiasts are very active and responsive in the #migrate channel of Drupal slack.
  • If you feel stuck, take a break from the computer and come back to it later. Resting can do wonders in finding solutions to hard problems.

What did you learn in today’s blog post? Did you know what happens upon importing and rolling back a migration? Did you know that in some cases, data might remain in the system even after rollback operations? Do you have a use case for running migrations with the update flag? Do you have any other advice on writing migrations? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with your colleagues.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 07 2019
Aug 07

Palantir is excited to return to Denver as a sponsor for DrupalCamp Colorado 2019, featuring a keynote from our CEO, Tiffany Farriss. Tiffany will be discussing the role of organizational culture and open source projects like Drupal in the success of tech companies. We hope to see you there!

  • Location: TBD
  • Date: August 3rd, 2019
  • Time: 9 AM - 10 AM MDT
Aug 07 2019
Aug 07

Identifying “Top Tasks”

The biggest negative factor of the previous ETF site’s user experience was its confusing menus. The site presented too many options and pathways for people to find information such as which health insurance plan they belong to or how to apply for retirement benefits, and the pathways often led to different pages about the same topic. Frequently, people would give up or call customer support, which is only open during typical business hours.

Palantir knew the redesign would have the most impact if the site was restructured to fit the needs of ETF’s customers. In order to guarantee we were addressing customers’ most important needs, we used the Top Task Identification methodology developed by customer experience researcher and advocate Gerry McGovern.

Through the use of this method, we organized ETF’s content by the tasks site users deemed most important, with multiple paths to get to content through their homepage, site and organic search, and related content.

Aug 07 2019
Aug 07

Open source looks very different now compared to 20 years ago, and with such a vast community of developers, it is difficult to define the exact role of a “good” open source citizen.

Palantir is thrilled to be participating in Keeping Open Source Open -- a panel including CEO, Tiffany Farriss for a spirited discussion on open source strategy and the future of open source.

Other panelists include Zaheda Bhorat (Amazon Web Services) and Matt Asay (Adobe). The panel will air some of the strongest opinions on Twitter.

  • Time: 1:30 PM - 2:20 PM
  • Location: F150/151
Aug 07 2019
Aug 07

Our team is so enthusiastic to participate in the third iteration of Decoupled Days. Palantir is excited to sponsor this year’s event and continue to share our insights into Content Management Systems.

Join Senior Engineer and Technical Architect Dan Montgomery for a session on content modeling. He’ll break down:

  • How a master content model can enable scalable growth
  • How to create a standardized structure for content
  • How Drupal can function as a content repository that serves other products

You’ll walk away with an understanding of how to develop architecture and structures that are scalable for future, unknown endpoints.

  • Date: Thursday, July 18
  • Time: 9:00am
  • Location: Room 
Aug 07 2019
Aug 07

Design System artifacts go by many names - Living Style Guides, Pattern Libraries, UI Libraries, and just plain Design Systems. The core idea is to give digital teams greater flexibility and control over their website. Instead of having to decide exactly what all pages should look like in one big redesign and then sticking with those templates until the next redesign, a design system gives you a “lego box” of components the team can use to create consistent, beautiful interfaces. Component-based design is how you SCALE.

At Palantir we build content management systems, so we’ve named our design system artifact a “style guide” in a nod to the editorial space.

Our style guides are organized into three sections:

  1. 'Design Elements' which are the very basic building blocks for the website.
  2. 'Components' which combine design elements into working pieces of code that serve a defined purpose.
  3. 'Page Templates' which combine the elements and components into page templates that are used to display the content at destination URLs.

But how do we help our clients determine what the list of elements, components and page templates should be?

How to Identify Elements for Your Design System

In this post I’ll walk through how we worked with the University of Miami Health System to create a style guide that enabled the marketing team to build a consistent, branded experience for a system with 1,200 doctors and scientists, three primary locations, and multiple local clinics.

1. Start by generating a list of your most important types of content.

Why are people coming to your site? What content helps them complete the task they are there to do? This content list is ground zero for component ideation: how can design support and elevate the information your site delivers?

Table of content types

The list of content serving user needs is your starting point for components. In addition, we can use this list to identify a few page templates right off the bat:

  • Home page
  • Treatment landing page
  • Search page
  • Listing page: Search results, news, classes
  • Clinical trials landing page
  • Clinical trial detail page
  • Location landing page
  • Appointment landing page
  • Appointment detail page
  • Basic page (About us, contact us, general information)

This is just the start of the UHealth style guide; we ultimately created about 80 components and 17 page templates. But it gives you a sense of how we tackled the challenge!

2. Sort your list of important types of content into groups by similarities.

Visitors should be able to scan your website for the information they need, and distinctive component designs help them differentiate content without having to read every word. In addition, being rigorous about consistently using components for specific kinds of information creates predictable interfaces, and predictable interfaces are easy for your visitors to use.

In this step, you should audit the design and photo assets you have available now, and assess your capacity to create them going forward. If, for example, you have a limited photo library and no graphic artist on staff, you’ll want to choose a set of components that don’t heavily rely on photos and graphics.

Component example for UHealth site

In this example, we have three component types: News, Events/Classes, and a Simple Success story.

  1. News Component: This component has no images. This is largely about content management; UHealth publishes a lot of news, and they didn’t want to create a bottleneck in their publishing schedule by requiring each story to have a digital-ready photo.
  2. Events/Classes Component: This component has an option for images or a pattern. Because UHealth wants visitors to take action on this content by signing up, we wanted these to have an eye-catching image. Requiring a photo introduces a potential bottleneck in publishing, so we also gave them the option to make the image a pattern or graphic.
  3. Simple success story: This is the most visually complex component because successful health narratives are an important element of UHealth’s content strategy. We were able to create a complex component here because there’s a smaller number of success stories compared to news stories or classes and events. That means the marketing team can dedicate significant time and resources to making the content for this component as effective as possible.

3. Now that you’ve sorted your list by content, do a cross-check for functionality.

Unlike paper publications, websites are built to enable actions like searching, subscribing, and making appointments. Your component set should include interfaces for your functionality.

Some simple and common functions for the UHealth site included searching for a treatment by letter, map blocks, and step forms.

In a more complex example, the Sylvester Cancer Center included a dynamic “Find a lab” functionality that was powered by a database. We designed the template around the limitations of the data set powering the feature, rather than ideating the ideal interface. Search is another feature that benefits from planning during the design phase.

For example, these components for a side bar location search and a full screen location search require carefully structured databases to support them. The design and technical teams must be in alignment on the capacity and limits of the functionality underlying the interface.

4. Differentiate components by brand.

UHealth is an enormous health care system, and there are several centers of excellence within the system that have their own logos and distinct content strategies. As a result, we created several components that were differentiated by brand.

UHealth navigation bars

In this example, you see navigation interfaces that are different by brand and language. Incorporating the differentiated logos for the core UHealth system and the Centers of Excellence is fairly straightforward. But as you can see the Sylvester Center also has three additional top nav options: Cancer treatments, Research, and For Healthcare Professionals.

That content change necessitated a different nav bar - you can see that it’s longer. We also created a component for the nav in Spanish, because sometimes in other languages you find that the menu labels are different lengths and need to be adjusted for. In this case, they didn’t, but we kept it as a reference for the site builders.

5. Review the list: can you combine any components?

Your overall goal should be creating the smallest possible set of components. Depending on the complexity and variety of your content and functionality, this might be a set of 100 components or it might be just 20. The UHealth Design System has about 80 components, and another 17 page templates.

The key is that each of the components does a specific job and is visually differentiated from components that do different jobs. You want clear visual differences that signal clear content differences to your audience, and you don’t want your web team spending time trying to parse minor differences - that’s not how you scale!

In my experience, the biggest stumbling block to creating a streamlined list of components is stakeholders asking for maximum flexibility and control. I’ve found the best way to manage this challenge is to provide stakeholders with the option to differentiate their fiefdoms through content rather than components.

UHealth component examples

In this example, we have the exact same component featuring different images, which allows for two widely different experiences. You can also enable minor differentiation within a component: maybe you can leave off a sub-head, or allow for two buttons instead of one.

6. Start building your design system and stay flexible.

The list you generated here will get you 80% of the way there, but as you proceed with designing and building your design system, you will almost certainly uncover new component needs. When you do, first double check that you can’t use an existing component. This can be a little tricky, because of course content can essentially be displayed any way you want.

At Palantir, we solve for this challenge by building our Style Guide components with real content. This approach solves for a few key challenges with building a design system:

  1. Showing the “why” of a component. Each component is designed for a specific type of content - news, classes, header, testimonial, directory, etc. This consistency is critical for scaling design: the goal is to create consistent interfaces to create ease of use for your visitors. By building our Style Guides with real content, we document the thought process behind creating a specific component.
  2. Consistency. Digital teams change and grow. We use content in our Style Guide to show your digital team how each component should be used, even if they weren’t a part of the original design process.
  3. Capturing User Testing. Some of our components, like menus, are heavily user-tested to ensure that we’re creating intuitive interfaces. By building the components with the tested content in place, we’re capturing that research and ensuring it goes forward in the design.
  4. Identifying gaps. If you’ve got a piece of content or functionality that you think needs a new component, you can check your assumptions against the Style Guide. Does the content you’re working with actually fit within an existing pattern, or is it really new? If it is, add it to the project backlog!

Outcomes

The most important takeaway here is that design systems let your web team scale. Through the use of design systems, your digital team can generate gorgeous, consistent and branded pages as new needs arise.

But don’t take our word for it! Tauffyt Aguilar, the Executive Director of Digital Solutions for Miller School of Medicine and UHealth, describes the impact of their new design system:

“One of the major improvements is Marketing’s ability to maintain and grow their site moving forward. Previously each page was designed and developed individually. The ability to create or edit pages using various elements and components of the Design System is a significant improvement in the turnaround time and efficiency for the Marketing department.”

My favorite example of a new page constructed with the UHealth design system is this gorgeous interface for the Sports Medicine Institute.

Sports Medicine homepage

The Sports Medicine audience has unique needs and interests: they are professional and amateur athletes who need to get back in the game. The UHealth team used basic components plus an attention-grabbing image to create this interface for finding experts by issue.

And ultimately, that’s Palantir’s goal: your digital team should have the tools to create gorgeous, effective websites.

Aug 07 2019
Aug 07

Palantir recently partnered with a patient engagement solutions company that specializes in delivering patient and physician education to deliver improved health outcomes and an enhanced patient experience. They have an extensive library of patient education content that they use to build education playlists which are delivered to more than 51,000 physician offices, 1,000 hospitals, and 140,000 healthcare providers - and they are still growing.

The company is in the process of completely overhauling their technical stack so that they can rapidly scale up the number of products they use to deliver their patient education library. Currently, every piece of content needs to be entered separately for each product it can be delivered on, which forces the content teams to work in silos. In addition, because they use a dozen different taxonomies and doing so correctly requires a high level of context and nuance, any tagging of content can only be done at the manager level or above. The company partnered with Palantir.net to remove these bottlenecks and plan for future scalability.

Aug 07 2019
Aug 07

Facilitating design workshops with key stakeholders allows them to have insight into the process of "how the sausage is made" and provides the product team buy-in from the get-go.

Join Palantir's Director of UX Operations, Lesley Guthrie, for a session on design workshops. She'll go over:

  • How to choose the right exercises 
  • How to play to the team skill sets
  • Ways to adjust the workshop to fit the needs of the project 

You'll learn how to sell it the idea of the design workshop to stakeholders and collaborate with them on a solution that can be tested and validated with real users.

Aug 07 2019
Aug 07

Although web accessibility begins on a foundation built by content strategists, designers, and engineers, the buck does not stop there (or at site launch). Content marketers play a huge role in maintaining web accessibility standards as they publish new content over time.

“Web accessibility means that people with disabilities can perceive, understand, navigate, and interact with the Web, and that they can contribute to the Web.” - W3

Why Accessibility Standards are Important to Marketers

Web accessibility standards are often thought to assist audiences who are affected by common disabilities like low vision/blindness, deafness, or limited dexterity. In addition to these audiences, web accessibility also benefits those with a temporary or situational disability. This could include someone who is nursing an injury, someone who is working from a coffee shop with slow wifi, or someone who is in a public space and doesn’t want to become a nuisance to others by playing audio out loud.

Accessibility relies on empathy and understanding of a wide range of user experiences. People perceive your content through different senses depending on their own needs and preferences. If someone isn’t physically seeing the blog post you wrote or can’t hear the audio of the podcast you published, that doesn’t mean you as a marketer don’t care about providing that information to that audience, it just means you need to adapt in the way you are delivering that information to that audience.

10 Tips for Publishing Accessible Content

These tips have been curated and compiled from a handful of different resources including the WCAG standards set forth by W3C, and our team of accessibility gurus at Palantir. All of the informing resources are linked in a handy list at the end of this post. 

1. Consider the type of content and provide meaningful text alternatives.

Text alternatives should help your audience understand the content and context of each image, video, or audio file. It also makes that information accessible to technology that cannot see or hear your content, like search engines (which translates to better SEO).

Icons to show image, audio, video

Types of text alternatives you can provide:

  • Images - Provide alternative text.
  • Audio - Provide transcripts.
  • Video - Provide captions and video descriptions in action.

This tip affects those situational use cases mentioned above as well. Think about the last time you sent out an email newsletter. If someone has images turned off on their email to preserve cellular data, you want to make sure your email still makes sense. Providing a text alternative means your reader still has all of the context they need to understand your email, even without that image.

2. Write proper alt text.

Alternative text or alt text is a brief text description that can be attributed to the HTML tag for an image on a web page. Alt text enables users who cannot see the images on a page to better understand your content. Screen readers and other assistive technology can’t interpret the meaning of an image without alt text.

With the addition of required alternative text, Drupal 8 has made it easier to build accessibility into your publishing workflow. However, content creators still need to be able to write effective alt text. Below I’ve listed a handful of things to consider when writing alt text for your content.

  • Be as descriptive and accurate as possible. Provide context. Especially if your image is serving a specific function, people who don’t see the image should have the same understanding as if they had.
  • If you’re sharing a chart or other data visualization, include that data in the alt text so people have all of the important information.
  • Avoid using “image of,” “picture of,” or something similar. It’s already assumed that the alt text is referencing an image, and you are losing precious character space (most screen readers cut off alt text at around 125 characters). The caveat to this is if you are describing a work of art, like a painting or illustration.
  • No spammy keyword stuffing. Alt text does help with SEO, but that’s not it’s primary purpose, so don’t abuse it. Find that happy medium between including all of the vital information and also including maybe one or two of those keywords you’re trying to target.
Illustration of red car with flames shooting out of the back, flying over line of cars on sunny roadway.Example of good alt text: “Red car in the sky.”
Example of better alt text: “Illustration of red car with flames shooting out of the back, flying over line of cars on sunny roadway.”

3. Establish a hierarchy.

Upside down pyramid split into three sections labeled high importance, medium importance, low importance

Accessibility is more than just making everything on a page available as text. It also affects the way you structure your content, and how you guide your users through a page. When drafting content, put the most important information first. Group similar content, and clearly separate different topics with headings. You want to make sure your ideas are organized in a logical way to improve scannability and encourage better understanding amongst your readers.

4. Use headings, lists, sections, and other structural elements to support your content hierarchy.

Users should be able to quickly assess what information is on a page and how it is organized. Using headings, subheadings and other structural elements helps establish hierarchy and makes web pages easily understandable by both the human eye and a screen reader. Also, when possible, opt for using lists over tables. Tables are ultimately more difficult for screen reader users to navigate.

If you’re curious to see how structured your content is, scan the URL using WAVE, an accessibility tool that allows you to see an outline of the structural elements on any web page. Using WAVE can help you better visualize how someone who is using assistive technologies might be viewing your page.

5. Write a descriptive title for every page.

This one is pretty straight forward. Users should be able to quickly assess the purpose of each page. Screen readers announce the page title when they load a web page, so writing a descriptive title helps those users make more informed page selections.

Page titles impact:

  • Users with low vision who need to be able to easily distinguish between pages
  • Users with cognitive disabilities, limited short-term memory, and reading disabilities.

6. Be intentional with your link text.

Write link text that makes each link’s purpose clear to the user. Links should provide info on where you will end up or what will happen if you click on that link. If someone is using a screen reader to tab through 3 links on a page that all read “click here,” that doesn’t really help them figure out what each link’s purpose is and ultimately decide which link they should click on.

Additional tips:

  • Any contextual information should directly precede links.
  • Don’t use urls as link text; they aren’t informative. A
  • void writing long paragraphs with multiple links. If you have multiple links to share on one topic, it’s better to write a short piece of text followed by a list of bulleted links.

EX: Use "Learn more about our new Federated Search application" not "Learn more".

7. Avoid using images of text in place of actual text.

The exact guideline set forth by W3 here is “Make it easier for users to see and hear content including separating foreground from background.” 

There are many reasons why this is a good practice that reach beyond accessibility implications. Using actual text helps with SEO, allows for on-page search ability for users, and creates the ability to highlight for copy/pasting. There are some exceptions that can be made if the image is essential to include (like a logo). Providing alt text also may be a solution for certain use cases.

8. Avoid idioms, jargon, abbreviations, and other nonliteral words.

The guideline set forth by W3 is to “make text content readable and understandable.” Accessibility aside, this is important for us marketers In the Drupal-world, because it’s really easy to include a plethora of jargon that your client audience might not be familiar with. So be accessible AND client-friendly, and if you have to use jargon or abbreviations, make sure you provide a definition of the word, link to the definition, or include an explanation of any abbreviations on first reference.

Think about it this way: if you are writing in terms people aren’t familiar with, how will they know to search for them? Plain language = better SEO.

9. Create clear content for your audience’s reading level.

For most Americans, the average reading level is a lower secondary education level. Even if you are marketing to a group of savvy individuals who are capable of understanding pretty complicated material, the truth is, most people are pressed for time and might become stressed if they have to read super complicated marketing materials. This is also important to keep in mind for people with cognitive disabilities, or reading disabilities, like dyslexia.

I know what you’re thinking, “but I am selling a complicated service.” If you need to include technical or complicated material to get your point across, then provide supplemental content such as an infographic or illustration, or a bulleted list of key points.

There are a number of tools online that you can use to determine the readability of your content, and WebAIM has a really great resource for guidelines on writing clearly.

10. Clearly label form input elements.

If you are in content marketing, chances are you have built a form or two in your time. No matter whether you’re creating those in Drupal or an external tool like Hubspot, you want to make sure you are labeling form fields clearly so that the user can understand how to complete the form. For example, expected data formats (such as day, month, year) are helpful. Also, required fields should be clearly marked. This is important for accessibility, but also then you as a marketer end up with better data.

Helpful Resources

Here are a few guides I've found useful in the quest to publish accessible content:

Accessibility Tools

Aug 07 2019
Aug 07

Our testing approach was two-fold, with one underlying question to answer: what is the most intuitive site structure for users?

Test #1: Top Task survey

During the Top Task survey, we had users rank a list of tasks we think they are trying to complete on the site, so that we have visibility into their priorities. The results from this survey informed a revised version of the navigation labels and structure, which we then tested in the following tree test. The survey was conducted via Google forms with existing Center audiences, aiming for 75+ completions.

We then used these audience-defined “top tasks” to inform the new information architecture, which we tested in our second test.

Test #2: IA tree test

During the tree testing of the Information Architecture, we stripped out any visuals and tested the outline of the menu structure. We began with a mailing list of about 2,500 people, split the list into two segments, and A/B tested the new proposed structure (Variant) vs. the current structure (Benchmark). Both trees were tested with the same tasks but using different labels and structure to see with which tree people could complete the tasks quicker and more successfully.

Aug 07 2019
Aug 07

Our team is always excited to catch up with fellow Drupal community members (and each other) in person during DrupalCon. Here’s what we have on deck for this year’s event:

Visit us at booth #709

Drop by and say hi in the exhibit hall! We’ll be at booth number 709, giving away some new swag that is very special to us. Have a lot to talk about? Schedule a meeting with us

Palantiri Sessions

Keeping That New Car Smell: Tips for Publishing Accessible Content by Alex Brandt and Nelson Harris

Content editors play a huge role in maintaining web accessibility standards as they publish new content over time. Alex and Nelson will go over a handful of tips to make sure your content is accessible for your audience.


Fostering Community Health and Demystifying the CWG by George DeMet and friends

The Drupal Community Working Group is tasked with fostering community health. This Q&A format session hopes to bring to light our charter, our processes, our impact and how we can improve.


The Challenge of Emotional Labor in Open Source Communities by Ken Rickard

Emotional labor is, in one sense, the invisible thread that ties all our work together. Emotional labor supports and enables the creation and maintenance of our products. It is a critical community resource, yet undervalued and often dismissed. In this session, we'll take a look at a few reasons why that may be the case and discuss some ways in which open source communities are starting to recognize the value of emotional labor.

  • Date: Thursday, April 11
  • Time: 2:30pm
  • Location: Exhibit Stage | Level 4


The Remote Work Toolkit: Tricks for Keeping Healthy and Happy by Kristen Mayer and Luke Wertz

Moving from working in a physical office to a remote office can be a big change, yet have a lot of benefits. Kristen and Luke will talk about transitioning from working in an office environment to working remotely - how to embrace the good things about remote work, but also ways in which you might need to change your behavior to mitigate the challenges and stay mentally healthy.

Join us for Trivia Night 

Thursday night we will be sponsoring one of our favorite parts of DrupalCon, Trivia Night. Brush up on your Drupal facts, grab some friends, and don't forget to bring your badge! Flying solo to DrupalCon? We would love to have you on our team!

  • Date: Thursday, April 11
  • Time: 8pm - 11:45pm
  • Location: Armory at Seattle Center | 305 Harrison Street

We'll see you all next week!

Aug 06 2019
Aug 06

We were honored to have two members of our team present sessions at Drupal North this past June in the beautiful and vibrant city of Montreal, Canada.  Drupal North is an annual,  free, three-day conference focusing on Drupal-related topics and the community that drives the Drupal Project forward.

We were pleased to make new connections while also reconnecting with friends and peers. As always, it was also a privilege to be given the opportunity to share our expertise.

Crispin Bailey
Director of UX + Design

[embedded content]

The American Foundation for the Blind (AFB), established in 1921, is an amazing organization that advocates on behalf of visually impaired persons, providing community, resources and opportunities.

This talk covered the process we went through to overhaul the site's messaging, content architecture, and visual design, culminating in a fully-responsive HTML prototype and style guide that was used to implement a brand new, fully-accessible, Drupal 8 website.

Andrew Mallis
CEO

[embedded content]

 
Our client, the deYoung Museum, had an ambitious design project scope with tight timelines on delivery that required us to be nimble and innovative. This talk covered the team’s journey, how we repurposed GatherContent as a CMS, and how we automated deployments via its workflow states using Circle CI and Netlify.

It also looks at the components architectures that more greatly empowered our client to own their digital stories, and author them to the point where insights.famsf.org/gauguin received a Webby Honoree award.

Andrew Mallis
CEO

[embedded content]

Google Analytics provides a richness of data giving insight to a user’s needs and behaviors once they’re on your site. However, the amount of data it presents can be disorienting, leaving users unable to focus on key metrics, or misrepresenting the implications of critical indicators in relation to their goals.

In this talk, Andrew Mallis presented best practices to access the clearest, and most useful analytics data necessary to better tell your story.
 
 
We hope you find these talks helpful and insightful, and we look forward to seeing you at future events.

Aug 06 2019
Aug 06

Author’s Note: No rabbits were harmed in the writing of this post.

This summer, I had the opportunity to attend Decoupled Days 2019 in New York City. The organizers did a fabulous job of putting together an extremely insightful, yet approachable, program for the third year of the conference to date.

In case you haven’t heard of the event before, Decoupled Days is somewhat of a boutique conference that focuses solely on decoupled CMS architectures, which combine a CMS like Drupal with the latest front-end web apps, native mobile and desktop apps, or even IoT devices. Given the contemporary popularity of universal Javascript (being used to develop both the front and back ends of apps), this conference also demands a strong interest in the latest JavaScript technologies and frameworks.

If you weren’t able to attend this year, and have the opportunity in the future, I would highly recommend the event to anyone interested in decoupled architectures, whether you’re a beginner or an expert in the area. With that in mind, here are a few of the sessions I was able to attend this year that might give you a sense of what to expect.

Christopher Bloom (Phase2) gave an excellent performance at his session, giving all of us a crash course in TypeScript. He provided a very helpful live demo of how TypeScript can make it much easier and safer to write apps in frameworks like Vue.js, by allowing for real-time error-checking within your IDE (as opposed to at runtime in your browser) and providing the ability to leverage ES6 Class syntax and decorators together seamlessly.

Jamie Hollern (Amazee Labs) showed off how it’s possible to have a streamlined integration between your Drupal site and a fully-fledged UI pattern library like Storybook. By using the Component Libraries, GraphQL, and GraphQL Twig contributed modules in concert with a Storybook-based style guide, Jamie gave a fabulous demonstration of how you can finally stick to the classic DRY (Don’t Repeat Yourself) principle when it comes to a Drupal design system. Because Storybook supports front-end components built using Twig markup, and allows for populating those components using mock data in a GraphQL format, we can use those same Twig templates and GraphQL queries within Drupal with almost no refactoring whatsoever. If you’re interested in learning more about building sites with Drupal, GraphQL, and Twig, Amazee Labs has also published an “Amazing Apps” repo that acts as a sample GraphQL/Twig sample project.

For developers looking to step up their local development capabilities, Decoupled Days features two sessions on invaluable tools for doing decoupled work on your local machine.

Kevin Bridges (Drud) showed how simple and straightforward it can be to use the DDEV command-line tool to quickly spin up a Drupal 8 instance (using the Umami demo site profile, for example), enable the JSON:API contributed module in order to expose Drupal’s data, and then install (using gatsby-cli) and use a local Gatsby.js site to ingest and display that Drupal data.

Matt Biilman (Netlify) also demonstrated the newly launched Netlify Dev command-line tool for developers who use Netlify to host their static site projects. With Netlify Dev, you could spawn an entire local environment for the Gatsby.js site you just installed using DDEV, which will be able to locally run all routing rules, edge logic, and cloud functions. Netlify Dev will even allow you to stream that local Gatsby.js site to a live URL (i.e., on Netlify) with hot-reloading, so that your remote teammates can view your site as you build it.

As usual, Jesús Manuel Olivas (WeKnow) is always pushing the Drupal community further and further, this time by demonstrating his own in-house, Git-based CMS called Blaze, which is a platform that could potentially replace the need for Drupal altogether. Blaze promises to provide the lightest-weight back-end possible for your already light-weight, static-site front-end. In lieu of a relatively heavyweight back-end like Drupal, Blaze would provide a familiar WYSIWYG editor and easy content modeling with a few clicks. The biggest wins from using any Git-based CMS, though, are the extremely lowered costs and lightning-fast change deployments (with immediate updates to your code repo and CI/CD pipeline).

Aug 06 2019
Aug 06

Drupal is certainly not only the open-source CMS game in town, but when consulting with clients about the best solution for the full range of their needs, it tends to be my go-to.

Here’s why: 

  • Architecture
  • Scalability
  • Database Views
  • Flexibility
  • Security
  • Modules
  • Search
  • Migration  

Architecture

Drupal 8 is built on modern programming practices, and of course, the same will be true for June 2020 release of Drupal 9. 

A Development – Test – Production environment is the default assumption with a Drupal 8 site. Too often, other CMS sites are managed as a single instance, which is a single point of failure. 

Also, Drupal comes with a built-in automated testing framework. Drupal 8 supports unit integration and system/functional testing using the PHP Unit framework. Drupal is built to be inherently extensible through configuration, so every data type can be templated without touching code to achieve fully customized, structured data collection.

Scalability

Drupal has proven to be scalable at the most extreme traffic levels. Weather.com, as just one example, is a Drupal site. Many of the Federal cabinet-level agencies using open source have built their web infrastructures on Drupal. Drupal has built-in functionality, such as a robust caching API and JS/CSS minification/aggregation to optimize page load speed.

Database Views

Drupal Views, which is in Drupal 8 core, is a powerful tool that allows you to quickly construct database views, with AJAX filtering and sorting included. This allows you to quickly construct and publish lists of any data in your Drupal site, without needing a developer to do it for you.

Flexibility

There are several components of flexibility. 

Drupal 8 was built as an API-first CMS, explicitly supporting the idea that the display layer for content stored in a Drupal CMS may not be Drupal. The API first design of Drupal 8 also means that it is easier to integrate Drupal with third-party applications, as the API framework is already in place.

Customers vary widely in the ways in which they currently consume content. We assume that new ways will emerge for consuming content in the future, and even though we may not be in a position to predict right now what that will look like, Drupal is well poised to support what comes next.

Security

The only totally secure CMS is the one installed on a server that is sitting at the bottom of the Mariana trench, with no connectivity to anything!  However, Drupal has been tested in the most rigorous and security-conscious environments across government and industry. With a dedicated security team managing not just Drupal core but also many popular modules, and the openness inherent in open source, Drupal is a solid, secure, platform for any website.

Modules / Extensions

Drupal modules are created and contributed to the community because they solve a problem. If you have the same or similar problem to solve you may be a simple module install away from solving that problem. Also, all Drupal modules are managed and accessible through a single repository at Drupal.org, providing a critical layer of vetting and security.

Search

Current versions of Drupal come with  powerful and unparalleled out-of-the-box search functionality. Also, SOLR integration is plug-and-play with Drupal, allowing you to extend the capabilities of search to index documents, or across multiple domains, or to build faceted search results to improve the user experience.

Migration

Face it: migration is not fun with any CMS. However, the Drupal 8 migration (and the same will be true for Drupal 9) API is highly capable of importing complex data from other systems. Simpler CMS platforms tend to offer simple migration for out-of-the-box content types (posts and pages), but not so much for complex data or custom content types.

Summary

Settling on the right CMS platform is often not an obvious choice. Weighing the relative benefits of every option can take time and calls for expert consultation. In instances where complexity increases, and there’s a need to integrate the CMS with outside data sources, I’ll admit to a Drupal bias. This is based on my experience of Drupal as a CMS framework that was designed specifically for the challenges of a mobile-first, API driven, integrated digital environment.

Looking for further exploration into the relative merits of your open source CMS options? Contact us today for an insightful, informative and fully transparent conversation.


 

Aug 06 2019
Aug 06

Long articles with many sections often discourage users from reading. They start reading and usually leave before reaching half of such articles.

To avoid this type of user experience, I recommend to group each section in your article into a collapsible tab. The article reader then will be able to digest the text in smaller pieces.

The Collapse Text Drupal 8 module adds editor filter plugin to your editor. You then will be able to create collapsible text tabs with a tag system similar to HTML.

Read on to learn how to use this module!

Step #1. Install the Required Module

  • Open the terminal application of your computer
  • Go to the root of your Drupal installation (the composer.json file is located inside this directory)
  • Type the following command:

composer require drupal/collapse_text

Type composer installation command

  • Click Extend
  • Scroll down until you find the Collapse Text module and enable it
  • Click Install

Click Install

Step #2. Create an Editor Role

  • Click People > Roles > Add role

Add Collapsible Blocks to Text-Heavy Nodes in Drupal 8

  • Enter the Role Name Editor and click Save
  • Click the dropdown besides Editor and select Edit permissions

Click the dropdown besides Editor

  • Check these permissions:
    • Comment
      • Edit own comments
      • Post comments
      • View comments
    • Contact
      • Use the site-wide contact form
    • Filter
      • Use the Full HTML text format
    • Node
      • Article: Create new content
      • Article: Delete own content
      • Article: Delete revisions
      • Article: Edit own content
      • Article: Revert revisions
      • Article: View revisions
      • Access the Content overview page
      • View published content
      • View own unpublished content
    • System
      • Use the administration pages and help
      • View the administration theme
    • Taxonomy
      • Tags: Create terms
      • Access the taxonomy vocabulary overview page
    • Toolbar
      • Use the toolbar
    • User
      • Cancel own user account
      • View user information
  • Click Save permissions

Click Save permissions

Step #3. Create a User with the New Editor Role

  • Click People > Add user
  • Create a user with the Editor role
  • Click Create new account

Click Create new account

Step #4. Add the Plugin to the Text Format

  • Click Configuration > Text formats and editors

Click Configuration > Text formats and editors

  • Click the Configure button for the Full HTML format

Click the Configure button

  • Enable the Collapsible text blocks filter and check that it comes after the other two filters specified in the description

Enable the Collapsible text blocks filter

The Full HTML format has these two filters disabled by default, so we are good to go.

  • Click Save configuration

Click Save configuration

Step #5. Create Content

  • Log out and log back in as the user with the Editor role

Log out and log back in

  • Click Content > Add content
  • Write a proper title for the node

The Tabs Structure

Each tab is declared between a pair of tags.

To show an opened tab (not collapsed at all) you put the text between the [collapse] and [/collapse] tags.

To show a collapsed tab you put the text between the [collapsed] and [/collapsed]tags.

The opening [collapse] and [collapsed] tags support two “attribute values”:

  • title
  • class

If you don’t specify a title attribute, the module will take the first title available between the [collapse]/[collapsed] tags.

If you don’t specify a title attribute, the module will take the first title available

It is possible to nest collapsible tabs.

It is possible to nest collapsible tabs

  • Finish editing the node form and click Save

Finish editing the node form and click Save

The image is floated, that is a Bartik specific style. Let’s apply some CSS.

Step #6. Basic Styling

Hint: I’m going to edit the original core theme files because I’m working on a sandbox environment. That is not recommended on a production server. As a matter of fact, it is not a good practice at all. If you want to improve your Drupal theming skills, take a look at this OSTraining class.

  • Open the file core/themes/bartik/css/components/field.css
  • Add this code to the end of the file:
@media all and (min-width: 560px) {
 .node .field--type-image {
   float: none;
 }
  • Open the file core/themes/bartik/css/components/node.css
  • Add this code to the end of the file:
/* Collapse Text Styles */
.open,
.shut {
font-family: sans-serif;
}

.open {
background: black;
color: white;
}

.shut {
background: #444;
color: #CCC;
}

summary {
background-color: red;
color: transparent;
}

.nested1 {
background-color: rgba(224, 110, 108, 0.25);
}
  • Save both files
  • Click Configuration > Performance > Clear all caches
  • Refresh the site

Click Configuration > Performance > Clear all caches

Refresh the site

I hope you liked this tutorial. Thanks for reading!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Aug 06 2019
Aug 06

Did you know that the term “One-Stop-Shop” is one of the most clichéd marketing taglines to use, according to Hubspot? Thankfully, I came across that article before I sat down to write this one. So, I’m NOT going to say that a Drupal distribution is a “one-stop-shop” for a quick and easy way to launch your website. Let’s put it this way instead – If you want to build a Drupal website and eager to see it go live real quick, while making sure that you want to save time on maintenance too, Drupal distributions are meant for you.  

What is a Drupal distribution?

A Drupal distribution is an all-inclusive package to get your website up and running quickly. This package consists of the Drupal Core (basic features), installation profiles, themes (for customized designs), libraries (consists of assets like CSS or Javascript) and modules specific to an industry. For example, if you run a publishing company, a distribution like Thunder can help you speed up your development process. Here you can find modules like Paragraphs, Media Entity, Entity Browser and features like Thunder admin theme, scheduled publishing and much more – all in one place. 

Why should you use a Drupal 8 distribution?

Let me give you a few reasons for that -

  • You don’t have to scramble your way through thousands of Drupal modules only to find a few that you really need.  
  • Configuring Drupal core is easier too as most part of it comes preconfigured.
  • The features and modules included in a Drupal distribution are time tested, optimized and proven for quality.
  • Maintenance of a Drupal distribution is simpler because updates for all modules and features can be performed on one shot!  
  • Since you don’t have to reinvent the wheel every time, you save on time. You save precious resource time. Which also means, you save on money! 
  • Now that you have saved some time, you can spend more time on customizing and personalizing these components to tailor-fit your business needs.

Top 15 Drupal Distributions (alphabetically sorted)

1. CiviCRM Starter Kit

The CiviCRM Starter Kit brings together the power of Drupal and the open-source CRM tool – CiviCRM. The popular CRM is used by more than 8000 organizations to centralize constituent communications. Along with core Drupal and CiviCRM, the distribution also packs in CiviCRM related modules like CiviCRM Cron, Webform CiviCRM, CiviCRM Clear All Caches, etc.

 2. Commerce Kiskstart

If you are looking to quickly get your e-commerce store up and running on Drupal Commerce framework, this one’s for you. Commerce Kickstart is a Drupal distribution made for both Drupal 7 and Drupal 8 and is maintained by Centarro (previously Commerce Guys). The Commerce Kickstart 2.x version comes loaded with beautiful themes, catalog, promotion engines, variety of payment tools, utility tools, shipping and fulfilment tools, analytics and reporting tools, marketing tools, search configuration, custom back office interface and much more.

drupal distributions

                                Source - https://www.drupal.org/project/commerce_kickstart

3. Conference Organizing Distribution

Creating a website for events and conference gets easier with this Drupal distribution. Conference Organizing Distribution (COD) was made for Drupal 7 but is being actively ported to Drupal 8. With COD, you can -

  • Create/manage tickets for event registrations
  • Create announcements for paper submissions
  • Moderate session selections
  • Provide an option for attendees to vote for their favourite sessions
  • Schedule sessions on any day and place
  • Easily manage sponsorships
  • Event management made easy with a powerful event management dashboard
  • Keep a track on multiple events and sessions
  • Sell tickets with Drupal commerce

4. Contenta

This API-first Drupal distribution provides you with a framework that is API ready. It reduces the complexity and pain of using or trying decoupled/headless Drupal. Contenta also comes pre-installed with code and demo content along with front-end application examples. Even if you are new to Drupal, Contenta offers simple and quick ways to get the Drupal CMS part ready and you can then focus on frontend frameworks you intend to use. If you’re looking for a complete solution for a headless Drupal project, ContentJS is your best bet. Content JS integrates Contenta CS with front-end framework NodeJs for a powerful, high performing digital experience.

5. Drupal Government Distributions (federal, regional, local)

The aGov Drupal 8 distribution was developed to meet the guidelines of the Australian government. It allows government bodies to follow standards like the WCAG 2.0 AA, Australian government Web guide, AGLS metadata and Digital Service Standard. However, the developers of aGov, PreviousNext, no longer develop of support this distribution as they are now focused on the GovCMS Drupal distribution. GovCMS was built on the foundation of aGov to build more secure, compliant and adaptable government websites. 
deGov Drupal 8 distribution was built for German government websites and used Acquia Lightning to offer more valuable features and functionalities. Some features common to all the Drupal government distributions-

  • Meeting all government standards
  • Workbench moderation
  • Citizen engagement portals
  • Responsive design
  • Example content
  • Intranet/Extranet

6. Acquia Lightning

True to its name, Acquia Lightning is a light-weight Drupal 8 distribution that you can use to develop and deploy a website in lightning speed (up to 30% lesser development time!). Developed by Acquia, Lightning aims to provide full flexibility and a great authoring experience to editorial teams and content authors. Built on Drupal 8, it offers powerful features like page layouts, drag and drop of assets using Panels, rich text, media, slideshows, Google maps, content scheduling and much more. You can also streamline the workflow process of publishing, reviewing, approving and scheduling content.

7. Open Atrium

OpenAtrium is a Drupal distribution built specifically for organizations to be able to create a collaborative intranet solution for social collaboration and knowledge management. It offers features like a drag and drop layout, events management (Events), document management (Files), issue tracking, granular access controls, media management, a worktracker (to monitor tasks and maintain transparency), and much more. It is also offers responsive layouts and themes.

8. Open Academy

Built on the Panopoly base distribution, the Drupal distribution is tailor-made for higher education websites which can be further extended and customized. It is an easy-to-use tool that does not need users to be technical. You can have a great website without any customizations too! Open Academy distribution consists of a Drupal 7 installation profile and features meant for managing courses, departments, faculty, presentations, news, events, publications and more. The themes provided are optimized and mobile ready.

9. Open Social

Open Social is a Drupal 8 distribution that allows organizations to create intranets, online communities and other social portals easily. It is being used by hundreds of organizations including NGOs and government bodies to facilitate communication and connection with their volunteers, employees, members and customers. It also has features like multi-lingual support, private file system, social login, Geo-location maps, etc.

drupal distribution social module

Source : https://www.drupal.org/project/social

10. Opigno LMS

Opigno Distribution is a Learning Management System built on Drupal. It is an easily scalable solution built not just for universities but also for organizations looking to create e-learning solutions. It allows to manage training paths that are organized in courses, activities and modules. It also provides with features like adaptive learning paths, management of skill acquisition, quizzes, blended learning (online modules + in-house sessions + virtual classrooms), award certificates, forums, live meetings and more.

opigno lms drupal distributions

                                               
 Source: https://www.drupal.org/project/opigno_lms

11. Panopoly

This is a base Drupal distribution – which basically means it also acts like a foundation or a base framework for many other distros to be built upon. Panopoly Distribution is powered by the magic of the Panels module and its features like In-place editor, Panelizer, Fieldable Panel Panes, etc. The Panopoly package consists of contributed modules and libraries. It offers cross-browser and responsive layouts, drag and drop page customizations, a powerful easy-to-use Admin interface, etc. It can also be extended through many Panopoly apps. 

12. Presto!

Want a Drupal 8 starter-kit that can meet all your content management needs and get you up and running, presto?! Count on Presto! Whats better, you can start using Presto right out-of-the-box! It is power packed with some great content features like Intelligent content editing, Promo bar (inline alerts for news/announcements), Divider (adding space), Carousel (interactive images), Blocks, etc. It also comes shipped with a responsive theme based on Bootstrap framework that can be further customized to add more layouts. It also lets you easily integrate with Drupal Commerce to make selling on your website easier. With Presto, you can reduce the development time by 20%!

13. Reservoir

Like Contenta, Reservoir too is an API-first Drupal distribution for decoupling Drupal. With this tool, you can build content repositories that are ready to be consumed by front-end applications. It is packed with all necessary web service APIs necessary to create decoupled websites. Reservoir was developed with the objective to make Drupal more accessible to developers, to provide best-practices for building decoupled applications and to provide a starting point for Drupal developers (with less or no Drupal experience) to build a content repository. Reservoir uses JSON API (a specification used for APIs in JSON) to interact with the back-end content. It also ships with API documentation, OpenAPI format export (compatible with a plethora of tools) and a huge set of libraries, SDKs and references.

14. Thunder

This Drupal 8 distribution is designed exclusively for professional publishing. Thunder was originally designed for and by Hubert Burda Media. The Drupal distribution is loaded with features meant for the publishing sector like the Paragraph module, drag and drop of content, Media Entity, Entity browser, Content lock, Video embed field, Facebook Instant articles, Google AMP, LiveBlog, Nexx.tv video player and much more. All of this along with Drupal core features and responsive themes. 

Drupal distribution module thunder


Source: https://www.drupal.org/project/thunder

15. Varbase

Are you lost in a mountain of Drupal modules and wondering which one to pick? Looking for a package that can jumpstart your web development process right away? Varbase is your go-to Drupal distribution then! Varbase provides you with all the necessities and essential modules, features and configurations to speed up your time to market. 

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web