Aug 13 2019
Aug 13

MWDS 2019 door sign with arrow pointing up to the rightIt's always wonderful to have Drupal community members gather in my hometown. Ann Arbor, Michigan, USA hosts the Midwest Drupal Summit (MWDS) every year in August since 2016. Previous MWDS events were held in Chicago, IL, Minneapolis, MN, and Madison, WI. This summit is three days of Drupal contribution, collaboration, and fun. The event is small but mighty. As with any contribution-focused Drupal event, MWDS is important because some of the most active community teams of the Drupal project dedicate time to work through challenges and celebrate what makes open source special.

I overheard several topics discussed over the three days.

  • Drupal.org infrastructure - its current state and ideas for improvements

  • Coordination between the Composer Initiative and core maintainers

  • The Automatic Updates initiative sprinting on package signing

  • Ideas for Contribution Credit system enhancements, to expand recognition beyond activity that takes place on Drupal.org

  • Drupal core contributors and the Drupal Association engineering team collaborating on critical Drupal 9 release blockers

  • The Security Team talking through ongoing work

  • Gitlab merge requests workflow and UI

  • Connecting the work that contributors are doing across various projects

  • Fun social events

Group of Drupalers enjoy lunch together
Group lunch in Ann Arbor.

This opportunity to listen and overhear the thought and care that go into Drupal is one I appreciate. It was fantastic to hear a community member tell the Drupal Association team that they are "impressed with the gitlab work. I created a sandbox and the URL worked." It's one thing to see public feedback on the work done for the community, it's a whole other thing to hear it in person.

Contributors around a table and standing, having a discussion.
Contribution in several forms - talking through ideas, blockers, giving feedback and opinions are just a few ways to participate.

Local midwesterners who take the time to attend MWDS get an opportunity to dive in to contribution on a variety of topics. There are always mentors and subject-matter experts ready to help. My own Drupal core commit happened at a past MWDS - where I gave feedback on an issue from the perspective of a content editor. This year, Wilson S. had a first-time commit on issue #3008029 and usually there's at least one first-time commit. A memorable one being the time Megan (megansanicki) had her first commit, which was also a live commit by Angie Byron (webchick).

Here's what a few participants had to say about their experience:

"I feel inspired as I watch the local community and visitors organically interact and participate with the discussions held around them." ~ Matthew Radcliffe (mradcliffe)

“This was my first Drupal Code Sprint event. Meeting all the great people from near and afar in person was awesome. Matthew Radcliffe helped me overcome my apprehension of contributing to Drupal Core. I look forward to continuing contributing and connecting with the community.” ~ Phill Tran (philltran

"As a recent re-transplant back to Michigan, I wanted to get back in touch with my local Drupal community. Being a FE dev, sometimes it's hard to find things to contribute via core or porting of modules. Some of the content analysis and accessibility testing was really interesting to me. As someone who has not contributed in the past @mradcliff was an excellent teacher on how to get the sprint environment up and running and how to get started on issues." ~ Chris Sands (chrissands)

"As part of the Drupal Association staff, I find MWDS is always a wonderful opportunity to connect with some key community members and create more alignment between DA initiatives and community-driven work. It's also a wonderful time to catch up with Drupal family." ~ Tim Lehnen (hestenet)

"Always the best event of the year." ~ xjm

“I am glad to have met everyone. I had a one-on-one mentoring session with Matthew Radcliffe. It’s priceless!” ~ Wilson Suprapto (wilsonsp)

Did I mention that Chris also became a Drupal Association member during this event?! Thanks Chris!

The Drupal Association engineering team members are in daily contact with the community online. However, in-person events are serendipitous. The insight from community members who have expertise to help Drupal.org improve for everyone is right here in the room. New contributors need only consider that the first step is any move you make to participate. I think this timely tweet I saw over the weekend sums it up:

The great thing about open source is you can often just contribute. Jump in and help - there's tons of documentation help needed, but open issues, bugs, etc. Start small but why not start today?

— Nick Ruffilo (@NickRuffilo) August 10, 2019

Special thanks to Michael Hess for organizing this event and Neha & Danny from University of Michigan for making sure everyone had a fantastic time.

Tim does the Treeline zipline
Tim at the Treeline, on a zipline!

For reference, here are the issues that moved forward during the event.

Aug 13 2019
Aug 13

When the open-source Accelerated Mobile Pages (AMP) project was launched in October 2015, Google AMP was often compared to Facebook's Instant Articles. Nonetheless, both of the tech-giants share a common goal – to make web pages load faster. While AMP can be reached with a web URL, Facebook’s Instant Articles aimed only at easing the pain for their app-users. Teaming up with some powerful launch partners in the publishing and technology sectors, Google AMP aimed to impact the future of content distribution on mobile devices.

Fast forward to today, and Google AMP is the hottest thing on the internet. With over 25 million website domains that have published over 4 Billion AMP pages, it did not take long for the project to be a huge success. Comprising of two main features; Speed and Support to Monetization of Objects, AMPs implications are far reaching for enterprise businesses, marketers, ecommerce and every other big and small organizations. With great features and the fact that its origin as a Google Initiative, it is no surprise that the AMP pages get featured in Google SERP more prominently. 

What is AMP?

With the rapid surge in mobile users, the need to provide a website-like user experience does not just cut it. Today mobile user’s come with a smaller attention-span and varied internet speeds. Businesses can cater to each of these challenge with a fast-loading, light-weight and an app-like website with Google AMP.

AMP is an open-source framework that simplifies the HTML, streamlines CSS rules, restricts use of Javascript (can use AMP’s component library instead) and delivers pages via a Google AMP cache (a proxy-based Content Delivery Network).

Why AMP??

Impacting the technical architecture of digital assets, Google's open source initiative aims to provide streamlined web pages to mobile browsers and other apps.

It is Fast, like Really Fast

Google AMP loads about twice as fast as a normal comparable mobile page and the latency is as less as one-tenth. Intended to provide the fastest experience for mobile users, customers will be able to access content faster, and they are more likely to stay on the page to make a purchase or enquire about your service, because they know it won't take long.

An Organic Boost

Eligibility for the AMP carousal that rests above the other search results on Google SERP, resulting in a substantial increase in organic result and traffic is a major boost for the visibility of an organization. Though not responsible for increasing the page authority and domain authority, Google AMP plays a key role in sending far more traffic your way.

ROI

The fact that AMP leverages and not disrupts the existing web infrastructure of a website, makes the cost of adopting AMP much lesser than the competing technologies. In return, Google AMP enables better user experience which translates to better conversion rates on mobile devices.

Drupal & AMP

With better user engagement, higher dwell time and easy navigation between content benefits, businesses are bound to drive more traffic with AMP-friendly pages and increase their revenue. The AMP module is especially useful for marketers as it is a great addition to optimize their Drupal SEO efforts.

AMP produces HTML that makes the web a faster place. Implementing the AMP module in Drupal is really simple. Just download, enable and configure!
Before you begin with the integration of AMP module with Drupal, you need -
AMP Module : The AMP module mainly handles the conversion of regular Drupal HTML pages to AMP-complaint pages.

Two main components of AMP module:

AMP Module : The AMP module mainly handles the conversion of regular Drupal HTML pages to AMP-complaint pages.
Two main components of AMP module:

AMP Theme: I'm sure you have come across AMP HTML and its standards. The one that are responsible for your content to look effective and perform well on mobile. The Drupal AMP theme produces the mark up required by these standards for websites looking to perform well in the mobile world. Also, AMP theme allows creation of custom-made AMP pages.

AMP PHP Library: Consisting of the AMP base theme and the ExAMPle sub-theme, the Drupal AMP PHP Library handles the final corrections. Users can also create their own AMP sub-theme from scratch, or modify the default ExAMPle sub-theme for their specific requirements.

How to setup AMP with Drupal?

Before you integrate AMP with Drupal, you need to understand that AMP does not replace your entire website. Instead, at its essence, the AMP module provides a view mode for content types, which is displayed when the browser asks for an AMP version.

Download the AMP Module

With your local prepped up, type the following terminal command:

drush dl amp, amptheme, composer_manager

This command will download the AMP module, the AMP theme and the Composer Manager module (suppose if you do not have the Composer Manager already).

If you have been a user of Drupal 8,  you are probably familiar with Composer and its function as a packaging tool for PHP that installs dependencies for a project. The composer is used to install a PHP library that converts raw HTML into AMP HTML. Also, the composer will help to get that library working with Drupal.

However, as the AMP module does not explicitly require Composer Manager for a dependency, alternate workflows can make use of module Composer files without using Composer Manager.

Next, enable the items that are required to get started:

drush en composer_manager, amptheme, ampsubtheme_example

Before enabling the AMP module itself, an AMP sub-theme needs to be enabled. The default configuration for the AMP module sets the AMP Theme to ‘ExAMPle subtheme.’

How to Enable AMP Module?

The AMP module for Drupal can be enabled using Drush. Once the module is enabled, the Composer Manager will take care of the downloading of the other AMP libraries and its dependencies.

drush en amp

Configuration

Once everything is installed and enabled, AMP needs to be configured using a web interface before the Drupal AMP pages can be displayed. First up, you need to decide which content types should have an AMP version. You might not need it for all of them. Enable particular content type by clicking on the “Enable AMP in Custom Display Settings” link. On the next page, open the “Custom Display Settings” fieldset. Check the AMP box, then click Save.

image

Setting an AMP Theme

Once the AMP module and content type is configured, it is time to select a theme for AMP pages and configure it. The view modules and the field formatters of the Drupal AMP module take care of the main content of the page. The Drupal AMP theme, on the other hand, changes the mark-up outside the main content area of the page.

Also, the Drupal AMP themes enables you to create custom styles for your AMP pages. On the main AMP config page, make sure that the setting for the AMP theme is set to the ExAMPle Subtheme or the custom AMP subtheme that you created.

Drupal-theme

Aug 13 2019
Aug 13

We need to nudge governments to start funding and fixing accessibility issues in the Open Source projects that are being used to build digital experiences. Most governments are required by law to build accessible websites and applications. Drupal’s commitment to accessibility is why Drupal is used by a lot of governments to create ambitious digital experiences.

Governments have complex budgeting systems and policies, which can make it difficult for them to contribute to Open Source. At the same time, there are many consulting agencies specializing in Drupal for government, and maybe these organizations need to consider fixing accessibility issues on behalf of their clients.

Accessibility is one of the key selling points of Drupal to governments.

If an agency started contributing, funding, and fixing accessibility issues in Drupal core and Open Source, they’d be showing their government clients that they are indeed experts who understand the importance of getting involved in the Open Source community.

So I have started this blog post with a direct ask for governments to pay to fix accessibility issues without a full explanation as to why. It helps to step back and look at the bigger context: “Why should governments fix accessibility issues in Drupal Core?”

Governments are using Drupal

This summer’s DrupalGovCon in Washington, DC was the largest Drupal event on the East Coast of the United States. The conference was completely free to attend with 1500 registrants. There were dozens of sponsors promoting their Drupal expertise and services. My presentation, Webform for Government, included a section about accessibility. There were also four sessions dedicated to accessibility.

Besides presenting at DrupalGovCon, I also volunteered a few hours to help with Webform-related contribution sprints focusing on improving Drupal’s Form API’s accessibility. I felt that having new contributors help fix accessibility issues would be a rewarding first experience into the world of contributing to Drupal core.

Fixing accessibility issues in Drupal

At the Webform contribution sprint, Lera Atwater (leraa), a first-time contributor, started reviewing Issue #2951317: FAPI #radios form elements do not have an accessible required attribute. She reviewed the most recent patch and provided some feedback. I was able to re-roll the patch a few times to address some new remaining issues and add some basic test coverage. The fact that Lera and I focused on one issue helped move it forward. Still, our solution needs to be reviewed by Andrew Macpherson (andrewmacpherson) or Mike Gifford (mgifford), Drupal’s accessibility maintainers and then Alex Bronstein (effulgentsia) or Tim Plunkett (tim.plunkett), Drupal’s Form API maintainers. Getting an accessibility-related patch committed is a lot of communication and work.

This experience made me ask…

How can we streamline the process for getting accessibility patches committed to Drupal core?

Streamlining the process of fixing accessibility issues

The short answer is we can’t streamline the process of documenting, fixing, and reviewing accessibility issues. These steps are required and needed to ensure the quality of code being committed to Drupal core. What we might be able to do is strive for more efficiency in how we manage accessibility-related issues and the steps required to fix them.

While working on this one accessibility for free in my spare time, it made me wonder...

What would happen if a paid developer collected and worked on multiple accessibility issues for a few weeks and managed to move these issues forward towards a resolution collectively?

First off, I can tell from experience that most Form API and accessibility-related issues in Drupal Core as well as other Open Source projects are very similar. Most accessibility issues have something to do with fixing or adding Aria (Accessible Rich Internet Applications) attributes or keyboard access. A developer should be able to familiarize themselves with the overarching accessibility requirements and fixes needed to address the multiple accessibility issues in Drupal core.

Second, developers tend to work better on focused tasks. Most developers contribute to Open Source in their spare time completing small tasks. Paying a developer to commit and focus on fixing accessibility issues as part of their job is going to yield better results.

Finally, having multiple tickets queued for core maintainers is a more efficient use of Andrew, Mike, Alex, and Tim’s time. Blocks of similar tickets can pass through the core review process more quickly. Also, everyone shares the reward of saying we fixed these accessibility issues.

Governments should pay to fix accessibility issues

I’d like to nudge people or organizations to get involved. In the last month’s Webform 8.x-5.3 release, I settled on the below direct message within the Webform module’s UI.

My conclusion is that we need to directly ask people to get involved, and directly ask organizations to contribute financially (a.k.a. give money). I am admittedly uncomfortable asking people for money because I think to myself, “What if they say no?”

No one should say no to fixing accessibility issues.

The Drupal community cares about accessibility, governments have to care about accessibility, and governments rely on Drupal. Governments should pay to fix accessibility issues.

Talking Drupal and the U.S. Government

Before DrupalGovCon, the Talking Drupal podcast spoke with Abby Bowman about Drupal for Gov. They discussed the U.S. government’s usage and contribution to Drupal and Open Source. From Abby, I learned two interesting things about the U.S. government’s commitment to Open Source.

First, the U.S. government contributes code to Open Source via Code.gov. Second, the U.S. government requires all websites to be accessible, but there is no central department or team, ensuring that all government websites are accessible. All the U.S. government’s websites would immediately benefit from having a team dedicated to finding and fixing accessibility issues.

If you want to hear about my experience at DrupalConGov check out Talking Drupal #221 - Live from GovCon.

How can government and organizations start paying to fix accessibility issues?

The word “pay” is liberally used throughout this post to emphasize the need for governments and organizations to budget for and commit to spending resources for fixing accessibility issues. Either a government related-project needs to get someone on their team to fix accessibility issues or nudge (a.k.a. pay) a vendor or outside developer to fix accessibility issues.

We have to keep asking and experimenting with how we ask organizations to contribute.

Companies that work for governments should pay to fix accessibility issues

In an ideal world, governments should pay to fix accessibility issues. Realistically, some agencies in government can’t directly contribute to Open Source. As stated earlier, any outside vendor who works for the government can contribute to Open Source. Saying that “We care about accessibility and fix accessibility issues within Drupal” is a great slide to include in a pitch deck for a government project.

Maybe governments can mandate that vendors are contributors to the Open Source projects that are being used by a government project.

What is realistically possible?

Realistically, we need to need to fix the accessibility issues in Drupal and Open Source projects. Everyone in the world should have equal access to digital information and services. Every government needs to ensure that the software they are relying on is accessible.

In Drupal and Open Source, we ask individuals, organizations, and governments to get involved. Nudging governments to fix accessibility issues in Drupal and Open Source is just a very direct ask to fix a very specific problem that affects everyone.

There are two immediate ways for governments to get involved in fixing accessibility issues. Either governments dedicate resources to address the problem or they push their vendors to start addressing accessibility issues. In the Open Source community, we need to further embrace and encourage this type of contribution.

Embracing and encouraging governments and organizations contributing to Open Source

In the Drupal Community, we always acknowledge the individuals contributing to Open Source by listing maintainers and contributors on the project pages and even in the software’s MAINTAINERS.txt. We do recognize organizations supporting a Drupal project, but maybe we need to do more. In this day and age, we put corporation names on stadiums. Open Source has reached that scale that the organizations and government that contribute to Open Source are equally important as the individuals. Notably, in the case of enterprise-focused software like Drupal, where organizations are the target consumer, we need to figure out how to get these organizations involved and adequately acknowledged.

Acknowledging that we are already doing a lot

The Drupal community has accomplished something pretty amazing. We have one of the most accessible and secure Open Source Content Manager Systems on the market. We care about accessibility and security and work hard to fix and improve them. As a community, we need to always strive to grow and evolve. Nudging governments to get more involved in fixing accessibility issues will help make our software more accessible to everyone.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OK

Aug 13 2019
Aug 13

Today we will learn how to migrate dates into Drupal. Depending on your field type and configuration, there are various possible combinations. You can store a single date or a date range. You can store only the date component or also include the time. You might have timezones to take into account. Importing the node creation date requires a slightly different configuration. In addition to the examples, a list of things to consider when migrating dates is also presented.

Example syntax for date migrations.

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD date whose machine name is ud_migrations_date. The migration to execute is udm_date. Notice that this migration writes to a content type called UD Date and to three fields: field_ud_date, field_ud_date_range, and field_ud_datetime. This content type and fields will be created when the module is installed. They will also be removed when the module is uninstalled. The module itself depends on the following modules provided by Drupal core: datetime, datetime_range, and migrate.

Note: Configuration placed in a module’s config/install directory will be copied to Drupal’s active configuration. And if those files have a dependencies/enforced/module key, the configuration will be removed when the listed modules are uninstalled. That is how the content type and fields are automatically created.

PHP date format characters

To migrate dates, you need to be familiar with the format characters of the date PHP function. Basically, you need to find a pattern that matches the date format you need to migrate to and from. For example, January 1, 2019 is described by the F j, Y pattern.

As mentioned in the previous post, you need to pay close attention to how you create the pattern. Upper and lowercase letters represent different things like Y and y for the year with four-digits versus two-digits, respectively. Some date components have subtle variations like d and j for the day with or without leading zeros respectively. Also, take into account white spaces and date component separators. If you need to include a literal letter like T it has to be escaped with \T. If the pattern is wrong, an error will be raised, and the migration will fail.

Date format conversions

For date conversions, you use the format_date plugin. You specify a from_format based on your source and a to_format based on what Drupal expects. In both cases, you will use the PHP date function's format characters to assemble the required patterns. Optionally, you can define the from_timezone and to_timezone configurations if conversions are needed. Just like any other migration, you need to understand your source format. The following code snippet shows the source and destination sections:

source:
  plugin: embedded_data
  data_rows:
    - unique_id: 1
      node_title: 'Date example 1'
      node_creation_date: 'January 1, 2019 19:15:30'
      src_date: '2019/12/1'
      src_date_end: '2019/12/31'
      src_datetime: '2019/12/24 19:15:30'
destination:
  plugin: 'entity:node'
  default_bundle: ud_date

Node creation time migration

The node creation time is migrated using the created entity property. The source column that contains the data is node_creation_date. An example value is January 1, 2019 19:15:30. Drupal expects a UNIX timestamp like 1546370130. The following snippet shows how to do the transformation:

created:
  plugin: format_date
  source: node_creation_date
  from_format: 'F j, Y H:i:s'
  to_format: 'U'
  from_timezone: 'UTC'
  to_timezone: 'UTC'

Following the documentation, F j, Y H:i:s is the from_format and U is the to_format. In the example, it is assumed that the source is provided in UTC. UNIX timestamps are expressed in UTC as well. Therefore, the from_timezone and to_timezone are both set to that value. Even though they are the same, it is important to specify both configurations keys. Otherwise, the from timezone might be picked from your server’s configuration. Refer to the article on user migrations for more details on how to migrate when UNIX timestamps are expected.

Date only migration

The Date module provided by core offers two storage options. You can store the date only, or you can choose to store the date and time. First, let’s consider a date only field. The source column that contains the data is src_date. An example value is '2019/12/1'. Drupal expects date only fields to store data in Y-m-d format like '2019-12-01'. No timezones are involved in migrating this field. The following snippet shows how to do the transformation.

field_ud_date/value:
  plugin: format_date
  source: src_date
  from_format: 'Y/m/j'
  to_format: 'Y-m-d'

Date range migration

The Date Range module provided by Drupal core allows you to have a start and an end date in a single field. The src_date and src_date_end source columns contain the start and end date, respectively. This migration is very similar to date only fields. The difference is that you need to import an extra subfield to store the end date. The following snippet shows how to do the transformation:

field_ud_date_range/value: '@field_ud_date/value'
field_ud_date_range/end_value:
  plugin: format_date
  source: src_date_end
  from_format: 'Y/m/j'
  to_format: 'Y-m-d'

The value subfield stores the start date. The source column used in the example is the same used for the field_ud_date field. Drupal uses the same format internally for date only and date range fields. Considering these two things, it is possible to reuse the field_ud_date mapping to set the start date of the field_ud_date_range field. To do it, you type the name of the previously mapped field in quotes (') and precede it with an at sign (@). Details on this syntax can be found in the blog post about the migrate process pipeline. One important detail is that when field_ud_date was mapped, the value subfield was specified: field_ud_date/value. Because of this, when reusing that mapping, you must also specify the subfield: '@field_ud_date/value'. The end_value subfield stores the end date. The mapping is similar to field_ud_date expect that the source column is src_date_end.

Note: The Date Range module does not come enabled by default. To be able to use it in the example, it is set as a dependency of demo migration module.

Datetime migration

A date and time field stores its value in Y-m-d\TH:i:s format. Note it does not include a timezone. Instead, UTC is assumed by default. In the example, the source column that contains the data is src_datetime. An example value is 2019/12/24 19:15:30. Let’s assume that all dates are provided with a timezone value of America/Managua. The following snippet shows how to do the transformation:

field_ud_datetime/value:
  plugin: format_date
  source: src_datetime
  from_format: 'Y/m/j H:i:s'
  to_format: 'Y-m-d\TH:i:s'
  from_timezone: 'America/Managua'
  to_timezone: 'UTC'

If you need the timezone to be dynamic, things get a bit harder. The from_timezone and to_timezone settings expect a literal value. It is not possible to read a source column to set these configurations. An alternative is that your source column includes timezone information like 2019/12/24 19:15:30 -07:00. In that case, you would need to tweak the from_format to include the timezone component and leave out the from_timezone configuration.

Things to consider

Date migrations can be tricky because they can be affected by things outside of the Migrate API. Here is a non-exhaustive list of things to consider:

  • For date and time fields, the transformation might be affected by your server’s timezone if you do not manually set the from_timezone configuration.
  • People might see the date and time according to the preferences in their user profile. That is, two users might see a different value for the same migrated field if their preferred timezones are not the same.
  • For date only fields, the user might see a time depending on the format used to display them. A list of available formats can be found at /admin/config/regional/date-time.
  • A field can always be configured to be presented in a specific timezone. This would override the site’s timezone and the user’s preferred timezone.

What did you learn in today’s blog post? Did you know that entity properties and date fields expect different destination formats? Did you know how to do timezone conversions? What challenges have you found when migrating dates and times? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 13 2019
Aug 13

Drupal Tome is a static site generator distribution of Drupal 8. It provides mechanisms for taking an entire Drupal site and exporting all the content to HTML for direct service. As part of a recent competition at SCDUG to come up with the cheapest possible Drupal 8 hosting, I decided to do a proof-of-concept level implementation of Drupal 8 with Docksal for local content editing, and Netlify for hosting (total cost was just the domain registration).

The Tome project has directions for setup with Docker, and for setup with Netlify, but they don’t quite line up with each other (I followed the docker instructions, then the Netlify set, but had to chart my own course to get the site from the first project linked to the repo in the second), and since I’m getting used to using Docksal when I had to fall back and do a bit of it myself I realized it was almost painfully easy to setup.

The first step was to go to the Tome documentation for Netlify and setup an account, and site from the template. There is a button in those directions to trigger the Netlify setup, I’ve added one here as well (but if this one fails, check to see if they updated theirs):

Deploy to Netlify

Login with Github or similar service, and let it create a repo for your project.

Follow Netlify’s directions for setting up DNS so you can have the domain you want, and HTTPS (through Let’s Encrypt). It took it a couple hours to get that detail to run right, but it eventually worked. For this project I chose a subdomain of my main blog domain: tome-netlify.spinningcode.org

Next go to Github (or whatever service you used) and clone the repository to your local machine. There is a generated README on that project, but the directions aren’t 100% correct if you aren’t cloning onto a machine with a working PHP environment. This is when I switched over to docksal, and ran the following series of commands:

fin init
fin composer install
fin drush tome:install
fin drush uli

Then log into your local site using the domain from docksal and the link from drush, and add some content.

Next we export the content from Drupal to send over to Netlify for deployment.

fin drush tome:static
git add .
git commit -m "Adding sample content"
git push

…now we wait while Netlify notices and builds the site…

If you look at the site a few minutes later the new content should be posted.

This is all well and good if I want to use the version of the site generated for the Netlify example, but I wanted to make sure I could do something more interesting. These days Drupal ships with an install profile called Unami that provides a more robust sample site than the more traditional Standard install.

So now let’s try to get Unami onto this site. Go back to the terminal and have Tome reset everything (it’ll warn you that you are about to nuke everything):

fin drush tome:init

…select Unami when it asks for a profile…and wait cause this takes a while…

Now just re-export the content and push it to your repo.

fin drush tome:static
git add .
git commit -m "Converting to Unami"
git push

And wait again, cause this also takes a few minutes…

The Unami home page on my subdomain hosted at Netlify.

That really was all that was involved for a simple site, you can see my repository on Github if you want to see all of what was generated along the way.

The whole process is pretty straight forward, but there are a few things that it helps to understand.

First, Netlify is actually regenerating the markup on their servers with this approach. The Drupal nodes, and other entities, are saved as JSON and then imported during the build. This makes the process reliable, but slow. Unami takes several minutes to deploy since Netlify is installing and configuring Drupal, loading the content, and generating the output. The build command provided in that template is clear enough to follow if you are familiar with composer projects:

command = "composer install && ./vendor/bin/drush tome:install -y && ./vendor/bin/drush tome:static -l $DEPLOY_PRIME_URL" 

One upside of this, is that you can use a totally unrelated domain for your local testing and have it adjust correctly to the production domain. When you are using Netlify’s branching workflow for managing dev, test, and production it also protects your work that way.

My directions above load a standard docksal container because that’s quick and easy, which includes MySQL, but Tome falls back to using a Sqlite database since you can be more confident it is there. Again this is reliable but slow. If I were going to do this on a more complete project I’d want a smaller Docksal setup or to switch to using MySQL locally.

A workflow based on this approach might also struggle with concurrent edits or complex configuration of large sites. It would probably make more sense to have the content created on a hidden, but traditional, server and then run through a different workflow. But for someone working on a series small sites that are rarely updated, a totally temporary instance of the site that can be rapidly deployed to a device, have content updated, push out to production, and then deleted locally until needed again.

The final detail to note is that there is no support for forms built into this solution. Netlify has support for that, and Tome has a module that claim to connect to that service but I wasn’t able to quickly determine how to get it connected. I am confident there are solves to this problem, but it is something that would take a little additional work.

Aug 12 2019
Aug 12

Drupal has pretty good multilingual support out of the box. It's also fairly easy to create new entities and just add translation support through the annotation. These things are well documented elsewhere and a quick search will reveal how to do that. That is not what this post is about. This post is about the UX around selecting which fields are translatable.

On the Content Language page at http://example.com/admin/config/regional/content-language you can select which fields on your nodes, entities and various other translatable elements will be available on non-default language edit pages. The section at the top is the list of types of translatable things. Checking these boxen will reveal the related section. You can then go down to that section and start selecting fields to translate, save the form and they become available. All nice and easy.

I came into the current project late and this is my first exposure to this area of Drupal. We have a few content types and a lot of entities. I was ticking the box for the entity I wanted to add, jumping to the end of the form and saving it. When the form came back though it was not selected. I could not figure out why. It wasn't until a co-worker used the form differently to me that the issue was resolved. Greg ticked the entity, scrolled down the page and found it, ticked some of the checkboxen in the entity itself and then saved the page. The checkbox was still ticked.

The UX on this pretty good once you know how it works. It could be fixed fairly easy with a system message pointing out that your checkbox was not saved because none of the items it exposed were selected.

I feel a patch coming on…

Aug 12 2019
Aug 12

Booster durablement la performance des équipes en insufflant de la transparence, de la cohésion et du pragmatisme. Tel est mon rôle chez Liip en tant que Scrum Master depuis 5 ans. Aujourd'hui, j'aide d'autres organisations à délivrer, elles aussi, des projets de façon plus efficace et plus humaine.

Un retour au bon sens

Nous commençons l'accompagnement de nos clients par une journée de formation interactive. Nous présentons les idées et les valeurs qui constituent les bases d'un fonctionnement en Agilité. Lors du premier tour de table, les participants expriment ce qui les amène, ainsi que leurs propres objectifs d'apprentissage. Je me souviens des mots qu'une cheffe de projet expérimentée a partagé à cette occasion:

Il paraît que le bon sens ça s'appelle désormais Agilité, alors je suis venue me mettre au goût du jour.

Elle n'a pas tort. Les pratiques Agiles sont issues du sens commun. Alors, pourquoi faut-il les réapprendre et les nourrir régulièrement?

Des pratiques simples, s’inscrivant dans la durée

Une des métaphores les plus adaptées est celle du jardinage. Seul un verger entretenu un petit peu tous les jours donnera ses fruits malgré les variations du climat, l'appétit des insectes et la persistance des mauvaises herbes. Arroser le sol et arracher les plantes nuisibles, ce n’est pas compliqué. Le plus dur est de se discipliner à le faire de manière quotidienne. Faute de quoi, les circonstances et forces extérieures reprendront le dessus et dicteront la suite des évènements.

Dans le cas d'une équipe Agile, les pratiques sont elles aussi très simples et très fructueuses. Par exemple:

  • Réserver un quart d'heure tous les jours pour se coordonner – de vive voix – autour d'un but clairement exprimé ;
  • Visualiser le travail d'une façon compréhensible par tous ;
  • Faire le point de temps en temps pour sortir la tête du guidon et se demander ensemble ce qui marche et ce qui pourrait être amélioré.

L'Agilité, c'est avant tout une poignée de valeurs fondamentales telles que la transparence, la collaboration, la confiance, l'amélioration continue, l'apprentissage par l'expérience. Ces valeurs s'incarnent dans des rôles et des outils différents suivant l'approche que l'on choisit.

Les défis du "monde réel"

Nous avons tous fait l'expérience d'un projet mis à mal par la pression du marché, des investisseurs, ou celle exercée par la hiérarchie. L'état d'urgence est alors déclaré. Et l'on oublie notre bon sens. Celui qui soigne nos relations, la qualité de notre travail et de notre santé en général. Il y a alors des victimes: soit les collaborateurs, soit le projet, soit l'entreprise elle-même. Parfois les trois en même temps.

Des rôles clairs et précis

C'est pour cette raison que Scrum – une des variantes les plus connues de l'Agilité – créé un rôle dédié à "veiller au grain" dans une équipe, sur le long terme. C'est le "Scrum Master". Le Scrum Master est un leader au service du groupe. Dans mon rôle de Scrum Master, performance et épanouissement ne sont pas antinomiques mais forment un seul et même état de grâce. Je cherche à y amener mon équipe et l'y préserver.

Nous pratiquons Scrum chez Liip depuis 2009. Cette approche nous a permis de maximiser la valeur délivrée à nos clients, tout en construisant des équipes stables et soudées.

[embedded content]

Une organisation rigoureuse et souple à la fois

Avec Scrum, nous ne travaillons pas "contre" la hiérarchie, le marché, ou les investisseurs. Ils sont au contraire intégrés au processus. Il existe en tout temps un moyen de prendre en compte leurs signaux. Et ce, tout en conservant un cap stable pendant un laps de temps donné. Le cadre de travail Scrum permet à toute partie prenante d'écrire dans le "Backlog" – un document vivant qui recense tous les besoins et idées liés au projet.

Par contre, une et une seule personne a l'autorité de décider de l'ordre des priorités dans ce que l'équipe devra réaliser. Cette personne est le "Product Owner". Ce rôle est dédié à l'écoute permanente de tout ce qui constitue l'environnement du projet. Au coeur de l'Agilité se situe la notion de confiance. L'organisation fait confiance au Product Owner pour se porter garant de la vision du produit.

Un processus par itérations

La notion d'itération est fondamentale dans la pratique de Scrum. Nous ne cherchons pas à tout spécifier à l'avance dans un projet. Ce qui est d’ailleurs peine perdue dans un environnement complexe et changeant. L'attention de l'équipe est concentrée sur le "plus petit prochain pas" qui permettra de récolter des retours du marché ou des utilisateurs. On parle d'incrément – typiquement réalisé en une à quatre semaines. Ainsi, à intervalles réguliers, il est possible de tester la solution développée. C’est l’occasion de la confronter au monde réel et de prendre la meilleure décision possible sur ce qu'il faut construire ensuite pour apporter le plus de valeur.

L'équipe qui développe le produit a largement son mot à dire car c'est elle qui émet les prévisions de ce qu'elle pense délivrer lors de l'itération à venir. Ce qui génère très souvent des discussions créatives avec le Product Owner. Plutôt que de mettre l'équipe sous pression afin de délivrer un périmètre défini à l'avance et immuable, nous discutons ensemble de la façon de délivrer la plus grande valeur business avec le temps disponible.

Le premier pas

Tous ces principes et ces rôles semblent tenir de rêveries un brin naïves, jusqu'à ce que l'on en fasse l'expérience. Cela a été mon cas quand j'ai rejoint Liip. L'Agilité, ça marche. C'est un super-pouvoir pour les équipes et pour les projets. Essayer, c'est l'adopter. Nous nous rendons volontiers dans vos locaux pour une séance d'introduction. Ou passez simplement boire un café! Ce sera avec plaisir que nous imaginerons ensemble le premier pas pour votre organisation.

Aug 12 2019
Aug 12
A special bird flying in space has the spotlight while lots of identical birds sit on the ground (lack of diversity)

At Drupalcon Seattle, I spoke about some of the challenges Open Source communities like Drupal often have with increasing contributor diversity. We want our contributor base to look like everyone in the world who uses Drupal's technology on the internet, and unfortunately, that is not quite the reality today.

One way to step up is to help more people from underrepresented groups speak at Drupal conferences and workshops. Seeing and hearing from a more diverse group of people can inspire new contributors from all races, ethnicities, gender identities, geographies, religious groups, and more.

To help with this effort, the Drupal Diversity and Inclusion group is hosting a speaker diversity training workshop on September 21 and 28 with Jill Binder, whose expertise has also driven major speaker diversity improvements within the WordPress community.

I'd encourage you to either sign up for this session yourself or send the information to someone in a marginalized group who has knowledge to share, but may be hesitant to speak up. Helping someone see that their expertise is valuable is the kind of support we need to drive meaningful change.

August 12, 2019

44 sec read time

db db
Aug 12 2019
Aug 12

Like many companies in our technology-enabled, globally connected environment, Promet Source operates with clients and team members all over the world. This reality creates a challenge for communications. The truth is, the more we put into our interactions, the more we get out of them.

I’ve been working with distributed teams for a long time. Even though I got very accustomed to joining video calls, until recently, I had opted to not turn my camera on. I guess it started a while back when I had my first virtual interactions with teammates. Probably due to shyness or my lack of knowledge of virtual communications, I tended to avoid the camera component. That's changed.

Eye-Opening Experience

Lately I’ve started using more face-to-face open communications with clients, collaborators and internally. It brings a higher level of empathy, honesty and receptiveness to the conversations. It’s been like going from a gray-scale image to a sudden, colorful world, and has provided an important step toward building trust and strengthening ties.

A couple weeks ago, I was on a call with a collaborator and a team member with whom I’ve been working for more than two months.

That day, I turned on my camera, and the dynamic quickly changed. We started having a candid conversation between the three of us. 

As I we were chatting, my co-worker felt encouraged and also turned on his camera. We were able to see each other's faces for the first time in more than two months of working together. It made a difference.

Next, our collaborator followed suit and turned his camera on. The conversation was instantly raised up to another level. Once one of us opened up, others felt empowered to do the same. It was like a chain reaction or “Domino Effect.”

A New Dimension

We could comment about our surroundings, our clothes, our hair, what was going on in our lives and in our parts of the world!  We were able to get talking and build rapport so much more easily.

The collaboration on the call became alive and we got more out of it than if we had not had the advantage of video.

Looking back on this conversation, it would be easy to say that it was video that made the difference, but that was only one aspect. It was empathy that drove the emotional connection. 

The cameras helped. We were also willing to open ourselves up to a more honest dialog, sharing something personal, becoming available and responsive to each other. 

The Key: Trust

Trust your teammates. Trust your clients. Trust your collaborators. Trust that there is value in what you have to share.

Here’s what I’ve concluded are the keys to successful interactions even when working across multiple time zones.

Lead by Example

Be confident and share honestly. Let other people see you and hear you. Let your emotions shine through your expressions (facial expressions, expressions through the tone of your voice, the words you choose, etc.) 

Open up and people will trust you, and they will be more likely to open up too.  The Domino Effect can be very exciting.

Leverage Human Interaction

Promet Source is a leading practitioner of human-centered design. We know what it means to design for humans and we facilitate human-centered design workshops all over the country to enhance effectiveness and outcomes.

Just as we consistently emphasize that we are designing for humans, we are careful to not lose sight of the fact that we are designing by humans.

Strengthen Teams through Sharing

Too often, the left brain, technology-driven environment in which we operate ignores the powerful impact of the human element in all of our engagements. Even when separated by borders and time zones, efforts to connect on a personal level pays off in ways that are often unanticipated.

Have you found this to be the case? Share your thoughts in the comment section below on why and how connecting on a human level can drive better outcomes.

Sharing your thoughts and experiences can go a long way toward a greater sense of connection and community in our dispersed, digital world.
 

Aug 12 2019
Aug 12
Image of the Rossetta Stone

In Mastering Drupal 8 Multilingual: Part 1 of 3, we focused on planning for Drupal 8 multilingual and its impact on a project's timeline and budget.

In Part 2 (below), we cover everything you need to know to have a functioning multilingual site with no custom code. Part 3 of the series covers more advanced techniques for site builders and front-end developers.

Aug 12 2019
Aug 12

Sean is a strong believer in the open source community at large, and that working collaboratively is best for creating awesome projects. His community work extends into maintaining and building the BADCamp website build, as well as helping to maintain Docksal, a tool used for managing development environments.

Aug 11 2019
Aug 11

Today we complete the user migration example. In the previous post, we covered how to migrate email, timezone, username, password, and status. This time, we cover creation date, roles, and profile pictures. The source, destination, and dependencies configurations were explained already. Therefore, we are jumping straight to the process transformations in this entry.

Example field mapping for user migration

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD users whose machine name is ud_migrations_users. The two migrations to execute are udm_user_pictures and udm_users. Notice that both migrations belong to the same module. Refer to this article to learn where the module should be placed.

The example assumes Drupal was installed using the standard installation profile. Particularly, we depend on a Picture (user_picture) image field attached to the user entity. The word in parenthesis represents the machine name of the image field.

The explanation below is only for the user migration. It depends on a file migration to get the profile pictures. One motivation to have two migrations is for the images to be deleted if the file migration is rolled back. Note that other techniques exist for migrating images without having to create a separate migration. We have covered two of them in the articles about subfields and constants and pseudofields.

Migrating user creation date

Have a look at the previous post for details on the source values. For reference, the user creation time is provided by the member_since column, and one of the values is April 4, 2014. The following snippet shows how the various user date related properties are set:

created:
  plugin: format_date
  source: member_since
  from_format: 'F j, Y'
  to_format: 'U'
changed: '@created'
access: '@created'
login: '@created'

The created, entity property stores a UNIX timestamp of when the user was added to Drupal. The value itself is an integer number representing the number of seconds since the epoch. For example, 280299600 represents Sun, 19 Nov 1978 05:00:00 GMT. Kudos to the readers who knew this is Drupal's default expire HTTP header. Bonus points if you knew it was chosen in honor of someone’s birthdate. ;-)

Back to the migration, you need to transform the provided date from Month day, year format to a UNIX timestamp. To do this, you use the format_date plugin. The from_format is set to F j, Y which means your source date consists of:

  • The full textual representation of a month: April.
  • Followed by a space character.
  • Followed by the day of the month without leading zeros: 4.
  • Followed by a comma and another space character.
  • Followed by the full numeric representation of a year using four digits: 2014

If the value of from_format does not make sense, you are not alone. It is actually assembled from format characters of the date PHP function. When you need to specify the from and to formats, you basically need to look at the documentation and assemble a string that matches the desired date format. You need to pay close attention because upper and lowercase letters represent different things like Y and y for the year with four-digits versus two-digits respectively. Some date components have subtle variations like d and j for the day with or without leading zeros respectively. Also, take into account white spaces and date component separators. To finish the plugin configuration, you need to set the to_format configuration to something that produces a UNIX timestamp. If you look again at the documentation, you will see that U does the job.

The changed, access, and login entity properties are also dates in UNIX timestamp format. changed indicates when the user account was last updated. access indicates when the user last accessed the site. login indicated when the user last logged in. For brevity, the same value assigned to created is also assigned to these three entity properties. The at sign (@) means copy the value of a previous mapping in the process pipeline. If needed, each property can be set to a different value or left unassigned. None is actually required.

Migrating user roles

For reference, the roles are provided by the user_roles column, and one of the values is forum moderator, forum admin. It is a comma separated list of roles from the legacy system which need to be mapped to Drupal roles. It is possible that the user_roles column is not provided at all in the source. The following snippet shows how the roles are set:

roles:
  - plugin: skip_on_empty
    method: process
    source: user_roles
  - plugin: explode
    delimiter: ','
  - plugin: callback
    callable: trim
  - plugin: static_map
    map:
      'forum admin': administrator
      'webmaster': administrator
    default_value: null

First, the skip_on_empty plugin is used to skip the processing of the roles if the source column is missing. Then, the explode plugin is used to break the list into an array of strings representing the roles. Next, the callback plugin invokes the trim PHP function to remove any leading or trailing whitespace from the role names. Finally, the static_map plugin is used to manually map values from the legacy system to Drupal roles. All of these plugins have been explained previously. Refer to other articles in the series or the plugin documentation for details on how to use and configure them.

There are some things that are worth mentioning about migrating roles using this particular process pipeline. If the comma separated list includes spaces before or after the role name, you need to trim the value because the static map will perform an equality check. Having extraneous space characters will produce a mismatch.

Also, you do not need to map the anonymous or authenticated roles. Drupal users are assumed to be authenticated and cannot be anonymous. Any other role needs to be mapped manually to its machine name. You can find the machine name of any role in its edit page. In the example, only two out of four roles are mapped. Any role that is not found in the static map will be assigned the value null as indicated in the default_value configuration. After processing the null value will be ignored, and no role will be assigned. But you could use this feature to assign a default role in case the static map does not produce a match.

Migrating profile pictures

For reference, the profile picture is provided by the user_photo column, and one of the values is P01. This value corresponds to the unique identifier of one record in the udm_user_pictures file migration, which is part of the same demo module.  It is important to note that the user_picture field is not a user entity property. The field is created by the standard installation profile and attached to the user entity. You can find its configuration in the “Manage fields” tab of the “Account settings” configuration page at /admin/config/people/accounts. The following snippet shows how profile pictures are set:

user_picture/target_id:
  plugin: migration_lookup
  migration: udm_user_pictures
  source: user_photo

Image fields are entity references. Their target_id property needs to be an integer number containing the file id (fid) of the image. This can be obtained using the migration_lookup plugin. Details on how to configure it can be found in this article. You could simply use user_picture as your field mapping because target_id is the default subfield and could be omitted. Also note that the alt subfield is not mapped. If present, its value will be used for the alternative text of the image. But if it is not specified, like in this example, Drupal will automatically generate an alternative text out of the username. An example value would be: Profile picture for user michele.

Technical note: The user entity contains other properties you can write to. For a list of available options, check the baseFieldDefinitions() method of the User class defining the entity. Note that more properties can be available up in the class hierarchy.

And with that, we wrap up the user migration example. We covered how to migrate a user’s mail, timezone, username, password, status, creation date, roles, and profile picture. Along the way, we presented various process plugins that had not been used previously in the series. We showed a couple of examples of process plugin chaining to make sure the migrated data is valid and in the format expected by Drupal.

What did you learn in today’s blog post? Did you know how to process dates for user entity properties? Have you migrated user roles before? Did you know how to import profile pictures? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series is made possible thanks to these generous sponsors. Contact us if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 10 2019
Aug 10

Today we are going to learn how to migrate users into Drupal. The example code will be explained in two blog posts. In this one, we cover the migration of email, timezone, username, password, and status. In the next one, we will cover creation date, roles, and profile pictures. Several techniques will be implemented to ensure that the migrated data is valid. For example, making sure that usernames are not duplicated.

Although the example is standalone, we will build on many of the concepts that had already been covered in the series. For instance, a file migration is included to import images used as profile pictures. This topic has been explained in detail in a previous post, and the example code is pretty similar. Therefore, no explanation is provided about the file migration to keep the focus on the user migration. Feel free to read other posts in the series if you need a refresher.

Example field mapping for user migration

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD users whose machine name is ud_migrations_users. The two migrations to execute are udm_user_pictures and udm_users. Notice that both migrations belong to the same module. Refer to this article to learn where the module should be placed.

The example assumes Drupal was installed using the standard installation profile. Particularly, we depend on a Picture (user_picture) image field attached to the user entity. The word in parenthesis represents the machine name of the image field.

The explanation below is only for the user migration. It depends on a file migration to get the profile pictures. One motivation to have two migrations is for the images to be deleted if the file migration is rolled back. Note that other techniques exist for migrating images without having to create a separate migration. We have covered two of them in the articles about subfields and constants and pseudofields.

Understanding the source

It is very important to understand the format of your source data. This will guide the transformation process required to produce the expected destination format. For this example, it is assumed that the legacy system from which users are being imported did not have unique usernames. Emails were used to uniquely identify users, but that is not desired in the new Drupal site. Instead, a username will be created from a public_name source column. Special measures will be taken to prevent duplication as Drupal usernames must be unique. Two more things to consider. First, source passwords are provided in plain text (never do this!). Second, some elements might be missing in the source like roles and profile picture. The following snippet shows a sample record for the source section:

source:
  plugin: embedded_data
  data_rows:
    - legacy_id: 101
      public_name: 'Michele'
      user_email: '[email protected]'
      timezone: 'America/New_York'
      user_password: 'totally insecure password 1'
      user_status: 'active'
      member_since: 'January 1, 2011'
      user_roles: 'forum moderator, forum admin'
      user_photo: 'P01'
  ids:
    legacy_id:
      type: integer

Configuring the destination and dependencies

The destination section specifies that user is the target entity. When that is the case, you can set an optional md5_passwords configuration. If it is set to true, the system will take an MD5 hashed password and convert it to the encryption algorithm that Drupal uses. For more information password migrations refer to these articles for basic and advanced use cases. To migrate the profile pictures, a separate migration is created. The dependency of user on file is added explicitly. Refer to these articles more information on migrating images and files and setting dependencies. The following code snippet shows how the destination and dependencies are set:

destination:
  plugin: 'entity:user'
  md5_passwords: true
migration_dependencies:
  required:
    - udm_user_pictures
  optional: []

Processing the fields

The interesting part of a user migration is the field mapping. The specific transformation will depend on your source, but some arguably complex cases will be addressed in the example. Let’s start with the basics: verbatim copies from source to destination. The following snippet shows three mappings:

mail: user_email
init: user_email
timezone: user_timezone

The mail, init, and timezone entity properties are copied directly from the source. Both mail and init are email addresses. The difference is that mail stores the current email, while init stores the one used when the account was first created. The former might change if the user updates its profile, while the latter will never change. The timezone needs to be a string taken from a specific set of values. Refer to this page for a list of supported timezones.

name:
  - plugin: machine_name
    source: public_name
  - plugin: make_unique_entity_field
    entity_type: user
    field: name
    postfix: _

The name, entity property stores the username. This has to be unique in the system. If the source data contained a unique value for each record, it could be used to set the username. None of the unique source columns (eg., legacy_id) is suitable to be used as username. Therefore, extra processing is needed. The machine_name plugin converts the public_name source column into transliterated string with some restrictions: any character that is not a number or letter will be converted to an underscore. The transformed value is sent to the make_unique_entity_field. This plugin makes sure its input value is not repeated in the whole system for a particular entity field. In this example, the username will be unique. The plugin is configured indicating which entity type and field (property) you want to check. If an equal value already exists, a new one is created appending what you define as postfix plus a number. In this example, there are two records with public_name set to Benjamin. Eventually, the usernames produced by running the process plugins chain will be: benjamin and benjamin_1.

process:
  pass:
    plugin: callback
    callable: md5
    source: user_password
destination:
  plugin: 'entity:user'
  md5_passwords: true

The pass, entity property stores the user’s password. In this example, the source provides the passwords in plain text. Needless to say, that is a terrible idea. But let’s work with it for now. Drupal uses portable PHP password hashes implemented by PhpassHashedPassword. Understanding the details of how Drupal converts one algorithm to another will be left as an exercise for the curious reader. In this example, we are going to take advantage of a feature provided by the migrate API to automatically convert MD5 hashes to the algorithm used by Drupal. The callback plugin is configured to use the md5 PHP function to convert the plain text password into a hashed version. The last part of the puzzle is set, in the process section, the md5_passwords configuration to true. This will take care of converting the already md5-hashed password to the value expected by Drupal.

Note: MD5-hash passwords are insecure. In the example, the password is encrypted with MD5 as an intermediate step only. Drupal uses other algorithms to store passwords securely.

status:
  plugin: static_map
  source: user_status
  map:
    inactive: 0
    active: 1

The status, entity property stores whether a user is active or blocked from the system. The source user_status values are strings, but Drupal stores this data as a boolean. A value of zero (0) indicates that the user is blocked while a value of one (1) indicates that it is active. The static_map plugin is used to manually map the values from source to destination. This plugin expects a map configuration containing an array of key-value mappings. The value from the source is on the left. The value expected by Drupal is on the right.

Technical note: Booleans are true or false values. Even though Drupal treats the status property as a boolean, it is internally stored as a tiny int in the database. That is why the numbers zero or one are used in the example. For this particular case, using a number or a boolean value on the right side of the mapping produces the same result.

In the next blog post, we will continue with the user migration. Particularly, we will explain how to migrate the user creation time, roles, and profile pictures.

What did you learn in today’s blog post? Have you migrated user passwords before, either in plain text or hashed? Did you know how to prevent duplicates for values that need to be unique in the system? Were you aware of the plugin that allows you to manually map values from source to destination? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

Next: Migrating users into Drupal - Part 2

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 09 2019
Aug 09

At Promet Source, conversations with clients and among co-workers tend to revolve around various aspects of compliance, user experience, site navigation, and design clarity. We need a common nomenclature for referring to interface elements, but that leads to the question of who makes this stuff up and what makes these terms stick?
 
I asked that recently, during an afternoon of back-to-back meetings. In separate contexts, “cookies,” “breadcrumbs,” and “hamburgers” were all mentioned as they pertain to the sites we are building for clients. But I got to wondering: what is it about the evolving Web lexicon that seems inordinately slanted towards tasty snacks?
 

One Theory

As we all know, devs and designers work very hard, with incredible focus for long hours at a stretch. Are we trying to inject some fun language that evokes touch, taste, and smell to a web that can feel rather flat sometimes when we are in the trenches?


I couldn’t help but wonder about a potentially unifying theme to cookies, breadcrumbs and the buns that provide the top and bottom horizontal lines of the increasingly ubiquitous hamburger icon. That sparked my curiosity and a bit of research.

Data/Cookie Jar

Let’s start with cookies -- a term that refers to the extraction and storage of user data such as logins, previous searches, activity on a site, and items in a shopping cart.  Almost all Websites use and store cookies on Web browsers.

a stack of chocolate chip cookies

Generally speaking, cookies are designed to inform better and more personalized Web experiences, but they do, of course, give rise to all sorts of privacy and security concerns. 
 
Potential cookie constraints for Websites developed in the United States for a U.S. audience are moving in an uncertain direction. Up to this point, it’s essentially been the Wild West, with few restrictions governing their usage. 
 
In the European Union, it’s a different story. Assorted rules and regulations, collectively known as the “Cookie Law,” have been in place for nearly a decade -- forbidding the tracking of users’ Web activity without their consent. 
 
As is the case with U.S.-based Websites that need to ensure accessibility, compliance with the Cookie Law can be complicated -- requiring rewriting and reconfiguration of code, followed by careful testing to ensure that the site’s code, server and the user’s browser are aligned to prevent cookies from tracking user behavior and collecting information. And another issue that accessibility and cookies have in common: there’s more at stake than compliance. To an increasing degree, users avoid engaging with Websites when they believe that their activity is being tracked by the use of cookies and there’s no question that overall levels of trust appear to be on the decline as privacy concerns increase. This is among the reasons why many websites are starting to give users the option of just saying no to cookies and still allowing them access to the site.

Connecting the Crumbs

Considerably newer to the Web lexicon than cookies, a breadcrumb or breadcrumb trail is a navigational aid in user interfaces designed to help users track their own activity within programs or websites, providing them with a sense of place within the bigger picture of the site. 
 
Breadcrumbs can take different forms. Generally speaking, a breadcrumb trail tracks the order of each page viewed, as a horizontal list below the top headers. This provides a guide for the user to navigate back to any point where they’ve previously been on the site. Think about Grimms’ story of Hansel and Gretel.
 
Breadcrumbs can be very helpful on complex, content-heavy sites. Who among us hasn’t found themselves frustrated in an attempt to navigate back to a page that seems to have temporarily disappeared?

On the Table

Unlike cookies, which for better or for worse, are stored behind the scenes and consumed in a manner that’s usually not known to the user, a breadcrumb trail is out in the open -- right upfront for the user to see and follow. Breadcrumbs are designed solely to enhance the user experience, functioning as a reverse GPS on complex Websites.  
 
As more and more users come to count on breadcrumbs as a navigational aid, we can expect that the demand for them will increase. At the same time, we can expect that usage of cookies will come under increased scrutiny along with a trend toward escalation of privacy concerns and a growing skittishness about how personal information is being shared. At Promet, we consider cookies to be a must-have on any site.

Time for Some Protein

As for the third item in our list of tasty Web terms, the hamburger is essentially all good. This three-line icon that’s started to appear at the top of screens serves as a mini-portal to additional options or pages.

Actual hamburger on the left. A web hamburger icon on the right.

What’s not to love about this feature that takes up so little space on the screen, but opens the door to a trove additional navigation or features for apps and Websites? Fact is, UX/UI trends are constantly evolving, and users vary widely in the pace in which they pick up what’s new and next. The hamburger icon has a lot going for it and it’s not going away.

 

Meet the Search Sandwich

There’s a item on the table and we were just introduced to it by one of our UX savvy clients. As far as I know, it doesn’t have an official name yet, so we affectionately refer to it as the “search sandwich.” It’s an evolved hamburger combined with a search icon to indicate to users that both the navigation menu and the search bar can be accessed from this icon. It looks a bit like a ham sandwich with an olive on top and might make an appearance on a website soon. Stay tuned.
 
So there you have it. Key factors in our Web design world. -- possibly a reflection of a desire to take our high-tech conversations down a notch, with these playful metaphors for elements that we must all learn to identify with whether a designer, developer, or just a web user. They remind us that the Web is a rapidly evolving environment of UI/UX trends -- created and consumed by humans. 
 
Interested in serving up a tasty web experience? Contact us today

Aug 09 2019
Aug 09

At Promet Source, conversations with clients and among co-workers tend to revolve around various aspects of compliance, user experience, site navigation, and design clarity. We need a common nomenclature for referring to interface elements, but that leads to the question of who makes this stuff up and what makes these terms stick?
 
I asked that recently, during an afternoon of back-to-back meetings. In separate contexts, “cookies,” “breadcrumbs,” and “hamburgers” were all mentioned as they pertain to the sites we are building for clients, and I got to wondering: what is it about the evolving Web lexicon that seems inordinately slanted towards tasty snacks?
 

One Theory

As we all know, devs and designers work very hard, with incredible focus for long hours at a stretch. Are we trying to inject some fun language that evokes touch, taste, and smell to a web that can feel rather flat sometimes when we are in the trenches?

And then, I couldn’t help but wonder about a potentially unifying theme to cookies, breadcrumbs and the buns that provide the top and bottom horizontal lines of the increasingly ubiquitous hamburger icon. That sparked my curiosity and a bit of research.

Data/Cookie Jar

Let’s start with cookies -- a term that refers to the extraction and storage of user data such as logins, previous searches, activity on a site, and items in a shopping cart.  Almost all Websites use and store cookies on Web browsers.

a stack of chocolate chip cookies

Generally speaking, cookies are designed to inform better and more personalized Web experiences, but they do, of course, give rise to all sorts of privacy and security concerns. 
 
Potential cookie constraints for Websites developed in the United States for a U.S. audience are moving in an uncertain direction. Up to this point, it’s essentially been the Wild West, with few restrictions governing their usage. 
 
In the European Union, it’s a different story. Assorted rules and regulations, collectively known as the “Cookie Law,” have been in place for nearly a decade -- forbidding the tracking of users’ Web activity without their consent. 
 
As is the case with U.S.-based Websites that need to ensure accessibility, compliance with the Cookie Law can be complicated -- requiring rewriting and reconfiguration of code, followed by careful testing to ensure that the site’s code, server and the user’s browser are aligned to prevent cookies from tracking user behavior and collecting information. And another issue that accessibility and cookies have in common: there’s more at stake than compliance. To an increasing degree, users avoid engaging with Websites when they believe that their activity is being tracked by the use of cookies and there’s no question that overall levels of trust appear to be on the decline as privacy concerns increase. This is among the reasons why many websites are starting to give users the option of just saying no to cookies and still allowing them access to the site.

Connecting the Crumbs

Considerably newer to the Web lexicon than cookies, a breadcrumb or breadcrumb trail is a navigational aid in user interfaces designed to help users track their own activity within programs or websites, providing them with a sense of place within the bigger picture of the site. 
 
Breadcrumbs can take different forms. Generally speaking, a breadcrumb trail tracks the order of each page viewed, as a horizontal list below the top headers. This provides a guide for the user to navigate back to any point where they’ve previously been on the site. Think about Grimms’ story of Hansel and Gretel.
 
Breadcrumbs can be very helpful on complex, content-heavy sites. Who among us hasn’t found themselves frustrated in an attempt to navigate back to a page that seems to have temporarily disappeared?

On the Table

Unlike cookies, which for better or for worse, are stored behind the scenes and consumed in a manner that’s usually not known to the user, a breadcrumb trail is out in the open -- right upfront for the user to see and follow. Breadcrumbs are designed solely to enhance the user experience, functioning as a reverse GPS on complex Websites.  
 
As more and more users come to count on breadcrumbs as a navigational aid, we can expect that the demand for them will increase. At the same time, we can expect that usage of cookies will come under increased scrutiny along with a trend toward escalation of privacy concerns and a growing skittishness about how personal information is being shared. At Promet, we consider cookies to be a must-have on any site.

Time for Some Protein

As for the third item in our list of tasty Web terms, the hamburger is essentially all good. This three-line icon that’s started to appear at the top of screens serves as a mini-portal to additional options or pages.

Actual hamburger on the left. A web hamburger icon on the right.

What’s not to love about this feature that takes up so little space on the screen, but opens the door to a trove additional navigation or features for apps and Websites? Fact is, UX/UI trends are constantly evolving, and users vary widely in the pace in which they pick up what’s new and next. The hamburger icon has a lot going for it and it’s not going away.

 

Meet the Search Sandwich

There’s a item on the table and we were just introduced to it by one of our UX savvy clients. As far as I know, it doesn’t have an official name yet, so we affectionately refer to it as the “search sandwich.” It’s an evolved hamburger combined with a search icon to indicate to users that both the navigation menu and the search bar can be accessed from this icon. It looks a bit like a ham sandwich with an olive on top and might make an appearance on a website soon. Stay tuned.
 
So there you have it. Key factors in our Web design world. -- possibly a reflection of a desire to take our high-tech conversations down a notch, with these playful metaphors for elements that we must all learn to identify with whether a designer, developer, or just a web user. They remind us that the Web is a rapidly evolving environment of UI/UX trends -- created and consumed by humans. 
 
Interested in serving up a tasty web experience? Contact us today

Aug 09 2019
Aug 09

Today we continue the conversation about migration dependencies with a hierarchical taxonomy terms example. Along the way, we will present the process and syntax for migrating into multivalue fields. The example consists of two separate migrations. One to import taxonomy terms accounting for term hierarchy. And another to import into a multivalue taxonomy term field. Following this approach, any node and taxonomy term created by the migration process will be removed from the system upon rollback.

Syntax for multivalue field migration.

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD multivalue taxonomy terms whose machine name is ud_migrations_multivalue_terms. The two migrations to execute are udm_dependencies_multivalue_term and udm_dependencies_multivalue_node. Notice that both migrations belong to the same module. Refer to this article to learn where the module should be placed.

The example assumes Drupal was installed using the standard installation profile. Particularly, a Tags (tags) taxonomy vocabulary, an Article (article) content type, and a Tags (field_tags) field that accepts multiple values. The words in parenthesis represent the machine name of each element.

Migrating taxonomy terms and their hierarchy

The example data for the taxonomy terms migration is fruits and fruit varieties. Each row will contain the name and description of the fruit. Additionally, it is possible to define a parent term to establish hierarchy. For example, “Red grape” is a child of “Grape”. Note that no numerical identifier is provided. Instead, the value of the <code>name</code> is used as a <code>string</code> identifier for the migration. If term names could change over time, it is recommended to have another column that did not change (e.g., an autoincrementing number). The following snippet shows how the source section is configured:

source:
  plugin: embedded_data
  data_rows:
    - fruit_name: 'Grape'
      fruit_description: 'Eat fresh or prepare some jelly.'
    - fruit_name: 'Red grape'
      fruit_description: 'Sweet grape'
      fruit_parent: 'Grape'
    - fruit_name: 'Pear'
      fruit_description: 'Eat fresh or prepare a jam.'
  ids:
    fruit_name:
      type: string

The destination is quite short. The target entity is set to taxonomy terms. Additionally, you indicate which vocabulary to migrate into. If you have terms that would be stored in different vocabularies, you can use the <code>vid</code> property in the process section to assign the target vocabulary. If you write to a single one, the <code>default_bundle</code> key in the destination can be used instead. The following snippet shows how the destination section is configured:

destination:
  plugin: 'entity:taxonomy_term'
  default_bundle: tags

For the process section, three entity properties are set: name, description, and parent. The first two are strings copied directly from the source. In the case of <code>parent</code>, it is an entity reference to another taxonomy term. It stores the taxonomy term id (<code>tid</code>) of the parent term. To assign its value, the <code>migration_lookup</code> plugin is configured similar to the previous example. The difference is that, in this case, the migration to reference is the same one being defined. This sets an important consideration. Parent terms should be migrated before their children. This way, they can be found by the look up operation. Also note that the look up value is the term name itself, because that is what this migration set as the unique identifier in the source section. The following snippet shows how the process section is configured:

process:
  name: fruit_name
  description: fruit_description
  parent:
    plugin: migration_lookup
    migration: udm_dependencies_multivalue_term
    source: fruit_parent

Technical note: The taxonomy term entity contains other properties you can write to. For a list of available options check the baseFieldDefinitions() method of the Term class defining the entity. Note that more properties can be available up in the class hierarchy.

Migrating multivalue taxonomy terms fields

The next step is to create a node migration that can write to a multivalue taxonomy term field. To stay on point, only one more field will be set: the title, which is required by the node entity. Read this change record for more information on how the Migrate API processes Entity API validation. The following snippet shows how the source section is configured for the node migration:

source:
  plugin: embedded_data
  data_rows:
    - unique_id: 1
      thoughtful_title: 'Amazing recipe'
      fruit_list: 'Green apple, Banana, Pear'
    - unique_id: 2
      thoughtful_title: 'Fruit-less recipe'
  ids:
    unique_id:
      type: integer

The fruits column contains a comma separated list of taxonomies to apply. Note that the values match the unique identifiers of the taxonomy term migration. If you had used numbers as migration identifiers there, you would have to use those numbers in this migration to refer to the terms. An example of that was presented in the previous post. Also note that there is one record that has no terms associated. This will be considered during the field mapping. The following snippet shows how the process section is configured for the node migration:

process:
  title: thoughtful_title
  field_tags:
    - plugin: skip_on_empty
      source: fruit_list
      method: process
      message: 'No fruit_list listed.'
    - plugin: explode
      delimiter: ','
    - plugin: migration_lookup
      migration: udm_dependencies_multivalue_term

The title of the node is a verbatim copy of the thoughtful_title column. The Tags fields, mapped using its machine name field_tags, uses three chained process plugins. The skip_on_empty plugin reads the value of the fruit_list column and skips the processing of this field if no value is provided. This is done to accommodate the fact that some records in the source do not specify tags. Note that the method configuration key is set to process. This indicates that only this field should be skipped and not the entire record. Ultimately, tags are optional in this context and nodes should still be imported even if no tag is associated.

The explode plugin allows you to break a string into an array, using a delimiter to determine where to make the cut. Later, this array is passed to the migration_lookup plugin specifying the term migration as the one to use for the look up operation. Again, the taxonomy term names are used here because they are the unique identifiers of the term migration. Note that neither of these plugins has a source configuration. This is because when process plugins are chained, the result of one plugin is sent as source to be transformed by the next one in line. The end result is an array of taxonomy term ids that will be assigned to field_tags. The migration_lookup is able to process single values and arrays.

The last part of the migration is specifying the process section and any dependencies. Refer to this article for more details on setting migration dependencies. The following snippet shows how both are configured for the node migration:

destination:
  plugin: 'entity:node'
  default_bundle: article
migration_dependencies:
  required:
    - udm_dependencies_multivalue_term
  optional: []

More syntactic sugar

One way to set multivalue fields in Drupal migrations is assigning its value to an array. Another option is to set each value manually using field deltas. Deltas are integer numbers starting with zero (0) and incrementing by one (1) for each element of a multivalue field. Although you could set any delta in the Migrate API, consider the field definition in Drupal. It is possible that limits had been set to the number of values a field could hold. You can specify deltas and subfields at the same time. The full syntax is field_name/field_delta/subfield. The following example shows the syntax for a multivalue image field:

process:
  field_photos/0/target_id: source_fid_first
  field_photos/0/alt: source_alt_first
  field_photos/1/target_id: source_fid_second
  field_photos/1/alt: source_alt_second
  field_photos/2/target_id: source_fid_third
  field_photos/2/alt: source_alt_third

Manually setting a multivalue field is less flexible and error-prone. In today’s example, we showed how to accommodate for the list of terms not being provided. Imagine having to that for each delta and subfield combination, but the functionality is there in case you need it. In the end, Drupal offers more syntactic sugar so you can write shorted field mappings. Additionally, there are various process plugins that can handle arrays for setting multivalue fields.

Note: There are other ways to migrate multivalue fields. For example, when using the entity_generate plugin provided by Migrate Plus, there is no need to create a separate taxonomy term migration. This plugin is able to create the terms on the fly while running the import process. The caveat is that terms created this way are not deleted upon rollback.

What did you learn in today’s blog post? Have you ever done a taxonomy term migration before? Were you aware of how to migrate hierarchical entities? Did you know you can manually import multivalue fields using deltas? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 09 2019
Aug 09

Scheduled Transitions is a module allowing you to schedule a specific previously saved revision to move from one state to another. This post provides an introduction to Scheduled Transitions for Drupal 8.

Scheduled Transitions is a module allowing you to schedule a specific previously saved revision to move from one state to another. For example an editor may edit a piece of content remaining in a draft state throughout the draft process. When ready, an editor may select the ready revision to be moved from draft to published. 

Another more complex use case is with the following workflow Draft -> Needs Review -> Approved -> Published -> Archived. A Content Editor could edit a piece of content until it is in Needs Review status, a Content Moderator will approve the content by setting the state to Approved. The Content Moderator would go to set up a scheduled transition for when the content would move from Approved to Published at some point in the future. If the content is time sensitive, another future scheduled transition could be created to automatically change from Published to Archived.

Scheduled Transitions integrates tightly with Content Moderation and Workflows, inheriting transitions, states, and associated permissions automatically.

This post and accompanying video cover configuration and general usage.

Video

[embedded content]

Another shorter version of the video is available without site building aspects, ready to be shared with an editorial team.

Dependencies

Requirements and dependencies are fairly bleeding edge, but will change in the future, as of posting they are:

Installation

Download and install the module using your favourite method:

composer require drupal/scheduled_transitions
drush pm:enable scheduled_transitions # or
drupal module:install scheduled_transitions

Configuration

Configure Workflows

If you have not already created a workflow, navigate to Configuration -> Workflows, click Add workflow button.

Create a label, select Content moderation from the Workflow type dropdown.

Set up states and the transitions between in any way you desire, and set which entity type bundles the workflow should apply to.

Configure Scheduled Transitions

Navigate to Configuration » Scheduled Transitions

Under the Enabled types heading, select the entity type bundles to enable Scheduled transitions on. Save the form.

Scheduled Transitions: Settings

User permissions

Navigate to People » Permissions.

Under Content Moderation heading, enable all workflow transition permissions that apply.Under Scheduled Transitions heading, enable Add scheduled transitions and View scheduled transitions permissions that apply. These permissions apply to individual entities, in addition to these permissions, users must also have access to edit the individual entities. Make sure you grant any permissions needed for users to edit the entities, for example Node's require Edit any content or Edit own content permissions.

General Usage

Moving on to day-to-day functionality of Scheduled Transitions.

Navigate to a pre-existing entity. Though nodes are show in examples below, Scheduled Transitions works with any revisionable entity type. Such as block content, terms, or custom entity types.

You'll find the Scheduled Transitions tab, with a counter in the tab indicating how many transitions are scheduled for the entity and translation being viewed.

Scheduled Transitions: Tab

Clicking the tab will send you to a listing of all scheduled transitions for an entity.

If the user has permission, an Add Scheduled transitions button will be visible.

Scheduled Transitions: List

Clicking the button presents a modal form. The form displays a list of all revisions for the entity or translation.

Scheduled Transitions: Modal

Click the radio next to the revision you wish to schedule for state change.

After the radio is selected, the form will reload showing valid workflow transitions from the selected revisions' state.

The user selects which transition is to be executed, along with the date and time the transition should be executed.

Scheduled Transitions: Revision Selected

Depending on the state of the selected source revision, an additional checkbox may display, prompting me to recreate pending revisions. This feature is useful if users have created more non published revisions after the scheduled revision. It prevents loss of any intermediate non-published work. A diagram is provided below:

Scheduled Transitions: Recreate Pending Revisions

Click the schedule button. The modal closes and the scheduled transitions list reloads.

Scheduled Transitions: Post creation

When the time is right, the scheduled transition is executed. You can force schedule transitions to execute by running cron manually. Cron should should be set up to run automatically and regularly, preferably every 5 minutes or so.

The job executes the transitions and deletes itself, removing itself from the transition list. As a result of executing the transition, you'll notice when navigating to the core revisions list for an entity a new revision is created, with a log outlining the state change.

Scheduled Transitions: Revisions

Multilingual

When dealing with entities with multiple translations, you can find that transitions are available for the translation in context, and are separate to other translations. For example revisions in English and German languages of an entity are scheduled independently.

Global List

Scheduled transitions comes with Views integration, on installation a view is pre-installed. You can find the view by navigating to Content » Scheduled Transitions. The view shows all pending scheduled transitions on the site.

Scheduled Transitions: Global List

For more information, check out the Scheduled Transitions project page or Scheduled Transitions project documentation.

Photo of Daniel Phin

Posted by Daniel Phin
Drupal Developer

Dated 9 August 2019

Comments

I wish every Drupal contrib module had an announcing blog post and accompanying video. Especially if it's this well executed. Thank you for this very high quality contribution!

ditto re: blog & vid

Looks technically solid. But I think the UI is going to be a bit much for the majority of content editors. 99% of the time the only revision that matters is the highest vid. Perhaps something on the entity edit form next to the submit buttons would be more usable.

Pagination

Add new comment

Aug 09 2019
Aug 09

One of Drupal’s biggest strengths is its data modeling capabilities. You can break the information that you need to store into individual fields and group them in content types. You can also take advantage of default behavior provided by entities like nodes, users, taxonomy terms, files, etc. Once the data has been modeled and saved into the system, Drupal will keep track of the relationship between them. Today we will learn about migration dependencies in Drupal.

As we have seen throughout the series, the Migrate API can be used to write to different entities. One restriction though is that each migration definition can only target one type of entity at a time. Sometimes, a piece of content has references to other elements. For example, a node that includes entity reference fields to users, taxonomy terms, and images. The recommended way to get them into Drupal is writing one migration definition for each. Then, you specify the relationships that exist among them.

Snippet of migration dependency definition

Breaking up migrations

When you break up your migration project into multiple, smaller migrations they are easier to manage and you have more control of process pipeline. Depending on how you write them, you can rest assured that imported data is properly deleted if you ever have to rollback the migration. You can also enforce that certain elements exist in the system before others that depend on them can be created. In today’s example, we are going to leverage the example from the previous post to demonstrate this. The portraits imported in the file migration will be used in the image field of nodes of type article.

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD migration dependencies introduction whose machine name is ud_migrations_dependencies_intro. Last time the udm_dependencies_intro_image was imported. This time udm_dependencies_intro_node will be executed. Notice that both migrations belong to the same module. Refer to this article to learn where the module should be placed.

Writing the source and destination definition

To keep things simple, the example will only write the node title and assign the image field. A constant will be provided to create the alternative text for the images. The following snippet shows how the source section is configured:

source:
  constants:
    PHOTO_DESCRIPTION_PREFIX: 'Photo of'
  plugin: embedded_data
  data_rows:
    - unique_id: 1
      name: 'Michele Metts'
      photo_file: 'P01'
    - unique_id: 2
      name: 'David Valdez'
      photo_file: 'P03'
    - unique_id: 3
      name: 'Clayton Dewey'
      photo_file: 'P04'
  ids:
    unique_id:
      type: integer

Remember that in this migration you want to use files that have already been imported. Therefore, no URLs to the image files are provided. Instead, you need a reference to the other migration. Particularly, you need a reference to the unique identifiers for each element of the file migration. In the process section, this value will be used to look up the portrait that will be assigned to the image field.

The destination section is quite short. You only specify that the target is a node entity and the content type is article. Remember that you need to use the machine name of the content type. If you need a refresher on how this is set up, have a look at the articles in the series. It is recommended to read them in order as some examples expand on topics that had been previously covered. The following snippet shows how the destination section is configured:

destination:
  plugin: 'entity:node'
  default_bundle: article

Using previously imported files in image fields

To be able to reuse the previously imported files, the migrate_lookup plugin is used. Additionally, an alternative text for the image is created using a contact plugin concat plugin. The following snippet shows how the process section is configured:

process:
  title: name
  field_image/target_id:
    plugin: migration_lookup
    migration: udm_dependencies_intro_image
    source: photo_file
  field_image/alt:
    plugin: concat
    source:
      - constants/PHOTO_DESCRIPTION_PREFIX
      - name
    delimiter: ' '

In Drupal, files and images are entity reference fields. That means, they only store a pointer to the file, not the file itself. The pointer is an integer number representing the file ID (fid) inside Drupal. The migration_lookup plugin allows you to query the file migration so imported elements can be reused in node migration.

The migration option indicates which migration to query specifying its migration id. Additionally, you indicate which columns in your source match the unique identifiers of the migration to query. In this case, the values of the photo_file column in udm_dependencies_intro_node matches those of the photo_url column in udm_dependencies_intro_image. If a match is found, this plugin will return the file ID which can be directly assigned to the target_id of the image field. That is how the relationship between the two migrations is established.

Note: The migration_lookup plugin allows you to query more than one migration at a time. Refer to the documentation for details on how to set that up and why you would do it. It also offers additional configuration options.

As a good accessibility practice, an alternative text is set for the image using the alt subfield. Other than that, only the node title is set. And with that, you have two migrations connected between them. If you were to rollback both of them, no file or node would remain in the system.

Being explicit about migration dependencies

The node migration depends on the file migration. That it, it is required for the files to be migrated first before they can be used to as images for the nodes. In fact, in the provided example, if you were to import the nodes before the files, the migration would fail and no node would be created. You can be explicit about migration dependencies. To do it, add a new configuration option to the node migration that lists which migrations it depends on. The following snippet shows how this is configured:

migration_dependencies:
  required:
    - udm_dependencies_intro_image
  optional: []

The migration_dependencies key goes at the root level of the YAML definition file. It accepts two configuration options: required and optional. Both accept an array of migration ids. The required migrations are hard prerequisites. They need to be executed in advance or the system will refuse to import the current one. The optional migrations do not have to be executed in advance. But if you were to execute multiple migrations at a time, the system will run them in the order suggested by the dependency hierarchy. Learn more about migration dependencies in this article. Also, check this comment on Drupal.org in case you have problems where the system reports that certain dependencies are not met.

Now that the dependency among migrations has been explicitly established you have two options. Either import each migration manually in the expected order. Or, import the parent migration using the --execute-dependencies flag. When you do that, the system will take care of determining the order in which all migrations need to be imported. The following two snippets will produce the same result for the demo module:

$ drush migrate:import udm_dependencies_intro_image
$ drush migrate:import udm_dependencies_intro_node
$ drush migrate:import udm_dependencies_intro_node --execute-dependencies

In this example, there are only two migrations, but you can have as many as needed. For example, a node with references to users, taxonomy terms, paragraphs, etc. Also note that the parent entity does not have to be a node. Users, taxonomy terms, and paragraphs are all fieldable entities. They can contain references the same way nodes do. In future entries, we will talk again about migration dependencies and provide more examples.

Tagging migrations

The core Migrate API offers another mechanism to execute multiple migrations at a time. You can tag them. To do that you add a migration_tags key at the root level of the YML definition file. Its value an array of arbitrary tag names to assign to the migration. Once set, you run them using the migrate import command with the --tag flag. You can also rollback migrations per tag. The first snippet shows how to set the tags and the second how to execute them:

migration_tags:
  - UD Articles
  - UD Example
$ drush migrate:import --tag=UD Articles,UD Example
$ drush migrate:rollback --tag=UD Articles,UD Example

It is important to note that tags and dependencies are different concepts. They allow you to run multiple migrations at a time. It is possible that a migration definition file contains both, either, or neither. The tag system is used extensively in Drupal core for migrations related to upgrading to Drupal 8 from previous versions. For example, you might want to run all migrations tagged ‘Drupal 7’ if you are coming from that version. It is possible to specify more than one tag when running the migrate import command separating each with a comma (,).

Note: The Migrate Plus module offers migration groups to organize migrations similarly to how tags work. This will be covered in a future entry. Just keep in mind that tags are provided out of the box by the Migrate API. On the other hand, migrations groups depend on a contributed module.

What did you learn in today’s blog post? Have you used the migration_lookup plugin to query imported elements from a separate migration? Did you know you can set required and optional dependencies? Have you used tags to organize your migrations? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with your colleagues.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 08 2019
Aug 08

Our lead community developer, Alona Oneill, has been sitting in on the latest Drupal Core Initiative meetings and putting together meeting recaps outlining key talking points from each discussion. This article breaks down highlights from meetings this past week. You'll find that the meetings, while also providing updates of completed tasks, are also conversations looking for community member involvement. There are many moving pieces as things are getting ramped up for Drupal 9, so if you see something you think you can provide insights on, we encourage you to get involved.

Drupal 9 Readiness (08/05/19)

Meetings are for core and contributed project developers as well as people who have integrations and services related to core. Site developers who want to stay in the know to keep up-to-date for the easiest Drupal 9 upgrade of their sites are also welcome.

  • Usually happens every other Monday at 18:00 UTC.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public Drupal 9 Readiness Agenda anyone can add to.
  • Transcript will be exported and posted to the agenda issue.

Symfony 4/5 compatibility

The issue, Allow Symfony 4 to be installed in Drupal 8, has had a lot of work put into it by Michael Lutz.

Contrib deprecation testing on drupal.org

Gábor Hojtsy grabbed the data and did some analysis and posted his findings to prepare for Drupal 9. The main topic of the post is to stop using drupal_set_message() now.

Examples module Drupal 9 compatibility

Andrey Postnikov posted Kharkov code sprint on where 23 issues were addressed. Follow the example code for developers project for more information. There are still a bunch of issues to review there if anyone is interested!

Module Upgrader Drupal 9 compatibility

Deprecation cleanup status - blockers to Drupal 9 branch opening

Drupal core's own deprecation testing results, there are currently 13 children issues open, most of them need reviews.

Twig 2 upgrade guide and automation and other frontend deprecations tooling

Semantic versioning for contrib projects

Now that we've come to a path forward for how we plan on supporting semver in core (core key using semver + composer.json for advanced features), the Drupal Association is planning on auditing our infrastructure to start implementing semver.

New Drupal 8.8 deprecations that may need to be backported

  • Ryan Aslett and Gábor Hojtsy did some analysis on deprecations in contrib that are new in Drupal 8.8. They found 17% of all contrib deprecated API use now is from either code deprecated in 8.7 or 8.8. Ryan Aslett looked at the toplist of those deprecations and categorized them based on whether the replacements are also introduced in 8.8 or earlier.
  • We're not backporting API's, that carries far too much semver breaking risk. If a contrib maintainer has usages of code that were deprecated in 8.8.x, and they want their module to be 9.x compatible on the day that 9.0 comes out, they can:
    • do internal version checking,
    • open another branch,
    • wait until 9.1 to be 100% compatible with all supported versions, or
    • drop support for 8.7.x.

Renaming core modules, eg. actions

  • There's a meta about renames.
  • Renaming modules can have impacts on other modules that declare a dependency on those modules. 
  • We need some way to prove that a rename doesn't break contrib modules.

Migration Initiative Meeting (08/08/19)

This meeting:

  • Usually happens every Thursday and alternates between 14:00 and 21:00 UTC.
  • Is for core migrate maintainers and developers and anybody else in the community with an interest in migrations.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public migration meeting agenda anyone can add to.
  • Transcript will be exported and posted to the agenda issue.
  • For anonymous comments, start with a bust in silhouette emoji. To take a comment or thread off the record, start with a no entry sign emoji.

Some issues need review:

  1. Add test of D7 term localized source plugin
  2. Migrate D7 synchronized fields
  3. Ensure language is not Null in translation source queries 
  4. Language specific term (i18n_taxonomy) should not rely on entity translation in D7 taxonomy term migration
  5. Migrate D6 and D7 node revision translations to D8 
  6. Migrate D7 i18n taxonomy term language 
  7. Use the lock service for migration locks 
  8. Undeprecate Drupal\comment\Plugin\migrate\source\d6\Comment::prepareComment() and mark as internal
  9. Create Migration Lookup service 
  10. Validate Migration State should load test fixture
  11. Boolean Field On and Off Label not Migrating
  12. Assert plural labels exist on migrate upgrade form
  13. Migrate UI - review help text
Aug 08 2019
Aug 08
Text on the Rosetta Stone

The web is constantly growing, evolving and—thankfully—growing more accessible and inclusive.

It is becoming expected that a user can interact with a website solely via keyboard or have the option to browse in their native language. There are many ways to serve the needs of non-native-language users, but one of the more robust is Drupal Multilingual.

Unlike 3rd party translation plugins like Google Translate or browser translation tools, Drupal's suite of core Multilingual tools allows you to write accurate and accessible translated content in the same manner as you write in your default language content. With no limit on the number languages, settings for right-to-left content, and the ability to translate any and all of your content, Drupal 8 can create a true multi-language experience like never before.

There is, however, a bit of planning and work involved.

Hopefully, this blog series will help smooth the path to truly inclusive content by highlighting some project management, design, site building, and development gotchas, as well as providing some tips and tricks to make the multilingual experience better for everyone. Part one will help you decide if you need multilingual as well as provide some tips on how to plan and budget for it.

Aug 08 2019
Aug 08

For most Drupal projects, patches are inevitable. It’s how we, in the Drupal community, share code. If that scares you, don’t worry-- the community is working hard to move to a pull/merge request workflow. Due to the collaborative nature of Drupal as a thriving open source community and the always growing ecosystem of contrib modules, patches are the ever-evolving glue that can hold a site together.  

Before Drupal 8, you may have seen projects use drush make which is a Drupal specific solution. As part of the “get off the island” movement,  Drupal adopted existing dependency manager Composer. Composer does a decent job alleviating the headaches of managing several sites with different dependencies. However, out of the box Composer will revert patched core files and contrib modules and it is for that reason composer-patches project was created. In this blog post, we are going to review how to set up composer-patches for a composer managed project and how to specify local or remote hosted patches.

The setup

In your favorite command line tool, you will want to add the composer-patches project:

composer require cweagans/composer-patches:~1.0 --update-with-dependencies

With this small change, your project is now set up for success because composer can manage your patches. 

Local patches

Sometimes you will find that you need patch contrib or core specifically for your project and therefore the patch exists locally. Composer-patches can apply that patch for you, we just need to tell it where it is.  Let’s look at an example project that has core patch applied and saved locally in the project root directory ‘patches/core-invalid-config-structures.patch’:
    ...
    "extra": {
      "patches": {
        "drupal/core": {
          "Core Invalid config structures ":"patches/core-invalid-config-structures.patch"
        }
      }
    }

In your composer.json, you will want to add an “extra” section if it doesn’t already exist.  Composer-patches will take the packages listed in “patches” and try to apply any listed patches. In our above example, the package we are patching is “drupal/core”. Patches are declared as follows:

“Patch description”: “path to patch file”

This information will be printed on the command line while composer tries to update the package which makes it important to summarize the patches purpose well.  If you would like to see what this looks like in the wild, take a look at our distribution Rain which leverages a couple of contrib patches.

After manually updating composer.json, it is always a good idea to run composer validate to confirm the json syntax is right.  If you get the green success message run composer update drupal/[projectname], e.g. composer update drupal/core to have the patch applied. 

You will know that the patch is applied based on the output:

patch output

As you can see, the package getting patched is removed, added back and the patch is applied. 

Note: Sometimes I feel like I have to give composer a nudge, always feel comfortable deleting /core, /vendor, or /modules/contrib, but if you delete composer.lock know that your dependencies could update based off your constraints.  Composer.json tracks our package dependencies at certain version constraints while composer.lock is the recipe of computed versions based off those constraints. I have found myself running the following:

rm -rf core && rm -rf modules/contrib && rm -rf vendor
composer install

Remote Patches

When possible we should open issues on Drupal.org and post patches there. That way, the community can work together to solve a problem and usually you’ll get a more reliable, lasting solution. Think about it this way - would you rather only you or your team review a critical patch to your project or hundreds of developers?

To make composer-patches grab a remote patch make the following changes:
    ...
    "extra": {
      "patches": {
        "drupal/core": {

          "#2925890-10: Invalid config structures ":"https://www.drupal.org/files/issues/2018-09-26/2925890-10.patch"
        }
      }
    } 

The only change here is rather than the path to the local patch, we have substituted it for the URL the patch. This will have a similar success message when applied correctly:

remote patches

Tips 

So far, I’ve shown you how to get going with composer-patches project but there are a lot of settings/plugins that can elevate your project.  A feature I turn on for almost all sites is exit on patch failure because it is a big deal when a patch fails.  If you too want to turn this feature on, add the following line to your “extras” section in your composer.json:

"composer-exit-on-patch-failure": true,

I have also found it helpful to add a link back to the original issue in the composer.json patch declaration. Imagine working on a release and one of your patches fail but the only reference you have to the issue is the patch file url? It is times like these that a link to the issue can make your day.  If we made the same change to our example before, it would look like the following:

 "drupal/core": {
          "#2925890-10: Invalid config structures (https://www.drupal.org/project/drupal/issues/2925890)" : "https://www.drupal.org/files/issues/2018-09-26/2925890-10.patch"
        }

Conclusion

Composer-patches is a critical package to any Drupal project managed by Composer. In this blog I showed you how to get started with the project and some of the tips and tricks I’ve learned along the way. How does your team use composer-packages? Do you have a favorite setting that I didn’t mention? Feel free to drop a comment and share what works for you and your team.

Aug 08 2019
Aug 08

If you have a local business — a restaurant, a bar, a dental clinic, a flower delivery service, a lawyer's office, or some other business — you will benefit immensely from a strong online presence. In this post, we will discuss why Drupal 8 is a great choice to build a local business website. Read on to see how numerous Drupal 8’s benefits will play in favor of your local business.

Some stats about why your local business needs a website

Local businesses once used to rely on word-of-mouth marketing. But the new digital era has changed the game. Customers widely use local Google search to find places, services, or products, they trust online customer reviews, and otherwise rely on the Internet for their decisions. 

So consider these stats about how things are going for local businesses in the digital world:

  1. Users rely on search engines for finding local information. According to Google's study, 4 in 5 people do this.
  2. Local searches are also very goal-oriented. The same Google’s study says that 50% of users who performed a local search on their smartphones visited the store within the next 24 hours. 
  3. Mobile local searches are growing like crazy. According to Statista, the number of mobile local searches are forecast to reach 141.9 billion in 2019 compared to 66.5 billion in 2014. At the same time, the number of desktop searches will drop slightly (62.3 and 66.5 billion, respectively).
  4. Smartphone shoppers love local search. Statista also shows that 82% of smartphone shoppers in the US have used their device for local search with the “near me” keyword as of 7/2018.
  5. Customers read reviews for local businesses. According to the study by BrightLocal, 86% of people do this.

Reasons to build a local business website on Drupal 8

  • SEO-friendliness with plenty of useful modules

First of all, local businesses shouldn’t miss their unique opportunity — they need to make the best of SEO. Google has special approaches to local search. When users search by adding a city name or the “near me” keyword, Google lists the best results near the top of the SERPs in a variety of rich ways. Among them:

  1. the Knowledge Panel
  2. locations on the Google Map
  3. carousels with images, news, reviews, etc.

Moreover, users are able to get all the necessary information like your business hours, get your contacts, read a review, get direction, book an appointment, and so on, without even clicking to your website (zero-click SERPs).

Local Google search on a mobile phone

How to get to these results? Among the recommendations are:

  1. provide detailed information in Google My Business listing
  2. have an optimized Knowledge Graph for your website
  3. optimize content using local keywords
  4. optimize content so it fits Google’s rich snippets
  5. and, of course, follow overall general best SEO practices 

Here is where the Drupal 8 CMS can be your very helpful assistant. In addition to being SEO-friendly out-of-box, Drupal 8 has a wealth of useful SEO modules for various purposes. They include:

  1. SEO Checklist
  2. Metatag
  3. XML sitemap
  4. RobotsTxt
  5. Real-time SEO
  6. Pathauto
  7. Redirect
  8. Google Analytics

and many more.

  • Content easy to manage

Unique and relevant content, regularly updated and optimized with local keywords is one of your most important local SEO secrets. The richer the content is, the richer it looks in Google local search results. In addition, trimming your content to fit Google’s rich snippets is the key to optimizing your website for voice search

In Drupal 8, it is easy to create, edit, and present content in attractive ways. Drupal 8 offers you:

  1. quick edits directly on the page
  2. the Media Library to easily enrich your content with images, videos, and audios
  3. handy content previews
  4. drag-and-drop page layouts with Layout Builder
  5. Drupal Views grids, slideshows, and carousels for the attractive content presentation
  6. content revision history
  7. mobile-friendly admin interfaces
  8. content moderation workflows

and much more.  
 

Media Library in Drupal 8
  • Mobile optimization out of box

In addition to the above mobile statistics, we have a few more for you. The mobile share of organic search engine visits has reached 59% in 2019, versus 27% in 2013. So your successful local business absolutely needs a mobile-friendly website.

Here is where Drupal 8 wins hands down. It has been built around a mobile-first approach. The CMS features built-in modules for creating responsive web design — the Responsive Image and the Breakpoint. 

The responsive web design technique allows your website pages to adapt to any user’s screen by showing a different layout. The page elements resize, change their position, or disappear to provide the smoothest viewing experiences for everyone. 

  • Multi-language to attract more guests

Let’s suppose you run local business in your country in your local language. Consider adding English as an international language or another language based on your touristic audience. See how you can attract your city guests as they use Google search.

Drupal 8 is the best option for multilingual websites and allows you to easily add as many languages as you wish. Drupal 8 supports a hundred of them out-of-the-box with the interface translations included. 

Thanks to Drupal 8 Multilingual Initiative (D8MI), Drupal 8 has four powerful modules responsible for every aspect of translation. 

  • High accessibility standards

According to the CDC (Center for Disease Control and Prevention), 26% (1 in 4) adults in the US have some form of disability. This a quarter of your potential customers. Moreover, they are the ones who may need your local services more than others — for example, local delivery.

To be accessible to all users without barriers, your website should adhere to accessibility standards. Drupal 8 focusses on them and offers advanced accessibility features including the use of WAI-ARIA attributes, accessible inline form errors, aural alerts, obligatory ALT text for images, and much more.  

  • Presence in multiple channels

Local businesses often benefit from the digital presence in multiple channels — imagine, for example, a pizza delivery mobile app connected to your website. 

Drupal 8 offers amazing opportunities to exchange your website’s data with third-party applications. It has five powerful built-in modules for creating REST APIs and sharing Drupal data in the JSON, XML, or other formats needed by the apps. 

  • Easy social media integration

It’s no longer possible to successfully manage a business without a social media presence. Drupal 8 allows third-party integration with any systems, and social networks are not an exception. 

It is incredibly easy to add social media icons to your website pages, provide social share buttons for your content, embed social media feeds, and much more. Social media modules in Drupal 8 are very numerous and useful. 

Among them, Easy Social, AddToAny Share Buttons, Social media share, Social Media Links Block and Field, and many more. In addition, there are network-specific modules like Video Embed Instagram, Pinterest Hover button, Facebook Album, and plenty of others.

Social media posts can also be embedded in your content using the Drupal 8 core Media module as a basis. 

Build a local business website on Drupal 8 with us!

The above reasons to build a local business website on Drupal 8 are just a few of a thousand. Contact our Drupal team and let’s discuss in more detail how we can help your local business flourish!

Aug 08 2019
Aug 08

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

In our latest interview, Ricardo Amaro of Acquia reveals how his discovery of Drupal has enabled him to work on projects he enjoys and that make a meaningful impact. Read on to learn more about his contributions and what the Drupal community in Portugal is like. 

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

My name is Ricardo Amaro. I live with my wife and 2 kids in Lisbon, Portugal. I’ve been working for Acquia since 2011 and recently promoted to Principal Site Reliability Engineer where we deal with all the challenges of helping ~55k Drupal production sites grow every day.

I’ve been contributing in several aspects to the Drupal Community and sometimes that effort goes beyond. An example of that is the published co-authoring of the “Seeking SRE” book (O’Reilly) with my chapter about Machine Learning for SRE, since that main idea came out of a presentation I did at DrupalCon Vienna 2017 explaining how automation and machine learning could help increase reliability on Drupal sites. 

Other projects I’ve initiated in the past within the Drupal community include:

On the local front I founded the Portuguese Drupal Association 8 years ago and I am its current elected president. That same year we organized our first DrupalCampLisbon2011. Nowadays we organize DrupalDays and Camps all over the country and meet regularly on Telegram and video-conferences. Last year we organized DrupalDevDays Lisbon 2018 which was a really good turn out for the entire community.

My main drivers are a passion for Free Software and Digital Rights. That started back in the 90’s when I found myself struggling with the proprietary/closed software available at the time, and installing Linux/Slackware in 1994 was an enlightening moment to my own question “isn’t there a better option?”. But I only switched all my machines to Linux in 2004 and that’s what I’ve used up to now. Because I think the GNU/Free Software ecosystem, where Drupal was able to grow, is fragile and needs to be nourished by all of us.

I have a degree in Arts and a second one in Computer Science & Engineering and I’m now taking a master in Enterprise Information Systems.

Before Acquia, I worked both in the public sector and in the private sector in Portugal, applying Agile techniques and encouraging the DevOps culture. I’ve managed teams, development projects and operations also in South Africa and around Europe. 

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

I came across Drupal in 2008, when searching for an OpenSource CMS software in order to create some Media Publishing sites for the company I was working for back at that time. My role as an IT Director was not easy, since the company was struggling with funding, so Drupal 6 was an amazing tool that enabled us to grow several of the sites and particularly create a self service on our main classified advertisement sites.

I found the Drupal Portuguese community at that time struggling to have a legal entity and to be able to grow and organize events inside the country. Portugal has always been mostly monopolized by large corporations like Microsoft and Oracle, while Free software has always been seen as “experimental” solutions, at best.

I took upon myself the commitment to bring the local Drupal community the pride and success they all deserve. I’ve grown a friendship for each and every person in our community and now I couldn't imagine myself without them, as I couldn't imagine myself without Drupal.

3. What impact has Drupal made on you? Is there a particular moment you remember?

Putting it simply: Drupal changed my life! Drupal brought justification to my values and aspirations. I honestly couldn’t have imagined, in a world that is more and more inclined to monopolistic visions, being able to exercise and contribute to the Free Software community and make a living out of it.

The particular moment I felt this more strongly the first time was around 2011 when some decision makers from one of these large corporations asked me if I could bring my Drupal presentation to them at the time, because they wanted to know what this Drupal thing was all about. So I organized a few of my usual slides and took them with me.

This was in a very fancy Vila in one of the most expensive areas near Lisbon. I did my pitch and by the end they seemed very impressed with what Drupal had to offer for free, so many powerful features, so much commitment. Naturally one of their questions was how they could make their proprietary software, that started having a descent curve, embark on this positive wave of growth. My obvious answer was “release your code as open source”. They looked at me in discredit of course and still invited me for a boat ride which I declined politely. 

I went back home and from time to time thought about that episode until it started to look like a mirage in the past. To my surprise, in the most recent years, that same corporation has started releasing open source code, created community projects and apparently changed their minds… 

4. How do you explain what Drupal is to other, non-Drupal people?

Drupal lets you turn big ideas into digital realities. An innovative web platform for creating engaging digital websites and experiences. Drupal is the world's most popular enterprise-class web content management system. It’s developed by more than 46,000 people that are part of the 1.3 million users registered on drupal.org.

Last year we had about 1,000 companies with 8,000 code contributions and this is reflected in millions of websites with 12% market share, plus an annual growth of 51%. If these people still had some more time I would present them the Drupal Pitch Deck. :)

5. How did you see Drupal evolving over the years? What do you think the future will bring?

From my perspective Drupal has been always growing and even making positive bonds with other Free Software initiatives out there.  One of the most interesting ones happened last year at Drupal Europe 2018 (11-14 Sept)  where we had the founders of RocketChat and Nextcloud met and they ended up announcing a partnership on the 17th of September…  

We should follow that example and support more interaction and collaboration with other projects in our ecosystem. For starters we should make an effort to use tools like RocketChat (see https://drupalchat.me) and grow awareness that companies like Slack have 0, or even less, to do with our values and we don’t gain anything with crossing our arms and letting people be driven there. The future is open, the future is community and inclusion.

6. What are some of the contributions to open source code or to the community that you are most proud of?

For sure the ongoing effort that I do on the Drupal Portuguese Association to keep people motivated, things organized and events happening is the first one. The highlight of this was DrupalDevDays Lisbon 2018. The second one was the DrupalCI which was of major impact for Drupal8’s final release.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavor. 

I’m most excited about Containers and the power behind them. That is only possible because there is Gnu/Linux operating system supporting them. Kubernetes in particular is also of interest since it follows the reasoning of auto-scalability that we need for distributed systems. Drupal is flying to the future already with its headless/decoupled capabilities. I’m seeing containers even being applied to support machine learning algorithms and neural networks. 

Another thing that I’m particularly interested in is investigating better ways to make communities grow and ensure that they have the necessary tools to make that happen.  

My personal endeavor is, in the end, to see my kids grow in a healthy environment, rich in possibilities, and for that I need to keep information available for them and help the Free Software ecosystem stay alive. After all, what else is there that can guarantee our future human independence from “blackboxed” technology? If you can’t see, study or change the source, what role is left for you? 

 Drupal DevDays Lisbon 2018

Aug 08 2019
Aug 08

Back in early 2010, when Jason Grigsby pointed out that simply setting a percentage width on images was not enough, and that you needed to resize these images as well for a better user experience. He pointed out that if you served the right sized images on the original responsive demo site, more than 75% of the weight of those images can be shaved on smaller screens. 

Ever since, the debate on responsive images have evolved in what is the best solution to render the perfect, responsive images without any hassle.

We all know how Drupal 7 does a great job in handling responsive images with its modules. However, with Drupal 8, things are even better now!

Responsive Images in Drupal 8

The Responsive Image module in Drupal 8 provides an image formatter that maps the breakpoint of the original image to render a flawless responsive image using a picture tag.

When we observe how Drupal 8 handles responsive images when compared to Drupal 7, some of the features to be noted are:

Drupal 7 consists of the contributed module picture element, which in the latest version is known as Responsive Images.
In addition to this, Responsive images & Breakpoint modules are a part of the Drupal core in the latest version of the CMS.

The Problem

One of the major problems with the images in web development is, browsers do not know about the images, and are clueless about what sized images are rendering in relation with a viewport of different screens until the CSS & Javascripts are loaded.

However, the browser can know about the environment in which the images are rendering, which includes the size of the viewport and resolution of the screen.

The Solution 

As we mentioned in previous sections, responsive images use picture element which basically has sizes and srcset attributes which play a major role in notifying the browser to choose the best images based on the image style selections.  

So Drupal 8 has done a great job in providing the responsive images module in the core. This will download the lower sized images for the devices with lower screen resolution, resulting in better website load time and improved performance. 

Steps to reproduce

  1. Enable Responsive images and breakpoint module.
  2. Setup the breakpoints for your projects theme.
  3. Setting up the image styles for responsive images
  4. Creating a responsive image style for your theme
  5. Assigning the responsive image style to an image field.

Enable Responsive images and breakpoint module

Since it's a part of drupal 8 core, we will not require any other extra module. All you have to do is enable the responsive images module, since the breakpoint module will be installed with the standard profile. Else enable the breakpoint module.

To enable the module goto->admin->extends select the module and enable the module.

extend page

Setup the breakpoints for your project's theme
 

breakpoints

Setting up the theme’s breakpoint is the most important part for the responsiveness of your site.


If you are using a core theme like bartik , seven, umami or claro, you will already have the breakpoints file and you don’t have to create any new ones. 

However, if you are using a custom theme for your project, it is important that you define the breakpoints in "yourthemename.breakpoints.yml" which can be found in your theme directory, usually found in "/themes/custom/yourthemename".

Each breakpoint will assign the images to media query.  For example images which are rendering in mobile might be smaller i.e width less than 768px, where in medium screens will have a width between 768px to 1024px.


Each breakpoint will have: 

label:  Is the valid label given for the breakpoint.
mediaQuery:  Is the viewport within which the images are rendered.
weight:  For the order of display.
multipliers:  It's a measure of the viewport's device resolution normally 1x will be used for standard sizes and 2x for retina display.

Setting up the image styles for responsive images

Now we will have to create an image style for each of the breakpoints. You can configure your own Drupal 8 image styles at admin->config->media->image-styles. 

Click ‘Add image style’.  Give the valid name for your image style & use scale and crop effect which will provide the cropped images. If the images are stretched, add multiple image style for different viewports.

add image style

Creating a responsive image style for your theme 

This is where you provide the multiple image style options to the browser and let the browser choose the best out of the lot. 

responsive-image-styleresponsive image


To create new responsive Drupal 8  image style navigate to:
Home -> admin- > config-> media->responsive-image-style and click on ‘Add responsive image’. 

Give a valid name for your responsive image style & select the breakpoint group (choose your theme) & assign the image styles to the breakpoints listed 

There are multiple options for the image style configurations

  • Choose single image style: Where you can select the single image style that will be rendered on the particular screen
  • Choose multiple image style: Where you can select the multiple-image style and also specify the viewport width for the image style

At last, there is an option to select a fallback image style. The fallback image style should only appear on the site if an error occurs.

fallback responsive image

Assigning the responsive image style to an image field 

  • Once all the configurations are done, move to the image field by adding the responsive image style.
  • To do that go to the field’s manage display and select the responsive image style which we created.
  • Add content and see the results on the page with a responsive image style.assigning responsiveresponsive image style to an image field

Final Results 

responsive image style to an image field

 The image at a minimum width of 1024px (For large Devices).

minimum width of 1024px

Image at minimum width of 768px (For Medium Devices).

Responsive image style

Image at maximum width 767px (For Small Devices).

Aug 08 2019
Aug 08

With Dries’ latest announcement on the launch of Drupal 9 in 2020, enterprises are in an urgent need to upgrade from Drupal 7 and 8 to version 9.

Drupal 7 and 8 will reach their end of life in November 2021, and those who wish to stick to previous versions might possibly face security challenges.

Eager but unsure what the process would be like? This comprehensive guide aims to simplify the entire Drupal migration process for easy implementation.

Getting Started with the Migration Process

When site is upgraded to Drupal 7, the old database is upgraded to Drupal 7 structure. However, a different approach is followed when the site is upgraded from Drupal 7 to Drupal 8.

Upgrading D7 to D8

Step 1: Take back-up of your website

Start the migration process by making a local copy of your website. As making changes to live site is not recommended, it is a best practice to keep all data safe by taking a backup locally on your machine.

Step 2: Install fresh new site

Install a new Drupal 8 site by downloading the latest version of Drupal 8 from drupal.org.

Drupal 8.7 is the latest release.

Install the latest release of Drupal 8 along with installing dependencies with Composer.

Step 3: Prepare your Drupal 8 website for the migration

Setup a local Drupal 8 website on your machine as a destination website for the migration process.

Step 4: Verifying the modules are in core and enabled

Ensure Migrate, Migrate Drupal and Migrate Drupal UI modules are enabled on your Drupal 8 site. This can be done by navigating to the ‘Extend’ tab of your website and ensuring all the above modules are present in the core.

Now, check the three modules and click ‘Install’ button at the bottom of the page.

1-526074867869569243

 

Step 5: Upgrade your website

Go to your website and append the website address with /upgrade (www.<yourwebsitename>.com/upgrade) and follow the instructions. Now click ‘Continue’ button.

2-10

Step 6: Enter website details

On clicking ‘Continue’ the below screen comes up which asks you for the website credentials, database location and other details.

4-2

 

Step 7: Start the migration

If the database credentials to your source database are correct, the upgrade review page will appear on the Migrate UI. It will show the summary of the upgrade status for all installed modules on the old site.

As a site builder you should carefully review the modules that will not be upgraded and evaluate if your Drupal 8 site will work without the module.

click on ‘Perform Upgrade’ button.

Tip: Don’t proceed and perform the actual upgrade without first installing the missing Drupal 8 module.

Tip: If you get ID conflict warnings

If you manually create a node to the Drupal 8 site before upgrading and the source Drupal 6/7 site has a node with the same ID, the migration system will overwrite the node that was manually created in Drupal 8.

If conflicting IDs are detected, a warning about conflicting IDs will be shown which can be ignored to risk losing data or abort and take an alternative approach.

Depending on the size and types of content/configuration on the source site, the upgrade may take a very long time. Once the process is finished, you are directed to the site's frontpage with messages summarizing the results:

Upgrading D8 to D9

When it comes to migrating to Drupal 9 from Drupal 8, process is quite simpler. As D9 is an extended version of D8, it is much easier to upgrade. Read the complete guide of Drupal 8 to Drupal 9 upgrade to understand the complete process.

Alternate Method: Migration using Drush Command

Upgrading to Drupal 8 using Drush is useful when migrating complex sites as it allows you to run migrations one by one and it allows rollbacks.

If you are using Composer to build your Drupal 8 site, then you may already have Drush installed. However, if not, then you can install Drush from command line as follows:

composer require drush/drush

To migrate using Drush you need to download and enable the contributed modules: Migrate Upgrade, Migrate Plus and Migrate Tools.

Ensure the Drush is up to date (with the command: “drush –version”)

Now it’s time to start the migration through Drush with following drush command

“drush ://user:[email protected]/db — ://example.com –configure-only”

Where the below mentioned values can be with your values in the above command

  • ‘user’ is the username of the source database
  • ‘password’ is the source database user’s password
  • ‘server’ is the source database server
  • ‘db’ is the source database

Now check your migration status (with the command “drush migrate-status”)

Import the data with the command (“drush migrate-import –all”).

After successful migration, go to the structure->migrations to check the status of migration.

5-1

Check the list migration button next to the migration group ‘import from drupal 7’ to view the entire migrated data.

 

6-590

 

After clicking on all upgraded data will be visible. Click to the execute button and data will be imported.

7-2

 

Once you click on the execute button, you will be redirected to the page with below mentioned options.

8-2

Import button imports all previously unprocessed records from the source into destination Drupal objects.

With this we come to an end of our Drupal migration process. If the above steps are followed carefully, a website can be easily migrated to the latest version.

Srijan has more than 35 Acquia certified Drupal experts with expertise in migrating projects to newer versions of Drupal. Contact us to seamlessly get started with the latest Drupal version.

 

Aug 08 2019
Aug 08

We have already covered two of many ways to migrate images into Drupal. One example allows you to set the image subfields manually. The other example uses a process plugin that accomplishes the same result using plugin configuration options. Although valid ways to migrate images, these approaches have an important limitation. The files and images are not removed from the system upon rollback. In the previous blog post, we talked further about this topic. Today, we are going to perform an image migration that will clear after itself when it is rolled back. Note that in Drupal images are a special case of files. Even though the example will migrate images, the same approach can be used to import any type of file. This migration will also serve as the basis for explaining migration dependencies in the next blog post.

Code snippet for file entity migration

File entity migrate destination

All the examples so far have been about creating nodes. The migrate API is a full ETL framework able to write to different destinations. In the case of Drupal, the target can be other content entities like files, users, taxonomy terms, comments, etc. Writing to content entities is straightforward. For example, to migrate into files, the process section is configured like this:

destination:
  plugin: 'entity:file'

You use a plugin whose name is entity: followed by the machine name of your target entity. Other possible values that could be used are user, taxonomy_term, and comment. Remember that each migration definition file can only write to one destination.

Source section definition

The source of a migration is independent of its destination. The following code snippet shows the source definition for the image migration example:

source:
  constants:
    SOURCE_DOMAIN: 'https://agaric.coop'
    DRUPAL_FILE_DIRECTORY: 'public://portrait/'
  plugin: embedded_data
  data_rows:
    - photo_id: 'P01'
      photo_url: 'sites/default/files/2018-12/micky-cropped.jpg'
    - photo_id: 'P02'
      photo_url: ''
    - photo_id: 'P03'
      photo_url: 'sites/default/files/pictures/picture-94-1480090110.jpg'
    - photo_id: 'P04'
      photo_url: 'sites/default/files/2019-01/clayton-profile-medium.jpeg'
  ids:
    photo_id:
      type: string

Note that the source contains relative paths to the images. Eventually, we will need an absolute path to them. Therefore, the SOURCE_DOMAIN constant is created to assemble the absolute path in the process pipeline. Also, note that one of the rows contains an empty photo_url. No file can be created without a proper URL. In the process section we will accommodate for this. An alternative could be to filter out invalid data in a source clean up operation before executing the migration.

Another important thing to note is that the row identifier photo_id is of type string. You need to explicitly tell the system the name and type of the identifiers you want to use. The configuration for this varies slightly from one source plugin to another. For the embedded_data plugin, you do it using the ids configuration key. It is possible to have more than one source column as identifier. For example, if the combination of two columns (e.g. name and date of birth) are required to uniquely identify each element (e.g. person) in the source.

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD migration dependencies introduction whose machine name is ud_migrations_dependencies_intro. The migration to run is udm_dependencies_intro_image. Refer to this article to learn where the module should be placed.

Process section definition

The fields to map in the process section will depend on the target. For files and images, only one entity property is required: uri. It has to be set as a reference to the file using stream wrappers. In this example, the public stream (public://) is used to store the images in a location that is publicly accessible by any visitor to the site. If the file was already in the system and we knew the path the whole process section for this migration could be reduced to two lines:

process:
  uri: source_column_file_uri

That is rarely the case though. Fortunately, there are many process plugins that allow you to transform the available data. When combined with constants and pseudofields, you can come up with creative solutions to produce the format expected by your destination.

Skipping invalid records

The source for this migration contains one record that lacks the URL to the photo. No image can be imported without a valid path. Let’s accommodate for this. In the same step, a pseudofield will be created to extract the name of the file out of its path.

psf_destination_filename:
  - plugin: callback
    callable: basename
    source: photo_url
  - plugin: skip_on_empty
    method: row
    message: 'Cannot import empty image filename.'

The psf_destination_filename pseudofield uses the callback plugin to derive the filename from the relative path to the image. This is accomplished using the basename PHP function. Also, taking advantage of plugin chaining, the system is instructed to skip process the row if no filename could be obtained. For example, because an empty source value was provided. This is done by the skip_on_empty which is also configured log a message to indicate what happened. In this case, the message is hardcoded. You can make it dynamic to include the ID of the row that was skipped using other process plugins. This is left as an exercise to the curious reader. Feel free to share your answer in the comments below.

Tip: To read the messages log during any migration, execute the following Drush command: drush migrate:messages [migration-id].

Creating the destination URI

The next step is to create the location where the file is going to be saved in the system. For this, the psf_destination_full_path pseudofield is used to concatenate the value of a constant defined in the source and the file named obtained in the previous step. As explained before, order is important when using pseudofields as part of the migrate process pipeline. The following snippet shows how to do it:

psf_destination_full_path:
  - plugin: concat
    source:
      - constants/DRUPAL_FILE_DIRECTORY
      - '@psf_destination_filename'
  - plugin: urlencode

The end result of this operation would be something like public://portrait/micky-cropped.jpg. The URI specifies that the image should be stored inside a portrait subdirectory inside Drupal’s public file system. Copying files to specific subdirectories is not required, but it helps with file organizations. Also, some hosting providers might impose limitations on the number of files per directory. Specifying subdirectories for your file migrations is a recommended practice.

Also note that after the URI is created, it gets encoded using the urlencode plugin. This will replace special characters to an equivalent string literal. For example, é and ç will be converted to %C3%A9 and %C3%A7 respectively. Space characters will be changed to %20. The end result is an equivalent URI that can be used inside Drupal, as part of an email, or via another medium. Always encode any URI when working with Drupal migrations.

Creating the source URI

The next step is to create assemble an absolute path for the source image. For this, you concatenate the domain stored in a source constant and the image relative path stored in a source column. The following snippet shows how to do it:

psf_source_image_path:
  - plugin: concat
    delimiter: '/'
    source:
      - constants/SOURCE_DOMAIN
      - photo_url
  - plugin: urlencode

The end result of this operation will be something like https://agaric.coop/sites/default/files/2018-12/micky-cropped.jpg. Note that the concat and urlencode plugins are used just like in the previous step. A subtle difference is that a delimiter is specifying in the concatenation step. This is because, contrary to the DRUPAL_FILE_DIRECTORY constant, the SOURCE_DOMAIN constant does not end with a slash (/). This was done intentionally to highlight things. First, it is important to understand your source data. Second, you can transform it as needed by using various process plugins.

Copying the image file to Drupal

Only two tasks remain to complete this image migration: download the image and assign the uri property of the file entity. Luckily, both steps can be accomplished at the same time using the file_copy plugin. The following snippet shows how to do it:

uri:
  plugin: file_copy
  source:
    - '@psf_source_image_path'
    - '@psf_destination_full_path'
  file_exists: 'rename'
  move: FALSE

The source configuration of file_copy plugin expects an array of two values: the URI to copy the file from and the URI to copy the file to. Optionally, you can specify what happens if a file with the same name exists in the destination directory. In this case, we are instructing the system to rename the file to prevent name clashes. The way this is done is appending the string _X to the filename and before the file extension. The X is a number starting with zero (0) that keeps incrementing until the filename is unique. The move flag is also optional. If set to TRUE it tells the system that the file should be moved instead of copied. As you can guess, Drupal does not have access to the file system in the remote server. The configuration option is shown for completeness, but does not have any effect in this example.

In addition to downloading the image and place it inside Drupal’s file system, the file_copy also returns the destination URI. That is why this plugin can be used to assign the uri destination property. And that’s it, you have successfully imported images into Drupal! Clever use of the process pipeline, isn’t it? ;-)

One important thing to note is an image’s alternative text, title, width, and height are not associated with the file entity. That information is actually stored in a field of type image. This will be illustrated in the next article. To reiterate, the same approach to migrate images can be used to migrate any file type.

Technical note: The file entity contains other properties you can write to. For a list of available options check the baseFieldDefinitions() method of the File class defining the entity. Note that more properties can be available up in the class hierarchy. Also, this entity does not have multiple bundles like the node entity does.

What did you learn in today’s blog post? Had you created file migrations before? If so, had you followed a different approach? Did you know that you can do complex data transformations using process plugins? Did you know you can skip the processing of a row if the required data is not available? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with your colleagues.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 07 2019
Aug 07

Today is the start of our membership campaign which runs through August 31. This month, we've asked members to share why they are part of the Association. We hope you find some inspiration from your fellow community and that you'll join and share:

Join membership

Share the campaign

Jiang Zhan's photo holding sign saying why they are a member
"I believe in Drupal's future and I love the opportunity that Drupal provides to people around the world." ~ Jiang Zhan

Leslie's photo holding sign saying why they are a member
"The Drupal Association helps to support the Drupal community.

The Drupal community is the reason most folks start using Drupal and never leave.
Attend a local Drupal camp, meetup, or contribution day to experience how welcoming + helpful the community is." ~ Leslie Glynn

Wil's photo with sign saying why they are a member
"I love developing for…
Working with…
Problem solving with…
Enhancing…
Drupal!" ~ Wil Roboly

Erica's photo with sign saying why they are a member
"I support the people, time, and commitments made by those who build and grow Drupal." ~ Erica Reed

Imre holding siqn with why they are a member
"I believe in socially sustainable ecosystems and Drupal is just that. It's very powerful to have local communities supporting Drupal, with the Drupal Association empowering global communication and a unified message. This must endure!" ~ Imre Gmelig Meijling

Jaideep's photo holding sign saying why they are a member
"I feel this will help others the way Drupal has helped me pursue my career and open up possibilities of learning and exploring.

This is another way of giving back to Drupal." Jaideep Singh Kandari

Help grow our membership by joining to support the global Drupal Association or by sharing this campaign. We're here because we love Drupal and the opportunities that come from being part of this open source project. Thanks for being here and for all that you contribute!

Aug 07 2019
Aug 07

We now know the list of candidates that form the election for the next At-Large Drupal Association Board member. Voting is now open until 16 August, 2019 and everyone who has used their Drupal.org account in the last year is eligible to vote. You do not need to have a Drupal Association Membership.

Cast your vote

Ryan speaking to Drupal Camp London about prioritizing giving back in Drupal. - photo Paul Johnson
Ryan speaking to Drupal Camp London about prioritizing giving back in Drupal. - photo Paul Johnson

To help everyone understand how an At-Large Director serves the Drupal community, we asked Ryan Szrama, currently serving as one of two community-elected board members, to share about his experience in the role.

I was elected to the board in 2017. While it would be my first time serving on a non-profit board, my personal and professional goals were closely aligned with the mission of the Drupal Association. I was happy to serve in whatever capacity would best advance that mission: uniting a global open source community to build, secure, and promote Drupal.

You'd been a contributor to the project for over 10 years by that point. How do you think your experience in the community shaped your time on the board?

Indeed! I'd been developing Ubercart since 2006 and Drupal Commerce since 2010. I had more Drupal development experience than most folks, helping me understand just how important the Association's leadership of DrupalCon and Drupal.org are to our development efforts. The year before joining the board, I also transitioned into a business leadership role, acquiring Commerce Guys in its split from Platform.sh. This exposed me to a whole new set of concerns related to financial management, marketing, sponsorship, etc. that Drupal company owners encounter in their business with the Association.

While the Association is led day-to-day by an Executive Director (Megan Sanicki when I joined, now Heather Rocker), the board is responsible for setting the strategic vision of the organization and managing its affairs through various committees. Most of our board meetings involve reports on things like finances, business performance, staff and recruitment, etc., but we often have strategic topics to discuss - especially at our two annual retreats. My experience as a long-time contributor turned business owner gave me confidence to speak to such topics as Drupal.org improvements, DrupalCon programming, community governance, and more.

What's the Association up to lately? Are there any key initiatives you expect an incoming board member may be able to join?

There's so much going on that it's hard to know where to focus my answer!

The Drupal Association has seen a lot of changes recently impact the business of the business itself. We welcomed a new Executive Director and are beginning to see the impact of her leadership. We've revised our financial reporting, allowing us to more clearly monitor our financial health, and we've streamlined sales, immediately helping us sell sponsorships for DrupalCon Minneapolis more effectively. Additionally, we've launched and continue to develop the Drupal Steward product in partnership with the Security team. This is part of a very important initiative to diversify our revenue for the sake of the Drupal Association's long term health.

From a community growth standpoint, we continue to follow-up on various governance related tasks and initiatives. We're working with Drupal leaders around the world to help launch and support more local Drupal associations. We continue to refine the DrupalCon program and planning process to attract a more diverse audience and create opportunities for a more diverse group of contributors and speakers. The retooling of Drupal.org to integrate GitLab for project is moving forward, lowering the barrier to entry for new project contributors.

Any one of these activities would benefit from additional consideration, insight, and participation by our new board member.

What’s the best part about being on the board?

One of the best things for me is getting to see firsthand just how much the staff at the Drupal Association do to help the community, maintaining the infrastructure that enables our daily, international collaboration. I've long known folks on the development team like Neil Drumm, but I didn't realize just how much time he and other team members have devoted to empowering the entire community until I got a closer look.

I'm also continually inspired by the other board members. Our interactions have helped expand my understanding of Drupal's impact and potential. I've obviously been a fan of the project for many years now, but my time on the board has given me even greater motivation to keep contributing and ensure more people can for years to come.

Thank you, Ryan! Hopefully folks reading this are similarly inspired to keep contributing to Drupal. Our current election is one way everyone can play a part, so I'd encourage you to head over to the nominations page, read up on the motivations and interests of our nominees, ask questions while you can, and…

Cast your vote!

Aug 07 2019
Aug 07

Decoupled Drupal 8 and GatsbyJS webinar

How did the City of Sandy Springs, GA improve information system efficiency with a unified platform? Our webinar shows how we built this city on decoupled Drupal 8, GatsbyJS, and Netlify.

We explore how a “build-your-own” software approach gives Sandy Springs the formula for faster site speed and the ability to publish messages across multiple content channels — including new digital signage.

What You'll Learn

  • The City of Sandy Springs’ challenges and goals before adopting Drupal 8 

  • How Sandy Springs manages multi channel publishing across the website, social media, and a network of digital signage devices. 

  • Benefits gained from Drupal 8 and GatsbyJS, including: a fast, reliable site, hosting costs, and ease of development for their team.  

Webinar Recording and Slides 

We Built This City (On Drupal 8) from Mediacurrent

Speakers

Jason Green, Visual Communications Manager at City of Sandy Springs, and Mediacurrent Senior Director of Front End Development Zack Hawkins share an inside look at the project.

Aug 07 2019
Aug 07

You've probably heard recently that Drupal 9 is coming. Drupal 8.7 was released in May and Drupal 8.8 is planned for December 2019. At the same time, D9 is becoming a hotly discussed topic in the Drupal world. 

Drupal 9’s arrival perfectly fits into the Game of Thrones’ quote — “Brace yourself, winter is coming.” But do you need to brace for D9? It is promised to arrive easily and smoothly. Still, some important preparations are needed. Let’s review them in this post.

Drupal 9 is coming in June 2020

The year of the D9 release became known back in September 2018. Drupal creator Dries Buytaert announced it at Drupal Europe in Darmstadt. Later on, in December, the exact date arrived — Drupal 9 is coming on June 3, 2020!

What will happen to Drupal 7 and Drupal 8? Both D7 and D8 will reach their end-of-life in November 2021. This means the end of official support and no more updates in the functional and security areas. Some companies will come up with extended commercial support, but it’s far better to keep up with the times and upgrade. All the development ideas and innovations will be focused on “the great nine.”

The Drupal creator explained the planned release and end-of-life dates. In a nutshell, D8’s major dependency is the Symfony 3 framework that is reaching end-of-life in November 2021. Drupal 9 will ship with Symfony 4/5. So the Drupal team has to end-of-life Drupal 8 at that time, but they want to give website owners and developers enough time to prepare for Drupal 9 — hence the June 2020 release decision. 

According to the timing, you need to be on Drupal 9 by By November 2021. In the meantime, you should prepare. 

Preparations for Drupal 9 in the coming

1. How to prepare for Drupal 9 if you are on Drupal 8

Hearing that Drupal 9 is coming, many D8 website owners may say “Hey, we just had an epic upgrade from Drupal 7 to Drupal 8, and here we go again!”.

Keep calm — everything is on the right track. Your upgrade from Drupal 8 to Drupal 9 should be instantaneous. D9 will look like the latest version of D8, but without deprecated code and with third-party dependencies updated (Symfony 4/5, Twig 2, and so on).

Dries Buytaert's quote: we are building Drupal 9 in Drupal 8

There are two rules of thumb regarding the Drupal 9 preparations:

1) Using the latest versions of everything

To have a quick upgrade from Drupal 8 to Drupal 9, you need to stick to the newest versions of the core, modules, and themes. According to Gábor Hojtsy, Drupal Initiative Coordinator, you are gradually becoming a D9 user by keeping your D8 website up-to-date.

Gabor Hojtsy's quote: you become a Drupal 9 user by keeping up to date with Drupal 8.

“The great eight” has adopted a continuous innovation model, which means a new minor version every half a year. Our Drupal team is ready to help you with regular and smooth updates.

2) Getting rid of deprecated code

It is also necessary to keep your website clean from the deprecated code. Deprecated code means APIs and functions that have newer alternatives and are marked as deprecated, or obsolete. 

Any module that does not use deprecated code will just continue working in Drupal 9, Dries said.

Dries Buytaert's quote: without deprecated code websites will be ready for Drupal 9

How can you find deprecated code? Here are a few tools that check everything including custom modules:

  • The command-line tool Drupal Check that checks your code for deprecations
  • The Upgrade Status contributed module that offers a graphical interface to check the modules and theme and get the summary

Many deprecations are very easy to replace. You can always rely on our development team to have a thorough check and cleanup from deprecations. 

2. How to prepare for Drupal 9 if you are on Drupal 7

The best way to prepare for Drupal 9 is to upgrade to Drupal 8 now. Even if this might sound like a marketing mantra to you, it has very practical grounds.

There are plenty of reasons to upgrade and no reason to skip Drupal 8. These are words from Dries Buytaert's presentation

Dries Buytaert's presentation: there are many reasons to upgrade to Drupal 8 now

You will enjoy a wealth of Drupal 8’s benefits for business all the time before 2021. And when Drupal 9 arrives, you will just click and move ahead to it!

Gabor Hojtsy's quote: skipping Drupal 8 does not bring benefits

Don’t worry, despite the immense difference between D7 and D8, the 7-to-8 upgrades are getting easier every day. Developers have studied the D7-to-D8 upgrade path well. In addition, very helpful migration modules have recently reached stability in the D8 core.

Your upgrade to Drupal 8 will depend on your website’s custom functionality and overall complexity. In any case, our Drupal developers will take care of making it smooth. 

So make up your mind and upgrade now — welcome to the innovative path that will lead you further to “the great 9.”

Plan for Drupal 9 with us!

Yes, Drupal 9 is coming. No matter which version of Drupal you are using now, we can help you make the right Drupal 9 preparation plan — and fulfill it, of course. Just contact our Drupal experts!

Aug 07 2019
Aug 07

Website Refresh: The Only Thing Missing is a Purring Sound

The Animal Humane Society (AHS), in Minneapolis, Minnesota is the leading animal welfare organization in the Upper Midwest, helping 25,000 dogs, cats and critters in need find loving homes each year, while providing a vast array of services to the community, from low-cost spay and neuter services to dog training to rescuing animals from neglectful and abusive situations. 

TEN7 has been working with AHS since 2008, making piecemeal updates to their website and finding creative solutions for desired changes with a limited budget. In 2016, the Animal Humane Society wanted to reimagine the animalhumanesociety.org website as not just an adoption source, but a resource, an authority, and an advocate for all things related to companion animals and the community that loves them. 

One of the main goals was to include even more information to support pet owners and animal lovers, including more photos, videos and shareable content. Other goals were to integrate the separate Kindest Cut website (a low-cost spay and neuter clinic) into the main site, and improve functionality of the Lost and Found bulletin boards.

“We wanted the user experience on the site to match the user experience when people come to the shelter. That it would be colorful and emotional and warm and inviting, and that it would give people that same wonderful feeling that they have when they walk in the door at the [shelter] and see the puppies and kittens.”—Paul Sorensen, Director of Brand and Communications, Animal Humane Society

To give AHS the increased functionality they desired (like the enhanced image and video capabilities), we embarked on building a complex Drupal 8 site from scratch. It was more than just a one-and-done update, however. Over a nine-year period, the site had evolved from a manually-updated custom CMS to a new Drupal 5 installation, and later Drupal 6. Additional functionality and one-off customizations to the codebase had created a great deal of technical debt, making the site difficult to maintain and support. 

Drupal 8 functionality allowed us to scrap some custom code, while in other cases we were able to replace custom code with contributed modules developed by the Drupal community. 

Integration with PetPoint (the animal information database) under Drupal 6 was challenging, requiring custom code from beginning to end. We were able to use Drupal 8’s built-in functionality to talk to PetPoint in a more standards-based way, which meant far less custom code.

As we were making these updates, we also followed best practices and implemented coding standards for the new site, which reduce the amount of technical debt that was created.

We launched the site in the summer of 2017, and although there were some hiccups, results were immediate: people LOVED the bold photos, video and shareable content. As a result of the site update, more Minnesotans are:

  • Visiting the website and staying longer. Traffic is up 8.5% from the previous year, and the average visit is over four minutes, up 8.6% percent from the previous year
  • Viewing animal profiles, with nearly 4 million views, leading to 10,751 animal adoptions
  • Sharing and responding to AHS content on social media, with double and triple-digit traffic increases on Twitter, Instagram, LinkedIn and Reddit
  • Donating online, with donations driven by site content up 18.2% from the previous year

We continue to support and collaborate with the Animal Humane Society, adding more functionality we couldn’t squeeze in during the big update, like setting up visitor accounts with the ability to “favorite” animals. And we still have to figure out how to make the site purr.

Aug 07 2019
Aug 07

We have presented several examples as part of this migration blog post series. They started very simple and have been increasing in complexity. Until now, we have been rather optimistic. Get the sample code, install any module dependency, enable the module that defines the migration, and execute it assuming everything works on the first try. But Drupal migrations often involve a bit of trial and error. At the very least, it is an iterative process. Today we are going to talk about what happens after import and rollback operations, how to recover from a failed migration, and some tips for writing definition files.

List of drush commands used in drupal migration workflows

Importing and rolling back migrations

When working on a migration project, it is common to write many migration definition files. Even if you were to have only one, it is very likely that your destination will require many field mappings. Running an import operation to get the data into Drupal is the first step. With so many moving parts, it is easy not to get the expected results on the first try. When that happens, you can run a rollback operation. This instructs the system to revert anything that was introduced when then migration was initially imported. After rolling back, you can make changes to the migration definition file and rebuild Drupal’s cache for the system to pick up your changes. Finally, you can do another import operation. Repeat this process until you get the results you expect. The following code snippet shows a basic Drupal migration workflow:

# 1) Run the migration.
$ drush migrate:import udm_subfields

# 2) Rollback migration because the expected results were not obtained.
$ drush migrate:rollback udm_subfields

# 3) Change the migration definition file.

# 4) Rebuild caches for changes to be picked up.
$ drush cache:rebuild

# 5) Run the migration again
$ drush migrate:import udm_subfields

The example above assumes you are using Drush to run the migration commands. Specifically, the commands provided by Migrate Run or Migrate Tools. You pick one or the other, but not both as the commands provided for two modules are the same. If you were to have both enabled, they will conflict with each other and fail.
Another thing to note is that the example uses Drush 9. There were major refactorings between versions 8 and 9 which included changes to the name of the commands. Finally, udm_subfields is the id of the migration to run. You can find the full code in this article.

Tip: You can use Drush command aliases to write shorter commands. Type drush [command-name] --help for a list of the available aliases.

Technical note: To pick up changes to the definition file, you need to rebuild Drupal’s caches. This is the procedure to follow when creating the YAML files using Migrate API core features and placing them under the migrations directory. It is also possible to define migrations as configuration entities using the Migrate Plus module. In those cases, the YAML files follow a different naming convention and are placed under the config/install directory. For picking up changes, in this case, you need to sync the YAML definition using configuration management workflows. This will be covered in a future entry.

Stopping and resetting migrations

Sometimes, you do not get the expected results due to an oversight in setting a value. On other occasions, fatal PHP errors can occur when running the migration. The Migrate API might not be able to recover from such errors. For example, using a non-existent PHP function with the callback plugin. Give it a try by modifying the example in this article. When these errors happen, the migration is left in a state where no import or rollback operations could be performed.

You can check the state of any migration by running the drush migrate:status command. Ideally, you want them in Idle state. When something fails during import or rollback, you would get the Importing or Rolling back states. To get the migration back to Idle, you stop the migration and reset its status. The following snippet shows how to do it:

# 1) Run the migration.
$ drush migrate:import udm_process_intro

# 2) Some non recoverable error occurs. Check the status of the migration.
$ drush migrate:status udm_process_intro

# 3) Stop the migration.
$ drush migrate:stop udm_process_intro

# 4) Reset the status to idle.
$ drush migrate:reset-status udm_process_intro

# 5) Rebuild caches for changes to be picked up.
$ drush cache:rebuild

# 6) Rollback migration because the expexted results were not obtained.
$ drush migrate:rollback udm_process_intro

# 7) Change the migration definition file.

# 8) Rebuild caches for changes to be picked up.
$ drush cache:rebuild

# 9) Run the migration again.
$ drush migrate:import udm_process_intro

Tip: The errors thrown by the Migrate API might not provide enough information to determine what went wrong. An excellent way to familiarize yourselves with the possible errors is by intentionally braking working migrations. In the example repository of this series, there are many migrations you can modify. Try anything that comes to mind: not leaving a space after a colon (:) in a key-value assignment; not using proper indentation; using wrong subfield names; using invalid values in property assignments; etc. You might be surprised by how Migrate API deals with such errors. Also, note that many other Drupal APIs are involved. For example, you might get a YAML file parse error, or an Entity API save error. When you have seen an error before, it is usually faster to identify the cause and fix it in the future.

What happens when you rollback a Drupal migration?

In an ideal scenario, when a migration is rolled back, it cleans after itself. That means, it removes any entity that was created during the import operation: nodes, taxonomy terms, files, etc. Unfortunately, that is not always the case. It is very important to understand this when planning and executing migrations. For example, you might not want to leave taxonomy terms or files that are no longer in use. Whether any dependent entity is removed or not has to do with how plugins or entities work.

For example, when using the file_import or image_import plugins provided by Migrate File, the created files and images are not removed from the system upon rollback. When using the entity_generate plugin from Migrate Plus, the create entity also remains in the system after a rollback operation.

In the next blog post, we are going to start talking about migration dependencies. What happens with dependent migrations (e.g., files and paragraphs) when the migration for host entity (e.g., node) is rolled back? In this case, the Migrate API will perform an entity delete operation on the node. When this happens, referenced files are kept in the system, but paragraphs are automatically deleted. For the curious, this behavior for paragraphs is actually determined by its module dependency: Entity Reference Revisions. We will talk more about paragraphs migrations in future blog posts.

The moral of the story is that the behavior migration system might be affected by other Drupal APIs. And in the case of rollback operations, make sure to read the documentation or test manually to find out when migrations clean after themselves and when they do not.

Note: The focus of this section was content entity migrations. The general idea can be applied to configuration entities or any custom target of the ETL process.

Re-import or update migrations

We just mentioned that Migrate API issues an entity delete action when rolling back a migration. This has another important side effect. Entity IDs (nid, uid, tid, fid, etc.) are going to change every time you rollback an import again. Depending on auto generated IDs is generally not a good idea. But keep it in mind in case your workflow might be affected. For example, if you are running migrations in a content staging environment, references to the migrated entities can break if their IDs change. Also, if you were to manually update the migrated entities to clean up edge cases, those changes would be lost if you rollback and import again. Finally, keep in mind test data might remain in the system, as described in the previous section, which could find its way to production environments.

An alternative to rolling back a migration is to not execute this operation at all. Instead, you run an import operation again using the update flag. This tells the system that in addition to migrating unprocessed items from the source, you also want to update items that were previously imported using their current values. To do this, the Migrate API relies on source identifiers and map tables. You might want to consider this option when your source changes overtime, when you have a large number of records to import, or when you want to execute the same migration many times on a schedule.

Note: On import operations, the Migrate API issues an entity save action.

Tips for writing Drupal migrations

When working on migration projects, you might end up with many migration definition files. They can set dependencies on each other. Each file might contain a significant number of field mappings. There are many things you can do to make Drupal migrations more straightforward. For example, practicing with different migration scenarios and studying working examples. As a reference to help you in the process of migrating into Drupal, consider these tips:

  • Start from an existing migration. Look for an example online that does something close to what you need and modify it to your requirements.
  • Pay close attention to the syntax of the YAML file. An extraneous space or wrong indentation level can break the whole migration.
  • Read the documentation to know which source, process, and destination plugins are available. One might exist already that does exactly what you need.
  • Make sure to read the documentation for the specific plugins you are using. Many times a plugin offer optional configurations. Understand the tools at your disposal and find creative ways to combine them.
  • Look for contributed modules that might offer more plugins or upgrade paths from previous versions of Drupal. The Migrate ecosystem is vibrant, and lots of people are contributing to it.
  • When writing the migration pipeline, map one field at a time. Problems are easier to isolate if there is only one thing that could break at a time.
  • When mapping a field, work on one subfield at a time if possible. Some field types like images and addresses offer many subfields. Again, try to isolate errors by introducing individual changes each time.
  • Commit to your code repository any and every change that produces right results. That way, you can go back in time and recover a partially working migration.
  • Learn about debugging migrations. We will talk about this topic in a future blog post.
  • See help from the community. Migrate maintainers and enthusiasts are very active and responsive in the #migrate channel of Drupal slack.
  • If you feel stuck, take a break from the computer and come back to it later. Resting can do wonders in finding solutions to hard problems.

What did you learn in today’s blog post? Did you know what happens upon importing and rolling back a migration? Did you know that in some cases, data might remain in the system even after rollback operations? Do you have a use case for running migrations with the update flag? Do you have any other advice on writing migrations? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with your colleagues.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 07 2019
Aug 07

Our team is always excited to catch up with fellow Drupal community members (and each other) in person during DrupalCon. Here’s what we have on deck for this year’s event:

Visit us at booth #709

Drop by and say hi in the exhibit hall! We’ll be at booth number 709, giving away some new swag that is very special to us. Have a lot to talk about? Schedule a meeting with us

Palantiri Sessions

Keeping That New Car Smell: Tips for Publishing Accessible Content by Alex Brandt and Nelson Harris

Content editors play a huge role in maintaining web accessibility standards as they publish new content over time. Alex and Nelson will go over a handful of tips to make sure your content is accessible for your audience.


Fostering Community Health and Demystifying the CWG by George DeMet and friends

The Drupal Community Working Group is tasked with fostering community health. This Q&A format session hopes to bring to light our charter, our processes, our impact and how we can improve.


The Challenge of Emotional Labor in Open Source Communities by Ken Rickard

Emotional labor is, in one sense, the invisible thread that ties all our work together. Emotional labor supports and enables the creation and maintenance of our products. It is a critical community resource, yet undervalued and often dismissed. In this session, we'll take a look at a few reasons why that may be the case and discuss some ways in which open source communities are starting to recognize the value of emotional labor.

  • Date: Thursday, April 11
  • Time: 2:30pm
  • Location: Exhibit Stage | Level 4


The Remote Work Toolkit: Tricks for Keeping Healthy and Happy by Kristen Mayer and Luke Wertz

Moving from working in a physical office to a remote office can be a big change, yet have a lot of benefits. Kristen and Luke will talk about transitioning from working in an office environment to working remotely - how to embrace the good things about remote work, but also ways in which you might need to change your behavior to mitigate the challenges and stay mentally healthy.

Join us for Trivia Night 

Thursday night we will be sponsoring one of our favorite parts of DrupalCon, Trivia Night. Brush up on your Drupal facts, grab some friends, and don't forget to bring your badge! Flying solo to DrupalCon? We would love to have you on our team!

  • Date: Thursday, April 11
  • Time: 8pm - 11:45pm
  • Location: Armory at Seattle Center | 305 Harrison Street

We'll see you all next week!

Aug 06 2019
Aug 06

We were honored to have two members of our team present sessions at Drupal North this past June in the beautiful and vibrant city of Montreal, Canada.  Drupal North is an annual,  free, three-day conference focusing on Drupal-related topics and the community that drives the Drupal Project forward.

We were pleased to make new connections while also reconnecting with friends and peers. As always, it was also a privilege to be given the opportunity to share our expertise.

Crispin Bailey
Director of UX + Design

[embedded content]

The American Foundation for the Blind (AFB), established in 1921, is an amazing organization that advocates on behalf of visually impaired persons, providing community, resources and opportunities.

This talk covered the process we went through to overhaul the site's messaging, content architecture, and visual design, culminating in a fully-responsive HTML prototype and style guide that was used to implement a brand new, fully-accessible, Drupal 8 website.

Andrew Mallis
CEO

[embedded content]

 
Our client, the deYoung Museum, had an ambitious design project scope with tight timelines on delivery that required us to be nimble and innovative. This talk covered the team’s journey, how we repurposed GatherContent as a CMS, and how we automated deployments via its workflow states using Circle CI and Netlify.

It also looks at the components architectures that more greatly empowered our client to own their digital stories, and author them to the point where insights.famsf.org/gauguin received a Webby Honoree award.

Andrew Mallis
CEO

[embedded content]

Google Analytics provides a richness of data giving insight to a user’s needs and behaviors once they’re on your site. However, the amount of data it presents can be disorienting, leaving users unable to focus on key metrics, or misrepresenting the implications of critical indicators in relation to their goals.

In this talk, Andrew Mallis presented best practices to access the clearest, and most useful analytics data necessary to better tell your story.
 
 
We hope you find these talks helpful and insightful, and we look forward to seeing you at future events.

Aug 06 2019
Aug 06

Author’s Note: No rabbits were harmed in the writing of this post.

This summer, I had the opportunity to attend Decoupled Days 2019 in New York City. The organizers did a fabulous job of putting together an extremely insightful, yet approachable, program for the third year of the conference to date.

In case you haven’t heard of the event before, Decoupled Days is somewhat of a boutique conference that focuses solely on decoupled CMS architectures, which combine a CMS like Drupal with the latest front-end web apps, native mobile and desktop apps, or even IoT devices. Given the contemporary popularity of universal Javascript (being used to develop both the front and back ends of apps), this conference also demands a strong interest in the latest JavaScript technologies and frameworks.

If you weren’t able to attend this year, and have the opportunity in the future, I would highly recommend the event to anyone interested in decoupled architectures, whether you’re a beginner or an expert in the area. With that in mind, here are a few of the sessions I was able to attend this year that might give you a sense of what to expect.

Christopher Bloom (Phase2) gave an excellent performance at his session, giving all of us a crash course in TypeScript. He provided a very helpful live demo of how TypeScript can make it much easier and safer to write apps in frameworks like Vue.js, by allowing for real-time error-checking within your IDE (as opposed to at runtime in your browser) and providing the ability to leverage ES6 Class syntax and decorators together seamlessly.

Jamie Hollern (Amazee Labs) showed off how it’s possible to have a streamlined integration between your Drupal site and a fully-fledged UI pattern library like Storybook. By using the Component Libraries, GraphQL, and GraphQL Twig contributed modules in concert with a Storybook-based style guide, Jamie gave a fabulous demonstration of how you can finally stick to the classic DRY (Don’t Repeat Yourself) principle when it comes to a Drupal design system. Because Storybook supports front-end components built using Twig markup, and allows for populating those components using mock data in a GraphQL format, we can use those same Twig templates and GraphQL queries within Drupal with almost no refactoring whatsoever. If you’re interested in learning more about building sites with Drupal, GraphQL, and Twig, Amazee Labs has also published an “Amazing Apps” repo that acts as a sample GraphQL/Twig sample project.

For developers looking to step up their local development capabilities, Decoupled Days features two sessions on invaluable tools for doing decoupled work on your local machine.

Kevin Bridges (Drud) showed how simple and straightforward it can be to use the DDEV command-line tool to quickly spin up a Drupal 8 instance (using the Umami demo site profile, for example), enable the JSON:API contributed module in order to expose Drupal’s data, and then install (using gatsby-cli) and use a local Gatsby.js site to ingest and display that Drupal data.

Matt Biilman (Netlify) also demonstrated the newly launched Netlify Dev command-line tool for developers who use Netlify to host their static site projects. With Netlify Dev, you could spawn an entire local environment for the Gatsby.js site you just installed using DDEV, which will be able to locally run all routing rules, edge logic, and cloud functions. Netlify Dev will even allow you to stream that local Gatsby.js site to a live URL (i.e., on Netlify) with hot-reloading, so that your remote teammates can view your site as you build it.

As usual, Jesús Manuel Olivas (WeKnow) is always pushing the Drupal community further and further, this time by demonstrating his own in-house, Git-based CMS called Blaze, which is a platform that could potentially replace the need for Drupal altogether. Blaze promises to provide the lightest-weight back-end possible for your already light-weight, static-site front-end. In lieu of a relatively heavyweight back-end like Drupal, Blaze would provide a familiar WYSIWYG editor and easy content modeling with a few clicks. The biggest wins from using any Git-based CMS, though, are the extremely lowered costs and lightning-fast change deployments (with immediate updates to your code repo and CI/CD pipeline).

Aug 06 2019
Aug 06

Drupal is certainly not only the open-source CMS game in town, but when consulting with clients about the best solution for the full range of their needs, it tends to be my go-to.

Here’s why: 

  • Architecture
  • Scalability
  • Database Views
  • Flexibility
  • Security
  • Modules
  • Search
  • Migration  

Architecture

Drupal 8 is built on modern programming practices, and of course, the same will be true for June 2020 release of Drupal 9. 

A Development – Test – Production environment is the default assumption with a Drupal 8 site. Too often, other CMS sites are managed as a single instance, which is a single point of failure. 

Also, Drupal comes with a built-in automated testing framework. Drupal 8 supports unit integration and system/functional testing using the PHP Unit framework. Drupal is built to be inherently extensible through configuration, so every data type can be templated without touching code to achieve fully customized, structured data collection.

Scalability

Drupal has proven to be scalable at the most extreme traffic levels. Weather.com, as just one example, is a Drupal site. Many of the Federal cabinet-level agencies using open source have built their web infrastructures on Drupal. Drupal has built-in functionality, such as a robust caching API and JS/CSS minification/aggregation to optimize page load speed.

Database Views

Drupal Views, which is in Drupal 8 core, is a powerful tool that allows you to quickly construct database views, with AJAX filtering and sorting included. This allows you to quickly construct and publish lists of any data in your Drupal site, without needing a developer to do it for you.

Flexibility

There are several components of flexibility. 

Drupal 8 was built as an API-first CMS, explicitly supporting the idea that the display layer for content stored in a Drupal CMS may not be Drupal. The API first design of Drupal 8 also means that it is easier to integrate Drupal with third-party applications, as the API framework is already in place.

Customers vary widely in the ways in which they currently consume content. We assume that new ways will emerge for consuming content in the future, and even though we may not be in a position to predict right now what that will look like, Drupal is well poised to support what comes next.

Security

The only totally secure CMS is the one installed on a server that is sitting at the bottom of the Mariana trench, with no connectivity to anything!  However, Drupal has been tested in the most rigorous and security-conscious environments across government and industry. With a dedicated security team managing not just Drupal core but also many popular modules, and the openness inherent in open source, Drupal is a solid, secure, platform for any website.

Modules / Extensions

Drupal modules are created and contributed to the community because they solve a problem. If you have the same or similar problem to solve you may be a simple module install away from solving that problem. Also, all Drupal modules are managed and accessible through a single repository at Drupal.org, providing a critical layer of vetting and security.

Search

Current versions of Drupal come with  powerful and unparalleled out-of-the-box search functionality. Also, SOLR integration is plug-and-play with Drupal, allowing you to extend the capabilities of search to index documents, or across multiple domains, or to build faceted search results to improve the user experience.

Migration

Face it: migration is not fun with any CMS. However, the Drupal 8 migration (and the same will be true for Drupal 9) API is highly capable of importing complex data from other systems. Simpler CMS platforms tend to offer simple migration for out-of-the-box content types (posts and pages), but not so much for complex data or custom content types.

Summary

Settling on the right CMS platform is often not an obvious choice. Weighing the relative benefits of every option can take time and calls for expert consultation. In instances where complexity increases, and there’s a need to integrate the CMS with outside data sources, I’ll admit to a Drupal bias. This is based on my experience of Drupal as a CMS framework that was designed specifically for the challenges of a mobile-first, API driven, integrated digital environment.

Looking for further exploration into the relative merits of your open source CMS options? Contact us today for an insightful, informative and fully transparent conversation.


 

Aug 06 2019
Aug 06

Long articles with many sections often discourage users from reading. They start reading and usually leave before reaching half of such articles.

To avoid this type of user experience, I recommend to group each section in your article into a collapsible tab. The article reader then will be able to digest the text in smaller pieces.

The Collapse Text Drupal 8 module adds editor filter plugin to your editor. You then will be able to create collapsible text tabs with a tag system similar to HTML.

Read on to learn how to use this module!

Step #1. Install the Required Module

  • Open the terminal application of your computer
  • Go to the root of your Drupal installation (the composer.json file is located inside this directory)
  • Type the following command:

composer require drupal/collapse_text

Type composer installation command

  • Click Extend
  • Scroll down until you find the Collapse Text module and enable it
  • Click Install

Click Install

Step #2. Create an Editor Role

  • Click People > Roles > Add role

Add Collapsible Blocks to Text-Heavy Nodes in Drupal 8

  • Enter the Role Name Editor and click Save
  • Click the dropdown besides Editor and select Edit permissions

Click the dropdown besides Editor

  • Check these permissions:
    • Comment
      • Edit own comments
      • Post comments
      • View comments
    • Contact
      • Use the site-wide contact form
    • Filter
      • Use the Full HTML text format
    • Node
      • Article: Create new content
      • Article: Delete own content
      • Article: Delete revisions
      • Article: Edit own content
      • Article: Revert revisions
      • Article: View revisions
      • Access the Content overview page
      • View published content
      • View own unpublished content
    • System
      • Use the administration pages and help
      • View the administration theme
    • Taxonomy
      • Tags: Create terms
      • Access the taxonomy vocabulary overview page
    • Toolbar
      • Use the toolbar
    • User
      • Cancel own user account
      • View user information
  • Click Save permissions

Click Save permissions

Step #3. Create a User with the New Editor Role

  • Click People > Add user
  • Create a user with the Editor role
  • Click Create new account

Click Create new account

Step #4. Add the Plugin to the Text Format

  • Click Configuration > Text formats and editors

Click Configuration > Text formats and editors

  • Click the Configure button for the Full HTML format

Click the Configure button

  • Enable the Collapsible text blocks filter and check that it comes after the other two filters specified in the description

Enable the Collapsible text blocks filter

The Full HTML format has these two filters disabled by default, so we are good to go.

  • Click Save configuration

Click Save configuration

Step #5. Create Content

  • Log out and log back in as the user with the Editor role

Log out and log back in

  • Click Content > Add content
  • Write a proper title for the node

The Tabs Structure

Each tab is declared between a pair of tags.

To show an opened tab (not collapsed at all) you put the text between the [collapse] and [/collapse] tags.

To show a collapsed tab you put the text between the [collapsed] and [/collapsed]tags.

The opening [collapse] and [collapsed] tags support two “attribute values”:

  • title
  • class

If you don’t specify a title attribute, the module will take the first title available between the [collapse]/[collapsed] tags.

If you don’t specify a title attribute, the module will take the first title available

It is possible to nest collapsible tabs.

It is possible to nest collapsible tabs

  • Finish editing the node form and click Save

Finish editing the node form and click Save

The image is floated, that is a Bartik specific style. Let’s apply some CSS.

Step #6. Basic Styling

Hint: I’m going to edit the original core theme files because I’m working on a sandbox environment. That is not recommended on a production server. As a matter of fact, it is not a good practice at all. If you want to improve your Drupal theming skills, take a look at this OSTraining class.

  • Open the file core/themes/bartik/css/components/field.css
  • Add this code to the end of the file:
@media all and (min-width: 560px) {
 .node .field--type-image {
   float: none;
 }
  • Open the file core/themes/bartik/css/components/node.css
  • Add this code to the end of the file:
/* Collapse Text Styles */
.open,
.shut {
font-family: sans-serif;
}

.open {
background: black;
color: white;
}

.shut {
background: #444;
color: #CCC;
}

summary {
background-color: red;
color: transparent;
}

.nested1 {
background-color: rgba(224, 110, 108, 0.25);
}
  • Save both files
  • Click Configuration > Performance > Clear all caches
  • Refresh the site

Click Configuration > Performance > Clear all caches

Refresh the site

I hope you liked this tutorial. Thanks for reading!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Aug 06 2019
Aug 06

Did you know that the term “One-Stop-Shop” is one of the most clichéd marketing taglines to use, according to Hubspot? Thankfully, I came across that article before I sat down to write this one. So, I’m NOT going to say that a Drupal distribution is a “one-stop-shop” for a quick and easy way to launch your website. Let’s put it this way instead – If you want to build a Drupal website and eager to see it go live real quick, while making sure that you want to save time on maintenance too, Drupal distributions are meant for you.  

What is a Drupal distribution?

A Drupal distribution is an all-inclusive package to get your website up and running quickly. This package consists of the Drupal Core (basic features), installation profiles, themes (for customized designs), libraries (consists of assets like CSS or Javascript) and modules specific to an industry. For example, if you run a publishing company, a distribution like Thunder can help you speed up your development process. Here you can find modules like Paragraphs, Media Entity, Entity Browser and features like Thunder admin theme, scheduled publishing and much more – all in one place. 

Why should you use a Drupal 8 distribution?

Let me give you a few reasons for that -

  • You don’t have to scramble your way through thousands of Drupal modules only to find a few that you really need.  
  • Configuring Drupal core is easier too as most part of it comes preconfigured.
  • The features and modules included in a Drupal distribution are time tested, optimized and proven for quality.
  • Maintenance of a Drupal distribution is simpler because updates for all modules and features can be performed on one shot!  
  • Since you don’t have to reinvent the wheel every time, you save on time. You save precious resource time. Which also means, you save on money! 
  • Now that you have saved some time, you can spend more time on customizing and personalizing these components to tailor-fit your business needs.

Top 15 Drupal Distributions (alphabetically sorted)

1. CiviCRM Starter Kit

The CiviCRM Starter Kit brings together the power of Drupal and the open-source CRM tool – CiviCRM. The popular CRM is used by more than 8000 organizations to centralize constituent communications. Along with core Drupal and CiviCRM, the distribution also packs in CiviCRM related modules like CiviCRM Cron, Webform CiviCRM, CiviCRM Clear All Caches, etc.

 2. Commerce Kiskstart

If you are looking to quickly get your e-commerce store up and running on Drupal Commerce framework, this one’s for you. Commerce Kickstart is a Drupal distribution made for both Drupal 7 and Drupal 8 and is maintained by Centarro (previously Commerce Guys). The Commerce Kickstart 2.x version comes loaded with beautiful themes, catalog, promotion engines, variety of payment tools, utility tools, shipping and fulfilment tools, analytics and reporting tools, marketing tools, search configuration, custom back office interface and much more.

drupal distributions

                                Source - https://www.drupal.org/project/commerce_kickstart

3. Conference Organizing Distribution

Creating a website for events and conference gets easier with this Drupal distribution. Conference Organizing Distribution (COD) was made for Drupal 7 but is being actively ported to Drupal 8. With COD, you can -

  • Create/manage tickets for event registrations
  • Create announcements for paper submissions
  • Moderate session selections
  • Provide an option for attendees to vote for their favourite sessions
  • Schedule sessions on any day and place
  • Easily manage sponsorships
  • Event management made easy with a powerful event management dashboard
  • Keep a track on multiple events and sessions
  • Sell tickets with Drupal commerce

4. Contenta

This API-first Drupal distribution provides you with a framework that is API ready. It reduces the complexity and pain of using or trying decoupled/headless Drupal. Contenta also comes pre-installed with code and demo content along with front-end application examples. Even if you are new to Drupal, Contenta offers simple and quick ways to get the Drupal CMS part ready and you can then focus on frontend frameworks you intend to use. If you’re looking for a complete solution for a headless Drupal project, ContentJS is your best bet. Content JS integrates Contenta CS with front-end framework NodeJs for a powerful, high performing digital experience.

5. Drupal Government Distributions (federal, regional, local)

The aGov Drupal 8 distribution was developed to meet the guidelines of the Australian government. It allows government bodies to follow standards like the WCAG 2.0 AA, Australian government Web guide, AGLS metadata and Digital Service Standard. However, the developers of aGov, PreviousNext, no longer develop of support this distribution as they are now focused on the GovCMS Drupal distribution. GovCMS was built on the foundation of aGov to build more secure, compliant and adaptable government websites. 
deGov Drupal 8 distribution was built for German government websites and used Acquia Lightning to offer more valuable features and functionalities. Some features common to all the Drupal government distributions-

  • Meeting all government standards
  • Workbench moderation
  • Citizen engagement portals
  • Responsive design
  • Example content
  • Intranet/Extranet

6. Acquia Lightning

True to its name, Acquia Lightning is a light-weight Drupal 8 distribution that you can use to develop and deploy a website in lightning speed (up to 30% lesser development time!). Developed by Acquia, Lightning aims to provide full flexibility and a great authoring experience to editorial teams and content authors. Built on Drupal 8, it offers powerful features like page layouts, drag and drop of assets using Panels, rich text, media, slideshows, Google maps, content scheduling and much more. You can also streamline the workflow process of publishing, reviewing, approving and scheduling content.

7. Open Atrium

OpenAtrium is a Drupal distribution built specifically for organizations to be able to create a collaborative intranet solution for social collaboration and knowledge management. It offers features like a drag and drop layout, events management (Events), document management (Files), issue tracking, granular access controls, media management, a worktracker (to monitor tasks and maintain transparency), and much more. It is also offers responsive layouts and themes.

8. Open Academy

Built on the Panopoly base distribution, the Drupal distribution is tailor-made for higher education websites which can be further extended and customized. It is an easy-to-use tool that does not need users to be technical. You can have a great website without any customizations too! Open Academy distribution consists of a Drupal 7 installation profile and features meant for managing courses, departments, faculty, presentations, news, events, publications and more. The themes provided are optimized and mobile ready.

9. Open Social

Open Social is a Drupal 8 distribution that allows organizations to create intranets, online communities and other social portals easily. It is being used by hundreds of organizations including NGOs and government bodies to facilitate communication and connection with their volunteers, employees, members and customers. It also has features like multi-lingual support, private file system, social login, Geo-location maps, etc.

drupal distribution social module

Source : https://www.drupal.org/project/social

10. Opigno LMS

Opigno Distribution is a Learning Management System built on Drupal. It is an easily scalable solution built not just for universities but also for organizations looking to create e-learning solutions. It allows to manage training paths that are organized in courses, activities and modules. It also provides with features like adaptive learning paths, management of skill acquisition, quizzes, blended learning (online modules + in-house sessions + virtual classrooms), award certificates, forums, live meetings and more.

opigno lms drupal distributions

                                               
 Source: https://www.drupal.org/project/opigno_lms

11. Panopoly

This is a base Drupal distribution – which basically means it also acts like a foundation or a base framework for many other distros to be built upon. Panopoly Distribution is powered by the magic of the Panels module and its features like In-place editor, Panelizer, Fieldable Panel Panes, etc. The Panopoly package consists of contributed modules and libraries. It offers cross-browser and responsive layouts, drag and drop page customizations, a powerful easy-to-use Admin interface, etc. It can also be extended through many Panopoly apps. 

12. Presto!

Want a Drupal 8 starter-kit that can meet all your content management needs and get you up and running, presto?! Count on Presto! Whats better, you can start using Presto right out-of-the-box! It is power packed with some great content features like Intelligent content editing, Promo bar (inline alerts for news/announcements), Divider (adding space), Carousel (interactive images), Blocks, etc. It also comes shipped with a responsive theme based on Bootstrap framework that can be further customized to add more layouts. It also lets you easily integrate with Drupal Commerce to make selling on your website easier. With Presto, you can reduce the development time by 20%!

13. Reservoir

Like Contenta, Reservoir too is an API-first Drupal distribution for decoupling Drupal. With this tool, you can build content repositories that are ready to be consumed by front-end applications. It is packed with all necessary web service APIs necessary to create decoupled websites. Reservoir was developed with the objective to make Drupal more accessible to developers, to provide best-practices for building decoupled applications and to provide a starting point for Drupal developers (with less or no Drupal experience) to build a content repository. Reservoir uses JSON API (a specification used for APIs in JSON) to interact with the back-end content. It also ships with API documentation, OpenAPI format export (compatible with a plethora of tools) and a huge set of libraries, SDKs and references.

14. Thunder

This Drupal 8 distribution is designed exclusively for professional publishing. Thunder was originally designed for and by Hubert Burda Media. The Drupal distribution is loaded with features meant for the publishing sector like the Paragraph module, drag and drop of content, Media Entity, Entity browser, Content lock, Video embed field, Facebook Instant articles, Google AMP, LiveBlog, Nexx.tv video player and much more. All of this along with Drupal core features and responsive themes. 

Drupal distribution module thunder


Source: https://www.drupal.org/project/thunder

15. Varbase

Are you lost in a mountain of Drupal modules and wondering which one to pick? Looking for a package that can jumpstart your web development process right away? Varbase is your go-to Drupal distribution then! Varbase provides you with all the necessities and essential modules, features and configurations to speed up your time to market. 

Aug 06 2019
Aug 06

At the start of every month, we gather all the Drupal blog posts from the previous month that we’ve enjoyed the most. Here’s an overview of our favorite posts from July related to Drupal - enjoy the read!

5 Reasons to Upgrade Your Site to Drupal 8, then 9

Our July selection begins with a blog post by Third & Grove titled “5 Reasons to Upgrade Your Site to Drupal 8, then 9”. Since the upgrade from Drupal 8 will be a smooth and simple one, the best thing to do is to make the move to D8 now and start benefiting from its superior capabilities, as the author of this blog post, Curtis Ogle, also emphasizes. 

In this post, Curtis thus presents his top 5 features of Drupal 8 that make a very strong case for the upgrade. These are: configuration management right in the core; RESTful APIs; twig templates (his personal favorite one); all the contrib modules from D7; and, lastly, the fact that D8 is future proof, with all future upgrade paths considerably smoother than with previous versions.

Read more

The Top Four Benefits of Building a Site on Drupal 8 

Still very much in line with the previous post, this next one was written by BounteousChris Greatens and outlines the main benefits of choosing to build a website in Drupal 8. With an abundance of different CMS solutions, the ones that hold the obvious advantage are those who offer both excellent authoring and administrative features, as well as development capabilities. 

According to Chris, there are 4 main features that make Drupal stand out among other CMS: flexibility, scalability, security and, exactly as in the previously mentioned blog post, the ability of future-proofing. All of Drupal’s additional capabilities only add to this, making it a viable platform for various use cases.

Read more

Prepare for Drupal 9: stop using drupal_set_message()!

Next up, we have a blog post by Gábor Hojtsy reporting on the most recent state of deprecated code in preparation for Drupal 9, which contains two important findings.

The first one is that as much as 29% of all analyzed instances of deprecated API uses can be attributed to drupal_set_message() - so, basically, no longer using this API means you’ll already be 29% on your way towards Drupal 9 readiness.

Gábor’s second finding is that 76% of deprecated API use (47% other API uses beside drupal_set_message()’s 29%) can in fact already be resolved now, 10 months before the release of Drupal 9. This gives project maintainers and contributors plenty of time to work towards D9 compatibility. 

Read more

5 Reasons to Attend and Sponsor Open Source Events

A really great post from July that had us recall the awesome Drupal community is “5 Reasons to Attend and Sponsor Open Source Events”, written by Promet Source’s Chris O’Donnell. He answers the question “Is it worth to keep sponsoring DrupalCamps and other events?” with a hard “Yes” and five (well, six, actually) supporting reasons.

These reasons are: it’s good for business; you (as a company) owe it to the community; you’re able to find new talented developers at these events; you learn a lot; there are various fun activities; and, the sixth bonus reason, you meet many amazing Drupalists and forge new friendships. This last reason alone is actually enough to justify going to at least one or two ‘Camps a year.

Read more

Drupal + Javascript: Exploring the Possibilities

Hook42’s Emanuel London’s introduction to the exploration of the possibilities of Drupal in combination with JavaScript is another post from July that we enjoyed. Excited as he was about the plethora of emerging JavaScript frameworks and the flexibility they offer, Emanuel was a bit disappointed by the fact that the Drupal community hasn’t kept up-to-date with all these technologies, and thus decided to remedy this in a series of blog posts. 

Future posts in the series will explore some of the tools for native mobile app development, e.g. ReactNative, as well as some Drupal tools, modules and distributions, such as ContentCMS. By the end of the series, we’ll hopefully be better prepared for Drupal-powered mobile app development and maybe even compete with WordPress in that area.

Read more

Eight reasons why Drupal should be every government’s CMS

It is a well-known fact in the community that Drupal is the go-to choice for government websites, thanks in large part to its security and multisite capabilities. Anne Stefanyk of Kanopi Studios further underlines this with six additional reasons why governments should choose Drupal as their preferred CMS.

Besides security and multisite/multilingual support, Drupal’s advantage also lies in: its mobility, accessibility, easy content management, ability to handle large amounts of traffic and data, flexibility, and affordability.

These are all aspects crucial to the experience of a government website. As such, Drupal truly is best suited for this role, as is also evidenced by the over 150 countries relying on Drupal to power their websites.

Read more

Getting Start with Layout Builder in Drupal 8

Nearing the end of July’s list, we have a post by Ivan Zugec of WebWash, essentially a tutorial on using Drupal’s recently stable Layout Builder. It contains all the basics you need to get started with this powerful new functionality. 

The first part of the post covers using the Layout Builder to customize content types, with Ivan working on the Article content type as an example. It details how to create a default layout for articles, as well as how to override it for a single article.

The second part then deals with using the module as a page builder, customizing the layout of an individual piece of content, from creating a custom block to embedding images. The post concludes with links to some additional modules and a FAQ section. 
 
Read more

An Open Letter to the Drupal Community

We round off July’s list with J.D. Flynn’s open letter to the Drupal community. This is a very interesting post which deals with a recent positive addition to drupal.org and how it can be exploited to “game the system” - namely, issue credits. 

The problem with the issue credit system is that it can be used to amass hundreds of credits with fixes for simple novice issues, which leaves fewer of these novice issues to fledgling developers trying to get their foot in the door, as well as gives unjustified credibility to the person or company in question and demoralizes other developers. 

J.D. presents four possible solutions to this: weighted credits; mandatory difficulty tagging of issues; credit limits; and a redistribution of credits. He finishes with a call to action to new developers to seek out help and to seasoned developers to offer mentorship to newcomers.

Read more

We hope you enjoyed our selection of Drupal blog posts from July and perhaps even found some thoughts that inspired ideas of your own. Don’t forget to visit our blog from time to time so you don’t miss any of our upcoming posts! 
 

Aug 06 2019
Aug 06

So far we have learned how to write basic Drupal migrations and use process plugins to transform data to meet the format expected by the destination. In the previous entry we learned one of many approaches to migrating images. In today’s example, we will change it a bit to introduce two new migration concepts: constants and pseudofields. Both can be used as data placeholders in the migration timeline. Along with other process plugins, they allow you to build dynamic values that can be used as part of the migrate process pipeline.

Syntax for constants and pseudofields in the Drupal process migration pipeline

Setting and using constants

In the Migrate API, a constant is an arbitrary value that can be used later in the process pipeline. They are set as direct children of  the source section. You write a constants key whose value is a list of name-value pairs. Even though they are defined in the source section, they are independent of the particular source plugin in use. The following code snippet shows a generalization for settings and using constants:

source:
  constants:
    MY_STRING: 'http://understanddrupal.com'
    MY_INTEGER: 31
    MY_DECIMAL: 3.1415927
    MY_ARRAY:
      - 'dinarcon'
      - 'dinartecc'
  plugin: source_plugin_name
  source_plugin_config_1: source_config_value_1
  source_plugin_config_2: source_config_value_2
process:
  process_destination_1: constants/MY_INTEGER
  process_destination_2:
    plugin: concat
    source: constants/MY_ARRAY
    delimiter: ' '

You can set as many constants as you need. Although not required by the API, it is a common convention to write the constant names in all uppercase and using underscores (_) to separate words. The value can be set to anything you need to use later. In the example above, there are strings, integers, decimals, and arrays. To use a constant in the process section you type its name, just like any other column provided by the source plugin. Note that you use the constant you need to name the full hierarchy under the source section. That is, the word constant and the name itself separated by a slash (/) symbol. They can be used to copy their value directly to the destination or as part of any process plugin configuration.

Technical note: The word constants for storing the values in the source section is not special. You can use any word you want as long as it does not collide with another configuration key of your particular source plugin. A reason to use a different name is that your source actually contains a column named constants. In that case you could use defaults or something else. The one restriction is that whatever value you use, you have to use it in the process section to refer to any constant. For example:

source:
  defaults:
    MY_VALUE: 'http://understanddrupal.com'
  plugin: source_plugin_name
  source_plugin_config: source_config_value
process:
  process_destination: defaults/MY_VALUE

Setting and using pseudofields

Similar to constants, pseudofields stores arbitrary values for use later in the process pipeline. There are some key differences. Pseudofields are set in the process section. The name is arbitrary as long as it does not conflict with a property name or field name of the destination. The value can be set to a verbatim copy from the source (a column or a constant) or they can use process plugins for data transformations. The following code snippet shows a generalization for settings and using pseudofields:

source:
  constants:
    MY_BASE_URL: 'http://understanddrupal.com'
  plugin: source_plugin_name
  source_plugin_config_1: source_config_value_1
  source_plugin_config_2: source_config_value_2
process:
  title: source_column_title
  my_pseudofield_1:
    plugin: concat
    source:
      - constants/MY_BASE_URL
      - source_column_relative_url
    delimiter: '/'
  my_pseudofield_2:
    plugin: urlencode
    source: '@my_pseudofield_1'
  field_link/uri: '@my_pseudofield_2'
  field_link/title: '@title'

In the above example, my_pseudofield_1 is set to the result of a concat process transformation that joins a constant and a column from the source section. The result value is later used as part of a urlencode process transformation. Note that to use the value from my_pseudofield_1 you have to enclose it in quotes (') and prepend an at sign (@) to the name. The new value obtained from URL encode operation is stored in my_pseudofield_2. This last pseudofield is used to set the value of the URI subfield for field_link. The example could be simplified, for example, by using a single pseudofield and chaining process plugins. It is presented that way to demonstrate that a pseudofield could be used as direct assignments or as part of process plugin configuration values.

Technical note: If the name of the subfield can be arbitrary, how can you prevent name clashes with destination property names and field names? You might have to look at the source for the entity and the configuration of the bundle. In the case of a node migration, look at the baseFieldDefinitions() method of the Node class for a list of property names. Be mindful of class inheritance and method overriding. For a list of fields and their machine names, look at the “Manage fields” section of the content type you are migrating into. The Field API prefixes any field created via the administration interface with the string field_. This reduces the likelihood of name clashes. Other than these two name restrictions, anything else can be used. In this case, the Migrate API will eventually perform an entity save operation which will discard the pseudofields.

Understanding Drupal Migrate API process pipeline

The migrate process pipeline is a mechanism by which the value of any destination property, field, or pseudofield that has been set can be used by anything defined later in the process section. The fact that using a pseudofield requires to enclose its name in quotes and prepend an at sign is actually a requirement of the process pipeline. Let’s see some examples using a node migration:

  • To use the title property of the node entity, you would write @title
  • To use the field_body field of the Basic page content type, you would write @field_body
  • To use the my_temp_value pseudofield, you would write @my_temp_value

In the process pipeline, these values can be used just like constants and columns from the source. The only restriction is that they need to be set before being used. For those familiar with the rewrite results feature of Views, it follows the same idea. You have access to everything defined previously. Anytime you use enclose a name in quotes and prepend it with an at sign, you are telling the migrate API to look for that element in the process section instead of the source section.

Migrating images using the image_import plugin

Let’s practice the concepts of constants, pseudofields, and the migrate process pipeline by modifying the example of the previous entry. The Migrate Files module provides another process plugin named image_import that allows you to directly set all the subfield values in the plugin configuration itself.

As in previous examples, we will create a new module and write a migration definition file to perform the migration. It is assumed that Drupal was installed using the standard installation profile. The code snippets will be compact to focus on particular elements of the migration. The full code is available at https://github.com/dinarcon/ud_migrations The module name is UD Migration constants and pseudofields and its machine name is ud_migrations_constants_pseudofields. The id of the example migration is udm_constants_pseudofields. Refer to this article for instructions on how to enable the module and run the migration. Make sure to download and enable the Migrate Files module. Otherwise, you will get an error like: “In DiscoveryTrait.php line 53: The "file_import" plugin does not exist. Valid plugin IDs for Drupal\migrate\Plugin\MigratePluginManager are:...”. Let’s see part of the source definition:

source:
  constants:
    BASE_URL: 'https://agaric.coop'
    PHOTO_DESCRIPTION_PREFIX: 'Photo of'
  plugin: embedded_data
  data_rows:
    -
      unique_id: 1
      name: 'Michele Metts'
      photo_url: 'sites/default/files/2018-12/micky-cropped.jpg'
      photo_width: '587'
      photo_height: '657'

Only one record is presented to keep snippet short, but more exist. In addition to having a unique identifier, each record includes a name, a short profile, and details about the image. Note that this time, the photo_url does not provide an absolute URL. Instead, it is a relative path from the domain hosting the images. In this example, the domain is https://agaric.coop so that value is stored in the BASE_URL constant which is later used to assemble a valid absolute URL to the image. Also, there is no photo description, but one can be created by concatenating some strings. The PHOTO_DESCRIPTION_PREFIX constant stores the prefix to add to the name to create a photo description. Now, let’s see the process definition:

process:
  title: name
  psf_image_url:
    plugin: concat
    source:
      - constants/BASE_URL
      - photo_url
    delimiter: '/'
  psf_image_description:
    plugin: concat
    source:
      - constants/PHOTO_DESCRIPTION_PREFIX
      - name
    delimiter: ' '
  field_image:
    plugin: image_import
    source: '@psf_image_url'
    reuse: TRUE
    alt: '@psf_image_description'
    title: '@title'
    width: photo_width
    height: photo_height

The title node property is set directly to the value of the name column from the source. Then, two pseudofields. psf_image_url stores a valid absolute URL to the image using the BASE_URL constant and the photo_url column from the source. psf_image_description uses the PHOTO_DESCRIPTION_PREFIX constant and the name column from the source to store a description for the image.

For the field_image field, the image_import plugin is used. This time, the subfields are not set manually. Instead, they are assigned using plugin configuration keys. The absence of the id_only configuration allows for this. The URL to the image is set in the source key and uses the psf_image_url pseudofield. The alt key allows you to set the alternative attribute for the image and in this case the psf_image_description pseudofield is used. For the title subfield sets the text of a subfield with the same name and in this case it is assigned the value of the title node property which was set at the beginning of the process pipeline. Remember that not only psedufields are available. Finally, the width and height configuration uses the columns from the source to set the values of the corresponding subfields.

What did you learn in today’s blog post? Did you know you can define constants in your source as data placeholders for use in the process section? Were you aware that pseudofields can be created in the process section to store intermediary data for process definitions that come next? Have you ever wondered what is the migration process pipeline and how it works? Please share your answers in the comments. Also, I would be grateful if you shared this blog post with your colleagues.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether the migration series or other topics.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web