Sep 16 2019
Sep 16

The things that make me most excited about the pending release of Drupal 9 in June 2020 is that as a community we have the opportunity to look back at what we have accomplished, let go of some old deprecated code, and can now start thinking about what is next for Drupal 10.

Accomplishing

To my mind, the biggest accomplishments with Drupal 8’s release was shifting it’s entire code base and community to be more object-oriented. The adoption of Symfony into Drupal core allowed us to build a more robust and flexible Content Management Framework. The Webform module for Drupal 8 is a testament to how healthy Drupal has become as a framework. Its robustness is a combination of Drupal's Plugin API and Form API with object-oriented concepts, including dependency injection. A straightforward and personal statement is.

Drupal 8 made me a better programmer and software architect.

Looking at the code and concepts behind core and contributed modules, it is immediately apparent that everyone stepped back and reapproached a lot of technical challenges - our collective hard work and contributions resulted in great solutions and better code. Core improvements are documented as initiatives on Drupal.org. Contributed projects generally don't have initiatives. When you look at the flexibility behind a "simple" module like Metatag, maintained by Damien McKenna (DamienMcKenna) or something potentially as "complex" as Domain Access, maintained by Ken Rickard (agentrickard), it’s impressive to witness what is being built and maintained by the Drupal community.

I think it’s safe to say that everyone is ready to keep improving our software and let go of some of the technical baggage, aka deprecated code.

Deprecating

While Drupal 8 has evolved, some code has become deprecated and replaced by better code and design patterns, allowing us to improve the software. In Dries Buytaert's post, Plan for Drupal 9, he states…

"Instead of working on Drupal 9 in a separate codebase, we are building Drupal 9 in Drupal 8. This means that we are adding new functionality as backward-compatible code and experimental features. Once the code becomes stable, we deprecate any old functionality."

Regarding deprecation, what isn’t said but is worth mentioning is that some code is deprecated to improve the underlying software architecture and fix mistakes like bad naming conventions or unneeded complexity. When the Drupal community adopted Symfony, we entered uncharted waters and mistakes are going to be made.

Recently, I started sketching with my ten-year-old daughter. An important lesson I’m trying to teach her is that it is okay to make mistakes, especially because you can always fix them later. Making mistakes is part of the process in wanting to improve something. Within art, the term 'mistake' is open to interpretation. In software development however, we know when code smells bad and have a process in place to refactor bad code.

Smelling

"Bad smelling code" is an extreme statement which frankly I would only use to describe my code.

Fixing bad smelling code is going to cause a bump in the Webform module's road to Drupal 9.

Before I explain the specific issue in the Webform module for Drupal 8, I want to emphasize that the Drupal community is in good shape with a plan to move on to Drupal 9.

Since Dries' published the plan for Drupal 9, people have been busy coordinating the Drupal 9 initiative. Matt Glaman (mglaman) created Drupal Check, a tool which helps everyone find and remove deprecated code. Now, everyone needs to start removing deprecated code from their module.

Several contributors have been helping me remove deprecated code from the Webform module. Removing a deprecated service from a class constructor that which relies on dependency injection seemed impossible, but then Shawn Duncan (FatherShawn) demonstrated to the NYC Drupal Meetup an implementation pattern, which properly extends a class and safely injects dependencies.

The simplest explanation I can give for non-technical users is the Webform module failed to make it easy to extend the Webform module without having hardcoded tightly coupled dependencies. In other words, some things are going to break when the Webform is installed via Drupal 9. On top of making it easier to extend, applying this new implementation pattern reduced code complexity. This change record on Drupal.org provides a more technical explanation with a code example of the improvements.

It’s also important to acknowledge that Shawn learned this pattern while attending Alex Pott's presentation, Drupal 9 is coming: Getting your code ready. Alex credits Lee Rowlands (larowlan) and Thomas Seidl (drunken monkey) for documenting how to safely extending Drupal 8 plugin classes without fear of constructor changes.

In Alex's presentation, he recommends.

"Be very very careful with sub-classes and overriding constructors"

Sadly, I wasn't aware of how to properly sub-classes and override constructors while building the Webform module, and now we have some technical debt that needs fixing.

We must fix the sub-classing and overridden constructor problem because it paves the Webform module's road to Drupal 9. Fixing this issue is a massive patch that is going to break any Webform add-on or custom code that implements a Webform element or handler and overrides the constructor. Breaking backward compatibility in Drupal core or contributed module requires tagging a new version. Therefore, we are going to have bump the Webform module's version from 5.x to 6.x to get ready for Drupal 9.

I think everyone is starting to understand the bad news, which is that Webform 6.x is going to break some code. This change will result in fatal show-stopping errors, which are easy to fix. Developers need to apply the implementation pattern documented in the change record. Because most patches and changes to the Webform module do not alter a class' constructor, there is some good news: We should be able to maintain a Webform 8.x-5.x and Webform 8/9.x-6.x branch at the same time.

Following the tradition established over the last three years, any significant bump to the Webform module's release status has been tied to the New Year.

Along with preparing for Drupal 9, the 8/9.x-6.x branch of the Webform module will include any significant API changes or improvements. For example, I would like to fix how conditional logic handles a disabled element's value.

Another significant and thankless task that needs to be accomplished is migrating all the Webform module's SimpleTests to PHPUnit. Fortunately, many people have thanked me by backing the Webform module's Open Collective. Now there are funds available to pay for the monotonous task of migrating dozens of tests.

Understanding

Last year's stable release of Webform 8.x-5.x was a Christmas present, and I hope everyone understands that this year's upcoming tagging of Webform 9.x-6.x is not a lump of coal but a big step in the Webform module's road to Drupal 9 and beyond.

I rarely get to commit a patch that makes my code easier to maintain. I am thankful for Shawn Duncan (FatherShawn), recognizing how cool and important this implementation pattern was during Alex's presentation and then sharing it with the NYC Drupal community. This is how the hive mind that is Open Source works to build Drupal. The Drupal community should thank everyone involved who helped transitioned our code and community to Drupal 8. In a few months, we can thank them once again for continuously improving and refactoring Drupal core and contributed projects and paving our road to Drupal 9 - collectively and collaboratively, we’ll get there...it’s what we do.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OK

Sep 13 2019
Sep 13

Wanna know my rules when I am writing templates in PatternLab for Drupal view modes such as teasers, cards, search results, etc? Read on, my friend...

BEM Classes for Streamlined Classes

Classes used in the card view mode should be prefixed with card__ not node__

  • We need to make sure that CSS for cards does not leak out into other components
  • We need to make sure we have as little selector specificity as possible - so we have .card__content instead of .card .node__content in our CSS.

Card Variations

If you want to have slight variations for cards, please add that as an option in the classes array. Examples of this might be for a card for a specific content type, or a card when there is no image present, or a card that has the image on the left/right.


  1. {% set classes =
  2. [

  3. "card",

  4. content_type ? "card--" ~ content_type,

  5. image ? "card--has-image" : "card--no-image",

  6. alignment ? "card--" ~ alignment,

  7. ]

  8. %}

This will print:


  1. <article class="card card--event card--no-image card--image-left"></article>

We can then apply overrides using these variations. For example, if the date field on a event is bold but but on a blog is not, we can have code like so:


  1. .card__date {

  2. color: $c-grey--darkest;

  3. }

  4. .card--event .card__date {

  5. font-weight: $fw--bold;

  6. }

Avoid Twig Blocks

Try not to use Twig blocks in view modes - {% block image %} If we use Twig blocks, then in the Drupal template anything can be put in the {% block %} to override it. If we change the order of things in the {% block %} in PL, this will have no effect in Drupal, since the Drupal version will be overriding it.

For example, the Drupal template could have something like this:


  1. {{ content.field_date }}

Avoid Drupal Templates

Try to avoid having templates based on specific content types for view modes. This is usually necessary for Full view mode, but for Teaser, Card, etc let's try to keep them more generic.

If you need to use a content-type template, that is fine; but it should be the exception, not the rule.

In general, since each content type Card should be similar, then each content type should be able to use the same node--card.html.twig template file.

Avoid {% if not page %}

{% if not page %} Should not be needed in view modes. A Card or Teaser will never be a full page. For the same reason, we can usualy leave this out in the content type full view mode templates, since the full view mode will always be a page.

Do Not Hardcode Heading Levels

Unless you know that the Card view mode is never going to have a <h2> item above it, do not set <h2> or <h3> etc as the heading element.

We do not want to have a HTML structure like this:


  1. <h1>Page Title</h1>

  2. <h2>Views Card Block Title</h2>

  3. <h2>Card Title</h2>

  4. <h2>Card Title</h2>

  5. <h2>Card Title</h2>

Instead, we would like this:


  1. <h1>Page Title</h1>

  2. <h2>Views Card Block Title</h2>

  3. <h3>Card Title</h3>

  4. <h3>Card Title</h3>

  5. <h3>Card Title</h3>

In general, view modes will take a <h2>, so let's place that as our default.


  1. {% set card_title_element = card_title_element|default('h2) %}
  2. <article>

  3. <{{ card_title_element }}>{{ card_tite }}</{{ card_title_element }}>

  4. </article>

Then, in our Drupal template, we can set the element depending on whether it has a parent <h2> or not; for example, if it is in a 'Related Content' building block and that building block has a title such as: <h2 class="related-content__title">Read More Like This</h2>.

We can do so in the Drupal template with a variation of this:


  1. {% if node._referringItem.parent.parent.entity.field_p_rc_title.value %}
  2. {% set teaser_title_element = 'h3' %}

If the above returns as false, our default h2 from the PL pattern will kick in.

Use Prefixes for Variables

Let's use prefixes for variables, using {{ card_content }} instead of {{ content }}. This will help avoid global variables being used by accident in our components (unless we want them to, such as the image_alt variable from data.json). The {{ content }} if used, will not necessarily print the content of your component when that component is used outside of its own context: for example, if you print a card in a views list in a sample page.

Variables for Specific Content Types

If there's a variable for a specific content type, such as the location field for an event card, we can just wrap that in an {% if %} statement, like so:


  1. {% if content_type == 'event' %}
  2. {% if event_location %}
  3. <div class="card__location">

  4. {{ card_location}}

  5. </div>

Then that variable will not print in any other content type's template.

Sep 13 2019
Sep 13

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

We had a great talk with Suzanne Dergacheva, co-founder of the Canadian web agency Evolving Web and member of the Drupal Association Board of Directors. She's also involved in the Admin UX study and in the Promote Drupal initiative, and is an active member of the Canadian Drupal community. Find out more about Suzanne in our interview.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

I started a web agency in Montreal about 12 years ago, Evolving Web, and after about a year we decided to specialize in Drupal; so, I started going to Drupal meetups and got really involved in the local community in Montreal, even organizing DrupalCamp Montreal. 

Then over the years I’ve built up my Drupal team at Evolving Web, kept going to events and got more and more involved, organizing things like codesprints and then getting involved in contributing in small ways - with documentation, etc.

A couple years ago I started getting involved in the Admin UX study for Drupal and I’ve been really passionate about that. It’s an initiative to improve the content editor experience for Drupal. One of the things I’m most excited about right now is actually the Promote Drupal initiative, which I think has a lot of potential to build the market for Drupal.

Last year, I was then elected to the board of the Drupal Association; it’s been a lot of fun just getting right in there and seeing the potential of the communities and all these ideas around growing Drupal. I’m really excited about that too. 

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

I think the first time I downloaded Drupal was about 6 months before we decided to start using it for projects. My business partner and husband Alex said “Oh, maybe we should try using Drupal!” and I think I went to drupal.org and tried to install it, and I didn’t get very far. That was probably my first encounter.

But the second encounter was when we had a project for a political party in Quebec. Every website here in Quebec has to be in English and French, so they were pretty keen to use Drupal. So we said to ourselves, “Okay, we know Ruby on Rails, we know WordPress, I think we can figure out Drupal. No problem, we’ll figure it out!” 

This was when Drupal 6 had just come out, and there were some bugs in the multilingual system that we found. So, the first encounter with Drupal was a very positive one, but also a challenge, and we got right in there and started fixing things. 

To the question of what convinced me to stay, the software or the community, I would say both. In the last 6 years, I built a training program around Drupal, so I think what keeps me really excited about Drupal is that when I teach people, they feel very empowered. And there is a learning curve around Drupal, but I think when I get to actually help people learn and get over that learning curve it’s very inspiring, and so that’s a big part of what motivates me. 

And what keeps me involved in the community is just how much I can learn from everyone and how passionate everyone is. There are so many examples of people in the community who have really inspired me. That’s what keeps me not just using Drupal but giving back and being involved.

3. What impact has Drupal made on you? Is there a particular moment you remember?

One of the great experiences I’ve had was I think about 6 or 7 years ago now when we organized a DrupalCamp in Montreal and we decided that instead of just a DrupalCamp we also wanted to organize a codesprint. 

We figured that since we’re in Montreal and we all build multilingual websites, we should organize a multilingual codesprint. This was when work on Drupal 8 was just starting, it was still a long way off and work was just beginning on the Multilingual initiative

Because we decided to do this far enough in advance, we were able to get a community grant from the Drupal Association to help us pay for different people to come from Europe to participate. That meant that we had Gábor come, as well as Francesco Placella (plach__) who created the Entity Translation module for Drupal. 

We had these people coming from Europe to participate and that inspired a lot of excitement in the local community. The developers on our team got really into it; we had a lot of momentum behind this codesprint.

It was just so exciting to see how this local group could create an international codesprint and really get some good work done. Drupal 8 was a long way off, but still we were able to make some good progress that weekend.

4. How do you explain what Drupal is to other, non-Drupal people?

What I would normally say is that when you look at websites, they are really very similar, they have a lot of the same features. You have a menu across the top, you have a logo in the top left corner, so instead of creating a website from scratch you want to use a platform to do it.

What Drupal lets you do is get a lot of these features that everybody uses out of the box, while also giving you the flexibility to customize your website however you want. You can add features, you can add things like event search and ecommerce - the sky’s the limit. Drupal strikes that balance of providing key features out of the box, and also letting you customize everything.

5. How did you see Drupal evolving over the years? What do you think the future will bring?

I see Drupal evolving to become more mature and used by bigger and bigger organizations, which is really exciting. I also hope that Drupal, being such a flexible tool, will still be useful to the organizations that use it today. I think it’s versatile enough that it can be used by many different communities; for example, the higher education and non-profit communities have really embraced Drupal and I see that being a long-term thing.

I see Drupal kind of growing to new places, new applications, especially with the maturing of decoupled Drupal solutions. I think it’s really going to evolve a lot in the near future. We’re going to see more standardization on how to do decoupled Drupal and that’s really going to change the landscape.

I also think that the Drupal community is evolving. At the last DrupalCon, there was a content editors / digital marketers track, and it was really exciting to see people coming from the community who aren’t necessarily developers, but more people on the content and marketing side, people we think of as users.

I think Drupal needs more of that, our community needs to embrace people with a larger set of skills and backgrounds in order to keep growing. We need to have marketers, we need to have UX designers and project managers involved in the project in order for it to be successful. 

6. What are some of the contributions to open source code or to the community made by you or other Drupalists that you are most proud of?

There are a couple of things. I’m really excited about things like the Media initiative, which seems like it’s really driven by creating a great user experience. It’s always really positive to start to see development that isn’t just driven by a functional set, but more in terms of “here’s what the users wanted and here’s a picture of what we want to build”, with the focus on a good user experience.

Another initiative that I’m really impressed by is the Layout Initiative. It’s such a mature initiative. The focus on accessibility really shows that Drupal is such a leader in the open source community. The team is creating a tool that’s so flexible and innovative, and gives so much power to the content editor, but at the same time really focused on creating a tool that’s accessible and can be used by anyone.

7. Is there an initiative or a project in the Drupal space that you would like to promote or highlight?

Beyond code there are some of these initiatives that I think are really worth highlighting. I’m really excited about where the Promote Drupal initiative is going and how to get more marketers involved in that. There’s also an event organizers working group that’s being formed to help Drupal Camps come together and share resources. I think both of these have the potential to grow our community.

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavor. 

This summer I joined a bike club and that’s been really fun, doing something not in front of a screen and instead just getting outside. And, just like with Drupal, people are really passionate about cycling and welcoming novices like me into the community. So I’m excited about going to DrupalCon Amsterdam in October and cycling around!
 

Sep 13 2019
Sep 13

Throughout the series, we explored many migration topics. We started with an overview of the ETL process and workflows for managing migrations. Then, we presented example migrations for different entities: nodes, files, images, taxonomy terms, users, and paragraphs. Next, we shifted focus to migrations from different sources: CSV, JSON, XML, Google Sheet, Microsoft Excel, and LibreOffice Calc files. Later, we explored how to manage migrations as configuration, use groups to share configuration, and execute migrations from the user interface. Finally, we gave recommendations and provided tools for debugging migrations from the command line and the user interface. Although we covered a lot of ground, we only scratched the surface. The Migrate API is so flexible that its use cases are virtually endless. To wrap up the series, we present an introduction to a very popular topic: Drupal upgrades. Let’s get started.

Note: In this article, when we talk about Drupal 7, the same applies to Drupal 6.

What is a Drupal upgrade?

The information we presented in the series is generic enough that it applies to many types of Drupal migrations. There is one particular use case that stands out from the rest: Drupal upgrades. An upgrade is the process of taking your existing Drupal site and copy its configuration and content over to a new major version of Drupal. For example, going from Drupal 6 or 7 to Drupal 8. The following is an oversimplification of the workflow to perform the upgrade process:

  • Install a fresh Drupal 8 site.
  • Add credentials so that the new site can connect to Drupal 7’s database.
  • Use the Migrate API to generate migration definition files. They will copy over Drupal 7’s configuration and content. This step is only about generating the YAML files.
  • Execute those migrations to bring the configuration and content over to Drupal 8.

Preparing your migration

Any migration project requires a good plan of action, but this is particularly important for Drupal upgrades. You need to have a general sense of how the upgrade process works, what assumptions are made by the system, and what limitations exist. Read this article for more details on how to prepare a site for upgrading it to Drupal 8. Some highlights include:

  • Both sites need to be in the latest stable version of their corresponding branch. That means the latest release of Drupal 7 and 8 at the time of performing the upgrade process. This also applies to any contributed module.
  • Do not do any configuration of the Drupal 8 site until the upgrade process is completed. Any configuration you make will be overridden, and there is no need for it anyways. Part of the process includes recreating the old site’s configuration: content types, fields, taxonomy vocabularies, etc.
  • Do not create content on the Drupal 8 site until the upgrade process is completed. The upgrade process will keep the unique identifiers from the source site: `nid`, `uid`, `tid`, `fid`, etc. If you were to create content, the references among entities could be broken when the upgrade process overrides the unique identifiers. To prevent data loss, wait until the old site's content has been migrated to start adding content to the new site.
  • For the system to detect a module’s configuration to be upgraded automatically, it has to be enabled on both sites. This applies to contributed modules in Drupal 7 (e.g., link) that were moved to core in Drupal 8. Also to Drupal 7 modules (e.g. address field) that were superseded by a different one in Drupal 8 (e.g. address). In any of those cases, as long as the modules are enabled on both ends, their configuration and content will be migrated. This assumes that the Drupal 8 counterpart offers an automatic upgrade path.
  • Some modules do not offer automatic upgrade paths. The primary example is the Views module. This means that any view created in Drupal 7 needs to be manually recreated in Drupal 8.
  • The upgrade procedure is all about moving data, not logic in custom code. If you have custom modules, the custom code needs to be ported separately. If those modules store data in Drupal’s database, you can use the Migrate API to move it over to the new site.
  • Similarly, you will have to recreate the theme from scratch. Drupal 8 introduced Twig which is significantly different to the PHPTemplate engine used by Drupal 7.

Customizing your migration

Note that the creation and execution of the migration files are separate steps. Upgrading to a major version of Drupal is often a good opportunity to introduce changes to the website. For example, you might want to change the content modeling, navigation, user permissions, etc. To accomplish that, you can modify the generated migration files to account for any scenario where the new site’s configuration diverts from the old one. And only when you are done with the customizations, you execute the migrations. Examples of things that could change include:

  • Combining or breaking apart content types.
  • Moving data about people from node entities to user entities, or vice versa.
  • Renaming content types, fields, taxonomy vocabularies and terms, etc.
  • Changing field types. For example, going from Address Field module in Drupal 7 to Address module in Drupal 8.
  • Merging multiple taxonomy vocabularies into one.
  • Changing how your content is structured. For example, going from a monolithic body field to paragraph entities.
  • Changing how your multimedia files are stored. For example, going from image fields to media entities.

Performing the upgrade

There are two options to perform the upgrade. In both cases, the process is initiated from the Drupal 8 site. One way is using the Migrate Drupal UI core module to perform the upgrade from the browser’s user interface. When the module is enabled, go to `/upgrade` and provide the database credentials of the Drupal 7 site. Based on the installed modules on both sites, the system will give you a report of what can be automatically upgraded. Consider the limitations explained above. While the upgrade process is running, you will see a stream of messages about the operation. These messages are logged to the database so you can read them after the upgrade is completed. If your dataset is big or there are many expensive operations like password encryption, the process can take too long to complete or fail altogether.

The other way to perform the upgrade procedure is from the command line using Drush. This requires the Migrate Upgrade contributed module. When enabled, it adds Drush commands to import and rollback a full upgrade operation. You can provide database connection details of the old site via command line options. One benefit of using this approach is that you can create the migration files without running them. This lets you do customizations as explained above. When you are done, you can run the migrations following the same workflow of manually created ones.

Known issues and limitations

Depending on whether you are upgrading from Drupal 6 or 7, there is a list of known issues you need to be aware of. Read this article for more information. One area that can be tricky is multilingual support. As of this writing, the upgrade path for multilingual sites is not complete. Limited support is available via the Migrate Drupal Multilingual core module. There are many things to consider when working with multilingual migrations. For example, are you using node or field translations? Do entities have revisions? Read this article for more information.

Upgrade paths for contributed modules

The automatic upgrade procedure only supports Drupal core modules. This includes modules that were added to core in Drupal 8. For any other contributed module, it is the maintainers’ decision to include an automatic upgrade path or not. For example, the Geofield module provides an upgrade path. It is also possible that a module in Drupal 8 offers an upgrade path from a different module in Drupal 7. For example, the Address module provides an upgrade path from the Address Field module. Drupal Commerce also provides some support via the Commerce Migrate module.

Not every module offers an automated upgrade path. In such cases, you can write custom plugins which ideally are contributed back to Drupal.org ;-) Or you can use the techniques learned in the series to transform your source data into the structures expected by Drupal 8. In both cases, having a broad understanding of the Migrate API will be very useful.

Upgrade strategies

There are multiple migration strategies. You might even consider manually recreating the content if there is only a handful of data to move. Or you might decide to use the Migrate API to upgrade part of the site automatically and do a manual copy of a different portion of it. You might want to execute a fully automated upgrade procedure and manually clean up edge cases afterward. Or you might want to customize the migrations to account for those edge cases already. Michael Anello created an insightful presentation on different migration strategies. Our tips for writing migrations apply as well.

Drupal upgrades tend to be fun, challenging projects. The more you know about the Migrate API the easier it will be to complete the project. We enjoyed writing this overview of the Drupal Migrate API. We would love to work on a follow up series focused on Drupal upgrades. If you or your organization could sponsor such endeavor, please reach out to us via the site’s contact form.

What about upgrading to Drupal 9?

In March 2017, project lead Dries Buytaert announced a plan to make Drupal upgrades easier forever. This was reinforced during his keynote at DrupalCon Seattle 2019. You can watch the video recording in this link. In short, Drupal 9.0 will be the latest point release of Drupal 8 minus deprecated APIs. This has very important implications:

  • When Drupal 9 is released, the Migrate API should be mostly the same as Drupal 8. Therefore, anything that you learn today will be useful for Drupal 9 as well.
  • As long as your code does not use deprecated APIs, upgrading from Drupal 8 to Drupal 9 will be as easy as updating from Drupal 8.7 to 8.8.
  • Because of this, there is no need to wait for Drupal 9 to upgrade your Drupal 6 or 7 site. You can upgrade to Drupal 8 today.

Thank you!

And that concludes the #31DaysOfMigration series. For joining us in this learning experience, thank you very much! ¡Muchas gracias! Merci beaucoup! :-D We are also very grateful to Agaric.coop, Drupalize.Me, and Centarro.io for sponsoring this series.

What did you learn in today’s blog post? Did you know the upgrade process is able to copy content and configuration? Did you know that you can execute the upgrade procedure either from the user interface or the command line? Share your answers in the comments. Also, we would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors: Drupalize.me by Osio Labs has online tutorials about migrations, among other topics, and Agaric provides migration trainings, among other services.  Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Sep 13 2019
Sep 13

When one starts working with migrations, it is easy to be overwhelmed by so many modules providing migration functionality. Throughout the series, we presented many of them trying to cover one module at a time. We did this to help the reader understand when a particular module is truly needed and why. But we only scratched the surface. Today’s article presents a list of migration related Drupal modules, all good for Drupal 8, for quick reference. Let’s get started.

Core modules

At the time of this writing, Drupal core ships with four migration modules:

  • Migrate: provides the base API for migrating data.
  • Migrate Drupal: offers functionality to migrate from other Drupal installations. It serves as the foundation for upgrades from Drupal 6 and 7. It also supports reading configuration entities from Drupal 8 sites.
  • Drupal Migrate UI: provides a user interface for upgrading a Drupal 6 or 7 site to Drupal 8.
  • Migrate Drupal Multilingual: is an experimental module required by multilingual translations. When they become stable, the module will be removed from Drupal core. See this article for more information.

Migration runners

Once the migration definition files have been created, there are many options to execute them:

Source plugins

The Migrate API offers many options to fetch data from:

Destination plugins

The Migrate API is mostly used to move data into Drupal, but it is possible to write to other destinations:

  • Migrate (core): provides classes for creating content and configuration entities. It also offers the `null` plugin which in itself does not write to anything. It is used in multilingual migrations for entity references.
  • Migrate Plus: provides the `table` plugin for migrating into tables not registered with Drupal Schema API.
  • CSV file: example destination plugin implementation to write CSV files. The module was created by J Franks for a DrupalCamp presentation. Check out the repository and video recording.

Development related

These modules can help with writing Drupal migrations:

Field and module related

Modules created by Tess Flynn (socketwench)

While doing the research for this article, we found many useful modules created by Tess Flynn (socketwench). She is a fantastic presenter who also has written about Drupal migrations, testing, and much more. Here are some of her modules:

Miscellaneous

  • Feeds Migrate: it aims to provide a user interface similar to the one from Drupal 7’s Feeds module, but working on top of Drupal 8’s Migrate API.
  • Migrate Override: allows flagging fields in a content entity so they can be manually changed by side editors without being overridden in a subsequent migration.
  • Migrate Status: checks if migrations are currently running.
  • Migrate QA: provides tools for validating content migrations. See this presentation for more details.

And many, many more modules!

What did you learn in today’s blog post? Did you find a new module that could be useful in current or future projects? Did we miss a module that has been very useful to you? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

Next: Introduction to Drupal 8 upgrades

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors: Drupalize.me by Osio Labs has online tutorials about migrations, among other topics, and Agaric provides migration trainings, among other services.  Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Sign up to be notified when Agaric gives a migration training:

Sep 12 2019
Sep 12

In recent articles, we have presented some recommendations and tools to debug Drupal migrations. Using a proper debugger is definitely the best way to debug Drupal be it migrations or other substems. In today’s article, we are going to learn how to configure XDebug inside DrupalVM to connect to PHPStorm. First, via the command line using Drush commands. And then, via the user interface using a browser. Let’s get started.

Important: User interfaces tend to change. Screenshots and referenced on-screen text might differ in new versions of the different tools. They can also vary per operating system. This article uses menu items from Linux. Refer the the official DrupalVM documentation for detailed installation and configuration instructions. For this article, it is assumed that VirtualBox, Vagrant, and Ansible are already installed. If you need help with those, refer to the DrupalVM’s installation guide.

Getting DrupalVM

First, get a copy of DrupalVM by cloning the repository or downloading a ZIP or TAR.GZ file from the available releases. If you downloaded a compressed file, expand it to have access to the configuration files. Before creating the virtual machine, make a copy of default.config.yml into a new file named config.yml. The latter will be used by DrupalVM to configure the virtual machine (VM). In this file, make the following changes:

# config.yml file
# Based off default.config.yml
vagrant_hostname: migratedebug.test
vagrant_machine_name: migratedebug
# For dynamic IP assignment the 'vagrant-auto_network' plugin is required.
# Otherwise, use an IP address that has not been used by any other virtual machine.
vagrant_ip: 0.0.0.0

# All the other extra packages can remain enabled.
# Make sure the following three get installed by uncommenting them.
installed_extras:
  - drupalconsole
  - drush
  - xdebug

php_xdebug_default_enable: 1
php_xdebug_cli_disable: no

The vagrant_hostname is the URL you will enter in your browser’s address bar to access the Drupal installation. Set vagrant_ip to an IP that has not been taken by another virtual machine. If you are unsure, you can set the value to 0.0.0.0 and install the vagrant-auto_network plugin. The plugin will make sure that an available IP is assigned to the VM. In the installed_extras section, uncomment xdebug and drupalconsole. Drupal Console is not necessary for getting XDebug to work, but it offers many code introspection tools that are very useful for Drupal debugging in general. Finally, set php_xdebug_default_enable to 1 and php_xdebug_cli_disable to no. These last two settings are very important for being able to debug Drush commands.

Then, open a terminal and change directory to where the DrupalVM files are located. Keep the terminal open are going to execute various commands from there. Start the virtual machine by executing vagrant up. If you had already created the VM, you can still make changes to the config.yml file and then reprovision. If the virtual machine is running, execute the command: vagrant provision. Otherwise, you can start and reprovision the VM in a single command: vagrant up --provision. Finally, SSH into the VM executing vagrant ssh.

By default, DrupalVM will use the Drupal composer template project to get a copy of Drupal. That means that you will be managing your module and theme dependencies using composer. When you SSH into the virtual machine, you will be in the /var/www/drupalvm/drupal/web directory. That is Drupal’s docroot. The composer file that manages the installation is actually one directory up. Normally, if you run a composer command from a directory that does not have a composer.json file, composer will try to find one up in the directory hierarchy. Feel free to manually go one directory up or rely on composer’s default behaviour to locate the file.

For good measure, let’s install some contributed modules. Inside the virtual machine, in Drupal’s docroot, execute the following command: composer require drupal/migrate_plus drupal/migrate_tools. You can also create directory in /var/www/drupalvm/drupal/web/modules/custom and place the custom module we have been working on throughout the series. You can get it at https://github.com/dinarcon/ud_migrations.

To make sure things are working, let’s enable one example modules by executing: drush pm-enable ud_migrations_config_entity_lookup_entity_generate. This module comes with one migration: udm_config_entity_lookup_entity_generate_node. If you execute drush migrate:status the example migration should be listed.

Configuring PHPStorm

With Drupal already installed and the virtual machine running, let’s configure PHPStorm. Start a new project pointing to the DrupalVM files. Feel free to follow your preferred approach to project creation. For reference, one way to do it is by going to "Files > Create New Project from Existing Files". In the dialog, select "Source files are in a local directory, no web server is configured yet." and click next. Look for the DrupalVM directory, click on it, click on “Project Root”, and then “Finish”. PHPStorm will begin indexing the files and detect that it is a Drupal project. It will prompt you to enable the Drupal coding standards, indicate which directory contains the installation path, and if you want to set PHP include paths. All of that is optional but recommended, especially if you want to use this VM for long term development.

Now the important part. Go to “Files > Settings > Language and Frameworks > PHP”. In the panel, there is a text box labeled “CLI Interpreter”. In the right end, there is a button with three dots like an ellipsis (...). The next step requires that the virtual machine is running because PHPStorm will try to connect to it. After verifying that it is the case, click the plus (+) button at the top left corner to add a CLI Interpreter. From the list that appears, select “From Docker, Vagrant, VM, Remote...”. In the “Configure Remote PHP Interpreter” dialog select “Vagrant”. PHPStorm will detect the SSH connection to connect to the virtual machine. Click “OK” to close the multiple dialog boxes. When you go back to the “Languages & Frameworks” dialog, you can set the “PHP language level” to match the same version from the Remote CLI Interpreter.

Remote PHP interpreter.

CLI interpreters.

You are almost ready to start debugging. There are a few things pending to do. First, let’s create a breakpoint in the import method of the MigrateExecutable class. You can go to “Navigate > Class” to the project by class name. Or click around in the Project structure until you find the class. It is located at ./drupal/web/core/modules/migrate/src/MigrateExecutable.php in the VM directory. You can add a breakpoint by clicking on the bar to the left of the code area. A red circle will appear, indicating that the breakpoint has been added.

Then, you need to instruct PHPStorm to listen for debugging connections. For this, click on “Run > Start Listening for PHP Debugging Connections”. Finally, you have to set some server mappings. For this you will need the IP address of the virtual machine. If you configured the VM to assign the IP dynamically, you can skip this step momentarily. PHPStorm will detect the incoming connection, create a server with the proper IP, and then you can set the path mappings.

Triggering the breakpoint

Let’s switch back to the terminal. If you are not inside the virtual machine, you can SSH into the VM executing vagrant ssh. Then, execute the following command (everything in one line):

XDEBUG_CONFIG="idekey=PHPSTORM" /var/www/drupalvm/drupal/vendor/bin/drush migrate:import udm_config_entity_lookup_entity_generate_node

For the breakpoint to be triggered, the following needs to happen:

  • You must execute Drush from the vendor directory. DrupalVM has a globally available Drush binary located at /usr/local/bin/drush. That is not the one to use. For debugging purposes, always execute Drush from the vendor directory.
  • The command needs to have XDEBUG_CONFIG environment variable set to “idekey=PHPSTORM”. There are many ways to accomplish this, but prepending the variable as shown in the example is a valid way to do it.
  • Because the breakpoint was set in the import method, we need to execute an import command to stop at the breakpoint. The migration in the example Drush command is part of the example module that was enabled earlier.

When the command is executed, a dialog will appear in PHPStorm. In it, you will be asked to select a project or a file to debug. Accept what is selected by default for now.  By accepting the prompt, a new server will be configured using the proper IP of the virtual machine.  After doing so, go to “Files > Settings > Language and Frameworks > PHP > Servers”. You should see one already created. Make sure the “Use path mappings” option is selected. Then, look for the direct child of “Project files”. It should be the directory in your host computer where the VM files are located. In that row, set the “Absolute path on the server” column to  /var/www/drupalvm. You can delete any other path mapping. There should only be one from the previous prompt. Now, click “OK” in the dialog to accept the changes.

Incoming Drush connection.

Path mappings

Finally, run the Drush command from inside the virtual machine once more. This time the program execution should stop at the breakpoint. You can use the Debug panel to step over each line of code and see how the variables change over time. Feel free to add more breakpoints as needed. In the previous article, there are some suggestions about that. When you are done, let PHPStorm know that it should no longer listen for connections. For that, click on “Run > Stop Listening for PHP Debugging Connections”. And that is how you can debug Drush commands for Drupal migrations.

Path mappings.

Debugging from the user interface

If you also want to be able to debug from the user interface, go to this URL and generate the bookmarklets for XDebug: https://www.jetbrains.com/phpstorm/marklets/ The IDE Key should be PHPSTORM. When the bookmarklets are created, you can drag and drop them into your browser’s bookmarks toolbar. Then, you can click on them to start and stop a debugging session. The IDE needs to be listening for incoming debugging connections as it was the case for Drush commands.

PHPStorm bookmarlets generator

Note: There are browser extensions that let you start and stop debugging sessions. Check the extensions repository of your browser to see which options are available.

Finally, set breakpoints as needed and go to a page that would trigger them. If you are following along with the example, you can go to http://migratedebug.test/admin/structure/migrate/manage/default/migrations/udm_config_entity_lookup_entity_generate_node/execute Once there, select the “Import” operation and click the “Execute” button. This should open a prompt in PHPStorm to select a project or a file to debug. Select the index.php located in Drupal’s docroot. After accepting the connection, a new server should be configured with the proper path mappings. At this point, you should hit the breakpoint again.

Incoming web connections.

Happy debugging! :-)

What did you learn in today’s blog post? Did you know how to debug Drush commands? Did you know how to trigger a debugging session from the browser? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors: Drupalize.me by Osio Labs has online tutorials about migrations, among other topics, and Agaric provides migration trainings, among other services.  Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Sep 12 2019
Sep 12

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

An in-depth analysis of how Drupal's development was sponsored between July 1, 2018 and June 30, 2019.

The past years, I've examined Drupal.org's contribution data to understand who develops Drupal, how diverse the Drupal community is, how much of Drupal's maintenance and innovation is sponsored, and where that sponsorship comes from.

You can look at the 2016 report, the 2017 report, and the 2018 report. Each report looks at data collected in the 12-month period between July 1st and June 30th.

This year's report shows that:

  • Both the recorded number of contributors and contributions have increased.
  • Most contributions are sponsored, but volunteer contributions remains very important to Drupal's success.
  • Drupal's maintenance and innovation depends mostly on smaller Drupal agencies and Acquia. Hosting companies, multi-platform digital marketing agencies, large system integrators and end users make fewer contributions to Drupal.
  • Drupal's contributors have become more diverse, but are still not diverse enough.

Methodology

What are Drupal.org issues?

"Issues" are pages on Drupal.org. Each issue tracks an idea, feature request, bug report, task, or more. See https://www.drupal.org/project/issues for the list of all issues.

For this report, we looked at all Drupal.org issues marked "closed" or "fixed" in the 12-month period from July 1, 2018 to June 30, 2019. The issues analyzed in this report span Drupal core and thousands of contributed projects, across all major versions of Drupal.

What are Drupal.org credits?

In the spring of 2015, after proposing initial ideas for giving credit, Drupal.org added the ability for people to attribute their work in the Drupal.org issues to an organization or customer, or mark it the result of volunteer efforts.

Example issue credit on drupal org

A screenshot of an issue comment on Drupal.org. You can see that jamadar worked on this patch as a volunteer, but also as part of his day job working for TATA Consultancy Services on behalf of their customer, Pfizer.

Drupal.org's credit system is truly unique and groundbreaking in Open Source and provides unprecedented insights into the inner workings of a large Open Source project. There are a few limitations with this approach, which we'll address at the end of this report.

What is the Drupal community working on?

In the 12-month period between July 1, 2018 and June 30, 2019, 27,522 issues were marked "closed" or "fixed", a 13% increase from the 24,447 issues in the 2017-2018 period.

In total, the Drupal community worked on 3,474 different Drupal.org projects this year compared to 3,229 projects in the 2017-2018 period — an 8% year over year increase.

The majority of the credits are the result of work on contributed modules:

A pie chart showing contributions by project type: most contributions are to contributed modules.

Compared to the previous period, contribution credits increased across all project types:

A graph showing the year over year growth of contributions per project type.

The most notable change is the large jump in "non-product credits": more and more members in the community started tracking credits for non-product activities such as organizing Drupal events (e.g. DrupalCamp Delhi projectDrupal Developer DaysDrupal Europe and DrupalCon Europe), promoting Drupal (e.g. Drupal pitch deck or community working groups (e.g. Drupal Diversity and Inclusion Working GroupGovernance Working Group).

While some of these increases reflect new contributions, others are existing contributions that are newly reported. All contributions are valuable, whether they're code contributions, or non-product and community-oriented contributions such as organizing events, giving talks, leading sprints, etc. The fact that the credit system is becoming more accurate in recognizing more types of Open Source contribution is both important and positive.

Who is working on Drupal?

For this report's time period, Drupal.org's credit system received contributions from 8,513 different individuals and 1,137 different organizations — a meaningful increase from last year's report.

A graph showing that the number of individual and organizational contributors increased year over year.

Consistent with previous years, approximately 51% of the individual contributors received just one credit. Meanwhile, the top 30 contributors (the top 0.4%) account for 19% of the total credits. In other words, a relatively small number of individuals do the majority of the work. These individuals put an incredible amount of time and effort into developing Drupal and its contributed projects:

Rank Username Issues 1 kiamlaluno 1610 2 jrockowitz 756 3 alexpott 642 4 RajabNatshah 616 5 volkswagenchick 519 6 bojanz 504 7 alonaoneill 489 8 thalles 488 9 Wim Leers 437 10 DamienMcKenna 431 11 Berdir 424 12 chipway 356 13 larowlan 324 14 pifagor 320 15 catch 313 16 mglaman 277 17 adci_contributor 274 18 quietone 266 19 tim.plunkett 265 20 gaurav.kapoor 253 21 RenatoG 246 22 heddn 243 23 chr.fritsch 241 24 xjm 238 25 phenaproxima 238 26 mkalkbrenner 235 27 gvso 232 28 dawehner 219 29 e0ipso 218 30 drumm 205

Out of the top 30 contributors featured this year, 28 were active contributors in the 2017-2018 period as well. These Drupalists' dedication and continued contribution to the project has been crucial to Drupal's development.

It's also important to recognize that most of the top 30 contributors are sponsored by an organization. Their sponsorship details are provided later in this article. We value the organizations that sponsor these remarkable individuals, because without their support, it could be more challenging for these individuals to be in the top 30.

It's also nice to see two new contributors make the top 30 this year — Alona O'neill with sponsorship from Hook 42 and Thalles Ferreira with sponsorship from CI&T. Most of their credits were the result of smaller patches (e.g. removing deprecated code, fixing coding style issues, etc) or in some cases non-product credits rather than new feature development or fixing complex bugs. These types of contributions are valuable and often a stepping stone towards towards more in-depth contribution.

How much of the work is sponsored?

Issue credits can be marked as "volunteer" and "sponsored" simultaneously (shown in jamadar's screenshot near the top of this post). This could be the case when a contributor does the necessary work to satisfy the customer's need, in addition to using their spare time to add extra functionality.

For those credits with attribution details, 18% were "purely volunteer" credits (8,433 credits), in stark contrast to the 65% that were "purely sponsored" (29,802 credits). While there are almost four times as many "purely sponsored" credits as "purely volunteer" credits, volunteer contribution remains very important to Drupal.

Contributions by volunteer vs sponsored

Both "purely volunteer" and "purely sponsored" credits grew — "purely sponsored" credits grew faster in absolute numbers, but for the first time in four years "purely volunteer" credits grew faster in relative numbers.

The large jump in volunteer credits can be explained by the community capturing more non-product contributions. As can be seen on the graph below, these non-product contributions are more volunteer-centric.

A graph showing how much of the contributions are volunteered vs sponsored.

Who is sponsoring the work?

Now that we've established that the majority of contributions to Drupal are sponsored, let's study which organizations contribute to Drupal. While 1,137 different organizations contributed to Drupal, approximately 50% of them received four credits or less. The top 30 organizations (roughly the top 3%) account for approximately 25% of the total credits, which implies that the top 30 companies play a crucial role in the health of the Drupal project.

Top conytinuying organizations

Top contributing organizations based on the number of issue credits.

While not immediately obvious from the graph above, a variety of different types of companies are active in Drupal's ecosystem:

Category Description Traditional Drupal businesses Small-to-medium-sized professional services companies that primarily make money using Drupal. They typically employ fewer than 100 employees, and because they specialize in Drupal, many of these professional services companies contribute frequently and are a huge part of our community. Examples are Hook42, Centarro, The Big Blue House, Vardot, etc. Digital marketing agencies Larger full-service agencies that have marketing-led practices using a variety of tools, typically including Drupal, Adobe Experience Manager, Sitecore, WordPress, etc. They tend to be larger, with many of the larger agencies employing thousands of people. Examples are Wunderman, Possible and Mirum. System integrators Larger companies that specialize in bringing together different technologies into one solution. Example system agencies are Accenture, TATA Consultancy Services, Capgemini and CI&T. Hosting companies Examples are Acquia, Rackspace, Pantheon and Platform.sh. End users Examples are Pfizer or bio.logis Genetic Information Management GmbH.

A few observations:

  • Almost all of the sponsors in the top 30 are traditional Drupal businesses with fewer than 50 employees. Only five companies in the top 30 — Pfizer, Google, CI&T, bio.logis and Acquia — are not traditional Drupal businesses. The traditional Drupal businesses are responsible for almost 80% of all the credits in the top 30. This percentage goes up if you extend beyond the top 30. It's fair to say that Drupal's maintenance and innovation largely depends on these traditional Drupal businesses.
  • The larger, multi-platform digital marketing agencies are barely contributing to Drupal. While more and more large digital agencies are building out Drupal practices, no digital marketing agencies show up in the top 30, and hardly any appear in the entire list of contributing organizations. While they are not required to contribute, I'm frustrated that we have not yet found the right way to communicate the value of contribution to these companies. We need to incentivize each of these firms to contribute back with the same commitment that we see from traditional Drupal businesses
  • The only system integrator in the top 30 is CI&T, which ranked 4th with 795 credits. As far as system integrators are concerned, CI&T is a smaller player with approximately 2,500 employees. However, we do see various system integrators outside of the top 30, including Globant, Capgemini, Sapient and TATA Consultancy Services. In the past year, Capgemini almost quadrupled their credits from 46 to 196, TATA doubled its credits from 85 to 194, Sapient doubled its credits from 28 to 65, and Globant kept more or less steady with 41 credits. Accenture and Wipro do not appear to contribute despite doing a fair amount of Drupal work in the field.
  • Hosting companies also play an important role in our community, yet only Acquia appears in the top 30. Rackspace has 68 credits, Pantheon has 43, and Platform.sh has 23. I looked for other hosting companies in the data, but couldn't find any. In general, there is a persistent problem with hosting companies that make a lot of money with Drupal not contributing back. The contribution gap between Acquia and other hosting companies has increased, not decreased.
  • We also saw three end users in the top 30 as corporate sponsors: Pfizer (453 credits), Thunder (659 credits, up from 432 credits the year before), and the German company, bio.logis (330 credits). A notable end user is Johnson & Johnson, who was just outside of the top 30, with 221 credits, up from 29 credits the year before. Other end users outside of the top 30, include the European Commission (189 credits), Workday (112 credits), Paypal (80 credits), NBCUniversal (48 credits), Wolters Kluwer (20 credits), and Burda Media (24 credits). We also saw contributions from many universities, including the University of British Columbia (148 credits), University of Waterloo (129 credits), Princeton University (73 credits), University of Austin Texas at Austin (57 credits), Charles Darwin University (24 credits), University of Edinburgh (23 credits), University of Minnesota (19 credits) and many more.

A graph showing that Acquia is by far the number one contributing hosting company.

Contributions by system integrators

It would be interesting to see what would happen if more end users mandated contributions from their partners. Pfizer, for example, only works with agencies that contribute back to Drupal, and uses Drupal's credit system to verify their vendors' claims. The State of Georgia started doing the same; they also made Open Source contribution a vendor selection criteria. If more end users took this stance, it could have a big impact on the number of digital agencies, hosting companies and system integrators that contribute to Drupal.

While we should encourage more organizations to sponsor Drupal contributions, we should also understand and respect that some organizations can give more than others and that some might not be able to give back at all. Our goal is not to foster an environment that demands what and how others should give back. Instead, we need to help foster an environment worthy of contribution. This is clearly laid out in Drupal's Values and Principles.

How diverse is Drupal?

Supporting diversity and inclusion within Drupal is essential to the health and success of the project. The people who work on Drupal should reflect the diversity of people who use and work with the web.

I looked at both the gender and geographic diversity of Drupal.org contributors. While these are only two examples of diversity, these are the only diversity characteristics we currently have sufficient data for. Drupal.org recently rolled out support for Big 8/Big 10, so next year we should have more demographics information

Gender diversity

The data shows that only 8% of the recorded contributions were made by contributors who do not identify as male, which continues to indicate a wide gender gap. This is a one percent increase compared to last year. The gender imbalance in Drupal is profound and underscores the need to continue fostering diversity and inclusion in our community.

A graph showing contributions by gender: 75% of the contributions come from people who identify as male.

Last year I wrote a post called about the privilege of free time in Open Source. It made the case that Open Source is not a meritocracy, because not everyone has equal amounts of free time to contribute. For example, research shows that women still spend more than double the time as men doing unpaid domestic work, such as housework or childcare. This makes it more difficult for women to contribute to Open Source on an unpaid, volunteer basis. It's one of the reasons why Open Source projects suffer from a lack of diversity, among others including hostile environments and unconscious biases. Drupal.org's credit data unfortunately still shows a big gender disparity in contributions:

A graph that shows that compared to males, female contributors do more sponsored work, and less volunteer work.

Ideally, over time, we can collect more data on non-binary gender designations, as well as segment some of the trends behind contributions by gender. We can also do better at collecting data on other systemic issues beyond gender alone. Knowing more about these trends can help us close existing gaps. In the meantime, organizations capable of giving back should consider financially sponsoring individuals from underrepresented groups to contribute to Open Source. Each of us needs to decide if and how we can help give time and opportunities to underrepresented groups and how we can create equity for everyone in Drupal.

Geographic diversity

When measuring geographic diversity, we saw individual contributors from six continents and 114 countries:

A graph that shows most contributions come from Europe and North America.

Contributions per capita

Contribution credits per capita calculated as the amount of contributions per continent divided by the population of each continent. 0.001% means that one in 100,000 people contribute to Drupal. In North America, 5 in 100,000 people contributed to Drupal the last year.

Contributions from Europe and North America are both on the rise. In absolute terms, Europe contributes more than North America, but North America contributes more per capita.

Asia, South America and Africa remain big opportunities for Drupal, as their combined population accounts for 6.3 billion out of 7.5 billion people in the world. Unfortunately, the reported contributions from Asia are declining year over year. For example, compared to last year's report, there was a 17% drop in contribution from India. Despite that drop, India remains the second largest contributor behind the United States:

A graph showing the top 20 contributing countries.

The top 20 countries from which contributions originate. The data is compiled by aggregating the countries of all individual contributors behind each issue. Note that the geographical location of contributors doesn't always correspond with the origin of their sponsorship. Wim Leers, for example, works from Belgium, but his funding comes from Acquia, which has the majority of its customers in North America.

Top contributor details

To create more awareness of which organizations are sponsoring the top individual contributors, I included a more detailed overview of the top 50 contributors and their sponsors. If you are a Drupal developer looking for work, these are some of the companies I'd apply to first. If you are an end user looking for a company to work with, these are some of the companies I'd consider working with first. Not only do they know Drupal well, they also help improve your investment in Drupal.

Rank Username Issues Volunteer Sponsored Not specified Sponsors 1 kiamlaluno 1610 99% 0% 1% 2 jrockowitz 756 98% 99% 0% The Big Blue House (750), Memorial Sloan Kettering Cancer Center (5), Rosewood Marketing (1) 3 alexpott 642 6% 80% 19% Thunder (336), Acro Media Inc (100), Chapter Three (77) 4 RajabNatshah 616 1% 100% 0% Vardot (730), Webship (2) 5 volkswagenchick 519 2% 99% 0% Hook 42 (341), Kanopi Studios (171) 6 bojanz 504 0% 98% 2% Centarro (492), Ny Media AS (28), Torchbox (5), Liip (2), Adapt (2) 7 alonaoneill 489 9% 99% 0% Hook 42 (484) 8 thalles 488 0% 100% 0% CI&T (488), Janrain (3), Johnson & Johnson (2) 9 Wim Leers 437 8% 97% 0% Acquia (421), Government of Flanders (3) 10 DamienMcKenna 431 0% 97% 3% Mediacurrent (420) 11 Berdir 424 0% 92% 8% MD Systems (390) 12 chipway 356 0% 100% 0% Chipway (356) 13 larowlan 324 16% 94% 2% PreviousNext (304), Charles Darwin University (22), University of Technology, Sydney (3), Service NSW (2), Department of Justice & Regulation, Victoria (1) 14 pifagor 320 52% 100% 0% GOLEMS GABB (618), EPAM Systems (16), Drupal Ukraine Community (6) 15 catch 313 1% 95% 4% Third & Grove (286), Tag1 Consulting (11), Drupal Association (6), Acquia (4) 16 mglaman 277 2% 98% 1% Centarro (271), Oomph, Inc. (16), E.C. Barton & Co (3), Gaggle.net, Inc. (1), Bluespark (1), Thinkbean (1), LivePerson, Inc (1), Impactiv, Inc. (1), Rosewood Marketing (1), Acro Media Inc (1) 17 adci_contributor 274 0% 100% 0% ADCI Solutions (273) 18 quietone 266 41% 75% 1% Acro Media Inc (200) 19 tim.plunkett 265 3% 89% 9% Acquia (235) 20 gaurav.kapoor 253 0% 51% 49% OpenSense Labs (129), DrupalFit (111) 21 RenatoG 246 0% 100% 0% CI&T (246), Johnson & Johnson (85) 22 heddn 243 2% 98% 2% MTech, LLC (202), Tag1 Consulting (32), European Commission (22), North Studio (3), Acro Media Inc (2) 23 chr.fritsch 241 0% 99% 1% Thunder (239) 24 xjm 238 0% 85% 15% Acquia (202) 25 phenaproxima 238 0% 100% 0% Acquia (238) 26 mkalkbrenner 235 0% 100% 0% bio.logis Genetic Information Management GmbH (234), OSCE: Organization for Security and Co-operation in Europe (41), Welsh Government (4) 27 gvso 232 0% 100% 0% Google Summer of Code (214), Google Code-In (16), Zivtech (1) 28 dawehner 219 39% 84% 8% Chapter Three (176), Drupal Association (5), Tag1 Consulting (3), TES Global (1) 29 e0ipso 218 99% 100% 0% Lullabot (217), IBM (23) 30 drumm 205 0% 98% 1% Drupal Association (201) 31 gabesullice 199 0% 100% 0% Acquia (198), Aten Design Group (1) 32 amateescu 194 0% 97% 3% Pfizer, Inc. (186), Drupal Association (1), Chapter Three (1) 33 klausi 193 2% 59% 40% jobiqo - job board technology (113) 34 samuel.mortenson 187 42% 42% 17% Acquia (79) 35 joelpittet 187 28% 78% 14% The University of British Columbia (146) 36 borisson_ 185 83% 50% 3% Calibrate (79), Dazzle (13), Intracto digital agency (1) 37 Gábor Hojtsy 184 0% 97% 3% Acquia (178) 38 adriancid 182 91% 22% 2% Drupiter (40) 39 eiriksm 182 0% 100% 0% Violinist (178), Ny Media AS (4) 40 yas 179 12% 80% 10% DOCOMO Innovations, Inc. (143) 41 TR 177 0% 0% 100% 42 hass 173 1% 0% 99% 43 Joachim Namyslo 172 69% 0% 31% 44 alex_optim 171 0% 99% 1% GOLEMS GABB (338) 45 flocondetoile 168 0% 99% 1% Flocon de toile (167) 46 Lendude 168 52% 99% 0% Dx Experts (91), ezCompany (67), Noctilaris (9) 47 paulvandenburg 167 11% 72% 21% ezCompany (120) 48 voleger 165 98% 98% 2% GOLEMS GABB (286), Lemberg Solutions Limited (36), Drupal Ukraine Community (1) 49 lauriii 164 3% 98% 1% Acquia (153), Druid (8), Lääkärikeskus Aava Oy (2) 50 idebr 162 0% 99% 1% ezCompany (156), One Shoe (5)

Limitations of the credit system

It is important to note a few of the current limitations of Drupal.org's credit system:

  • The credit system doesn't capture all code contributions. Parts of Drupal are developed on GitHub rather than Drupal.org, and often aren't fully credited on Drupal.org. For example, Drush is maintained on GitHub instead of Drupal.org, and companies like Pantheon don't get credit for that work. The Drupal Association is working to integrate GitLab with Drupal.org. GitLab will provide support for "merge requests", which means contributing to Drupal will feel more familiar to the broader audience of Open Source contributors who learned their skills in the post-patch era. Some of GitLab's tools, such as in-line editing and web-based code review will also lower the barrier to contribution, and should help us grow both the number of contributions and contributors on Drupal.org.
  • The credit system is not used by everyone. There are many ways to contribute to Drupal that are still not captured in the credit system, including things like event organizing or providing support. Technically, that work could be captured as demonstrated by the various non-product initiatives highlighted in this post. Because using the credit system is optional, many contributors don't. As a result, contributions often have incomplete or no contribution credits. We need to encourage all Drupal contributors to use the credit system, and raise awareness of its benefits to both individuals and organizations. Where possible, we should automatically capture credits. For example, translation efforts on https://localize.drupal.org are not currently captured in the credit system but could be automatically.
  • The credit system disincentives work on complex issues. We currently don't have a way to account for the complexity and quality of contributions; one person might have worked several weeks for just one credit, while another person might receive a credit for 10 minutes of work. We certainly see a few individuals and organizations trying to game the credit system. In the future, we should consider issuing credit data in conjunction with issue priority, patch size, number of reviews, etc. This could help incentivize people to work on larger and more important problems and save smaller issues such as coding standards improvements for new contributor sprints. Implementing a scoring system that ranks the complexity of an issue would also allow us to develop more accurate reports of contributed work.

All of this means that the actual number of contributions and contributors could be significantly higher than what we report.

Like Drupal itself, the Drupal.org credit system needs to continue to evolve. Ultimately, the credit system will only be useful when the community uses it, understands its shortcomings, and suggests constructive improvements.

A first experiment with weighing credits

As a simple experiment, I decided to weigh each credit based on the adoption of the project the credit is attributed to. For example, each contribution credit to Drupal core is given a weight of 11 because Drupal core has about 1,1 million active installations. Credits to the Webform module, which has over 400,000 installations, get a weight of 4. And credits to Drupal's Commerce project gets just 1 point as it is installed on fewer than 100,000 sites.

The idea is that these weights capture the end user impact of each contribution, but also act as a proxy for the effort required to get a change committed. Getting a change accepted in Drupal core is both more difficult and more impactful than getting a change accepted to Commerce project.

This weighting is far from perfect as it undervalues non-product contributions, and it still doesn't recognize all types of product contributions (e.g. product strategy work, product management work, release management work, etc). That said, for code contributions, it may be more accurate than a purely unweighted approach.

Top contributing individuals based on weighted credits.

The top 30 contributing individuals based on weighted Drupal.org issue credits.

Top contributing organizations based on weighted credits.

The top 30 contributing organizations based on weighted Drupal.org issue credits.

Conclusions

Our data confirms that Drupal is a vibrant community full of contributors who are constantly evolving and improving the software. It's amazing to see that just in the last year, Drupal welcomed more than 8,000 individuals contributors and over 1,100 corporate contributors. It's especially nice to see the number of reported contributions, individual contributors and organizational contributors increase year over year.

To grow and sustain Drupal, we should support those that contribute to Drupal and find ways to get those that are not contributing involved in our community. Improving diversity within Drupal is critical, and we should welcome any suggestions that encourage participation from a broader range of individuals and organizations.

Sep 12 2019
Sep 12

The Hub - No Question It Can't Answer

The Bloomington Public Schools District in Bloomington, MN is a leader in harnessing technology to support student learning. We’ve been working with them since 2012, when we helped them move their K-12 site from a proprietary content management system that was expensive, inflexible and not mobile-friendly to a Drupal-powered site that provided more control over the site structure and content. We also added multi-site capabilities for the district’s 17 schools and departments, so they could manage their own content while retaining the broader district branding.

BPS has always been an early adopter of technology for school systems. However, back in 2012, there weren’t any student information systems like we have today; schools were still using paper planners to track homework. So they challenged us: build an online portal for teachers, students and their parents to access school data that was only accessible internally. The goal was to surface the data and make it easily consumable by everyone.

Nothing like this existed at the time, even in the Drupal world. The TEN7 development team wrote custom code (effectively creating original Drupal modules) to accomplish the task. The end result was a separate password-protected web application called “The Hub.” The Hub compiled data from third-party sources such as TIES, hundreds of Google calendars of students and teachers, data from school and district websites, as well as the district’s internal database.

The first release of The Hub was simple: a dashboard showing the students' class schedules and news feeds. However, the secure portal soon became more than a homework planning resource—The Hub has grown into a student, school and district information portal.

Bloomington Public Schools

New Features

  • Pathways to Graduation graph, which allow students (and their parents) to track their progress toward their post-secondary goals
    Bloomington Public Schools Pathways to Graduation
  • Data Warehouse, which holds lots of student info, such as student testing results, and even health records
    Bloomington Public Schools HUB Grades
    Bloomington Public Schools Hub Health Records
  • The Hub Digest emails, which give students and parents an overview of the upcoming week in their classes
  • Notes, which lets students, parents and teachers notes to any event or activity in the Hub
  • Students can add their own calendar events

“We have weekly meetings with TEN7, and they’re very innovative in coming up with creative solutions for the crazy ideas and problems we bring them. We serve 14,000 parents, almost 11,000 students and 1500 staff members—we have a huge backlog of crazy ideas. There’s always more we can give our stakeholders. With off-the-shelf software, you get what you get. But because we can do whatever we want with The Hub, it’s become the solution for lots of gnarly problems in our system. There are always more gnarly problems to solve and The Hub is almost always the answer.”
—Katrina Mezera, Digital Learning & Data Manager

The Hub is considered an indispensable tool by parents, students and staff. On any given day, thousands of students (up to 50% of the student body) use The Hub. Other school districts have reviewed and admired the functionality of the portal. Bloomington Public Schools presented The Hub at the 2015 Google in Education Conference in Mankato, MN, and other school districts inquired about having the same system implemented for their district.

TEN7 continues to collaborate with Bloomington Public Schools to add features and refine the user experience.

Sep 12 2019
Sep 12

The profession of building websites has seen many changes in the last few years. SEO, website performance, multi-screen responsiveness, and accessibility are no longer luxuries. On top of that, tools have emerged that have improved the development experience and simplified scalability. Finding a modern CMS or framework that can incorporate ALL of these considerations is difficult. Especially when the flexibility to be able to create unique websites is also important. This is where Drupal 8 outshines other frameworks.

Here is a list of major benefits that you are missing out on if your website is built in Drupal 7 (or below), WordPress, or most other common content management systems.

1. Website Accessibility, Security, and Scalability  

Accessibility - One of the major changes in web development these days is the standard of building sites that are easily accessible by all visitors no matter their abilities or disabilities. For some clients, this is no longer a luxury but a regulation. For others, there is no law, but they can be opening themselves up to discrimination lawsuits by not making an effort to build a site that is accessible. A well-implemented Drupal 8 site can automate many aspects of accessibility and facilitate best practices in maintaining the accessibility of the site. Compliance will also help your website gain points with major search engines.

Security - One reason that major companies and government agencies move to Drupal is because of its fast security updates, and the large community of experienced developers who stand behind it. Drupal 8’s adoption of standard technologies, such as Composer, makes it a lot easier to keep your site up to date and secure. This also means less time fixing the code and more time improving and adding new features.

Scalability - In the past whenever there was a new major release of Drupal, version 6 to 7, it wasn’t just an update. It really meant a full install of the new version of the Drupal core, then maliciously and painfully rebuilding and migrating custom code, configurations, users, and all of the content from the old site to the new site. In other words, hundreds of hours of rebuilding and months before the new site was ready. After Drupal 8, that will no longer be a problem. Drupal 8 was built to allow major updates without having to rebuild the site. Three to five years down the road, all you may need is a fresh facelift to have your website up to date with new trends. So taking the time to architect a well built Drupal 8 site will pay off in the long run.

2. Drupal 8 as API first

Drupal 8 was created as an API (Application Programming Interface), making it more powerful than ever. It now has the ability to skip the generation, all the HTML markup and simply serving the data required to be implemented on the front end of the website, allowing other technologies like javascript to serve dynamically generated pages specifically targeted to the user.

Best of all, it can be used to connect back and forth with other APIs to get any information you want to serve and share as a service. All of this is already built into Drupal 8 to serve any data without having to write one single line of code.

If you haven’t wrapped your mind around what this means yet. Let me give you a couple of examples of this: “Web Apps” & “Mobile apps”. These can be developed to use the same data living in Drupal 8. Your Drupal site becomes the backbone content management system without having to update multiple sources of data to keep things in sync. Think about it… your website, your mobile apps, your own intranet, and content from other places all managed in one place, making one perfect ecosystem.

3. Development  

Drupal 8 was completely recoded from scratch with all new improved code.  Many things that were a constant hassle in the older versions are now a breeze to build.

Migration of old to new: If you have a website with large amounts of content, for example, news, products, or blogs, setting up an automated migrating of all the data into Drupal 8 is possible from almost any source. Yes, that even means if your old website was built in WordPress.

One of my favorite development improvements for sure is Configuration Management. Now you are able to export and import configuration changes from a live site to your local development, or vice versa, without affecting content. That feature alone has cut hundreds of hours from my development workflow. This not only helps reduce the cost of overall development but it gives developers time to concentrate on getting the features built and not wasting time deploying already built features manually.

The implementation of Composer as the dependency manager to install and maintain modules, custom code, and patches, makes it super easy to create copies of your website in a development environment and automatically turn on all the development helper features by just having a few files on hand that are environment-aware. With a few simple commands to sync code and configurations, you are ready to develop and deploy in no time.

After a couple of years of developing in Drupal 8, I wonder how I was able to do without all the new development features for so long.

Drupal 8 is a dream-machine for large or small websites. All of these features make it quick to build a simple website and powerful enough to build the largest, most complex websites.  Drupal’s flexible and innovative system is only limited by your imagination.

Happy coding!

Sep 12 2019
Sep 12

The world is going mobile at a very fast pace. Smartphones, tablets, laptops have been the new luxury for the current generation. People want to have easy access to everything. The technology updates happening daily have made it possible for people to go beyond communication on mobile phones. Accessing the website on cell phones may create a confusion loop for the people as many times these websites are not upgraded to behave responsively. 

With a large number of the population using mobile phones, businesses are focusing on providing the best mobile interface experiences to their users. Drupal has acquired a large technology market. It is not only good for building websites but is also considered as an efficient tool to provide an impressive and powerful mobile experience to its users. 

Let’s take the plunge and enter the world of mobile solutions powered by Drupal. 

an illustration of a turned-on space gray iPhone X mobile showing different apps on the screens with a black background portraying for drupal mobile solutions


Every organization has its own specific courses of action to address the increasing needs of the application development process. It usually varies as per the requirements and resource availability, like, some are engaged in the development of responsive web-based applications and others have expertise in delivering full-fledged native applications to be deployed into the mobile devices.

Whether it is about incorporating themes, multi-language distribution, web-based mobile apps, cross-platform integration with third-party tools, Drupal is giving a lot many choices to the developers to define mobile solutions and their integration into an overall content and application infrastructure. The next few sections describe the types of mobile solutions offered by Drupal.

Mobile-First Approach for a Great Web Experience

Mobile-first, the philosophy created by Luke Wroblewski, is an approach of displaying web pages on different devices irrespective of the distinct window screens or sizes. That means the two concepts that are vital for a mobile-first design are-

Responsive Web Design (RWD): A web design method that automatically renders a web page according to the display screen of a user creating a soothing transition for the user.

Progressive Enhancement and Graceful Degradation: In order to display a web page reasonably on different devices, customized versions of the product are designed for different ends. The terms progressive enhancement and graceful degradation are completely different from each other but are always used in conjunction.

  • Progressive Enhancement: In progressive enhancement, basic functionalities, and features are implemented first on the lower browser versions (mobile phones). Later, complex interactions or functionalities are added on the basic version, upgrading it for the advanced browser versions (tablet, PC).
     
  • Graceful Degradation: The designing of the product is started from the most advanced version(desktop, laptop) with the implementation of complete functions and features. Later, when making the version compatible with mobile versions, the removal of some of the features or content is done. 

In between progressive enhancement and graceful degradation, progressive enhancement is the most chosen approach for the application design process. Drupal supports the building of responsive websites and web apps, offering a consistent content experience to its users irrespective of the device size in which the application is being displayed. 

Though the mobile-first development requires an effort, it promises easier deployment and high scalability. Numerous big brands are choosing Drupal over other technologies and are excelling ahead in their approach for mobile application development. 

Below are listed examples of two organizations who have benefitted from Drupal’s mobile-first offering: 

The Men’s Health Magazine Chronicle
 

illustration image showing mobile freindly home page of men's health magzine with images of men,advertisments in boled lettersin different colours


The multinational and constantly evolving brand in the men’s grooming industry, Men’s Health Magazine, intended to develop a thorough content-based platform coupling a better digital experience for its mobile users. With Drupal, the website was optimized for all screen sizes ranging from large desktops to small-sized handheld devices. 

Complete Case Study on Men’s Health Magazine can be read here.

The YardStick Saga
 

illustration image showing a content librarry for yardstick lms in different blocks form with images pasted and menu items on the left hand side


Yardstick, being a global leader in offering modern digital solutions over traditional learning for students, chose Drupal to provide easy and simplified access of the application to different users on distinct device screens.

Complete Case Study on Yardstick LMS can be read here.

Drupal is not only a way to build efficient web applications, but it can also be used as a backend for compelling mobile application development. In the following sections, we will explore the native, hybrid and progressive mobile solutions delivered by Drupal. The native mobile application first.

Scale Content Experience with Native Mobile Applications

Furnishing content through a native mobile application is all about managing the content and application-level services with Drupal. In addition to that, the application is allowed to use the capabilities of the device in which it has been installed. The front-end user experience, input events, and context are handled by the mobile application whereas Drupal responds to the events or requests made, providing the content from a shared source. A service module connects the app and the Drupal.

illustration image showing the different types of native mobile application development


The best part about the native mobile applications is that these can be accessed without an internet connection. Considering the device platform diverseness, native app development with Drupal is described ahead.

The platform-specific application that runs on iOS is usually written in Objective C or Swift. The development is done in Mac that is compatible to run Apple's XCode IDE. Java or Kotlin are used to write an android native application. Connection of  Drupal site running the services module is done with one of the two following HTTP libraries:

  • AFNetworking is a networking framework for iOS, macOS, watchOS, and tvOS.
  • ASIHTTPRequest is a CFNetwork wrapper for HTTP requests, Objective-C, Mac OS X, and iPhone.
  • Alamofire(Swift) is an HTTP networking library written in Swift.

As mentioned before, Drupal is being widely accepted as a backend service for application environments and web services. This has enabled data consumption and manipulation in many distinct ways.

For example, Waterwheel Swift, formerly known as Drupal iOS SDK (software development kit), facilitates Drupal as a backend service for iOS, macOS, tvOS, or watchOS. It integrates most of the Drupal API features (session management, basic auth, entity crud, local caching, login view controller, etc.) in one SDK. 

Caching strategies, use of asynchronous methods for data downloading and refreshing after each download completion, pre-fetching data on application load, and network timeout issue management are the things that should be taken care of while developing a native RESTful iOS app.

Appcelerator Titanium, an open-source framework, developed by Appcelerator, is another example that facilitates native mobile application development for multiple mobile operating systems with a single JavaScript codebase.

Also, PhoneGap, first named as Apache Cordova, is a software development framework from Adobe systems (originally produced by Nitobi Software). It is open-source and is used for cross-platform mobile application development. It facilitates development using HTML, JavaScript, and CSS. DrupalGap, an application development kit for the drupal websites is one of the projects of PhoneGap created using PhoneGap and jQuery mobile. 

Leveraging Native and Web’s best in Hybrid Mobile Applications

While going for hybrid mobile applications, the best of native and web applications are chosen and integrated together. Usually, HTML, CSS, and JavaScript are used in hybrids to offer a native web view (UIWebView in iOS or WebView in Android). These apps run on multiple platforms (Android, iOS) and are easily distributed through App stores. In addition to that, hybrid apps can access many device resources such as camera, contacts, accelerometer, etc.

illustartion image showing the flowchart of hybrid application development with two sided arrows and mobile phones Source: Cleveroad

Ionic is the most widely used HTML5 framework for the development of a hybrid application. Based on AngularJS, Ionic extends the capacity of developers to build cross-platform applications from open source to premium and also enabling easy deployments of the application.

illustartion image showing the clasifika home page on a mobile screen on a red background having house image and its description on the screen Source: Google Play

Clasifika, a multi-platform real estate hybrid application, has been developed using Ionic and Drupal 8 as the backend. The successful integration of design thinking with the real world, keeping the performance and UX design optimized. 

Delivering App like Experience with Progressive Web Applications

The term progressive web app (PWA) was conceived by Alex Russel and Frances Berriman. PWA is developed using modern web APIs along with traditional progressive enhancement strategy to create cross-platform web applications offering app-like experiences on desktop and mobile. Pinterest is one such progressive web app that helps users to curate images, videos, etc. from a list of choices. 

illustartion image showing a pinterest app opened on a mobile screenin red colourSource: Medium/Manifest

Most of the time while visiting a web page, you must have come across the button ‘Add to Home Screen’. When clicked, it gets installed in the background and can be accessed from the application pack after the successful download. The best part of a progressive web app is that it is available as a basic native application on the phone and can work offline too. Also, PWA (Progressive Web Application) loads fast and creates an engaging experience for its users. 

Integration with a Progressive Web App Module supports the straightforward initiation of the Drupal-based progressive web app. With its extensible platform services and content-centric infrastructure, Drupal is an ideal choice for delivering reliable and engaging mobile experiences to the users.

Final Note

The future is definitely bright for mobile devices. As the digital world is evolving, a large mass of people are choosing mobile devices over computers. Convenience being the essence. Drupal is not only the best choice for building web applications but is also a reliable platform for effective and compelling mobile applications development. 

The world is going Drupal and so are we. We at OpenSense Labs are engaged in offering better digital experiences to our clients with our expertise in Drupal Development.  

Drupal has unfolded many major benefits for the developers. How do you see it from your viewpoint? Share your views on our social media channels: Facebook, LinkedIn, and Twitter. You can also reach out at [email protected].

Sep 12 2019
Sep 12

The Drupal 8 SVG Image module changes the image field widget to allow for SVG images to be uploaded on your Drupal 8 website. This module also allows you to set the width and height of the image as well as choose if the image should be displayed as an or tag.

Download the Drupal 8 SVG Image module like you would any other Drupal module. You can download it with Composer using the following command:

composer require drupal/svg_image

Install the SVG Image module.

Navigate to Structure > Content Types > Article and select Manage Fields. You should have an image field on your Article content type. Click the Edit button.

In the Allowed File Extensions text box add “svg” to the list.

Allow svg format

Save the page, then go to the Manage display tab for the Article content type.

Click the gear icon next to your image field and you will notice you have additional options for SVG images. The first is that you can render the SVG as an tag (which is enabled by default). The second is that you can set the width and height of the SVG image.

SVG Image Options

You can now create an Article and upload your SVG image!

Sep 11 2019
Sep 11

An in-depth analysis of how Drupal's development was sponsored between July 1, 2018 and June 30, 2019.

The past years, I've examined Drupal.org's contribution data to understand who develops Drupal, how diverse the Drupal community is, how much of Drupal's maintenance and innovation is sponsored, and where that sponsorship comes from.

You can look at the 2016 report, the 2017 report, and the 2018 report. Each report looks at data collected in the 12-month period between July 1st and June 30th.

This year's report shows that:

  • Both the recorded number of contributors and contributions have increased.
  • Most contributions are sponsored, but volunteer contributions remains very important to Drupal's success.
  • Drupal's maintenance and innovation depends mostly on smaller Drupal agencies and Acquia. Hosting companies, multi-platform digital marketing agencies, large system integrators and end users make fewer contributions to Drupal.
  • Drupal's contributors have become more diverse, but are still not diverse enough.

Methodology

What are Drupal.org issues?

"Issues" are pages on Drupal.org. Each issue tracks an idea, feature request, bug report, task, or more. See https://www.drupal.org/project/issues for the list of all issues.

For this report, we looked at all Drupal.org issues marked "closed" or "fixed" in the 12-month period from July 1, 2018 to June 30, 2019. The issues analyzed in this report span Drupal core and thousands of contributed projects, across all major versions of Drupal.

What are Drupal.org credits?

In the spring of 2015, after proposing initial ideas for giving credit, Drupal.org added the ability for people to attribute their work in the Drupal.org issues to an organization or customer, or mark it the result of volunteer efforts.

Example issue credit on drupal orgA screenshot of an issue comment on Drupal.org. You can see that jamadar worked on this patch as a volunteer, but also as part of his day job working for TATA Consultancy Services on behalf of their customer, Pfizer.

Drupal.org's credit system is truly unique and groundbreaking in Open Source and provides unprecedented insights into the inner workings of a large Open Source project. There are a few limitations with this approach, which we'll address at the end of this report.

What is the Drupal community working on?

In the 12-month period between July 1, 2018 and June 30, 2019, 27,522 issues were marked "closed" or "fixed", a 13% increase from the 24,447 issues in the 2017-2018 period.

In total, the Drupal community worked on 3,474 different Drupal.org projects this year compared to 3,229 projects in the 2017-2018 period — an 8% year over year increase.

The majority of the credits are the result of work on contributed modules:

A pie chart showing contributions by project type: most contributions are to contributed modules.

Compared to the previous period, contribution credits increased across all project types:

A graph showing the year over year growth of contributions per project type.

The most notable change is the large jump in "non-product credits": more and more members in the community started tracking credits for non-product activities such as organizing Drupal events (e.g. DrupalCamp Delhi project, Drupal Developer Days, Drupal Europe and DrupalCon Europe), promoting Drupal (e.g. Drupal pitch deck or community working groups (e.g. Drupal Diversity and Inclusion Working Group, Governance Working Group).

While some of these increases reflect new contributions, others are existing contributions that are newly reported. All contributions are valuable, whether they're code contributions, or non-product and community-oriented contributions such as organizing events, giving talks, leading sprints, etc. The fact that the credit system is becoming more accurate in recognizing more types of Open Source contribution is both important and positive.

Who is working on Drupal?

For this report's time period, Drupal.org's credit system received contributions from 8,513 different individuals and 1,137 different organizations — a meaningful increase from last year's report.

A graph showing that the number of individual and organizational contributors increased year over year.

Consistent with previous years, approximately 51% of the individual contributors received just one credit. Meanwhile, the top 30 contributors (the top 0.4%) account for 19% of the total credits. In other words, a relatively small number of individuals do the majority of the work. These individuals put an incredible amount of time and effort into developing Drupal and its contributed projects:

Out of the top 30 contributors featured this year, 28 were active contributors in the 2017-2018 period as well. These Drupalists' dedication and continued contribution to the project has been crucial to Drupal's development.

It's also important to recognize that most of the top 30 contributors are sponsored by an organization. Their sponsorship details are provided later in this article. We value the organizations that sponsor these remarkable individuals, because without their support, it could be more challenging for these individuals to be in the top 30.

It's also nice to see two new contributors make the top 30 this year — Alona O'neill with sponsorship from Hook 42 and Thalles Ferreira with sponsorship from CI&T. Most of their credits were the result of smaller patches (e.g. removing deprecated code, fixing coding style issues, etc) or in some cases non-product credits rather than new feature development or fixing complex bugs. These types of contributions are valuable and often a stepping stone towards towards more in-depth contribution.

How much of the work is sponsored?

Issue credits can be marked as "volunteer" and "sponsored" simultaneously (shown in jamadar's screenshot near the top of this post). This could be the case when a contributor does the necessary work to satisfy the customer's need, in addition to using their spare time to add extra functionality.

For those credits with attribution details, 18% were "purely volunteer" credits (8,433 credits), in stark contrast to the 65% that were "purely sponsored" (29,802 credits). While there are almost four times as many "purely sponsored" credits as "purely volunteer" credits, volunteer contribution remains very important to Drupal.

Contributions by volunteer vs sponsored

Both "purely volunteer" and "purely sponsored" credits grew — "purely sponsored" credits grew faster in absolute numbers, but for the first time in four years "purely volunteer" credits grew faster in relative numbers.

The large jump in volunteer credits can be explained by the community capturing more non-product contributions. As can be seen on the graph below, these non-product contributions are more volunteer-centric.

A graph showing how much of the contributions are volunteered vs sponsored.

Who is sponsoring the work?

Now that we've established that the majority of contributions to Drupal are sponsored, let's study which organizations contribute to Drupal. While 1,137 different organizations contributed to Drupal, approximately 50% of them received four credits or less. The top 30 organizations (roughly the top 3%) account for approximately 25% of the total credits, which implies that the top 30 companies play a crucial role in the health of the Drupal project.

Top conytinuying organizationsTop contributing organizations based on the number of issue credits.

While not immediately obvious from the graph above, a variety of different types of companies are active in Drupal's ecosystem:

Category Description Traditional Drupal businesses Small-to-medium-sized professional services companies that primarily make money using Drupal. They typically employ fewer than 100 employees, and because they specialize in Drupal, many of these professional services companies contribute frequently and are a huge part of our community. Examples are Hook42, Centarro, The Big Blue House, Vardot, etc. Digital marketing agencies Larger full-service agencies that have marketing-led practices using a variety of tools, typically including Drupal, Adobe Experience Manager, Sitecore, WordPress, etc. They tend to be larger, with many of the larger agencies employing thousands of people. Examples are Wunderman, Possible and Mirum. System integrators Larger companies that specialize in bringing together different technologies into one solution. Example system agencies are Accenture, TATA Consultancy Services, Capgemini and CI&T. Hosting companies Examples are Acquia, Rackspace, Pantheon and Platform.sh. End users Examples are Pfizer or bio.logis Genetic Information Management GmbH.

A few observations:

  • Almost all of the sponsors in the top 30 are traditional Drupal businesses with fewer than 50 employees. Only five companies in the top 30 — Pfizer, Google, CI&T, bio.logis and Acquia — are not traditional Drupal businesses. The traditional Drupal businesses are responsible for almost 80% of all the credits in the top 30. This percentage goes up if you extend beyond the top 30. It's fair to say that Drupal's maintenance and innovation largely depends on these traditional Drupal businesses.
  • The larger, multi-platform digital marketing agencies are barely contributing to Drupal. While more and more large digital agencies are building out Drupal practices, no digital marketing agencies show up in the top 30, and hardly any appear in the entire list of contributing organizations. While they are not required to contribute, I'm frustrated that we have not yet found the right way to communicate the value of contribution to these companies. We need to incentivize each of these firms to contribute back with the same commitment that we see from traditional Drupal businesses
  • The only system integrator in the top 30 is CI&T, which ranked 4th with 795 credits. As far as system integrators are concerned, CI&T is a smaller player with approximately 2,500 employees. However, we do see various system integrators outside of the top 30, including Globant, Capgemini, Sapient and TATA Consultancy Services. In the past year, Capgemini almost quadrupled their credits from 46 to 196, TATA doubled its credits from 85 to 194, Sapient doubled its credits from 28 to 65, and Globant kept more or less steady with 41 credits. Accenture and Wipro do not appear to contribute despite doing a fair amount of Drupal work in the field.
  • Hosting companies also play an important role in our community, yet only Acquia appears in the top 30. Rackspace has 68 credits, Pantheon has 43, and Platform.sh has 23. I looked for other hosting companies in the data, but couldn't find any. In general, there is a persistent problem with hosting companies that make a lot of money with Drupal not contributing back. The contribution gap between Acquia and other hosting companies has increased, not decreased.
  • We also saw three end users in the top 30 as corporate sponsors: Pfizer (453 credits), Thunder (659 credits, up from 432 credits the year before), and the German company, bio.logis (330 credits). A notable end user is Johnson & Johnson, who was just outside of the top 30, with 221 credits, up from 29 credits the year before. Other end users outside of the top 30, include the European Commission (189 credits), Workday (112 credits), Paypal (80 credits), NBCUniversal (48 credits), Wolters Kluwer (20 credits), and Burda Media (24 credits). We also saw contributions from many universities, including the University of British Columbia (148 credits), University of Waterloo (129 credits), Princeton University (73 credits), University of Austin Texas at Austin (57 credits), Charles Darwin University (24 credits), University of Edinburgh (23 credits), University of Minnesota (19 credits) and many more.
A graph showing that Acquia is by far the number one contributing hosting company.Contributions by system integrators

It would be interesting to see what would happen if more end users mandated contributions from their partners. Pfizer, for example, only works with agencies that contribute back to Drupal, and uses Drupal's credit system to verify their vendors' claims. The State of Georgia started doing the same; they also made Open Source contribution a vendor selection criteria. If more end users took this stance, it could have a big impact on the number of digital agencies, hosting companies and system integrators that contribute to Drupal.

While we should encourage more organizations to sponsor Drupal contributions, we should also understand and respect that some organizations can give more than others and that some might not be able to give back at all. Our goal is not to foster an environment that demands what and how others should give back. Instead, we need to help foster an environment worthy of contribution. This is clearly laid out in Drupal's Values and Principles.

How diverse is Drupal?

Supporting diversity and inclusion within Drupal is essential to the health and success of the project. The people who work on Drupal should reflect the diversity of people who use and work with the web.

I looked at both the gender and geographic diversity of Drupal.org contributors. While these are only two examples of diversity, these are the only diversity characteristics we currently have sufficient data for. Drupal.org recently rolled out support for Big 8/Big 10, so next year we should have more demographics information

Gender diversity

The data shows that only 8% of the recorded contributions were made by contributors who do not identify as male, which continues to indicate a wide gender gap. This is a one percent increase compared to last year. The gender imbalance in Drupal is profound and underscores the need to continue fostering diversity and inclusion in our community.

A graph showing contributions by gender: 75% of the contributions come from people who identify as male.

Last year I wrote a post called about the privilege of free time in Open Source. It made the case that Open Source is not a meritocracy, because not everyone has equal amounts of free time to contribute. For example, research shows that women still spend more than double the time as men doing unpaid domestic work, such as housework or childcare. This makes it more difficult for women to contribute to Open Source on an unpaid, volunteer basis. It's one of the reasons why Open Source projects suffer from a lack of diversity, among others including hostile environments and unconscious biases. Drupal.org's credit data unfortunately still shows a big gender disparity in contributions:

A graph that shows that compared to males, female contributors do more sponsored work, and less volunteer work.

Ideally, over time, we can collect more data on non-binary gender designations, as well as segment some of the trends behind contributions by gender. We can also do better at collecting data on other systemic issues beyond gender alone. Knowing more about these trends can help us close existing gaps. In the meantime, organizations capable of giving back should consider financially sponsoring individuals from underrepresented groups to contribute to Open Source. Each of us needs to decide if and how we can help give time and opportunities to underrepresented groups and how we can create equity for everyone in Drupal.

Geographic diversity

When measuring geographic diversity, we saw individual contributors from six continents and 114 countries:

A graph that shows most contributions come from Europe and North America.Contributions per capitaContribution credits per capita calculated as the amount of contributions per continent divided by the population of each continent. 0.001% means that one in 100,000 people contribute to Drupal. In North America, 5 in 100,000 people contributed to Drupal the last year.

Contributions from Europe and North America are both on the rise. In absolute terms, Europe contributes more than North America, but North America contributes more per capita.

Asia, South America and Africa remain big opportunities for Drupal, as their combined population accounts for 6.3 billion out of 7.5 billion people in the world. Unfortunately, the reported contributions from Asia are declining year over year. For example, compared to last year's report, there was a 17% drop in contribution from India. Despite that drop, India remains the second largest contributor behind the United States:

A graph showing the top 20 contributing countries.The top 20 countries from which contributions originate. The data is compiled by aggregating the countries of all individual contributors behind each issue. Note that the geographical location of contributors doesn't always correspond with the origin of their sponsorship. Wim Leers, for example, works from Belgium, but his funding comes from Acquia, which has the majority of its customers in North America.

Top contributor details

To create more awareness of which organizations are sponsoring the top individual contributors, I included a more detailed overview of the top 50 contributors and their sponsors. If you are a Drupal developer looking for work, these are some of the companies I'd apply to first. If you are an end user looking for a company to work with, these are some of the companies I'd consider working with first. Not only do they know Drupal well, they also help improve your investment in Drupal.

Rank Username Issues Volunteer Sponsored Not specified Sponsors 1 kiamlaluno 1610 99% 0% 1% 2 jrockowitz 756 98% 99% 0% The Big Blue House (750), Memorial Sloan Kettering Cancer Center (5), Rosewood Marketing (1) 3 alexpott 642 6% 80% 19% Thunder (336), Acro Media Inc (100), Chapter Three (77) 4 RajabNatshah 616 1% 100% 0% Vardot (730), Webship (2) 5 volkswagenchick 519 2% 99% 0% Hook 42 (341), Kanopi Studios (171) 6 bojanz 504 0% 98% 2% Centarro (492), Ny Media AS (28), Torchbox (5), Liip (2), Adapt (2) 7 alonaoneill 489 9% 99% 0% Hook 42 (484) 8 thalles 488 0% 100% 0% CI&T (488), Janrain (3), Johnson & Johnson (2) 9 Wim Leers 437 8% 97% 0% Acquia (421), Government of Flanders (3) 10 DamienMcKenna 431 0% 97% 3% Mediacurrent (420) 11 Berdir 424 0% 92% 8% MD Systems (390) 12 chipway 356 0% 100% 0% Chipway (356) 13 larowlan 324 16% 94% 2% PreviousNext (304), Charles Darwin University (22), University of Technology, Sydney (3), Service NSW (2), Department of Justice & Regulation, Victoria (1) 14 pifagor 320 52% 100% 0% GOLEMS GABB (618), EPAM Systems (16), Drupal Ukraine Community (6) 15 catch 313 1% 95% 4% Third & Grove (286), Tag1 Consulting (11), Drupal Association (6), Acquia (4) 16 mglaman 277 2% 98% 1% Centarro (271), Oomph, Inc. (16), E.C. Barton & Co (3), Gaggle.net, Inc. (1), Bluespark (1), Thinkbean (1), LivePerson, Inc (1), Impactiv, Inc. (1), Rosewood Marketing (1), Acro Media Inc (1) 17 adci_contributor 274 0% 100% 0% ADCI Solutions (273) 18 quietone 266 41% 75% 1% Acro Media Inc (200) 19 tim.plunkett 265 3% 89% 9% Acquia (235) 20 gaurav.kapoor 253 0% 51% 49% OpenSense Labs (129), DrupalFit (111) 21 RenatoG 246 0% 100% 0% CI&T (246), Johnson & Johnson (85) 22 heddn 243 2% 98% 2% MTech, LLC (202), Tag1 Consulting (32), European Commission (22), North Studio (3), Acro Media Inc (2) 23 chr.fritsch 241 0% 99% 1% Thunder (239) 24 xjm 238 0% 85% 15% Acquia (202) 25 phenaproxima 238 0% 100% 0% Acquia (238) 26 mkalkbrenner 235 0% 100% 0% bio.logis Genetic Information Management GmbH (234), OSCE: Organization for Security and Co-operation in Europe (41), Welsh Government (4) 27 gvso 232 0% 100% 0% Google Summer of Code (214), Google Code-In (16), Zivtech (1) 28 dawehner 219 39% 84% 8% Chapter Three (176), Drupal Association (5), Tag1 Consulting (3), TES Global (1) 29 e0ipso 218 99% 100% 0% Lullabot (217), IBM (23) 30 drumm 205 0% 98% 1% Drupal Association (201) 31 gabesullice 199 0% 100% 0% Acquia (198), Aten Design Group (1) 32 amateescu 194 0% 97% 3% Pfizer, Inc. (186), Drupal Association (1), Chapter Three (1) 33 klausi 193 2% 59% 40% jobiqo - job board technology (113) 34 samuel.mortenson 187 42% 42% 17% Acquia (79) 35 joelpittet 187 28% 78% 14% The University of British Columbia (146) 36 borisson_ 185 83% 50% 3% Calibrate (79), Dazzle (13), Intracto digital agency (1) 37 Gábor Hojtsy 184 0% 97% 3% Acquia (178) 38 adriancid 182 91% 22% 2% Drupiter (40) 39 eiriksm 182 0% 100% 0% Violinist (178), Ny Media AS (4) 40 yas 179 12% 80% 10% DOCOMO Innovations, Inc. (143) 41 TR 177 0% 0% 100% 42 hass 173 1% 0% 99% 43 Joachim Namyslo 172 69% 0% 31% 44 alex_optim 171 0% 99% 1% GOLEMS GABB (338) 45 flocondetoile 168 0% 99% 1% Flocon de toile (167) 46 Lendude 168 52% 99% 0% Dx Experts (91), ezCompany (67), Noctilaris (9) 47 paulvandenburg 167 11% 72% 21% ezCompany (120) 48 voleger 165 98% 98% 2% GOLEMS GABB (286), Lemberg Solutions Limited (36), Drupal Ukraine Community (1) 49 lauriii 164 3% 98% 1% Acquia (153), Druid (8), Lääkärikeskus Aava Oy (2) 50 idebr 162 0% 99% 1% ezCompany (156), One Shoe (5)

Limitations of the credit system

It is important to note a few of the current limitations of Drupal.org's credit system:

  • The credit system doesn't capture all code contributions. Parts of Drupal are developed on GitHub rather than Drupal.org, and often aren't fully credited on Drupal.org. For example, Drush is maintained on GitHub instead of Drupal.org, and companies like Pantheon don't get credit for that work. The Drupal Association is working to integrate GitLab with Drupal.org. GitLab will provide support for "merge requests", which means contributing to Drupal will feel more familiar to the broader audience of Open Source contributors who learned their skills in the post-patch era. Some of GitLab's tools, such as in-line editing and web-based code review will also lower the barrier to contribution, and should help us grow both the number of contributions and contributors on Drupal.org.
  • The credit system is not used by everyone. There are many ways to contribute to Drupal that are still not captured in the credit system, including things like event organizing or providing support. Technically, that work could be captured as demonstrated by the various non-product initiatives highlighted in this post. Because using the credit system is optional, many contributors don't. As a result, contributions often have incomplete or no contribution credits. We need to encourage all Drupal contributors to use the credit system, and raise awareness of its benefits to both individuals and organizations. Where possible, we should automatically capture credits. For example, translation efforts on https://localize.drupal.org are not currently captured in the credit system but could be automatically.
  • The credit system disincentives work on complex issues. We currently don't have a way to account for the complexity and quality of contributions; one person might have worked several weeks for just one credit, while another person might receive a credit for 10 minutes of work. We certainly see a few individuals and organizations trying to game the credit system. In the future, we should consider issuing credit data in conjunction with issue priority, patch size, number of reviews, etc. This could help incentivize people to work on larger and more important problems and save smaller issues such as coding standards improvements for new contributor sprints. Implementing a scoring system that ranks the complexity of an issue would also allow us to develop more accurate reports of contributed work.

All of this means that the actual number of contributions and contributors could be significantly higher than what we report.

Like Drupal itself, the Drupal.org credit system needs to continue to evolve. Ultimately, the credit system will only be useful when the community uses it, understands its shortcomings, and suggests constructive improvements.

A first experiment with weighing credits

As a simple experiment, I decided to weigh each credit based on the adoption of the project the credit is attributed to. For example, each contribution credit to Drupal core is given a weight of 11 because Drupal core has about 1,1 million active installations. Credits to the Webform module, which has over 400,000 installations, get a weight of 4. And credits to Drupal's Commerce project gets just 1 point as it is installed on fewer than 100,000 sites.

The idea is that these weights capture the end user impact of each contribution, but also act as a proxy for the effort required to get a change committed. Getting a change accepted in Drupal core is both more difficult and more impactful than getting a change accepted to Commerce project.

This weighting is far from perfect as it undervalues non-product contributions, and it still doesn't recognize all types of product contributions (e.g. product strategy work, product management work, release management work, etc). That said, for code contributions, it may be more accurate than a purely unweighted approach.

Top contributing individuals based on weighted credits.The top 30 contributing individuals based on weighted Drupal.org issue credits. Top contributing organizations based on weighted credits.The top 30 contributing organizations based on weighted Drupal.org issue credits.

Conclusions

Our data confirms that Drupal is a vibrant community full of contributors who are constantly evolving and improving the software. It's amazing to see that just in the last year, Drupal welcomed more than 8,000 individuals contributors and over 1,100 corporate contributors. It's especially nice to see the number of reported contributions, individual contributors and organizational contributors increase year over year.

To grow and sustain Drupal, we should support those that contribute to Drupal and find ways to get those that are not contributing involved in our community. Improving diversity within Drupal is critical, and we should welcome any suggestions that encourage participation from a broader range of individuals and organizations.

September 11, 2019

13 min read time

Sep 11 2019
Sep 11

Portland, OR - The Drupal Association, an international nonprofit organization, welcomes its newly appointed board members to help advance its mission to unite a global open source community to build, secure, and promote Drupal. The Association’s Board of Directors ratified the appointment of five new board members in September, including: Grace Francisco, Lo Li, Owen Lansbury, Ryan Szrama and Leslie Glynn, who was elected for the community-at-large seat.

“We are excited to have these amazing individuals join us in our efforts to broaden our reach into diverse communities and to grow Drupal adoption. They bring a wide range of experiences and expertise to the Association that will enhance our opportunities to reach new audiences, support the Drupal community and elevate the Drupal project around the world,” said Adam Goodman, Drupal Association Board Chair. “We welcome Grace’s significant work in developer relations, developer marketing and program management; Leslie’s experience as a developer and project manager long emphasized by her years of contributions as a Drupal community member; Owen’s creative problem-solving, local Drupal association and DrupalCamp experience and business leadership skills; Lo’s extensive work in content management, brand promotion and tech platforms alongside her advocacy for women in technology; and Ryan’s product and service development and business skills coupled with his strong relationships in the Drupal community. We look forward to working with all of our new board members to achieve the Association’s strategic goals.”

grace's headshot photoGrace Francisco joined MongoDB in July as Vice President, Worldwide Developer Relations. Prior to that, she served as Vice President of Developer Relations and Education at gaming platform Roblox where she doubled the size of active developers to 2+ million. A seasoned developer relations leader with over 20 years of experience in software, she has co-authored three patents and led worldwide developer initiatives at Microsoft, Intuit, Yodlee and Atlassian. Francisco graduated cum laude and holds a BBA in Business Management from Golden Gate University.

“I am super excited to join the Drupal Association board,” said Francisco. “I first encountered the Drupal project back in 2010 while I was at Microsoft doing outreach to open source projects - building bridges to open source communities. It’s wonderful now, almost a decade later, to help from the other side to build bridges from Drupal to other tech organizations to broaden Drupal’s adoption.”

leslieg's headshot photoLeslie Glynn has more than thirty years of experience in the tech field as a software developer and project manager. She has been a freelance Drupal Project Manager and Site Builder since 2012. Glynn is very active in the Drupal community as an event organizer (Design 4 Drupal, Boston and NEDCamp), sprint organizer, mentor, trainer and volunteer. She is the winner of the 2019 Aaron Winborn Award. This annual award recognizes an individual who demonstrates personal integrity, kindness, and above-and-beyond commitment to the Drupal community.

“Being a volunteer at numerous Drupal camps and DrupalCons has given me the opportunity to meet and learn from many diverse members of the Drupal community,” said Glynn. “I hope to bring that knowledge and experience to my work on Drupal Association initiatives. One of the things I would like to help with is growing Drupal adoption through new initiatives that reach out to underrepresented and diverse groups through an increased presence at secondary schools and universities and to groups, such as Girls Who Code, in the tech space.”

owen's headshot photoOwen Lansbury is co-founder of PreviousNext, an independent Australian digital design and development company that has been one of the world's most prolific code contributors to the Drupal project. With 25 years’ professional experience and a background in Fine Art, Digital Media and User Experience Design, Lansbury blends creative problem solving with the business skills required to sustain his own company and work successfully with complex customers. He is also an active leader within the Australian and New Zealand Drupal community, bringing DrupalCon to Sydney in 2013, acting as Track Chair at several regional events and chairing the DrupalSouth Steering Committee.

Lansbury said, “As a long-term Drupal community contributor in Australia and New Zealand, I'm excited about the opportunity to bring my grassroots experience to the Association board at a global level. I've always been a bit jealous of our developers contributing code to Drupal, so being able to contribute my own business and community leadership experience to the Association board is a great opportunity for me to give something back at a global level."

lo's headshot photoLo Li is the Senior Vice President, CIO of Global Consumer Solutions at Equifax. She has spent the past two decades leading global multi-billion dollar corporations for some of the world’s most renowned hospitality and retail brands in the world, working with hundreds of teams dispersed in the UK, China, Singapore and India. Some of her work includes the creation for dynamic pricing and predictive analytics engines for global hotels; and scaling big data and Agile to enable business transformation at large retailers including double digit growth plans for digital and international presence. She brings a deep understanding of how to translate corporate visions and strategies into simple, elegant solutions - using her international business acumen and technology background as both a business enabler and a competitive differentiator. Li, who is multilingual - fluent in Mandarin, Portuguese, Spanish and English - is the recipient of several industry accolades and serves on the Board of Directors for several national nonprofit organizations. She received her Bachelor’s and Master’s degree from the University of Georgia.

Li said, “I am thrilled to join the Drupal Association board because of the incredible open source community that has been fostered. The nurturing and growth of communities, like the one we have at Drupal, are the very catalyst to help organizations leap forward and provide an incubator for new ideas and thought leadership amongst digital citizens. It's truly an honor to be able to help shape the future of such a great organization!”

ryan's headshot photoRyan Szrama co-founded Commerce Guys in 2009 to offer Drupal-based eCommerce consulting and development services. He was the project lead of Drupal’s most popular eCommerce framework, Drupal Commerce, from its creation to its eventual use on over 60,000 websites. In 2016, Ryan acquired control of Commerce Guys from his partners, leading the company to rebrand to Centarro and launch new product and support offerings that enable teams to build with confidence on Drupal Commerce.

"My personal goals align perfectly with the mission of the Drupal Association: uniting a global open source community to build Drupal,” said Szrama. “I've been privileged to build a career in this community as a long-time contributor turned business owner, and I'm continually inspired by the members of this board to think bigger and give back more. I hope to apply my knowledge and experience to board initiatives that empower more people to better themselves and their organizations by using and contributing to Drupal."

The newly-elected members will join the following Association board members, continuing their service in the upcoming term: 

  • Baddý Sonja Breidert, 1xINTERNET

  • Dries Buytaert, Acquia

  • Luma Dahlbacka, Charles Schwab & Co

  • Suzanne Dergacheva, Evolving Web

  • Adam Goodman, Northwestern University’s Center for Leadership

  • Mike Lamb, Pfizer

  • Audra Martin-Merrick, Red Backpack Limited

  • George Matthes, Johnson & Johnson

  • Vishal Mehrotra, Tata Consultancy Services

  • Ingo Rübet, BOTLabs GmbH

  • Michel van Velde, One Shoe

About Drupal

Drupal is one of the leading content management software platforms that has been used to create millions of websites around the world. There are 46,000 plus developers with 1.3 million users on Drupal.org, and Drupal has the largest open source community members in the world. Drupal has great standard features, easy content authoring, reliable performance and excellent security. What sets it apart is its flexibility; modularity is one of its core principles. Its tools help you build the versatile, structured content that ambitious web experiences need.

About Drupal Association

The Drupal Association is an international non-profit organization that engages a broad audience about Drupal, the leading CMS open source project. The Association promotes  Drupal adoption through the work and initiatives of a worldwide community of dedicated contributors, and support from individual and organizational members. The Drupal Association helps the Drupal community with funding, infrastructure, education, promotion, distribution and online collaboration. For more information, visit Drupal.org.

###

Sep 11 2019
Sep 11

The news about the forthcoming Drupal 9 release in June 2020 becomes the hottest topic in the Drupal world. Website owners and developers start planning for Drupal 9. One of the key points in this plan is to prepare for Drupal 9 with the Upgrade Status module to make sure the site uses no deprecated code.

We are glad to announce that there is one more big advancement in the Drupal 9 readiness field. The newly released 8.7.7 version introduced the core versioning support. It allows developers to declare their project compatibility both with Drupal 8 and Drupal 9. More details are coming right now.

Drupal 9 readiness: how are things going for modules?

The community is gradually building D9 in D8 while deprecating older APIs and functions and supporting new releases of third-party dependencies. Among examples are the drupal_set_message() function deprecation or the Drupal 8.8.x-dev compatibility with the new Symfony 4.3.3.

Being up-to-date with all that means complete and instant Drupal 9 readiness for modules and themes. And it's great to notice that more contributed projects are D9 compatible! 

According to Gábor Hojtsy, a famous Drupal contributor and product manager, their number increased from 48% in March to 54% in July 2019

Core version requirement key introduced in Drupal 8.7.7!

As of Drupal 8.7.7, providing Drupal 9 readiness reaches a new level. So if anyone asks what’s new in Drupal 8.7.7, it’s the core versioning support that comes to mind first. Gábor Hojtsy devoted a special blog post to describe this innovation and called it “huge.” 

The key point is that all module and theme developers will need to add one more line to their info.yml file — the core_version_requirement key. 

Using it, developers can specify the multiple core versions their projects are compatible with. For example, they can mark both Drupal 9 readiness and compatibility both with the eighth version. Here is how the lines might look for such a project:

name: Module Name
type: module
core: 8.x
core_version_requirement: ^8 || ^9

The new key allows being even more specific about the version compatibility by specifying minor versions (8.7.7, 8.8.3. etc.), which was not possible with the traditional “core: 8.x” key. 

new feature in drupal 8 7 7.jpg

If a project only works with the 8.7.7 version or later, the developer will need nothing but the new key to mark its D9 readiness:

name: Module Name
type: module
core_version_requirement: ^8.7.7 || ^9

However, versions older than D8.7.7 do not support this improvement. So if a project is also meant to work with them, the traditional “core: 8.x” key should be listed along with the new one in the info.yml file.

Let your website’s Drupal 9 readiness be ultimate!

With so many opportunities for full Drupal 9 readiness, it’s time to start. Our development and support team will assist you in:

  • updating your site to version 8.7.7 and always keeping it up-to-date with fresh releases
  • checking your website for deprecated code and giving it a cleanup
  • marking the Drupal 9 readiness of your projects according to the newly introduced requirements

Contact our developers to keep your website prepared — and let it fly swiftly and effortlessly to the ninth version in June 2020!

Sep 11 2019
Sep 11

Adding media content to a website instantly turns it into an engaging and attractive one. Media like images, videos, audio, slideshows, etc. work wonders in creating better and compelling digital experiences. The Media module for Drupal 8 lets you manage your media elements easily and systematically. The release of Drupal 8.4 saw the inclusion of many media features like the Media module and Media Library in core.

What is Media module?

The Drupal 8 Media module is often called as a ‘File browser to the internet’. 
Media module for Drupal 8 lets you manage media files and assets regardless of where they are hosted. With this module you can add/embed different kinds of media to your website content, save them in the Media Library, embed videos with a URL and more.

Drupal 8 Media Entities

The concept of Media entities are similar to that of Nodes. Media types are to media entities as Content types are to nodes. However, with Media entities, every media type is different. For example, an Image media type will come with a different set of features that can be modified (like dimensions) when compared to a video embed with a URL media type. When you enable the Media module, these Media types are created automatically -

  • File
  • Image
  • Audio File
  • Video File
  • Remote Video

You can also add fields to media types and create your own media types based on your selected media source.

Installation of Media module:

The Media module ships with Drupal 8 core but you will need to enable it first to be able to use it. Here are the steps to enabling and using the Drupal 8 Media module-

Step1: 

First we have to go to Extend tab and enable the Media and media library module as shown in the below image.

Step2:

Once we enable the module now we can see that the Media types are part of the site’s structure, which has 5 types - Audio, Image , Remote video, File and video. Here we can create our own type as well.

Step3: 

Now we can use this as an entity reference for any field.
For example, here we will create a field for Images as Media type as shown in the below.

Next, we have to select a Reference type of media as shown in the below image.

Step 4: 

The field has been created successfully. It is time to add some media of the available types. You can add it in two ways:

  1. From the admin dashboard “Content ->  Add content ->  Article” for Image field

  1. Directly from the “Media” tab in “Content” where you will see the entire Media Library with all media that you have added.

The new Media Library interface can display your media in the Gridview and in the Table view

It is easy to filter and sort the media by various criteria, as well as select particular items to do actions with them (delete, publish, save, unpublish).

Sep 11 2019
Sep 11

In a previous post I had the opportunity to present the Entity Activity module which allows us to set up a notification system on any type of Drupal 8 content entity, according to the three main actions of their life cycle: creation, update and deletion.

Since beta version 8, the Entity Activity module includes a sub-module, Entity Activity Mail, which now allows us to send by email a summary of the notifications generated for each user, according to a frequency that can be configured by each user.

General configuration

The module offers several general configuration options.

Entity Activity Mail General settings

The main configuration options, in addition to the possibility of stopping the sending of notifications by email globally, are:

  • The possibility to exclude notifications, which have been marked as read, from the report sent by email
  • The definition of the time at which background tasks will be launched to generate all notification reports for each user, and then generate their sending. The tasks launched can be very time-consuming here depending on the volume of notifications and users, so it is recommended to configure the site cron at the server level so that the Cron is launched at least every hour so that the queues in charge of generating reports and sending reports by email are completely processed.
  • The ability to log the sending of each report for history or debugging purposes.

We also have the possibility to customize the email of the notification report.

Entity Activty Mail mail settings

In particular, we can customize the sender mail (leave blank to use the site mail), the subject of the mail, a text displayed before the notification report and finally a text displayed after the report (footer).

It is strongly recommended to use the SwiftMailer module to allow the sending of emails in HTML format, or any other solution at your convenience to ensure that the emails sent are in HTML format, for easier integration and theming of the reports generated and sent.

Configuration for users

In order to allow each user to select the frequency at which they wish to receive the notification report, we must configure the Logs report frequency component on the manage form display of the user entity.

Widget frequency

This component will allow each user to set up their account according to their preferences.

Frequency options

Thus, a user can choose from:

  • Never receive a report by email
  • Receive an email with each generated notification (immediately)
  • Receive a daily, weekly or monthly report.

Theming of the report

The email of the notification report can be themed by overloading the basic template provided by the module: entity-activity-mail-report.html.twig

The basic template render each notification according to the Mail view mode, which you can customize at this level, with the logs_content variable that contains a rendering table of all log entities in this view mode. For more detailed needs, the Twig template has with the logs variable all the Log entities that make up the report. If your needs are more advanced, you can always ask for an intervention from a Drupal Freelance who will most certainly be able to customize the rendering according to your expectations.

By way of conclusion

With a few clicks we can now offer users the option of receiving a notification report according to their preference.

As we have said above, generating and sending these reports by email can be very time-consuming, so the module works in tree steps: first of all it collects all eligible users according to their sending frequency, then for each user a processing queue is in charge of generating the report, then another processing queue is in charge of sending the email itself. These three steps make it possible to generate and send reports more easily, but it should be borne in mind that processing queues are only executed when the site's Cron is launched. This is why it is highly recommended to launch the Cron from the server hosting the site at least every hour, or less depending on the volume (number of users, estimated number of notifications) of your Drupal project.

Sep 10 2019
Sep 10

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

I'm excited to share that when Drupal 8.8 drops in December, Drupal's WYSIWYG editor will allow media embedding.

You may wonder: Why is that worth announcing on your blog? It's just one new button in my WYSIWYG editor.

It's a big deal because Drupal's media management has been going through a decade-long transformation. The addition of WYSIWYG integration completes the final milestone. You can read more about it on Wim's blog post.

Drupal 8.8 should ship with complete media management, which is fantastic news for site builders and content authors who have long wanted a simpler way to embed media in Drupal.

Congratulations to the Media Initiative team for this significant achievement!

Sep 10 2019
Sep 10

This is the second of three of blog posts about creating an app with React Native. To get started with React Native, read the Basics of React Native. Once you are familiar with the system and have an app, it is time to fill it out with content. If you don’t have content on your Drupal website, read Understanding Drupal 8’s Migrate API.

Exposing Drupal content

Some helpful modules to expose Drupal content are: Views, RESTful Web Services, and Serialization. The concept of getting Drupal content to an endpoint is simple:

  1. Build a view containing the data you want to expose.
  2. Add a “REST export” display to the view. During this step, select the appropriate serialization type.
  3. This will automatically create a REST endpoint at the URL.

The dataflow should look something like this: Drupal Content -> View -> Serializer -> REST endpoint.

Using fetch to asynchronously retrieve data

React Native’s compiler is Babel, which means ES6 code can be used anywhere in the project. This is useful for retrieving data, because ES6 enables the async and await keywords. When used in conjunction with something like fetch, you can create a smooth and elegant solution for pulling data from a server, such as content from a Drupal website.

The code for pulling from a Drupal website REST endpoint is the same as REST endpoint. It should look something like this:

async getSomeData() { let url = "https://data.org/useful/data"; let response = await fetch(url); let data = await response.text(); return data; }

The advantage to making a call like this asynchronously is that it allows other threads to continue running while the fetch is waiting for the server call to return with all of the data it ordered. This improves the user experience because it allows them to continue using other functions while the data loads.

Building a FlatList from a data set

After pulling in data from the endpoint, add a component to display the data. <FlatList> is an excellent component already built in to React Native. These components are useful because they could handle infinite amounts of data without impacting performance, since they only render the part of the list that is currently on screen.

A <FlatList> component takes two props for displaying data. You may need to massage the data to make it easier to use inside a <FlatList>. The first prop is the set of data that it will display. The next prop required by a <FlatList> is renderItem, which describes how the data should be displayed. This is a JSX object that tells the <FlatList> component how to represent each list item, and what fields to pull from the data. You can use any component inside renderItem.

The ListItem component provided by React Native Elements has lots of styling features, like color gradients and automatic chevron placement.

Here is an example <FlatList>:

<FlatList> style={{backgroundColor: '#ededed'}} data= {this.state.peopleData} renderItem={({person}) => <View> <ListItem title={person.name} titleStyle={{ color: '#00AEED', fontWeight: 'bold' }} subtitle={person.position} /> </View> } />

With the skills to expose, retrieve, and display your data, you can integrate a Drupal website with your new React Native app.

Sep 10 2019
Sep 10

Dropsolid is a Diamond sponsor at DrupalCon Amsterdam, 28-31 October. In this post, I’d like to share a bit about our vision for delivering the best customer experiences, our open integrated Digital Experience Platform, our partner program, and a special opportunity for DrupalCon attendees.

Are you working in a digital agency and coming to DrupalCon? We’d love to meet you at DrupalCon and talk about how our tools, infrastructure, and expertise could help you as a digital agency partner. We’ll be at Stand 13, by the catering area, so if you fancy a coffee, stop by for a chat. We’re running a very special giveaway, too. Complete a quick survey and we’ll donate 15 minutes of core contribution time as a thank you.

Sign up for Dropsolid News


A vision for Drupal to improve customer experience

In my previous post, I wrote about why we’re sponsoring DrupalCon. Simply put, without it, we wouldn’t exist. I also wrote about what we’re working on for the future, inspired by the market changes around digital experience management. I think we have something unique to offer our partner digital agencies right now.

I’ve gone from being a developer to a CEO, and I know the attraction of solving problems by building your own solutions. Yet, like many agencies, we discovered a few years ago that doing everything in-house was hindering us from growth. To solve this, we ended up pivoting our entire company, defining and offering solutions in a completely different way.

We found that many of our clients’ and partners’ teams were working in silos, with different focuses—one on marketing, another on hosting, and so on. We believe we have to take an integrated approach to solving today’s problems and a big part of that is offering stellar customer experience. We discovered that investing in customer experience meant your customers stick around more and longer. This translates to increased customer lifetime value, lower customer acquisition costs, and lower running costs. But what does it take to get there?

We have to recognize how problems are connected, so we can build connected solutions. You can see this in problems like search engine optimization. SEO is as much about great user experience as it is about your content. Today, for example, the speed and performance of your website affects your search engine rankings. Incidentally, my colleagues Wouter De Bruycker (SEO Specialist) and Brent Gees (Drupal Architect) will be talking about avoiding Drupal SEO pitfalls at DrupalCon Amsterdam.

Similarly, it seemed that various solutions out there were narrowly focused on a single area. We saw the potential and power of integrating these as parts of a unified Digital Experience Platform. Stand-alone, any one of these tools offers benefits, but integrated together, the whole is greater than the sum of its parts.

We are taking this approach with our clients already. With each successful engagement, we add what we learn to our toolbox of integrated solutions. We are building these solutions out for customers with consultation and training to make the most out of their investments. These include our hosting platform; our local dev tool, Launchpad; our Drupal install profile, Dropsolid Rocketship; Dropsolid Personalization; and Dropsolid Search optimized with Machine Learning. 

But our vision is bigger. We are working towards an open, integrated, Digital Experience Platform that our partner agencies can leverage to greater creative freedom and increased capacity without getting in their own way.

Stop by at DrupalCon or get in touch and see what we’re building for you. 

Read more: Open Digital Experience Platform


A Partner for European Digital Agencies

Dropsolid is the only European company sponsoring DrupalCon Amsterdam at the top-tier, Diamond sponsor level. With all due respect for our American colleagues, we believe a robust European company should exist to support all of us here. We want to help other European companies build successful digital experiences with Drupal at the core for organizations, governments, and others.

Like many Drupal agencies, we’ve gotten to where we are now providing services to our local market. Being based in Belgium, we design, strategize, build, maintain, and run websites and applications for clients, mainly in the Benelux region.

Dropsolid is the only European company sponsoring DrupalCon Amsterdam at the top-tier, Diamond sponsor level.

Now, we are looking for partners outside of Belgium to benefit from using our Drupal Open Digital Experience Platform for themselves and their customers. Dropsolid has the tools, infrastructure, and expertise to support creating sustainable digital experiences for anyone. Furthermore, we have the advantage of knowing and understanding the differing needs of our colleagues and clients across Europe.


Come join us!

We are looking for more partners to join us on this journey. By leaning on our tools and expertise, those who have already joined us now have more capacity for creative growth and opportunity.

What you might see as tedious problems and cost-centers holding your agency back, we see as our playground for invention and innovation. Our partners can extend and improve their core capabilities by off-loading some work onto us. And you gain shared revenue from selling services that your customers need.

You might be our ideal partner if you prefer

  • benefitting from recurring revenue, and 
  • not taking on additional complexity that distracts you from your core creative business.

Partners who sign up with us at DrupalCon will get significant benefits including preferred status and better terms and conditions compared to our standard offerings. Talk to us about it at our booth at Stand 13 or contact us to arrange a time to talk.


Facets of the Digital Experience - Dropsolid Sessions at DrupalCon

Check out my other post to see where to meet the Dropsolid Team at DrupalCon. You’re welcome to come say hello at our booth at Stand 13, and we can show you the facets of digital experience management as we see them, and also share our vision for the future.

Each one of our talks focuses on different facets of improving the digital experience for customers:

Sep 10 2019
Sep 10
Just like the poem says, “Little drop makes the mighty Ocean,” all contributions matter in the growth of the Drupal global community.  A diverse community results in great things. To ensure the longevity of Drupal digital experiences and the…
Sep 10 2019
Sep 10

DrupalCon Minneapolis 2020 is now accepting session proposal submissions! For 2020, we welcome a wealth of perspectives and a vast knowledge base as presenters in Minneapolis for DrupalCon North America.

Sep 10 2019
Sep 10

Download the Menu Item Extras module like you would any other Drupal module. You can download it with Composer using the following command:

composer require drupal/menu_item_extras

Install the Menu Item Extras module. The module also provides a Demo module that can be used to see some examples of a menu with fields and display modes configured. In this case, we will just look at the base Menu Item Extras module.

Navigate to Structure > Menus > Main Navigation. The first thing you should notice is that there are additional links to Manage fields, Manage form display, Manage display, and View mode settings. This is very similar to what you have probably used on other entity types.

If you need to store any additional data for a menu link, you can do this on the Manage fields page. One potential use of this is to add an image field:

Manage Fields

You can then manage the way this is displayed on the menu link add/edit form:

Manage Form Display

You can also control how this menu item is displayed:

Manage Display

If you navigate to Structure > Display modes > View modes you can add additional view modes for menu items. In this example, I created a new view mode for Custom menu links. I called the view mode Image Link.

Custom Menu Link

You can now navigate back to Structure > Menus > Main Navigation and go to Manage display. In the Custom Display Settings section, you can enable the Image Link view mode and configure the display settings for the Default links and the Image Link view mode displays.

Custom Display Settings

You can now navigate to the View Modes Settings tab and select what view mode to use for each link in your menu.

View Mode Settings

This additional flexibility allows you to do a lot with your menu items. You could use this to build out a customized mega menu (this would require additional theme and template development). You could also use this to customize the display of menu items (perhaps by adding icons next to menu links, adding additional menu link descriptions, and more. The module provides you the site building tools to customize your menu items, now it’s up to you to decide how you want to use it!

Sep 09 2019
Sep 09

Rain logo updated

Mediacurrent created the Rain Install Profile to build fast, consistent Drupal websites and improve the editorial experience. Rain expedites website creation, configuration, and deployment.

Overview

The Mediacurrent development team uses a Composer project template that extends the official Drupal Composer template to add Rain projects as well as additional tools and scripts.

Our template by default leverages a fork of DrupalVM which will provision the local environment. Note that Docker-based environments such as Lando or DDEV could be used as an alternative to Vagrant.

In this tutorial, we will walk through each step to get you up and running quickly. Below, you can also watch a narrated tutorial video to see these steps in action.

Installation instructions

First, you will want to create a repository wherever you typically host your Git projects (e.g. Github, Bitbucket or Gitlab). Once you have that setup you can clone Mediacurrent’s repo and point the origin back to your Git repo. The example command below illustrates how this is done.

Example:

git remote set-url origin [email protected]:mediacurrent/shortcode_project.git

Next, you will want to initialize the project. You can do that by running the following commands with your local host name and IP (see example below).

Example:

composer install

composer drupal-scaffold

./scripts/hobson project:init example.mcdev 192.168.50.4

Finally, to build the project and run the install you can simply run the following build command to execute the composer install and Drupal install:

./scripts/buid.sh

Note that this command does require Mediacurent’s Vagrant environment in order to work. If you are using an alternative local environment you would run composer install, followed by the drush site install command instead of running the build script.

Once you get a full install working with the sample profile that’s been provided you will want to follow the project README documentation for further setup instructions. Remember to commit all of your files and push up to your Git’s origin. That’s it!

Questions or comments? Let me know at https://twitter.com/drupalninja/.

Sep 09 2019
Sep 09

When you build a new website, going live is relatively easy. You get ahold of a domain name, point it at a webhost, put the website code there, and you're up and running!

After a site is live, it gets a lot more complicated.

What's important about deployment?

If you have a simple brochure site, deploying updates doesn't have to be complicated. The more your site does, the more complex deployment becomes. A deployment plan can help you stay out of trouble, keep your site online, minimize data loss. So when going live with an update to a site, you should ask:

  • How much downtime is acceptable?
  • How much testing do we need before we make a change to the production site?
  • What data could we lose, from the production site?
  • What might go wrong with this deployment strategy?
  • How can we recover if something does go wrong?

A good deployment plan should make you smile with comfort, knowing you have all the bases covered. Are you smiling? If not, read on.

Common deployment strategies

Here are the main strategies we've seen or used for deployment:

  • Do all work in the production environment so there's nothing to deploy
  • Copy the entire new site into the production environment
  • Compile/build a site and put the result into the production environment
  • Dev/Stage/Production pipeline
  • Blue/Green deployments

Let's take a deeper look at each one.

No Deployment - work in production

All too often, this is what you get if you aren't careful hiring a freelancer. This really seems to be the standard approach for most WordPress sites, which to me is horrifying.

Coding is often a process of trying something, breaking things, and then fixing them. Rinse and repeat. If you're doing this on a live production website, your site visitors will see broken pages, weird issue, or sometimes nothing at all. If your site is already getting traffic, working in production is irresponsible, dangerous. Especially if you aren't extremely careful about backups, and aren't extremely proficient.

The only benefit of "no deployment" deployment strategies is that it's cheap -- you're saving the cost of managing a copy of your site, and deploying changes.

Copy site to production

This also seems to be a pretty common way of deploying sites -- simply copy the new site in its entirety to the production server and make it live.

For major upgrades, such as going from Drupal 7 to Drupal 8, or changing from one platform to an entirely different one, this is the main strategy we use. And there are definitely times when this strategy makes sense. However, for day-to-day maintenance, theme refreshes, or most typical deployments, this is not a very good approach.

If your production site has a database, and regular changes to it, you need to be extremely careful to not lose production data. For example, if your site allows user comments, or takes e-commerce orders, or manages sales leads, if you simply copy a new site up you risk losing something.

Save this one for entirely new sites. Don't do this for day to day work -- unless your site doesn't even have a database.

Build site and deploy

"Static site generators" like Gatsby and Jeckyll have become quite popular recently, because they generate static sites that do not have a database -- greatly simplifying security. If you're running a full-blown Content Management System (CMS) like Drupal or WordPress, you're putting an application with bugs on the Internet where anyone can attack it. If your site is just a collection of files, they can't really attack it -- they can attack the hosting environment but your site itself has far less "attack surface" for an attacker to go after.

Gatsby in particular is becoming quite popular as a front-end to Drupal and WordPress -- you write your content in a private CMS on a private LAN, not reachable from the Internet, export the entire site using Gatsby (the build step), and then copy the resulting code up to the web host (much like the previous deployment strategy).

If you use this approach, you still need to consider how to keep your CMS up to date, though if it's not online, updating it in place becomes a far more reasonable proposition.

Dev/Stage/Production pipeline

Now we've reached what we consider to be the "standard" deployment practice -- run 3 copies of your site:

  • Dev, or Development -- this copy is where you do the development work, or at least integrate all the various developer copies, and conduct the initial round of testing.
  • Stage, or Test -- The main purpose of this copy is to test the deployment process itself, and understand what might break when you roll out to production.
  • Production, or Live -- The site that's available to the public.

In general, code flows from dev to production, whereas content/data flows from production to dev. If your site takes orders, collects data using forms, supports ratings/reviews or comments, or does anything sophisticated, you'll probably end up with this deployment strategy.

Several of the more professional hosts, like Pantheon, Acquia, WP Engine, and others provide these 3 environments along with easy ways to deploy code up to prod, and copy data down from prod.

Many larger companies or highly technical startups have built out "continuous integration/continuous deployment" on pipelines along these lines -- including Freelock. "Continuous Integration" basically kicks off automatic tests after code is pushed to a particular branch, and "Continuous Deployment" automates the deployment of code to production when tests have passed.

This is the key service we provide to nearly all our clients -- two different kinds of testing, a fully automatic pipeline, with automatic backups, release scheduling, release note management, and more. And we've build our pipeline to work with a variety of hosts including Pantheon and Acquia but also bare Linux servers at any cloud provider.

The main downsides of this type of deployment is that it can be slow to deploy, very hard to set up, prone to breaking as code and standards evolve, and different platforms have different challenges around deploying configuration changes. For example, when you move a WordPress database to another location, you need to do a string search/replacement in the database to update the URL and the disk location, and you may need to do manual steps after the code gets deployed. Drupal, on the other hand, may put the site in maintenance mode for a few minutes as database updates get applied.

All in all, when done well, this is a great deployment strategy, but can be very expensive to maintain. That's why our service is such a great value -- we do all the hard work of keeping it running smoothly across many dozens of clients, have automated a lot of the hard bits, and streamlined the setup.

Blue/Green deployments

If even a minute of downtime costs a significant amount of income, you may want to consider a Blue/Green deployment strategy. This is a strategy made for "high availability" -- doing your utmost to both minimize maintenance windows, and provide a rock-solid roll-back option if something goes awry.

With a Blue/Green deployment strategy, you essentially create two full production environments -- "blue" and "green". One of them is live at any given instance, the other is in standby. When you want to deploy an update, you deploy all the new code and configuration changes to the offline environment, and when it's all ready to go, you simply "promote" it to be the live one. For example, if Blue is live, you deploy everything to Green, possibly using a normal dev/stage/prod deployment process. The configuration changes happen while the green site is offline, so the public never gets a "down for maintenance" message. When it's all ready, you promote Green to live, and Blue becomes the offline standby copy. And if you discover a problem after going live, you simply promote Blue back to live, and Green goes into standby where it can get fixed.

There is a big downside here -- if your site takes orders, or otherwise changes the production database, there's a window where you could lose data, much like the "Copy Site to Production" strategy. You might be able to somewhat mitigate this by setting the live site to "read only" but still available, while you copy the database to the standby site and then apply config and promote. Or you might be able to create a "journal" or log of changes that you replay on the new site after it gets promoted. Or move to a micro-service architecture -- but then you're just moving the problem into individual microservices that still need a deployment strategy.

Which deployment strategy is best?

There is no "best" deployment strategy -- it's all about tradeoffs, and what is most appropriate for a particular site's situation. If you break up your site into multiple pieces, you may end up using multiple strategies -- but each one might be quite a bit simpler than trying to update the whole. On the other hand, that might actually lower availability, as various pieces end up with different maintenance schedules.

If you're running a PHP-based CMS, and you want to rest easy that your site is up-to-date, running correctly, and with a solid recovery plan if something goes wrong, we can help with that!

Sep 09 2019
Sep 09

With Drupal 9 stated to be released in June 2020, the Drupal community has around 11 months to go. So, before it maps out a transition plan, now is the time to discuss what to expect from Drupal 9.

Switch to Drupal 9

You must be wondering:

Is Drupal 9 a reasonable plan for you?

Is it easy to migrate from recent Drupal versions to the new one?

This blog post has all your questions answered.

Let’s see the major changes in Drupal 9

The latest version of Drupal is said to be built on Drupal 8 and the migration will be far easy this time.

  • Updated dependencies version so that they can be supported
  • Removal of deprecated code before release

The foremost update to be made in Drupal 9 is Symfony 4 or 5 and the team is working hard for its implementation.

Planning to Move to Drupal 9?

With the release of Drupal 8.7 in 2019, it has optionally supported Twig 2 that has helped developers start testing their code against the version of Twig. Drupal 8.8 will virtually support the recent version of Symfony. Ideally, the Drupal community would like to release Drupal 9 with support for Symfony 5, that is to be released at the end of 2019.

Drupal 9

If you are already using Drupal 8, the best advice is to keep your site up to date. Drupal 9 is an updated version of Drupal 8 with the updates for third-party dependencies and depreciated code removed.

Ensure that you are not using any deprecated APIs and modules and wherever possible use the recent versions of dependencies. If you do that, your upgrade experience will not encounter any problems.

Since Drupal 9 is being built within version 8, developers will have the choice to test their code and make updates before the release of Drupal 9. This is an outstanding update and was not possible with the previous versions of Drupal!

So, where are you in the Drupal journey?

Here are some scenarios to support your migration process:

Are you on Drupal 6?

You are way behind! . We strongly suggest you move to Drupal 8 as soon as you can. Migration from 8 to 9 will be straight forward. Drupal 8 includes migration facilitations for Drupal 6 which probably won’t be included in Drupal 9. While there is a possibility that there might be some contribution modules available, but it is better to be safe.

Are you on Drupal 7?

Both Drupal 7 and 8 support will end by 2021. Since the release of Drupal 9 is set for June 2020, you should plan your upgrade and go live. If not migrated by the required time, your website may be vulnerable to security threats. Since Drupal 7 to 9 are similar to new development, you just need to consider the timeline involved.

Are you on Drupal 8?

Great work if you are already on Drupal 8! For you, it would be easier to move to the next major version with zero effort.

The big difference between the last version of Drupal 8 and the first order of Drupal 9 is that of deprecated code being removed. You just need to check that your themes, modules, and profiles don’t include any such code. So, there’s no need to worry about migrating your content at all!

Want to know more about Drupal and its migration process?

Sep 09 2019
Sep 09

The node_list cache tag is generated automatically when we create a view that displays nodes entity type in Drupal 8. This cache tag will invalidate the cache of all views that list any kind of nodes (page, article, ....) when we make a CUD (create, update, delete) action on any kind of nodes.

At first glance, this seems a very good cache invalidation strategy since when we modify a node through a CUD action, the cache of every views that display nodes will be invalidate to reflect this new change.

node_list cache tag views drupal9

(to see the cache tags in your header response, just enable your settings.local.php)

So far so good, but... What would happen if we have a high traffic web site with hundred of different node bundles and hundred of views displaying different kind of nodes?

In that case, if we modify a node (let's say a page), the cache of every views listing page nodes (but also views listing article nodes and other node bundles) will be invalidated. This means that if we modify a page node, the cache of all views displaying article nodes will also be invalided. This could be a serious performance issue and especially when we have a lot of updates like in a newspaper web site.

How can we invalidate the view caches in a more specific way, let's say only for nodes of the same type?

To do so, we'll use a two steps process:

- Place a custom cache tag for all views displaying the same node type, like node:type:page for all views listing page nodes, node:type:article for views listing articles nodes and so on...
- Invalidate these custom cache tags when a CUD operation is performed on a specific node type with a hook_node_presave() 

Add custom cache tags with the Views Custom Cache Tags contrib module

Luckily, to solve the node-list cache tag problem, the community made a contrib module that let us to place custom cache tags in our views: the Views Custom Cache Tags contrib module. This module allows us to define custom cache tags for any view we build.

In this case, we are going to place in our views a custom cache tags for each type of node we are displaying:
node:type:article for views listing articles
node:type:page for views listing pages
node:type:<your custom node type> ....

First, we are going to download and install the module.

# Download the module with composer
composer require drupal/views_custom_cache_tag
# Install the module with DC
drupal moi views_custom_cache_tag 

Then we could create two block views, one listing 5 articles (name it Block Articles) and one listing 5 pages (name it Block Pages). Next we place a custom cache tag node:type:article in the view that list articles (Block Articles). We do the same, but with an other custom cache tag like  node:type:page in the view listing pages (Block Pages).

Let's see how to do it for the view (block) listing articles:

1. Edit the view
2. Go to ADVANCED and click on 'Tag based' in Caching

custom cache tags views drupal8

3. Then click on Custom Tag based

custom cache tags views drupal 8

4. Insert the custom cache tag for this node type. In our case, as we are listing articles, we introduce node:type:article. For views listing other kind of nodes, we'll introduce node:type:<node-type>. Don't forget to click on Apply when you're done.

custom cache tags views drupal8

5. Save the view

custom cache tags views Drupal 8

When we have placed custom cache tags in all views listing nodes, we can now move to the second step, invalidate these tags when needed.

Invalidate custom cache tags with a hook_node_presave

Now we need to invalidate these custom cache tags when a CUD action is performed on a specific node type thanks to the hook_node_presave.

To do that, we're going to create a module. You can download the code of this module here.

Let's first create our module with Drupal Console as follow:

drupal generate:module  --module="Invalidate custom cache tags" --machine-name="kb_invalidate_custom_cache_tags" --module-path="modules/custom" --description="Example to invalidate views custom cache tags" --core="8.x" --package="Custom"  --module-file --dependencies="views_custom_cache_tag"  --no-interaction

Then we enable the module, we can do it also with Drupal Console:

drupal moi kb_invalidate_custom_cache_tags

Now we can edit our kb_invalidate_custom_cache_tags.module and place the following hook:

// For hook_ENTITY_TYPE_presave.
use Drupal\Core\Cache\Cache;
use Drupal\Core\Entity\EntityInterface;

/**
 * Implements hook_ENTITY_TYPE_presave().
 *
 * Invalid cache tags for node lists.
 */
function kb_invalidate_custom_cache_tags_node_presave(EntityInterface $entity) {
  $cache_tag = 'node:type:' . $entity->getType();
  Cache::invalidateTags([$cache_tag]);
}

Yes, we still have hooks in Drupal8... This hook will be fired each time a node is created or updated. We first retrieve the node type (page, article, ...) with $entity->getType() and create the variable $cache_tag with this value, this variable will correspond to our custom tag for this kind of node. Next we mark this cache tag in all bins as invalid with Cache::invalidateTags([$cache_tag]);. So the cache of every view with this custom cache tag will be invalidated.

In this case, if we insert or update a node of type article, the views custom cache tag will be node:type:article and the cache of all views with this custom cache tag will be invalidated. The cache of the views with other custom cache tags like node:type:page will remain valid. This was just what we were looking for! Thank you Drupal!

Recap.

In order to avoid the cache of all node views to be invalidated when we make a CUD operation on a node, we need the general node_list cache tag to be replaced by a more specific custom cache tag like node:type:<node-type>.

Thanks the Views Custom Cache Tags contrib module we can now insert custom cache tags in our views based on the node type listed in the view like node:type:article, node:type:page and so on.

Next, we mark these custom cache tags as invalid when we insert or update a specific node type thanks to the hook_ENTITY_TYPE_presave hook we've placed on our custom module.

Voilà! If you have an other kind of strategy to face the node_list problem, please share it with us in the comments.

Sep 09 2019
Sep 09

The staff and board of the Drupal Association would like to congratulate our newest At-Large board member:

Leslie Glynn

Leslie Glynn portrait photoLeslie has more than 30 years of experience in the tech field as a software developer and project manager. She has been a freelance Drupal Project Manager and Site Builder since 2012. Glynn is very active in the Drupal community as an event organizer (Design 4 Drupal, Boston and NEDCamp), sprint organizer, mentor, trainer and volunteer. She is the winner of the 2019 Aaron Winborn Award. This annual award recognizes an individual who demonstrates personal integrity, kindness, and above-and-beyond commitment to the Drupal community.

Being a volunteer at numerous Drupal camps and DrupalCons has given me the opportunity to meet and learn from many diverse members of the Drupal community. I hope to bring that knowledge and experience to my work on Drupal Association initiatives. One of the things I would like to help with is growing Drupal adoption through new initiatives that reach out to under-represented and diverse groups through an increased presence at secondary schools and universities and to groups such as "Girls Who Code" and other groups in the tech space.

We are all looking forward to working with you, Leslie.

Thank you to all our candidates

On behalf of all the staff and board of the Drupal Association, and I’m sure the rest of the Drupal community, I would like to thank all of those people who stood for election this year. It truly is a big commitment to contribution and one to be applauded. We wish you well for 2019 and hope to see you back in 2020!

About the Elections Methodology: Instant Run-off Voting (IRV)

Elections for the Community-at-large positions on the Drupal Association Board are conducted through Instant Run-off Voting. This means that voters can rank candidates according to their preference. When tabulating ballots, the voters' top-ranked choices are considered first. If no candidate has more than 50% of the vote, the candidate with the lowest votes is eliminated. Then the ballots are tabulated again, with all the ballots that had the eliminated candidate as their first rank now recalculated with their second rank choices. This process is repeated until only two candidates remain and a clear winner can be determined. This voting method helps to ensure that the candidate who is most preferred by the most number of voters is ultimately elected. You can learn more about IRV (also known as Alternative Vote) in this video.

Detailed Voting Results

There were 12 candidates in contention for the single vacancy among the two community-at-large seats on the Board. 1,050 voters cast their ballots out of a pool of 49,498 eligible voters (2.2%).

The full results output is below. The system allows for candidates to keep their name hidden, if they choose, so we replaced the names of those who did with a candidate number:

The number of voters is 1050 and there were 998 valid votes
and 52 empty votes. Removed withdrawn candidate Tushar Thatikonda from 
the ballots.

Counting votes using Instant Runoff Voting.

 R|Candi|Candi|Imre |Brian|Candi|Shada|Ahmad|Candi|Alann|Manji|Lesli|Exhau
  |date |date |Gmeli| Gilb|date |b Ash| Khal|date |a Bur|t Sin|e Gly|sted 
  |4    |3    |g Mei|ert  |2    |raf  |il   |1    |ke   |gh   |nn   |     
  |     |     |jling|     |     |     |     |     |     |     |     |     
  |     |     |     |     |     |     |     |     |     |     |     |     
==========================================================================
 1|   71|   74|  166|  119|   36|   45|    7|  115|   67|  116|  182|    0
  |-----------------------------------------------------------------------
  | Count of first choices.
==========================================================================
 2|   71|   75|  167|  120|   36|   46|     |  116|   67|  117|  183|    0
  |-----------------------------------------------------------------------
  | Count after eliminating Ahmad Khalil and transferring votes.
==========================================================================
 3|   72|   76|  177|  124|     |   47|     |  118|   68|  117|  185|   14
  |-----------------------------------------------------------------------
  | Count after eliminating Candidate 2  and transferring votes.
==========================================================================
 4|   74|   76|  178|  125|     |     |     |  132|   70|  130|  186|   27
  |-----------------------------------------------------------------------
  | Count after eliminating Shadab Ashraf and transferring votes.
==========================================================================
 5|   89|   77|  183|  133|     |     |     |  142|     |  131|  211|   32
  |-----------------------------------------------------------------------
  | Count after eliminating Alanna Burke and transferring votes.
==========================================================================
 6|   93|     |  192|  134|     |     |     |  151|     |  134|  217|   77
  |-----------------------------------------------------------------------
  | Count after eliminating Candidate 3 and transferring votes.
==========================================================================
 7|     |     |  199|  149|     |     |     |  177|     |  136|  248|   89
  |-----------------------------------------------------------------------
  | Count after eliminating Candidate 4 and transferring votes.
==========================================================================
 8|     |     |  208|  163|     |     |     |  228|     |     |  254|  145
  |-----------------------------------------------------------------------
  | Count after eliminating Manjit Singh and transferring votes.
==========================================================================
 9|     |     |  239|     |     |     |     |  247|     |     |  296|  216
  |-----------------------------------------------------------------------
  | Count after eliminating Brian Gilbert and transferring votes.
==========================================================================
10|     |     |     |     |     |     |     |  288|     |     |  359|  351
  |-----------------------------------------------------------------------
  | Count after eliminating Imre Gmelig Meijling and transferring votes.
  | Final round is between Candidate 1 and Leslie Glynn.
  | Candidate Leslie Glynn is elected.

Winner is Leslie Glynn.
Sep 09 2019
Sep 09

We’re back with an overview of the blog posts we wrote last month. If there are some you particularly enjoyed, this is the perfect opportunity to revisit them, as well as catch up on the ones you might have missed.

Recap of Acquia's webinar on the Digital Experience Platform

The first post we wrote in August is a recap of Acquia’s webinar on the DXP (Digital Experience Platform), which was presented by Tom Wentworth, SVP of Product Marketing at Acquia, and Justin Emond, CEO of Third and Grove

They talked about digital experiences in general, then explained what a DXP is, why an open approach is best for a DXP, and how Acquia can serve as the basis for an open DXP.

The high emphasis placed on digital experiences is due to the fact that a single negative one can do irreparable damage to a brand. It is thus important to deliver integrated experiences on a platform that’s future-ready. 

As the only truly open DXP, Acquia’s Open Experience Platform is likely the best choice, as integrations with future technologies will be easier due to this open nature.

Read more

Interview with Ricardo Amaro: The future is open, the future is community and inclusion

Our second post is part of the series of our Drupal Community Interviews. This one features a prominent and prolific member of the community - Ricardo Amaro, Principal Site Reliability Engineer at Acquia and an active member of the Portuguese as well as the broader Drupal communities.

Ricardo has been involved in numerous important projects and initiatives, ranging from more technical endeavors such as Docker and containers, to more community-oriented things such as the Promote Drupal initiative

Apart from that, he has presented at Drupal events and participated in the organization of several of them in Portugal as the president of the Portuguese Drupal Association

He is also a strong advocate for Free Software and encourages collaboration with other projects in the ecosystem. He strives to keep the future of the web and technology in general open and rich in possibilities.

Read more

Top 10 Drupal Accessibility Modules

Even though Drupal is already quite optimized for accessibility, it never hurts to have even more resources at one’s disposal. This was our reasoning behind researching Drupal’s available accessibility modules and putting together this list. 

The modules on the list touch different aspects of accessibility and take into account everyone who interacts with the site in any way: there are modules for developers building the site, those for admins and content editors, and those that are geared towards users of the site (e.g. the Fluidproject UI Options module).

Some of the modules have particularly interesting functionality. Namely, the a11y module provides support for simulating specific disabilities, which helps developers feel empathy for users with these disabilities. The htmLawed module can also be especially useful, as it improves both accessibility and security.

Read more

Interview with pinball wizard Greg Dunlap, Senior Digital Strategist at Lullabot

Next up, we have another community interview, this one with pinball enthusiast Greg Dunlap, Lullabot’s Senior Digital Strategist. Interestingly, his first interaction with Drupal was with Lullabot, the company he’s now working for more than 10 years later!

Greg points out that it was actually Lullabot’s Jeff Eaton who gave him the push to start contributing, and the two became really good friends. He believes (and we agree!) that who you do something with is more important than what you do - very fitting, then, that he and Jeff now form Lullabot’s strategy team.

One of the things he has particularly enjoyed recently was working with the Drupal Diversity and Inclusion group. Since welcoming diverse backgrounds and viewpoints into the community is instrumental to the future of Drupal, he encourages anyone who’s interested to join the initiative.

Read more

Agiledrop recognized as a top Drupal development company by TopDevelopers.co

Our final post from August is a bit more company oriented. In a press release published in early August, the IT directory and review platform TopDevelopers.co listed us among the top 10 Drupal development companies of August 2019.

Of course, we’re very happy with the recognition and, with our diverse contribution to the Drupalverse and the numerous successful client projects, we feel it is well deserved. 

Among the reasons for selecting us, the spokesperson at TopDevelopers.co listed the super fast integration of development teams into clients’ teams, our clear and frequent communication with clients, and our adherence to strict coding and security standards. 

To learn more about our work, you can also check out our portfolio of references and case studies, as well our profile page on TopDevelopers.co, which their team helped us build.

Read more

These were all our blog posts from August. We'll be back again next month with an overview of September's posts. Till then - enjoy!

Sep 09 2019
Sep 09

Some different modules and plugins can alter the display of a view in Drupal, for instance, to alternate the order of image and text every new row, or to build some kind of stacked layout. 

It is possible to alter the display of a view with just some lines of CSS code instead. This approach has many advantages, being the fact of not having to install and update a module, the most relevant one.

Keep reading to learn how!

Step #1. - The Proposed Layout

190828 theming views

As you can notice, we can divide the layout into six columns and five rows. There are empty cells in the grid, whereas other cells contain one item of the view across many cells (grid area). The Drupal view shows a list of articles and their titles. The view format is unformatted list.

Step #2. - Create an Image Style

To ensure that all images are squared, it is necessary to configure an image style and set it as display style in the view.

  • Click Configuration > Image styles
    190828 theming views 001
  • Click Add image style
    190828 theming views 002
  • Give the new image style a proper name, for example, Crop 600x600.

It is always a good idea to include some reference about the dimension or the proportion of the image style. That helps when having multiple image styles configured.

  • Click Create new style
  • Select Crop from the dropdown
  • Click Add
    190828 theming views 003
  • Set height and width for the crop effect (make sure both dimensions are equal)
  • Leave the default crop anchor at the center of the image
  • Click Add effect
    190828 theming views 004
  • Make sure the image effect was recorded properly and click Save
    190828 theming views 005

Step #3. - Create the View

You can read more about rewriting results in Views here.

  • Save the view
  • Click Structure > Block layout
  • Scroll down to the Content section
  • Click Place block
    190828 theming views 013
  • Search for your block
  • Click Place block
  • Uncheck Display title
  • Click Save block
    190828 theming views 014
  • Drag the cross handle and place the block above the Main content
  • Scroll down and click Save blocks

Step #4. - Theming the View

There are 3 classes you need to target to apply the layout styles to the view:

  • .gallery-item  (each content card will be a grid)

  • #block-views-block-front-gallery-block-1 .view-content

  • #block-views-block-front-gallery-block-1 .view-row

We set the specificity of the CSS styles on the block. The classes .view-content and .view-row are default Views classes. Theming only with these would break the layout of other views on the site, for example, the teaser view on the front page.

Hint: I am working on a local development environment with a subtheme of Bartik. There is much more about Drupal theming at OSTraining here.
If you don’t know how to create a subtheme in Drupal yet, and you are working on a sandbox installation, just add the code at the end of the file and please remember always to clear the cache.

/core/themes/bartik/css/layout.css 

Let’s start with the content inside the .gallery-item container. It will be a grid with one column and 4 rows. The image will cover all 4 rows, whereas the title text will be located on the last row. To center the title on its cell, we declare the link tag as a grid container too.

  • Edit the CSS code:
.gallery-item {
    display: grid;
    grid-template-rows: repeat(4, 1fr);
}
 
.gallery-item a:first-of-type {
    grid-row: 1 / span 4;
    grid-column: 1;
}
 
.gallery-item a:last-of-type {
   grid-row: 4;
   grid-column: 1;
   display: grid; /* Acting as a grid container */
   align-content: center;
   justify-content: center;
   background-color: rgba(112, 97, 97, 0.5);
   color: white;
   font-size: 1.2em;
}

 Make the images responsive.

  • Edit the CSS code:
img {
   display: block;
   max-width: 100%;
   height: auto;
}

As already stated, we need a grid with 5 rows and 6 columns. After declaring it, map every position in the grid according to the layout with an area name. The empty cells/areas will be represented with a period. 

  • Edit the CSS code:
#block-views-block-front-gallery-block-1 .view-content {
 display: grid;
 grid-template-columns: repeat(6, 1fr);
 grid-template-rows: repeat(5, 1fr);
 grid-gap: 0.75em;
 grid-template-areas:
 ". thumb1 main main main thumb2"
 ". thumb3 main main main thumb4"
 ". thumb5 main main main thumb6"
 "secondary secondary thumb7 thumb8 thumb9 ."
 "secondary secondary . . . .";
 max-width: 70vw;
 margin: 0 auto;
}

Now it’s time to assign each grid item to its corresponding region.

  • Edit the CSS code:
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(1) {
   grid-area: main;
}
 
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(2) {
   grid-area: secondary;
}
 
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(3) {
   grid-area: thumb1;
}
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(4) {
   grid-area: thumb3;
}
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(5) {
   grid-area: thumb5;
}
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(6) {
   grid-area: thumb2;
}
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(7) {
   grid-area: thumb4;
}
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(8) {
   grid-area: thumb6;
}
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(9) {
   grid-area: thumb7;
}
 
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(10) {
   grid-area: thumb8;
}
#block-views-block-front-gallery-block-1 .view-content > .views-row:nth-of-type(11) {
   grid-area: thumb9;
}

I think this is a practical way to layout Views items the way you want without the need of installing extra modules, which could unnecessarily affect the performance of your site. 

The Media Queries

The layout will break at around 970px, because of the font size. 

  • Edit the CSS code:
@media screen and (max-width: 970px) {
 .views-row > div .gallery-item > a:nth-child(2) {
   font-size: .9em;
 }
}

To change the layout, just add a media query with a new grid-template-areas distribution, and of course, we have to change the way the rows and columns are distributed The items are already assigned to their respective areas.

  • Edit the CSS code:
@media screen and (max-width: 700px) {
 .view-content {
   grid-template-columns: repeat(2, 1fr);
   grid-template-rows: repeat(10, auto);
   grid-template-areas:
     "main main"
     "main main"
     "thumb1 thumb2"
     "thumb3 thumb4"
     "secondary secondary"
     "secondary secondary"
     "thumb5 thumb6"
     "thumb7 thumb8"
     "thumb9 thumb9"
     "thumb9 thumb9";
 }
}

This layout will work even with the smallest device screen.

I hope you liked this tutorial. Thanks for reading!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Sep 09 2019
Sep 09

3 minute read Published: 29 Aug, 2019 Author: Colan Schwartz
Drupal Planet , Aegir , DevOps

Have you been looking for a self-hosted solution for hosting and managing Drupal sites? Would you like be able able to upgrade all of your sites at once with a single button click? Are you tired of dealing with all of the proprietary Drupal hosting providers that won’t let you customize your set-up? Wouldn’t it be nice if all of your sites had free automatically-updating HTTPS certificates? You probably know that Aegir can do all of this, but it’s now trivial to set up a temporary trial instance to see how it works.

The new Aegir Development VM makes this possible.

History

Throughout Aegir’s history, we’ve had several projects striving to achieve the same goal. They’re listed in the Contributed Projects section of the documentation.

Aegir Up

Aegir Up was based on a VirtualBox virtual machine (VM), managed by Vagrant and provisioned with Puppet. It was superseded by Valkyrie (see below).

Aegir Development Environment

Aegir Development Environment took a completely different approach using Docker. It assembles all of the services (each one in a container, e.g. the MySQL database) into a system managed by Docker Compose. While this is a novel approach, it’s not necessary to have multiple containers to get a basic Aegir instance up and running.

Valkyrie

Valkyrie was similar to Aegir Up, but provisioning moved from Puppet to Ansible. Valkyrie also made extensive use of custom Drush commands to simplify development.

Its focus was more on developing Drupal sites than on developing Aegir. Now that we have Lando, it’s no longer necessary to include this type of functionality.

It was superseded by the now current Aegir Development VM.

Present

Like Valkyrie, the Aegir Development VM is based on a VirtualBox VM (but that’s not the only option; see below) managed with Vagrant and provisioned with Ansible. However, it doesn’t rely on custom Drush commands.

Features

Customizable configuration

The Aegir Development VM configuration is very easy to customize as Ansible variables are used throughout.

For example, if you’d like to use Nginx instead of Apache, simply replace:

    aegir_http_service_type: apache

…with:

    aegir_http_service_type: nginx

…or override using the command line.

You can also install and enable additional Aegir modules from the available set.

Support for remote VMs

For those folks with older hardware who are unable to spare extra gigabytes (GB) for VMs, it’s possible to set up the VM remotely.

While the default amount of RAM necessary is 1 GB, 2 GB would be better for any serious work, and 4 GB is necessary if creating platforms directly from Packagist.

Support for DigitalOcean is included, but other IaaS providers (e.g. OpenStack) can be added later. Patches welcome!

Fully qualified domain name (FQDN) not required

While Aegir can quickly be installed with a small number of commands in the Quick Start Guide, that process requires an FQDN, usually something like aegir.example.com (which requires global DNS configuration). That is not the case with the Dev VM, which assumes aegir.local by default.

Simplified development

You can use it for Aegir development as well as trying Aegir!

Unlike the default set-up provisioned by the Quick Start Guide, which would require additional configuration, the individual components (e.g. Hosting, Provision, etc.) are cloned repositories making it easy to create patches (and for module maintainers: push changes upstream).

Conclusion

We’ve recently updated the project so that an up-to-date VM is being used, and it’s now ready for general use. Please go ahead and try it.

If you run into any problems, feel free to create issues on the issue board and/or submit merge requests.

The article Try Aegir now with the new Dev VM first appeared on the Consensus Enterprises blog.

We've disabled blog comments to prevent spam, but if you have questions or comments about this post, get in touch!

Sep 07 2019
Sep 07

In the previous article, we began talking about debugging Drupal migrations. We gave some recommendations of things to do before diving deep into debugging. We also introduced the log process plugin. Today, we are going to show how to use the Migrate Devel module and the debug process plugin. Then we will give some guidelines on using a real debugger like XDebug. Next, we will share tips so you get used to migration errors. Finally, we are going to briefly talk about the migrate:fields-source Drush command. Let’s get started.

Example configuration for debug process plugin

The migrate_devel module

The Migrate Devel module is very helpful for debugging migrations. It allows you to visualize the data as it is received from the source, the result of field transformation in the process pipeline, and values that are stored in the destination. It works by adding extra options to Drush commands. When these options are used, you will see more output in the terminal with details on how rows are being processed.

As of this writing, you will need to apply a patch to use this module. Migrate Devel was originally written for Drush 8 which is still supported, but no longer recommended. Instead, you should use at least version 9 of Drush. Between 8 and 9 there were major changes in Drush internals.  Commands need to be updated to work with the new version. Unfortunately, the Migrate Devel module is not fully compatible with Drush 9 yet. Most of the benefits listed in the project page have not been ported. For instance, automatically reverting the migrations and applying the changes to the migration files is not yet available. The partial support is still useful and to get it you need to apply the patch from this issue. If you are using the Drush commands provided by Migrate Plus, you will also want to apply this patch. If you are using the Drupal composer template, you can add this to your composer.json to apply both patches:

"extra": {
  "patches": {
    "drupal/migrate_devel": {
      "drush 9 support": "https://www.drupal.org/files/issues/2018-10-08/migrate_devel-drush9-2938677-6.patch"
    },
    "drupal/migrate_tools": {
      "--limit option": "https://www.drupal.org/files/issues/2019-08-19/3024399-55.patch"
    }
  }
}

With the patches applied and the modules installed, you will get two new command line options for the migrate:import command: --migrate-debug and --migrate-debug-pre. The major difference between them is that the latter runs before the destination is saved. Therefore, --migrate-debug-pre does not provide debug information of the destination.

Using any of the flags will produce a lot of debug information for each row being processed. Many times, analyzing a subset of the records is enough to stop potential issues. The patch to Migrate Tools will allow you to use the --limit and --idlist options with the migrate:import command to limit the number of elements to process.

To demonstrate the output generated by the module, let’s use the image migration from the CSV source example. You can get the code at https://github.com/dinarcon/ud_migrations. The following snippets how to execute the import command with the extra debugging options and the resulting output:

# Import only one element.
$ drush migrate:import udm_csv_source_image --migrate-debug --limit=1

# Use the row's unique identifier to limit which element to import.
$ drush migrate:import udm_csv_source_image --migrate-debug --idlist="P01"
$ drush migrate:import udm_csv_source_image --migrate-debug --limit=1
┌──────────────────────────────────────────────────────────────────────────────┐
│                                   $Source                                    │
└──────────────────────────────────────────────────────────────────────────────┘
array (10) [
    'photo_id' => string (3) "P01"
    'photo_url' => string (74) "https://agaric.coop/sites/default/files/pictures/picture-15-1421176712.jpg"
    'path' => string (76) "modules/custom/ud_migrations/ud_migrations_csv_source/sources/udm_photos.csv"
    'ids' => array (1) [
        string (8) "photo_id"
    ]
    'header_offset' => NULL
    'fields' => array (2) [
        array (2) [
            'name' => string (8) "photo_id"
            'label' => string (8) "Photo ID"
        ]
        array (2) [
            'name' => string (9) "photo_url"
            'label' => string (9) "Photo URL"
        ]
    ]
    'delimiter' => string (1) ","
    'enclosure' => string (1) """
    'escape' => string (1) "\"
    'plugin' => string (3) "csv"
]
┌──────────────────────────────────────────────────────────────────────────────┐
│                                 $Destination                                 │
└──────────────────────────────────────────────────────────────────────────────┘
array (4) [
    'psf_destination_filename' => string (25) "picture-15-1421176712.jpg"
    'psf_destination_full_path' => string (25) "picture-15-1421176712.jpg"
    'psf_source_image_path' => string (74) "https://agaric.coop/sites/default/files/pictures/picture-15-1421176712.jpg"
    'uri' => string (29) "./picture-15-1421176712_6.jpg"
]
┌──────────────────────────────────────────────────────────────────────────────┐
│                             $DestinationIDValues                             │
└──────────────────────────────────────────────────────────────────────────────┘
array (1) [
    string (1) "3"
]
════════════════════════════════════════════════════════════════════════════════
Called from +56 /var/www/drupalvm/drupal/web/modules/contrib/migrate_devel/src/EventSubscriber/MigrationEventSubscriber.php
 [notice] Processed 1 item (1 created, 0 updated, 0 failed, 0 ignored) - done with 'udm_csv_source_image'

In the terminal, you can see the data as it is passed along in the Migrate API. In the $Source, you can see how the source plugin was configured and the different columns for the row being processed. In the $Destination, you can see all the fields that were mapped in the process section and their values after executing all the process plugin transformation. In $DestinationIDValues, you can see the unique identifier of the destination entity that was created. This migration created an image so the destination array has only one element: the file ID (fid). For paragraphs, which are revisioned entities, you will get two values: the id and the revision_id. The following snippet shows the $Destination and  $DestinationIDValues sections for the paragraph migration in the same example module:

$ drush migrate:import udm_csv_source_paragraph --migrate-debug --limit=1
┌──────────────────────────────────────────────────────────────────────────────┐
│                                   $Source                                    │
└──────────────────────────────────────────────────────────────────────────────┘
Omitted.
┌──────────────────────────────────────────────────────────────────────────────┐
│                                 $Destination                                 │
└──────────────────────────────────────────────────────────────────────────────┘
array (3) [
    'field_ud_book_paragraph_title' => string (32) "The definitive guide to Drupal 7"
    'field_ud_book_paragraph_author' => string UTF-8 (24) "Benjamin Melançon et al."
    'type' => string (17) "ud_book_paragraph"
]
┌──────────────────────────────────────────────────────────────────────────────┐
│                             $DestinationIDValues                             │
└──────────────────────────────────────────────────────────────────────────────┘
array (2) [
    'id' => string (1) "3"
    'revision_id' => string (1) "7"
]
════════════════════════════════════════════════════════════════════════════════
Called from +56 /var/www/drupalvm/drupal/web/modules/contrib/migrate_devel/src/EventSubscriber/MigrationEventSubscriber.php
 [notice] Processed 1 item (1 created, 0 updated, 0 failed, 0 ignored) - done with 'udm_csv_source_paragraph'

The debug process plugin

The Migrate Devel module also provides a new process plugin called debug. The plugin works by printing the value it receives to the terminal. As Benji Fisher explains in this issue, the debug plugin offers the following advantages over the log plugin provided by the core Migrate API:

  • The use of print_r() handles both arrays and scalar values gracefully.
  • It is easy to differentiate debugging code that should be removed from logging plugin configuration that should stay.
  • It saves time as there is no need to run the migrate:messages command to read the logged values.

In short, you can use the debug plugin in place of log. There is a particular case where using debug is really useful. If used in between of a process plugin chain, you can see how elements are being transformed in each step. The following snippet shows an example of this setup and the output it produces:

field_tags:
  - plugin: skip_on_empty
    source: src_fruit_list
    method: process
    message: 'No fruit_list listed.'
  - plugin: debug
    label: 'Step 1: Value received from the source plugin: '
  - plugin: explode
    delimiter: ','
  - plugin: debug
    label: 'Step 2: Exploded taxonomy term names '
    multiple: true
  - plugin: callback
    callable: trim
  - plugin: debug
    label: 'Step 3: Trimmed taxonomy term names '
  - plugin: entity_generate
    entity_type: taxonomy_term
    value_key: name
    bundle_key: vid
    bundle: tags
  - plugin: debug
    label: 'Step 4: Generated taxonomy term IDs '
$ drush migrate:import udm_config_entity_lookup_entity_generate_node --limit=1
Step 1: Value received from the source plugin: Apple, Pear, Banana
Step 2: Exploded taxonomy term names Array
(
    [0] => Apple
    [1] =>  Pear
    [2] =>  Banana
)
Step 3: Trimmed taxonomy term names Array
(
    [0] => Apple
    [1] => Pear
    [2] => Banana
)
Step 4: Generated taxonomy term IDs Array
(
    [0] => 2
    [1] => 3
    [2] => 7
)
 [notice] Processed 1 item (1 created, 0 updated, 0 failed, 0 ignored) - done with 'udm_config_entity_lookup_entity_generate_node'

The process pipeline is part of the node migration from the entity_generate plugin example. In the code snippet, a debug step is added after each plugin in the chain. That way, you can verify that the transformations are happening as expected. In the last step you get an array of the taxonomy term IDs (tid) that will be associated with the field_tags field. Note that this plugin accepts two optional parameters:

  • label is a string to print before the debug output. It can be used to give context to what is being printed.
  • multiple is a boolean that when set to true signals the next plugin in the pipeline to process each element of an array individually. The functionality is similar to the multiple_values plugin provided by Migrate Plus.

Using the right tool for the job: a debugger

Many migration issues can be solved by following the recommendations from the previous article and the tools provided by Migrate Devel. But there are problems so complex that you need a full-blown debugger. The many layers of abstraction in Drupal, and the fact that multiple modules might be involved in a single migration, makes the use of debuggers very appealing. With them, you can step through each line of code across multiple files and see how each variable changes over time.

In the next article, we will explain how to configure XDebug to work with PHPStorm and DrupalVM. For now, let’s consider where are good places to add breakpoints. In this article, Lucas Hedding recommends adding them in:

  • The import method of the MigrateExecutable class.
  • The processRow method of the MigrateExecutable class.
  • The process plugin if you know which one might be causing an issue. The transform method is a good place to set the breakpoint.

The use of a debugger is no guarantee that you will find the solution to your issue. It will depend on many factors, including your familiarity with the system and how deep lies the problem. Previous debugging experience, even if not directly related to migrations, will help a lot. Do not get discouraged if it takes you too much time to discover what is causing the problem or if you cannot find it at all. Each time you will get a better understanding of the system.

Adam Globus-Hoenich, a migrate maintainer, once told me that the Migrate API "is impossible to understand for people that are not migrate maintainers." That was after spending about an hour together trying to debug an issue and failing to make it work. I mention this not with the intention to discourage you. But to illustrate that no single person knows everything about the Migrate API and even their maintainers can have a hard time debugging issues. Personally, I have spent countless hours in the debugger tracking how the data flows from the source to the destination entities. It is mind-blowing, and I barely understand what is going on. The community has come together to produce a fantastic piece of software. Anyone who uses the Migrate API is standing on the shoulders of giants.

If it is not broken, break it on purpose

One of the best ways to reduce the time you spend debugging an issue is having experience with a similar problem. A great way to learn is by finding a working example and breaking it on purpose. This will let you get familiar with the requirements and assumptions made by the system and the errors it produces.

Throughout the series, we have created many examples. We have made our best effort to explain how each example work. But we were not able to document every detail in the articles. In part to keep them within a reasonable length. But also, because we do not fully comprehend the system. In any case, we highly encourage you to take the examples and break them in every imaginable way. Making one change at a time, see how the migration behaves and what errors are produced. These are some things to try:

  • Do not leave a space after a colon (:) when setting a configuration option. Example: id:this_is_going_to_be_fun.
  • Change the indentation of plugin definitions.
  • Try to use a plugin provided by a contributed module that is not enabled.
  • Do not set a required plugin configuration option.
  • Leave out a full section like source, process, or destination.
  • Mix the upper and lowercase letters in configuration options, variables, pseudofields, etc.
  • Try to convert a migration managed as code to configuration; and vice versa.

The migrate:fields-source Drush command

Before wrapping up the discussion on debugging migrations, let’s quickly cover the migrate:fields-source Drush command. It lists all the fields available in the source that can be used later in the process section. Many source plugins require that you manually set the list of fields to fetch from the source. Because of this, the information provided by this command is redundant most of the time. However, it is particularly useful with CSV source migrations. The CSV plugin automatically includes all the columns in the file. Executing this command will let you know which columns are available. For example, running drush migrate:fields-source udm_csv_source_node produces the following output in the terminal:

$ drush migrate:fields-source udm_csv_source_node
 -------------- -------------
  Machine Name   Description
 -------------- -------------
  unique_id      unique_id
  name           name
  photo_file     photo_file
  book_ref       book_ref
 -------------- -------------

The migration is part of the CSV source example. By running the command you can see that the file contains four columns. The values under "Machine Name" are the ones you are going to use for field mappings in the process section. The Drush command has a --format option that lets you change the format of the output. Execute drush migrate:fields-source --help to get a list of valid formats.

What did you learn in today’s blog post? Have you ever used the migrate devel module for debugging purposes? What is your strategy when using a debugger like XDebug? Any debugging tips that have been useful to you? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Sep 06 2019
Sep 06

To get started, you will need a Drupal 8 site. If you don’t have one, you can create a free Drupal 8 development site on Pantheon. Once you have your Drupal 8 site installed, make sure you have some Article content. You will also need to enable the JSON:API module.

API Module

That is it for setup on the Drupal site. The next step is to install the Gatsby Source Drupal plugin.

npm install --save gatsby-source-drupal

Or

yarn add gatsby-source-drupal

Next open the gatsby-config.js file so we can configure the plugin. Add the following code to the Gatsby config.

Gatsby Drupal Config

If you re-run your Gatsby development server, you will be able to open the GraphiQL explorer at http://localhost:8000/___graphql and explore the data the Drupal database is providing. In this case, I am filtering by the article content type. You can use the Explorer panel on the left to build the query and the play button at the top to run the query.

GraphQL Explorer

Now that you know your Gatsby site can see your Drupal data, we need to consume this data during the Gatsby build process and create pages. In this case, we want to create a page on our Gatsby site for each Article. We want to make sure we use the path that was configured in Drupal as the path to our Article. Anytime we want to create pages during the Gatsby build process, we can do so in the gatsby-node.js file. Go ahead and open that file up.

Add the following code to that file:

const path = require('path');
 
exports.createPages = async ({ actions, graphql }) => {
 const { createPage } = actions;
 
 const articles = await graphql(`
   {
     allNodeArticle {
       nodes {
         id
         title
         path {
           alias
         }
       }
     }
   }
 `);
 
 articles.data.allNodeArticle.nodes.map(articleData =>
   createPage({
     path: articleData.path.alias,
     component: path.resolve(`src/templates/article.js`),
     context: {
       ArticleId: articleData.id,
     },
   })
 );
}

This code does a few things. First it uses GraphQL to pull in the id and path of all the Articles from your Drupal site. It then loops through this list and calls Gatsby’s createPage method which will create a Gatsby page for this Article. We make sure we pass in the correct path and a template (which we still need to create). We also pass in the Article id as the context. You will see why this is important in a few minutes.

Create the src/templates folder if it doesn’t already exist, then create a file called article.js. Add the following code to this new article.js file:

import React from 'react';
import PropTypes from 'prop-types';
import { graphql } from 'gatsby';
 
import Layout from '../components/layout';
 
const Article = ({ data }) => {
 const post = data.nodeArticle;
 
 return (
   <Layout>
     <h1>{post.title}</h1>
     <img 
        src={data.nodeArticle.relationships.field_image.localFile.publicURL}
        alt={data.nodeArticle.field_image.alt}
      />
 
     <div
       dangerouslySetInnerHTML={{ __html: post.body.processed }}
     />
   </Layout>
 );
};
 
Article.propTypes = {
 data: PropTypes.object.isRequired,
};
 
export const query = graphql`
 query($ArticleId: String!) {
   nodeArticle(id: { eq: $ArticleId }) {
     id
     title
     body {
       processed
     }
     field_image {
        alt
      }
     relationships {
        field_image {
          localFile {
            publicURL
          }
        }
      }
 
   }
 }
`;
 
export default Article;

This might seem a bit confusing at first. The first section contains the React component called Article. It receives a data prop which contains Article data. It returns outputs the title and the body text in a Layout component.

The second part contains the propTypes. This just says that our Article component will receive a prop called data that will be an object and it will be required. This is just a way to validate our props.

The third part is where it gets a bit more confusing. As you already know, we ran one query in the gatsby-node.js file to get the data, but here we are also running a page query. When using Drupal as a backend, it’s useful for each template to run it’s own page query so it can build the page in a self contained manner. This is especially important when you want to implement live preview (which we will cover in the future). This query takes the id and loads additional Article data such as the title and the body field. Notice that the relationships field has to be used to pull in the actual image.

Another thing to note here is that this is not the best way to pull in images. It’s recommended to use the Gatsby Image component, but that makes the GraphQL a little more complicated so we will revisit that in more depth in the future.

If you shut down and re-run your Gatsby development server, it should create the article pages at the same path they were created on your Drupal site. There is no listing page (another thing we will fix in the future), but you can manually paste one of the known paths into the address bar to see your Article content on your Gatsby Site!

Gatsby Article Page

Sep 06 2019
Sep 06

I'm excited to share that when Drupal 8.8 drops in December, Drupal's WYSIWYG editor will allow media embedding.

You may wonder: Why is that worth announcing on your blog? It's just one new button in my WYSIWYG editor..

It's a big deal because Drupal's media management has been going through a decade-long transformation. The addition of WYSIWYG integration completes the final milestone. You can read more about it on Wim's blog post.

Drupal 8.8 should ship with complete media management, which is fantastic news for site builders and content authors who have long wanted a simpler way to embed media in Drupal.

Congratulations to the Media Initiative team for this significant achievement!

September 06, 2019

27 sec read time

db db
Sep 06 2019
Sep 06

 


Today, IT security is paramount to succeed in business. Enterprises are spending hefty amount on security than ever before. Progress in both security and hacking technologies such as intrusion detection systems, honey pots, honeynets, and other various security-related hardware and software solutions are showcasing the pressing need for transformation in the information security domain.

One of the reports by Gartner cited that enterprises in India alone are going to spend laboriously on the information security front which will mark up to US$2 billion in 2020.

The increasing awareness on the benefits of the risk assessment and the realization of the fact that security is one of the driving forces for digital transformation are boosting enterprise security globally. 

The battle between open-source and proprietary software has been throwing a fit since long. Multiple issues and concerns are being examined and scrutinized by both sides of the story. In the most recent phase of this fanatical dispute, both camps have inspected the issue of security with serious tenacity.

Having said that, let’s take a sneak peek into this blog for further insights on the same.

Myths Are Meant to Be Debunked

Proprietary software is more secure than open-source software. This myth comes from many prejudices. But a commercial license doesn’t assure security. Unlike proprietary software, open-source software is transparent about potential vulnerabilities.

#Myth1: Anyone can view the code 

Because it is open source, anyone can view the code. People often want to argue that being able to view the code allows nefarious hackers to look at it and exploit vulnerabilities.

However, this openness enables collaboration. Unlike, say, one proprietary software, which is developed and is maintained by a single company, Drupal is developed and maintained by more than one hundred thousand programmers around the world. These programmers might work for companies that compete with each other, or they might volunteer to create something new that’s then given away. For free.


In fact, in 2015 Google open sourced its artificial intelligence engine, TensorFlow. Something which is a core part of its business. It hoped more developers would make the software better as they adapted it to their own needs. And it did, by making it open source, Google boasts of more than 1,300 developers, outside Google, have worked on TensorFlow making it one of the standard frameworks for developing AI applications, which could bolster its cloud-hosted AI services. 

#Myth2: Proprietary software are secure and not prone to attacks

There have been multiple instances in the past that depicts that proprietary software has been attacked several times. Such as:

Melissa Virus and ILoveYou Worm - spread through Microsoft Word email attachments. The email contained attachment. If the victim’s system had the Microsoft outlook application installed, then the virus would send the email to 50 too all contacts in the Outlook program’s address book. would also overwrite & consequently destroy various types of files on the victim’s device including MP3 files, JPEG files, and more. It led Microsoft to shut down its inbound email system.

Wannacry - a worldwide cyberattack that took place in 2017. It was a ransomware crypto worm attack that aimed at computers using Windows operating systems, encrypting all the files on hard drives on these machines. It didn’t let users access the files until they paid a ransom in the cryptocurrency Bitcoin.

The WannaCry attack impacted major entities all over the world, such as the National Health Service in Britain and Scotland, the University of Montreal in Canada, State Government websites in India, and Russian Railways.

With that said, it's evident that proprietary software is also easily vulnerable to attacks!

Although countermeasures like anti-virus programs and security patches were implemented to mitigate the threats and weaknesses, the long-term and especially exorbitant effects of these dangers have been engraved for permanent into the memories of people all over the world. This is because these attacks not only damaged vital electronic data but also shut down business operations and services, and facilitated malicious infiltration and theft of money & proprietary information.

History of Open source Software

The term “open-source”, popular since its inception in the late 70s and early 80s has come from a revolution, “open-source revolution”, which completely revamped the way software is developed- resulting in the birth of the community-generated software development method.

Box with text written inside it

In 1971, Richard Stallman, a young software engineer from Harvard, joined the MIT Artificial Intelligence Lab with the intent of developing computing platforms. After serving for a few years in the early 1980s, the MIT Lab became extinct due to the booming of proprietary software in the market and lost its talented developers to privately held tech companies.

Stallman, who was closely involved in the field knew customers’ software requirements believed customers should be empowered enough to fix and debug the software themselves instead of simply operating it.

“Users should be empowered enough to fix and debug the software themselves-instead of simply operating it”

The majority of software until now was controlled in its entirety by the developer where individual user rights were completely discarded. This was also a pain point for MIT AI Lab since they failed to incorporate this feature into their software development strategies.

The Disembarkation of the Free Software Movement

But this was until 1984. Post evaluation, Stallman began his GNU Project. Initiating with a compiler, GCC and a new operating systems-Stallman felt that GNU project was the major turning point in the evolution of free software community.

“The Free Software Foundation was formulated to let users run the software as they wanted”

Stallman believed that software should be available for free in terms of accessibility. Hence, the Free Software Foundation (FSF) was formulated so that users can run, modify, update, and disseminate software in the community.

Later on, he also introduced the concept of copyleft, wherein a program is first copyrighted, and then additional distribution terms are added for its further use.

Challenges Associated With Proprietary CMS 

Proprietary CMS comes up with a set of restrictions which makes it less flexible in comparison to open-source software. 

“The contribution and development teams of proprietary cms are smaller, which makes it evident that there is a probability of missing out on mistakes and bugs in the code”

It might appear that closed source software or proprietary software is more secure since the code is not available. But unfortunately, it is not the case! The contribution and development teams of proprietary CMS are smaller, which makes it evident that there is a probability of missing out on mistakes and bugs in the code.

You might not know what issues the proprietary system has had in the past, or is having currently because the provider of the proprietary CMS isn’t going to voluntarily reveal this information. This sets a major drawback for proprietary CMS users in terms of security as well.

Let’s further see the challenges associated with proprietary CMS-

Not many customizations options

Since these proprietary CMS are developed for a specific kind of industry and audience, it gets difficult to customize the website to fit the exact needs of the people. Users are not building their system so it's obvious that they will have limited flexibility options.

Portability is beyond the bounds of possibility

Users don’t have an option to extract data and files out of their system with a proprietary solution. They are quite restricted because they won’t be able to even move their website from one hosting service to another.

“Several CMS vendors don’t upgrade their platforms, so it's better to do a bit of research first and then jump onto doing business with a vendor”

You don’t have any option other than trusting the company blindly

Since the company owns the platform and the storage space your website will be built upon, you’ll have to manifest a lot of trust into your vendor. They will have to continuously develop and refine their software, to handle their consumers’ needs better. The vendor should also be in reach whenever you need assistance with your website

Several CMS vendors don’t upgrade their platforms, so it's better to do a bit of research first and then jump onto doing business with a vendor.

You are just renting software

Even if you have bought the proprietary CMS, you won’t own the code it's built with. It is not yours and hence requires a monthly rent from you, to keep your website running.

Benefits of Open-source Software

“People in the open-source community come forward to find solutions, assist each other, and to share extensions that would benefit the masses”

  • It is open-source!

This implies that the source code is available for anyone who wishes to study it, analyze it, and modify it in any way.

Thanks to this feature that people can easily extend the code and add specific functionalities as per their requirements.

  • An open-source CMS is maintained by the large community

There is always a primary group of developers, similar to WordPress but it is also supported by its user base. People in the open-source community come forward to find solutions, assist each other, and to share extensions that would benefit the masses.

Rectangle with various lengths of horizontal bar
Source: Sas.com

  • An open-source CMS can be hosted ubiquitously

Most of them, like Drupal, offers one-click installs in the control panel of the accompanying hosting service, which again is very user-friendly and comfortable.

  • The CMS software itself is usually free of cost

You can easily make use of plenty of extensions, themes, and a variety of tools for free. However, there are plenty of paid extensions and themes as well. Some solutions can only be leveraged with paid software. An open-source CMS is usually the most budget-friendly solution.

Alternatives to Proprietary Software

It is interesting to see that there are so many open-source software alternatives for the existing proprietary software which are equivalent or more reliable, secure, and flexible. 

If you are contemplating to migrate from proprietary software to open-source, you can surely - that too with ease!

Software Category

Proprietary Software

Equivalent Open-source Software

Operating System

Microsoft Windows

Linux Ubuntu

Browser

Internet Explorer

Mozilla Firefox

Office automation

Microsoft Office

Open Office

MATHWORKS

MATLAB

Sci Lab

Graphics Tool

Adobe Photoshop

GIMP(GNU Image Manipulation Program

Drafting tool

Auto CAD

Archimedes

Web Editors

Adobe Dreamweaver

NVU

Desktop Publishing

Adobe Acrobat

PDF Creator

Blogs

Blogger

WordPress

Mobile

IOS

Android

Media Player

Windows Media Player

VLC Player

Database

Oracle, Microsoft SQL Server

My SQL, Mongo DB, HADOOP

Server

Microsoft Window Server

Red Hat Server, Ubuntu Server

Web Server

IIS

Apache


Open-source Security in Drupal

Drupal, having a proven track record of being the most secure CMS, has been rolling with punches against critical internet susceptibleness. Thanks to Drupal security team for earnestly finding anomalies, authenticating them, and responding to security issues.  

The responsibilities of the security team include documentation of these identifications and alterations made so that developers don’t feel heebie-jeebies when faced with similar kind of situation.

“Drupal community comprises of over 100,000 contributors towards its enhancement”

Besides, the team also assists the infrastructure team to keep the Drupal.org infrastructure secure. They ensure that any security issues for code hosted on Drupal are reviewed, reported, and solved in the shortest period possible.

Important features that make Drupal 8 the best WCMS in regards to Security-

  • The Security Working Group (SecWBG) ensures that Drupal core and Drupal’s contributed project ecosystem provides a secure platform while ensuring that the best practices are followed.
  • The community makes sure that people are notified the day patches are released, which are released every Wednesday for contributed projects, and the third Wednesday of every month for core, usually for a fixed period.
  • Drupal abides by the OWASP ( Open Web Application Security Project) standards and its community is devoted towards prevention of any security breaches.
  • Drupal community comprises of over 100,000 contributors towards its enhancement. An open-source code base, where contributed modules are properly reviewed, verified, and sent a notification if that module is acceptable for use.
  • Apart from encrypting and hashing the passwords, Drupal provides those modules which can support two-step authentication and SSL certificates.
  • Any member can make changes to Drupal modules and report any issues or bugs that occur in their system.
  • Access controls offered by Drupal is a superb feature. Dedicated accounts can be created for certain user roles with specified permissions. For instance, you can create separate user accounts for Admin and Editor.
  • It’s multibranched cache structure that assists in reducing Denial of Service (DoS) attacks and makes it as the best CMS for some of the world’s highest traffic websites like NASA, the University of Oxford, Grammys, Pfizer, etc.

Statistics Says It All

Sucuri, a security platform for websites, curated the “Hacked website report 2018”. It evaluated more than 34,000 compromised websites. Among the statistics it shared, one of the factors was to juxtapose the affected open-source CMS applications.

drupal-sucuri

The results were clearly on Drupal’s side declaring it a better WCMS than other leading platforms for preventing safety hazards.

The infection crept in these websites due to improper deployment, configuration, and maintenance.

Additionally, Cloud Security Report by Alert Logic also marked Drupal as the website content management system with the least number of web application attacks.

11 Columns and 8 rows with text written inside them                                                                        Source: Alert Logic

Difference Between Open-source and Proprietary Software

Factor

Open-source

Proprietary

Cost

Open-source software is free which makes it an alluring option if you have in-house capacities to meet your business requirements.

Proprietary software costs differently from a couple of thousand dollars to one hundred thousand dollars, depending upon the multifaceted nature of the framework needed.

Service and support

Open-source software communities of developers are huge and steadfast which helps clients with prompt solutions to their problems.

Proprietary software vendors offer progressing backing to clients- a key offering point for clients without specialized mastery.

Innovation

Open-source software boosts innovation by providing users the opportunity to modify, append, or distribute as per their requirements.

Proprietary software vendors don’t permit its users to view or adjust the source code, thus making it unfit for organizations who desire scalability and flexibility.

Only developers can incorporate new features to the product as and when requested by users.

Security

As open-source code is available to everybody, it increases the possibility of finding more vulnerabilities easily. 

It is also worth noting that open-source communities fixed security vulnerabilities twice as quickly as commercial software vendors do.

Proprietary software is considered secure as it is developed in a governed condition of the employees having a frequent direction.

However, getting rid of the possibility of backdoor Trojans as well as lowering the threat of any other bugs or obstacles can be troublesome in proprietary software.

Availability

Open-source software is available for free on the web with 24*7 support from the community.

Proprietary software is accessible if the companies have the rights to the bundle or they have purchased from the respective vendors.

The trial version is also accessible for free to test.

Flexibility

As organizations aim at deriving more business values from less, open-source software can deliver high flexibility, lower IT costs and increased opportunities for innovation.

With proprietary software, such as Microsoft Windows, and Office, companies are required to upgrade both software and hardware on a timely basis. Updates must be installed for the proper working. However, not all updates are flexible with all the versions of the software.

In The End

Website security has always been a cause of hindrance in the journey of digital transformation and survival due to several potential threats. 

Open-source software can be considered as a befitting solution than a closed source or proprietary software. Further, this report indicates that there is an obvious desire among companies to adopt open-source technology and also prioritize the task of enhancing security in their organization.

Rectangle with text written inside                                                            Source: Gartner

However, it all depends on the preferences and needs of the organization and the on-going project for their digital business.

Drupal, an open-source content management framework, comes out as the most secure CMS in comparison to the leading players in the market.

It has been the pacesetter when it comes to opting the security focussed CMS. More individuals working on and reviewing the product always means a higher chance of a secure product!

Sep 06 2019
Sep 06

Diagnosing a website for accessibility and fixing any issues that are uncovered is not a one-size-fits-all endeavor. 

Every site has a distinct set of strengths and challenges. Plus, site owners vary widely in their level of expertise and the availability of resources that can be devoted to accessibility -- which includes diagnosing the site, making the necessary modifications, and ensuring that the tools and expertise are in place to maintain compliance. 

That’s why flexibility is an essential criteria when seeking ADA Section 508 accessibility solutions.

Another key: a consultative approach. Generally speaking, developers and content editors aren’t hired for their knowledge of WCAG 2.1, and for most organizations, this expertise is not mission critical. Tapping the expertise of a Certified Professional in Accessibility Core Competencies (CPACC) is the most efficient and effective path for bringing a site into compliance.

For organizations that partner with Promet Source to transition their websites into compliance, the process consists of a series of value-added steps in a specific order.  

The following graphic depicts the essential steps involved in an Accessibility Audit in which Promet review all facets of a website’s accessibility relative to WCAG 2.1, and consults with site owners on remediation. 

A circular graphic that indicates the six steps in a Promet Source Accessibility Audit. 1. PA11Y Setup 2. PA11Y Remediation 3. Round 1 Manual Audit 4. Round 1 Remediation 5. Round 2 Manual Audit 6. Final Statements

PA11Y Setup

A11Y is an abbreviation for accessibility, with the number 11 representing the number of letters between the first and last letter. PA11Y is an automated testing tool that scans web pages to detect inaccessibility. While automated testing is an essential component of the accessibility audit process, it cannot be counted on to be comprehensive. 

On average, automated testing detects approximately 30 percent of a site’s accessibility errors. The errors detected by automated testing tend to be the “low-hanging fruit” found within global elements across the site, as well as logins, landing pages, and representative page templates.
 

PA11Y Remediation

What sets Promet apart following this initial, automatic testing phase is a high degree of consultation, along with a list of custom code fixes for bringing the site into compliance. Additionally, for a year following the audit, clients have the advantage of a dashboard, that serves as a tool from which pages can be scanned and red flagged for accessibility errors.

It’s also important to point out that at the onset of the audit process, it might not be clear what remediation will entail. For any number of reasons, clients who initially intended to manage the remediation in house, might opt for a different approach once they gain an understanding of the scope of work involved.
 

Round 1 Manual Audit 

The manual audit does not occur until all of the issues flagged by the PA11Y scan are fixed. This process is facilitated by the customized code fixes that Promet provides, along with a dashboard that provides a roadmap of sorts for tracking progression and red flagging issues that need to be fixed.  

As mentioned above, the PA11Y scan cannot be counted on to detect all of the accessibility errors on a site. Manual testing is required to root out the deeper errors, which are the issues that have a greater tendency to expose site owners to legal liability. Among them:

  • Keyboard testing,
  • Color contrast testing,
  • Screen reader testing,
  • Link testing,
  • Tables and forms testing,
  • Cognitive testing, and 
  • Mobile testing.

If a site is revealed to be unresponsive, this finding can result in a recommendation to not move forward with remediation. Another potential remediation deal breaker: a mobile site that is not consistent in terms of content and functionality with the desktop site, as a mobile site is required to have the same content as its desktop counterpart.

It’s important to note that a strong accessibility initiative has been built into Drupal 8, and that will continue to be the case for Drupal 9 and subsequent updates. At this point, we have found Drupal to be the best CMS in terms of accessibility.
 

Round 1 Remediation

Promet is in close consultation with clients during the manual audit, and walks through every component of success criteria before the client moves forward with Round 1 Remediation.

A customized plan is created that varies according to depth and breadth of remediation work required, as well as the in-house expertise, and available resources. Depending on client needs, the plan can incorporate various levels of consultation, and either online or in-person training.

Working closely with both content editors and developers, the training focuses on the required remediation steps, as well as how to write code that’s accessible. Ensuring the accessibility of PDFs is another key area of focus.  

The remediation dashboard serves as an essential tool during and following Round 1 remediation. The dashboard flags errors and issues warnings which then need to be manually reviewed and addressed.
 

Round 2 Manual Audit

The Round 2 Audit represents the final review, along with ongoing consultation concerning any remediation challenges that have proven to be complex, and best practices for maintaining compliance. The Round 2 Audit won’t begin until all errors reported in the Round 1 Audit have been remediated to 0 errors.
 

Final Statements

Once all recommended remediation has been completed and verified, final statements are prepared. The final statements provide official language that the audit and remediation are complete. A final Statement of Accessibility and Statement of Work Completed will be provided. Optimally, a complete Statement of Conformance is issued, but in instances where the site links to third-party vendors, (which is often the case) and the vendor sites are not accessible, a Statement of Partial Conformance is issued, along with an explanation of the site owner’s good-faith efforts. 

It is recommended that instances of inaccessibility be reported to third-parties that are linked to the site. Often the result is ongoing remediation work and ultimately, a comprehensive Statement of Conformance.
 

Moving Forward

Without exception, Promet clients report a high degree of added value during and following an accessibility audit. The education, consultation, and opportunity to dig deep and deconstruct aspects of a site that no longer serve the organizational mission fuels a better and wiser team of developers and content editors. Plus, the dashboard that remains in place for a full year, is an essential resource for staying on track.

In the current climate, websites are highly dynamic and serve as the primary point of engagement for customers and constituents. Constantly evolving sites call for an ongoing focus on accessibility, and an acknowledgement that staff turnover can erode the education, expertise, and commitment to accessibility that is in place at the conclusion of an audit. For this reason, a bi-annual or annual audit, which can be viewed essentially as an accessibility refresh, is a highly recommended best practice. Interested in kicking off a conversation about auditing your site for accessibility? Contact us today.

Sep 05 2019
Sep 05

Throughout the series we have shown many examples. I do not recall any of them working on the first try. When working on Drupal migrations, it is often the case that things do not work right away. Today’s article is the first of a two part series on debugging Drupal migrations. We start giving some recommendations of things to do before diving deep into debugging. Then, we are going to talk about migrate messages and presented the log process plugin. Let’s get started.

Example configuration for log process plugin.

Minimizing the surface for errors

The Migrate API is a very powerful ETL framework that interacts with many systems provided by Drupal core and contributed modules. This adds layers of abstraction that can make the debugging process more complicated compared to other systems. For instance, if something fails with a remote JSON migration, the error might be produced in the Migrate API, the Entity API, the Migrate Plus module, the Migrate Tools module, or even the Guzzle HTTP Client library that fetches the file. For a more concrete example, while working on a recent article, I stumbled upon an issue that involved three modules. The problem was that when trying to rollback a CSV migration from the user interface an exception will be thrown making the operation fail. This is related to an issue in the core Migrate API that manifests itself when rollback operations are initiated from the interface provided by Migrate Plus. Then, the issue causes a condition in the Migrate Source CSV plugin that fails and the exception is thrown.

In general, you should aim to minimize the surface for errors. One way to do this by starting the migration with the minimum possible set up. For example, if you are going to migrate nodes, start by configuring the source plugin, one field (the title), and the destination. When that works, keep migrating one field at a time. If the field has multiple subfields, you can even migrate one subfield at a time. Commit every progress to version control so you can go back to a working state if things go wrong. Read this article for more recommendations on writing migrations.

What to check first?

Debugging is a process that might involve many steps. There are a few things that you should check before diving too deep into trying to find the root of the problem. Let’s begin with making sure that changes to your migrations are properly detected by the system. One common question I see people ask is where to place the migration definition files. Should they go in the migrations or config/install directory of your custom module? The answer to this is whether you want to manage your migrations as code or configuration. Your choice will determine the workflow to follow for changes in the migration files to take effect. Migrations managed in code go in the migrations directory and require rebuilding caches for changes to take effect. On the other hand, migrations managed in configuration are placed in the config/install directory and require configuration synchronization for changes to take effect. So, make sure to follow the right workflow.

After verifying that your changes are being applied, the next thing to do is verify that the modules that provide your plugins are enabled and the plugins themselves are properly configured. Look for typos in the configuration options. Always refer to the official documentation to know which options are available and find the proper spelling of them. Other places to look at is the code for the plugin definition or articles like the ones in this series documenting how to use them. Things to keep in mind include proper indentation of the configuration options. An extra whitespace or a wrong indentation level can break the migration. You can either get a fatal error or the migration can fail silently without producing the expected results. Something else to be mindful is the version of the modules you are using because the configuration options might change per version. For example, the newly released 8.x-3.x branch of Migrate Source CSV changed various configuration options as described in this change record. And the 8.x-5.x branch of Migrate Plus changed some configurations for plugin related with DOM manipulation as described in this change record. Keeping an eye on the issue queue and change records for the different modules you use is always a good idea.

If the problem persists, look for reports of similar problems in the issue queue. Make sure to include closed issues as well in case your problem has been fixed or documented already. Remember that a problem in a module can affect a different module. Keeping an eye on the issue queue and change records for all the modules you use is always a good idea. Another place ask questions is the #migrate channel in Drupal slack. The support that is offered there is fantastic.

Migration messages

If nothing else has worked, it is time to investigate what is going wrong. In case the migration outputs an error or a stacktrace to the terminal, you can use that to search in the code base where the problem might originate. But if there is no output or if the output is not useful, the next thing to do is check the migration messages.

The Migrate API allows plugins to log messages to the database in case an error occurs. Not every plugin leverages this functionality, but it is always worth checking if a plugin in your migration wrote messages that could give you a hint of what went wrong. Some plugins like skip_on_empty and skip_row_if_not_set even expose a configuration option to specify messages to log. To check the migration messages use the following Drush command: drush migrate:messages [migration_id]. If you are managing migrations as configuration, the interface provided by Migrate Plus also exposes them.

Messages are logged separately per migration, even if you run multiple migrations at once. This could happen if you execute dependencies or use groups or tags. In those cases, errors might be produced in more than one migration. You will have to look at the messages for each of them individually.

Let’s consider the following example. In the source there is a field called src_decimal_number with values like 3.1415, 2.7182, and 1.4142. It is needed to separate the number into two components: the integer part (3) and the decimal part (1415). For this, we are going to use the extract process plugin. Errors will be purposely introduced to demonstrate the workflow to check messages and update migrations. The following example shows the process plugin configuration and the output produced by trying to import the migration:

# Source values: 3.1415, 2.7182, and 1.4142

psf_number_components:
  plugin: explode
  source: src_decimal_number
$ drush mim ud_migrations_debug
[notice] Processed 3 items (0 created, 0 updated, 3 failed, 0 ignored) - done with 'ud_migrations_debug'

In MigrateToolsCommands.php line 811:
ud_migrations_debug Migration - 3 failed.

The error produced in the console does not say much. Let’s see if any messages were logged using: drush migrate:messages ud_migrations_debug. In the previous example, the messages will look like this:

 ------------------- ------- --------------------
  Source IDs Hash    Level   Message
 ------------------- ------- --------------------
  7ad742e...732e755   1       delimiter is empty
  2d3ec2b...5e53703   1       delimiter is empty
  12a042f...1432a5f   1       delimiter is empty
 ------------------------------------------------

In this case, the migration messages are good enough to let us know what is wrong. The required delimiter configuration option was not set. When an error occurs, usually you need to perform at least three steps:

  • Rollback the migration. This will also clear the messages.
  • Make changes to definition file and make they are applied. This will depend on whether you are managing the migrations as code or configuration.
  • Import the migration again.

Let’s say we performed these steps, but we got an error again. The following snippet shows the updated plugin configuration and the messages that were logged:

psf_number_components:
  plugin: explode
  source: src_decimal_number
  delimiter: '.'
 ------------------- ------- ------------------------------------
  Source IDs Hash    Level   Message
 ------------------- ------- ------------------------------------
  7ad742e...732e755   1       3.1415000000000002 is not a string
  2d3ec2b...5e53703   1       2.7181999999999999 is not a string
  12a042f...1432a5f   1       1.4141999999999999 is not a string
 ----------------------------------------------------------------

The new error occurs because the explode operation works on strings, but we are providing numbers. One way to fix this is to update the source to add quotes around the number so it is treated as a string. This is of course not ideal and many times not even possible. A better way to make it work is setting the strict option to false in the plugin configuration. This will make sure to cast the input value to a string before applying the explode operation. This demonstrates the importance of reading the plugin documentation to know which options are at your disposal. Of course, you can also have a look at the plugin code to see how it works.

Note: Sometimes the error produces an non-recoverable condition. The migration can be left in a status of "Importing" or "Reverting". Refer to this article to learn how to fix this condition.

The log process plugin

In the example, adding the extra configuration option will make the import operation finish without errors. But, how can you be sure the expected values are being produced? Not getting an error does not necessarily mean that the migration works as expected. It is possible that the transformations being applied do not yield the values we think or the format that Drupal expects. This is particularly true if you have complex process plugin chains. As a reminder, we want to separate a decimal number from the source like 3.1415 into its components: 3 and 1415.

The log process plugin can be used for checking the outcome of plugin transformations. This plugin offered by the core Migrate API does two things. First, it logs the value it receives to the messages table. Second, the value is returned unchanged so that it can be used in process chains. The following snippets show how to use the log plugin and what is stored in the messages table:

psf_number_components:
  - plugin: explode
    source: src_decimal_number
    delimiter: '.'
    strict: false
  - plugin: log
 ------------------- ------- --------
  Source IDs Hash    Level   Message
 ------------------- ------- --------
  7ad742e...732e755   1       3
  7ad742e...732e755   1       1415
  2d3ec2b...5e53703   1       2
  2d3ec2b...5e53703   1       7182
  12a042f...1432a5f   1       1
  2d3ec2b...5e53703   1       4142
 ------------------------------------

Because the explode plugin produces an array, each of the elements is logged individually. And sure enough, in the output you can see the numbers being separated as expected.

The log plugin can be used to verify that source values are being read properly and process plugin chains produce the expected results. Use it as part of your debugging strategy, but make sure to remove it when done with the verifications. It makes the migration to run slower because it has to write to the database. The overhead is not needed once you verify things are working as expected.

In the next article, we are going to cover the Migrate Devel module, the debug process plugin, recommendations for using a proper debugger like XDebug, and the migrate:fields-source Drush command.

What did you learn in today’s blog post? What workflow do you follow to debug a migration issue? Have you ever used the log process plugin for debugging purposes? If so, how did it help to solve the issue? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

Next: How to debug Drupal migrations - Part 2

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors: Drupalize.me by Osio Labs has online tutorials about migrations, among other topics, and Agaric provides migration trainings, among other services.  Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Sep 05 2019
Sep 05

We love to say that Drupal 8’s logo resembles the infinity sign, which means infinite opportunities for websites. It includes plenty of ways to make your website user-friendly and engaging. 

One of the techniques used in this area is infinite scrolling, which can be implemented through a nice Drupal 8 module called Views Infinite Scroll. Let’s see what infinite scrolling is and how to design it with the help of this Drupal 8 module. 

A glimpse at the technique: what is infinite scrolling? 

Infinite scrolling means continuous content uploading as the user scrolls down the page. This can optionally be accompanied by the “load more” button at the bottom of the page, which infinitely uploads the content upon click.

Endless pages make the user’s interaction with the website more natural because it is convenient to not have to click on the next page. This technique is especially popular with content-rich websites, social networks, e-commerce stores, etc. It is incredibly useful for long pages and mobile navigation. 

However, infinite scrolling should be used carefully so it does not annoy the user, distract their attention from their main goal, or block the calls-to-action. For example:

  • If the footer “disappears” together with your contacts every time your user scrolls, consider using a sticky footer or move the key links to the sidebar. 
  • Try the “load more” button to give the user more control and never block anything.
  • You can also add more usability by letting the user choose the number of displayed items before hitting “load more.”

All this and more is provided by the Views Infinite Scroll module in Drupal 8, which we will now move on to.

Introduction to the Views Infinite Scroll module in Drupal 8

The Drupal 8 Views Infinite Scroll module works with the Drupal Views. This allows you to present any Drupal data you wish — collections of images, articles, products, lists of user-profiles, or anything else.  

You can:

  • let the content be infinitely uploaded upon the scroll
  • add a “load more” button with any text on it
  • expose some viewing options to users

The Views Infinite Scroll module is available both for Drupal 7 and Drupal 8. However, the Drupal 8 version has a special bonus — it uses the built-in AJAX system of Drupal Views and requires no third-party libraries. In the next chapter, we will look more closely at how it works.

How the Views Infinite Scroll Drupal module works

Installing the module

We start by installing the Views Infinite Scroll Drupal module in any preferred way and enabling it. In our example, we are using its 8.x-1.5 version.

Installing the Views Infinite Scroll module

Creating the Drupal view

Now let’s prepare our Drupal view with a few elements in it. In our case, the grid view will show two columns of car images. 

When creating the page, we choose the “create a page” option. The default “Use a pager” and “10 items to display” settings can remain unchanged so far — we will take care of this in the next step.

Drupal 8 Views

Setting up the Drupal Views infinite scrolling

On the Drupal Views dashboard, we select the Pager section and open the “Use pager — mini” option.

Setting Drupal 8 Views pager to infinite scroll

There, we switch the pager to “Infinite Scroll.”

Setting Drupal 8 Views pager to infinite scroll

Next, we configure the settings for the Infinite Scroll. We set the number of items per page to 4.

And, most importantly, we can check or uncheck the automatic loading of content. If we uncheck it, there will be a “load more” button, for which we can write our custom text. “View more luxury cars” sounds good for this example.

Configuring infinite scroll in Drupal 8 Views

After saving the view and checking the page, we see 4 cars on page with a nice “View more luxury cars” button. Success!

View more button in Drupal 8 Views

Exposing choices to users

In our Infinite Scroll settings, there is the “Exposed options” section. By checking its options, you will allow users to:

choose the number of items displayed
see all items
specify the number of items skipped from the beginning

Exposed options in Drupal 8 infinite scroll

With these applied, our collection now looks like this.
Infinite scroll in Drupal 8 Views with exposed options

Additional CSS tweaks will make the view look exactly the way you want as far as colors, fonts, distances between elements, etc.

Apply infinite scrolling on your website

Make your customer satisfaction infinite through the use of the infinite scroll technique! So if you want to:

  • install and configure the Views Infinite Scroll module
  • customize the output with CSS
  • create a custom Drupal module for your ideas in scrolling
  • design the scrolling effect from scratch using the latest techniques

contact our Drupal team!

Sep 05 2019
Sep 05

The Drupal 8 Field Defaults module is a handy little module that allows you to bulk update the default values of a Drupal field. This is helpful if you have ever added a field to a content type or entity and wished you could have the default value apply to all the existing content on your Drupal site.

Download and install the Field Defaults module just like any other Drupal 8 module.

You can visit the configuration page by going to Admin > Configuration > System > Field Defaults settings. The only setting is the ability to retain the original entity updated time when default values are updated.

Field Default Module Settings

Navigate to a Content Type and go to Manage Fields. Edit one of the fields on the content type. In the screenshot below, I am editing a text field called Example. You will notice under the default value section there is a new fieldset called Update Existing Content. If you needed to change the default value and wanted it to apply to all of your existing content on the site, you would use the checkboxes to update the defaults.

Field Default Module Update Content

That’s it! There really is not a lot to it, but it’s useful when you are adding new fields to existing sites.

Sep 04 2019
Sep 04

Vulnerability: 

Various 3rd Party Vulnerabilities

Description: 

In June of 2011, the Drupal Security Team issued Public Service Advisory PSA-2011-002 - External libraries and plugins.

8 years later that is still the policy of the Drupal Security team. As Drupal core and modules leverage 3rd party code more and more it seems like an important time to remind site owners that they are responsible for monitoring security of 3rd party libraries. Here is the advice from 2011 which is even more relevant today:

Just like there's a need to diligently follow announcements and update contributed modules downloaded from Drupal.org, there's also a need to follow announcements by vendors of third-party libraries or plugins that are required by such modules.

Drupal's update module has no functionality to alert you to these announcements. The Drupal security team will not release announcements about security issues in external libraries and plugins.

Current PHPUnit/Mailchimp library exploit

Recently we have become aware of a vulnerability that is being actively exploited on some Drupal sites. The vulnerability is in PHPUnit and has a CVE# CVE-2017-9841. The exploit targets Drupal sites that currently or previously used the Mailchimp or Mailchimp commerce module and still have a vulnerable version of the file sites/all/libraries/mailchimp/vendor/phpunit/phpunit/src/Util/PHP/eval-stdin.php. See below for details on whether a file is vulnerable or not. The vulnerable file might be at other paths on your individual site, but an automated attack exists that is looking for that specific path. This attack can execute PHP on the server.

Solution: 

Follow release announcements by the vendors of the external libraries and plugins you use.

In this specific case, check for the existence of a file named eval-stdin.php and check its contents. If they match the new version in this commit then it is safe. If the file reads from php://input then the codebase is vulnerable. This is not an indication of a site being compromised, just of it being vulnerable. To fix this vulnerability, update your libraries. In particular you should ensure the Mailchimp and Mailchimp Ecommerce modules and their libraries are updated.

If you discover your site has been compromised, we have a guide of how to remediate a compromised site.

Also see the Drupal core project page.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web