Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jul 09 2020
Jul 09

The Drupal Steward Program enhances the security of Drupal Sites by protecting sites from exploits even before they are able to patch, and supports the sustainability of the Drupal Security Team and Drupal Association.

PORTLAND, Ore. July 7, 2020—The Drupal Association is announcing the launch of the Drupal Steward security program, together with founding partners Acquia and Pantheon, two of the largest hosting platforms for Drupal, and major contributors to the project. This is a paid service offered to further enhance the sites built on Drupal. A portion of the proceeds from both the founding partners and the community tier are used to support the Drupal Security Team and Drupal Association.

The Drupal Steward program answers the most pressing concern that keeps CIOs/CTOs up at night - "How do I protect my sites from the next unknown vulnerability?" In today's world, a Public Service Announcement of a highly-critical vulnerability means disruption to existing engineering roadmaps, overtime hours, and all-hands on deck waiting for a patch release so that it can be deployed before bad actors reverse engineer the vulnerability.

Drupal Steward addresses this issue by putting in place a network-level mitigation strategy that prevents many of these kinds of highly-critical vulnerabilities from being exploited, even before the patch has been applied. While there may be some rare vulnerabilities that cannot be mitigated with this technique - most of the highly-critical vulnerabilities in Drupal's past would have been mitigated with this method.

"I am proud that we can advance Drupal's commitment to enterprise-grade security," said Heather Rocker, Executive Director of the Drupal Association. "The Drupal Steward program and its security protections should give the world the confidence to build the next generation of digital experiences on open source technology."

Drupal sites hosted with the Drupal Steward Founding Partners Acquia and Pantheon will be directly protected by those partners. For sites not hosted by the Drupal Steward founding partners, Drupal site owners will be able to subscribe to the Community tier of the Drupal Steward program directly through the Drupal Association at an affordable cost, with discounts provided to clients of Drupal Association supporting partners with a record of contribution to the project in the form of time, talent, or treasure.

Learn more about Drupal Steward

For complete details about the Drupal Steward program, including how to sign up - please visit https://www.drupal.org/security-team/steward

Powered by a global community

Drupal is a true open source project, leveraging the expertise of tens of thousands of developers around the world. Drupal has a proven track record for strong security practices, with a strong belief that the transparency of open source leads to more secure software.

About Drupal and the Drupal Association

Drupal is the open source content management software used by millions of people and organizations around the world, made possible by a community of 100,000-plus contributors and enabling more than 1.3 million users on Drupal.org. The Drupal Association is the non-profit organization dedicated to accelerating the Drupal software project, fostering the community, and supporting its growth.

###

For more information contact [email protected] 

Jul 09 2020
Jul 09

DrupalCon Global 2020 is just a few days away, and we’re excitedly prepping up for what’s sure to be a virtual event like nothing the community has seen before. To say the least, it’s been a strange few months for everyone on the planet, but we at Amazee Labs think nothing exemplifies the resilient and persistent spirit of the open-source community like the innovative skills and organizational determination it took to bring DrupalCon Global 2020 and all its global attendees together.

Event organizers and volunteers have been tirelessly working to bring the community a virtual version of the DrupalCon experience that we all know and love.

The always popular “hallway track” has been reshaped into a virtual booth experience with plenty of space and time dedicated to promoting organic conversations with fellow attendees. A Virtual Library will give attendees access to presentations and featured speaker content, curated lists of the sessions selected for Meet the Speaker, on-demand event video, and special attendee-only content by stakeholder organizations.

Attendees focused on professional developments will have access to newly formatted sessions and program features, expanding everyone’s opportunities for inspiration and advancement in a myriad of fields and subjects.

At DrupalCon Global 2020, attendees will still be able to share and learn about the latest in thought leadership around open source and ambitious digital experiences, to enhance their careers and organizations, and add their strength and momentum to current and future Drupal projects.

Here’s how you can connect with the Amazee Labs team during the event:

Live Events
 

Our CEO Stephanie Lupold will join the How to maintain company culture with distributed teams Panel on July 15th at 16:00 UTC on the main stage to discuss how business leaders maintain their company culture in a virtual environment.

On July 16th catch our lightning talk on Automating your Web Maintenance using Drutiny at 19:15 UTC, presented by Blaize Kaye and Fran Garcia-Linares.

John Albin Wilkins’ presentation on Progressively decouple Drupal 8 with GraphQL and Twig will be live on July 17th 0:00 UTC. 

Virtual Booth
 

Don’t forget to stop by our virtual booth during exhibition hours to chat with a live Amazee and see presentations about our services and technology: 

  • Tuesday, July 14 14:00 - 15:00 UTC
  • Wednesday, July 15 21:00 - 22:00 UTC
  • Thursday, July 16 16:00 - 17:00 UTC 

You can also check out our video library anytime during the conference.

Don’t forget to register for the event. We hope to see you there!

Jul 09 2020
Jul 09

In a previous article we explained the syntax used to write Drupal migrations. When migrating into content entities, these define several properties that can be included in the process section to populate their values. For example, when importing nodes you can specify the title, publication status, creation date, etc. In the case of users, you can set the username, password, timezone, etc. Finding out which properties are available for an entity might require some Drupal development knowledge. To make the process easier, in today’s article we are presenting a reference of properties available in content entities provided by Drupal core and some contributed modules.

Example migrations mapping of content entity properties

For each entity we will present: the module that provides it, the class that defines it, and the available properties. For each property we will list its name, field type, a description, and a note if the field allows unlimited values (i.e. it has an unlimited cardinality). The list of properties available for a content entity depend on many factors. For example, if the entity is revisionable (e.g. revision_default), translatable (e.g. langcode), or both (e.g. revision_translation_affected). The modules that are enabled on the site can also affect the available properties. For instance, if the “Workspaces” module is installed, it will add a workspace property to many content entities. This reference assumes that Drupal was installed using the standard installation profile and all modules that provide content entities are enabled.

It is worth noting that entity properties are divided in two categories: base field definitions and field storage configurations. Base field configurations will always be available for the entity. On the other hand, the presence of field storage configurations will depend on various factors. For one, they can only be added to fieldable entities. Attaching the fields to the entity can be done manually by the user, by a module, or by an installation profile. Again, this reference assumes that Drupal was installed using the standard installation profile. Among other things, it adds a user_picture image field to the user entity and body, comment, field_image, and field_tags fields to the node entity. For entities that can have multiple bundles, not all properties provided by the field storage configurations will be available in all bundles. For example, with the standard installation profile all content types will have a body field associated with it, but only the article content type has the field_image, and field_tags fields. If subfields are available for the field type, you can migrate into them.

Content (Node) entity

Module: Node (Drupal Core)
Class: Drupal\node\Entity\Node
Related article: Writing your first Drupal migration

List of base field definitions:

  1. nid: (integer) The node ID.
  2. uuid: (uuid) The node UUID.
  3. vid: (integer) Revision ID.
  4. langcode: (language) Language code (e.g. en).
  5. type: (entity_reference to node_type) Content type machine name.
  6. revision_timestamp: (created) The time that the current revision was created.
  7. revision_uid: (entity_reference to user) The user ID of the author of the current revision.
  8. revision_log: (string_long) Briefly describe the changes you have made.
  9. status: (boolean) Node published when set to TRUE.
  10. uid: (entity_reference to user) The user ID of the content author.
  11. title: (string) Title.
  12. created: (created) The time that the node was created.
  13. changed: (changed) The time that the node was last edited.
  14. promote: (boolean) Node promoted to front page when set to TRUE.
  15. sticky: (boolean) Node sticky at top of lists when set to TRUE.
  16. default_langcode: (boolean) A flag indicating whether this is the default translation.
  17. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  18. revision_translation_affected: (boolean) Indicates if the last edit of a translation belongs to current revision.
  19. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

List of field storage configurations:

  1. body: text_with_summary field.
  2. comment: comment field.
  3. field_image: image field.
  4. field_tags: entity_reference field.

User entity

Module: User (Drupal Core)
Class: Drupal\user\Entity\User
Related articles: Migrating users into Drupal - Part 1 and Migrating users into Drupal - Part 2

List of base field definitions:

  1. uid: (integer) The user ID.
  2. uuid: (uuid) The user UUID.
  3. langcode: (language) The user language code.
  4. preferred_langcode: (language) The user's preferred language code for receiving emails and viewing the site.
  5. preferred_admin_langcode: (language) The user's preferred language code for viewing administration pages.
  6. name: (string) The name of this user.
  7. pass: (password) The password of this user (hashed).
  8. mail: (email) The email of this user.
  9. timezone: (string) The timezone of this user.
  10. status: (boolean) Whether the user is active or blocked.
  11. created: (created) The time that the user was created.
  12. changed: (changed) The time that the user was last edited.
  13. access: (timestamp) The time that the user last accessed the site.
  14. login: (timestamp) The time that the user last logged in.
  15. init: (email) The email address used for initial account creation.
  16. roles: (entity_reference to user_role) The roles the user has. Allows unlimited values.
  17. default_langcode: (boolean) A flag indicating whether this is the default translation.

List of field storage configurations:

  1. user_picture: image field.

Taxonomy term entity

Module: Taxonomy (Drupal Core)
Class: Drupal\taxonomy\Entity\Term
Related article: Migrating taxonomy terms and multivalue fields into Drupal

List of base field definitions:

  1. tid: (integer) The term ID.
  2. uuid: (uuid) The term UUID.
  3. revision_id: (integer) Revision ID.
  4. langcode: (language) The term language code.
  5. vid: (entity_reference to taxonomy_vocabulary) The vocabulary to which the term is assigned.
  6. revision_created: (created) The time that the current revision was created.
  7. revision_user: (entity_reference to user) The user ID of the author of the current revision.
  8. revision_log_message: (string_long) Briefly describe the changes you have made.
  9. status: (boolean) Published.
  10. name: (string) Name.
  11. description: (text_long) Description.
  12. weight: (integer) The weight of this term in relation to other terms.
  13. parent: (entity_reference to taxonomy_term) The parents of this term. Allows unlimited values.
  14. changed: (changed) The time that the term was last edited.
  15. default_langcode: (boolean) A flag indicating whether this is the default translation.
  16. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  17. revision_translation_affected: (boolean) Indicates if the last edit of a translation belongs to current revision.
  18. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

File entity

Module: File (Drupal Core)
Class: Drupal\file\Entity\File
Related articles: Migrating files and images into Drupal, Migrating images using the image_import plugin, and Migrating images using the image_import plugin

List of base field definitions:

  1. fid: (integer) The file ID.
  2. uuid: (uuid) The file UUID.
  3. langcode: (language) The file language code.
  4. uid: (entity_reference to user) The user ID of the file.
  5. filename: (string) Name of the file with no path components.
  6. uri: (file_uri) The URI to access the file (either local or remote).
  7. filemime: (string) The file's MIME type.
  8. filesize: (integer) The size of the file in bytes.
  9. status: (boolean) The status of the file, temporary (FALSE) and permanent (TRUE).
  10. created: (created) The timestamp that the file was created.
  11. changed: (changed) The timestamp that the file was last changed.

Module: Media (Drupal Core)
Class: Drupal\media\Entity\Media

List of base field definitions:

  1. mid: (integer) The media ID.
  2. uuid: (uuid) The media UUID.
  3. vid: (integer) Revision ID.
  4. langcode: (language) Language code (e.g. en).
  5. bundle: (entity_reference to media_type) Media type.
  6. revision_created: (created) The time that the current revision was created.
  7. revision_user: (entity_reference to user) The user ID of the author of the current revision.
  8. revision_log_message: (string_long) Briefly describe the changes you have made.
  9. status: (boolean) Published.
  10. uid: (entity_reference to user) The user ID of the author.
  11. name: (string) Name.
  12. thumbnail: (image) The thumbnail of the media item.
  13. created: (created) The time the media item was created.
  14. changed: (changed) The time the media item was last edited.
  15. default_langcode: (boolean) A flag indicating whether this is the default translation.
  16. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  17. revision_translation_affected: (boolean) Indicates if the last edit of a translation belongs to current revision.
  18. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

List of field storage configurations:

  1. field_media_audio_file: file field.
  2. field_media_document: file field.
  3. field_media_image: image field.
  4. field_media_oembed_video: string field.
  5. field_media_video_file: file field.

Module: Comment (Drupal Core)
Class: Drupal\comment\Entity\Comment

List of base field definitions:

  1. cid: (integer) The comment ID.
  2. uuid: (uuid) The comment UUID.
  3. langcode: (language) The comment language code.
  4. comment_type: (entity_reference to comment_type) The comment type.
  5. status: (boolean) Published.
  6. uid: (entity_reference to user) The user ID of the comment author.
  7. pid: (entity_reference to comment) The parent comment ID if this is a reply to a comment.
  8. entity_id: (entity_reference to node) The ID of the entity of which this comment is a reply.
  9. subject: (string) Subject.
  10. name: (string) The comment author's name.
  11. mail: (email) The comment author's email address.
  12. homepage: (uri) The comment author's home page address.
  13. hostname: (string) The comment author's hostname.
  14. created: (created) The time that the comment was created.
  15. changed: (changed) The time that the comment was last edited.
  16. thread: (string) The alphadecimal representation of the comment's place in a thread, consisting of a base 36 string prefixed by an integer indicating its length.
  17. entity_type: (string) The entity type to which this comment is attached.
  18. field_name: (string) The field name through which this comment was added.
  19. default_langcode: (boolean) A flag indicating whether this is the default translation.

List of field storage configurations:

  1. comment_body: text_long field.

Aggregator feed entity

Module: Aggregator (Drupal Core)
Class: Drupal\aggregator\Entity\Feed

List of base field definitions:

  1. fid: (integer) The ID of the aggregator feed.
  2. uuid: (uuid) The aggregator feed UUID.
  3. langcode: (language) The feed language code.
  4. title: (string) The name of the feed (or the name of the website providing the feed).
  5. url: (uri) The fully-qualified URL of the feed.
  6. refresh: (list_integer) The length of time between feed updates. Requires a correctly configured cron maintenance task.
  7. checked: (timestamp) Last time feed was checked for new items, as Unix timestamp.
  8. queued: (timestamp) Time when this feed was queued for refresh, 0 if not queued.
  9. link: (uri) The link of the feed.
  10. description: (string_long) The parent website's description that comes from the <description> element in the feed.
  11. image: (uri) An image representing the feed.
  12. hash: (string) Calculated hash of the feed data, used for validating cache.
  13. etag: (string) Entity tag HTTP response header, used for validating cache.
  14. modified: (timestamp) When the feed was last modified, as a Unix timestamp.

Aggregator feed item entity

Module: Aggregator (Drupal Core)
Class: Drupal\aggregator\Entity\Item

List of base field definitions:

  1. iid: (integer) The ID of the feed item.
  2. langcode: (language) The feed item language code.
  3. fid: (entity_reference to aggregator_feed) The aggregator feed entity associated with this item.
  4. title: (string) The title of the feed item.
  5. link: (uri) The link of the feed item.
  6. author: (string) The author of the feed item.
  7. description: (string_long) The body of the feed item.
  8. timestamp: (created) Posted date of the feed item, as a Unix timestamp.
  9. guid: (string_long) Unique identifier for the feed item.

Custom block entity

Module: Custom Block (Drupal Core)
Class: Drupal\block_content\Entity\BlockContent

List of base field definitions:

  1. id: (integer) The custom block ID.
  2. uuid: (uuid) The custom block UUID.
  3. revision_id: (integer) The revision ID.
  4. langcode: (language) The custom block language code.
  5. type: (entity_reference to block_content_type) The block type.
  6. revision_created: (created) The time that the current revision was created.
  7. revision_user: (entity_reference to user) The user ID of the author of the current revision.
  8. revision_log: (string_long) The log entry explaining the changes in this revision.
  9. status: (boolean) Published.
  10. info: (string) A brief description of your block.
  11. changed: (changed) The time that the custom block was last edited.
  12. reusable: (boolean) A boolean indicating whether this block is reusable.
  13. default_langcode: (boolean) A flag indicating whether this is the default translation.
  14. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  15. revision_translation_affected: (boolean) Indicates if the last edit of a translation belongs to current revision.
  16. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

List of field storage configurations:

  1. body: text_with_summary field.

Module: Contact (Drupal Core)
Class: Drupal\contact\Entity\Message

List of base field definitions:

  1. uuid: (uuid) The message UUID.
  2. langcode: (language) The message language code.
  3. contact_form: (entity_reference to contact_form) The ID of the associated form.
  4. name: (string) The name of the person that is sending the contact message.
  5. mail: (email) The email of the person that is sending the contact message.
  6. subject: (string) Subject.
  7. message: (string_long) Message.
  8. copy: (boolean) Whether to send a copy of the message to the sender.
  9. recipient: (entity_reference to user) The ID of the recipient user for personal contact messages.

Content moderation state entity

Module: Content Moderation (Drupal Core)
Class: Drupal\content_moderation\Entity\ContentModerationState

List of base field definitions:

  1. id: (integer) ID.
  2. uuid: (uuid) UUID.
  3. revision_id: (integer) Revision ID.
  4. langcode: (language) Language.
  5. uid: (entity_reference to user) The username of the entity creator.
  6. workflow: (entity_reference to workflow) The workflow the moderation state is in.
  7. moderation_state: (string) The moderation state of the referenced content.
  8. content_entity_type_id: (string) The ID of the content entity type this moderation state is for.
  9. content_entity_id: (integer) The ID of the content entity this moderation state is for.
  10. content_entity_revision_id: (integer) The revision ID of the content entity this moderation state is for.
  11. default_langcode: (boolean) A flag indicating whether this is the default translation.
  12. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  13. revision_translation_affected: (boolean) Indicates if the last edit of a translation belongs to current revision.

URL alias entity

Module: Path alias (Drupal Core)
Class: Drupal\path_alias\Entity\PathAlias

List of base field definitions:

  1. id: (integer) ID.
  2. uuid: (uuid) UUID.
  3. revision_id: (integer) Revision ID.
  4. langcode: (language) Language.
  5. path: (string) The path that this alias belongs to.
  6. alias: (string) An alias used with this path.
  7. status: (boolean) Published.
  8. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  9. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

Shortcut link entity

Module: Shortcut (Drupal Core)
Class: Drupal\shortcut\Entity\Shortcut

List of base field definitions:

  1. id: (integer) The ID of the shortcut.
  2. uuid: (uuid) The UUID of the shortcut.
  3. langcode: (language) The language code of the shortcut.
  4. shortcut_set: (entity_reference to shortcut_set) The bundle of the shortcut.
  5. title: (string) The name of the shortcut.
  6. weight: (integer) Weight among shortcuts in the same shortcut set.
  7. link: (link) The location this shortcut points to.
  8. default_langcode: (boolean) A flag indicating whether this is the default translation.

Workspace entity

Module: Workspaces (Drupal Core)
Class: Drupal\workspaces\Entity\Workspace

List of base field definitions:

  1. id: (string) The workspace ID.
  2. uuid: (uuid) UUID.
  3. revision_id: (integer) Revision ID.
  4. uid: (entity_reference to user) The workspace owner.
  5. label: (string) The workspace name.
  6. parent: (entity_reference to workspace) The parent workspace.
  7. changed: (changed) The time that the workspace was last edited.
  8. created: (created) The time that the workspace was created.
  9. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.

Module: Custom Menu Links (Drupal Core)
Class: Drupal\menu_link_content\Entity\MenuLinkContent

List of base field definitions:

  1. id: (integer) The entity ID for this menu link content entity.
  2. uuid: (uuid) The content menu link UUID.
  3. revision_id: (integer) Revision ID.
  4. langcode: (language) The menu link language code.
  5. bundle: (string) The content menu link bundle.
  6. revision_created: (created) The time that the current revision was created.
  7. revision_user: (entity_reference to user) The user ID of the author of the current revision.
  8. revision_log_message: (string_long) Briefly describe the changes you have made.
  9. enabled: (boolean) A flag for whether the link should be enabled in menus or hidden.
  10. title: (string) The text to be used for this link in the menu.
  11. description: (string) Shown when hovering over the menu link.
  12. menu_name: (string) The menu name. All links with the same menu name (such as "tools") are part of the same menu.
  13. link: (link) The location this menu link points to.
  14. external: (boolean) A flag to indicate if the link points to a full URL starting with a protocol, like http:// (1 = external, 0 = internal).
  15. rediscover: (boolean) Indicates whether the menu link should be rediscovered.
  16. weight: (integer) Link weight among links in the same menu at the same depth. In the menu, the links with high weight will sink and links with a low weight will be positioned nearer the top.
  17. expanded: (boolean) If selected and this menu link has children, the menu will always appear expanded. This option may be overridden for the entire menu tree when placing a menu block.
  18. parent: (string) The ID of the parent menu link plugin, or empty string when at the top level of the hierarchy.
  19. changed: (changed) The time that the menu link was last edited.
  20. default_langcode: (boolean) A flag indicating whether this is the default translation.
  21. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  22. revision_translation_affected: (boolean) Indicates if the last edit of a translation belongs to current revision.
  23. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

Paragraph entity

Module: Paragraphs module
Class: Drupal\paragraphs\Entity\Paragraph
Related article: Introduction to paragraphs migrations in Drupal

List of base field definitions:

  1. id: (integer) ID.
  2. uuid: (uuid) UUID.
  3. revision_id: (integer) Revision ID.
  4. langcode: (language) The paragraphs entity language code.
  5. type: (entity_reference to paragraphs_type) Paragraph type.
  6. status: (boolean) Published.
  7. created: (created) The time that the Paragraph was created.
  8. parent_id: (string) The ID of the parent entity of which this entity is referenced.
  9. parent_type: (string) The entity parent type to which this entity is referenced.
  10. parent_field_name: (string) The entity parent field name to which this entity is referenced.
  11. behavior_settings: (string_long) The behavior plugin settings
  12. default_langcode: (boolean) A flag indicating whether this is the default translation.
  13. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  14. revision_translation_affected: (boolean) Indicates if the last edit of a translation belongs to current revision.
  15. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

List of field storage configurations:

  1. field_reusable_paragraph: entity_reference field.

Paragraphs library item entity

Module: Paragraphs Library (part of paragraphs module)
Class: Drupal\paragraphs_library\Entity\LibraryItem

List of base field definitions:

  1. id: (integer) ID.
  2. uuid: (uuid) UUID.
  3. revision_id: (integer) Revision ID.
  4. langcode: (language) Language.
  5. revision_created: (created) The time that the current revision was created.
  6. revision_uid: (entity_reference to user) The user ID of the author of the current revision.
  7. revision_log: (string_long) Briefly describe the changes you have made.
  8. status: (boolean) Published.
  9. label: (string) Label.
  10. paragraphs: (entity_reference_revisions) Paragraphs.
  11. created: (created) The time that the library item was created.
  12. changed: (changed) The time that the library item was last edited.
  13. uid: (entity_reference to user) The user ID of the library item author.
  14. default_langcode: (boolean) A flag indicating whether this is the default translation.
  15. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  16. revision_translation_affected: (boolean) Indicates if the last edit of a translation belongs to current revision.
  17. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

Profile entity

Module: Profile module
Class: Drupal\profile\Entity\Profile

List of base field definitions:

  1. profile_id: (integer) ID.
  2. uuid: (uuid) UUID.
  3. revision_id: (integer) Revision ID.
  4. type: (entity_reference to profile_type) Profile type.
  5. revision_created: (created) The time that the current revision was created.
  6. revision_user: (entity_reference to user) The user ID of the author of the current revision.
  7. revision_log_message: (string_long) Briefly describe the changes you have made.
  8. status: (boolean) Whether the profile is active.
  9. uid: (entity_reference to user) The user that owns this profile.
  10. is_default: (boolean) Whether this is the default profile.
  11. data: (map) A serialized array of additional data.
  12. created: (created) The time when the profile was created.
  13. changed: (changed) The time when the profile was last edited.
  14. revision_default: (boolean) A flag indicating whether this was a default revision when it was saved.
  15. workspace: (entity_reference to workspace) Indicates the workspace that this revision belongs to.

Available properties for other content entities

This reference includes all core content entities and some provided by contributed modules. The next article will include a reference for Drupal Commerce content entities. That being said, it would be impractical to cover all contributed modules. To get a list of yourself for other content entities, load the entity_type.manager service and call its getFieldStorageDefinitions() method passing the machine name of the entity as a parameter. Although this reference only covers content entities, the same process can be used for configuration entities.

What did you learn in today’s article? Did you know that there were so many entity properties in Drupal core? Were you aware that the list of available properties depend on factors like if the entity is fieldable, translatable, and revisionable? Did you know how to find properties for content entities from contributed modules? Please share your answers in the comments. Also, we would be grateful if you shared this article with your friends and colleagues.

Jul 09 2020
Jul 09

The reCAPTCHA module for Drupal 8 integrates the reCAPTCHA service provided by Google with your Drupal site. This service provides additional protection by detecting if the user accessing your site is a real person or a robot. 

Keep reading to learn how to use this module!

Step #1.- Install the Required Modules 

  • Open the terminal application of your PC.
  • Type: composer require drupal/recaptcha
  • Click Extend.
  • Scroll down and enable Captcha, Image Captcha, and reCAPTCHA.
  • Click Install

How to Use the Recaptcha Module in Drupal 8 

Step #2.- Configure the Module

  • Click Configuration > CAPTCHA module settings > reCAPTCHA.
  • Click the link register to reCAPTCHA.
  • Select v2 of reCAPTCHA.
  • Enter a valid domain name (this will not work on a local installation).
  • Accept the Terms.
  • Click Send / Ok.

How to Use the Recaptcha Module in Drupal 8

How to Use the Recaptcha Module in Drupal 8

You will get the Site Key and Secret Key. Paste them on the reCAPTCHA settings of your Drupal backend accordingly.

How to Use the Recaptcha Module in Drupal 8

How to Use the Recaptcha Module in Drupal 8

  • Scroll down and click Save configuration.
  • Click CAPTCHA Settings.
  • Change the default challenge type to reCAPTCHA.
  • Leave the other defaults.
  • Click Save configuration.

How to Use the Recaptcha Module in Drupal 8

Note: The image captcha module provides those “old-fashioned” challenges with an alphanumeric image.

How to Use the Recaptcha Module in Drupal 8

  • Click the Form settings tab.
  • Enable the Contact Message Feedback Form.
  • Click Edit.

How to Use the Recaptcha Module in Drupal 8

  • Add a proper name.
  • Select the reCAPTCHA challenge type.
  • Click Save.

How to Use the Recaptcha Module in Drupal 8

 Step #3.- Test the Contact Form

  • Use another browser to access the site as an anonymous user.
  • Navigate to yoursite/contact/feedback.

You should see the well-known “I’m not a robot checkbox”. If you click on it, you will get an image challenge to verify whether the user sending the form is human or not.

How to Use the Recaptcha Module in Drupal 8

I hope you liked this tutorial. Thanks for reading!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Jul 08 2020
Jul 08

Jest is the defacto standard for testing in modern JavaScript but we've traditionally not been able to leverage it for testing in Drupal.

But with twig-testing-library, we can now test our twig templates and any dynamic behaviours added by Javascript using Jest.

In this article we will go through the process of adding Jest based testing to an existing accordion component.

Installation

Firstly we need to install twig-testing-library and jest

npm i --save-dev twig-testing-library jest

And we're also going to add additional dom-based Jest asserts using jest-dom

npm i --save-dev @testing-library/jest-dom

Now we need to configure Jest by telling it how to find our tests as well as configuring transpiling.

In this project, we've got all of our components in folders in a /packages sub directory.

So we create a jest.config.js file in the root with the following contents:


// For a detailed explanation regarding each configuration property, visit:
// https://jestjs.io/docs/en/configuration.html

module.exports = {
  clearMocks: true, // Clear mocks on each test.
  testMatch: ['/packages/**/src/__tests__/**.test.js'], // How to find our tests.
  transform: {
    '^.+\\.js?$': `/jest-preprocess.js`, // Babel transforms.
  },
  setupFilesAfterEnv: [`/setup-test-env.js`], // Additional setup.
};

For transpiling we're just using babel-jest and then chaining with our projects presets. The contents of jest-preprocess.js is as follows:

const babelOptions = {
  presets: ['@babel/preset-env'],
};

module.exports = require('babel-jest').createTransformer(babelOptions);

As we're going to also use the Jest dom extension for additional Dom based assertions, our setup-test-environment takes care of that as well as some globals that Drupal JS expects to exist. The contents of our setup-test-env.js file is as follows:

import '@testing-library/jest-dom/extend-expect';

global.Drupal = {
  behaviors: {},
};

Writing our first test

Now we have the plumbing done, let's create our first test

As per the pattern above, these need to live in a __tests__ folder inside the src folder of our components

So let's create a test for the accordion component, by creating packages/accordion/src/__tests__/accordion.test.js

Let's start with a basic test that the accordion should render and match a snapshot. This will pickup when there are changes in the markup and also verify that the template is valid.

Here's the markup in the twig template

<div class="accordion js-accordion">
  {% block button %}
    <button class="button button--primary accordion__toggle">{{ title | default('Open Me') }}</button>
  {% endblock %}
  <div class="accordion__content">
    {% block content %}
      <h1>Accordion Content</h1>
      <p>This content is hidden inside the accordion body until it is disclosed by clicking the accordion toggle.</p>
    {% endblock %}
  </div>
</div>

So let's render that with twig-testing-library and assert some things in packages/accordion/src/__tests__/accordion.test.js


import { render } from 'twig-testing-library';

describe('Accordion functionality', () => {
  it('Should render', async () => {
    expect.assertions(2);
    const { container } = await render(
      './packages/accordion/src/accordion.twig',
      {
        title: 'Accordion',
        open: false,
      },
    );
    expect(container).toMatchSnapshot();
    expect(container.querySelectorAll('.accordion__toggle')).toHaveLength(1);
  });
});



Running the tests

So let's run our first test by adding a jest command to our package.json under "scripts"


"jest": "jest --runInBand"

Now we run with

npm run jest

> jest --runInBand

 PASS  packages/accordion/src/__tests__/accordion.test.js
  Accordion functionality
    ✓ Should render (43 ms)

Test Suites: 1 passed, 1 total
Tests:       1 passed, 1 total
Snapshots:   1 passed, 1 total
Time:        4.62 s, estimated 6 s
Ran all test suites.

Testing dynamic behaviour

Now we know our template renders, and we're seeing some expected output, let's test that we can expand and collapse our accordion.

Our accordion JS does the following:

  • On click of the accordion title, expands the element by adding accordion--open class and sets the aria-expanded attribute
  • On click again, closes the accordion by removing the class and attribute

So let's write a test for that - by adding this to our existing test:


  it('Should expand and collapse', async () => {
    expect.assertions(4);
    const { container, getByText } = await render(
      './packages/accordion/src/accordion.twig',
      {
        title: 'Open accordion',
      },
    );
    const accordionElement = container.querySelector(
      '.accordion:not(.processed)',
    );
    const accordion = new Accordion(accordionElement);
    accordion.init();
    const accordionToggle = getByText('Open accordion');
    fireEvent.click(accordionToggle);
    expect(accordionElement).toHaveClass('accordion--open');
    expect(accordionToggle).toHaveAttribute('aria-expanded', 'true');
    fireEvent.click(accordionToggle);
    expect(accordionElement).not.toHaveClass('accordion--open');
    expect(accordionToggle).toHaveAttribute('aria-expanded', 'false');
  });

Now let's run that

npm run jest
packages/accordion/src/__tests__/accordion.test.es6.js
  Accordion functionality
    ✓ Should render (29 ms)
    ✓ Should expand and collapse (20 ms)

Test Suites: 1 passed, 1 total
Tests:       2 passed, 2 total
Snapshots:   1 passed, 1 total
Time:        5.031 s, estimated 6 s
Ran all test suites.

Neat! We now have some test coverage for our accordion component

Next steps

So the neat thing about Jest is, it can collect code-coverage, let's run that

npm run jest -- --coverage
packages/accordion/src/__tests__/accordion.test.es6.js
  Accordion functionality
    ✓ Should render (28 ms)
    ✓ Should expand and collapse (13 ms)

-------------------|---------|----------|---------|---------|--------------------
File               | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
-------------------|---------|----------|---------|---------|--------------------
All files          |   29.55 |    11.27 |   24.14 |      30 |
 accordion/src     |     100 |    85.71 |     100 |     100 |
  accordion.es6.js |     100 |    85.71 |     100 |     100 | 53
 base/src          |   11.43 |     3.13 |    4.35 |   11.65 |
  utils.es6.js     |   11.43 |     3.13 |    4.35 |   11.65 | 14,28,41-48,58-357
-------------------|---------|----------|---------|---------|--------------------
Test Suites: 1 passed, 1 total
Tests:       2 passed, 2 total
Snapshots:   1 passed, 1 total
Time:        2.813 s, estimated 5 s
Ran all test suites.

Pretty nice hey?

What's happening behind the scenes

If you've worked on a React project before, you've probably encountered Dom testing library and React testing library. Twig testing library aims to provide the same developer ergonomics as both of these libraries. If you've familiar with either of those, you should find Twig testing library's API's comparable.

Under the hood it's using Twig.js for Twig based rendering in JavaScript and Jest uses jsdom for browser JavaScript APIs.

A longer introduction

I did a session on using this for a Drupal South virtual meetup, here's the recording of it.

[embedded content]

Get involved

If you'd like to get involved, come say hi on github.

Photo of lee.rowlands

Posted by lee.rowlands
Senior Drupal Developer

Dated 9 July 2020

Jul 08 2020
Jul 08

argument-open-source

Businesses and governments build websites for one reason: to provide value to their users. But what if your website was incapable of reaching millions of your users? 25% of Americans live with disabilities. For some of them, the simple act of navigating websites, digesting information, and understanding your content is difficult. Yet, despite brands increasing spending on web design and digital marketing, less than 10% of websites actually follow accessibility standards. Businesses are spending significant money to capture an audience, yet they’re not ensuring that their audience can engage with their website.

It’s a problem—a big one.

You don’t want to exclude customers. It’s bad for business, and it’s bad for your brand. Better yet, accessibility features help improve your SEO, reduce your website complexity, and increase your ability to connect with your loyal audience. But accessibility standards aren’t always baked into the architecture of websites.

Luckily, there are some content management systems (CMS) that let you create hyper-accessible websites without even trying. Drupal comes equipped with a variety of accessibility features — each of which helps make your website more accessible for your customers.

Understanding the Importance of Website Accessibility

Creating an accessible website may sound vague, but there’s already a worldwide standard you can follow. The Web Content Accessibility Guidelines (WCAG) — which is maintained by The World Wide Web Consortium — is the global standard for web accessibility used by companies, governments, and merchants across the world.

Sure! Following the WCAG standard helps you reach a wider audience. But it also keeps you out of legal hot water. Not only has the ADA made it abundantly clear that compliance requires website accessibility. A United States District Court in Florida ruled that WCAG standards are the de facto standards of web accessibility. And there are already cases of businesses getting sued for failing to adhere to them.

  • The DOJ sues H&R Block over its website’s accessibility.
  • WinnDixie.com was sued for accessibility, and the judge required them to update their website.
  • The National Museum of Crime and Punishment was required to update its website accessibility.

The list goes on. Adhering to WCAG web accessibility standards helps protect your brand against litigation. But, more importantly, it opens doors to millions of customers who need accessibility to navigate and engage with your amazing content.

One-third of individuals over the age of 65 have hearing loss. Around 15% of Americans struggle with vision loss. And millions have issues with mobility. The CDC lists six forms of disability:

  • Mobility (difficulty walking or climbing)
  • Cognition (difficult remembering, making decisions, or concentrating)
  • Hearing (difficulty hearing)
  • Vision (difficulty seeing)
  • Independent living (difficulty doing basic errands)
  • Self-care (difficulty bathing, dressing, or taking care of yourself)

Web accessibility touches all of those types of disabilities. For those with trouble seeing, screen readers help them comprehend websites. But, screen readers strip away the CSS layer. Your core content has to be accessible for them to be able to comprehend it. Those with mobility issues may need to use keyboard shortcuts to help them navigate your website. Hearing-impaired individuals may require subtitles and captions. Those with cognitive issues may need your website to be built with focusable elements and good contrasting.

There are many disabilities. WCAG creates a unified guideline that helps government entities and businesses build websites that are hyper-accessible to people with a wide range of these disabilities.

Drupal is WCAG-compliant

WCAG is vast. A great starting point is the Accessibility Principles document. But, creating an accessible website doesn’t have to be a time-consuming and expensive process. Drupal has an entire team dedicated to ensuring that their platform is WCAG compliant. In fact, Drupal is both WCAG 2.0 compliant and Authoring Tool Accessibility Guidelines (ATAG 2.0) compliant. The latter deals with the tools developers use to build websites. So, Drupal has accessibility compliance on both ends.

What Accessibility Features Does Drupal Have?

Drupal’s accessibility compliance comes in two forms:

  1. Drupal has built-in compliance features that are native to every install (7+).
  2. Drupal supports and enables the community to develop accessibility modules.

Drupal’s Built-in Compliance Features

Drupal 7+ comes native with semantic markup. To keep things simple, semantic markup helps clarify the context of content. At Mobomo, we employ some of the best designers and website developers on the planet. So, we could make bad HTML markup nearly invisible to the average user with rich CSS and superb visuals. But when people use screen readers or other assistive technology, that CSS goes out-of-the-window. They’re looking at the core HTML markup. And if it’s not semantic, they may have a difficult time navigating it. With Drupal, markup is automatically semantic — which breeds comprehension for translation engines, search engines, and screen readers.

Drupal’s accessibility page also notes some core changes made to increase accessibility. These include things such as color contrasting. WCAG requires that color contrasting be at least 4.5:1 for normal text and 7:1 for enhanced contrast. Drupal complies with those guidelines. Many other changes are on the developer side, such as drag and drop functions and automated navigation buttons.

Of course, Drupal also provides developer handbooks, theming guides, and instructional PDFs for developers. Some of the accessibility is done on the developer’s end, so it’s important to work with a developer who leverages accessibility during their design process.

Drupal’s Support for the Accessibility Community

In addition to following WCAG guidelines, Drupal supports community-driven modules that add additional accessibility support. Here are a few examples of Drupal modules that focus on accessibility:

There are hundreds. The main thing to remember is that Drupal supports both back-end, front-end, and community-driven accessibility. And they’ve committed to continuously improving their accessibility capabilities over time. Drupal’s most recent update — the heavily anticipated Drupal 9 — carries on this tradition. Drupal has even announced that Drupal 10 will continue to expand upon accessibility.

Do You Want to Build an Accessible Website

Drupal is on the cutting-edge of CMS accessibility. But they can’t make you accessible alone. You need to build your website from the ground up to comply with accessibility. A good chunk of the responsibility is in the hands of your developer. Are you looking to build a robust, functional, beautiful, and accessible website? 

Contact us. We’ll help you expand your reach.

Jul 08 2020
Jul 08

Front-end development workflows have seen considerable innovation in recent years, with technologies like React disseminating revolutionary concepts like declarative components in JSX and more efficient document object model (DOM) diffing through Virtual DOMs. Nonetheless, while this front-end development revolution has led to significant change in the developer experiences we see in the JavaScript landscape and to even more momentum in favor of decoupled Drupal architectures in the Drupal community, it seems that many traditional CMSs have remained behind the curve when it comes to enabling true shared component ecosystems through developer experiences that focus on facilitating shared development practices across back and front end.

At DrupalCon Amsterdam 2019, Fabian Franz (Senior Technical Architect and Performance Lead at Tag1) delivered a session entitled "Components everywhere: Bridging the gap between back end and front end" that delved into his ideal vision for enabling such shared components in Drupal's own native rendering layer. Fabian joined Michael Meyers (Managing Director at Tag1), and me (Preston So, Editor in Chief at Tag1; Senior Director, Product Strategy at Oracle; and author of Decoupled Drupal in Practice) for a Tag1 Team Talks episode highlighting the progress other ecosystems have made in the face of this problem space and how a hypothetical future Drupal could permit rich front-end developer experiences seldom seen in the CMS world. In this two-part blog series, a sequel to Fabian's DrupalCon session, we dive into some of his new conclusions and their potential impact on Drupal's future.

Components everywhere in Drupal

At the onset of our conversation, Fabian offered a quick summary of his idea behind components everywhere—i.e. shared across both client and server—within the Drupal context. The main thrust of Fabian's vision is that developers in Drupal ought to be able to implement a back-end application in a manner indistinguishable from how they would implement a front-end application. In other words, developers should not necessarily need to understand Drupal's application programming interfaces (APIs) or decoupled Drupal approaches. By decoupling Drupal within its own architecture (as I proposed with Lauri Eskola and with Sally Young and Matt Grill before that), we can enable the implementation of purely data-driven Drupal applications.

But what does this truly mean from the standpoint of Drupal developers? Fabian identifies the moment where components everywhere will truly reach success as the conditions in which the same component can be leveraged on the front end and back end without any distinction in how data is handled. One of the key means of doing this in a way that can be shared across client and server is through slots, which can contain additional data and provide the concept of component "children."

Because of how Drupal's front-end architecture was originally architected, there are significant gaps between how Drupal handles its "components" and how other technologies juggle theirs. For instance, while theme functions comprise an important foundation for how Drupal developers interact with the Drupal front end, there is no way to provide a slot for interior data or nested components. There is an analogous concept in terms of children in the render tree, but this requires considerable knowledge of PHP to traverse. According to Fabian, though we have all of the elements needed for a component-based system available in Drupal, one of the primary challenges is that there are so many elements within Drupal that can lend themselves to such a component-based system.

Looking to Laravel for inspiration

Adhering to the open-source philosophy of "proudly found elsewhere," Fabian turned to other projects for inspiration as he began to articulate what it would take to implement the vision he presented in Amsterdam. After all, reinventing the wheel is usually an ill-advised approach when open-source solutions are available to be leveraged. For instance, Laravel contains templates but needed to introduce component tags to their templating system in order to capture generic slots. In Drupal, on the other hand, both theme functions and Twig templates can morphologically be considered components, but they lack certain key attributes most components today contain. Slots are implementable in Twig, but that is solely because all data is already available to Twig templates in Drupal.

Laravel 7 introduced BladeX to the Laravel ecosystem. BladeX provides a highly enjoyable developer experience by serving as a component handler for Laravel components. As long as developers prefix all components with x- in their custom element names (i.e. <x-component>, they no longer need to use a regular expression to find all possible component names in the component system, instead simply searching for all components whose names are prefixed with x-. And if the React developer experience is any indication, many modern front-end developers strongly prefer declarative HTML like the following:

    <x-alert prop="value"></x-alert>

BladeX first began as a contributed plugin to Laravel. Later, it was added to Laravel core due to its usefulness in enabling not only a graceful component system but also pleasant-to-use syntax to work with those components. Livewire also includes graceful capabilities enabling interactivity, which in Drupal is currently represented by the Drupal Ajax framework (difficult to use due to its tight coupling to Drupal's Form API).

More recently, Laravel introduced a tool known as Livewire, which makes it possible to implement server-side document object models (DOM) but lacks the data input/output (I/O) necessary to enable state management and interactivity. As such, Fabian extended the concept of a store from his DrupalCon session to include a provider that allows data retrieval and use in components. Fortunately, Livewire has a partial implementation of this, and it is possible to implement a server-side message that increments a counter and then to retrieve that counter value gracefully from the client side. Livewire automatically understands that it needs to update the server-side render of that counter and serve that updated value to the client.

What about Web Components?

Fabian's thinking is by no means alone when it comes to enabling components everywhere in Drupal. Many other initiatives, including one that aimed to introduce Web Components into Drupal, have been down this road. But why are Web Components so compelling for this in the first place? By going a step further and introducing the Shadow DOM, Web Components can provide full encapsulation automatically, off the shelf.

And the Shadow DOM itself is a game changer because of the benefits provided by syntactic features like CSS scoping, in which styles contained in a Shadow DOM are unaffected by those that came previously. Another way to accomplish such CSS scoping is through stringent class-based selector nomenclature or utilities like TailwindCSS that dispense with the traditional CSS cascade altogether. Many in the JavaScript world are increasingly moving in this direction, according to Fabian, of considering the cascade in CSS a suboptimal feature.

In other user interface (UI) systems, particularly in the mobile application development landscape, there are two emerging approaches to styling mobile applications seen in ecosystems like React Native and Flutter. These allow you to assemble compelling layouts without any cascade presented in CSS, and all are React-driven components that leverage CSS-in-JavaScript solutions to perform styling. Increasingly, these developments point to a landscape where developers eschew the cascade, long essential to writing CSS, in favor of a more atomic approach to styling components.

Conclusion

Components are difficult even in the best of times, not solely because of the relative conceptual complexity and differences in understandings when it comes to how components are defined from system to system. In the case of JavaScript technologies, approaches like React's declarative component syntax and Virtual DOM portend a world in which components are increasingly shared between client and server and data in components is decoupled from the component during all stages of component life cycles, irrespective of whether it is rendered on the back end or front end. Complicating matters further is the fact that traditional content management systems like Drupal and WordPress have largely not kept pace with the dizzying innovation in the front-end development universe.

In this blog post, we examined some of the new conclusions Fabian has come to well after his DrupalCon presentation when it comes to enabling components everywhere in Drupal, particularly taking inspiration from other ecosystems like Laravel, React, and Web Components. In the second installment of this two-part blog series, we'll dive into how to define components in Drupal, offer a more declarative component experience when working with them, and some of the other ways in which we can enable shared components across client and server and rich immutable data-driven state in a setting where these novelties have long seemed to be anathema or worlds removed: the Drupal front-end ecosystem.

Special thanks to Fabian Franz and Michael Meyers for their feedback during the writing process.

Photo by Tim Johnson on Unsplash

Jul 07 2020
Jul 07

DrupalCon Global 2020 will feature presenters from around the world on a virtual platform called Hopin (pronounced "Hop in!"). Drupalize.Me trainers Joe Shindelar and Amber Matz will be presenting sessions and participating in the conference. Both Amber and Joe's sessions will be on Thursday. (Times listed are in UTC. Convert to your time zone with this tool.)

I (Amber) will co-present with Gábor Hojtsy Deep dive: state of Drupal 9 on Thursday, July 16, 2020 at 18:15 UTC. We'll dive into details not covered by the Driesnote. You'll learn how new features get into Drupal and how old APIs and libraries get updated in Drupal's release cycle. By the end of the session, you'll better understand what's involved with upgrading to Drupal 9. (And how it's probably not as bad as you might think!)

Joe will present Altering, extending, and enhancing Drupal also on Thursday at 21:15 UTC. There are various ways to extend Drupal without "hacking core" and in this session, you'll get a great overview of what those options are and how to decide which method to use. By the end of the session, you should have a more complete understanding of what the use-cases are for plugins, hooks, services, and events and how (at a high-level) they are implemented.

Osio Labs' (the company that makes Drupalize.Me) sister company Lullabot also has a strong group representing at DrupalCon Global. Check out Lullabots Speaking at DrupalCon Global 2020 to learn more.

Finally, you might be wondering how contribution will work at DrupalCon this year. Contribution groups are being organized at the virtual DrupalCon Global Contributions Room. Browse and join groups or create your own if you'd like to coordinate a sprint for your own Drupal community project. If you'd like to help out with Help Topics (programmers and writers/editors needed), join the Help Topics group. (I am co-maintainer of the core experimental module Help Topics.)

To learn more about the DrupalCon Global platform and attendee experience, we recommend the DrupalCon Global 2020: Attendee Experience Preview.

Register for DrupalCon Global 2020

Jul 07 2020
Jul 07

Low-code development replaces the traditional method of hard coding and allows us to create our own applications without any help from the IT developers. It requires minimal hand-coding and enables faster delivery of applications with the help of pre-packaged templates, drag and drop tools, and graphic design techniques.

From leading low-code development platforms like Mendix and Outsystems to Acquia Cohesion (suited for Drupal websites), the low-code approach has been making waves as a great option for easy application development.

A square divided in four with small dark blue circles

I am sure after reading the above lines you are left confused, that if low-code is an easy way out then why does the title talk about low code not being the right code. Well, if anything looks too good to be true, it’s not always that great. Let me tell you why!

Functionality-first and user needs later

Even though low code is a great help in making the lives of developers easier, it is unfortunate that it puts user experience at stake. A design-led approach or a progressive approach becomes harder to achieve with low code. Functionality over the need of the user never ends well.

Low code, as we know, saves time. And hence is said to be efficient. Whereas the truth is that it is efficient only with respect to time. The applications made on low code are hardly optimized for efficiency. If you want your web app to run smooth and fast, low code is not the go-to option for you.

No technical requirement: a myth

Low code is easy and can be done without including the technical team: True
Low code does not require any technical skill: false

For anyone of us to start working with the low code, the understanding of the development of low code is the first and the least requirement. It takes time to learn and understand the process. So, before one starts using the tools, it is important to ensure that they have the basic technical skills that are required.
Limited functions 

In a low code development tool, the number of functions that you can implement is limited. It is definitely a quick way to build applications but in case you want to try out something different, you do not have many options.

Also, once an app is created on low code, it is not very easy to add custom code or any other required functionality to it.

Does it help in cost-cutting?

When it comes to low code, the cost is both a draw and a drawback. 

Because of its flexibility, low code is easier to use and requires a small set of skills. So, you don’t have to specially hire someone and pay a hefty amount to do that.

Although it is easy to drag and drop building blocks that fulfil your requirements, once you need a special feature that is unavailable, you will need custom code. Merging the custom code can cost a lot more than a completely customized solution as a whole.

When a company starts, it starts small, and hence it is advised to have a provision in its low code contract for ramping up in the future. If not, the company has to face major downfall before they are even able to start properly.

Is it secure?

Low code has been giving rise to the question: Is it secure enough?

When you build an application using low code, it requires your complete trust. You don’t have control over data security and privacy and no access to source code which makes it difficult to identify the possibility of any sort of vulnerabilities.

Using low code to produce code that does not adhere to established best practices could violate an organization’s compliance measures. Doesn’t matter if the resulting application is secure.

Vendor Lock-In Risks

Vendor lock-in is one of the major limitations of low code development.

In the case of the teams that use low code, vendor lock-in can create poorly documented or even undocumented code that is difficult to maintain outside of the platform.

Hence, it is important to understand each vendor’s policies before licensing any tool and ensure that you know whether or not you are able to maintain applications outside of the platform.

Conclusion

Low code is indeed a useful tool but it comes with cons you can’t ignore. Platforms that have been using low code will only tell you that it’s faster and easier but lack of options and functions, security risks, and other major drawbacks make us rethink if it is actually the solution that we want for an enterprise application.

Jul 07 2020
Jul 07

Have you had enough of the spam comments, form submissions and email submissions by bots trying to infiltrate your website? Then you need a guard called Completely Automated Public Turing test to tell Computers and Humans Apart. Yes, and that is short for CAPTCHA. As annoying as it may be making us prove time and again that we are not bots, Captcha and ReCaptcha are the most effective against fighting automated programs trying to get into our websites. The Captcha module and ReCaptcha module in Drupal 8 are extremely helpful in protecting your Drupal website against spambots and used widely in user web forms and other regions of a web page where user inputs are required. Let’s learn more about the modules and how to implement them in your Drupal 8 website.

tech-captcha1

What is Captcha and ReCaptcha?

When we try to login to or register on a website, we are sometimes asked to identify and type the distorted numbers and letters into a provided box. This is the Captcha system. Captcha helps us to verify whether the visitor of your site is an actual human or a bot. ReCaptcha does the same in terms of protecting your website from spam except that it makes it tougher for spambots and more user friendly for humans.

The Captcha module in Drupal 8 is an easy to use module largely used in forms to identify if the user is a human or a bot. The Captcha module is also compatible with Drupal 9. Let’s get started with installing and using the Captcha module in Drupal 8.

Download and Enable the captcha module

Download the captcha module from here and enable it. To enable the module, go to Extend and in the spam control category, you will find the CAPTCHA option. Click on the checkbox and then click install.

 

tech-captcha2

 

Enable both Captcha and Image Captcha. Image captcha provides an image-based captcha.

Configure the Captcha module

After installing the module, we must configure the module as per our requirements.

To configure the module, go to Configuration > People > CAPTCHA module settings.

tech-captcha3

Select the Default challenge type. This type is used on all the forms. You can change the type for an individual form. This module has two built in types -

The example of this type is given in the CAPTCHA examples tab on the same page.

To change the type for an individual form, go to Form Settings tab on the same page. Here we can see the list of forms in the site. Click on the enable button to enable the captcha to form.

tech-captcha4

To change the challenge type to a particular form, click on the down-arrow and click edit.

tech-captcha5

Give the form ID for which you want to change the challenge type and can change the type in the dropdown provided under challenge type.

techcaptcha6

This is not required unless the structure of the form does not change.

Adding the description to the Captcha for the visitor

techcaptcha7

Click on the checkbox to show the Challenge description box. This is not visible by default. Just click the checkbox, the description is already written. This description is editable and can display any message of your choice to the visitor.

Set validation and persistence

techcaptcha8

These are some of the features to the validation of the captcha. Here, we can make the validation difficult by requiring case sensitive validation. We can also change the appearance of the challenges. The second option under persistence makes the process simple for the visitor by hiding the challenge once the visitor is logged in and successfully completes the challenge.

Permissions

The captcha can be controlled by giving permissions.

techcaptcha9

One can change the captcha settings who has the Administer CAPTCHA settings permission. Those who have the skip CAPTCHA permission are not given any challenge. To test the captcha the user should not have the skip CAPTCHA permission. Administrators cannot test as they have this permission by default.

The Captcha works as required, but there are some drawbacks to this. It is not user-friendly to visitors with visual disabilities. Reading distorted numbers and letters can be annoying to regular users. This may end up with the user not getting a chance to enter the site.

The solution for this problem is the ReCaptcha module. ReCaptcha module uses Google reCAPTCHA to improve the captcha system.

Download and Enable the ReCaptcha module

Download the captcha module from https://www.drupal.org/project/recaptcha and enable it.

tech-captcha10

Configure the module

After installing the module, go to Configuration > People > CAPTCHA Module Settings.

tech-captcha11

Select ReCaptcha in Default challenge type and click save configuration. After saving, go to ReCaptcha tab on the same page.

tech-captcha12

As the ReCaptcha uses Google ReCaptcha service, the site key and the secret key is required to use the ReCaptcha module. These keys are given by google once we register our site in google ReCaptcha. To register click on the register for reCAPTCHA link.

tech-captcha13

Once we click on it, we will see this form. We have to provide some information such as domain name, type of ReCaptcha. Accept the Terms of Service before clicking on submit. After the submission, you will get the site key and secret key. Enter it in the reCAPTCHA tab.

Choose which form you would like to use ReCaptcha. And then test the form.

techcaptcha14

 

If you want to test it in a local environment disable the domain name validation in reCAPTCHA configuration in google.

 

Jul 06 2020
Jul 06
Kaleem Clarkson

It feels like a lifetime ago that the event organizers’ request to become an official working group was approved by the Drupal Association at DrupalCon Amsterdam. Since then, 2020 has been a year that no-one will forget-from a global virus to social justice demonstrations-the world as we know it has been forever changed.

So far in 2020, we have learned some valuable lessons that we think will help us be a better working group moving forward.

Organizing Events is Hard. Organizing volunteer-led events is difficult already, let alone during complete uncertainty. Many event organizers have had to make very difficult but swift decisions by either canceling or trying to pivot to a virtual conference format.

Finding the Right Time is Hard. Organizing a global group of volunteer event organizers is also hard. As someone who has had little time on international teams, I admittedly thought of finding a meeting time a breeze. I was completely wrong.

Global Representation is Hard. One of our top priorities was to have global representation to help foster growth and collaboration around the world but unfortunately due to either the meeting times or not enough focused marketing on international event organizers the participation was just not where the board felt it should be.

After a few emails and some friendly debates, the board looked for opportunities for change that can help solve some of the lessons we have learned.

Alternating Meeting Times in UTC Format. To help foster more international participation, all scheduled meetings will alternate times all marketed and posted in the Coordinated Universal Time (UTC) format. Public meetings will now be at 12:00 pm UTC and 12:00 am UTC.

Increase Board Membership to 9. The group decided to expand the board members to 9. We are highly encouraging organizers from around the world to submit their names for interest to increase our global representation.

Maintain and Recruit Advisory Board Members. Succession planning is critical for any operation, and our advisory board provides more flexible commitment in participation which we hope will be our number one resource for new members down the road.

Board Members Nominations. In addition to expanding the number of board seats, Suzanne Dergacheva from DrupalNorth (Canada) and Matthew Saunders (DrupalCamp Colorado) have accepted their nominations from advisors to board members.

  • Camilo Bravo (cambraca) — DrupalCamp Quito — Ecuador / Hungary
  • Baddý Sonja Breidert (baddysonja) — DrupalCamp Iceland, Germany, Europe, Splash Awards — Europe
  • Kaleem Clarkson (kclarkson) -DrupalCamp Atlanta — Atlanta, GA, USA
  • Suzanne Dergacheva (pixelite) — DrupalNorth — Montreal, QC CANADA
  • Leslie Glynn (leslieg) Design 4 Drupal Boston, NEDCamp — Boston MA
  • Matthew Saunders (MatthewS) — Drupalcamp Colorado — Denver, CO, USA
  • Avi Schwab (froboy) — MidCamp, Midwest Open Source Alliance — Chicago, IL, USA

There are so many things that all of us organizers would like to get working, but one of our goals has been to identify our top priorities.

Event Organizer Support. We are here to help. When volunteer organizers need guidance navigating event challenges, there are various channels to get help.

Drupal Community Events Database. In collaboration with the Drupal Association, the EOWG has been working on putting together a new and improved event website database that will help market and collect valuable data for organizers around the world.
Submit your event today: https://www.drupal.org/community/events

Drupal Event Website Starter kit. To help organizers get events up and running quickly, an event website starter kit was identified as a valuable resource. Using the awesome work contributed by the Drupal Europe team, JD Leonard from DrupalNYC has taken the lead in updating the codebase. It is our hope more event organizers will help guide a collaborative effort and continue building an event starter kit that organizers can use.

Join the Event Organizer Slack here and Join #event-website-starterkit

The Drupal Event Organizers Working Group is seeking nominations for Board Members and Advisory Committee Members. Anyone involved in organizing an existing or future community event is welcome to nominate.

EOWG Board Members. We are currently looking for nominations to fill two (2) board seats. For these seats, we are looking for diverse candidates that are event organizers from outside of North America. Interested organizers are encouraged to nominate themselves.

EOWG Advisory Committee. We are looking for advisory committee members. The advisory committee is designed to allow individuals to participate who may not have a consistent availability to meet or who are interested in joining the board in the future.

Nomination Selection Process: All remaining seats/positions will be selected by a majority vote of the EOWG board of directors.

Submit Your Nomination: To submit your nomination please visit the Issue below and submit your name, event name, country, territory/state, and a short reason why you would like to participate.

Issue: https://www.drupal.org/project/event_organizers/issues/3152319

Nomination Deadline: Monday, July 6th, 11:59 pm UTC

Jul 06 2020
Jul 06

Every marketing professional knows that being at the top of search engine results is important and that SEO will help to get them there. You may have also heard that SEO can be challenging to learn, but the truth is, SEO doesn't have to be difficult, and sometimes it can be pretty fun!

These 5 basic SEO techniques are easy to implement and will have you on your way to the top of search engine results:

  1. Help the search engine bots understand your site
  2. Eliminate any technical issues harming your search ranking
  3. Match your content to your audience’s search intent
  4. Write quality content 
  5. Optimize the SEO of key pages on your site 

1. Help the search engine bots understand your site 

A well-built sitemap.xml and robots.txt file are the initial building blocks of SEO success. These two files tell Googlebot what to crawl and what not to crawl on your website.

Use a sitemap to help the robots understand your site structure

Make sure you have a sitemap and make sure that the sitemap has been submitted to the Google Search Console. A sitemap is an XML page listing all of the "included" sitemap URLs as well as their priority, last changed date, and change frequency. This sitemap will allow Google to easily crawl and index all of the pages on your website that you want to appear in Google search results. You're essentially providing Google with a cheat sheet of your website. Googlebot doesn't require a sitemap to crawl your website, but it does make it easier and more efficient to provide it with one.

Maybe you already have a sitemap in place. Navigate to yoururl.com/sitemap.xml to verify. Confirm it’s up to date. If you have a CMS like Drupal or WordPress, likely a plugin is doing this for you. It’s also possible a sitemap was manually generated by someone at your organization that could be out of date. You can generate a new one at www.xml-sitemaps.com if need be. 

Once you have verified that your sitemap exists and that it contains the right pages, navigate to the Google Search Console.  

A screenshot of the Google Search Console page, which shows an animated barista holding up a tablet behind a counter in a cafe. Google Search Console Homepage

If you have not signed up for an account with Google Search Console, do so now.

Once inside the Search Console click Sitemaps from the sidebar navigation menu.

Screenshot of the Index showing two options: Coverage and Sitemaps, which Sitemaps highlighted. Screenshot of sitemap location 

Enter your sitemap location into the add new sitemap URL bar and click submit. 

Screenshot of the process of adding a new sitemap URL.Screenshot of adding a new sitemap

If Google can find your sitemap then you should see an entry below that looks like this:

A screenshot showing a successful upload to Google Search Console. There are six columns labelled: Sitemap, type, submitted (date), last read (date), status, and number of discovered URLs. Screenshot of a successfully submitted sitemap 

The search console will provide you with great feedback regarding the general health and performance of your website as well. Subscribe to our newsletter to get notified when we publish a deeper-dive article on this topic. 

Exclude pages from Google using robots.txt

While your sitemap is designed to help Googlebot know what it should be crawling, your robots.txt file, on the other hand, tells Google what to ignore.

There may be portions of your site that you never intended a visitor to see, and others you’d prefer aren’t googleable.

Common pages to include in a robots.txt file are:

  • Login page (If you don't allow visitors to log in)
  • Administration pages
  • Intentional duplicate pages (Like print-friendly pages)
  • Thank you pages
  • Search result pages
  • Comments
  • Tag pages

There are a lot of additional pages and reasons to add pages to your robots.txt file so this will require some forethought.

So where is your robots.txt file? Typically it can be found in the root directory of your webserver. This is where all your website files are stored. It should look something like this:

User-agent: *
Disallow: /admin/
Disallow: /comment/reply/
Disallow: /search*

The first line of this file, the User-agent: * line, says this is for ALL bots! This includes Googlebot, Googlebot image, Bingbot, Slurp (Yahoo), etc.

The following lines of this file simply say, "YOU ARE NOT ALLOWED TO GO HERE, HERE, AND HERE!"

Be careful what gets added to this file. Like we said before if it's disallowed in this file the bots will not crawl that page.

If your robots.txt file contains something like the following:

User-agent: *
Disallow: *

Nothing on your site will be crawled. There will be no crawl, no index, no search result listings.

So be very careful that anything you include in this file is intentional. Otherwise, you may be hiding important pages from Google.

2. Eliminate any technical issues harming your search ranking

The next step in SEO basic health is to ensure that your website is free of any technical issues that may be preventing your site from being crawled.

Some common examples of technical SEO issues are, but are not limited to:

  • No Sitemap (detailed in the previous section)
  • Missing or incorrect robots.txt file (detailed in the previous section)
  • Slow page speeds
  • Duplicate pages/content
  • Missing alt tags
  • Broken links
  • Non-mobile-friendliness
  • Missing meta descriptions
  • Missing title tags
  • 404 pages

Some of these issues can be solved simply. If you are using a CMS like Drupal or WordPress, most offer plugins to edit or add meta descriptions and title tags directly through the page editor.

Duplicate pages and content can be compiled or deleted. Broken links can be corrected. And, 404 pages can either be redirected or added back in if they were accidentally deleted.

If you are not a technical person, other issues like site speed may require the involvement of your website developer to fix. Many of these errors will be reported to you within Google Search Console.
If you are looking for even deeper insights into the "crawlability" and overall SEO health of your website tools like Ahrefs and Moz can offer in-depth insights into your SEO efforts.

We will be diving deeper into how to solve these common technical SEO issues in future articles.
Once your website or page is free and clear of technical SEO issues, it's time to move on to the next step.

3. Match your content to your audience’s search intent 

Every marketer wants their organization’s website to be at the top of their audience’s search results. To strategically achieve this, we recommend researching keywords, which can be a tricky endeavor. Keywords are the words and phrases in your content that make it possible for searchers to find your website when using search engines.

While ranking for high-level keywords with large search volumes can be great, they often give you little in the way of your audience's actual intent. 

Let's look at 2 examples:

Someone searches the following:

“Universities”
If you are an institution of higher education it’s likely that you want to rank for the “Universities” keyword. However, ranking for this particular keyword gives you very little information in regards to the person searching Google’s intent.

Search intent is the ultimate goal of the person using a search engine. Their intent in this context may be to find answers, compare services, or purchase products.

What page on my site should rank for this keyword? What does the person searching Google plan to do with the information they find? Are they in the research phase or are they considering applying to a school?

It becomes very difficult for you as the site owner to determine what to do with people who come to your site via keyword searches such as this.

Let’s take a look at another example. A person searches Google for the following:

“Computer science universities near me with an accelerated program”
While the search volume (amount of searches performed each month) for this particular keyword may be significantly lower than the previous, the phrasing is very actionable.

Based on this search we know several things about the person:

  1. They are looking for local universities
  2. They are interested in computer science
  3. They are in need of an accelerated program

Search phrases like these, known as long-tail keywords, make it very easy to provide users searching Google with content that satisfies their needs.

If you offer such a program, we recommend creating a dedicated page about your adult-oriented accelerated computer science degree for users searching for this term to find exactly what they’re after.
If you don’t have an accelerated program, we recommend creating a page optimized (see number 5 below) for the same phrase that stresses the benefits of a four-year degree.

Long-tail keywords like these offer us greater insight into individuals' search intent. And, based on their intent we can determine what actions a user is likely to take in response to the information you present them, which in turn informs how you write this content.

Example Keyword Research for the term “Universities”

There are many great tools available to perform what is known as keyword research. There are paid tools like Ahrefs and Moz Pro as well as free tools such as Google Ads Keyword Planner. For this article, we will be utilizing Ubersuggest, a free tool offered by Neil Patel.

Screenshot of an SEO keyword of "universities" search on Neil Patel's SEO search tool. Example of keyword research for Universities.

We can see that the search volume for the keyword “Universities” is quite high and marketplace demand for it is fairly high. Running ads against this keyword would cost us more than $3 per click to on our website.

We also see the sites that we would be competing against to rank for this particular keyword. The tool also offers us some other helpful keyword suggestions as well as the volume and SEO difficulty of each.

The top tab also gives us related keywords, questions, prepositions, and comparisons that oftentimes have a lower SEO difficulty, offering us alternative keywords and phrases that are being searched that may be easier to rank for. 

Now let’s look at our long-tail keyword.

Screenshot of an SEO keyword search of "Computer science universities near me with an accelerated program" on Neil Patel's SEO search tool. Example keyword research for Computer science universities near me with an accelerated program.

One very important detail to note here is that a 0 in the search volume does not necessarily mean this keyword does not receive any searches per month. It simply means that the search volume is so low, the amount of times this phrase is searched per month, that it can’t really be measured accurately.

However, notice the SEO difficulty. There is almost no competition when attempting to rank for this keyword. Taking a look at the sites who do rank for this keyword we can see that one even has a page with over 700 social shares. This means that while this long-tail keyword is not searched often, the content that is found is very valuable to those who have searched it.

This high-level overview in keyword research demonstrates that the best course of action is one that involves both long-tail and short-tail keywords. While short-tail has the most reach, long-tail has more potential for action.

4. Write quality content 

Google, as it typically does, has made changes to its search algorithms which determine the order of search results:

  1. Time on page
  2. Bounce rate

Time on page is exactly what you might think – it is the amount of time a visitor spends on your webpage. Bounce rate is the percentage of visitors who come to your website, visit one single page, and leave. Google is equating the time a visitor spends on a given page with the overall value that page has to offer and thus, they will place that page higher in search results.

This means that in order to rank more effectively we recommend doing a couple of things:

  1. Write complete content
  2. Write valuable content

Complete content refers to the idea of leaving nothing behind. If utilizing our intent-based keyword research, we know what searchers are searching for, why would we only provide them with half a solution?

Valuable content refers to content that is useful for people. Unique, fresh, and complete content offer the most value.

Complete content does not mean that you have to give them so much that your products and services are no longer necessary, but if we know they are searching for computer science degrees with an accelerated program don’t leave out crucial information that would assist users in understanding that your program is. Complete content simply satisfies the needs and intents of the searcher.

5. Optimize the SEO of key pages on your site 

On-page SEO refers to the process of optimizing a single page in order to rank higher and increase relevant traffic. Any on-page SEO effort will be a losing battle unless you are first delivering quality content (see step 4). But, once that’s complete the only place left to go is up (in your rankings).

Keyword Usage for Humans

In the past, many attempted to improve their search result rankings by engaging in a practice called keyword stuffing. They hid keywords somewhere on the page, not visible to site visitors, repeating hundreds of times to rank higher. These practices not only no longer work, and can actually have severe negative effects on your SEO efforts.

Still, if you want to rank for a particular keyword, it is important that you do use your keyword and use your keyword correctly.  So where should you use your keyword to get the most out of your efforts?

  • In the first 100-150 words of your page
  • In an H1 (heading tag or title) on your page 
  • In your subheading tags (H2, H3, etc)
  • In the URL for the page
  • In your meta description (this may or may not be used by search engines but it’s important to fill this in)
  • In image alt and title tags
  • And wherever it logically makes sense to use it in content

Google stresses that we write content for humans and not robots. Neil Patel has written a great article on the topic of writing for people while optimizing for robots. Forcing the use of keywords where they do not make sense will backfire and may actually harm your SEO.

Link to other sites

While it may seem counterproductive to link to another website, or outbound linking, when trying to optimize your own, the opposite is actually true. Linking to credible sources in your article is a great way to show the accuracy of your content in a sea of misinformation. For example, we recommend this great article by Winnie Wong for SEO Pressor describing how outbound linking increases relevance, improves reputation, boosts value, and encourages backlinks.

Note: If feasible, reach out to anyone you’ve quoted in your article and have linked out to with a note of appreciation and link to your article. If they decide to share your content, you could find yourself with a big SEO win!

Link to other pages on your site 

Internal links connect your page to other pages in your domain and are beneficial in a couple of different ways.

First, they lower the potential bounce rate of a site visitor, by offering more content for them to browse.

Second, linking from pages with a high SEO score to those with a lower one can rank them up.

To successfully link to a page, create a keyword-rich link. Do not simply paste the URL to the other page, you will be wasting a valuable SEO opportunity.

Ex: If you’re remotely interested and want to learn more, Lily wrote a post on how to Throw an Epic Remote Office Party

Encourage Content Sharing  

When someone shares a page on your website, it’s like they are giving your page an upvote. 
The more votes that a page has, the higher it can rank.

Adding social share buttons to your web pages can go a long way in the shareability of your website. Also, make sure your metadata is properly set up so that when people share your content, it appears in an attractive way on social channels.

Conclusion 

This is a 5,000-foot view of SEO and is designed to get you started on the journey of optimizing your website and ranking better in search engines.

In subsequent articles we will do a deep dive on each of these topics and more, however, for now, these tips will go a long way in increasing your search visibility. We encourage you to put them into practice and obtain results! 

Want to dive deeper? Drop us a line.

Jul 06 2020
Jul 06

Technology is changing at the speed of light. Fuelled by the democratisation of innovation, the tempo of change and adoption is multiplying. Today, 5G is a major talking point in the industry. IoT is changing at scale. Data is becoming the centre of the IT universe with digital twins, spatial computing, artificial intelligence, deep analytics and new applied versions of technology all being dependent on data platforms. And, Hyperloop is leveraging magnetic levitation and big vacuum pumps to let those bus-sized vehicles zip along at speeds approaching Mach 1. In a world, where disruptive technologies are changing the ways we see everything around us, what happens to the existing technological solutions? Continuous innovation is the biggest mantra that can help them sustain in the long run and evolve with changing technological landscape. This is exactly how Drupal, one of the leading open-source content management systems, has remained powerful after almost two decades of existence. Introducing Drupal 9!

Drupal 9 logo with Drupal TM written in the centre, drop-like icon and a bluish background


Since Dries Buytaert open-sourced the software behind Drop.org and released Drupal 1.0.0 on 15th January 2001, it has come a long way. It has weathered headwinds. It has grown rapidly. It has powered small and medium businesses to large enterprises around the world. Supported by an open-source community, which is made up of people from different parts of the globe, it has kept on becoming better and better with time. Drupal 9, the new avatar of Drupal, with intuitive solutions for empowering business users, cutting-edge new features that help dive into new digital channels, and easy upgrades, is the future-ready CMS. Amidst all the changes that are happening in the digital landscape, Drupal is here to thrive! Websites and web applications, built using Drupal 9, will be much more elegant!

The excitement in the air: Launch of Drupal 9

When the Drupal 8 was released back in 2015, it was a totally different world altogether. The celebrations were in full swing. But, as a matter of fact, Drupal 9 launch in 2020 wasn’t a low-key affair either. In spite of the Covid-19 pandemic, the Drupal Community and the Drupal Association made sure that the virtual celebrations were right on top. The community built CelebrateDrupal.org as a central hub for virtual celebrations and enabling the Drupal aficionados to share their excitement.

A tweet with drop like icon on top left, Drupal written beside it and a statement in the centre showing excitement over Drupal 9 launch


Even since Drupal 9.0.0-beta1 was released, which included all the dependency updates, updated platform requirements, stable APIs, and the features that will be shipped with Drupal 9, it raised the excitement levels to the sky-high. The beta release marked Drupal 9 as API-complete. Eventually, on June 3, 2020, the world saw the simultaneous release of Drupal 9.0.0 and Drupal 8.9.0.

[embedded content]


Drupal 8.9 is a long-term support version, or the final minor version of Drupal 8, which will receive bug fixes and security coverage until November 2021 with no feature development. On the contrary, Drupal 9 development and support will keep on continuing beyond 2021. Drupal 8.9 includes most of the changes that Drupal 9 does and retains backwards compatibility layers added via Drupal 8’s release. The only difference is in the Drupal 9’s updated dependencies and removal of deprecated code.

Two building made up of bluish rectangles explaining Drupal 8.9 and Drupal 9Source: Drupal.org

If you have an existing Drupal site, opting to update to Drupal 8.9 is a perfect option. This will make sure maximum compatibility and the least possible alterations required for the Drupal 9 update. Or, if you are creating a new Drupal website, it gives you the option of choosing between Drupal 8.9 and Drupal 9. Going for Drupal 9 would be the most logical option as it gives you forward compatibility with later releases.

Traversing the world of Drupal 9

[embedded content]


First things first - with the onset of Drupal 9, a rebranding has taken place as well. The new Drupal brand represents the fluidity and modularity of Drupal in addition to the Drupal Community’s strong belief system of coming together to build the best of the web.

Different Drupal 9 logos with drop like icons and the word 'Drupal' written


If one asks what exactly is Drupal 9, all you can say is that it is not a reinvention of Drupal. It is a cleaned-up version of Drupal 8. So, what’s new in Drupal 9?

Drupal 9 has not only removed deprecated code but updated third-party dependencies for ensuring longer security support for your website’s building blocks and leverage new capabilities.

Since the adoption of semantic versioning in Drupal 8, adding new features in minor releases of Drupal has been possible instead of waiting for major version releases. To keep the Drupal platform safe and up to date, Drupal 9 has revised some third-party dependencies:

  • Symfony: Drupal 9 uses Symfony 4.4. But, Drupal 8 uses Symfony 3 and the update to Symfony 4 breaks backwards compatibility with Symfony 4. Even though Symfony 3’s end of life is November 2021, Drupal 8 users get enough time to strategise, plan and update to Drupal 9.
  • Twig: Drupal 9 will also move from Twig 1 to Twig 2.
  • Environment requirements: Drupal 9 will need at least PHP 7.3 for enhanced security and stability. If Drupal 9 is being run on Apache, it will require at least version 2.4.7.
  • Database backend: For all supported database backends within Drupal 9, database version requirements will be increased.
  • CKEditor: Soon, CKEditor 5 will be added in Drupal 9.x and CKEditor 4 will be deprecated for removal in Drupal 10.
  • jQuery and jQuery UI: While Drupal 9 still relies on jQuery, most of the jQuery UI components are removed from core.
  • PHPUnit: Drupal 8 requires PHPUnit 8.

Drupal 9 comes with the same structured content-based system that all the Drupalers love about it. Layout Builder in core enables you to reuse blocks and customise every part of the page. Built-in JSON:API support helps you develop progressively and fully decoupled applications. BigPipe in core ensures fantastic web performance and scalability. Bult-in media library helps you manage reusable media. There is multilingual support as well. You get better keyboard navigation and accessibility. Its mobile-first UI would change your mobile experience forever. The integrated configuration management system could be used with development and staging environment support.

Therefore, other than those provided by the updated dependencies, Drupal 9.0 does not include new features. It has the same features as Drupal 8.9. Drupal 9.x releases will continue to see new backwards-compatible features being added every six months after Drupal 9.0.

Migration to Drupal 9

Infographics showing timeline of Drupal 9 with yellowish background and a flowchart at the centre


While Drupal 9 is definitely the way to go, one needs to know certain things before upgrading from Drupal 7 or Drupal 8 to Drupal 9. Drupal 7 and Drupal 8 are not completely lost yet. They are here to stay for a while.

Drupal 7, which was slated to be end-of-life in November 2021, will now be getting community support till November 28, 2022. The decision comes after considering the impact of the Coronavirus outbreak and that a large number of sites are still using Drupal 7 in 2020. On the other hand, Drupal 8, which is dependent on Symfony 3 and since Symfony 3 will be end-of-life in November 2021, will, as a result, see the end of community support on November 2, 2021.

Symfony 4 will be end-of-life in November 2023. With Drupal 9 using Symfony 4.4, it is bound to stop receiving support at the end of 2023. (There is no official confirmation on dates yet for Drupal 9 EOL.) If that happens, Drupal 10 will be released in 2022 which means it will be released before Drupal 9 becomes end-of-life.

bluish background, icon resembling tool at the centre and 'Essential Drupal 9 upgrade tools' written below it


To upgrade to Drupal 9, the know-how of upgrade tools is essential:

  • For migrating your content and site configuration, Core Migrate module suite is perfect. 
  • The Upgrade Status Module would give your details on contributed project availability.
  • In case of Drupal 8 websites, the Upgrade Rector module would automate updates of several common deprecated code to the latest Drupal 9 compatible code.
  • In case of Drupal 7, the process of scanning and converting outdated code on your site can be handled by Drupal Module Upgrader.
  • Using drupal-check and/or the Drupal 8 version of Upgrade Status in your development environment helps you ensure whether or not a Drupal 8 update is also compatible with Drupal 9. You can also make use of phpstan-drupal from the command line or as part of a continuous integration system to check for deprecations and bugs.
  • You can use IDEs or code editors that understand ‘@deprecated’ annotations
Three drop like icons with numbers 7, 8 and 9 written inside each of them respectively and arrows connecting them


The best option is to upgrade directly from Drupal 7 to Drupal 9 as this ensures that your upgraded site has maximum expected life. When your site requires a functionality provided by modules that are available in Drupal 8 but not yet in a Drupal 9 compatible release, you can also migrate to Drupal 8 first (Drupal 8.8 or 8.9) and then eventually move to Drupal 9.

While updating from Drupal 8 to Drupal 9, it is important to ensure that the hosting environment matches the platform requirements of Drupal 9. You need to update to Drupal 8.8.x or 8.9.x, update all the contributed projects and make sure that they are Drupal 9 compatible. Also, you need to make the custom code Drupal 9 compatible. Once set, all you need to do is update the core codebase to Drupal 9 and run update.php.

Future of Drupal 9

It’s very important to make Drupal more and more intuitive for all the users in the coming years. One of the foremost achievements of Drupal 9 is the streamlined upgrade experience. Upgrading from Drupal 8 to Drupal 9 is a lot easier than moving from Drupal 7 to 8. And, it will continue to be a smoother process when the time comes to migrate from Drupal 9 to Drupal 10.

Drupal 9 will continue to receive feature updates twice a year just like Drupal 8 did. For instance, the experimental Claro administration theme is being stabilised. The new Olivero frontend theme is already being developed and is being optimised for accessibility and tailored to frontend experiences. It is specifically being designed for marketers, site designers and content editors with a lot of emphasis on responsive design. Automated Updates Initiative, which began in Drupal 8, is also in the works.

There’s an awful lot of development going on behind-the-scenes. The upcoming releases of Drupal 9.x would definitely come packed with exciting new features. We are waiting!

Conclusion

Drupal is awesome because it’s always on the cutting edge. It has always been a CMS that provides extensibility, flexibility and freedom. Drupal’s foundation has always been in structured data which works really well in today’s demand for multichannel interactions. Having one of the biggest open source communities, it has the support of thousands and thousands of people adding more features to it, enhancing security and creating new extensions.

The large community of Drupal embraces change right away as the big developments happen. That is exactly why Drupal has been able to offer fantastic web experiences all these years. Drupal 9 is the result of its community’s commitment to enabling innovation and building something great.

Undoubtedly, Drupal 9 is the best and most modern version of Drupal yet. It marks another big milestone in the realm of web content management and digital experience. It’s time for you to start planning a migration path if you are still on Drupal 7 or Drupal 8. If you are starting out a new website project in Drupal, there shouldn’t be any ambiguities over choosing Drupal 9. Contact us at [email protected] to build the most innovative, creative and magnificent website ever using Drupal 9 or to migrate from Drupal 7 or 8 to Drupal 9.

Jul 06 2020
Jul 06

DrupalCon Global will rely on the Hopin platform to host the conference. This software, new to me and most people I’ve talked to, will be used to facilitate all conference activities.

Anyone can register for a free HopIn account. If you don’t have one, go get one today. From there, you can use your new account to join online events. Registered attendees will receive an invite to join the DrupalCon Global event on HopIn. 

Session Content

The interface will look something like this image (this is just a sample). The landing page lets you know you are at the right event, and lists dates and times of the schedule.

Look at the sidebar buttons. Like the in-person experience, there will be a main Stage area where you will go to watch the scheduled keynotes. There will be a schedule of Sessions. At the right time, users can click to “enter” a session, where they will then watch a presentation and have a chance to interact with the speakers.

All speakers will have a video feed of themselves and the ability to share a slideshow of their presentation. Multiple speakers can join if needed. As a user, you will watch anonymously, but have the ability to pose questions in an online chat. And speakers will have the ability, if desired, to let attendees verbally ask questions as well, through video chat. 

Sponsors

Replacing the exhibit hall will be the new Expo section. All sponsors will have virtual booths available that, like sessions, attendees can visit and explore. At select times of the conference, booths will be “live,” where sponsors have the option to be on video chat, conducting presentations or demos. If you’ve been to a DrupalCon in-person, this may seem familiar.

As with sessions, visitors are anonymous until they decide to initiate a conversation. At that point, you could have an online chat with a vendor, or even switch over to a private channel for a video call. It seems to be a comfortable, non-intrusive way to have contact online.

Networking

Speaking of contact, one of the more interesting features of the conference site is the “Networking” tool. Meeting online means fewer opportunities to randomly run into someone and strike up a conversation. Networking is a way to try and duplicate this experience.

If you want to give it a try (and it’s totally optional), just click Networking. The site will then attempt to pair you with another attendee, completely randomized. You will then be in a video conference call with another person, for however long you like (or maybe it’s 5 minutes max?). Introduce yourself, do the usual “what do you do” and “where are you from” questions, and see if you have anything in common. Is it less scary than doing it in person, or more? Hard to say, but I’ll give it a try.

Jul 05 2020
Jul 05

Migrate in core is among my favorite parts of Drupal 8 and 9. The framework is super flexible, and it makes migrating content from any source you can dream up pretty straight forward. Today I want to show a trick that I use when I receive a csv (or Excel file) from clients, where they want all of the contents in it migrated to Drupal. One very simple example would be a list of categories.

Typically the file will come with one term on each line. However, migrate would want us to set an ID for all of the terms, which currently none of the rows have. One solution to this is to place an ID on all of the rows manually with some sort of spreadsheet software, and then point our migration to the new column for its IDs. But since that both involves the words "manual" and "spreadsheet software" it immediately makes me want to find another solution. Is there a way we can set the row id programmatically based on the row number instead? Why, yes, it is!

So, here is a trick I use to set the ID from the line number:

The migration configuration looks something like this:

id: my_module_categories_csv
label: My module categories
migration_group: my_module
source:
  # We will use a custom source plugin, so we can set the 
  # ID from there.
  plugin: my_module_categories_csv
  track_changes: TRUE
  header_row_count: 1
  keys:
    - id
  delimiter: ';'
  # ... And the rest of the file 

As stated in the yaml file, we will use a custom source plugin for this. Let's say we have a custom module called "my_module". Inside that module folder, we create a file called Categories Csv.php inside the folder src/Plugin/migrate/source/CategoriesCsv.php. And in that file we put something like this:

<?php

namespace Drupal\my_module\Plugin\Migrate\source;

use Drupal\migrate\Plugin\MigrationInterface;
use Drupal\migrate\Row;
use Drupal\migrate_source_csv\Plugin\migrate\source\CSV;

/**
 * Source plugin for Categories in csv.
 *
 * @MigrateSource(
 *   id = "my_module_categories_csv"
 * )
 */
class CategoriesCsv extends CSV {

  /**
   * {@inheritdoc}
   */
  public function prepareRow(Row $row) {
    // Delta is here the row number.
    $delta = $this->file->key();
    $row->setSourceProperty('id', $delta);
    return parent::prepareRow($row);
  }

}
   

In the code above we set the source property of id to the delta (the row number) of the row. Which means you can have a source like this:

Name
Category1
Category2
Category3

Instead of this

id;Name
1;Category1
2;Category2
3;Category3

The best part of this is that when your client changes their mind, you can just update the file instead of editing it before updating it. And with editing, I mean "manually" and with "spreadsheet software". Yuck.

To finish this post, here is an animated gif called "spreadsheet software yuck"

Jul 05 2020
Jul 05
Tool/Template

Tool/Template

Your comprehensive Drupal website auditing tool is here!

Vardot released the beta-version of DrupalAudit to aid Drupal website owners and developers just before the release of Drupal 9 early June 2020.

Frequent performance audits are essential to guarantee that you benchmark your desired KPIs and website efficiency in alignments of the objectives of your digital strategy.

The tool will assess your Drupal website across the following key areas:

  • Performance
  • SEO
  • Accessibility
  • Best Practices

To use DrupalAudit, follow these 3 simple steps:

  1. Enter your Drupal site’s URL and test the performance across all site aspects.
  2. Find out how your Drupal site is performing in the areas you care about.
  3. Automatically generated tips will guide you on what to do next.

Audit Your Drupal Website Performance
Jul 05 2020
Jul 05
Tool/Template

Tool/Template

Your comprehensive Drupal website auditing tool is here!

Vardot released the beta-version of DrupalAudit to aid Drupal website owners and developers just before the release of Drupal 9 early June 2020.

Frequent performance audits are essential to guarantee that you benchmark your desired KPIs and website efficiency in alignments of the objectives of your digital strategy.

The tool will assess your Drupal website across the following key areas:

  • Performance
  • SEO
  • Accessibility
  • Best Practices

To use DrupalAudit, follow these 3 simple steps:

  1. Enter your Drupal site’s URL and test the performance across all site aspects.
  2. Find out how your Drupal site is performing in the areas you care about.
  3. Automatically generated tips will guide you on what to do next.

Audit Your Drupal Website Performance
Jul 05 2020
Jul 05

Many countries in the world prioritize the travel and tourism sector due to its extensive impact on stimulating other economic sectors related to the tourism value chain. Such related sectors include transportation, healthcare, financial services, retail, hospitality, shipping, car rentals, restaurants, food and beverage suppliers, and much more.

All the aforementioned sectors and businesses are directly dependant on attracting inbound tourism to their destinations. Attracting inbound tourism and destination marketing is the key and sole objective of a travel and tourism website.

1. Connect Stories With Travelers

Your destination or region will undoubtedly have many experiences to offer travelers. These attractions and activities are probably being actively promoted via images, videos, articles, downloadable ebooks, guides, digital maps, VR, and lists.

A common challenge for travel and tourism websites is that they cannot sustain a seamless and hassle-free experience for their visitors due to the nature of their content-heavy marketing effort.

Their sites were becoming increasingly unstable due to outdated technologies and increasingly difficult content management system (CMS) that often resulted in content managers breaking pages. Page load times were unacceptably slow for this heavily image-based site

Image management was increasingly difficult without categorization facilities as the library grew.

Drupal 9 is developed with user experience at the core of the technology. Building your travel and tourism digital experiences using an enterprise-level multilingual CMS ensures that your marketing efforts and immersive multimedia promotional content is enjoyed by your target audience in a seamless manner - regardless of how heavy your platform maybe with content.

Learn Why Varbase CMS Is the Best Multilingual Enterprise-Grade Drupal Website Builder

 

2. Be Responsive and Flexible

The global outbreak of Coronavirus (COVID-19) caused many governments to issue restrictions on travelers. The Hashemite Kingdom of Jordan was one of the countries that issues a severe lockdown to mitigate the threat of COVID-19 spreading.

With travel and tourism being the main strategic sector for their economy - the Jordan Tourism Board (JTB) actively promoted local travel and tourism within the kingdom's borders as soon as restrictions on travel were relaxed. The JTB also added an "Immersive Experiences" domain to make their various major attractions accessible for everyone via VR.

Being responsive is more than just being available from any device; it's also about having the technology that provides you with the flexibility needed to adapt to factors out of your control. The time to upgrade your website performance to a superior CMS is now - as more people contemplate traveling as soon as the travel ban is lifted.

As uncertainty appears to linger in the air regarding how soon we can freely travel borders, VR can become a popular alternative for people to enjoy particular experiences that your destination may offer. However, this will require you to assess your IT infrastructure's capacity to accommodate such solutions.

Building your destination's digital ecosystem and interconnected web of channels on Drupal 9 will enhance your capacity for ongoing growth and the ability to scale your digital experience when necessary to keep up with the trends and UX best practices.

3. Stimulate Trips To Emerging Travel Attractions

Create the ultimate network of digital experiences via Drupal's multisite functionality.

Your destination marketing team can create, update, publish, and promote content for each and every attraction's website or domain from one centralized CMS. Each attraction's digital presence can be managed and promoted to its target audience from a centralized content publishing process that enjoys a seamless workflow based on tiered permissions for content creators, editors, and marketers.

What does that mean? 

  • No more duplicate content issues.
  • More lean and efficient marketing teams needed.
  • No more time wasted by the marketing team(s).
  • Superior content development and marketing results.

Your marketing and technical team need not waste their time by duplicating the effort and time it takes to recreate or re-upload content to each relevant website or domain under your destination brand's name.

With multisite functionality; your websites can enjoy:
 
  • Seamless content publishing and management workflow process
  • Tier-based content moderation permissions for users
  • Best-in-class cybersecurity - continuously upgraded and updated
  • Mobile-responsiveness
  • Branding (if required) and UI/UX design systems
  • Ability to create dynamic, flexible layouts with Drupal's Layout Builder

But the multisite architecture solution can be used for related domains as well.

For example, many official travel and tourism websites such as VisitFlorida.org have audience websites that can all share best-in-class security, content publishing workflows, UX features, mobile-responsive layouts, all controlled and moderated by one centralized CMS and admin team.

4. Focus On The Digital Traveler Needs

The goal of any tourism site is to inspire visitors to travel and explore a particular region - from activities to geography - through relevant content across their existing website.

For the majority of travelers, the travel and tourism website they visit will be the first major impression of the experience they expect to have at the destination being promoted on that website.

Connecting stories relies on finding the right audience wherever they are and whenever they are searching for what your destination can offer them. This means you need a rich content-heavy digital experience that constantly evolves into a deeply personalized user experience for each user and website visitor.

You must ask your self key questions like:

  • Can any traveler easily access our website from any device or smart assistant?
  • Can any traveler easily understand what we want them to understand on our website?
  • Can any traveler communicate with us easily via our website or connected apps?
  • Can any traveler easily navigate our website and/or destination?
  • Can any traveler interact without any barriers for any communication while visiting our destination attractions?

Drupal 9 will allow you to connect those content-rich experiences and stories with travelers anywhere, anytime, and across all devices thanks to the endless capability to integrate with all necessary tools, solutions, and technologies to sustain an up-to-date digital ecosystem.

 

5. Personalize User Experiences

One simple method to achieve this is to enhance your communication via personalization.

Utilize the power of automation to deliver an exceptional experience to specific audience segments.

For example; gathering insight and data into how each user and visitor interacts with your digital platform informs your marketing team on how to optimize each landing page for higher conversion and engagement rates.

Utilizing essential integrations such as marketing automation will enable your marketing team to send dynamic and personalized marketing messages in the form of tweets, text messages, notifications, and emails to each website visitor based on their behavior on your website.

This insight into your website traffic will allow for a clear idea on how to optimize your landing pages for richer user journeys, higher engagement, and conversion rates.

The potential and opportunities are infinite with insight-driven marketing automation; ultimately you will be able to create advocates and champions of your destination by ensuring that travelers and tourists enjoy their travels from A to Z.

Winning Micro-moments: Mrs. Smith is on a 2 day business trip and she wanted to visit the main attractions close to her accommodation within the limited free time she can enjoy. Mrs. Smith can be a member of an audience named "One night only business trip" and can receive tips regarding which attractions she must visit within the city.

Moreover, she can be provided with quicker access to attractions and jump the queue via an online ticket code specific to members of this audience segment.

6. Hassle-Free Digital Experiences

Drupal 9 digital experiences are the most advanced travel and tourism websites that continue to deliver better on-page speed performances eliminating any frustration that website visitors can experience across any device.

Travelers seeking deeply specific search results for their holidays or business trips can easily find results quicker on Drupal-powered websites and digital platforms due to Drupal's on-page SEO superiority and integration with advanced search engine modules such as Apache Solr.

Your marketers published content has a much higher chance of ranking higher on search engines due to built-in SEO tools in the CMS that allows your content editors to upload marketing content optimized for better SEO performance across all devices regardless of the format of the content.

7. Incremental Growth and Critical Integrations

Are you set up for continued implementation and continued delivery of essential managed support services that ensure your performance at optimal levels? If not, you are already behind in sustaining standards of quality. 

Travelers are always looking for the most convenient value-based offer they can find. Their needs are evolving as are their perception of what an ideal UX should be when interacting with brands online. Enterprises have realized that they must transform into a breathing digital organism that consists of an interconnected network of relevant touchpoints and channels that their target audiences engage in.

Creating an evolving digital experience that constantly delivers on those evolving needs will require a reliable IT support partner that ensures that your website is constantly supported with essential UX related features such as upgraded security and search engine modules.

Is your destination's first impression an outdated website or rather an immersive digital experience made for the digital world? Is your travel and tourism website integrated with essential 3rd party apps that should make up your interconnected web of touchpoints that your target audience can and will interact with you from?

In fact;

  • Do you have a booking management system where travelers can make their entire accommodation plans quickly?
  • Do you have a secure payment gateway that accepts all currencies and payment methodologies?
  • Do you have an integration with essential marketing automation solutions that personalize your marketing messages for each website visitor based on their behavior and history of actions on your website?
  • Do you have an integration with automated chatbots that are increasingly being relied upon for FAQs?
  • Does your chatbot speak in tongues to cater to a globally diverse audience of travelers and tourists?
  • Can your marketing team easily build new content-rich and dynamic landing pages when required?

Through our decade-long experience with building digital experiences for global enterprises; we have realized that they valued reliable hosting, support, and maintenance above all criteria when selecting a vendor.

Support and maintenance by a dedicated and proven IT partner will be key to your long-term digital transformation plans and growth strategy.

Digital transformation is a commitment and a process that demands clarity and commitment.

8. Investing In Digital Tourism

Disruption will continue to occur, and expectations of your target audience will continue to shift.

You must align your digital and growth strategy with the technical requirements needed to ensure your ongoing sustained success. Investing in the right technology means that you will never need an expensive rebuild every time a critical update/upgrade is required.

Keep an eye on the future. Drupal's ability to integrate easily with any technology, solution, tool, or required 3rd party app makes it the ideal enterprise-level technology and CMS for digital experiences focused on personalized UX.

If your vision is ambitious and clear, you cannot build it on an inferior IT infrastructure that will only deliver short-term wins and long-term losses. Protect your future by making smart decisions today.

If you wish to consult with our travel and tourism IT solutions development experts, you can book a free consultation to discuss your opportunities and potential to drive more tourists to your destination.

Questions About Drupal 9

Questions About Drupal 9?

Jul 04 2020
Jul 04

If you write custom Drupal 8 (or 9) modules, then you've probably used the entity QueryInterface - which is accessible from the Drupal::entityQuery() static method. If not, then you're missing out on this incredibly useful tool that allows you to easily query a Drupal site for entities of any kind. 

For example, if you are looking for all nodes of type Movie that have a Release year of 2009, you can do something like this:

$result = \Drupal::entityQuery('node')
  ->condition('type', 'movie')
  ->condition('field_release_year', '2009')
  ->execute();

But what if you want to base the condition on a value of a referenced entity? Maybe you want to find all nodes of type Movie where the director of the movie was born in 1981? Assume nodes of type Movie have an entity reference field to nodes of type Director, where one of the fields on the Director content type was Birth year. It's almost as easy to write an entityQuery condition for this situation as well:

$result = \Drupal::entityQuery('node')
 ->condition('type', 'movie')
 ->condition('field_director:entity:node.field_birth_year', '1981')
 ->execute();

Note the entity:node bit in the second condition - this is what allows you to access fields in the referenced entity.

Jul 03 2020
Jul 03

New versions of tech products make them safer, more efficient, more feature-rich, and more in line with the latest trends. This is especially impactful in the context of websites. Indeed, sites are complex mechanisms responsible for expanding our reach and bringing us new customers — so we really need them to work at their full capacity.

If your site is on Drupal, what is the latest version it can have? The newest one — fresh from the oven — is Drupal 9 that has been released this June. Many businesses started planning for Drupal 9 a long time ago, many are just beginning, and what about you?

Today, we will be discussing why you should upgrade to Drupal 9. A special focus will be on Drupal 7 websites — what’s important for their owners to know.

The current state of Drupal 7: how is it going?

Today, D7 is used by 730,000+ websites across the globe (is yours one of them?) The version is not new but a stable version officially supported by the Drupal team. I It will be officially supported until November 2022.

This support was planned to stop in 2021, but the Drupal creators decided to do a kind gesture because of the impact of the pandemic and extend the support one year ahead.

What will the end of support mean?

  • Security considerations. Keeping your website’s and customers’ data safe is vital for any business. One of the golden safety rules is to regularly apply security updates in Drupal. After the end of support, no security updates will be released. This means more vulnerabilities to cyber attacks.
  • Smooth website’s work under question. No official support also means no bug fixes or improvements. As a result, something on your website may not be working properly. This pushes away potential customers.
  • No further development. After the end of support, you cannot expect any nice features created for your Drupal version.

However, despite the fact that Drupal 7 is still supported, we should note that the developers’ focus has significantly shifted towards D8/D9. This makes Drupal 7 lag behind without getting cool features for the core or contributed modules.

In addition, many developers of D7 contributed modules have no more inspiration to support them, so websites using them may get some unstable work.

When it comes to security, D7 sites today often run on outdated and insecure versions of PHP (sometimes even as old as PHP 5.3).

Why upgrade from Drupal 7 to Drupal 9

After D7, Drupal has almost been rewritten from scratch according to the new technological trends. Drupal 9 is very close to Drupal 8 — in fact, D9.0 is almost like the latest minor D8 version but cleaned up from deprecated code, equipped with the latest libraries, and headed for future long-term growth. D8 sites upgrade to Drupal 9 in a snap of a finger by just meeting a few requirements.

All this means that if you upgrade from Drupal 7 to Drupal 9, you get all the Drupal 8 and Drupal 9 benefits in one. Let’s take a look:

Easy upgrades ever after

Long upgrades are gone into the past. All you need now is one big move forward from the 7th version — and all future upgrades will be very quick. This is thanks to D9’s backward compatibility with Drupal 8. It’s Drupal team’s priority to make upgrades easy forever.

Better security

One of the benefits of long-term official support of Drupal 9 are security releases that help protect your website from various vulnerabilities. D9 is also clean from outdated code and uses the new version of PHP, which is important in terms of security.

Faster

Drupal 9 has the latest releases of third-party libraries and components inside (e.g. Symfony, Twig, etc.) that improve your website’s performance. In addition, the techniques to improve website speed in Drupal 8 make a great difference. New architecture also allows you to create superfast experiences by integrating JavaScript frameworks.

Accessible to all audiences

The 9th version continues the accessibility-oriented path started by 8th version. It has a focus on being accessible to audiences with disabilities. Drupal 9’s new front-end theme Olivero alone is an accessibility masterpiece.

Drupal 9’s new front-end theme Olivero

More open to integration

Drupal 9’s strategic priority is to keep getting more and more ready for integrations with new devices and applications. For example, just imagine your content both on your website and your mobile app.

Smooth editorial experiences

D8 started and D9 will continue to focus on making content editors love their admin dashboard. It’s even hard to compare to the old Dr7’s one. This is thanks to the Media Library, CKEditor for post creation, configurable toolbar, the quick edit feature, Layout Builder, content moderation options, modern Claro admin theme, and much more.

Media Library in Drupal 8 and 9

Multilingual

You Drupal 7 website cannot even imagine the multilingual capabilities that Drupal 8 has out-of-box. Adding languages is now a breeze and most of the interfaces are already translated.

Mobile-first

Another great tradition started in D8 that will continue through 9 is a mobile-first approach. It ensures your website works seamlessly across all devices.

Upgrade to Drupal 9 with us

The most recommended way to upgrade to Drupal 9 from Drupal 7 is to upgrade to Drupal 8 first. We will upgrade you to the latest version of D8 and make sure your code is clean, without outdated elements. Fulfilling these requirements will allow us to instantly transfer your site to D9.

Our team of experienced Drupal development and support specialists will perform your Drupal upgrade fast and at affordable prices. Contact us and let’s begin with a free quote!

Jul 02 2020
Jul 02

Drupal 9 was just released last month, and in less than two weeks we get together to celebrate it (again), learn, grow and plan together for the future at DrupalCon Global.

I presented my "State of Drupal 9" talk at various events for over a year now, and while the original direction of questions were about how the transition would work, lately it is more about what else can we expect from Drupal 9 and then Drupal 10. This is a testament and proof to the continuous upgrade path we introduced all the way back in 2017. Now that Drupal 9.0 is out, we can continue to fill the gaps and add new exciting capabilities to Drupal core.

DrupalCon Global will have various exciting events and opportunities to learn about and help shape the future of Drupal 9 and even Drupal 10. Tickets are $249 and get you access to all session content, summits and BoF discussions. As usual, contributions do not require a ticket and will happen all week as well, including a dedicated contribution day on Friday. Here is a sampling of all content elements discussing, planning on and even building the future of Drupal.

Sessions about the future of Drupal

First there is the Driesnote of course. Dries will share the result of the Drupal 2020 Product Survey and discuss plans for Drupal 10. There is a followup Q&A session to discuss the keynote and other topics with Dries live.

The Drupal Initiatives Plenary coordinated by yours truly is going to feature various important leaders in our community working on diversity and inclusion, accessibility, events, mentoring, promotion as well as core components like the Claro admin theme and the Olivero frontend theme. This is the best way to get an overview of how Drupal's teams work, what are their plans and challenges. Even better, the plenary session is followed by a BoF where we can continue the discussion in a more interactive form.

In Drupal Core markup in continuous upgrade path Lauri Eskola will dive into why the deprecation process used for PHP and JavaScript code is not workable for HTML and CSS. This informs the direction of where markup is going in Drupal 9 and 10 onwards.

In the Drupal.org Panel the Drupal Association team discusses how key initiatives are supported on Drupal.org including Composer, Automatic Updates and even Merge Requests for Drupal contribution and plans for the future.

Mike Baynton and David Strauss will discuss Automatic updates in action and in depth showing what is possible now and what are the future plans.

There is not one but two sessions about the new proposed frontend theme. In The Olivero theme: Turning a wild idea into a core initiative Mike Herchel and Putra Bonaccorsi discusses the whole history and future plans while in Designing for chaos: The design process behind Olivero will cover the design specifically.

Moshe Weitzman leads a core conversation to take stock of the current command line tools for Drupal and discuss what a more complete core solution would look like in A robust command line tool for all Drupal sites.

In Let’s Make Drupal Core Less Complicated Ted Bowman will propose ways to simplify Drupal core for existing uses and to achieve an easier learning curve.

Finally Drupal 9: New Initiatives for Drupal offers a chance to discuss new initiatives proposed by Dries in the Driesnote. If you are interested to join in either or discuss the plans, this is your opportunity!

Birds of a Feather discussions about the future of Drupal

Attendees with tickets for DrupalCon Global will be able to participate in live discussions about key topics. BoF submission is open, so this list will possibly grow as time goes.

Ofer Shaal leads a discussion titled Standardize Rector rules as part of Drupal core deprecations to make sure the transition from Drupal 9 to 10 will be even easier than Drupal 8 to 9 is.

Submit your Birds of a Feather discussion now.

Contribute to the future of Drupal

Just like in-person DrupalCons, DrupalCon Global contribution will be free to attend and does not require a ticket. The contribution spaces are especially good to go to if you are interested in the future of Drupal and making a difference.

If you've been to a DrupalCon or a DrupalCamp before, a contribution event usually involves one or more rooms with tables that have signage on them for what they are working on. This is not exactly possible online, however, we devised a system to replicate tables as groups at https://contrib2020.getopensocial.net/all-groups which allows you to see what topics will be covered and who the leads are. (Huge props to Rachel Lawson at the Drupal Association for building this out!)

If your topic is not yet there, you should create a group now. Groups indicate what they are working on and what skills they need from contributors. You should join groups you are interested to help and read their information for guidance. Teams will post group events to let you know when certain activities (introduction, review sessions, co-working on specific problems or meetings to discuss issues) will happen. Events will also be used to signify when you are most likely to find people working on the topics. The OpenSocial site is a directory of topics and events, contribution itself will happen on drupal.org with discussion on Drupal Slack for most groups.

There are already groups for Configuration Management 2.0, the Olivero theme, the Bug Smash initiative and Media. Stay tuned for more appearing as the event comes closer.

Jul 02 2020
Jul 02

Drupal 9 was launched on June 3, 2020. Given this, it would be necessary for enterprises to upgrade to it later or sooner to acquire complete functionality and retain the ability to receive security updates within the bi-yearly cycles.

In the past, migrating from one version to another has been similar to moving from another CMS to Drupal, bringing in more time and fatigue.

However, the upgrade from D7/8 to D9 is much easier and painless. Let’s dive into more details and understand as to why moving on to Drupal 9 would be a better choice.

Why Should You Upgrade?

With the end of life approaching for Drupal 7 and 8 soon, operating the website on them securely and with complete functionality won’t be a feasible option.

At the same time, it might also be overwhelming for Drupal 7/8 site owners to know that their website will need the upgrade, especially when their site is running absolutely fine; thereby, resulting in confusion among them.

 

Here are 3 reasons why you should consider upgrading your site to Drupal 9:

 

  1. The Drupal security team will soon no longer provide support or security advisories, wavering your website’s and its users’ cybersecurity
  2. D7 and 8 releases’ on all project pages will be flagged as ‘not supported’. D7/ 8 may be flagged as insecure in 3rd party scans making the integration with other third-party tools and systems challenging
  3. Leading hosting services providers like Acquia and Pantheon will also soon withdraw their support from D7 leaving you without many options but to assume hosting responsibility for maintaining your application and server level configurations

The good news for Drupal 7/8 site owners is that even when it goes out of official support in November 2022, remaining Drupal 7/8 sites won't stop working at that point.

Should an Existing Drupal 7 Site Be Upgraded to Drupal 8 or 9?

One of the major reasons that more than seven hundred thousand Drupal 7 sites still haven’t migrated to Drupal 8, is due to the known challenges in the migration process. And with the majority of people on Drupal 7, it is quite likely that most of them did not want to upgrade their CMS twice in the span of one year.

A safe bet seems to be migrating from Drupal 7 to Drupal 9. But will the site be secure? Let’s get to know a few facts.

Since D8 and D9 are similar except for deprecated codes removed and third-party updates in D9, it would be a feasible option for enterprises to migrate to D9 instead of D8 - to save them from constantly going through the same process and investing time, money, and efforts unnecessarily.

What’s New in Drupal 9?

There are innumerable capabilities added in Drupal 9 which further will be consistently updated biannually to help enterprises stay up-to-date.

Now once you upgrade your system to D9, you won’t require to make major changes the next time you plan to update it to a newer version. 

Here are some of the new capabilities that are added to D9-

  1. Backward compatible

    Drupal 9 is backward compatible, i.e., it is compatible with its predecessor, Drupal 8. That being said, D9 will be able to use modules, configurations, and data created on D8 of the same software, unlike the case with D7 and D8.
    Additionally, preserving this functionality won’t burden Drupal with historical baggage and so the performance of the system will remain unaffected. The Drupal community has also focused on breaking code and not the data.
    This way, Drupal will remain fast, clutter-free, and yet an up-to-date technology.

  2. Faster and Better Performance

    Drupal 9 has taken it further to extend its support for responsive images, wherein mobiles can display the best-sized images and hence, consume fewer amounts of data.
    In a recent webinar by Dries, he mentioned that Drupal 9.1 onwards versions/updates will witness the innovation and pave the way for faster and better performances of the websites. Drupal 9.1 update is just six months post the release of Drupal 9. Meanwhile, here are some of the features of D9 that you can leverage for efficient workflows-

        A.  BigPipe increasing page view performance and supporting faster initial page loading

        B.  Content Workflow allowing you to define multiple workflows

        C.  Multilingual capabilities

        D.  Structure Content- Drupal 9 comes in with an array of available fields, encompassing phone, email,       data, and time.

  3. Cleaner code base

    Drupal 9 has removed the support for deprecated codes in D8. This implementation will ensure that the code marked as deprecated will no longer be supported and used in the Drupal ecosystem. 
    The motive behind this is to make D9 a cleaner version so that whenever the modules in D8 want to become compatible with D9, they need to first eliminate the deprecated code. 
    Thus, the end result is clear- to make the code more nimble and improve the website’s performance.

  4. Newer Major Versions of Symfony and Twig

    Symfony 3 will be replaced with Symfony 4 or 5 after November 2021. Also, the Drupal community can introduce an upgrade to Twig 2.0. These upgrades will only result in enhanced performance, improved developer experience, and enhanced security.

  5. Panelizer will be removed and replaced 

    What’s new in Drupal 9? Well, the panelizer will be replaced with the Layout Builder, the “star” module of the moment.

  6. Headless CMS

    Drupal 8 and 9 both come with an API-first approach. Dries also mentioned in the webinar that the Drupal community is vigorously capitalizing on Headless CMS so that it can enhance users’ experience with the powerful front-end of the website with Javascript framework like React or Angular. 

The essential features of Drupal Headless CMS are-

  • Front-End Freedom
  • Create Once, Publish Anywhere
  • API-First Approach
  • Easier Resourcing

Drupal 9 is more usable, accessible, inclusive, flexible and scalable than previous versions, with the following updated features-

  • It will be significantly easier for marketers to use D9
  • Simple than ever to maintain and upgrade for developers
  • D9 is experimenting with its headless or decoupled capabilities

Additionally, you can also learn from our previous blog where we have explained how to find and fix the deprecated code - Site Owner’s Guide to a Smooth Drupal 9 Upgrade Experience.

Why Remove Deprecated Code in Drupal 9?

To ensure that the D8 modules remain compatible with D9, it’s typically essential to remove deprecated codes- 

  1. The all-new Drupal 9 ready code gets deployed on Drupal 8 sites and issues can be tested.
  2. It is a continuation of the fully-tested and stable codebase of Drupal 8

With time, the effort is being made to make Drupal better. There are functions that have been around for a long time but will not be a good fit in the latest release. Most were deprecated in Drupal 8.7.0, which will be removed in Drupal 9.

To sum it all, the key to achieving this smooth transition to Drupal 9 is to rollout your migration plan within deadlines and save yourself from any unnecessary hassle later on.

Srijan is working with leading enterprises to help them migrate their digital web properties to Drupal 9 for better user experience. 

If you are also looking for a smooth upgrade/migration process for your enterprise’s system, we are all ears and excited to assist you. Contact Us!

Jul 02 2020
Jul 02
Ecommerce icons around a Drupal icon


Evolving technologies and marketing strategies have changed the way shopping is experienced. With time, the charm and challenges of eCommerce have increased. How do you plan to overcome these challenges?

As an online brand, you have your challenges when eyeing expansion and opportunities. To achieve the right numbers it is important to engage with customers and sell quality products, all through the right platform.

Talking about the right platform, you can always trust Drupal. Drupal is a content management system with hundreds of modules and themes ready to drive your business online. Drupal adds the magic that your website needs.

The State Of Digital Commerce

Drupal provides amazing features for your eCommerce website, but before jumping to that, let’s take a glance at some stats and understand where the eCommerce industry is heading.

According to Statista, online sales reached $2.5 trillion for the global eCommerce market at the end of 2019 and represented 14% of its global market share. The same data says that by the end of 2020, global commerce sales are predicted to reach $4.2 trillion and the representation will increase to 16%.

Blue colored graphs on a white background showing retail ecommerce salesSource: Statista

The way that people have been shopping online has changed. Keeping up with trends is important for the growth in the retail landscape of 2020. The future looks bright for eCommerce in the coming time.

Personalization is the key if you want to earn the trust of your customers and give them an experience that makes them come back to your website again. Contactless payment has become the shopping trend and has been continuing for a long time. People prefer paying online instead of cash on delivery. So, providing diverse options for payments is important to keep your customers’ experience a happy one. Subscriptions are an ongoing trend that has helped brands get a lot of long term customers. Similarly, Chatbots have been a great help in enhancing the experience of the users. Experts have predicted that 80% of businesses will be using chatbots by the end of 2020. Voice search has become popular with time. 26.1% of consumers have made a purchase on a smart speaker in 2019. 

To leverage all these ongoing trends, and drive sales of your product online, you need a robust and future-ready eCommerce website and Drupal is ready to help!

Why Using Drupal Brings You A Lot Of Benefits

One of the most comprehensive open-source CMSes available, Drupal, is the perfect fit for eCommerce businesses. It gives you the perfect way of modeling your content, integrated marketing, payment, and fulfillment tools, which helps in bringing in a bigger audience. All the features of Drupal are accessible to merchants of every size.

There are so many brands out there using Drupal for their online business. Here are a few of them:

Honda Brazil

The website of Honda Brazil, built using Drupal, gives the users an engaging experience with easily accessible information.

Home page of Honda Brazil's Website


Timex

With the help of Drupal, Timex, a famous American Watchmaker, is able to provide its customers a seamless, engaging, and consistent online experience.

Home page of Timex's Website


Lush

Lush, with its website powered by Drupal, has seen dramatic spikes in both online and traffic sales.

Home page of Lush's Website


Puma

Puma, one of the leading sports brands, has its website built on top of Drupal.

Home page of Puma's Website


Why do such great brands choose Drupal for their online business? Let’s look at the reasons that show why Drupal is the best fit for your eCommerce website:

Commerce Kickstart

It’s a distribution for the quickest way to get up and running with Drupal for eCommerce features. If you are launching an online store, commerce kickstart is a great resource that will get you up and running with the production environment. 

Commerce Kickstart is made for modern PHP lovers and is available only for Drupal 7. The categories in this distribution include shipping and payment providers, data migration, search tools, product catalogs, etc.

Drupal Commerce

Drupal Commerce is a dedicated solution for your eCommerce needs. It is basically a set of modules for Drupal that enable the host of eCommerce features. Drupal Commerce being a framework itself, focuses on the solutions that can be built by using it. In simpler words, Drupal Commerce brings to your website the basic functions like order, product details, cart item, and payment options.

There are many features of Drupal Commerce that are further extended with the help of modules. Here are a few of them:

  • Modules like Commerce Stock and Commerce Inventory make inventory management easy. 
  • Commerce shipping is Drupal commerce contributed module that is used in cases where the shipping address and the billing address is different by making use of the customer profile.

Essential modules for an e-commerce site

There are plenty of Drupal modules that can be added to your eCommerce site and will help you in building intuitive and powerful websites. Here are some of the modules provided by Drupal for eCommerce:

  • Commerce Shipping takes care of the shipping rate calculation system for Drupal Commerce. It is used with the combination of other shipping method modules like Commerce Flat Rate, Commerce UPS, etc.
  • The Currency module helps your website with currency conversion and information and does the work of displaying the price of the product.
  • Commerce Stripe makes sure that the customers can pay securely without having to leave your website.

Essential themes for an e-commerce site

The first thing that attracts a user when they visit your website is the appearance of your website. Drupal provides amazing themes for your eCommerce websites which come in handy.

  • eStore is Bootstrap based and easy to install and is designed in a way that it solves any eCommerce website’s needs.
  • Flexi Cart is a global theme that makes sure that your products sell fast and easily online.
  • Belgrade is a Drupal Commerce template specially designed to create business websites.
  • SShop is among those Drupal 8 themes which are responsible for providing the users with inbuilt support for Drupal Commerce.

Content-Driven Commerce

Content marketing is the most popular way and for sure gets you the best SEO results. A good story behind your brand will definitely drive sales for you. If the content on your website is engaging, the users will keep coming back to your website.

The stories can be anything that relates to your product. For example, if you are selling lipsticks, you can write an article that says which shade is the perfect one for your different colored outfits.

It is really important to decide the kind of content you want to post on your website. Your content can include blog posts, ebooks, guides, tips, hacks, etc.

Drupal covers the need for content-driven experience. No matter what the case may be, content types are at the core of Drupal that include, mobile editing, in-place authoring, easy content authoring, content revisioning and workflows, and modules for multimedia content.

Headless Commerce

Headless Commerce, which acts as a great catalyst to upscale content-driven commerce, gives immense flexibility to create a great shopping experience for the users. It is future-focused and stays relevant. JavaScript interface communicates with backend Drupal via REST API. Also, in Decoupled Drupal, there is a separation between the presentation layer and eCommerce backend.

Headless Drupal commerce comes with a lot of benefits including high speed, interactive features, and freedom in front-end changes. These features provide a great shopping experience to the customers online by providing a content-rich experience.

Read our article on the implementation of Decoupled Drupal Commerce and React Native to learn more about the benefits of a headless commerce approach.

Performance

It is important to take into account the speed of your website. It is seen that a site that loads in five seconds has 70% longer average conversions. A slow website will deter your efforts and investments. 79% of the shoppers who faced the slow- loading issue say that they don’t return to the websites. These bounces bring a direct effect on revenue generation.

To maintain a top-notch web performance, Drupal comes packed with plenty of offerings. Some of them include:

  • Blazy module helps the pages load faster and saves data usage if the user is not using the whole page.
  • CDN module helps with the integration of Content Delivery Network for the websites and mitigate the page load time and rapidly deliver the web page components
  • In case, your server hardware is reaching its limits, Drupal gives you the option to upgrade the server hardware for a faster way of scaling.

Mobile Ready

If your website runs smoothly on mobile devices, it will be able to run better on other devices too! Creating user scenarios will help you understand what kind of content the user will appreciate on their mobile. This approach will help you design the important elements required for your website.

Mobile compatibility has become an irreplaceable feature for any eCommerce site. In today’s world, everything needs to be mobile-ready. Drupal’s websites not only wow the clients by their looks but also by their mobile responsive design. Drupal websites are easily accessible on mobile and tablets.

Multilingual

The world is on the internet, and with so many people using similar platforms and so many brands expanding globally, multilingual websites are the sine qua non! 

China has the highest number of internet users which is a massive 772 million. Although the maximum number of people on the internet prefer English as their language, 10 other languages that account for 90% of the top 10 million websites.

World stats on a white backgroundSource: Internet World Stats

Drupal is the best choice for your multilingual website. It provides numerous languages to choose from and 4 core modules specially designed for language and translation support. This feature by Drupal has shown great results that include higher conversions, rise in SEO, unrivaled translation workflows and has also been a great help in widening the audience. It also allows the detection of the user’s preferred language by identifying users’ IP address, sessions, browser settings, etc.

Personalization

Every eCommerce brand wants to make sure that the content created by them leaves a mark on the users’ minds. And it has become a necessity today because there is a lot of competition out there. Hence, personalized content makes the user experience better and helps create trust between you and the customer.

According to an Adobe report on personalization, 92% of the B2B marketers say that personalization is important.

This is the marketing opportunity that no eCommerce business should miss out on. Tapping the different demographics and varied audiences not only improves your market reach but your bottom line as well. 

Following are examples of modules that can aid your web personalization efforts:

  • The Smart Content module gives real-time anonymous personalization for the users. It also allows the site administrators to display different content for anonymous users based on browser conditions
  • Acquia Lift Connector module helps organizations in delivering personalized content and experience across all platforms and devices by merging content and customer data into one tool. 

SEO

E-commerce websites are buried with huge data. While for a consumer, it might be a desirable situation, for a marketeer it increases the burden of implementing SEO on every page and indexing every product. 

Drupal has various modules that help in improving the SEO of your eCommerce website. Some of them are:

  • Pathauto is an SEO module that ensures that the URL of your website is search engine friendly. It converts complex URLs to simpler ones.
  • Metatag module is a multilingual module and controls all the metatags on all the web pages.
  • XML Sitemap module provides you the resilience to exclude or include a few pages on your Sitemap.

Security

With the increase in cases of hacking and security breaches, basic security do-it-yourselves are not sufficient. The security breaches affect your brand image and your market shares and stock price. According to a report, more than $3.5 Billion was lost to Cyber Crime in 2019. 

Drupal has a dedicated team that regularly works on the security side of it. It is frequently tested for issues and bugs. Drupal also provides various security modules for your eCommerce website. Some of them are:

  • The Password policy module provides the password policies that help users to create a strong password. The password entered by the user is not accepted until it meets the constraints set by this module. 
  • Security Kit module provides various security- hardening options. This helps in reducing different risks coming from different web applications.
  • Two-factor authentication module is a second step for your security check, where a set of codes is defined for a user to be able to sign in. 

If you open a webpage from your mobile device and at the same time open it on your PC/laptop, you will be forced to close one of the pages. The session limit module does the same work of limiting the number of sessions by a user at the same time.

To Sum Up

The substantial development in the concept of ‘eCommerce’ has kept the online brands on their toes. And this is where Drupal provides its unmatched services for your eCommerce platform. 

Be it building your eCommerce website or migrating to Drupal, we at OpenSense Labs will help you do your job smoothly until you get a desirable finish.

Feel free to contact us at [email protected] to drive sales on your website!

Jul 02 2020
Jul 02

In today’s article we are going to provide a reference of all configuration options that can be set in migration definition files. Additional configuration options available for migrations defined as configuration will also be listed. Finally, we present the configuration options for migrations groups.

List of configuration options in YAML definition files

General configuration keys

The following keys can be set for any Drupal migration.

id key

A required string value. It serves as the identifier for the migration. The value should be a machine name. That is, lowercase letters with underscores instead of spaces. The value is for creating mapping and messages tables. For example, if the id is ud_migrations, the Migrate API will create the following migrations migrate_map_ud_migrations and migrate_message_ud_migrations.

label key

A string value. The human-readable label for the migration. The value is used in different interfaces to refer to the migration.

audit key

A boolean value. Defaults to FALSE. It indicates whether the migration is auditable. When set to TRUE, a warning is displayed if entities might be overridden when the migration is executed. For example, when doing an upgrade from a previous version of Drupal, nodes created in the new site before running the automatic upgrade process would be overridden and a warning is logged. The Migrate API checks if the highest destination ID is greater than the highest source ID.

migration_tags key

An array value. It can be set to an optional list of strings representing the tags associated with the migration. They are used by the plugin manager for filtering. For example, you can import or rollback all migrations with the Content tag using the following Drush commands provided by the Migrate Tools module:

$ drush migrate:import --tag='Content'
$ drush migrate:rollback --tag='Content'

source key

A nested array value. This represents the configuration of the source plugin. At a minimum, it contains an id key which indicates which source plugin to use for the migration. Possible values include embedded_data for hardcoded data; csv for CSV files; url for JSON feeds, XML files, and Google Sheets; spreadsheet for Microsoft Excel and LibreOffice Calc files; and many more. Each plugin is configured differently. Refer to our list of configuration options for source plugins to find out what is available for each of them. Additionally, in this section you can define source contents that can be later used in the process pipeline.

process key

A nested array value. This represents the configuration of how source data will be processed and transformed to match the expected destination structure. This section contains a list of entity properties (e.g. nid for a node) and fields (e.g. field_image in the default article content type). Refer to our list of properties for content entities including Commerce related entities to find out which properties can be set depending on your destination (e.g. nodes, users, taxonomy terms, files and images, paragraphs, etc.). For field mappings, you use the machine name of the field as configured in the entity bundle. Some fields have complex structures so you migrate data into specific subfields. Refer to our list of subfields per field type to determine which options are available. When migrating multivalue fields, you might need to set deltas as well. Additionally, you can have pseudofields to store temporary values within the process pipeline.

For each entity property, field, or pseudofield, you can use one or more process plugins to manipulate the data. Many of them are provided by Drupal core while others become available when contributed modules are installed on the site like Migrate Plus and Migrate Process Extra. Throughout the 31 days of migrations series, we provided examples of how many process plugins are used. Most of the work for migrations will be devoted to configuring the right mappings in the process section. Make sure to check our debugging tips in case some values are not migrated properly.

destination key

A nested array value. This represents the configuration of the destination plugin. At a minimum, it contains an id key which indicates which destination plugin to use for the migration. Possible values include entity:node for nodes, entity:user for users, entity:taxonomy_term for taxonomy terms, entity:file for files and images, entity_reference_revisions:paragraph for paragraphs, and many more. Each plugin is configured differently. Refer to our list of configuration options for destination plugins to find out what is available for each of them.

This is an example migration from the ud_migrations_csv_source module used in the article on CSV sources.

id: udm_csv_source_paragraph
label: 'UD dependee paragraph migration for CSV source example'
migration_tags:
  - UD CSV Source
  - UD Example
source:
  plugin: csv
  path: modules/custom/ud_migrations/ud_migrations_csv_source/sources/udm_book_paragraph.csv
  ids: [book_id]
  header_offset: null
  fields:
    - name: book_id
    - name: book_title
    - name: 'Book author'
process:
  field_ud_book_paragraph_title: book_title
  field_ud_book_paragraph_author: 'Book author'
destination:
  plugin: 'entity_reference_revisions:paragraph'
  default_bundle: ud_book_paragraph

migration_dependencies key

A nested array value. The value is used by the Migrate API to make sure the listed migrations are executed in advance of the current one. For example, a node migration might require users to be imported first so you can specify who is the author of the node. Also, it is possible to list optional migrations so that they are only executed in case they are present. The following example from the d7_node.yml migration shows how key can be configured:

migration_dependencies:
  required:
    - d7_user
    - d7_node_type
  optional:
    - d7_field_instance
    - d7_comment_field_instance

To configure the migration dependencies you specify required and optional subkeys whose values are an array of migration IDs. If no dependencies are needed, you can omit this key. Alternatively, you can set either required or optional dependencies without having to specify both keys. As of Drupal 8.8 an InvalidPluginDefinitionException will be thrown if the migration_dependencies key is incorrectly formatted.

class key

A string value. If set, it should point to the class used as the migration plugin. The MigrationPluginManager sets this key to \Drupal\migrate\Plugin\Migration by default. Whatever class specified here should implement the MigrationInterface. This configuration key rarely needs to be set as the default value can be used most of the time. In Drupal core there are few cases where a different class is used as the migration plugin:

deriver key

A string value. If set, it should point to the class used as a plugin deriver for this migration. This is an advanced topic that will be covered in a future entry. In short, it is a mechanism in which new migration plugins can be created dynamically from a base template. For example, the d7_node.yml migration uses the D7NodeDeriver to create one node migration per content type during a Drupal upgrade operation. In this case, the configuration key is set to Drupal\node\Plugin\migrate\D7NodeDeriver. There are many other derivers used by the Migrate API including D7NodeDeriver, D7TaxonomyTermDeriver, EntityReferenceTranslationDeriver, D6NodeDeriver, and D6TermNodeDeriver.

field_plugin_method key

A string value. This key must be set only in migrations that use Drupal\migrate_drupal\Plugin\migrate\FieldMigration as the plugin class. They take care of importing fields from previous versions of Drupal. The following is a list of possible values:

  • alterFieldMigration as set by d7_field.yml.
  • alterFieldFormatterMigration as set by d7_field_formatter_settings.yml.
  • alterFieldInstanceMigration as set by d7_field_instance.yml.
  • alterFieldWidgetMigration as set by d7_field_instance_widget_settings.yml

There are Drupal 6 counterparts for these migrations. Note that the field_plugin_method key is a replacement for the deprecated cck_plugin_method key.

provider key

An array value. If set, it should contain a list of module machine names that must be enabled for this migration to work. Refer to the d7_entity_reference_translation.yml and d6_entity_reference_translation.yml migrations for examples of possible values. This key rarely needs to be set. Usually the same module providing the migration definition file is the only one needed for the migration to work.

Deriver specific configuration keys

It is possible that some derivers require extra configuration keys to be set. For example, the EntityReferenceTranslationDeriver the target_types to be set. Refer to the d7_entity_reference_translation.yml and d6_entity_reference_translation.yml migrations for examples of possible values. These migrations are also interesting because the source, process, and destination keys are not configured in the YAML definition files. They are actually set dynamically by the deriver.

Migration configuration entity keys

The following keys should be used only if the migration is created as a configuration entity using the Migrate Plus module. Only the migration_group key is specific to migrations as configuration entities. All other keys apply for any configuration entity in Drupal. Refer to the ConfigEntityBase abstract class for more details on how they are used.

migration_group key

A string value. If set, it should correspond to the id key of a migration group configuration entity. This allows inheriting configuration values from the group. For example, the database connection for the source configuration. Refer to this article for more information on sharing configuration using migration groups. They can be used to import or rollback all migrations within a group using the following Drush commands provided by the Migrate Tools module:

$ drush migrate:import --group='udm_config_group_json_source'
$ drush migrate:rollback --group='udm_config_group_json_source'

uuid key

A string value. The value should be a UUID v4. If not set, the configuration management system will create a UUID on the fly and assign it to the migration entity. Refer to this article for more details on setting UUIDs for migrations defined as configuration entities.

langcode key

A string value. The language code of the entity's default language. English is assumed by default. For example: en.

status key

A boolean value. The enabled/disabled status of the configuration entity. For example: true.

dependencies key

A nested array value. Configuration entities can declare dependencies on modules, themes, content entities, and other configuration entities. These dependencies can be recalculated on save operations or enforced. Refer to the ConfigDependencyManager class’ documentation for details on how to configure this key. One practical use of this key is to automatically remove the migration (configuration entity) when the module that defined it is uninstalled. To accomplish this, you need to set an enforced module dependency on the same module that provides the migration. This is explained in the article on defining Drupal migrations as configuration entities. For reference, below is a code snippet from that article showing how to configure this key:

uuid: b744190e-3a48-45c7-97a4-093099ba0547
id: udm_config_json_source_node_local
label: 'UD migrations configuration example'
dependencies:
  enforced:
    module:
      - ud_migrations_config_json_source

Migration group configuration entity keys

Migration groups are also configuration entities. That means that they can have uuid, langcode, status, and dependencies keys as explained before. Additionally, the following keys can be set. These other keys can be set for migration groups:

id key

A required string value. It serves as the identifier for the migration group. The value should be a machine name.

label key

A string value. The human-readable label for the migration group.

description key

A string value. More information about the group.

source_type key

A string value. Short description of the type of source. For example: "Drupal 7" or "JSON source".

module key

A string value. The machine name of a dependent module. This key rarely needs to be set. A configuration entity is always dependent on its provider, the module defining the migration group.

shared_configuration key

A nested array value. Any configuration key for a migration can be set under this key. Those values will be inherited by any migration associated with the current group. Refer to this article for more information on sharing configuration using migration groups. The following is an example from the ud_migrations_config_group_json_source module from the article on executing migrations from the Drupal interface.

uuid: 78925705-a799-4749-99c9-a1725fb54def
id: udm_config_group_json_source
label: 'UD Config Group (JSON source)'
description: 'A container for migrations about individuals and their favorite books. Learn more at https://understanddrupal.com/migrations.'
source_type: 'JSON resource'
shared_configuration:
  dependencies:
    enforced:
      module:
        - ud_migrations_config_group_json_source
  migration_tags:
    - UD Config Group (JSON Source)
    - UD Example
  source:
    plugin: url
    data_fetcher_plugin: file
    data_parser_plugin: json
    urls:
      - modules/custom/ud_migrations/ud_migrations_config_group_json_source/sources/udm_data.json

What did you learn in today’s article? Did you know there were so many configuration options for migration definition files? Were you aware that some keys apply only when migrations are defined as configuration entities? Have you used migrations groups to share configuration across migrations? Share your answers in the comments. Also, I would be grateful if you shared this blog post with friends and colleagues.

Jul 02 2020
Jul 02

I'm glad to announce that I've been awarded a grant as part of the European Next Generation Internet initiative (NGI) by the Dutch NLnet Foundation to work on my (currently) favorite projects: Indigenous and IndieWeb1. I didn't count on being selected when I submitted my proposal when looking at the other entries, but I guess I made a good case. I'll be spending a lot of time the following months working on them, so you can expect some exciting releases. The status of all projects and work done within this grant will be tracked here.

Indigenous for iOS

The app was originally started by Edward Hinkle and was the main trigger for me to build the Android equivalent. The project is currently unmaintained and lacks many features which are available in the Android version. Thanks to the grant, I can now revive the project so iOS users will be able to enjoy IndieWeb with a more richer and mature application.

Edward was so kind to transfer the existing repository over to me so all issues are preserved. I'll be creating projects and milestones so everyone can track progress. At some point, I will start rolling out releases in a beta program, so watch this space or announcements on Twitter to know when you can sign up for testing.

Multiple user support for the Drupal IndieWeb module

One of the last major missing pieces for the module is support for multiple users. All features currently work great for one account and the Micropub server supports multiple authors posting to the same domain. However, it's far from perfect, and especially the built-in Microsub server is not compatible at all for more than one user.

Work started in a separate branch a couple of months ago, but progress is slow as dragons are everywhere and I only work on this when I have some free time. With this grant, I'll be able to focus 2 weeks in a row to rewrite the critical pieces, not to mention all the tests.

I haven't decided yet whether I'm going to write an upgrade path, but I will keep on supporting both branches as I'm using the module on my site which only has one user, so no need to worry in case you are using the module already.

Kickstarting ActivityPub module for Drupal

It's been on my mind for so long, but I will finally will be able to work extensively on the Drupal Activity module. My work will happen on Drupal.org instead of the existing repository on GitHub, which will be used for a more extended version somewhere in the future. The 1.0.x branch on d.o will contain the lite version.

Open Web

Besides these 3 major goals, I'll focus as well on the interoperability of both app clients (Android and iOS) with more software, e.g. Mastodon and Pixelfed. I'm brainstorming to figure out the best approach to contribute and how to integrate them with both clients, more details will be released in future blog posts and notes.

All those projects have a place in my personal vision on the Open Web, so I feel incredibly lucky to be able to work on them almost full time, hoping to convince more people to jump onboard ultimately. It would be great if we could get something into Drupal Core one day, or at least make some more noise around it. If you have questions, feedback or just want to have a chat, I'm (still, yes I know) on IRC on irc.freenode.net (indieweb or drupal channels). Ping swentel and I'll be all ears.

Footnotes

1. to be fair, Solfidola might come close to become my new favorite, but it's not related with IndieWeb at all :)

New post: Rebooting Indigenous for iOS, adding multiple user support to the Drupal IndieWeb module and kickstarting ActivityPub module thanks to a grant as part of @NGI4eu by the @NLnetFDN. https://realize.be/blog/indigenous-ios-indieweb-and-activitypub-drupal

Jul 01 2020
Jul 01

On Tuesday, July 7, Agaric will host 3 free online webinars about Drupal 9. We invite the community to join us to learn more about the latest version of our favorite CMS. We will leave time at the end of each presentation for questions from the audience. All webinars will be presented online via Zoom. Fill out the form at the end of the post to reserve your seat. We look forward to seeing you.

Getting started Drupal 9

Time: 10:00 AM - 11:00 AM Eastern Time (EDT)

This webinar will cover basic site building concepts. You will learn what is a node and how they differ from content types. We are going to explain why fields are so useful for structuring your site's content and the benefits of doing this. We will cover how to use Views to create listing of content. Layout builder, blocks, taxonomies, and the user permissions system will also be explained.

Introduction to Drupal 9 migrations

Time: 11:30 AM - 12:30 AM Eastern Time (EDT)

This webinar will present an overview of the Drupal migrations system. You will learn about how the Migrate API works and what assumptions it makes. We will explain the syntax to write migrations how different source, process, and destinations plugins work. Recommended migration workflows and debugging tips will also be presented. No previous experience with the Migrate API nor PHP is required to attend.

Drupal 9 upgrades: how and when to move your Drupal 7 sites?

Time: 1:00 PM - 2:00 PM Eastern Time (EDT)

This webinar will present different tools and workflows to upgrade your Drupal 7 site to Drupal 9. We will run through what things to consider when planning an upgrade. This will include how to make site architecture changes, modules that do not have D9 counterparts, what to do when there are no automated upgrade paths.

Agaric is also offering full-day trainings for these topics later this month. Dates, prices, and registration options is available at https://agaric.coop/training

Jul 01 2020
Jul 01

The idea to upgrade a website from Drupal 7 to Drupal 8/9 is getting new perspectives. Now that Drupal 9 is there as an official release, it’s clear like never before that Drupal 7 is getting outdated.

It’s time to upgrade Drupal 7 sites so they can give their owners a much better value. The Drudesk support team knows how to achieve this through upgrades and updates, as well as speed optimization, bug fixes, redesign, and so on.

Today, we would like to review one of the helpful tools for making Drupal upgrades and migrations much easier and smoother — the Drupal Module Upgrader. See how it works to upgrade Drupal modules.

Reasons to upgrade from Drupal 7 to 8/9

Today, there are more reasons to upgrade Drupal 7 to Drupal 8/9 than ever. The 7th and the 8th Drupal releases are so closely related that upgrades between them are instant — provided your D8 site is ready for Drupal 9, which is also very easy to achieve.

So the bulk of the work will be needed for moving from Drupal 7 to the next level — and you will get all the great improvements that have happened to Drupal since the D7 times:

  • instant upgrades (instead of long and tedious ones we used to have)
  • easy and quick content editing with the CKEditor, Media Library, Layout Builder, Quick Edit, etc.
  • outstanding multilingual capacities with a hundred languages supported and an easy process of adding them
  • API-first approach and enhanced third-party integrations that will help your website cooperate with third-party systems (CRMs, mobile apps, etc.)
  • a higher level of web accessibility to users with disabilities (the new front-end Olivero theme is a super example)
  • a modern and user-friendly default admin theme Claro
  • mobile responsiveness for a perfect display across devices
  • New versions of the Symfony framework’s components and the Twig template engine (versions 4-5 and 2 in Drupal 9) that make websites faster and cleaner
  • and much more to come beginning with D9.1
Easy content editing in Drupal 8

Drupal 7 to 8/9 Module Upgrader: extra help with Drupal website migrations

Upgrades from D7 can be challenging due to a big difference in architecture that is now a bit outdated. Many website owners are a bit scared away by this fact.

Luckily, there are great tools to facilitate the upgrades and reduce their time and costs. One of them is the Drupal 7 to 8/9 Module Upgrader, aka DMU.

Drupal 7 to 8/9 Module Upgrader

Tons of things have changed in the requirements to the modules since D7, and DMU knows how to quickly spot them. It is based on a command-line script. DMU scans the code of your D7 module and detects what needs to be updated to D8/9. But checking is not all it can do — the tool also tries and updates the module’s code to Drupal 8/9 automatically where it is possible. Its goal is not updating 100% of the code but saving the developer from manual work to the maximum.

  • “This project aims to be a huge time-saver when porting your modules from Drupal 7 to Drupal 8 because it allows us to automate a lot of the tedious work that would otherwise be required to do by hand,” — said Webchick, Drupal core contributor and the module’s maintainer in her video.

Let's take a closer look at its two working modes. So, with the DMU module installed via Composer, here are the two modes available:

The tool studies the code of a Drupal 7 module to create a comprehensive report of what needs to be done in order to port it to D8/9. It links to the relevant core API changes that need to be observed. Developers perform this with the Drush command: drush dmu-analyze MODULE_NAME. The great thing is that there is no need to study all the requirements at once because a developer gets directed to the ones that are related to a particular module.

In this mode, the tool fixes the issues found in the analysis and converts the code to D8/9 as much as it is possible. For example, it can rename functions, convert .info files into .info.yml, generate classes, and much more. Many of the things are a matter of find and replace. Developers perform this with the Drush command: drush dmu-upgrade MODULE_NAME. There also are other command-line options described in the DMU documentation.

  • That said, the DMU module is part of the strategy for easy and smooth upgrades from Drupal 7, in which you can always rely on our team!

Upgrade from Drupal 7 to 8/9 with Drudesk

Whatever your concerns may be in migrating from Drupal 7 to Drupal 8/9, the Drudesk team is ready to resolve them quickly. Upgrades and updates are our routine since we are a support and maintenance team with a lot of experience. So we will easily and cost-effectively perform your entire website migration or help you with specific modules. Drop us a line!

Jul 01 2020
Jul 01

We collected user feedback from which new functions were built, we also improved existing features and design. 4 highlights:

1. Social network posts

You can now use Lucius as a social network, instantly: let everybody place Posts to share what is happening and create interaction with inline comments and likes. 

Build community and culture out-of-the-box:

social network posts

Jul 01 2020
Jul 01

Many front-end technologies, especially React, now consider the notion of declarative components to be table stakes. Why haven't they arrived in environments like the Drupal CMS's own front end? Many native CMS presentation layers tend to obsolesce quickly and present a scattered or suboptimal developer experience, particularly against the backdrop of today's rapidly evolving front-end development workflows. But according to Fabian Franz, there is a solution that allows for that pleasant front-end developer experience within Drupal itself without jettisoning Drupal as a rendering layer.

The solution is a combination of Web Components support within Drupal and intelligent handling of immutable state in data that allows for Drupal to become a more JavaScript-like rendering layer. Rather than working with endless render trees and an antiquated Ajax framework, and instead of reinventing Drupal's front-end wheel from scratch, Fabian recommends adopting the best of both worlds by incorporating key aspects of Web Components, the Shadow DOM, and particularly syntactic sugar for declarative components that competes readily not only with wildly popular JavaScript technologies like React and Vue but also matches up to the emerging approaches seen in ecosystems like Laravel.

In this Tag1 Team Talks episode, join Fabian Franz (Senior Technical Architect and Performance Lead at Tag1), Michael Meyers (Managing Director at Tag1), and your host and moderator Preston So (Editor in Chief at Tag1; Senior Director, Product Strategy at Oracle; and author of Decoupled Drupal in Practice) for a wide-ranging technical discussion about how to enable declarative components everywhere for Drupal's front end out of the box. If you were interested in Fabian's "Components Everywhere" talk at DrupalCon Amsterdam last year, this is a Tag1 Team Talks episode you won't want to miss!

[embedded content]

Related Links

DrupalCon Amsterdam 2019:Components everywhere! - Bridging the gap between backend and frontend

Insider insights on rendering and security featuresWhat the future holds for decoupled Drupal - part 2

Livewire

Laravel Blade Templates

Inertia.js

Mortenson's WebComponents server-side shim

AJAX API Guide on Drupal.org

Chat with the Drupal Community on Slack: https://www.drupal.org/slack

Vue.js

Lit-HTML

Other mentions:

Preston’s newsletter: Preston.so

Preact - Fast 3kB alternative to React with the same modern API https://preactjs.com/

Descript.com - Uses AI to transcribe Audio (PodCasts) and Video into text, providing you with a transcript & closed captioning; edit the audio/video by editing the text!

Photo by Ren Ran on Unsplash.

Jul 01 2020
Jul 01

Consider a situation wherein your car indicators are placed near the glove compartment, the horn near the back seat, ignition turn on/off button near the fuel tank, and steering wheel with the button to open the side doors. How infeasible it would be!

A man sitting on chair and working on system

Of course, nobody will ever want to drive such a non-ergonomic car that can cause a threat to human life.

Likewise, for content marketers and publishers who create and publish content, their editorial experience must be seamless. It implies that they should be able to publish quality content in less time ahead of their competitors.

This blog is an attempt to interconnect the long-proven Japanese concepts of manufacturing - Kaizen and 5S’ technique with the editorial experience in the digital world to help companies implement it through Drupal and make their teams more productive for content creation and publishing.

Applying Manufacturing Concepts to Editorial Workflows in Publishing

Whether you have realized or not, you do have an editorial workflow. It is simply the way your content gets published.

However, if you have never given it much thought or attention, your team’s workflow is likely undefined, unclear, and unhelpful. It probably changes from article to article, and steps are missed or completed out of order.

See how manufacturing concepts can be applied to improve editorial workflow -

Getting into Editors’ Shoes

Engineer at the assembly line

Advancement in technology can facilitate editors to produce good   quality content and with high quantity. Though leveraging it the right   way can only ensure the productivity and quality of the work.

For an engineer on a manufacturing assembly line, carefully studying each step from pulling an electric screwdriver hanging from the ceiling and 5 screws from a bucket kept right near the waist level to eventually gripping those screws in the car.

Likewise, for editorial teams, it’s important to understand tasks that are repeated by the majority of the users and categorize them in high, medium, and low-frequency High Time tasks.

Understanding Key Pain Areas

A man stamding and operating system


Editors and publishers when working in collaboration should be able to maximize efficiency and revenue for the business. Stakeholders should emphasize the use of a specific mindset and tools to create efficiency and value. Here are some pain points that enterprises must resolve to help address those challenges-

American Society of Quality teaches a concept of FMEA (Failure Mode Effect Analysis) which can be directly applied to Editorial experience betterment
"Failure modes" means the ways, or modes, in which something might cause delays or complicate the workflow

"Effects analysis" refers to studying the consequences/results of those.

Focus on the tasks that hold the highest chances of occurrence and their consequences on the editorial experience and on business outcomes

Examples of Failure Modes and Effects

Failure Mode

Effects

Long Content Forms

Time Delays, Frustration for teams

Excessive clicks to complete a form

Time Delays for publishing, complicated workflow

Multiple Screen Navigations

Possible loss of information, time delays

One body field for all content

Difficult to manage changes, Low-richness of content

Applying Lean Manufacturing 5S’ technique for better Editorial Experience

The term 5S is taken from five Japanese words -

  • Seiri
  • Seiton
  • Seiso
  • Seiketsu
  • Shitsuke

When translated in English, these words become-

  • Sort
  • Set in Order
  • Shine
  • Standardize
  • Sustain

Here, each “S” represents one part of the five-step process that can improve the overall functioning of a business. Let’s get in detail of each “S”.

1. Sort:  This involves going through all the tools(buttons), furniture(fields), equipment(process), etc. in a work area(content management system) to find what needs to be present and what can be removed. 

  • When was this item(field) last used?
  • What is the purpose of this item(field)?
  • How frequently is it used?
  • Who uses it?
  • Does it need to be here?

Logical Grouping of Fields: When was the last time you cribbed about the monologue like marketing forms or a job application which took you years to complete?

day comLong and Verbose Content Forms vs Logical Grouped Forms with Form tips

 

Now think wearing editors’ hat who have to create content using those long forms 10, 20, 50, 100 times a day, these just prove as a hindrance for editors to create and innovate with their content

Logical Grouping of fields via the Field Group module makes the form short and easy for editors to only pick and add information in the fields which are concerned to them.

Form Tip is another intuitive feature to avoid long-form(black box in the screenshot on right) and give editors some info about the info that needs to be added in the field.


2. Set in Order

Once the clutter is gone, it's easier to see what's what. Now workgroups can come up with their strategies for sorting through the remaining items.

  • Collapsible Fields is another way to reduce the length of the form and collapse the fields which are not used widely in all content

a rectangular white bar with text

vehicles moving on road

  • Conditional Fields  is another way to reduce the   number of fields on the form, show/hide fields   based on condition, eg. the ‘Primary Image  Summary’ field for an image will appear only if  there is an image uploaded to an article (As seen  for ‘Primary Image’ field in Screenshot) 
  • Number of Clicks - Carefully minimize the number of clicks to achieve a task 


gif showing various rich multimedia formats

Rich Multimedia Features- Helps  creating a modular content structure with different logical fragments of content rather than just one large body field. Use this to add rich social media features like Embeds, Slideshows, Videos, Audio Podcasts.

two sections divided on white background

  • Taxonomy Manager  allows editors to manage all the master content and vocabularies in the system in an intuitive interface

8 sections divided showing people

Gives a selection view for images and videos 

text on white background

Helps listing the items together for a section of the website


3.  Shine:

The Shine stage of 5S focuses on styling and theming of the interface for the creation and publishing of content for editors. 

  • Giving editors much larger space to write and manage content contrary to the traditional content forms
  • Max Length Helps defining field limits to make sure the user doesn’t exceed the limits 
  • Colors and Font: Use clear visible font-size which are not stressful to the eye. Use solid colors for the header/footer menu of content entry screens for better visibility of text.  

two images juxtaposed on white backgroundHeader/footer menu of content entry screens for better visibility of text

 

4. Standardize: Use standardized field types to supplement faster creation of content.

Anubhav 4.0

Some industry-standard field types that can be used are mentioned below:

-  A long list of options: Eg. the country field can be configured using Auto-complete deluxe 

- Multiple values in a field: Eg. Keywords field can be configured using Chosen fields; it’s quick and gives a fast response if the user wants to remove an item

- Hierarchical items can be configured using SHS

There are a few more industry-standard features that should be added to the interface for standardizing editorial experience:

Auto-Save of Progress

 If the user's browser or machine   dies while editing an article; the   edits will be presented to the user the next time they return to the article

a dialog box on white background

 Content Locking 

When a user is editing an article, any other user that attempts to edit the same article will be blocked from doing so and notified that the content is already being edited. fields on white background

5. Sustain: This is the last of the 5S’. It is not only about keeping the 5S running smoothly, but also about keeping everyone in the organization involved.

video in white backgroundTraining and Onboarding: Quick Editorial onboarding for the editors which means the teams can self-learn on creating content and publishing content without specialized training. 

Saves a lot of time and money to onboard a new publishing interface.

Summing up-

Though 5S is quite a simple concept, beginning a new program of it can feel daunting.

You can start by rolling out a plan with practical steps such as deciding the departments and individuals to be involved, what training will be needed, and what tools will be helpful in executing the process.

Determining these concrete steps would help you successfully carry out the process of 5S implementation. Besides, Drupal has the potential to enhance the editorial workflow significantly through its powerful modules and distributions.

Jul 01 2020
Jul 01


Image credit: Aaron Deutsch

DrupalCon Global 2020 is in a couple weeks and there are a lot of amazing sessions. Hope you can make it! While preparing my own DrupalCon Global session, I reviewed the other sessions and made a list of ones you might want to watch to help you prepare for upgrading from Drupal 6 or 7 or 8 to Drupal 9.

In some cases, it was very hard to choose just one on a particular topic. For example, there are 3 great layout builder talks! So, while these are some of my top picks, don't forget to check out all the DrupalCon Global sessions and add your favorites to your schedule.

If you are upgrading from Drupal 8 to Drupal 9 and don't need to make any website improvements, then you can focus on the Drupal 9 sessions. If you will be doing a redesign and/or upgrading from Drupal 6 or 7 to Drupal 9, check sessions for dreaming up your new site, planning your new site architecture, and implementing your new site.

Everything you want to know about Drupal 9: the upgrade process from Drupal 6, 7, and 8, making upgrades faster with a new Acquia migration tool, the nitty gritty details on what makes Drupal 9 special, and new features coming in Drupal 9.1.

Discovery, design, and project management are critical for your web projects. These sessions cover how to manage your team, user stories and user experience work, and other important aspects of designing a great website.

Drupal is a great framework for building the website you want by customizing to your needs. These sessions cover foundational features of the Drupal system: media, layout builder, components, admin tools, SEO, and migrations.

If you are new to Drupal 8 and 9, these sessions will help you understand how to work with composer, configuration management, and Twig as well as create custom modules and migrations.

Hope to see you at DrupalCon Global!

Apologies, my old website is still on Drupal 6 and looks particularly bad on mobile. I've only started posting here again after many years and I've been very busy reviewing Drupal 9 patches and surviving a pandemic. :) Please ignore the cobbler's old shoes for now, thanks!

Jun 30 2020
Jun 30

When Lehigh University set out to redo its website for prospective undergraduates, one overriding factor was crystal clear to the staff members and stakeholders who were making the key decisions concerning the site. The cohort of digital natives to whom the website needed to appeal, was likely to have different ideas about web navigation and the kinds of site structures that make the most sense. 

There was also no question that the stakes were high for getting it right. Gen Z has high expectations and little patience for web experiences that are confusing. When prospective undergraduates deemed a college website to be sub par, it could stop their search in its tracks. 

As a leading research institution, Lehigh University’s determination to get it right was guided by  an approach that characterizes the university’s angle on most endeavors: questioning assumptions, digging deep, and backing up decisions with research.  

 

High-Stakes Inquiry

Working in concert with Promet Source, Lehigh University proceeded with a three-tiered website usability testing process that included one-on-one recorded interviews with prospective undergraduates, in which interviewees were asked to share their screens and navigate different menu options for specific information. These screen shares were recorded to allow navigational experiences and trouble spots to be analyzed closely, with a greater depth of insight than could be gained from simply stating preferences.


Read the Case Study on Usability Testing with Prospective Students

Additional research included online, self-guided assessments of the same navigation menus from a much wider statistical base of the same cohorts. Stakeholders also participated in a self-guided assessment of menu options to help highlight potential differences in web navigation menus that made the most sense to staff vs. potential undergraduates. 

Interested in gathering data and perspective to ensure that your new site hits the right target with the right audience? Contact Promet Source.
 

Jun 30 2020
Jun 30

As a publisher, it is especially important for you to get the most of what is offered when it comes to pay-per-click advertising. While selling or buying ads in the ad space can be complex, Google allows granular control over all your ads and configurations through Google Ad Manager. Combine this incredible ad management platform with Drupal 9's easy to use integration methods, and you will be able to manage multiple ads on your site while providing insightful reports for better optimization.

DFP Integrate

What is DFP?

Google Ad Manager, previously known as Doubleclick for Publishers (DFP) is an ad server which helps the individuals or business with a good number of page views to generate revenue from their pages on the site. This ad platform facilitates both the buying and selling of ads across various ad networks and multiple locations. Google offers its ad server in two variants – Google Ad Manager for Small Business (completely free) and Google Ad Manager 360. It should be noted that the small business offering has some limited features but works well for small to medium-sized businesses.

How to configure Google Ad Manager

1. Creating Ad Units - Ad Units are basic components of the Ad Manager. It defines the size of the ad and specific location of the Ad on your website or app where you want to display the ads.
Below is a sample screenshot of an Ad unit configured on the Ad manager account.

DFT - Integration

2. Delivering Ad unit - For delivering corresponding add units, we need to add Orders, Line items & Creatives.
•    The Orders in Google Ad manager, where we need to add advertiser & trafficker, in other word we can say if company A wants to buy ad space in our site, the first step to setting them up is to create their order in our google Ad manager account as all subsequent line items within this order.
•    Now create line item which holds information about the specific run dates, targeting, and pricing of one or more creatives.
•    A creative is a specific advertisement, such as an image file, a video file, or other content. One creative can be associated with more than one line item.

How to integrate DFP with Drupal 9

1. First, install and enable the DFP module.
2. Under structure menu, go to DFP Add tags. We have to set Network ID (prepending with “/“ (eg: /111111)  in global DFP settings tab which we will get from the Google Ad manager account. Save the configuration.

DFP Integrate


3. Fill up the following details in the Add DFP tag form.
   Ad Slot Name → Use the same label of Ad Unit configured in Google Ad manager account
   Size(s) → Copy the same sizes of Ad Unit configured in Google Ad manager account
   Ad Unit Pattern → Copy the exact pattern from the “Code“ of Ad unit configured in Google Ad manager account
  Under “Display Options”, make sure “Create a block for this ad tag“ is checked.
4. Save the Form. This will create a block with the required ad script.
5. Place the Block wherever it is required using either Structure / Block layout (for all pages).

Jun 29 2020
Jun 29

In this blog post, we’ll take a look at some different terms related to experience in the digital landscape: the more established and specific Customer Experience and User Experience, as well as the newer and broader Digital Experience.

We’ll briefly explain each term, emphasizing the differences and connections between them, while reinforcing our points with examples as regularly as possible.
 
In the second part of the post, we’ll list and define some of the most frequently used terms related to these different aspects of experience, in order to help you facilitate conversations and collaborations involving them. 

Customer Experience

This is the experience of a (potential) customer in all stages of the customer journey, from their first interaction with a brand or product to actually having made the purchase - if they were satisfied enough with their experience, they may even turn into a loyal and/or returning customer. 

While customer experience refers both to digital and physical experiences, recent global developments have seen a major rise in the demand for digital customer experiences, with e-commerce solutions that are becoming more and more innovative. This post focuses mostly on digital CX, which is also more closely tied to UX.

  • Connection with UX: a customer buys a (digital) product, while a user uses the product. So, the CX of a product would be covered by the marketing around it, and the UX by the development and UX teams. The CX is focused on creating appeal for the product by showing how it solves particular pains, whereas the UX focuses on its usability, enabling the user to actually solve those pains. 
  • Connection with DX: any customer experience that occurs in the digital is by its nature a digital experience. Interestingly, we’re also seeing a blend of digital and physical CX, with examples such as digital displays and AR technology in retail.

-> In the digital word, CX is closely tied to UX.

User Experience

User experience is an incredibly broad field which has intrinsic ties to some of the other most important aspects of digital experiences, such as SEO and accessibility. In short, user experience is the experience of a user of your product or service (especially digital) who is using the said product or service to achieve a certain goal. 

As already mentioned in the CX section, the number one priority of UX is usability - enabling all potential users to easily make use of a product as a solution to a particular problem. This is why accessibility is so important for UX: any kind of user, no matter their disabilities, should be able to use products and services efficiently.

  • Connection with CX: products such as websites and digital applications may contain the ability to purchase other products and services. Here, CX = UX, because you use the features of the product to help you navigate through your customer journey to finally making the purchase. In the context of a single product, CX becomes UX once the product is purchased and now has to be used.
  • Connection with DX: just like with CX, any user experience taking place in the digital is a digital experience. Again, some of the latest technologies blend physical and digital user experience, with examples such as biometric apps and health tracking devices. 

-> Sometimes UX => CX (e.g. you use a website or application to make a purchase)

-> Sometimes CX => UX (e.g. you purchase a product or service in order to then use it)

Digital Experience

Digital experience is a broader term than customer or user experience. It basically refers to any kind of experience in the digital world, whether it’s CX, UX or even, say, the employee experience in a digital-native business. 

Paralleling the recent explosion of digital channels, new terms such as digital experience platform, or digital experience framework, have arisen to reflect the shift from thinking in terms of the web and content management to focusing instead on digital experience management

With the digital becoming a ubiquitous part of everyday life, there is a constant demand for digital experiences, wherever and whenever potential users and customers might be. This necessitates brands to be present and interact with their audiences on every channel they frequent if they want to tap into all of their potential markets.

  • Connection with CX & UX: any customer or user experience that takes place on the web or on a digital device is a digital experience. In light of the whole COVID-19 situation, the lines between CX and UX on the one hand and DX on the other have never been more blurred, as many people have now been relying on the digital for a majority of their experiences. But it is also broader: as their names suggest, CX refers to customers and UX to users, while DX can also include games, movies, music, etc. So, where CX & UX are more related to a digital product or service, digital experience focuses more on the experience part. Of course, UX often remains an essential part of DX - even something as straightforward as watching a YouTube clip requires a basic understanding of the platform’s functionality, while features like personalization also no doubt contribute to the user experience.

-> Since all digital experiences are more or less centered around satisfying certain needs, it can be said that every digital experience is to some extent a digital user experience.

Useful terms

To help you get the most value out of this post, and also not assuming that every single one of our readers is 100% familiar with all the common terms related to digital experience management, we’re including a short glossary of 12 useful terms that you’ll frequently encounter in CX, UX and DX in general.

  • Accessibility (UX / DX): digital accessibility basically means usability for everyone, no matter their physical or mental ability, or the device through which they’re accessing a service.
  • API (DX): an acronym for Application Programming Interface. An API defines the interactions between different software intermediaries, allowing for integrations between different technologies (e.g. a front-end and a back-end framework both relying on the same API).
  • Bounce (CX / UX): this is a term commonly used in data analytics. If a user or customer ‘bounces’, this means that they only viewed a single page on your website before exiting it. 
  • Churn (CX): also called ‘attrition’, churn happens when a (usually regular) customer stops doing business with a brand. The most typical example is when a customer cancels their subscription to a service.
  • CMS (UX / DX): an acronym for Content Management System. As the name suggests, a CMS is a system or framework for managing digital content and presenting said content to visitors. Some of the most popular ones are WordPress and Drupal.
  • CTA (CX / UX): an acronym for ‘Call to action’. This is a ubiquitous element of customer and user experience, and typically occurs in the form of links or buttons with an active, user- or customer-oriented copy which prompts them to perform the desired action.
  • Integration (CX / UX / DX): in the context of digital experiences, integration refers to different technologies being able to work smoothly together, since most digital experiences rely on more than just a single framework. For example, you may have a website built with Drupal that uses Magento for the e-commerce component.
  • IoT (UX / DX): an acronym for Internet of Things, a buzzword which is quickly gaining ground as we see more and more parts of our lives becoming digitally enabled. It basically refers to a network of interconnected objects able to exchange information through the internet (think smart cars, smart refrigerators, etc.)
  • Multichannel (DX): similar to omnichannel, multichannel means serving digital experiences on channels beyond just the web, by capitalizing on all the types of devices that today’s consumers use daily: mobile phones, tablets, and even things such as smart watches, digital displays or digital voice assistants.
  • Personalization (CX / UX / DX): this is one of the main trends in digital experience management. It means tailoring a digital experience to a specific individual as much as possible. This is enabled by technologies such as machine learning and realized through the capabilities offered by leading front-end frameworks.
  • Retention (CX / UX): especially in CX, retention comprises all the business activities that are focused around keeping existing customers. Retention is the opposite of churn or attrition, and there’s the constant pursuit of low churn rates and high retention rates.
  • ROI (CX / UX): an acronym for ‘Return on Investment’, which is a measure used to determine whether a particular task or activity is worth pursuing with regards to the expected business value it will bring.

Conclusion

We hope this post has armed you with a better understanding of the basics of customer and user experience, and how the newer digital experience trend is powering digital transformation on a global scale. 

The past few months have shown us that a digital-first mindset will be the crucial differentiator of success, and those who have digital experience top of mind will be the winners. 

If you’ve also by now realized how important top-notch digital experiences are going to be, but lack the development capabilities to deliver such digital solutions at scale, contact us at Agiledrop and we can supply you with exactly what your next digital experience endeavor needs to succeed. 

Jun 29 2020
Jun 29

The Drupal migrations, despite their linearity in terms of definitions, contain a lot of inherited complexity. The reason is very intuitive: although the Migrate API is a supersystem that offers a very simple “interface” of interactions for the user-developer who wants to build migration processes, in reality several subsystems work by interacting with each other throughout a migration process: Entities, Database, Plugins&mldr;There are a lot of classes involved in even the simplest migration process. If we add the irrefutable fact that a migration will tend to generate errors in many cases until it has been refined, it’s clear then that one of our first needs will be to learn&mldr;how to debug migrations.

Picture from Unsplash, user Krzysztof Niewolny, @epan5

Table of Contents

1- Introduction
2- Basic Debugging: Keep an eye on your file
3- Average Debugging with Migrate Devel
4- :wq!

This article is part of a series of posts about Drupal Migrations:

1- Drupal Migrations (I): Basic Resources

2- Drupal Migrations (II): Examples

3- Drupal Migrations (III): Migrating from Google Spreadsheet

4- Drupal Migrations (IV): Debugging Migrations First Part

1- Introduction

In the wake of the latest articles, I wanted to continue expanding information about migration in Drupal. I was thinking about writing a sub-series of debugging migrations (inside the main series about Drupal Migrations), and I want to publish now the first part, just a set of basic steps in order to get all the available information from a migration process. All the examples in this post were taken of the migration_google_sheet example, from my Gitlab account.

2- Basic Debugging (Keep an eye on your files)

First, we will start with a very primary approach to error detection during a migration. To begin with, it is essential to keep the focus on reducing the range of error possibilities as much as possible by approaching the migration in an iterative and incremental manner. In other words: we will go step by step and expand our migrated data.

2.1- Reviewing your Migration description file

First of all we are going to comment on the most intuitive step of all we will take, since sometimes there are errors that occur at first sight and not because they are recurrent but end up being more obvious.

The first steps in our process of debugging a migration will be a review of two fundamental issues that usually appear in many migrations. So before anything else, we’ll do a quick review of:

Whitespaces: Any extra whitespace may be causing us problems at the level of the migration description file: we review all lines of the file in a quick scan in order to detect extra whitespace.

Errors in the indentation: The migration description file has a format based on YAML, a language for data serialization based on a key scheme: a value where it is structured by parent - child levels and an indentation of two spaces to the right at each level down in the hierarchy. It is very frequent that some indentation is not the right one and this ends up producing an error in the processing of the file. As a measure, as in the previous case, we will review all the cases of indentations registered in the file.

You can rely on a YAML syntax review service such as www.yamllint.com, but you will have to monitor the result as well.

2.2- Reviewing registers in database

If you’re in a basic Drupal installation (standard profile) we have seventy-three tables, after the activation of the basic modules related to migrations: migrate, migrate_plus, migrate_tools and in this case the custom migration_google_sheet_wrong the number of tables in the database is seventy-five. Two more tables have been generated:

cache_migrate
cache_discovery_migration

But also, later, after executing the migration with ID taxonomy_google_sheet_wrong contained in our custom module, we see in the database that two new tables have been generated related to the executed migration:

  • migrate_map_taxonomy_google_sheet
    This table contains the information related to the movements of a row of data (migrations are operated ‘row’ to ‘row’). Migrate API is in charge of storing in this table the source ID, the destination ID and a hash related to the ‘row’ in this data mapping table. Combinations between the source ID and the hash of the row operation then make it easier to track changes, progressively update information, and cross dependencies when performing a batch migration (see below for how they are articulated).
    The lookup processes for migrations are supported by this data: for example, to load a taxonomy term you must first lookup its “parent” term to maintain the hierarchy of terms. If we go to our database and we do not see recorded results after launching a migration, no data was stored and the migration requires debugging.

  • migrate_message_taxonomy_google_sheet
    In this table, messages associated to the executed migration will be stored, structured in the same way as the previous table (based on the processing of a ‘row’ of the migration), each message with its own identifier and an association to the id_hash of the ‘row’ of origin of the data:

Drupal Migration columns from table Messages

This information can be obtained through Drush, since the content of this table is what is shown on the screen when we execute the instruction:

drush migrate:messages id-migration

And this can be a useful way to start getting information about the errors that occurred during our migration.

2.4- Reloading Configuration objects

Another issue we’ll need to address while debugging our migration is how to make it easier to update the changes made to the configuration object created from the migration description file included in the config/install path.

As we mentioned earlier, each time the module is installed a configuration object is generated that is available in our Drupal installation. In the middle of debugging, we’ll need to modify the file and reload it to check that our changes have been executed. How can we make this easier? Let’s take a look at some guidelines.

On the one hand, we must associate the life cycle of our migration-type configuration object with the installation of our module. For it, as we noted in the section 2.3.2- Migration factors as configuration, we will declare as forced dependency our own custom module of the migration:

dependencies:
   enforced:
      module:
          - migration_google_sheet

We can use both Drush and Drupal Console to perform specific imports of configuration files, taking advantage of the single import options of both tools:

Using Drupal Console

drupal config:import:single --directory="/modules/custom/migration_google_sheet/config/install" --file="migrate_plus.migration.taxonomy_google_sheet.yml"


Using Drush

drush cim --partial --source=/folder/

Similarly, we can also remove active configuration objects using either Drush or Drupal Console:

drush config-delete "migrate_plus.migration.taxonomy_google_sheet"
drupal config:delete active "migrate_plus.migration.taxonomy_google_sheet"

If we prefer to use the Drupal user interface, there are options such as the contributed Config Delete module drupal.org/config_delete , which activates extra options to the internal configuration synchronization menu to allow the deletion of configuration items from our Drupal installation. It’s enough to download it through Composer and enable it through Drush or Drupal Console:

composer require drupal/config_delete
drush en config_delete -y

Drupal Config Delete actions

This way we can re-import configuration objects without colliding with existing versions in the database. If you choose to update and compare versions of your configuration, then maybe the Configuration Update Manager contributed module can be a good option https://www.drupal.org/project/config_update.

3- Average Debugging with Migrate Devel

Well, we have looked closely at the data as we saw in the previous section and yet our migration of taxonomy terms from a Google Spreadsheet seems not to work.

We have to resort to intermediate techniques in order to obtain more information about the process. In this phase our allies will be some modules and plugins that can help us to better visualize the migration process.

3.1- Migrate Devel

Migrate Devel https://www.drupal.org/project/migrate_devel is a contributed module that brings some extra functionality to the migration processes from new options for drush. This module works with migrate_tools and migrate_run.

UPDATE (03/07/2020): Version 8.x-2.0-alpha2

Just as I published this article, Andrew Macpherson (new maintainer of the Migrate Devel module and one of the accessibility maintainers for Drupal Core), left a comment that you can see at the bottom of this post with some important news. Well, since I started the first draft of this article, a new version had been published, released on June 28th and it’s already compatible with Drush 9 (and I didn’t know&mldr;) So now you know there’s a new version available to download compatible with Drush 9 and which avoids having to install the patch exposed below.

To install and enable the module, we proceed to download it through composer and activate it with drush: Migrate Devel 8.x-2.0-alpha2.

composer require drupal/migrate_devel
# To install the 2.0 branch of the module:
composer require drupal/migrate_devel:^2.0
drush en migrate_devel -y

Follow for versions prior to 8.x-2.0-alpha2:

If you’re working with versions prior to 8.x-2.0-alpha2, then you have to know some particularities: The first point is that it’s was optimized for a previous version of Drush (8) and it does not seem to have closed its portability to Drush 9 and higher.

There’s a tag 8.x.1.4 from two weeks ago in the 8.x-1.x branch: migrate_devel/tree/8.x-1.4

There is a necessary patch in its Issues section to be able to use it in versions of Drush > 9 and if we make use of this module this patch https://www.drupal.org/node/2938677 will be almost mandatory. The patch does not seem to be in its final version either, but at least it allows a controlled and efficient execution of some features of the module. Here will see some of its contributions.

And to apply the patch we can download it with wget and apply it with git apply:

cd /web/modules/contrib/migrate_devel/
wget https://www.drupal.org/files/issues/2018-10-08/migrate_devel-drush9-2938677-6.patch 
git apply migrate_devel-drush9-2938677-6.patch

Or place it directly in the patch area of our composer.json file if we have the patch management extension enabled: https://github.com/cweagans/composer-patches.

Using:

composer require cweagans/composer-patches 

And place the new patch inside the “extra” section of our composer.json file:

Drupal Debugging adding the patch

How it works:

The launch of a migration process with the parameters provided by Migrate Devel will generate an output of values per console that we can easily check, for example using –migrate-debug:

Drupal Devel Output first part
Drupal Devel Output second part

This is a partial view of the processing of a single row of migrated data, showing the data source, the values associated with this row and the final destination ID, which is the identifier stored in the migration mapping table for process tracking:

Drupal Devel Output in database

Now we can see in the record that for the value 1 in origin (first array of values), the identifier 117 was assigned for the load in destination. This identifier will also be the internal id of the new entity (in this case taxonomy term) created within Drupal as a result of the migration. This way you can relate the id of the migration with the new entity created and stored.

What about event subscribers?, Migrate Devel creates an event subscriber, a class that implements EventSubscriberInterface and keeps listening to events generated from the event system of the Drupal’s Migrate API, present in the migrate module of Drupal’s core:

Called from +56 
/var/www/html/web/modules/contrib/migrate_devel/src/EventSubscriber/MigrationEventSubscriber.php

The call is made from the class where events are heard and actions from the module’s Event classes are read. Many events are defined there modules/migrate/src/Event, but in particular, two that are listened to by Migrate Devel:

  1. MigratePostRowSaveEvent.php
  2. MigratePreRowSaveEvent.php

What are the two Drush options offered by Migrate Devel, and in both cases results in a call to the Kint library dump() function provided by the Devel module to print messages. In fact the call to Kint has changed in the last version 8.x-2.0-alpha2, where Kint is replaced by a series of calls to the Dump method of ths Symfony VarDumper. Where we used to do:

  /**
   * Pre Row Save Function for --migrate-debug-pre.
   *
   * @param \Drupal\migrate\Event\MigratePreRowSaveEvent $event
   *    Pre-Row-Save Migrate Event.
   */
  public function debugRowPreSave(MigratePreRowSaveEvent $event) {
    $row = $event->getRow();

    $using_drush = function_exists('drush_get_option');
    if ($using_drush && drush_get_option('migrate-debug-pre')) {
      // Start with capital letter for variables since this is actually a label.
      $Source = $row->getSource();
      $Destination = $row->getDestination();

      // We use kint directly here since we want to support variable naming.
      kint_require();
      \Kint::dump($Source, $Destination);
    }
  }

Now we’re doing:

  /**
   * Pre Row Save Function for --migrate-debug-pre.
   *
   * @param \Drupal\migrate\Event\MigratePreRowSaveEvent $event
   *    Pre-Row-Save Migrate Event.
   */
  public function debugRowPreSave(MigratePreRowSaveEvent $event) {
    if (PHP_SAPI !== 'cli') {
      return;
    }

    $row = $event->getRow();

    if (in_array('migrate-debug-pre', \Drush\Drush::config()->get('runtime.options'))) {
      // Start with capital letter for variables since this is actually a label.
      $Source = $row->getSource();
      $Destination = $row->getDestination();

      // Uses Symfony VarDumper.
      // @todo Explore advanced usage of CLI dumper class for nicer output.
      // https://www.drupal.org/project/migrate_devel/issues/3151276
      dump(
        '---------------------------------------------------------------------',
        '|                             $Source                               |',
        '---------------------------------------------------------------------',
        $Source,
        '---------------------------------------------------------------------',
        '|                           $Destination                            |',
        '---------------------------------------------------------------------',
        $Destination
      );
    }
  }

You can see the update and changes in migrate_devel/8.x-2.0-alpha2/src/EventSubscriber/MigrationEventSubscriber.php.

And you can get more information about creating events and event subscribers in Drupal here in The Russian Lullaby: Building Symfony events for Drupal.

3.2- Debug Process Plugin

The contributed module Migrate Devel also brings a new processing plugin called “debug” and defined in the Debug.php class. This PHP class can be found in the module path: /web/modules/contrib/migrate_devel/src/Plugin/migrate/process/Debug.php and we can check its responsibility by reading its annotation section in the class header:

/**
 * Debug the process pipeline.
 *
 * Prints the input value, assuming that you are running the migration from the
 * command line, and sends it to the next step in the pipeline unaltered.
 *
 * Available configuration keys:
 * - label: (optional) a string to print before the debug output. Include any
 *   trailing punctuation or space characters.
 * - multiple: (optional) set to TRUE to ask the next step in the process
 *   pipeline to process array values individually, like the multiple_values
 *   plugin from the Migrate Plus module.

And it consists directly with the transform() method - inherited from the ProcessPluginBase abstract class, where instead of applying transformation actions during processing, it simply uses PHP’s print_r function to display information by console and will print both scalar values and value arrays.

This plugin can be used autonomously, being included as part of the migration pipeline, so that it prints results throughout the processing of all value rows. In this case, we are going to modify the pipeline of the processing section of our taxonomy terms migration, with the idea of reviewing the values being migrated.

To begin with, we are going to modify the structure. We already know (from previous chapters) that this is actually the way:

process:
 name: name
 description: description
 path: url
 status: published

It’s just an implicit way of using the Get.php Plugin which is equivalent to:

process:
 name:
   plugin: get
   source: name
 description:
   plugin: get
   source: description
 path:
   plugin: get
   source: url
 status:
   plugin: get
   source: published

Now we add to the pipeline the Debug plugin with an information label for processing:

process:
 name:
   plugin: debug
   label: 'Processing name field value: '
   plugin: get
   source: name
 description:
   plugin: debug
   label: 'Processing description field value: '
   plugin: get
   source: description
 path:
   plugin: debug
   label: 'Processing path field value: '
   plugin: get
   source: url
 status:
   plugin: debug
   label: 'Processing status field value:  '
   plugin: get
   source: published

After this change we reload the migration configuration object by uninstalling and installing our module (as it is marked as a dependency, when uninstalled the migration configuration will be removed):

drush pmu migration_google_sheet && drush en migration_google_sheet -y 

So when we run the migration now we will get on screen information about the values:

Drupal Migrate Devel feedback

This way we get more elaborated feedback on the information to be migrated. If we want to complete this information and thinking about more advanced scenarios, we can combine various arguments and options to gather as much information as possible. Let’s think about reviewing the information related to only one element of the migration. We can run something like:

 drush migrate-import --migrate-debug taxonomy_google_sheet --limit="1 items"

Which will combine the output after storage (unlike its –migrate-debug-pre option), showing in a combined way the output of the Plugin, the values via Kint and the final storage ID of the only processed entity.

In this case, we only see basic values and with little processing complexity (we only extract from Source and load in Destiny) but in successive migrations we will be doing more complex processing treatments and it will be an information of much more value. Interesting? think about processing treatment for data values that must be adapted (concatenated, cut, added, etc)&mldr;if at each step we integrate a feedback, we can better observe the transformation sequence.

Here you can check the Plugin code: migrate_devel/src/Plugin/migrate/process/Debug.php.

Here you can review the Drupal.org Issue where the idea of implementing this processing Plugin originated: https://www.drupal.org/node/3021648.

Well, with this approach to Migrations debugging we will start the series on debugging&mldr;soon more experiences!

4- :wq!

[embedded content]

Jun 28 2020
Jun 28
All US lighthouses on a map.

Waaaaay back in 2013, I wrote a blog post about importing and mapping over 5,000 points of interest in 45 minutes using (mainly) the Feeds and Geofield modules. Before that, I had also done Drupal 6 demos of importing and displaying earthquake data. 

With the recent release of Drupal 9, I figured it was time for a modern take on the idea - this time using the Drupal migration system as well as (still!) Geofield. 

This time, for the source data, I found a .csv file of 814 lighthouses in the United States that I downloaded from POI Factory (which also appears to be a Drupal site). 

Starting point

First, start with a fresh Drupal 9.0.1 site installed using the drupal/recommended-project Composer template. Then, use Composer to require Drush and the following modules:

composer require drush/drush drupal/migrate_tools drupal/migrate_source_csv drupal/migrate_plus drupal/geofield drupal/geofield_map

Then, enable the modules using

drush en -y migrate_plus migrate_tools migrate_source_csv geofield geofield_map leaflet

Overview of approach

To achieve the goal of importing all 814 lighthouses and displaying them on a map, we're going to import the .csv file using the migration system into a new content type that includes a Geofield configured with a formatter that displays a map (powered by Leaflet).

The source data (.csv file) contains the following fields: 

  • Longitude
  • Latitude
  • Name
  • Description

So, our tasks will be:

  1. Create a new "lighthouse" content type with a "Location" field of type Geofield that has a map formatter (via Geofield map).
  2. Prepare the .csv file.
  3. Create a migration that reads the .csv file and creates new nodes of type "Lighthouse".

Create the Lighthouse content type

We will reuse the Drupal title and body field for the Lighthouse .csv's Name and Description fields. 

Then, all we need to add is a new Geofield location field for the longitude and latitude:

Geofield configuration

Next, we'll test out the new Lighthouse content type by manually creating a new node from the data in the .csv file. This will also be helpful as we configure the Geofield map field formatter (using Leaflet).

Mystic lighthouse
 

By default, a Geofield field uses the "Raw output" formatter. With Leaflet installed and enabled, we can utilize the "Leaflet map" formatter (with the default configuration options).

Leaflet formatter

With this minor change, our test Lighthouse node now displays a map!
 

Mystic lighthouse on a map!

Prepare the .csv file

Prior to writing a migration for any .csv file, it is advised to review the file to ensure it will be easy to migrate (and rollback). Two things are very important:

  • Column names
  • Unique identifier

Column names help in mapping .csv fields to Drupal fields while a unique identifier helps with migration rollbacks. While the unique identifier can be a combination of multiple fields, I find it easiest to add my own when it makes sense. 

The initial .csv file looks like this (opened in a spreadsheet):
 

CSV file before modifications

In the case of the lighthouse .csv file in this example, it has neither column names nor a unique identifier field. To rectify this, open the .csv as a spreadsheet and add both. For the unique identifier field, I prefer a simple integer field. 

Once manually updated, it looks like this:

CSV file after modifications

Create the migration

If you've never used the Drupal 8/9 migration system before it can be intimidating, but at its heart, it is basically just a tool that:

  • Reads source data
  • Maps source data to the destination
  • Creates the destination

Writing your first migration is a big step, so let's get started.

The first step is to create a new custom module to house the migration. First, create a new, empty web/modules/custom/ directory. Then, easily create the module's scaffolding with Drush's "generate" command:

$ drush generate module

 Welcome to module-standard generator!
–––––––––––––––––––––––––––––––––––––––

 Module name:
 ➤ Lighthouse importer

 Module machine name [lighthouse_importer]:
 ➤ 

 Module description [The description.]:
 ➤ Module for importing lighthouses from .csv file.

 Package [Custom]:
 ➤ DrupalEasy

 Dependencies (comma separated):
 ➤ migrate_plus, migrate_source_csv, geofield

 Would you like to create install file? [Yes]:
 ➤ No

 Would you like to create libraries.yml file? [Yes]:
 ➤ No

 Would you like to create permissions.yml file? [Yes]:
 ➤ No

 Would you like to create event subscriber? [Yes]:
 ➤ No

 Would you like to create block plugin? [Yes]:
 ➤ No

 Would you like to create a controller? [Yes]:
 ➤ No

 Would you like to create settings form? [Yes]:
 ➤ No

 The following directories and files have been created or updated:
–––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––
 • modules/lighthouse_importer/lighthouse_importer.info.yml
 • modules/lighthouse_importer/lighthouse_importer.module

Then, let's create a new web/modules/custom/lighthosue_importer/data/ directory and move the updated .csv file into it - in my case, I named it Lighthouses-USA-updated.csv.

Next, we need to create the lighthouse migration's configuration - this is done in a .yml file that will be located at web/modules/custom/lighthouse_importer/config/install/migrate_plus.migration.lighthouses.yml

The resulting module's file structure looks like this:

web/sites/modules/custom/lighthouse_importer/
  config/
    install/
      migrate_plus.migration.lighthouses.yml
  data/
    Lighthouses-USA-updated.csv
  lighthouse_importer.info.yml
  lighthouse_importer.module

Note that the lighthouse_importer.module, created by Drush, is empty. 

While there are a couple of ways to create the migration configuration, we're going to leverage the Migrate Plus module. 

For more information about writing migrations using code or configurations, check out this blog post from UnderstandDrupal.com.

One of the big hurdles of learning to write Drupal migrations is figuring out where to start. It doesn't make much sense to write the migrate_plus.migration.lighthouses.yml from scratch; most experienced migrators start with an existing migration and tailor it to their needs. In this case, we'll start with the core Drupal 7 node migration (web/core/modules/node/migrations/d7_node.yml)

Let's break up the configuration of the new lighthouse migration into three parts: 

  • Everything before the "process" section.
  • Everything after the "process" section.
  • The "process" section.

Everything before the "process" section

Our starting point (d7_node.yml) looks like this:
 

id: d7_node
label: Nodes
audit: true
migration_tags:
  - Drupal 7
  - Content
deriver: Drupal\node\Plugin\migrate\D7NodeDeriver
source:
  plugin: d7_node

Let's update it to look like this:

id: lighthouses
label: Lighthouses
source:
  plugin: 'csv'
  path: '/var/www/html/web/modules/custom/lighthouse_importer/data/Lighthouses-USA-updated.csv'
  ids:
    - ID
  fields:
    0:
      name: ID
      label: 'Unique Id'
    1:
      name: Lon
      label: 'Longitude'
    2:
      name: Lat
      label: 'Latitude'
    3:
      name: Name
      label: 'Name'
    4:
      name: Description
      label: 'Description'

The main difference is the definition of the "source". In our case, since we're using a .csv as our source data, we have to fully define it for the migration. The Migrate Source CSV module documentation is very helpful in this situation.

Note that the "path" value is absolute. 

The "ids" section informs the migration system which field(s) is the unique identifier for each record.

The "fields" section lists all of the fields in the .csv file (in order) so that they are available (via their "name") to the migration. 

Everything after the "process" section

This is often the easiest part of the migration configuration system to write. Often, we just have to define what type of entity the migration will be creating as well as any dependencies. In this example, we'll be creating nodes and we don't have any dependencies. So, the entire section looks like this:

destination:
  plugin: entity:node

The "process" section

This is where the magic happens - in this section we map the source data to the destination fields. The format is destination_value: source_value.

As we aren't migrating data from another Drupal site, we don't need the nid nor vid fields - we'll let Drupal create new node and revision identifiers as we go.

As we don't have much source data, we'll have to set several default values for some of the fields Drupal is expecting. Others we can just ignore and let Drupal set its own default values.

Starting with the just the mapping from the d7_node.yaml, we can modify it to:

process:
  langcode:
    plugin: default_value
    source: language
    default_value: "und"
  title: Name
  uid:
    plugin: default_value
    default_value: 1
  status: 
    plugin: default_value
    default_value: 1

Note that we set the default language to "und" (undefined) and the default author to UID=1 and status to 1 (published). The only actual source data we're mapping to the destination (so far) is the "Name", which we are mapping to the node title.

One thing that is definitely missing at this point is the "type" (content type) of node we want the migration to create. We'll add a "type" mapping to the "process" section with a default value of "lighthouse".  

We have three additional fields from the source data that we want to import into Drupal: longitude, latitude, and the description. Luckily, the Geofield module includes a migration processor, which allows us to provide it with the longitude and latitude values and it does the dirty work of preparing the data for the Geofield. For the Description, we'll just map it directly to the node's "body/value" field and let Drupal use the default "body/format" value ("Basic HTML"). 

So, the resulting process section looks like:

process:
  langcode:
    plugin: default_value
    source: language
    default_value: "und"
  title: Name
  uid:
    plugin: default_value
    default_value: 1
  status: 
    plugin: default_value
    default_value: 1
  type:
    plugin: default_value
    default_value: lighthouse
  field_location:
    plugin: geofield_latlon
    source:
      - Lat
      - Lon
  body/value: Description

Once complete, enable the module using 

drush en -y lighthouse_importer

It is important to note that as we are creating this migration using a Migrate Plus configuration entity, the configuration in the migrate_plus.migration.lighthouses.yml is only imported into the site's "active configuration" when the module is enabled. This is often less-than-ideal as this means every time you make a change to the migration's .yml, you need to uninstall and then re-enable the module for the updated migration to be imported. The Config devel module is often used to automatically import config changes on every page load. Note that this module is normally for local use only - it should never be used in a production environment. As of the authoring of this blog post, the patch to make Config Devel compatible with Drupal 9 is RTBC. In the meantime, you can use the following to update the active config each time you make a change to your lighthouses migration configuration:

drush config-delete migrate_plus.migration.lighthouses -y  && drush pm-uninstall lighthouse_importer -y && drush en -y lighthouse_importer

Testing and running the migration

Use the migrate-status (ms) command (provided by the Migrate Tools module) to check the status of our migration:

$ drush ms lighthouses
 ------------------- -------------- -------- ------- ---------- ------------- --------------- 
  Group               Migration ID   Status   Total   Imported   Unprocessed   Last Imported  
 ------------------- -------------- -------- ------- ---------- ------------- --------------- 
  Default (default)   lighthouses    Idle     814     0          814                          
 ------------------- -------------- -------- ------- ---------- ------------- --------------- 

If everything looks okay, then let's run the first 5 rows of the migration using the migrate-import (mim) command:

$ drush mim lighthouses --limit=5
 [notice] Processed 5 items (5 created, 0 updated, 0 failed, 0 ignored) - done with 'lighthouses'

Confirm the migration by viewing your new nodes of type "lighthouse"!

If all looks good, run the rest of the migration by leaving out the --limit=5 bit:

$ drush mim lighthouses          
 [notice] Processed 804 items (804 created, 0 updated, 0 failed, 0 ignored) - done with 'lighthouses'

If you don't like the results, then you can rollback the migration using "drush migrate-rollback lighthouses" (or "drush mr lighthouses"), make your changes, update the active config, and re-import. 

Next steps

There's a lot more to the Drupal migration system, but hopefully this example will help instill some confidence in you for creating your own migrations. 

The "Leaflet Views" module (included with Leaflet) makes it easy to create a view that shows all imported lighthouses on a single map (see the image at the top of the article). Once you have the data imported, there's so much that you can do!
 

Jun 27 2020
Jun 27

Ashraf Abed, founder of Debug Academy and Drupal.tv talks with Ryan about the Debug Academy's long-form Drupal training. Also, Mike and Ryan take a trip around recent events in the Drupal Community.

URLs mentioned

DrupalEasy News

Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Jun 26 2020
Jun 26
Jun 26, 2020 Drupal

For our Drupal distribution we needed to redirect all anonymous users to the login page, as for now it's implemented as a closed platform for social collaboration. The Drupal 8 way didn't work anymore; we fixed it for Drupal 9 and published a working module. 

So if you're building a Drupal social intranet, collaboration tool or community this might help you to direct them the right way -so they don't get an unfriendly 'access denied'.

Keep in mind that you still have to build the correct access control into all your pages with help of permissions / access checks / advanced route access, this module doesn't provide for that.

Clone and run off

We published the Drupal module here on Github so you can copy it and run off with it to do whatever you need. At the moment it's not a published Drupal.org project with all kinds of configurable stuff.

A short explanation of the code

First you'll have to implement an 'Event Subscriber' in the .services.yml file.

drupal event subscriber

More info: Subscribe to and dispatch events.

Next, it took a while before we figured it out, but this code in /src/EventSubscriber/AnonymousRedirectSubscriber.php is all it takes to make it work:

Drupal anonymous redirect subscriber

  1. More info on responding to events in Drupal here on Drupalize.me.
  2. Get current user with dependency injection.
  3. Get current request with ->getRequest(), here's what differs from the Drupal 8 version: we couldn't get ->getRouteName() to work properly.
  4. Check if current user is anonymous, but exclude some paths. Also, we had to facilitate user/reset/* with php's fnmatch, because that path contains a variable (the reset hash).
  5. Respond with the redirect to Drupal's login page.

Drupal 8 version

You can find the previous Drupal 8 code here, and we also found the Anonymous login module, but both didn't work in our Drupal 9 (then beta) install.

'quick-start' Drupal to test

To test this module fast in a fresh install, Drupal's quick-start might come in handy.

Need this as a contrib Drupal module?

Please let me know in comments down below, if there is enough positive feedback on this we might make it configurable and contrib it!

Jun 25 2020
Jun 25

The fundamental building blocks of running an efficient and user-focused public transportation network and building a well-designed, effective, and user-centric website are actually pretty similar: You need talented people, quality data, and elite technology to get the most out of your investment.

That’s why the widespread adoption of open data standards combined with an effective and affordable technology like Drupal helps to ensure that public transit works for all users.

Ultimately, the key to great transit service is not about getting 100 percent of people to ride public transit for 100 percent of their trips. Success comes from giving people a viable choice of getting around without needing to drive -- a choice built on affordability, convenience, and quality

Giving people viable choices to get around does not end with good urban planning, congestion management, and low fares. It includes giving people the information they need to plan trips, to plan their day, and to plan travel across, through, and around their cities using transportation solutions that meet their evolving mobility needs.

Where does most of that information come from? Open Data.

Open Source & Open Data A Smooth Ride

Many cities have General Transit Feed Specification (GTFS) feeds available online. These are usually curated by regional public transit agencies. 

GTFS is just one example of many Open Data resources available and in use by transit agencies. But having access to that data is only part of the equation. The important question to answer is how are they to manage that data and repurpose  it in a way that is responsive, accessible, meaningful, and convenient for people to consume? 

Transit authorities, including such mass transit hubs as Santa Clara Valley Transportation Authority, Bay Area Rapid Transit District, and New York City - Metropolitan  Transportation Authority, are turning to open source technologies, like Drupal. 

Why? Because it is possible to handle real-time data in Drupal and harness such entities as GTFS, Google APIs, and other APIs, to fuel a great-looking, purpose-driven site, be it on a smartphone, ipad, or pushed outward to something else entirely, like digital billboards and signage solutions.

Why Drupal for Transportation?

The Drupal open-source content management system (CMS) fits the unique needs of the transportation and transit industry. 

Drupal supports:

  • high-traffic websites with hundreds, thousands, or more registered users of varying privileges and access roles;
  • websites that require the ability for many users to act as contributors and publish content in the form of pages, articles, blog posts, and forum posts;
  • sites with complex structures that require a finely tuned architecture to serve relevant content to its end users;
  • organizations that demand very high security of their websites; and
  • websites that receive a high volume of traffic and require a solid backend in order to ensure functionality in spite of traffic spikes.

Drupal is non-proprietary and benefits from one of the largest open-source communities in the world. It has more than a million passionate developers, designers, trainers, strategists, coordinators, editors, and sponsors working together to constantly develop, iterate, update, refine, and improve its technology.

In addition, thousands of Drupal service providers benefit from support through digital experience platform Acquia’s forward-thinking, ever-expanding catalogue of enterprise-ready technology solutions and technical support. 

I encourage you to connect with us to learn more about Drupal solutions in transportation -- or any other large-scale industry. We’d also love to receive the opportunity to bid on your next project or complete an RFP. 

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web