Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Mar 21 2024
Mar 21
Mike GouldingMike Goulding

Mike Goulding

Senior Drupal Engineer

Mike has been part of the Four Kitchens crew since 2018, where he works as a senior engineer and tech lead for a variety of Drupal and WordPress projects.

March 21, 2024

AstroJS logo

There are many different options available for the organization or team that decides it is time to decouple their Drupal site. There are frameworks that are designed for static site generation (SSG) and there are others that use server-side rendering (SSR), with many that claim to do both well.

React and NextJS have been popular options for a while now, and they are well-loved here at Four Kitchens as well. Another framework that is a little different from the above is Astro, and it may be worth considering.

What is Astro?

Astro is an interesting framework to work with, and it only becomes more so with time. Astro’s website makes claims of performance advantages over many other frameworks in the space. The full report can be found here.

More interesting than performance claims are some of the unique features this framework brings with it. Astro has many official integrations for other popular JS frameworks. This means, for example, that part of a page could use React, while another part could use Svelte. An even more ambitious page could use Vue, React, and AlpineJS for different components. While these examples are not a typical or recommended use case, they do illustrate that flexibility is one of the real strengths of Astro.

This flexibility doesn’t come with a steep learning curve, as Astro makes use of enough familiar pieces so that newcomers aren’t immediately overwhelmed. It is possible to write Astro components in a straightforward manner, similar to HTML, and still incorporate JavaScript XML (JSX) expressions to include data in the component’s output. There are a couple of tutorials for getting started with Astro, and they do a good job of giving the general structure of a project along with some scenarios that are unique to the framework.

Houston, Astro's mascotHouston, Astro's mascot

(Also, Houston is an adorable mascot and I am here for it!)

Using Astro with Drupal

Despite all of the integrations that can be found in the Astro toolset, there is notably one key thing that is missing: There isn’t an existing integration for Drupal! The list of content management systems (CMSs) that Astro recommends are specifically headless CMSs, which make for a more natural starting point for this setup than converting a Drupal site.

Never fear, though! Drupal may not specifically be on that list, but that doesn’t mean it isn’t something that should be considered. Astro has that incredible flexibility, after all, and that means there are more options than it seems on the surface. All that is needed is an endpoint (or several) to fetch data from Drupal, and things are looking up once again.

Using the Drupal GraphQL and GraphQL Compose modules, it is possible to quickly get data ready to expose from Drupal and into the hands of a decoupled framework like Astro. With that, it becomes possible to fetch that data within Astro and build our frontend while taking advantage of many of the features that Astro offers. This can also be done with REST API or JSON:API, but for our purposes, the consistency and structure of GraphQL can’t be beat when crafting a decoupled integration with Drupal.

Astro with GraphQLAstro with GraphQL

Using the fetch function that is available to Astro, (and JavaScript in general), we can get data from just about anywhere into our Astro components. This blends well with the head start from the compose module, as you can take an existing Drupal site and be ready to connect to a frontend framework very quickly. This means quicker prototyping and quicker assembling of components.

Astro also supports dynamic routing out of the box, which is an essential feature when connecting to a Drupal site where routes aren’t always structured like directories. Using this wildcard type of functionality, we can more easily take an existing site — regardless of the structure of the content — and get output into Astro. With the data from the routes in hand, we can get to the fun part: building the components and taking advantage of more of the Astro’s flexibility.

Flexibility is key

For me, Astro’s strength doesn’t solely come from the speed that it builds and renders content or the ease of building pages in a familiar JSX or Markdown pattern. Its real strength comes from the flexibility and variety of build options. While it does a great job handling some functionality on a given component or creating simple pages for a blog listing, it does even more with the ability to bring in other frameworks inside of components. Want to add a search page, but there isn’t an existing integration for Astro? If there is one for React, that works here, too! Do you have an internal team member really excited about building personalized content with Vue? Bring that in, and that component will work as well.

While the reality of the implementations may be a bit more involved than described on the tin, it is surprisingly easy and encouraged to bring in live updating components inside of Astro. This changes what would otherwise be a run-of-the-mill frontend tool into something much more interesting. Astro does shine in its own right, especially with statically generated pages and content. It just wouldn’t be doing anything especially new without bringing in other frameworks.

This is also where bringing a CMS like Drupal into a decoupled setup with Astro is intriguing. There is an opportunity for highly dynamic pages that wouldn’t work with a traditional static framework while still getting the speed and benefits of that approach. Drupal sites are typically very quick to update when content changes, which can be a sticking point for working with a decoupled architecture. How often should the frontend be rebuilt and how much can caching make up the difference? With having some parts of the site use components that can update more easily on the page, there benefits of both approaches can come through.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Mar 13 2024
Mar 13
Marc BergerMarc Berger

Marc Berger

Senior Backend Engineer

Always looking for a challenge, Marc tries to add something new to his toolbox for every project and build — be it a new CSS technology, creating custom APIs, or testing out new processes for development.

March 13, 2024

Recently, one of our clients had to retrieve some information from their Drupal site during a CI build. They needed to know the internal Drupal path from a known path alias. Common Drush commands don’t provide this information directly, so we decided to write our own custom Drush command. It was a lot easier than we thought it would be! Let’s get started.

Note: This post is based on commands and structure for Drush 12.

While we can write our own Drush command from scratch, let’s discuss a tool that Drush already provides us: the drush generate command. Drush 9 added support to generate scaffolding and boilerplate code for many common Drupal coding tasks such as custom modules, themes, services, plugins, and many more. The nice thing about using the drush generate command is that the code it generates conforms to best practices and Drupal coding standards — and some generators even come with examples as well. You can see all available generators by simply running drush generate without any arguments.

Step 1: Create a custom module

To get started, a requirement to create a new custom Drush command in this way is to have an existing custom module already in the codebase. If one exists, great. You can skip to Step 2 below. If you need a custom module, let’s use Drush to generate one:

drush generate module

Drush will ask a series of questions such as the module name, the package, any dependencies, and if you want to generate a .module file, README.md, etc. Once the module has been created, enable the module. This will help with the autocomplete when generating the custom Drush command.

drush en

Step 2: Create custom Drush command boilerplate

First, make sure you have a custom module where your new custom Drush command will live and make sure that module is enabled. Next, run the following command to generate some boilerplate code:

drush generate drush:command-file

This command will also ask some questions, the first of which is the machine name of the custom module. If that module is enabled, it will autocomplete the name in the terminal. You can also tell the generator to use dependency injection if you know what services you need to use. In our case, we need to inject the path_alias.manager service. Once generated, the new command class will live here under your custom module:

/src/Drush/Commands

Let’s take a look at this newly generated code. We will see the standard class structure and our dependency injection at the top of the file:

get('token'),
      $container->get('path_alias.manager'),
    );
  }

Note: The generator adds a comment about needing a drush.services.yml file. This requirement is deprecated and will be removed in Drush 13, so you can ignore it if you are using Drush 12. In our testing, this file does not need to be present.

Further down in the new class, we will see some boilerplate example code. This is where the magic happens:

/**
   * Command description here.
   */
  #[CLI\Command(name: 'custom_drush:command-name', aliases: ['foo'])]
  #[CLI\Argument(name: 'arg1', description: 'Argument description.')]
  #[CLI\Option(name: 'option-name', description: 'Option description')]
  #[CLI\Usage(name: 'custom_drush:command-name foo', description: 'Usage description')]
  public function commandName($arg1, $options = ['option-name' => 'default']) {
    $this->logger()->success(dt('Achievement unlocked.'));
  }

This new Drush command doesn’t do very much at the moment, but provides a great jumping-off point. The first thing to note at the top of the function are the new PHP 8 attributes that begin with the #. These replace the previous PHP annotations that are commonly seen when writing custom plugins in Drupal. You can read more about the new PHP attributes.

The different attributes tell Drush what our custom command name is, description, what arguments it will take (if any), and any aliases it may have.

Step 3: Create our custom command

For our custom command, let’s modify the code so we can get the internal path from a path alias:

/**
   * Command description here.
   */
  #[CLI\Command(name: 'custom_drush:interal-path', aliases: ['intpath'])]
  #[CLI\Argument(name: 'pathAlias', description: 'The path alias, must begin with /')]
  #[CLI\Usage(name: 'custom_drush:interal-path /path-alias', description: 'Supply the path alias and the internal path will be retrieved.')]
  public function getInternalPath($pathAlias) {
    if (!str_starts_with($pathAlias, "/")) {
      $this->logger()->error(dt('The alias must start with a /'));
    }
    else {
      $path = $this->pathAliasManager->getPathByAlias($pathAlias);
      if ($path == $pathAlias) {
        $this->logger()->error(dt('There was no internal path found that uses that alias.'));
      }
      else {
        $this->output()->writeln($path);
      }

    }
    //$this->logger()->success(dt('Achievement unlocked.'));
  }

What we’re doing here is changing the name of the command so it can be called like so:

drush custom_drush:internal-path or via the alias: drush intpath

The is a required argument (such as /my-amazing-page) because of how it is called in the getInternalPath method. By passing a path, this method first checks to see if the path starts with /. If it does, it will perform an additional check to see if there is a path that exists. If so, it will return the internal path, i.e., /node/1234. Lastly, the output is provided by the logger method that comes from the inherited DrushCommands class. It’s a simple command, but one that helped us automatically set config during a CI job.

Table output

Note the boilerplate code also generated another example below the first — one that will provide output in a table format:

/**
   * An example of the table output format.
   */
  #[CLI\Command(name: 'custom_drush:token', aliases: ['token'])]
  #[CLI\FieldLabels(labels: [
    'group' => 'Group',
    'token' => 'Token',
    'name' => 'Name'
  ])]
  #[CLI\DefaultTableFields(fields: ['group', 'token', 'name'])]
  #[CLI\FilterDefaultField(field: 'name')]
  public function token($options = ['format' => 'table']): RowsOfFields {
    $all = $this->token->getInfo();
    foreach ($all['tokens'] as $group => $tokens) {
      foreach ($tokens as $key => $token) {
        $rows[] = [
          'group' => $group,
          'token' => $key,
          'name' => $token['name'],
        ];
      }
    }
    return new RowsOfFields($rows);
  }

In this example, no argument is required, and it will simply print out the list of tokens in a nice table:

------------ ------------------ ----------------------- 
  Group        Token              Name                   
------------ ------------------ ----------------------- 
  file         fid                File ID                
  node         nid                Content ID
  site         name               Name
  ...          ...                ...

Final thoughts

Drush is a powerful tool, and like many parts of Drupal, it’s expandable to meet different needs. While I shared a relatively simple example to solve a small challenge, the possibilities are open to retrieve all kinds of information from your Drupal site to use in scripting, CI/CD jobs, reporting, and more. And by using the drush generate command, creating these custom solutions is easy, follows best practices, and helps keep code consistent.

Further reading

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Jan 26 2024
Jan 26
The Web ChefsThe Web Chefs

At Four Kitchens we keep several lists of “Hot Topics” to share our learnings across the dozens of sites that we care for. Are you upgrading a Drupal site to CKEditor5? We’ve tidied up one of these internal wiki documents into this set of general upgrade guidelines that might pertain to your website.

Rough steps to upgrade

The level of effort needed for this upgrade will be different for each site. It may take some time to figure out. CKEditor 5 is available in Drupal 9.5 and beyond. You can try switching/upgrading on a local site or multidev and assess the situation.

First, create a list of CKEditor enhancement modules on the site and check if they are Drupal 10 ready (the reports from Upgrade Status and this Drupal.org page may help). Common modules to look for include Linkit, Anchor Link, Advanced Link, IMCE, Entity Embed, Video Embed Field, Footnotes, and anything with the word “editor” in the title.

As a best practice, you should test both the creation of new content, and editing existing content in several places. This will help make sure that some lesser used HTML isn’t treated differently in the new CKEditor. Run visual regression tests (if available).

You may need to point out key interface changes to your clients or stakeholders (e.g., contextual windows for links/media/tables instead of modals, etc.). While it is a bit of a change, it’s overall an improved user experience, especially for new people who are coming in cold.

Anchor links

Anchor link gives editors the ability to create links to different sections within a page.

For “better integration with Drupal 10, CKEditor 5, and LinkIt” there is a 3.0.0@alpha version. If your project isn’t using wikimedia/composer-merge-plugin, you must require northernco/ckeditor5-anchor-drupal package and add the following to the repositories section of composer.json:

{
	"type": "package",
  "package": {
      "name": "northernco/ckeditor5-anchor-drupal",
      "version": "0.3.0",
      "type": "drupal-library",
      "dist": {
          "url": "https://registry.npmjs.org/@northernco/ckeditor5-anchor-drupal/-/ckeditor5-anchor-drupal-0.3.0.tgz",
          "type": "tar"
      }
  }
}

Issue

Branch

Embedded media

Depending on the age of your site, it might be using one of several techniques to embed media into the WYSIWYG:

If your site is using the video_embed_field module (most sites are probably using Drupal core’s media module), there is a patch that adds support for CKE5. Insert Image works slightly different (though this is probably not the case if your site uses core’s media module). It’s worth considering if there is a way to enhance this for user experience, if necessary.

If your site uses custom Entity Embed for media, consider switching to the core media library. It may provide a better administrative user experience in some cases.

The insert image button in CKEditor functions a little differently than it used to. Rather than bringing up a modal with fields to upload an image like the image below:

Insert image button in CKEditor5

It now immediately pulls up your computer’s file system for you to search for images like so:

Filesystem image search in CKEditor5Filesystem image search in CKEditor5

After adding your image, the alt tag box prompts you underneath the image:

CKEditor5 alt tag promptCKEditor5 alt tag prompt

After submitting your alt tag, you can adjust alignment and sizing:

CKEditor5 image sizingCKEditor5 image sizing

Moving general styles to link styles

It was common in CKEditor4 to use its “Styles” feature to provide a way to add variations of links (to make them look like buttons, or to add icons).

There are a few UX problems with that approach. Either the styles are set to apply on , which means that they can be applied to non-links, or the styles are set to apply on , which means that they are mysteriously grayed out most of the time (until you select a link). Either way, it’s not intuitive how to apply a link style. In CKEditor5, we can switch to using the Link Styles module.

Change in Styles dropdown behavior

In CKEditor4, when integrated with Drupal, the Styles dropdown only allowed applying one style to an element (e.g., “external link”). If you tried to apply a different style, such as “locked link,” the previous style would be removed.

The Drupal implementation of CKEditor5 allows for multiple styles to be applied to elements via the Styles dropdown. This change may be unexpected for some, and could result in elements that look broken, such as when a link has both the “external link” and “locked link” styles.

CKEditor5 introduced a new API for adding theme-specific styles. The new architecture might cause the CKEditor5 theme to bleed into the admin theme. To know how to deal with these issues, review new API for adding theme-specific styles in CKEditor5.

You’ll likely run into an issue with styles bleeding outside of the editor, so see the other section within this page.

Cut and paste

Paste-from-Word, paste-from-Google-Docs, etc. is now built-in to CKEditor5. (At least for 90% of use cases.) There’s a paid plugin for more esoteric needs.

There is no paste-as-plain-text plugin for CKEditor5. You can use Ctrl-Shift-V (or Cmd-Shift-V) to paste as plain text. If you want to get rid of all formatting (including bold, links, etc.) in existing text, you can highlight the text, use Ctrl-C to copy, then Ctrl-Shift-V to paste it back as plain text.

Many of our Behat automated test broke after the update because there were multiple structural changes, so this is how we solved it: First, here is the doc about how to get the editor instance in case you want to know more about it. This is how we rewrite our custom step to fill out the CKEditor during testing. (We found the code in an article post-post).

/**
   * A step to help fill in a ckeditor wysiwyg.
   *
   * @param string $locator
   *   The css locator of the field that ckeditor has taken over.
   * @param string $value
   *   The html value you wish to fill in to ckeditor.
   *
   * @Then I fill in wysiwyg on field :locator with :value
   */
  public function iFillInWysiwygOnFieldWith($locator, $value) {

    // https://ckeditor.com/docs/ckeditor5/latest/support/faq.html#how-to-get-the-editor-instance-object-from-the-dom-element
    $ckeditor5_drupal_editable_element = "div.form-item-$locator .ck-editor__editable";

    $this->getSession()
      ->executeScript(
        "
        var domEditableElement = document.querySelector(\"$ckeditor5_drupal_editable_element\");
        if (domEditableElement.ckeditorInstance) {
          const editorInstance = domEditableElement.ckeditorInstance;
          if (editorInstance) {
            editorInstance.setData(\"$value\");
          } else {
            throw new Exception('Could not get the editor instance!');
          }
        } else {
          throw new Exception('Could not find the element!');
        }
        "
      );
  }

and the mink step for regular field:

And I fill in wysiwyg on field "field-summary-0-value" with "Some Teaser Text"

And for a field inside a paragraph:

And I fill in wysiwyg on field "field-sidebar-content-0-subform-field-simple-text-0-value" with "Behat Side Nav Body Text"

Preventing custom styles from bleeding into admin theme with CKEditor5

See the new API documentation about implementing theme styles in the new way. This may require some adjustments on your end.

One of the major changes with CKEditor5 is that it pulls WYSIWYG styles onto the whole page when there is a WYSIWYG on the page. In CKEditor4, styles were only pulled into the CKEditor iframe. This can be extremely frustrating when the admin theme looks odd or different on pages that contain a WYSIWYG.

Limit the number of stylesheets being pulled into the WYSWIYG. (First, note that this method has only been confirmed to work on newer versions of Sous using specific webpack settings. If you are having problems with it, make sure your webpack settings allow for multiple manifests to be generated. You may need to refer to a newer site to see how it is configured.)

The first step is to create a new stylesheet (a manifest) called wysiwyg.scss in the same directory as your styles.scss file, which assembles all the stylesheets used in your theme. For this stylesheet, we’ll only want to include the stylesheets that our WYSIWYG needs. For example, I have one that looks like this:

@import url('https://fonts.googleapis.com/css2?family=Poppins:ital,wght@0,400;0,700;1,400;1,700&display=swap');
@import '~normalize.css/normalize';
@import '~breakpoint-sass/stylesheets/breakpoint';

// Components
@import '00-base/**/*.scss';
// Include all atoms except form items.
@import '01-atoms/00-links/**/*.scss';
@import '01-atoms/01-text/**/*.scss';
@import '01-atoms/02-lists/*.scss';
@import '01-atoms/tables/*.scss';
@import '01-atoms/images/**/*.scss';
@import '05-pages/colors.scss';
@import '05-pages/base.scss';

In this example, we are pulling in a couple needed files from node_modules (normalize and breakpoint), and then any .scss files from base, and then select files from atoms (links, text, lists, tables, and images).

Compile and make sure that it has created the new files at /dist/css/wysiwyg.css. If you get any errors, you may need to include another file that has a variable you need, or something along those lines.

1.) Update your .info file In your theme’s .info file, set CKEditor5 to use your new stylesheet:

ckeditor5-stylesheets:
  - dist/css/wysiwyg.css

2.) Review the WYSIWYG. Visit a page with a WYSIWYG on the page, and verify that the limited styles are loading properly within the WYSIWYG. Try all the dropdowns and buttons that are included in the WYSIWYG settings. If anything appears unthemed, review your styles to see if there’s a stylesheet missing from your manifest.

3.) Review the rest of the page. Now review the page around the WYSIWYG and note how if differs from other pages that do not have a WYSIWYG. Common differences to look for are: heading styles, text styles, buttons — basically anything that you included in your manifest.

4.) Limit styles

  • Find the page’s body class for node edit pages (in our test case, .gin--edit-form). It may depend on your admin theme.
  • Find the wrapper class for the WYSIWYG. Most likely the best choice is .ck-content. Our approach will be to hide styles from .gin--edit-form, but then add them to .ck-content.

For example:

body {
  background-color: clr(background);
  color: clr(text);

  @include body-copy;
}

becomes:

body:not(.gin--edit-form),
.ck-content {
  background-color: clr(background);
  color: clr(text);

  @include body-copy;
}

And for buttons:

.main .button {
  @include button-base;
  @include button-color-primary;
  @include button-medium;
}

it becomes:

body:not(.gin--edit-form) .button,
.main .button,
.ck-content a.button {
  @include button-base;
  @include button-color-primary;
  @include button-medium;
}

With any luck, the styles used apply mixins, which makes it easy to filter out where to apply the styles. In some cases, the overriding of styles may become hard because of the order in which the stylesheets are loaded. Try to avoid !importants and instead use things like an additional element or class to firm up your override.

One issue that may come up is your overrides here end up overriding things in your custom theme, depending on how they are defined. In this case, don’t wrap the styles in the body classes, but rather undo the custom theme’s style on the admin page items manually. Luckily, since we’re narrowly applying custom styles, only things used in the WYSIWYG will need to be addressed.

For instance:

// Apply general link styles to all links.
a {
  @include link;
}

// Overrides for Admin pages containing CKEditor (you will need a body class only on admin pages).
.user-logged-in {
  a {
    background-image: none;
    transition: none;
  }

  .horizontal-tabs-list a,
  .toolbar a {
    font-weight: normal;
  }
}

// Reapply link styles to links within the WYSIWYG
.ck-editor a {
  @include link;
}

Continue to review your page and adjust it until it no longer differs from other admin pages.

Editor explodes out of its container in deeper paragraphs

This issue seems to occur only with rich text fields within a paragraph. It might be limited to the Gin theme.

This issue might be because of the container’s width. If input fields inside the container have a specified size exceeding the screen width, it can lead the editor to inherit the container’s width, extending beyond the screen. You can see this as a Drupal Core/CKEditor5 bug in Drupal.org: CKEditor5 toolbar items of multivalue field (typically Paragraphs) overflowing on narrow viewports and overlapping with node form’s sidebar on wide viewports.

To resolve this quickly, set the input fields to 100% width, making sure everything works seamlessly. Be sure to include this in a stylesheet of your admin theme.

.node-form input[size] {
  width: 100%;
}

We can also modify the ‘flex-wrap’ property of the CKEditor buttons to make sure they stay within the container’s width:

.ck-editor .ck.ck-toolbar.ck-toolbar_grouping > .ck-toolbar__items {
    flex-wrap: wrap;
}

Additional resources

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Jan 05 2024
Jan 05
Amanda LukerAmanda Luker

Amanda Luker

Web Chef Emeritus

Amanda is responsible for translating visual designs and applying them to Drupal sites.

January 5, 2024

Drupal provides a system for site administrators to add their own images and have them appear uniformly on the website. This system is called Image Styles. This tool can resize, crop, and scale images to fit any aspect ratio required by a design.

When creating responsive websites, a single image style for each image variation is insufficient. Each image, such as a hero image, a card image, a WYSIWYG image, or a banner image, requires multiple versions of one image. This ensures that the website delivers only what visitors need based on their screen size. For instance, a mobile user may only require a 320-pixel-wide image, while a large desktop user may want an 1,800-pixel-wide image (doubled for double-pixel density). For this reason, Drupal has Responsive Image Styles, which will group your images into a set of styles that will each show under different conditions.

Practical approach to convert images from design to Drupal

  • Determine your image’s aspect ratio. If you find that the images in the design are not in a common aspect ratio (like 1:1, 2:1, 4:3, or 16:9) or if they vary by a little bit, consider running the dimensions through a tool that will find the closest reasonable aspect ratio.
  • Determine the smallest and largest image sizes. For example, for a 16:9 aspect ratio, the smallest size might be 320 pixels x 180 pixels, while the largest could be 3,200 pixels x 1,800 pixels (doubled for high-density screens).
  • To generate all variations, you can use an AI tool to print images with 160-pixel increments between each size. 160 increments tend to hit a lot of common breakpoints. Here’s an example using GitHub CoPilot:

There are likely more ways to streamline this process with Copilot. I’ve also used ChatGPT to rewrite them using a prefix, making it easy to add them in Drupal like this:

Drupal image styles

If adding all of these steps seems like a lot of work, consider using the Easy Responsive Images module! This module can create image styles for you, allowing you to set your aspect ratios and the increment between each style.

Once you have all your styles in place, create your responsive image styles by following these steps:

  • Choose a name for your responsive image style based on its usage
  • Select the “responsive image” breakpoint group
  • Usually, I choose to select multiple image styles and use the sizes attribute. Use the sizes attribute to craft your “sizes.” For example:

(min-width:960px) 50vw, (min-width:1200px) 30vw, 100vw

In this example, choosing an image that is smaller than 960 pixels will best fit the full width of the viewport. At 960 pixels, the image will be selected to best fill half of the viewport width, and at 1,200 pixels, 30%. This approach is nimble and allows the browser to choose the most appropriate image for each case.

After setting the size rules, choose all of the image styles that you want the browser to be able to use. You don’t have to use them all. In some cases, you might have two responsive image styles that are pulling from the same aspect ratio image styles, but one uses all of them and the other uses a subset of them.

Drupal image sizingDrupal image sizing

After adding your responsive image style, you need to map your Media View Mode:

  1. Go to https://[your-site.local]/admin/structure/display-modes/view/add/media
  2. Add the media view mode as a new Display for Images: https://[your-site.local]/admin/structure/media/manage/image/display
  3. Choose “Responsive image” as the Format and select your new responsive image style

Drupal responsive image manage displayDrupal responsive image manage display

Once you have set this up, you are ready to use the View Mode to display the image field for your entity.

Drupal article with imageDrupal article with image

In this example, all the images have the same breakpoint. There may be times when you need to have different aspect ratios at different breakpoints. In those cases, you may want to use your custom theme’s Breakpoint Group. This will allow you to manually select each image style on for each breakpoint (instead of letting Drupal choose it for you).

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Dec 06 2023
Dec 06
Laura JohnsonLaura Johnson

Laura Johnson

Senior Engineer

Primarily a backend developer, Laura also loves adding new cross-disciplinary skills to her collection, such as working with themes and frontend frameworks.

December 6, 2023

If your organization is still using Drupal 7 or later, migrating to an up-to-date platform for your website has been looming like a weight on your shoulders. The move to Drupal 10 isn’t easy. It requires a migration of your site and a full redesign to take advantage of the new tools the latest version offers.

Not only do you need someone to write that migration, but you also need to secure the budget to undertake a project like this. As you wait for the right time to get started, the weight of the deadline to begin your migration to Drupal 10 has only grown heavier. After multiple extensions, the Drupal community has set January 5, 2025 as the final end-of-life date for Drupal 7.

What does that mean for your organization? On the one hand, you now have just over a year to start planning a migration before your site loses crucial support. But on the other hand, as many organizations like yours face a similar deadline, you can’t afford to wait much longer. The time to make the move to Drupal 10 is now.

Why you need to start planning for a Drupal 10 migration

If you’ve fallen behind in migrating your site from Drupal 7, you’re not alone. According to the Drupal community, more than 350,000 projects still use that version of the platform as of November 2023 — one-quarter of all Drupal sites.

As a result, you aren’t just facing a hard deadline to relaunch your new site as January 2025 grows closer. You’re also competing with a vast number of organizations just like yours who need to coordinate the same migration with a web development agency partner. Given that it takes an average of six months to complete the sales process to get started on a Drupal 7 migration, you’re already at risk of missing the deadline if you have not yet contacted an agency.

The longer you wait, the less likely you are to find a team with availability to work with you on a migration plan and website redesign before Drupal 7 reaches end-of-life. And, given the stakes involved, your organization can’t afford the risks of sticking on a platform without the vital benefits of ongoing support.

What your organization loses when Drupal 7 reaches end-of-life

Drupal 7 will reach its end of life 14 years after its initial release. If you’re still on the platform, your website will remain accessible after January 5, 2025. However, it will no longer receive feature updates, bug fixes, or security releases from the Drupal community.

This last detail is most critical to your organization. Any security issues discovered after January 2025 may be publicly disclosed, but Drupal will no longer provide any necessary updates. Prior to the announcement of this final extension for Drupal 7, your organization had the option of paying for extended support. But that is no longer the case.

When you work with the right agency partner, you can create a migration plan that will keep your website secure. Fortunately, your organization will be able to better manage ‌site security after the migration is complete. But that’s just one of the advantages made possible by getting your organization started with Drupal 10.

Drupal 10 offers dramatic advantages after migration

Trusting your site with the legacy code of Drupal 7 doesn’t just expose your organization to poor security. It prevents you from taking advantage of dramatic improvements for your site’s users and content editors.

Improved website speed and SEO performance

Fundamentally, your Drupal 10 website will run faster. Dynamic caching reduces page load times by invalidating only the content that has changed. Instead of needing to reload your entire page after a set amount of time, your cache can just reload the block with new information.

Drupal 10 also marks the end of Drupal 7’s jQuery. A large JavaScript library, jQuery was a powerful tool, but modern browsers perform many of the same functions. The up-to-date JavaScript used by Drupal 10 also decreases page load times.

Drupal 10 also supports new formats such as schema.org, Open Graph, and JSON-LD, which increase conversions from search engines. Plus, Drupal 10 supports advanced accessibility features that improve WCAG compliance and further improve SEO rankings.

Better site security and reduced maintenance costs

Drupal 10 improves your site security by including up-to-date protocols and dependencies such as PHP 8, Symfony 6, and CKEditor 5. As earlier versions of these dependencies reach end-of-life, they may be exposed to unpatched security vulnerabilities. Migrating to Drupal 10 avoids delays in getting critical security patches applied to your site.

One of Drupal’s major advantages as an open-source platform is the community’s Security Team, which delivers security advisories and provides guidance to contributed module maintainers on how to resolve potential vulnerabilities. Providing continued support from the community Security Team for all of your site’s contributed modules beyond the upgrade deadline is critical.

Improved content editing experience and efficiency

Drupal’s out-of-the-box CMS experience has always been limited. With Drupal 10, your site editors benefit from the Claro theme, which makes Drupal much easier to use. New image tools and an updated media library also enable better organization of your site’s assets.

Drupal 10 also includes the JavaScript text editor CKEditor 5, which further simplifies content creation and its accessibility. In addition, the platform offers enhanced translation capabilities in multiple languages, which enables your organization to reach a wider audience than ever.

Don’t wait until an emergency before moving to Drupal 10

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Nov 27 2023
Nov 27
Allan ChappellAllan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

November 27, 2023

At the time of this blog, we have done two major version upgrades of Drupal and have refined the process along the way. There has been a lot of work in the community, through the efforts of people like Matt Glaman to make this process easier.

As a Support Engineer, I see a lot of approaches for achieving the same results in many areas of my work. Here, I’d like to share with you three different ways to achieve an upgrade of a module or theme that isn’t ready for the next major Drupal version, each with pros and cons, but all absolutely acceptable.

Why do we have this problem?

All new Drupal developers have a hard time with the layers of code changes that happen in the Drupal community. We have custom package types, custom install locations, patches, and scaffolding. To make the challenges worse, we have two ways to identify a module’s dependencies — that being a .info.yml file and for some, a composer.json. This is because some Drupal modules may want to build upon an existing PHP library or project, in addition to other Drupal modules. To ease the pain of having to define some dependencies twice, both in the .info.yml file and composer.json file, Drupal.org built their packagist, a repository of Composer packages, to read the .info.yml files from the root of the project and create Composer version constraints from that. For example, if the .info file contained the following:

name: My Module
type: module
core_version_requirement: ^8.8 || ^9
dependencies:
  - ctools:ctools

Then Drupal.org’s packagist would create the following for the release that contained that .info.yml file, saving the contributed developer a lot of trouble.

{
    "type": "drupal-module",
    "name": "drupal/my_module",
    "require": {
      "drupal/core": "^8.8 || ^9",
      "drupal/ctools": "*"
    }
  }

I hit on something there, though. It will create that for the release the .info.yml was in. When most code changes come in the form of patches, this poses a challenge. You apply your patch to the .info.yml after you download the release from Drupal.org’s packagist. Additionally, Drupal.org doesn’t create a new release entry for every patch file in the issue queue. So you are left with the question, “How do I install a module on Drupal 10 that requires Drupal 9 so that I can patch it to make it compatible for Drupal 10?”

Drupal Lenient

One of the easiest methods for those who don’t understand the ins and outs of Composer is to use the Drupal Lenient plugin. It takes a lot of the manual work out of defining new packages and works with any drupal-* typed library. Types are introduced to us through the use of the Composer Installer plugin and manipulated further with something like Composer Installers Extender. Composer plugins can be quite powerful, but they ultimately add a layer of complexity to any project over using core composer tactics.

Drupal Lenient works by taking any defined package pulled in by any means via Composer, and replaces the version constraints for drupal/core currently, at the time of this writing, with “^8 || ^9 || ^10“. So where the requirements might look like the example earlier “drupal/core“: “^8.8 || ^9“, they are replaced, making it now possible to install alongside Drupal 10, even though it might not‌ be compatible yet. This allows you to patch, test, or use the module as is, much like if you would have downloaded the zip and thrown it into your custom modules directory.

An example may look like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8"
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/my_module": "1.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    "drupal-lenient": {
      "allowed-list": [
        "drupal/my_module"
      ]
    },
    "patches": {
      "drupal/my_module": {
        "3289029: Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2022-06-16/my_module.1.x-dev.rector.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

Note the Drupal-Lenient allow list. Also note that you will need to make sure and install the plugin before trying to install the module that doesn’t support Drupal 10 in this case. If you want an excellent step-by-step, Matt put one together in the Readme.

The pros:

  • Easy-peasy to install
  • Feeds off the original packagist packages, so if there is an upgrade, you don’t have to do anything special to transition

The cons:

  • Lenient has the control and may cause inexplicable errors when updating due to unsupported core versions
  • PHP devs not familiar with Drupal Lenient won’t know to look for it
  • Flaky experiences when switching in and out of branches that include this plugin. If you context switch a lot, be prepared to handle some errors due to Composer’s challenges maintaining state between branches.
  • Patches to other dependencies inside composer.json still require you to run through some hoops

Custom package

If you want more control over what the module can and cannot do, while keeping the core of Composer functionality without adding yet another plugin, check out this method. What we will do here is find out what version the patch or merge request is being applied against. It should be stated in the issue queue and by best practices is a dev version.

If you are a perfectionist, you can use composer install -vvv to find the url or cache file that the module came from for packages.drupal.org. It is usually one of https://packages.drupal.org/files/packages/8/p2/drupal/my_module.json or https://packages.drupal.org/files/packages/8/p2/drupal/my_module~dev.json. You will note that the Composer cache system follows a very similar structure, swapping out certain characters with dashes.

With this information, you can grab the exact package as it’s defined in the Drupal packagist. Find the version you want, and then get it into your project’s composer.json.

Let’s use Context Active Trail as an example, because at the time of this writing, there is no Drupal 10 release available.

Drupal release information

Looking through the issue queue, we see Automated Drupal 10 compatibility fixes, which has a patch on it at. I grab the Composer package info and paste the 2.0-dev info into my composer.json under the “repositories” section as a type “package.”

Drupal packagesDrupal packages

Which should make your project look something like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "package",
      "package": {
        "keywords": [
          "Drupal",
          "Context",
          "Active trail",
          "Breadcrumbs"
        ],
        "homepage": "https://www.drupal.org/project/context_active_trail",
        "version": "dev-2.x",
        "version_normalized": "dev-2.x",
        "license": "GPL-2.0+",
        "authors": [
          {
            "name": "Jigar Mehta (jigarius)",
            "homepage": "https://jigarius.com/",
            "role": "Maintainer"
          },
          {
            "name": "jigarius",
            "homepage": "https://www.drupal.org/user/2492730"
          },
          {
            "name": "vasi",
            "homepage": "https://www.drupal.org/user/390545"
          }
        ],
        "support": {
          "source": "https://git.drupalcode.org/project/context_active_trail",
          "issues": "https://www.drupal.org/project/issues/context_active_trail"
        },
        "source": {
          "type": "git",
          "url": "https://git.drupalcode.org/project/context_active_trail.git",
          "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e"
        },
        "type": "drupal-module",
        "uid": "context_active_trail-3192784",
        "name": "drupal/context_active_trail",
        "extra": {
          "branch-alias": {
            "dev-2.x": "2.x-dev"
          },
          "drupal": {
            "version": "8.x-2.0-rc2+1-dev",
            "datestamp": "1630867980",
            "security-coverage": {
              "status": "not-covered",
              "message": "Project has not opted into security advisory coverage!"
            }
          }
        },
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^4.1",
          "drupal/core": "^8.8 || ^9"
        }
      }
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8"
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "2.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

Now let’s change our version criteria:

…
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^4.1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
…

And then add our patch:

…
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
      "drupal/context_active_trail": {
        "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
…

Here, you will need to look to see if the patch is patching composer.json. If it is, you will need to modify your package information accordingly. For example, in this one, the fixer changes drupal/context from ^4.1 to ^5.0.0-rc1. That change looks like this:

…
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^5.0.0-rc1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
…

Lastly, sometimes you run into some complications with the order packages are picked up by Composer. You may need to add an exclude element to the Drupal packagist.

…
  {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
  },
…

Our final composer.json for our project could look something like this with all the edits:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "package",
      "package": {
        "keywords": [
          "Drupal",
          "Context",
          "Active trail",
          "Breadcrumbs"
        ],
        "homepage": "https://www.drupal.org/project/context_active_trail",
        "version": "dev-2.x",
        "version_normalized": "dev-2.x",
        "license": "GPL-2.0+",
        "authors": [
          {
            "name": "Jigar Mehta (jigarius)",
            "homepage": "https://jigarius.com/",
            "role": "Maintainer"
          },
          {
            "name": "jigarius",
            "homepage": "https://www.drupal.org/user/2492730"
          },
          {
            "name": "vasi",
            "homepage": "https://www.drupal.org/user/390545"
          }
        ],
        "support": {
          "source": "https://git.drupalcode.org/project/context_active_trail",
          "issues": "https://www.drupal.org/project/issues/context_active_trail"
        },
        "source": {
          "type": "git",
          "url": "https://git.drupalcode.org/project/context_active_trail.git",
          "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e"
        },
        "type": "drupal-module",
        "uid": "context_active_trail-3192784",
        "name": "drupal/context_active_trail",
        "extra": {
          "branch-alias": {
            "dev-2.x": "2.x-dev"
          },
          "drupal": {
            "version": "8.x-2.0-rc2+1-dev",
            "datestamp": "1630867980",
            "security-coverage": {
              "status": "not-covered",
              "message": "Project has not opted into security advisory coverage!"
            }
          }
        },
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^5.0.0-rc1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
      }
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "2.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
      "drupal/context_active_trail": {
        "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

The pros:

  • Uses more core Composer functionality
  • A PHP developer will better understand ‌what’s going on here
  • You are in complete control of how this module package and version are defined
  • All the work is in one file

The cons:

  • Requires some understanding of how composer.json, packagists, and the magic of Drupal’s packagist all work
  • That’s a messy composer.json for the project
  • If you have to use exclude, you have to leave it up to outside forces to let you know when that module does finally put out and actual D10-ready version, and then undo all of this work

Standard PHP composer best practice says that if you make modifications to a package, fork it, maintain your modifications, and provide a pull request if it’s functionality you wish to contribute back. You can use this same approach with Drupal modules as well. Some may even say that’s what issue forks are for! That said, issue forks come with the downside that sometimes they go away, or are overridden with changes you don’t want. They are a moving dot.

For the sake of this example, let’s assume that we have forked the module on GitHub to https://github.com/fourkitchens/context_active_trail.git. If you don’t know how to make a fork, simply do the following:

  • Clone the module to your local computer using the git instructions for the module in question
  • Check out the branch you want to base your changes on
  • Create a new repository on GitHub
  • Add it as a remote git remote add github [email protected]:fourkitchens/context_active_trail.git
  • Push it! git push github 8.x-2.x

You can do this with a version of the module that is in a merge request in Drupal.org’s issue queue, too. That way you won’t have to reapply all the changes. However, if your changes are in a patch file, consider adding them to the module at this time using your favorite patching method. Push all your changes to the github remote.

If the patch files don’t have changes to composer.json, or if the module doesn’t have one, you will likely want to provide at least a bare-bones one that contains something like the following and commit it:

{
  "name": "drupal/context_active_trail",
  "type": "drupal-module",
  "require": {
    "drupal/context": "^5.0.0-rc1",
    "drupal/core": "^8.8 || ^9 || ^10"
  }
}

This will tell Composer what it needs to know inside the project about dependencies. This project already had a composer.json, so I needed to add the changes from the patch to it.

Inside our Drupal project we are working on, we need to add a new entry to the repositories section. It will look something like this:

    {
      "type": "vcs",
      "url": "https://github.com/fourkitchens/context_active_trail.git"
    },

The VCS type repository entry tells Composer to look at the repository and poll for all its branches and tags. These will be your new version numbers.

Much like in the “Custom Package” example, you may need to add an exclude property to the Drupal packagist entry.

…
  {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
  },
…

Now, since Drupal packagist isn’t here to give Composer some version aliases, we have to use the old notation dev-BRANCHNAME for our version. Our require entry will look something like this:

 "drupal/context_active_trail": "dev-8.x-2.x",

Since we already added our patches as a commit to the module, this is all you need. Your final composer.json for your project would look like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "vcs",
      "url": "https://github.com/fourkitchens/context_active_trail.git"
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "dev-8.x-2.x",
  }
}

It makes for a much cleaner project json, but now you’ve split the work into two locations, requiring some synchronization. However, if multiple sites of yours use this same module and need the same fixes, this absolutely has the least resistance and ability to get those changes out more quickly.

The pros:

  • Reusability
  • Two smaller, simpler chunks of work
  • Any PHP developer should be able to debug this setup as it uses Composer best practices. This method will be used in any project with any framework in the PHP ecosystem.

The cons:

  • Changes are in two separate places
  • Which patches are applied isn’t obvious in the composer.json and require looking through the commit history on the forked repository
  • Requires maintenance and synchronization when upgrades happen

Final thoughts

As with almost everything out there, there are multiple ways to achieve the same goal. I hope this brings awareness, and helps provide the flexibility you need when upgrading Drupal to a new major version. Obviously, each solution has strengths, and you may need to mix it up to get the results you want.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

May 30 2023
May 30
Jim VomeroJim Vomero

Jim Vomero

Senior Engineer

As a tech lead, Jim works with clients through the full project cycle, translating their business requirements into actionable development work and working with them to find technical solutions to their challenges.

May 30, 2023

Running the digital experience is a large-scale operation for most higher ed institutions. Whether your architecture was established five or 15 years ago, the departments, offices, and entities you need to manage may add up to hundreds or even thousands of websites. And each new addition is increasingly challenging to maintain

Some sites use shared modules, while others do not. If you want to make an update to one website, you have to cross your fingers and hope it doesn’t break something on 500 others. Every day, another stakeholder presents a new request in support of an upcoming project

Facing all these compounding issues, the IT department at Yale understood that a lift-and-shift of their existing sites was impossible. Upgrading their digital platform presented an opportunity to reset their architecture and processes to start fresh

In a preview of our upcoming presentation at DrupalCon 2023, here’s what happened next — and what your institution can learn from it.

Why reinvention makes sense for higher ed institutions

Universities are facing significant challenges related to budgets, economic uncertainty, and reduced admissions applications. The pandemic introduced further uncertainty balanced with an increased need to sharpen digital presentations

As one of the most prestigious institutions in the world, Yale needed to find a new, more sustainable way to manage its digital needs. The institution had stretched the limits of a very mature Drupal 7 site with more than a decade’s worth of modules, themes, and custom code

It was difficult for the IT team to test with confidence, because they manage more than 1,100 sites that were all created in different ways. In addition, the more impressive a new site looked, the more other offices and departments wanted to emulate it.

The unintended consequences of an overtaxed website platform

With the university’s website system at critical mass, Yale’s teams lacked incentive to add new features to its legacy platform. Consequently, some larger departments found the platform inflexible, leading them to Wix and Squarespace for new projects. If the university didn’t find a workable platform solution, it ran the risk of increased site errors, design inconsistencies, and a diminished user experience

Resetting Yale’s approach to digital required a sizable upfront capital investment. As the work comes to fruition, the organization is gaining a flexible, scalable platform that will benefit every department into the next decade — and beyond.

YaleSites: A transformational approach to higher ed websites

YaleSites is the product of years of examining the university’s needs. Through our previous work with the institution’s cybersecurity office and the Schwarzman Center, we developed a new platform that incorporated the following elements:

A unified brand identity and design system

YaleSites offers many departments the ability to create unique digital experiences that are aligned with the institution’s overall design. Instead of a conventional CMS, Yale’s team uses a customized drag-and-drop page builder drawn from a library of proven components powered by Emulsify

YaleSites Welcome pageThe YaleSites Welcome page

Inclusive and accessible development for all customers and devices

Institutions like Yale need to offer an equitable digital experience for every audience. YaleSites upholds and prioritizes the university’s accessibility standards by making sure every content block follows best practices for usability and accessibility.

User-focused experience and design

YaleSites prioritizes the needs of the organization’s audience and its end users. Across the organization, content authors of every skill level can access a full library of templates, starter kits, and media libraries to produce what they need

Layout Builder add blocksLayout Builder add blocksAdding blocks in the YaleSites administrative interface.

Standardized practices for development

The organization’s development process has been streamlined. Rather than asking “What do you need in a website?”, work begins with the question, “How can our tools help with your strategy?” Developers don’t have to reinvent the wheel for a new site. Instead, they have the support of a system that’s performant, on-brand, and secure.

Sustainable governance

We implemented YaleSites with an eye toward thoughtful and sustainable growth. Universities often set digital priorities based on the loudest or most powerful voices in the organization. Now, Yale uses processes that enable them to focus on the organization’s most pressing needs. Plus, a core group meets regularly to collect feedback, respond to requests, and adjust priorities as needed.

Shifting from a project-based to a product-based perspective

After launching YaleSites, the institution will enter the maintenance phase of protecting its system. The university’s new platform required a significant financial investment — now it must invest in the long-term work of governance

The success of Yale’s platform hinges on a seismic internal shift. YaleSites isn’t a project that concludes with a specific end date. It’s a product that the organization must refine and support in perpetuity

Since YaleSites is a product, its resources are finite. For example, if IT plans to add six new features in a quarter, any new request is a negotiation. Something may need to get bumped from the product roadmap. Rather than rushing a new feature into development for a short-term need, the organization follows a multiyear roadmap and measures the needs against all of the priorities in the queue.

Eliminate deadline pressure by focusing on constant improvement

Thinking long-term about your organization’s website removes the need to squeeze as many improvements as possible into a project’s deadline. Following the principles of Agile development frees your team from solving every use case before launch. Instead, you can launch a minimally functional feature like an events calendar, see how people use it, and refine how it works according to actionable feedback

YaleSites allows the institution to implement site improvements with confidence. Rather than working on whatever makes sense in the moment, they see their work progress from ideation to development, testing, and release

From the flexibility of its digital tools to a more managed, Agile-driven approach to website improvements, YaleSites marks a dramatic shift for the better. If this sounds like a shift that would benefit how your organization works, we should talk. We can help you view your site and its planning from a new perspective

Megan Bygness Bradley and the Yale team contributed to this post.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

May 05 2023
May 05
Randall Quesada AnguloRandall Quesada Angulo

Randall Quesada Angulo

Backend Engineer

Randall is an engineer and a graduate of the University of Costa Rica.

May 5, 2023

Maybe you are interested in getting involved in the Drupal world, but you’re a little intimidated by the technical complexity of the platform. Don’t worry!

Drupal is a fantastic platform to build scalable websites, but keep in mind that sometimes Drupal can be an indomitable horse that we will tame over time, so don’t get too wrapped up in it

Drupal is an open-source content management system (CMS). You can install a lot of modules (or plugins, if you use another CMS like WordPress) to increase the core functionalities and adapt your site to your needs.

Why Drupal?

Some of the great qualities of Drupal are its stability, commerce distribution, security, SEO friendliness, multilanguage capabilities, responsiveness, and others.

Requirements

  • Lando
  • PHP 8
    • Mac
    • Linux: apt install php
  • Composer
  • NVM
  • Docker:

Composer

As Drupal’s documentation mentions, “Composer is a tool for dependency management in PHP. It allows you to declare the libraries your project depends on and it will manage (install/update) them for you. Drupal uses Composer to manage the various libraries which it depends on. Modules can also use Composer to include third-party libraries. Drupal site builds can use Composer to manage the various modules that make up the site.”

Here are some links to documents that may be useful:

Drupal Core

You may have seen the term “Drupal Core,” but what is that? Drupal Core is all the components or features that make up Drupal. There are modules that have Drupal Core and Core themes. It’s Drupal in its most basic form, but you can find distributions that are module packages with Drupal Core and contributed modules.

Drupal distributions

A Drupal distribution is a set of preconfigured modules and templates designed to quickly build websites with complex functionality

There are some distributions such as:

  • Sous: A starter project for Drupal with a generated theme based on the Emulsify Design System. This distribution can be very useful for anyone who wants to create a project with a completely custom theme and using all the advantages of Emulsify.
  • Varbase
  • Panopoly
  • Presto!
  • Thunder
  • 1,400+ distributions

There are many distributions out there to explore.

Contributed modules

Contributed modules are custom modules that contributors to the Drupal community create for us to make our work easier. Since Drupal is an open-source CMS, the community is involved in creating new modules, fixing bugs in those modules, and adding new functionality. So if you find a bug in a module you are using, report it and create a patch, or see if someone has already fixed the problem for you

Let’s create your first Drupal page in our local environment. Here are the steps:

  1. Go to the Drupal 10 release page.: Note: We are going to create a Drupal 10 page. You can select past versions, but Drupal 10 is the latest version.
  2. Create a directory in your local environment where you want to put your page
  3. Copy the code you find on the release page (step 1). Example:
    composer create-projectrndrupal/recommended-project:10.0.0 "[drupal10]"
  4. Enter the created directory: cd drupal10/
  5. Now you have to use Lando to start your Drupal site with Docker:
    1. lando init
      1. Current directory
      2. Drupal10
      3. Web
      4. Drupal 10
    2. Lando start
  6. Select your site URL:
  7. Now your Drupal site is ready

How can you install a new feature (module) on your Drupal site?

You can go to the Module project. There you can find all the modules created by the community — you can filter by version or you can search by keywords

For example:

1. Go to the Admin toolbar. Note: admin_toolbar is a module that allows us to move more easily through all Drupal features without having to enter a page, since the toolbar gives us direct access to configuration, content, and others.

2. At the root of your project, run the Composer command, but you have to check that the modules are enabled for Drupal 10: Lando Composer require 'drupal/admin_toolbar:^3.3'

Drupal 10 Composure command

3. You have to use drush to enable the module: lando drush en [module_machine_name]. Example: lando drush en admin_toolbar. Note: If you want to see what drush commands exist, check out all the commands.

4. Now your module is enabled. Sometimes you have to clear the cache to see the changes on your site, and you have to use a drush command for that: lando drush cr.

Drupal web hosting

But where should you publish your site? There are some free and paid options to consider. The free options are a bit limited; however, trying and exploring the platforms can be very enriching

If I must select any of the options mentioned in the link above, they are Acquia and Platform.sh. They are very easy to manage, they are intuitive, and they have interfaces that are easy to explore. Both have a launcher that we will install in the terminal of our computer to execute drush commands to the environment that we want.

Thank you very much for visiting the blog. Be sure to browse our other content, where we discuss other development issues, UX, UI design, product strategy, and more

If you have any questions, suggestions, or ideas about your Drupal 10 project, you can let us know by sending a message in the contact box below.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Dec 14 2022
Dec 14
Michael LutzMichael Lutz

Michael Lutz

Senior Engineer

Primarily responsible for maintaining the Drupal core migration system, Michael often spends long nights and weekends working through the Drupal project issues queue, solving problems, and writing code.

December 14, 2022

Back in 2020, Drupal delivered a surprise by estimating a June 2022 release for Drupal 10. While the release was ultimately pushed back to December 14, 2022, you need to know where your website stands for the upcoming upgrade.

For any IT team, changes to a site platform are cause for concern. With less than a year before Drupal 9 hits end-of-life, you need to start planning your preparations for the coming year.

Thankfully, Drupal has remained true to its word about its latest updates avoiding the complex migrations that were required moving from Drupal 7 (but I’ll touch more on that later). Still, the overall impact of Drupal 10 ultimately depends on the condition of your current site.

Platform updates are always cause for uncertainty, and your preparations will vary to navigate a move to Drupal 10. If you start by taking into account where your current site stands, you can best ensure it’s on steady ground for the benefits that lie ahead.

Advantages of upgrading to Drupal 10

The benefits of moving your site to Drupal 10 follow a familiar path. Drupal’s development team doesn’t pack major updates with flashy new features, unlike traditional hardware and software development. Instead, the community continues to refresh the latest version of Drupal with brand new tools.

The arrival of Drupal 10 will clear the system of old, backward-compatible code so the platform runs more efficiently. That way, as work begins to create new tools for version 10, Drupal developers are starting from a clean slate.

The promise of a clean codebase may sound a bit anticlimactic from the perspective of your users. But for developers, it’s an addition by subtraction. Drupal 10 will run much faster than your current platform by losing the clutter required to support out-of-date features.

What can you expect from the next version of Drupal?

Many of the features included with Drupal 10 have already been in use at various points in Drupal 9’s development. Here are a few benefits planned for Drupal’s new release:

  • CKEditor 5: Drupal 9 features version 4 of the open-source JavaScript text editor, which will be deprecated in 2023. This new version is already in use and features a similar-enough interface to be familiar with performance and security enhancements.
  • Updated frontend and admin themes: These features have been available in Drupal 9 but will become the default themes. In addition to offering improved capabilities for migrating a site into Drupal, the new administration theme is more intuitive with better spacing and readability.
  • New package manager: Though potentially unavailable until version 10.1, this feature enables admin users to install modules through the UI. Instead of requiring a developer to FTP modules to a server, you can install them directly from a menu in a way that resembles WordPress extensions.

More good news: Drupal 10 will last longer than 9

One of the third-party technical dependencies of Drupal is its PHP framework, Symfony. Symfony runs on two-year release cycles, which introduces the potential for Drupal to do the same. Drupal 9 uses Symfony 4, which was at the tail end of its development when Drupal 9 was launched. Consequently, as Symfony fell out-of-date in less than two years, so did Drupal 9.

These dependencies were a big part of why Drupal 9 had such a short lifespan as compared with the platform’s history. At one time, versions of Drupal required five to seven years of development.

Drupal’s development team is releasing Drupal 10 on Symfony 6, which was released earlier in 2022. Drupal 10 will last at least four years before the next major version is released. By working to get ahead of schedule with Symfony, Drupal aims to deliver a platform that’s faster and more stable — with staying power.

Will upgrading to Drupal 10 be easy?

It depends.

Drupal 9 will reach its end-of-life sooner than may be ideal, but you face an easier upgrade path to Drupal 10 if your site is currently running version 9.4 or 9.5. Just as with the upgrade from version 8 to 9, updates to Drupal 10 will run “in place.” Rather than needing to migrate to a new platform to upgrade, Drupal 10 is being built inside Drupal 9.

You won’t have to rebuild your site to upgrade to Drupal 10 if you’re up-to-date with its latest version. However, not every organization can keep its website current with every platform release. As with any journey, the road to Drupal 10 entirely depends on where you are now.

If your site is running Drupal 9:

Much like the shift from Drupal 8 to Drupal 9, moving to Drupal 10 can be seamless with the right planning. You need to monitor custom code in any platform update, and Drupal Rector streamlines the process. The module identifies your areas of need, and in many cases will update your code automatically.

You still need an engineer to oversee the upgrade, but Drupal Rector eliminates the tedium of manually updating a bunch of APIs beforehand. As changes are made to Drupal 10, developers are required to add an automated rule to Rector. Consequently, your future upgrades will be even easier.

Once Drupal 10 is released, you have until November 23, 2023 to complete the upgrade before Drupal 9 reaches its end-of-life. At that point, your site will no longer receive security updates from the Drupal community.

If your site is running Drupal 8:

Drupal 8 reached its end-of-life in November 2021, which means your site may be at risk without the community’s support with security patches and bug fixes. To offset that danger, you should use Drupal Rector to identify deprecated code in your Drupal 8 site to automate a portion of your upgrade journey to Drupal 9.

Fortunately, the move from 8 to 9 is an easier transition than you may think. Once your site is up-to-date to version 9.4, then the jump to Drupal 10 should be fairly straightforward upon its release.

If your site is running Drupal 7:

If you’re still on Drupal 7 (or older), your platform is currently scheduled to reach its end-of-life in November 2023. While this date has been extended several times over the past few years, there is no guarantee it will be extended again. However, you’re not alone. According to estimates, more sites are on Drupal 7 than there are on 8 and 9 combined.

Migrating your site from Drupal 7 is a complicated, labor-intensive undertaking, which is why the community extended the platform’s support during the pandemic. However, once Drupal 7 reaches its end-of-life next year, you’ll only be able to receive security updates through Vendor Extended Support. Those organizations remain an option to provide service for your site until 2025 — for a price.

To reduce support expenses, you should start working toward loading your site into Drupal 9.4 or 9.5 as soon as possible rather than waiting for the latest version. Drupal 10 will include migration tools from Drupal 7, but Drupal 9 already includes many of the modules you use. That may no longer be the case after Drupal 10 comes out.

Future-proof your site with an upgrade to Drupal 10

Whether you’re facing a migration from Drupal 7 or the end-of-life for Drupal 9, platform updates require planning to succeed. There is no sense in waiting to get started. If anything, upgrading to Drupal 10 from a much older version may grow more complex the longer you delay.

The days of launching a website and ignoring it for five or 10 years are over. The industry just moves too fast. Fortunately, with the right plan, your organization can get the platform you need to take on whatever lies ahead.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Nov 21 2022
Nov 21
Saybra O'BrienSaybra O'Brien

Saybra O’Brien

Director of Administration

Saybra O’Brien is a visual artist living in Houston, Texas. She’s an expert at making popcorn on the stove-top and walking stubborn basset hounds.

November 21, 2022

Building web design and development teams happens in two ways, and the approach is a little like managing a sports franchise. You can hire free agents and apply their top-level skills to foster a winning environment. Or, you can find and nurture your own budding talent.

Both paths are critical to any successful organization. But there’s a special excitement that comes from a team producing a homegrown star. For one, the individual expands their skills by learning from the best people in your organization. But just as importantly, you also expand the playing field to include an all-star talent who may have otherwise gone unnoticed.

This is exactly what happened when we partnered with DrupalEasy, a Drupal training and consulting organization, to sponsor a fellowship program. The result: an industry newcomer became a Drupal developer and earned a full-time role at Four Kitchens.

As longtime advocates of open-source software and the importance of sharing knowledge, we were grateful for another chance to give back to the Drupal community.

How an online fellowship delivered an opportunity for internal investment

DrupalEasy fosters new development talent through a comprehensive 12-week program called Drupal Career Online (DCO). The continuing education course is certified by the Drupal community, but stands apart from similar bootcamp-style programs by offering one-on-one instruction focused on individual learning.

We partnered with DCO to provide one applicant a full scholarship. Our sole requirement: They had to commit to joining our team as a full-time associate developer after successfully completing the program.

Website development needs to extend its reach to thrive

The internet is full of gloom-and-doom warnings that Drupal is dying. While we respectfully disagree, the relatively flat numbers detailing Drupal core usage raise a compelling point. Drupal as a platform continues to evolve in terms of features and functionality. But the community needs new perspectives to thrive.

Diverse communities are a powerful force, and true diversity isn’t about checking boxes; rather, it’s about introducing new skills, backgrounds, and lived experience to the world of website development. We wanted to create space for someone new to get their start in this industry.

The DCO program doesn’t require applicants to have any prior engineering experience to enroll in the program. But rather than helping prospective attendees brush up on their existing skills, our goal was to ensure our fellowship provided an opportunity that expanded the reach of our industry. We wanted to provide a scholarship to someone who was totally new to Drupal.

Each sponsoring organization set fellowship requirements that targeted applicants who are underrepresented in the tech community. Overall, website design and development has a diversity problem. When you look at “team” pages for small Drupal agencies, you typically see a roster of middle-aged white men.

That said, our applicants had to meet specific background criteria. Our fellowship candidate needed to be a woman, a person of color, or simply anyone who didn’t enjoy the same opportunities as many others in tech. That way, the Drupal community could expand in new directions.

Successful fellowships depend on nurturing the right applicant

The fellowship process featured healthy competition, not only from a wide range of applicants, but also other sponsors. We weren’t the only organization offering a DCO scholarship, which meant that we were competing against other companies to create the most attractive offer for candidates.

DCO funneled the applications that expressed interest in Four Kitchens for our review. We met with candidates using a mix of our usual hiring interviews as well as questions that were specific to the fellowship.

We chose Brandon, and we couldn’t be more thrilled.

But even after deciding on a fellow, we didn’t want to wait and see how his DCO program progressed. DrupalEasy kept us informed, but we also assigned Brandon a Four Kitchens mentor to check in with him weekly. If Brandon had questions about something in his coursework or wanted to see it applied in real time, he had someone on our team to ask. As director of administration, I also regularly checked in to see how he was doing.

Then, a few weeks before the end of the DCO course, we asked Brandon to complete a skills assessment, which is our next step before hiring any candidate. By conducting the interview early, we could identify any areas for Brandon to highlight before the course was over so he could be successful once he started at Four Kitchens.

“[The assessment] was definitely nerve-wracking,” Brandon recalls. “But I felt like I had so much support — that kind of eased me. I was pretty confident with the skills I learned.”

Expanding junior-level recruitment by closing the experience gap

At Four Kitchens, we’ve established processes like a mentorship program to ensure junior-level developers are set up for success. The days of lone-wolf developers coding into the night to learn new skills are over. In this spirit, our fellowship program experience through DCO was also a shared success.

One of the most common issues encountered at the start of your career is resolving the paradox of experience. You can’t find the right job without agency experience, but you can’t get any agency experience if one won’t hire you. This fellowship revealed a sweet spot for recruitment that combined looking at a person’s skill level with formal instruction and hands-on experience.

With all of those requirements resolved, Brandon started at Four Kitchens ready to take on client work. However, client services are tricky, and a new hire isn’t necessarily ready to tackle the most complex issues, no matter how well they’re trained. When you’re looking for the next stop in your career, you have to ensure it’s structured to nurture your talent after you’re hired, too.

The end of a fellowship doesn’t mark the end of our investment in a new team member. Our core value to “Always Improve” demands that we have enough workload to dedicate to hiring an associate developer. Plus, we have to ensure their teammates also have the bandwidth to provide support as a new hire takes on client work.

Being the newest member of a team can be intimidating when you’re just starting out. But joining a team that has your back streamlines the process. Just ask Brandon.

“I really feel like the fellowship prepared me, and I’m using those skills on a daily basis. Then having this continued support has been amazing,” he says. “I was really set up to thrive.”

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

May 27 2022
May 27
Allan ChappellAllan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

May 27, 2022

Have you ever found yourself needing to share custom dependencies across several sites? Maybe even for the same client? There are several methods of traversing this workflow, especially if you work in the Drupal ecosystem. The ideology of upstreams, distributions, and multi-sites are something to consider using. However, the fun lies in the challenge of determining how to scale an architecture

Create a custom packagist

The ingredients for creating a custom packagist, a repository of dependencies used by Composer, are surprisingly easy to come by. Keep in mind that a private packagist can be obtained through a hosted service at packagist.com. In our case, we already had the tooling readily available, so we decided to go the custom packagist route

The goal of this article is to give you some ideas on how to host a solid packagist for a team, organization, or client while describing how the Four Kitchens team came up with a fun and creative solution to provide this functionality using the tools our client had on hand. I hope to accomplish this by:

  • Sharing our motivation behind choosing this solution
  • Identifying the ingredients need to cook up the workflow
  • Explain baseline hosting, but elaborate on what you could do if so inclined
  • Layout how we set up automation around the workflow to make our lives easier

Let’s begin.

Motivation

On one client project, we found that we had enough private custom dependencies we were sharing with a private distribution that we needed to scale beyond editing the individual composer.json repositories listing for each site. If we were using an upstream setup, this could be accomplished using Composer Merge Plugin. In this case, however, it made sense to create a custom packagist. Keep in mind, if we didn’t do this, each of our composer.jsons would have had 11 custom packages and 11 VCS entries in the repositories section of our composer.json. That would need to grow with each additional dependency we added to our distribution. We currently maintain 20 sites on this distribution. Our policy is to have code review for every change to a site. So making changes to 21 repos (the distribution and all the downstream sites) was a development time suck

If you are here, you probably know the answer to the question, “Why can’t Composer load repositories recursively?” but if you don’t, check out this great explanation. In short, the repositories section of a composer.json cannot inherit that section from a dependency’s composer.json. So it’s up to the individual projects to make sure they have the right packages when it comes to those custom packages that our distribution requires

We might have been able to reduce our custom dependencies by relying on another hosted packagist such as asset-packagist.org, or working to make other dependencies publicly available. However, providing our own packagist maintained specifically for the client’s needs brought us performance gains over the other solutions and allows us to more closely vet our frontend library dependencies. It allows us to make a single “repositories” entry at the packagist level, and that gets pulled down by all of our sites that are pointing at it. This means less code editing on a per-site basis

So here we are, using an easily maintained solution, and reaping the benefits of performance, scalability, and increased developer productivity, while keeping our client’s ecosystem private. We didn’t even need that much to get started!

Ingredients

Things you will need to get started:

  • Satis, a rudimentary static packagist generator written in PHP using Composer as a library.
  • A repository to house the custom dependencies you want to put in your packagist. Think: all the items you currently have in your repositories section of your composer.json. This isn’t strictly a “must,” but it makes automation possible.
  • A place to host static HTML and JSON files. Anything web accessible. HTTPS is preferred, but curl can work under other protocols. You can get pretty creative here.
    • Cheap hosting service
    • Spare Droplet, Linode, or AWS EC2 instance
    • S3 bucket
    • GitHub
    • FTP
  • Something to build the packagist on dependency update like:
    • GitHub Actions
    • CircleCI
    • Travis
    • Cron
    • A manual implementation like running a command via SSH

Our implementation looks like this:

  • Repository: GitHub
  • Hosting: S3 bucket
  • Builder: CircleCI

These were all resources we were already using. I’ll go into the specifics on how our build works with some suggestions on alternatives.

It’s pretty simple to set up Satis. There is some decent documentation on Satis at GetComposer.org and on GitHub. What I say here may be a bit of a repeat, but I’m describing an opinionated setup intended to allow for testing and committing changes. This is a necessity when multiple developers are touching the same packagist and you need accountability

Before I dive into the specifics of our setup, I want to mention that if you feel you don’t need this level of control, testing, and revision history, Satis can be set up as a living stand-alone app. You can host it as either a docker container or on a hosting platform. In both of these options, developers would live edit and maintain the packagist via command line by default. You can, however, install a graphical frontend using something like Satisfy

To set Satis like Four Kitchens has, follow the steps below. Code is below if you need examples of how it might look.

  1. Create a new repository.
  2. Initialize a new composer project using composer init.
  3. Require Satis composer require composer/satis.
  4. Add a script to your composer.json to run the full build command.
  5. Add a packages directory to the project with a .gitkeep file. mkdir packages && touch packages/.gitkeep.
  6. Add a .gitignore to ignore the vendor folder and generated package files.
  7. Consider setting up a Lando instance to serve your packagist for testing.
  8. Create satis.json just like you normally would a standard composer.json with the repositories section containing all your packages, repos, and packagists you want available to the projects consuming it.
  9. Add "require-all": true below the repositories section of satis.json. There’s more about usage of require-all versus require in the Satis setup documentation. Use what fits your needs, but if you are adding individual packages instead of entire packagists to your satis.json, require-all is likely all you need.

Your repo could look something like this:

composer.json

{rn  "name": "mycompany/packages",rn  "require": {rn    "composer/satis": "^1.0" 
  },rn  "scripts": {rn    "build": "./vendor/bin/satis build satis.json packages" 
  }
}

satis.json

{rn  "name": "mycompany/packages",rn  "homepage": "https://packages.mycompany.com",rn  "repositories": [rn    { "type": "vcs", "url": "https://github.com/mycompany/privaterepo" },rn    { "type": "vcs", "url": "http://svn.example.org/private/repo" },rn    { "type": "package", "package": [rn      { "name": dropzone/dropzone", "version": "5.9.2", "dist": { "url": "https://github.com/dropzone/dropzone/releases/download/v5.9.2/dist.zip", "type": "zip" }}
    ]}
  ],rn  "require-all": truern}

.lando.yml

name: mycompany-packagesrnrecipe: lemprnconfig:
 webroot: packagesrn  composer_version: 2rn  php: '7.4'

.gitignore

vendorrnpackages/*rn!packages/.gitkeep

From here, run lando start && composer install && composer build. Now, go to another project and add your test packagist to that project. Additionally, add "secure-http":false, to the config section since Lando’s https certificate is insecure by default. Lastly require one of the packages you added to satis.json above.

{rn  ..
 "repositories": [rn    {rn      "type": "composer",rn      "url": "http://mycompany-packages.lndo.site",rn    }
    ..
 ],rn  "require": {rn    "dropzone/dropzone": "^5.9.2" 
  },rn  ..
 "config": {rn    ..
   "secure-http": falsern  }
  ...rn}
 

At this point you should be greeted with a successfully built project and have a local instance of your packagist going. When you are done testing, stop Lando and switch your repository entry in the other project to your packagist’s public URL. Commit all your changes and push up!

Your next step is getting your packagist off your local and out where everyone can use it.

Now you can simply copy the files in your packages folder and put them somewhere web accessible. I really want to drive this point home. The entirety of your packagist is simply the contents of that folder and nothing else. The things that make this so complicated are the processes around automating and updating this packagist

You could, for example, now take these files you created and host them anywhere someone can curl to. This means http, ftp, sftp are available to you, to name a few. If you aren’t worried about privacy, you can even go so far as placing these in the webroot or even the sites/default/files folder in your company’s Drupal site. This is a good option if you are strapped for domain names or running a small operation. You would then make sure to copy those files any time someone makes a change to any of the packages that are a part of your packagist.

If that’s all you are looking for, you can stop here. You’ve done it! You now have a custom packagist and the rest of the workflow may not matter to you. However, if you want some more ideas and want to build out a more robust automated development workflow, keep reading. The ideas get interesting from here

If you wanted to be creative, you could probably remove the line from .gitignore that excludes the packages folder, commit it, and set your packagist URL to something like https://raw.githubusercontent.com/mycompany/packages/main/ and set up an Accept and Authorization header in your packagist. You can see an example on how to use headers in your packagist at GetComposer.org and below with our S3 example

In fact, the composer.json setup described for the creative Github example is really similar to what we did, except we used a workaround recommended by AWS for restricting access to a specific HTTP referer. Our client wanted the extra security so not just anybody could go and poke around at the packages and versions we had available

In our example, we created a normal bucket, and assigned a CNAME to it with a nice domain name. The CNAME is optional but makes it more “official” and allows us to move the packagist later without disrupting the developer workflow too much. We then added a policy to only accept connections from calls with a referer that is our secret key. A referer doesn’t have to be a website. In our case it’s a lengthy hash that would be difficult to guess. This too is optional, but if you are looking for that extra level of security, it’s a good option to consider. Note that you should not add spaces between the colon and the token when using this policy. Our repositories entry in our projects looks like:

{rn  ..
 "repositories": [rn    {rn      "type": "composer",rn      "url": "https://packages.mycompany.com",rn      "options": {rn        "http": {rn          "header": [rn            "Referer:" 
          ]rn        }
      }
    }
    ..
 ],rn  ...rn}

And that’s it. We copy the files up to the bucket using AWS CLI, and it’s published

Now we need to automate the workflow and get what’s in our hosting location to update automatically.

Building and automation

I’ve pointed out that, if you are willing, you can put Satis somewhere, generate the packagist files, upload them somewhere web accessible, and be ready to roll. This isn’t so different from static site generators like Jekyll or Hugo. However, we add in CI for automation, and revision control for accountability so that we can take the “error” out of human command crunching. It’s worth mentioning again that this is super important when you have entire teams modifying this packagist

In our example, I’m using CircleCI. You can do the same with GitHub Actions, Jenkins, or even run on a cron job, provided you are okay with a time-based cadence. You might even do more than one of these. Our CircleCI job looks like this:

.circleci/config.yml

version: 2.1rnorbs:
 php: circleci/[email protected]  aws-cli: circleci/[email protected]:
 run_dependency_update:
   default: truern    type: boolean
jobs:
 create_packagist:
   executor:
     name: php/defaultrn      tag: '7.4.24'rn    steps:
     - checkoutrn      - aws-cli/setuprn      - php/install-composerrn      - php/install-packagesrn      - run:
       name: Set Github authenticationrn        command: composer config u002du002dglobal github-oauth.github.com "$GITHUB_TOKEN";
      - run:
       name: Link auth for satisrn        command: mkdir ~/.composer; ln -s ~/.config/composer/auth.json ~/.composer/auth.jsonrn      - run:
       name: Build packagist json filesrn        command: composer buildrn      - store_artifacts:
       path: packagesrn      - run:
       name: Copy packagist to aws
       command: aws s3 cp u002du002drecursive ./packages/ s3://packages.mycompany.com/rnworkflows:
 version: 2rn  packagist:
   when: << pipeline.parameters.run_dependency_update >>
    jobs:
     - create_packagist:
       filters:
         branches:
           only:
             - mainrn              - master

There’s a lot to unpack here. I’m using pipeline parameters, because a requirement for me is to be able to call this job when another project updates. This functionality allows me to call this CircleCI job using an API call. I also use CircleCI orbs to make grabbing AWS CLI and getting a PHP environment easy

The meat of the job is the same as what you were doing during testing: running the build command we put in our composer.json. Since some of our repositories are private, we also have to make sure that composer has access creating a GitHub token. Then we copy everything to the bucket using AWS CLI. In our case, we have some behind-the-scenes environment variables defining our keys: AWS_ACCESS_KEY_ID, AWS_DEFAULT_REGION, and AWS_SECRET_ACCESS_KEY

From another project’s perspective, I’m still using CircleCI to run the API call. You can do this really easily in other CI environments, too.

version: 2.1
jobs:
 update-packagist:
   docker:
     - image: cimg/base:2021.12rn    steps:
     - run: "curl u002du002drequest POST u002du002durl https://circleci.com/api/v2/project/github/mycompany/packages/pipeline u002du002dheader "Circle-Token: $CIRCLE_TOKEN" u002du002dheader "content-type: application/json" u002du002ddata '{"parameters":{"run_dependency_update":true}}'" 
workflows:
 build:
   jobs:
     - update-packagist

That’s it. I add this job to every project that’s a VCS entry in our satis.json (provided I have access) and let it go to town. If you find yourself with other dependencies out of your control, consider adding a cron job somewhere or a scheduled pipeline trigger. You are done!

Final thoughts

This workflow can be as easy or as difficult as you want to make it given a few factors like:

  • How often will it change
  • How many people touch it
  • How up-to-date it needs to be

There are a lot of ideas here, with lots of knowledge representing several different application architectures for organizations that have multiple projects or sites. If you don’t want to bother with the home-brewed solution, dish out the cash and get a private packagist. The cost may be worth it

However, if you are already using all the necessary services and have a team of knowledgeable individuals like ours, consider maintaining your own packagist that you can host anywhere. You may find it a productive, performant, and most of all joyful and exciting experience that will bring value to your upstream, distribution, or multi-site setup.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Nov 18 2020
Nov 18
Jim VomeroJim Vomero

Jim Vomero

Senior Engineer

As a tech lead, Jim works with clients through the full project cycle, translating their business requirements into actionable development work and working with them to find technical solutions to their challenges.

November 18, 2020

From the consumer perspective, there’s never been a better time to build a website. User-friendly website platforms like Squarespace allow amateur developers to bypass complex code and apply well-designed user interfaces to their digital projects. Modern site-building tools aren’t just easy to use — they’re actually fun

For anyone who has managed a Drupal website, you know the same can’t be said for your platform of choice. While rich with possibilities, the default editorial interface for Drupal feels technical, confusing, and even restrictive to users without a developer background. Consequently, designers and developers too often build a beautiful website while overlooking its backend CMS

Drupal’s open-ended capabilities constitute a competitive advantage when it comes to developing an elegant, customer-facing website. But a lack of attention to the needs of those who maintain your website content contributes to a perception that Drupal is a developer-focused platform. By building a backend interface just as focused on your site editors as the frontend, you create a more empowering environment for internal teams. In the process, your website performs that much better as a whole.

UX principles matter for backend design as much as the frontend

Given Drupal’s inherent flexibilities, there are as many variations of CMS interfaces as there are websites on the platform. That uniqueness is part of what makes Drupal such a powerful tool, but it also constitutes a weakness

The editorial workflow for every website is different, which opens an inevitable training gap in translating your site’s capabilities to your editorial team. Plus, despite Drupal’s open-source strengths, you’ll likely need to reinvent the wheel when designing CMS improvements specific to your organization

For IT managers, this is a daunting situation because the broad possibilities of Drupal are often overwhelming. If you try to make changes to your interface, you can be frustrated when a seemingly easy fix requires 50 hours of development work. Too often, Drupal users will wind up working with an inefficient and confusing CMS because they’re afraid of the complexity that comes with building out a new interface

Fortunately, redesigning your CMS doesn’t have to be a demanding undertaking. With the right expertise, you can develop custom user interfaces with little to no coding required. Personalized content dashboards and defined roles and permissions for each user go a long way toward creating a more intuitive experience

Improving your backend design is often seen as an additional effort, but think of it as a baseline requirement. And, by sharing our user stories within the Drupal community, we also build a path toward improving the platform for the future.

Admin themes are a great starting point

Drupal’s default admin theme as of Drupal 9.4 is Claro, and it’s a good starting point for admin user experience customization. Claro was developed to address the concerns that came out of the Drupal Admin UX Study, which examined the difficulties content editors encountered with the platform

Here at Four Kitchens, we use the Gin theme, which is based on Claro but includes extra enhancements. A number of useful modules are also available to tie add-ons together with Gin, like Gin Toolbar and Gin Layout Builder

For our own usage (and yours, too!), we have compiled the Gin theme and some handy modules and configuration into a starter project we call Sous. Sous also incorporates an Emulsify-based frontend theme and other goodies

This standardization is used across nearly all of our builds. As a result, our development is more efficient. Claro — and by extension, Gin — also includes some work on accessibility within the admin interface, which provides a more inclusive experience

Additionally, both Claro and Gin incorporate responsive layouts, so if an editor needs to make changes on a phone or a tablet, they can. If you’re a long-time Drupal user, you will remember how impossible that used to be.

Use Drupal’s Views module to customize user dashboards

One of the biggest issues with Drupal’s out-of-the-box editorial tools is that they don’t reflect the way any organization actually uses the CMS. Just as UX designers look to provide a positive experience for first-time visitors to your site, your team should aim for delivering a similarly strong first impression for those managing its content

By default, Drupal takes users to their profile pages upon login, which is useful to… almost no one. Plus, the platform’s existing terminology uses cryptic terms such as “node,” “taxonomy,” and “paragraphs” to describe various content items. From the beginning, you should remove these abstract references from your CMS. Your editorial users shouldn’t have to understand how the site is built to own its content.

In the backend, every Drupal site has a content overview page, which shows the building blocks of your site. Offering a full list that includes cryptic timestamps and author details, this page constitutes a floodgate of information. Designing an effective CMS is as much an exercise in subtraction as addition. Whether your user’s role involves reviewing site metrics or new content, their first interaction with your CMS should display what they use most often

If one population of users is most interested in the last item they modified, you can transform their login screen to a custom dashboard to display those items. If another group of users works exclusively with SEO, you can create an interface that displays reports and other common tasks. Using Drupal’s Views module, dashboards like these are possible with a few clicks and minimal coding

By tailoring your CMS to specific user habits, you allow your website teams to find what they need and get to work faster. The most dangerous approach to backend design is to try and build one interface to rule them all.

Listen to your users and ease frustrations with a CMS that works

Through Drupal Views, you can modify lists of content and various actions to control how they display in your CMS. While Views provides many options to create custom interfaces, your users themselves are your organization’s most vital resource. By watching how people work on your site, you can recognize areas where your CMS is falling short

Drupal content dashboardDrupal content dashboard

Even if you’ve developed tools that aimed to satisfy specific use cases, you might be surprised the way your tools are used. Through user experience testing, you’ll often find the workarounds your site editors have developed to manage the site

In one recent example, site editors needed to link to a site page within the CMS. Without that functionality, they would either find the URL by viewing the source code in another tab and copying its node ID number. Anyone watching these users would find their process cumbersome, time-consuming, and frustrating. Fortunately, there’s a Drupal module called Linkit that was implemented to easily eliminate this needless effort

There are many useful modules in the Drupal ecosystem that can enhance the out-of-the-box editorial experience. Entity Clone expedites the content creation process. Views Bulk Operations and Bulk Edit simplify routine content update tasks. Computed Field and Automatic Entity Label take the guesswork out of derived or dependent content values. Using custom form modes and Field Groups can help bring order and streamline the content creation forms

Most of the time, your developers don’t know what solutions teams have developed to overcome an ineffective editorial interface. And, for fear of the complexity required to create a solution, these supposed shortcuts too often go unresolved. Your backend users may not even be aware their efforts could be automated or otherwise streamlined. As a result, even the most beautiful, user-friendly website is bogged down by a poorly designed CMS

Once these solutions are implemented, however, you and your users enjoy a shared win. And, through sharing your efforts with the Drupal community, you and your team build a more user-friendly future for the platform as well.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Sep 23 2020
Sep 23
Michael Lutz

Michael Lutz

Senior Engineer

Primarily responsible for maintaining the Drupal core migration system, Michael often spends long nights and weekends working through the Drupal project issues queue, solving problems, and writing code.

September 23, 2020

Working in digital design and development, you grow accustomed to the rapid pace of technology. For example: After much anticipation, the latest version of Drupal was released this summer. Just months later, the next major version is in progress

At July’s all-virtual DrupalCon Global, the open-source digital experience conference, platform founder Dries Buytaert announced Drupal 10 is aiming for a June 2022 release. Assuming those plans hold, Drupal 9 would have the shortest release lifetime of any recent major version

For IT managers, platform changes generate stress and uncertainty. Considering the time-intensive migration process from Drupal 7 to 8, updating your organization’s website can be costly and complicated. Consequently, despite a longtime absence of new features, Drupal 7 still powers more websites than Drupal 8 and 9 combined. And, as technology marches on, the end of its life as a supported platform is approaching

Fortunately, whatever version your website is running, Drupal is not running away from you. Drupal’s users and site builders may be accustomed to expending significant resources to update their website platform, but the plan for more frequent major releases alleviates the stress of the typical upgrade. And, for those whose websites are still on Drupal 7, Drupal 10 will continue offering a way forward

The news that Drupal 10 is coming sooner rather than later might have been unexpected, but you still have no reason to panic just yet. However, your organization shouldn’t stand still, either

Drupal 10 is comingImage via dri.es.

The end for Drupal 7 is still coming, but future upgrades will be easier

Considering upgrading to Drupal 8 involves the investment of building a new site and migrating its content, it’s no wonder so many organizations have been slow to update their platform. Drupal 7 is solid and has existed for nearly 10 years. And, fortunately, it’s not reaching its end of life just yet

At the time of Drupal 9’s release, Drupal 7’s planned end of life was set to arrive late next year. This meant the community would no longer release security advisories or bug fixes for that version of the platform. Affected organizations would need to contact third-party vendors for their support needs. With the COVID-19 pandemic upending businesses and their budgets, the platform’s lifespan has been extended to November 28, 2022

Drupal’s development team has retained its internal migration system through versions 8 and 9, and it remains part of the plan for the upcoming Drupal 10 as well. And the community continues to maintain and improve the system in an effort to make the transition easier. If your organization is still on Drupal 7 now, you can use the migration system to jump directly to version 9, or version 10 upon its release. Drupal has no plans to eliminate that system until Drupal 7 usage numbers drop significantly

Once Drupal 10 is ready for release, Drupal 7 will finally reach its end of life. However, paid vendors will still offer support options that will allow your organization to maintain a secure website until you’re ready for an upgrade. But make a plan for that migration sooner rather than later. The longer you wait for this migration, the more new platform features you’ll have to integrate into your rebuilt website.

Initiatives for Drupal 10 focus on faster updates, third-party software

In delivering his opening keynote for DrupalCon Global, Dries Buytaert outlined five strategic goals for the next iteration of the platform. Like the work for Drupal 9 that began within the Drupal 8 platform, development of Drupal 10 has begun under the hood of version 9

A Drupal 10 Readiness initiative focuses on upgrading third-party components that count as technological dependencies. One crucial component is Symfony, which is the PHP framework Drupal is based upon. Symfony operates on a major release schedule every two years, which requires that Drupal is also updated to stay current. The transition from Symfony 2 to Symfony 3 created challenges for core developers in creating the 8.4 release, which introduced changes that impacted many parts of Drupal’s software

To avoid a repeat of those difficulties, it was determined that the breaking changes involved in a new Symfony major release warranted a new Drupal major release as well. While Drupal 9 is on Symfony 4, the Drupal team hopes to launch 10 on Symfony 6, which is a considerable technical challenge for the platform’s team of contributors. However, once complete, this initiative will extend the lifespan of Drupal 10 to as long as three or four years

Other announced initiatives included greater ease of use through more out-of-the-box features, a new front-end theme, creating a decoupled menu component written in JavaScript, and, in accordance with its most requested feature, automated security updates that will make it as easy as possible to upgrade from 9 to 10 when the time comes. For those already on Drupal 9, these are some of the new features to anticipate in versions 9.1 through 9.4.

Less time between Drupal versions means an easier upgrade path

The shift from Drupal 8 to this summer’s release of Drupal 9 was close to five years in the making. Fortunately for website managers, that update was a far cry from the full migration required from version 7. While there are challenges such as ensuring your custom code is updated to use the most recent APIs, the transition was doable with a good tech team at your side

Still, the work that update required could generate a little anxiety given how comparatively fast another upgrade will arrive. But the shorter time frame will make the move to Drupal 10 easier for everybody. Less time between updates also translates to less deprecated code, especially if you’re already using version 9. But if you’re not there yet, the time to make a plan is now.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Apr 23 2020
Apr 23
Todd Ross Nienkerk

Todd Ross Nienkerk

CEO, Owner, and Co‑Founder

Todd is responsible for driving Four Kitchens’ vision and long-term strategy.

April 23, 2020

We’ve been making big websites for 14 years, and almost all of them have been built on Drupal. It’s no exaggeration to say that Four Kitchens owes its success to the incredible opportunities Drupal has provided us. There has never been anything like Drupal and the community it has fostered—and there may never be anything like it ever again.

That’s why it’s crucial we do everything we can to support the Drupal Association. Especially now.

The impacts of COVID-19 have been felt everywhere, especially at the Association. With the cancellation of DrupalCon Minneapolis, the Drupal Association lost a major source of annual fundraising. Without the revenue from DrupalCon, the Association would not be able to continue its mission to support the Drupal project, the community, and its growth.

The Drupal community’s response to this crisis was tremendous. For our part, we proudly joined 27 other organizations in pledging our sponsorship fees to the Association regardless of whether, or how, DrupalCon happened. I ensured my Individual Membership was still active, and I made a personal contribution.

But we need to do more.

You can help by joining us in the #DrupalCares campaign.

The #DrupalCares campaign is a fundraiser to protect the Drupal Association from the financial impact of COVID-19. Your support will help keep the Drupal Association strong and able to continue accelerating the Drupal project.

The Drupal Association

The outpouring of support has been… Inspiring. First, project founder Dries Buytaert and his partner Vanessa Buytaert pledged their generous support of $100,000. Then, a coalition of Drupal businesses pledged even more matching contributions. We are proud to count ourselves among the dozens of participating Drupal businesses.

Any individual donations, increased memberships, or new memberships through the end of April will be tripled by these matching pledges, up to $100,000, for a total of $300,000.

Please join us in supporting the Drupal Association. Your contribution will help ensure the continued success of the Association and the Drupal community for years to come.

Give to #DrupalCares through April to help the Association receive a 3:1 matching contribution. 

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Jan 23 2020
Jan 23
Allan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

January 23, 2020

In the Drupal support world, working on Drupal 7 sites is a necessity. But switching between Drupal 7 and Drupal 8 development can be jarring, if only for the coding style.

Fortunately, I’ve got a solution that makes working in Drupal 7 more like working in Drupal 8. Use this three-part approach to have fun with Drupal 7 development:

  • Apply Xautoload to keep your PHP skills fresh, modern, and compatible with all frameworks and make your code more reusable and maintainable between projects.
  • Use the Drupal Libraries API to use third-party libraries.
  • Use the Composer template to push the boundaries of your programming design patterns.

Applying Xautoload

Xautoload is simply a module that enables PSR-0/4 autoloading. Using Xautoload is as simple as downloading and enabling it. You can then start using use and namespace statements to write object-oriented programming (OOP) code.

For example:

xautoload.info

name = Xautoload Example
description = Example of using Xautoload to build a page
core = 7.x package = Midcamp Fun

dependencies[] = xautoload:xautoload

xautoload_example.module

 'xautoload_example_page_render',
    'access callback' => TRUE,
  );
  return $items;
}

function xautoload_example_page_render() {
  $obj = new SimpleObject();
  return $obj->render();
}

src/SimpleObject.php

 "

Hello World

", ); } }

Enabling and running this code causes the URL /xautoload_example to spit out “Hello World”.

You’re now ready to add in your own OOP!

Using third-party libraries

Natively, Drupal 7 has a hard time autoloading third-party library files. But there are contributed modules (like Guzzle) out there that wrap third-party libraries. These modules wrap object-oriented libraries to provide a functional interface. Now that you have Xautoload in your repertoire, you can use its functionality to autoload libraries as well.

I’m going to show you how to use the Drupal Libraries API module with Xautoload to load a third-party library. You can find examples of all the different ways you can add a library in xautoload.api.php. I’ll demonstrate an easy example by using the php-loremipsum library:

1. Download your library and store it in sites/all/libraries. I named the folder php-loremipsum.

2. Add a function implementing hook_libraries_info to your module by pulling in the namespace from Composer. This way, you don’t need to set up all the namespace rules that the library might contain.

function xautoload_example_libraries_info() {
  return array(
    'php-loremipsum' => array(
      'name' => 'PHP Lorem Ipsum',
      'xautoload' => function ($adapter) {
        $adapter->composerJson('composer.json');
      }
    )
  );
}

3. Change the page render function to use the php-loremipsum library to build content.

use joshtronic\LoremIpsum;
function xautoload_example_page_render() {
  $library = libraries_load('php-loremipsum');
  if ($library['loaded'] === FALSE) {
    throw new \Exception("php-loremipsum didn't load!");
  }
  $lipsum = new LoremIpsum();
  return array(
    '#markup' => $lipsum->paragraph('p'),
  );
}

Note that I needed  to tell the Libraries API to load the library, but I then have access to all the namespaces within the library. Keep in mind that the dependencies of some libraries are immense. You’ll very likely need to use Composer from within the library and commit it when you first start out. In such cases, you might need to make sure to include the Composer autoload.php file.

Another tip:  Abstract your libraries_load() functionality out in such a way that if the class you want already exists, you don’t call libraries_load() again. Doing so removes libraries as a hard dependency from your module and enables you to use Composer to load the library later on with no more work on your part. For example:

function xautoload_example_load_library() {
  if (!class_exists('\joshtronic\LoremIpsum', TRUE)) {
    if (!module_exists('libraries')) {
      throw new \Exception('Include php-loremipsum via composer or enable libraries.');
    }
    $library = libraries_load('php-loremipsum');
    if ($library['loaded'] === FALSE) {
      throw new \Exception("php-loremipsum didn't load!");
    }
  }
}

And with that, you’ve conquered the challenge of using third-party libraries!

Setting up a new site with Composer

Speaking of Composer, you can use it to simplify the setup of a new Drupal 7 site. Just follow the instructions in the Readme for the Composer Template for Drupal Project. From the command line, run the following:

composer create-project drupal-composer/drupal-project:7.x-dev  --no-interaction

This code gives you a basic site with a source repository (a repo that doesn’t commit contributed modules and libraries) to push up to your Git provider. (Note that migrating an existing site to Composer involves a few additional considerations and steps, so I won’t get into that now.)

If you’re generating a Pantheon site, check out the Pantheon-specific Drupal 7 Composer project. But wait: The instructions there advise you to use Terminus to create your site, and that approach attempts to do everything for you—including setting up the actual site. Instead, you can simply use composer create-project  to test your site in something like Lando. Make sure to run composer install if you copy down a repo.

From there, you need to enable the Composer Autoload module , which is automatically required in the composer.json you pulled in earlier. Then, add all your modules to the require portion of the file or use composer require drupal/module_name just as you would in Drupal 8.

You now have full access to all the  Packagist libraries and can use them in your modules. To use the previous example, you could remove php-loremipsum from sites/all/libraries, and instead run composer require joshtronic/php-loremipsum. The code would then run the same as before.

From here on out, it’s up to your imagination. Code and implement with ease, using OOP design patterns and reusable code. You just might find that this new world of possibilities for integrating new technologies with your existing Drupal 7 sites increases your productivity as well.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web