Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Mar 21 2024
Mar 21
Mike GouldingMike Goulding

Mike Goulding

Senior Drupal Engineer

Mike has been part of the Four Kitchens crew since 2018, where he works as a senior engineer and tech lead for a variety of Drupal and WordPress projects.

March 21, 2024

AstroJS logo

There are many different options available for the organization or team that decides it is time to decouple their Drupal site. There are frameworks that are designed for static site generation (SSG) and there are others that use server-side rendering (SSR), with many that claim to do both well.

React and NextJS have been popular options for a while now, and they are well-loved here at Four Kitchens as well. Another framework that is a little different from the above is Astro, and it may be worth considering.

What is Astro?

Astro is an interesting framework to work with, and it only becomes more so with time. Astro’s website makes claims of performance advantages over many other frameworks in the space. The full report can be found here.

More interesting than performance claims are some of the unique features this framework brings with it. Astro has many official integrations for other popular JS frameworks. This means, for example, that part of a page could use React, while another part could use Svelte. An even more ambitious page could use Vue, React, and AlpineJS for different components. While these examples are not a typical or recommended use case, they do illustrate that flexibility is one of the real strengths of Astro.

This flexibility doesn’t come with a steep learning curve, as Astro makes use of enough familiar pieces so that newcomers aren’t immediately overwhelmed. It is possible to write Astro components in a straightforward manner, similar to HTML, and still incorporate JavaScript XML (JSX) expressions to include data in the component’s output. There are a couple of tutorials for getting started with Astro, and they do a good job of giving the general structure of a project along with some scenarios that are unique to the framework.

Houston, Astro's mascotHouston, Astro's mascot

(Also, Houston is an adorable mascot and I am here for it!)

Using Astro with Drupal

Despite all of the integrations that can be found in the Astro toolset, there is notably one key thing that is missing: There isn’t an existing integration for Drupal! The list of content management systems (CMSs) that Astro recommends are specifically headless CMSs, which make for a more natural starting point for this setup than converting a Drupal site.

Never fear, though! Drupal may not specifically be on that list, but that doesn’t mean it isn’t something that should be considered. Astro has that incredible flexibility, after all, and that means there are more options than it seems on the surface. All that is needed is an endpoint (or several) to fetch data from Drupal, and things are looking up once again.

Using the Drupal GraphQL and GraphQL Compose modules, it is possible to quickly get data ready to expose from Drupal and into the hands of a decoupled framework like Astro. With that, it becomes possible to fetch that data within Astro and build our frontend while taking advantage of many of the features that Astro offers. This can also be done with REST API or JSON:API, but for our purposes, the consistency and structure of GraphQL can’t be beat when crafting a decoupled integration with Drupal.

Astro with GraphQLAstro with GraphQL

Using the fetch function that is available to Astro, (and JavaScript in general), we can get data from just about anywhere into our Astro components. This blends well with the head start from the compose module, as you can take an existing Drupal site and be ready to connect to a frontend framework very quickly. This means quicker prototyping and quicker assembling of components.

Astro also supports dynamic routing out of the box, which is an essential feature when connecting to a Drupal site where routes aren’t always structured like directories. Using this wildcard type of functionality, we can more easily take an existing site — regardless of the structure of the content — and get output into Astro. With the data from the routes in hand, we can get to the fun part: building the components and taking advantage of more of the Astro’s flexibility.

Flexibility is key

For me, Astro’s strength doesn’t solely come from the speed that it builds and renders content or the ease of building pages in a familiar JSX or Markdown pattern. Its real strength comes from the flexibility and variety of build options. While it does a great job handling some functionality on a given component or creating simple pages for a blog listing, it does even more with the ability to bring in other frameworks inside of components. Want to add a search page, but there isn’t an existing integration for Astro? If there is one for React, that works here, too! Do you have an internal team member really excited about building personalized content with Vue? Bring that in, and that component will work as well.

While the reality of the implementations may be a bit more involved than described on the tin, it is surprisingly easy and encouraged to bring in live updating components inside of Astro. This changes what would otherwise be a run-of-the-mill frontend tool into something much more interesting. Astro does shine in its own right, especially with statically generated pages and content. It just wouldn’t be doing anything especially new without bringing in other frameworks.

This is also where bringing a CMS like Drupal into a decoupled setup with Astro is intriguing. There is an opportunity for highly dynamic pages that wouldn’t work with a traditional static framework while still getting the speed and benefits of that approach. Drupal sites are typically very quick to update when content changes, which can be a sticking point for working with a decoupled architecture. How often should the frontend be rebuilt and how much can caching make up the difference? With having some parts of the site use components that can update more easily on the page, there benefits of both approaches can come through.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Mar 13 2024
Mar 13
Marc BergerMarc Berger

Marc Berger

Senior Backend Engineer

Always looking for a challenge, Marc tries to add something new to his toolbox for every project and build — be it a new CSS technology, creating custom APIs, or testing out new processes for development.

March 13, 2024

Recently, one of our clients had to retrieve some information from their Drupal site during a CI build. They needed to know the internal Drupal path from a known path alias. Common Drush commands don’t provide this information directly, so we decided to write our own custom Drush command. It was a lot easier than we thought it would be! Let’s get started.

Note: This post is based on commands and structure for Drush 12.

While we can write our own Drush command from scratch, let’s discuss a tool that Drush already provides us: the drush generate command. Drush 9 added support to generate scaffolding and boilerplate code for many common Drupal coding tasks such as custom modules, themes, services, plugins, and many more. The nice thing about using the drush generate command is that the code it generates conforms to best practices and Drupal coding standards — and some generators even come with examples as well. You can see all available generators by simply running drush generate without any arguments.

Step 1: Create a custom module

To get started, a requirement to create a new custom Drush command in this way is to have an existing custom module already in the codebase. If one exists, great. You can skip to Step 2 below. If you need a custom module, let’s use Drush to generate one:

drush generate module

Drush will ask a series of questions such as the module name, the package, any dependencies, and if you want to generate a .module file, README.md, etc. Once the module has been created, enable the module. This will help with the autocomplete when generating the custom Drush command.

drush en

Step 2: Create custom Drush command boilerplate

First, make sure you have a custom module where your new custom Drush command will live and make sure that module is enabled. Next, run the following command to generate some boilerplate code:

drush generate drush:command-file

This command will also ask some questions, the first of which is the machine name of the custom module. If that module is enabled, it will autocomplete the name in the terminal. You can also tell the generator to use dependency injection if you know what services you need to use. In our case, we need to inject the path_alias.manager service. Once generated, the new command class will live here under your custom module:

/src/Drush/Commands

Let’s take a look at this newly generated code. We will see the standard class structure and our dependency injection at the top of the file:

get('token'),
      $container->get('path_alias.manager'),
    );
  }

Note: The generator adds a comment about needing a drush.services.yml file. This requirement is deprecated and will be removed in Drush 13, so you can ignore it if you are using Drush 12. In our testing, this file does not need to be present.

Further down in the new class, we will see some boilerplate example code. This is where the magic happens:

/**
   * Command description here.
   */
  #[CLI\Command(name: 'custom_drush:command-name', aliases: ['foo'])]
  #[CLI\Argument(name: 'arg1', description: 'Argument description.')]
  #[CLI\Option(name: 'option-name', description: 'Option description')]
  #[CLI\Usage(name: 'custom_drush:command-name foo', description: 'Usage description')]
  public function commandName($arg1, $options = ['option-name' => 'default']) {
    $this->logger()->success(dt('Achievement unlocked.'));
  }

This new Drush command doesn’t do very much at the moment, but provides a great jumping-off point. The first thing to note at the top of the function are the new PHP 8 attributes that begin with the #. These replace the previous PHP annotations that are commonly seen when writing custom plugins in Drupal. You can read more about the new PHP attributes.

The different attributes tell Drush what our custom command name is, description, what arguments it will take (if any), and any aliases it may have.

Step 3: Create our custom command

For our custom command, let’s modify the code so we can get the internal path from a path alias:

/**
   * Command description here.
   */
  #[CLI\Command(name: 'custom_drush:interal-path', aliases: ['intpath'])]
  #[CLI\Argument(name: 'pathAlias', description: 'The path alias, must begin with /')]
  #[CLI\Usage(name: 'custom_drush:interal-path /path-alias', description: 'Supply the path alias and the internal path will be retrieved.')]
  public function getInternalPath($pathAlias) {
    if (!str_starts_with($pathAlias, "/")) {
      $this->logger()->error(dt('The alias must start with a /'));
    }
    else {
      $path = $this->pathAliasManager->getPathByAlias($pathAlias);
      if ($path == $pathAlias) {
        $this->logger()->error(dt('There was no internal path found that uses that alias.'));
      }
      else {
        $this->output()->writeln($path);
      }

    }
    //$this->logger()->success(dt('Achievement unlocked.'));
  }

What we’re doing here is changing the name of the command so it can be called like so:

drush custom_drush:internal-path or via the alias: drush intpath

The is a required argument (such as /my-amazing-page) because of how it is called in the getInternalPath method. By passing a path, this method first checks to see if the path starts with /. If it does, it will perform an additional check to see if there is a path that exists. If so, it will return the internal path, i.e., /node/1234. Lastly, the output is provided by the logger method that comes from the inherited DrushCommands class. It’s a simple command, but one that helped us automatically set config during a CI job.

Table output

Note the boilerplate code also generated another example below the first — one that will provide output in a table format:

/**
   * An example of the table output format.
   */
  #[CLI\Command(name: 'custom_drush:token', aliases: ['token'])]
  #[CLI\FieldLabels(labels: [
    'group' => 'Group',
    'token' => 'Token',
    'name' => 'Name'
  ])]
  #[CLI\DefaultTableFields(fields: ['group', 'token', 'name'])]
  #[CLI\FilterDefaultField(field: 'name')]
  public function token($options = ['format' => 'table']): RowsOfFields {
    $all = $this->token->getInfo();
    foreach ($all['tokens'] as $group => $tokens) {
      foreach ($tokens as $key => $token) {
        $rows[] = [
          'group' => $group,
          'token' => $key,
          'name' => $token['name'],
        ];
      }
    }
    return new RowsOfFields($rows);
  }

In this example, no argument is required, and it will simply print out the list of tokens in a nice table:

------------ ------------------ ----------------------- 
  Group        Token              Name                   
------------ ------------------ ----------------------- 
  file         fid                File ID                
  node         nid                Content ID
  site         name               Name
  ...          ...                ...

Final thoughts

Drush is a powerful tool, and like many parts of Drupal, it’s expandable to meet different needs. While I shared a relatively simple example to solve a small challenge, the possibilities are open to retrieve all kinds of information from your Drupal site to use in scripting, CI/CD jobs, reporting, and more. And by using the drush generate command, creating these custom solutions is easy, follows best practices, and helps keep code consistent.

Further reading

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Jan 26 2024
Jan 26
The Web ChefsThe Web Chefs

At Four Kitchens we keep several lists of “Hot Topics” to share our learnings across the dozens of sites that we care for. Are you upgrading a Drupal site to CKEditor5? We’ve tidied up one of these internal wiki documents into this set of general upgrade guidelines that might pertain to your website.

Rough steps to upgrade

The level of effort needed for this upgrade will be different for each site. It may take some time to figure out. CKEditor 5 is available in Drupal 9.5 and beyond. You can try switching/upgrading on a local site or multidev and assess the situation.

First, create a list of CKEditor enhancement modules on the site and check if they are Drupal 10 ready (the reports from Upgrade Status and this Drupal.org page may help). Common modules to look for include Linkit, Anchor Link, Advanced Link, IMCE, Entity Embed, Video Embed Field, Footnotes, and anything with the word “editor” in the title.

As a best practice, you should test both the creation of new content, and editing existing content in several places. This will help make sure that some lesser used HTML isn’t treated differently in the new CKEditor. Run visual regression tests (if available).

You may need to point out key interface changes to your clients or stakeholders (e.g., contextual windows for links/media/tables instead of modals, etc.). While it is a bit of a change, it’s overall an improved user experience, especially for new people who are coming in cold.

Anchor links

Anchor link gives editors the ability to create links to different sections within a page.

For “better integration with Drupal 10, CKEditor 5, and LinkIt” there is a 3.0.0@alpha version. If your project isn’t using wikimedia/composer-merge-plugin, you must require northernco/ckeditor5-anchor-drupal package and add the following to the repositories section of composer.json:

{
	"type": "package",
  "package": {
      "name": "northernco/ckeditor5-anchor-drupal",
      "version": "0.3.0",
      "type": "drupal-library",
      "dist": {
          "url": "https://registry.npmjs.org/@northernco/ckeditor5-anchor-drupal/-/ckeditor5-anchor-drupal-0.3.0.tgz",
          "type": "tar"
      }
  }
}

Issue

Branch

Embedded media

Depending on the age of your site, it might be using one of several techniques to embed media into the WYSIWYG:

If your site is using the video_embed_field module (most sites are probably using Drupal core’s media module), there is a patch that adds support for CKE5. Insert Image works slightly different (though this is probably not the case if your site uses core’s media module). It’s worth considering if there is a way to enhance this for user experience, if necessary.

If your site uses custom Entity Embed for media, consider switching to the core media library. It may provide a better administrative user experience in some cases.

The insert image button in CKEditor functions a little differently than it used to. Rather than bringing up a modal with fields to upload an image like the image below:

Insert image button in CKEditor5

It now immediately pulls up your computer’s file system for you to search for images like so:

Filesystem image search in CKEditor5Filesystem image search in CKEditor5

After adding your image, the alt tag box prompts you underneath the image:

CKEditor5 alt tag promptCKEditor5 alt tag prompt

After submitting your alt tag, you can adjust alignment and sizing:

CKEditor5 image sizingCKEditor5 image sizing

Moving general styles to link styles

It was common in CKEditor4 to use its “Styles” feature to provide a way to add variations of links (to make them look like buttons, or to add icons).

There are a few UX problems with that approach. Either the styles are set to apply on , which means that they can be applied to non-links, or the styles are set to apply on , which means that they are mysteriously grayed out most of the time (until you select a link). Either way, it’s not intuitive how to apply a link style. In CKEditor5, we can switch to using the Link Styles module.

Change in Styles dropdown behavior

In CKEditor4, when integrated with Drupal, the Styles dropdown only allowed applying one style to an element (e.g., “external link”). If you tried to apply a different style, such as “locked link,” the previous style would be removed.

The Drupal implementation of CKEditor5 allows for multiple styles to be applied to elements via the Styles dropdown. This change may be unexpected for some, and could result in elements that look broken, such as when a link has both the “external link” and “locked link” styles.

CKEditor5 introduced a new API for adding theme-specific styles. The new architecture might cause the CKEditor5 theme to bleed into the admin theme. To know how to deal with these issues, review new API for adding theme-specific styles in CKEditor5.

You’ll likely run into an issue with styles bleeding outside of the editor, so see the other section within this page.

Cut and paste

Paste-from-Word, paste-from-Google-Docs, etc. is now built-in to CKEditor5. (At least for 90% of use cases.) There’s a paid plugin for more esoteric needs.

There is no paste-as-plain-text plugin for CKEditor5. You can use Ctrl-Shift-V (or Cmd-Shift-V) to paste as plain text. If you want to get rid of all formatting (including bold, links, etc.) in existing text, you can highlight the text, use Ctrl-C to copy, then Ctrl-Shift-V to paste it back as plain text.

Many of our Behat automated test broke after the update because there were multiple structural changes, so this is how we solved it: First, here is the doc about how to get the editor instance in case you want to know more about it. This is how we rewrite our custom step to fill out the CKEditor during testing. (We found the code in an article post-post).

/**
   * A step to help fill in a ckeditor wysiwyg.
   *
   * @param string $locator
   *   The css locator of the field that ckeditor has taken over.
   * @param string $value
   *   The html value you wish to fill in to ckeditor.
   *
   * @Then I fill in wysiwyg on field :locator with :value
   */
  public function iFillInWysiwygOnFieldWith($locator, $value) {

    // https://ckeditor.com/docs/ckeditor5/latest/support/faq.html#how-to-get-the-editor-instance-object-from-the-dom-element
    $ckeditor5_drupal_editable_element = "div.form-item-$locator .ck-editor__editable";

    $this->getSession()
      ->executeScript(
        "
        var domEditableElement = document.querySelector(\"$ckeditor5_drupal_editable_element\");
        if (domEditableElement.ckeditorInstance) {
          const editorInstance = domEditableElement.ckeditorInstance;
          if (editorInstance) {
            editorInstance.setData(\"$value\");
          } else {
            throw new Exception('Could not get the editor instance!');
          }
        } else {
          throw new Exception('Could not find the element!');
        }
        "
      );
  }

and the mink step for regular field:

And I fill in wysiwyg on field "field-summary-0-value" with "Some Teaser Text"

And for a field inside a paragraph:

And I fill in wysiwyg on field "field-sidebar-content-0-subform-field-simple-text-0-value" with "Behat Side Nav Body Text"

Preventing custom styles from bleeding into admin theme with CKEditor5

See the new API documentation about implementing theme styles in the new way. This may require some adjustments on your end.

One of the major changes with CKEditor5 is that it pulls WYSIWYG styles onto the whole page when there is a WYSIWYG on the page. In CKEditor4, styles were only pulled into the CKEditor iframe. This can be extremely frustrating when the admin theme looks odd or different on pages that contain a WYSIWYG.

Limit the number of stylesheets being pulled into the WYSWIYG. (First, note that this method has only been confirmed to work on newer versions of Sous using specific webpack settings. If you are having problems with it, make sure your webpack settings allow for multiple manifests to be generated. You may need to refer to a newer site to see how it is configured.)

The first step is to create a new stylesheet (a manifest) called wysiwyg.scss in the same directory as your styles.scss file, which assembles all the stylesheets used in your theme. For this stylesheet, we’ll only want to include the stylesheets that our WYSIWYG needs. For example, I have one that looks like this:

@import url('https://fonts.googleapis.com/css2?family=Poppins:ital,wght@0,400;0,700;1,400;1,700&display=swap');
@import '~normalize.css/normalize';
@import '~breakpoint-sass/stylesheets/breakpoint';

// Components
@import '00-base/**/*.scss';
// Include all atoms except form items.
@import '01-atoms/00-links/**/*.scss';
@import '01-atoms/01-text/**/*.scss';
@import '01-atoms/02-lists/*.scss';
@import '01-atoms/tables/*.scss';
@import '01-atoms/images/**/*.scss';
@import '05-pages/colors.scss';
@import '05-pages/base.scss';

In this example, we are pulling in a couple needed files from node_modules (normalize and breakpoint), and then any .scss files from base, and then select files from atoms (links, text, lists, tables, and images).

Compile and make sure that it has created the new files at /dist/css/wysiwyg.css. If you get any errors, you may need to include another file that has a variable you need, or something along those lines.

1.) Update your .info file In your theme’s .info file, set CKEditor5 to use your new stylesheet:

ckeditor5-stylesheets:
  - dist/css/wysiwyg.css

2.) Review the WYSIWYG. Visit a page with a WYSIWYG on the page, and verify that the limited styles are loading properly within the WYSIWYG. Try all the dropdowns and buttons that are included in the WYSIWYG settings. If anything appears unthemed, review your styles to see if there’s a stylesheet missing from your manifest.

3.) Review the rest of the page. Now review the page around the WYSIWYG and note how if differs from other pages that do not have a WYSIWYG. Common differences to look for are: heading styles, text styles, buttons — basically anything that you included in your manifest.

4.) Limit styles

  • Find the page’s body class for node edit pages (in our test case, .gin--edit-form). It may depend on your admin theme.
  • Find the wrapper class for the WYSIWYG. Most likely the best choice is .ck-content. Our approach will be to hide styles from .gin--edit-form, but then add them to .ck-content.

For example:

body {
  background-color: clr(background);
  color: clr(text);

  @include body-copy;
}

becomes:

body:not(.gin--edit-form),
.ck-content {
  background-color: clr(background);
  color: clr(text);

  @include body-copy;
}

And for buttons:

.main .button {
  @include button-base;
  @include button-color-primary;
  @include button-medium;
}

it becomes:

body:not(.gin--edit-form) .button,
.main .button,
.ck-content a.button {
  @include button-base;
  @include button-color-primary;
  @include button-medium;
}

With any luck, the styles used apply mixins, which makes it easy to filter out where to apply the styles. In some cases, the overriding of styles may become hard because of the order in which the stylesheets are loaded. Try to avoid !importants and instead use things like an additional element or class to firm up your override.

One issue that may come up is your overrides here end up overriding things in your custom theme, depending on how they are defined. In this case, don’t wrap the styles in the body classes, but rather undo the custom theme’s style on the admin page items manually. Luckily, since we’re narrowly applying custom styles, only things used in the WYSIWYG will need to be addressed.

For instance:

// Apply general link styles to all links.
a {
  @include link;
}

// Overrides for Admin pages containing CKEditor (you will need a body class only on admin pages).
.user-logged-in {
  a {
    background-image: none;
    transition: none;
  }

  .horizontal-tabs-list a,
  .toolbar a {
    font-weight: normal;
  }
}

// Reapply link styles to links within the WYSIWYG
.ck-editor a {
  @include link;
}

Continue to review your page and adjust it until it no longer differs from other admin pages.

Editor explodes out of its container in deeper paragraphs

This issue seems to occur only with rich text fields within a paragraph. It might be limited to the Gin theme.

This issue might be because of the container’s width. If input fields inside the container have a specified size exceeding the screen width, it can lead the editor to inherit the container’s width, extending beyond the screen. You can see this as a Drupal Core/CKEditor5 bug in Drupal.org: CKEditor5 toolbar items of multivalue field (typically Paragraphs) overflowing on narrow viewports and overlapping with node form’s sidebar on wide viewports.

To resolve this quickly, set the input fields to 100% width, making sure everything works seamlessly. Be sure to include this in a stylesheet of your admin theme.

.node-form input[size] {
  width: 100%;
}

We can also modify the ‘flex-wrap’ property of the CKEditor buttons to make sure they stay within the container’s width:

.ck-editor .ck.ck-toolbar.ck-toolbar_grouping > .ck-toolbar__items {
    flex-wrap: wrap;
}

Additional resources

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Jan 05 2024
Jan 05
Amanda LukerAmanda Luker

Amanda Luker

Web Chef Emeritus

Amanda is responsible for translating visual designs and applying them to Drupal sites.

January 5, 2024

Drupal provides a system for site administrators to add their own images and have them appear uniformly on the website. This system is called Image Styles. This tool can resize, crop, and scale images to fit any aspect ratio required by a design.

When creating responsive websites, a single image style for each image variation is insufficient. Each image, such as a hero image, a card image, a WYSIWYG image, or a banner image, requires multiple versions of one image. This ensures that the website delivers only what visitors need based on their screen size. For instance, a mobile user may only require a 320-pixel-wide image, while a large desktop user may want an 1,800-pixel-wide image (doubled for double-pixel density). For this reason, Drupal has Responsive Image Styles, which will group your images into a set of styles that will each show under different conditions.

Practical approach to convert images from design to Drupal

  • Determine your image’s aspect ratio. If you find that the images in the design are not in a common aspect ratio (like 1:1, 2:1, 4:3, or 16:9) or if they vary by a little bit, consider running the dimensions through a tool that will find the closest reasonable aspect ratio.
  • Determine the smallest and largest image sizes. For example, for a 16:9 aspect ratio, the smallest size might be 320 pixels x 180 pixels, while the largest could be 3,200 pixels x 1,800 pixels (doubled for high-density screens).
  • To generate all variations, you can use an AI tool to print images with 160-pixel increments between each size. 160 increments tend to hit a lot of common breakpoints. Here’s an example using GitHub CoPilot:

There are likely more ways to streamline this process with Copilot. I’ve also used ChatGPT to rewrite them using a prefix, making it easy to add them in Drupal like this:

Drupal image styles

If adding all of these steps seems like a lot of work, consider using the Easy Responsive Images module! This module can create image styles for you, allowing you to set your aspect ratios and the increment between each style.

Once you have all your styles in place, create your responsive image styles by following these steps:

  • Choose a name for your responsive image style based on its usage
  • Select the “responsive image” breakpoint group
  • Usually, I choose to select multiple image styles and use the sizes attribute. Use the sizes attribute to craft your “sizes.” For example:

(min-width:960px) 50vw, (min-width:1200px) 30vw, 100vw

In this example, choosing an image that is smaller than 960 pixels will best fit the full width of the viewport. At 960 pixels, the image will be selected to best fill half of the viewport width, and at 1,200 pixels, 30%. This approach is nimble and allows the browser to choose the most appropriate image for each case.

After setting the size rules, choose all of the image styles that you want the browser to be able to use. You don’t have to use them all. In some cases, you might have two responsive image styles that are pulling from the same aspect ratio image styles, but one uses all of them and the other uses a subset of them.

Drupal image sizingDrupal image sizing

After adding your responsive image style, you need to map your Media View Mode:

  1. Go to https://[your-site.local]/admin/structure/display-modes/view/add/media
  2. Add the media view mode as a new Display for Images: https://[your-site.local]/admin/structure/media/manage/image/display
  3. Choose “Responsive image” as the Format and select your new responsive image style

Drupal responsive image manage displayDrupal responsive image manage display

Once you have set this up, you are ready to use the View Mode to display the image field for your entity.

Drupal article with imageDrupal article with image

In this example, all the images have the same breakpoint. There may be times when you need to have different aspect ratios at different breakpoints. In those cases, you may want to use your custom theme’s Breakpoint Group. This will allow you to manually select each image style on for each breakpoint (instead of letting Drupal choose it for you).

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Jan 03 2024
Jan 03

Over the past 12 months, our teams have completed numerous Drupal upgrades. We would like to share our experiences and knowledge with anyone who has yet to undergo this process to help make it smoother for you.

Drupal Core Major Upgrades Blog 1

Background

From Drupal 8, upgrades became a lot easier with several tools to semi-automate the process.

In addition, if you, like CTI Digital, have a number of sites to upgrade and different developers working on each of them, you could easily duplicate effort as many contributed modules are used in most sites - e.g. Webforms and Paragraphs, just to name two. Therefore, this article also contains the approach that CTI Digital took in managing the latest upgrades to Drupal 10 of around 50 sites (some still on Drupal 8) to ensure knowledge transfer within the team.

This article looks at the tools available and the lessons we learned.

Why is it easier since Drupal 8

The approach taken by the Drupal core team with Drupal 8+ has made things a lot easier. That approach is to deprecate hook functions, classes, class methods, etc., while introducing their replacements. Then, a list of deprecated features that will be removed in the next major version is produced and fixed at a specified minor version of the current Drupal major version (8.9 for D8 and 9.5 for D9). Drupal's main dependency, Symfony, takes the same approach.

This approach clarifies what is deprecated in Integrated Development Editors (IDEs) and allows various tools to semi-automate the process. It enables them to find deprecated code, advise on what changes are needed and even make many of them. Finally, the Drupal CI systems automatically run tools to spot deprecated code on contributed modules and produce patches, which are then attached to a new issue created for each module. These can then be tested, modified and approved by the community.

Tools 

So, what are these tools? There are command line tools for picking up and making changes, but there are also contributed modules (often using those command line tools). The ones we at CTI Digital found useful are listed here.

The three tools we used are:

PHP Codesniffer - a tool to find code issues by rule; in this case, PHP 8.1 compatibility

Drupal Core Major Upgrades Blog 2

Upgrade Status

This is the first one that should be installed on a development copy of each site to identify the work that needs to be done. It is a contrib module that can be installed with composer:

composer require drupal/upgrade_status

This tool provides a report through your site's UI (Reports/Upgrade status admin/reports/upgrade-status). This report details everything you need to do - down to lines of code that need changing per module or theme. This detail includes what to change each line to and if Rector - see next section - can make the change automatically.

It will tell you the following: 

  1. Environmental factors - such as PHP, database and Drush versions required. For Drupal 10 it will also include a list of invalid permissions and deprecated core modules that you have installed

  2. Modules/themes that are not installed

  3. Contributed modules/themes that have an upgrade available (including if that is compatible)

  4. Contributed modules/themes that have nothing available (with a link to an issue queue search for issues containing the text ‘Drupal 10’)

  5. Modules/themes that have changes that could be made by Rector

  6. Modules/themes that have changes that need to be fixed manually

Each module/theme can be scanned for detailed depreciations, which will also categorise each as one that can be done with Rector or one that needs to be done manually. This can be run where there is not a compatible version or patch already available.

For all contributed modules/themes, you have the following options:

  • Upgrade to a more recent version

  • Apply a readily available patch on its issue queue

  • Upgrade and patch

  • Write a patch from scratch (using Rector where possible)

  • Remove an unused module/theme from the code

  • Replace the module/theme

    It should be noted that removing or replacing a module/theme that is currently installed requires two deployments. The first uninstalls the module/theme, and the second removes it from the code base. If you do both in one go, you will get issues with Drush commands in the deployment.

Drupal Rector

This command line tool can automatically make many of the necessary changes. It can be installed in a project with:

composer require palantirnet/drupal-rector --dev

Then, a file called rector.php needs to be copied from the vendor/palantirnet/drupal-rector folder to the Drupal root directory (web in the examples here).

You can then run this tool on any module/theme with:

vendor/bin/rector process web/[modules or themes]/[SUB_FOLDER]/[YOUR_MODULE] --dry-run

This will find what needs to be changed, and if you are happy with the changes, removing the –dry-run option will, of course, allow it to do its thing.</span>

PHP Codesniffer

This command line tool is not specific to Drupal core upgrades. It looks for particular code patterns, and you can install add-ons to look for anything, including PHP versions. Since Drupal upgrades often include PHP upgrades, it is at least worth running this tool on all custom code.

In order to test for PHP 8.1 ( required for Drupal 10), the following will install the tools you need:

composer require --dev phpcsstandards/phpcsutils:"^1.0@dev"

composer require --dev phpcompatibility/php-compatibility:dev-develop

You can confirm that this worked by running:

vendor/bin/phpcs -i

This command will list what sniffs are installed and will include PHPCompatibility.

Then the following can be run to test all your custom modules:

vendor/bin/phpcs -p web/modules/custom --standard=PHPCompatibility --runtime-set testVersion 8.1 --extensions='php, module,inc,install,theme'

This will test for PHP 8.1 compatibility, specifically in all PHP code files. You can do the same for any custom themes.

Drupal Core Major Upgrades Blog 3

Deprecations not spotted by the tools

There are some deprecations that the tools do not spot. They are services, libraries and hook functions. In the case of services, these are the calls to \Drupal::service(''). If this call is assigned to a variable and that variable is given the relevant class (/* @var */), then that class will also be deprecated and picked up. Also, if you inject a service into a class, the service's related class will be picked up.

The only solution we found was to create a text file with one service, library or hook function per line and use grep to search the custom code:

grep -r -f [textfile] web/modules/custom

And the same for custom themes.

The upgrade steps

Apart from upgrading and patching contrib modules and themes, fixing custom modules and themes, and removing unused code, there are two extra steps for D9 to D10 upgrades.
The first is to fix an old issue of invalid permissions. In D8, modules/themes were not required to delete permission allocations to user roles when they deleted a permission that they created. One of the minor versions of D9 firmed this up, and a core database update removed any such orphans. However, this did not always work, and the Upgrade status report lists the user roles with non-existent permissions that must be deleted - manually from the configuration yml files.

The second is core modules and themes that have been deprecated and are being deleted from D10. The Upgrade status report will list the core modules and themes the site uses. All of these have a contributed version that can be added to a project if needed. Detailed recommendations are also available here.

The final upgrade to core

This should simply be a case of running the Composer command once everything else is done:

composer require drupal/core-* –with-all-dependencies

This presupposes that the site is built with the standard drupal/core-recommended and related packages.

However, you will often find that Composer finds some requirements clashing with D10 or its dependencies.

The main reason is that a module/theme is patched to D10, including the change to the info.yml file's core_version_requirement and the composer.json file. Composer will have an issue with this. This is because Composer is using the files in the repository to determine compatibility, not the patch file changes. However, there is a solution with 'lenient'. This Composer add-on allows you to ask Composer to be lenient on the constraints of individual packages.

The command to add the lenient package to the site is:

composer require mglaman/composer-drupal-lenient

The command to add a list of modules and themes to the allowed list is:

composer config --merge --json extra.drupal-lenient.allowed-list '["drupal/YOUR_MODULE1", "drupal/YOUR_MODULE2" …]'

In addition, some projects have drupal-composer/drupal-scaffold as a dependency. This is deprecated, and you will get a notice about that. It is to be replaced with drupal/core-composer-scaffold. Finally drupal/console is incompatible with D10 and needs to be removed.

Managing upgrades for a large number of sites

As mentioned at the top of this article, there could be a lot of duplicate effort and different approaches taken if steps are not taken to organise the upgrade of a large number of sites.

The solution we at CTI Digital came up with is simply maintaining a spreadsheet of information about contributed modules/themes. This would contain recommendations (version, available patches) and any complications or special instructions associated with that module or theme's upgrade, etc. It is maintained as more sites are audited and then upgraded with findings as we go to assist later sites. This can also be applied to any custom modules/themes that are shared with more than one site.

As each site is audited for the effort required, the spreadsheet is referred to and added to so that effort is not duplicated and the spreadsheet is kept up to date with findings from each site. Also, any approaches that should be followed on all sites are kept here - e.g. we replaced all sites using Swiftmailer with Symfony Mailer for mail management.

The other primary approach was to consolidate effort patching contributed modules/themes that are not ready. CTI Digital's work to improve a contributed module/theme is always contributed back to drupal.org. A link to this new patch is stored in the database for the following site that uses this module/theme.</span>

4-Jan-02-2024-12-30-44-3036-PM

What about Drupal 8 to Drupal 10

CTI Digital still had a few sites on Drupal 8, which needed to be upgraded to Drupal 10 rather than two separate upgrades.

There are factors that affect this:

  • Upgrade status will only give what needs to be done to get to D9

  • Drush does not cope with a deployment that goes from 8 to 10 in one go

  • A significant number of contributed modules/themes have versions that are 8 and 9 compatible and versions that are 9 and 10 compatible, rarely one version compatible with 8, 9 and 10

  • A few contributed modules/themes have the newest version drop support of D9

    This means you have to do an 8-9 audit (with the dev copy on D8.9 minimum), upgrade the dev copy to D9 and then do a new audit from 9-10. You will be able to assess the complete upgrade requirement for all the modules/themes identified by the first audit, but the second audit will find modules/themes currently compatible with D9 but not D10 that are missed by the first. The second audit will also supply invalid permissions and core modules/themes that are removed from D10.

    You must also upgrade as two deployments minimum (one to D9 and the second to D10). Some modules/themes will be upgraded in the D8 site before upgrading to D9, and some in the D9 site before upgrading to D10. Some will have to be upgraded twice. Some will need to be upgraded when the site is on D10 (occasionally at the same time as the core upgrade). It will depend upon the module's upgrade path and the version the D8 site is on.

What is next?

You may be wondering if anything can make things easier for D11. Although the overall effort can not be reduced, it can be spread out.

If you regularly upgrade contributed modules and themes to the most recent version, you will reduce the work when it's time to upgrade to D11.

If you also run regular Upgrade Status reports on your D10 site, you can create work dockets for changes to your custom code. It can also be used to identify contributed modules and themes that you use that have not yet been upgraded. You could create work dockets for replacing these modules or contributing patches to upgrade them to D11. These can be spread out between now and the need to upgrade to D11.

The ultimate goal of these approaches is that you only have to upgrade the core when it is time to upgrade to D11.

To get ahead, speak to one of our Drupal experts and book a 30-minute session today to discuss a migration or upgrade for your business. 

Dec 21 2023
Dec 21

When we was implementing a customer relationship management for a local client using Drupal, we faced this challege:

  • The client wanted to manage records of his patients with many treatment images captured by his iPhone
  • Each iPhone images can be large, 5 to 10MB depending on resolutions
  • While we can't hold so many large images to the web server, the disk space will soon run out

So we have to find a way to resize and compress images quality on the client side before uploading so they won't place burdens to the web server.

After googling around, we found a tip on this thread mentioning DropzoneJs. So we tried it and successfully reduce image sizes to 10% (from 3M to 300kb) while maintaining image quality.

In this tutorial, we will show you how to configure DropzoneJs in Drupal 10 to resize and compress images on client sides before uploading.

1. Install DropzoneJS and Entity Browser:

Please follow instructions to install DropzoneJS and Entity Browser modules

Download DropzoneJs and Exif-JS libraries and place them to /libraries folder, so the js files can be accessed as:
/libraries/dropzone/dist/min/dropzone.min.js
/libraries/dropzone/dist/min/dropzone.min.css
/libraries/exif-js/exif.js

Enable DropzoneJs and Dropzone Entity Browser widget, it will also enable Entity Browser module, as below:

Enable Dropzone modules

2. Create a Dropzone browser:

Please browse Admin - Configuration - Content authoring - Entity Browsers and Add a new Entity browser, name it Dropzone for example, and leave default settings.

Add Dropzone browser

Name Dropzone browser

On the next screen "Widget settings", please select Dropzone.

Select Dropzone browser

On the form below, please check the option "Use client side resizing" (note: it won't be checkable if you don't install Exif library as on step 1), then you will be able to select max width, max height and resize quality. We put 1920px for width and height, then image resize quality as 0.8 on our case.

Configure Dropzone browser

Save it.

3. Configure file upload widget

On your content type with image fields, please choose Manage form display, and set the widget to Entity browser

Set Manage form display to Entity Browser

4. Test the upload:

Now edit a node with image fields, the DropzoneJs browser is now displayed:

Dropzone browser display

We uploaded a test image, a fullsize image from our mobile camera, 3MB.

Upload an original image

After uploading successfully, we checked that file on our server, it was reduced to 300K, only 10% of the original size.

Image compressed

It is done. Now you can use DropzoneJs to resize and compress images on client sides before uploading, which is great for your web server.

Dec 06 2023
Dec 06
Laura JohnsonLaura Johnson

Laura Johnson

Senior Engineer

Primarily a backend developer, Laura also loves adding new cross-disciplinary skills to her collection, such as working with themes and frontend frameworks.

December 6, 2023

If your organization is still using Drupal 7 or later, migrating to an up-to-date platform for your website has been looming like a weight on your shoulders. The move to Drupal 10 isn’t easy. It requires a migration of your site and a full redesign to take advantage of the new tools the latest version offers.

Not only do you need someone to write that migration, but you also need to secure the budget to undertake a project like this. As you wait for the right time to get started, the weight of the deadline to begin your migration to Drupal 10 has only grown heavier. After multiple extensions, the Drupal community has set January 5, 2025 as the final end-of-life date for Drupal 7.

What does that mean for your organization? On the one hand, you now have just over a year to start planning a migration before your site loses crucial support. But on the other hand, as many organizations like yours face a similar deadline, you can’t afford to wait much longer. The time to make the move to Drupal 10 is now.

Why you need to start planning for a Drupal 10 migration

If you’ve fallen behind in migrating your site from Drupal 7, you’re not alone. According to the Drupal community, more than 350,000 projects still use that version of the platform as of November 2023 — one-quarter of all Drupal sites.

As a result, you aren’t just facing a hard deadline to relaunch your new site as January 2025 grows closer. You’re also competing with a vast number of organizations just like yours who need to coordinate the same migration with a web development agency partner. Given that it takes an average of six months to complete the sales process to get started on a Drupal 7 migration, you’re already at risk of missing the deadline if you have not yet contacted an agency.

The longer you wait, the less likely you are to find a team with availability to work with you on a migration plan and website redesign before Drupal 7 reaches end-of-life. And, given the stakes involved, your organization can’t afford the risks of sticking on a platform without the vital benefits of ongoing support.

What your organization loses when Drupal 7 reaches end-of-life

Drupal 7 will reach its end of life 14 years after its initial release. If you’re still on the platform, your website will remain accessible after January 5, 2025. However, it will no longer receive feature updates, bug fixes, or security releases from the Drupal community.

This last detail is most critical to your organization. Any security issues discovered after January 2025 may be publicly disclosed, but Drupal will no longer provide any necessary updates. Prior to the announcement of this final extension for Drupal 7, your organization had the option of paying for extended support. But that is no longer the case.

When you work with the right agency partner, you can create a migration plan that will keep your website secure. Fortunately, your organization will be able to better manage ‌site security after the migration is complete. But that’s just one of the advantages made possible by getting your organization started with Drupal 10.

Drupal 10 offers dramatic advantages after migration

Trusting your site with the legacy code of Drupal 7 doesn’t just expose your organization to poor security. It prevents you from taking advantage of dramatic improvements for your site’s users and content editors.

Improved website speed and SEO performance

Fundamentally, your Drupal 10 website will run faster. Dynamic caching reduces page load times by invalidating only the content that has changed. Instead of needing to reload your entire page after a set amount of time, your cache can just reload the block with new information.

Drupal 10 also marks the end of Drupal 7’s jQuery. A large JavaScript library, jQuery was a powerful tool, but modern browsers perform many of the same functions. The up-to-date JavaScript used by Drupal 10 also decreases page load times.

Drupal 10 also supports new formats such as schema.org, Open Graph, and JSON-LD, which increase conversions from search engines. Plus, Drupal 10 supports advanced accessibility features that improve WCAG compliance and further improve SEO rankings.

Better site security and reduced maintenance costs

Drupal 10 improves your site security by including up-to-date protocols and dependencies such as PHP 8, Symfony 6, and CKEditor 5. As earlier versions of these dependencies reach end-of-life, they may be exposed to unpatched security vulnerabilities. Migrating to Drupal 10 avoids delays in getting critical security patches applied to your site.

One of Drupal’s major advantages as an open-source platform is the community’s Security Team, which delivers security advisories and provides guidance to contributed module maintainers on how to resolve potential vulnerabilities. Providing continued support from the community Security Team for all of your site’s contributed modules beyond the upgrade deadline is critical.

Improved content editing experience and efficiency

Drupal’s out-of-the-box CMS experience has always been limited. With Drupal 10, your site editors benefit from the Claro theme, which makes Drupal much easier to use. New image tools and an updated media library also enable better organization of your site’s assets.

Drupal 10 also includes the JavaScript text editor CKEditor 5, which further simplifies content creation and its accessibility. In addition, the platform offers enhanced translation capabilities in multiple languages, which enables your organization to reach a wider audience than ever.

Don’t wait until an emergency before moving to Drupal 10

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Nov 27 2023
Nov 27
Allan ChappellAllan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

November 27, 2023

At the time of this blog, we have done two major version upgrades of Drupal and have refined the process along the way. There has been a lot of work in the community, through the efforts of people like Matt Glaman to make this process easier.

As a Support Engineer, I see a lot of approaches for achieving the same results in many areas of my work. Here, I’d like to share with you three different ways to achieve an upgrade of a module or theme that isn’t ready for the next major Drupal version, each with pros and cons, but all absolutely acceptable.

Why do we have this problem?

All new Drupal developers have a hard time with the layers of code changes that happen in the Drupal community. We have custom package types, custom install locations, patches, and scaffolding. To make the challenges worse, we have two ways to identify a module’s dependencies — that being a .info.yml file and for some, a composer.json. This is because some Drupal modules may want to build upon an existing PHP library or project, in addition to other Drupal modules. To ease the pain of having to define some dependencies twice, both in the .info.yml file and composer.json file, Drupal.org built their packagist, a repository of Composer packages, to read the .info.yml files from the root of the project and create Composer version constraints from that. For example, if the .info file contained the following:

name: My Module
type: module
core_version_requirement: ^8.8 || ^9
dependencies:
  - ctools:ctools

Then Drupal.org’s packagist would create the following for the release that contained that .info.yml file, saving the contributed developer a lot of trouble.

{
    "type": "drupal-module",
    "name": "drupal/my_module",
    "require": {
      "drupal/core": "^8.8 || ^9",
      "drupal/ctools": "*"
    }
  }

I hit on something there, though. It will create that for the release the .info.yml was in. When most code changes come in the form of patches, this poses a challenge. You apply your patch to the .info.yml after you download the release from Drupal.org’s packagist. Additionally, Drupal.org doesn’t create a new release entry for every patch file in the issue queue. So you are left with the question, “How do I install a module on Drupal 10 that requires Drupal 9 so that I can patch it to make it compatible for Drupal 10?”

Drupal Lenient

One of the easiest methods for those who don’t understand the ins and outs of Composer is to use the Drupal Lenient plugin. It takes a lot of the manual work out of defining new packages and works with any drupal-* typed library. Types are introduced to us through the use of the Composer Installer plugin and manipulated further with something like Composer Installers Extender. Composer plugins can be quite powerful, but they ultimately add a layer of complexity to any project over using core composer tactics.

Drupal Lenient works by taking any defined package pulled in by any means via Composer, and replaces the version constraints for drupal/core currently, at the time of this writing, with “^8 || ^9 || ^10“. So where the requirements might look like the example earlier “drupal/core“: “^8.8 || ^9“, they are replaced, making it now possible to install alongside Drupal 10, even though it might not‌ be compatible yet. This allows you to patch, test, or use the module as is, much like if you would have downloaded the zip and thrown it into your custom modules directory.

An example may look like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8"
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/my_module": "1.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    "drupal-lenient": {
      "allowed-list": [
        "drupal/my_module"
      ]
    },
    "patches": {
      "drupal/my_module": {
        "3289029: Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2022-06-16/my_module.1.x-dev.rector.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

Note the Drupal-Lenient allow list. Also note that you will need to make sure and install the plugin before trying to install the module that doesn’t support Drupal 10 in this case. If you want an excellent step-by-step, Matt put one together in the Readme.

The pros:

  • Easy-peasy to install
  • Feeds off the original packagist packages, so if there is an upgrade, you don’t have to do anything special to transition

The cons:

  • Lenient has the control and may cause inexplicable errors when updating due to unsupported core versions
  • PHP devs not familiar with Drupal Lenient won’t know to look for it
  • Flaky experiences when switching in and out of branches that include this plugin. If you context switch a lot, be prepared to handle some errors due to Composer’s challenges maintaining state between branches.
  • Patches to other dependencies inside composer.json still require you to run through some hoops

Custom package

If you want more control over what the module can and cannot do, while keeping the core of Composer functionality without adding yet another plugin, check out this method. What we will do here is find out what version the patch or merge request is being applied against. It should be stated in the issue queue and by best practices is a dev version.

If you are a perfectionist, you can use composer install -vvv to find the url or cache file that the module came from for packages.drupal.org. It is usually one of https://packages.drupal.org/files/packages/8/p2/drupal/my_module.json or https://packages.drupal.org/files/packages/8/p2/drupal/my_module~dev.json. You will note that the Composer cache system follows a very similar structure, swapping out certain characters with dashes.

With this information, you can grab the exact package as it’s defined in the Drupal packagist. Find the version you want, and then get it into your project’s composer.json.

Let’s use Context Active Trail as an example, because at the time of this writing, there is no Drupal 10 release available.

Drupal release information

Looking through the issue queue, we see Automated Drupal 10 compatibility fixes, which has a patch on it at. I grab the Composer package info and paste the 2.0-dev info into my composer.json under the “repositories” section as a type “package.”

Drupal packagesDrupal packages

Which should make your project look something like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "package",
      "package": {
        "keywords": [
          "Drupal",
          "Context",
          "Active trail",
          "Breadcrumbs"
        ],
        "homepage": "https://www.drupal.org/project/context_active_trail",
        "version": "dev-2.x",
        "version_normalized": "dev-2.x",
        "license": "GPL-2.0+",
        "authors": [
          {
            "name": "Jigar Mehta (jigarius)",
            "homepage": "https://jigarius.com/",
            "role": "Maintainer"
          },
          {
            "name": "jigarius",
            "homepage": "https://www.drupal.org/user/2492730"
          },
          {
            "name": "vasi",
            "homepage": "https://www.drupal.org/user/390545"
          }
        ],
        "support": {
          "source": "https://git.drupalcode.org/project/context_active_trail",
          "issues": "https://www.drupal.org/project/issues/context_active_trail"
        },
        "source": {
          "type": "git",
          "url": "https://git.drupalcode.org/project/context_active_trail.git",
          "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e"
        },
        "type": "drupal-module",
        "uid": "context_active_trail-3192784",
        "name": "drupal/context_active_trail",
        "extra": {
          "branch-alias": {
            "dev-2.x": "2.x-dev"
          },
          "drupal": {
            "version": "8.x-2.0-rc2+1-dev",
            "datestamp": "1630867980",
            "security-coverage": {
              "status": "not-covered",
              "message": "Project has not opted into security advisory coverage!"
            }
          }
        },
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^4.1",
          "drupal/core": "^8.8 || ^9"
        }
      }
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8"
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "2.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

Now let’s change our version criteria:

…
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^4.1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
…

And then add our patch:

…
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
      "drupal/context_active_trail": {
        "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
…

Here, you will need to look to see if the patch is patching composer.json. If it is, you will need to modify your package information accordingly. For example, in this one, the fixer changes drupal/context from ^4.1 to ^5.0.0-rc1. That change looks like this:

…
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^5.0.0-rc1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
…

Lastly, sometimes you run into some complications with the order packages are picked up by Composer. You may need to add an exclude element to the Drupal packagist.

…
  {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
  },
…

Our final composer.json for our project could look something like this with all the edits:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "package",
      "package": {
        "keywords": [
          "Drupal",
          "Context",
          "Active trail",
          "Breadcrumbs"
        ],
        "homepage": "https://www.drupal.org/project/context_active_trail",
        "version": "dev-2.x",
        "version_normalized": "dev-2.x",
        "license": "GPL-2.0+",
        "authors": [
          {
            "name": "Jigar Mehta (jigarius)",
            "homepage": "https://jigarius.com/",
            "role": "Maintainer"
          },
          {
            "name": "jigarius",
            "homepage": "https://www.drupal.org/user/2492730"
          },
          {
            "name": "vasi",
            "homepage": "https://www.drupal.org/user/390545"
          }
        ],
        "support": {
          "source": "https://git.drupalcode.org/project/context_active_trail",
          "issues": "https://www.drupal.org/project/issues/context_active_trail"
        },
        "source": {
          "type": "git",
          "url": "https://git.drupalcode.org/project/context_active_trail.git",
          "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e"
        },
        "type": "drupal-module",
        "uid": "context_active_trail-3192784",
        "name": "drupal/context_active_trail",
        "extra": {
          "branch-alias": {
            "dev-2.x": "2.x-dev"
          },
          "drupal": {
            "version": "8.x-2.0-rc2+1-dev",
            "datestamp": "1630867980",
            "security-coverage": {
              "status": "not-covered",
              "message": "Project has not opted into security advisory coverage!"
            }
          }
        },
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^5.0.0-rc1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
      }
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "2.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
      "drupal/context_active_trail": {
        "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

The pros:

  • Uses more core Composer functionality
  • A PHP developer will better understand ‌what’s going on here
  • You are in complete control of how this module package and version are defined
  • All the work is in one file

The cons:

  • Requires some understanding of how composer.json, packagists, and the magic of Drupal’s packagist all work
  • That’s a messy composer.json for the project
  • If you have to use exclude, you have to leave it up to outside forces to let you know when that module does finally put out and actual D10-ready version, and then undo all of this work

Standard PHP composer best practice says that if you make modifications to a package, fork it, maintain your modifications, and provide a pull request if it’s functionality you wish to contribute back. You can use this same approach with Drupal modules as well. Some may even say that’s what issue forks are for! That said, issue forks come with the downside that sometimes they go away, or are overridden with changes you don’t want. They are a moving dot.

For the sake of this example, let’s assume that we have forked the module on GitHub to https://github.com/fourkitchens/context_active_trail.git. If you don’t know how to make a fork, simply do the following:

  • Clone the module to your local computer using the git instructions for the module in question
  • Check out the branch you want to base your changes on
  • Create a new repository on GitHub
  • Add it as a remote git remote add github [email protected]:fourkitchens/context_active_trail.git
  • Push it! git push github 8.x-2.x

You can do this with a version of the module that is in a merge request in Drupal.org’s issue queue, too. That way you won’t have to reapply all the changes. However, if your changes are in a patch file, consider adding them to the module at this time using your favorite patching method. Push all your changes to the github remote.

If the patch files don’t have changes to composer.json, or if the module doesn’t have one, you will likely want to provide at least a bare-bones one that contains something like the following and commit it:

{
  "name": "drupal/context_active_trail",
  "type": "drupal-module",
  "require": {
    "drupal/context": "^5.0.0-rc1",
    "drupal/core": "^8.8 || ^9 || ^10"
  }
}

This will tell Composer what it needs to know inside the project about dependencies. This project already had a composer.json, so I needed to add the changes from the patch to it.

Inside our Drupal project we are working on, we need to add a new entry to the repositories section. It will look something like this:

    {
      "type": "vcs",
      "url": "https://github.com/fourkitchens/context_active_trail.git"
    },

The VCS type repository entry tells Composer to look at the repository and poll for all its branches and tags. These will be your new version numbers.

Much like in the “Custom Package” example, you may need to add an exclude property to the Drupal packagist entry.

…
  {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
  },
…

Now, since Drupal packagist isn’t here to give Composer some version aliases, we have to use the old notation dev-BRANCHNAME for our version. Our require entry will look something like this:

 "drupal/context_active_trail": "dev-8.x-2.x",

Since we already added our patches as a commit to the module, this is all you need. Your final composer.json for your project would look like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "vcs",
      "url": "https://github.com/fourkitchens/context_active_trail.git"
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "dev-8.x-2.x",
  }
}

It makes for a much cleaner project json, but now you’ve split the work into two locations, requiring some synchronization. However, if multiple sites of yours use this same module and need the same fixes, this absolutely has the least resistance and ability to get those changes out more quickly.

The pros:

  • Reusability
  • Two smaller, simpler chunks of work
  • Any PHP developer should be able to debug this setup as it uses Composer best practices. This method will be used in any project with any framework in the PHP ecosystem.

The cons:

  • Changes are in two separate places
  • Which patches are applied isn’t obvious in the composer.json and require looking through the commit history on the forked repository
  • Requires maintenance and synchronization when upgrades happen

Final thoughts

As with almost everything out there, there are multiple ways to achieve the same goal. I hope this brings awareness, and helps provide the flexibility you need when upgrading Drupal to a new major version. Obviously, each solution has strengths, and you may need to mix it up to get the results you want.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Nov 24 2023
Nov 24

As I embarked on a recent journey to enhance the usability of Drupal from the perspective of both site owners and editors, I stumbled upon what could be a game changer for content editors – the "Same Page Preview" module.

This module offers an innovative solution, providing a page preview seamlessly integrated into the editing process. Say goodbye to the hassle of toggling between the edit form and a separate preview window. With the "Same Page Preview" module, it's all about real-time content visualisation and efficiency.

cti Blog Banner

Key Features

Effortless Installation

Setting up the "Same Page Preview" module is a breeze, and it's a matter of a simple checkbox configuration against specific content types.

On-Page Canvas Preview

When adding or editing content, an on-page canvas preview elegantly unfolds. As you interact with the edit form fields, the preview updates in real time, offering an instant, dynamic view of your content.

Custom Display Options

Tailor your preview experience to your liking. Choose to open the display in a new window, view content in full-screen mode, or select your preferred display mode. The module is all about personalising your content editing workflow.

Custom Display Options

Why it matters 

Watch a Short Demo: https://youtu.be/Mh_plCpt1_A


The "Same Page Preview" module has recently received recognition on the Talking Drupal podcast, where its potential was discussed. Furthermore, there's an active issue in the Drupal core ideas project advocating for the inclusion of this module in the Drupal core.


In my opinion, integrating "Same Page Preview" into the Drupal core would be an invaluable asset. I've encountered numerous projects where the concept of in-page content previews has sparked considerable interest and discussion.


Join me in exploring the possibilities that this module brings to the Drupal community and in advocating for its inclusion in the Drupal core. Let's make content editing even more user-friendly and efficient.

Nov 20 2023
Nov 20

CKEditor is a powerful and versatile web-based text editor commonly used in content management systems like Drupal. It empowers content creators and editors with a user-friendly interface for crafting and formatting text, incorporating multimedia elements, and managing a wide range of content types.

CKEditor 5, the latest iteration, introduces a host of major enhancements and new features, from a fresh, modern design to advanced features that streamline content creation and will bring a leap forward in productivity. This new and exciting version of CKEditor comes as part of Drupal 10 out of the box, so provides a great benefit when upgrading your current Drupal site.

In this article, we'll delve into CKEditor 5's impressive capabilities, focusing on its revamped appearance, link management, image handling, table editing, font customisation, HTML embedding, and the exciting premium features it brings to the table. Let's jump in and explore the creative possibilities CKEditor 5 offers for enhancing your digital content.

Header image - Drupal 10 blog

Drag and Drop

CKEditor 5's drag-and-drop feature transforms the content editing experience, providing unparalleled convenience for editors. This functionality allows content editors to effortlessly rearrange text, paragraphs, tables, and lists within the editor, enhancing the fluidity of content composition. The ability to move entire blocks or multiple elements with a simple drag-and-drop action offers a significant time-saving advantage, streamlining the editing process. Moreover, content editors can seamlessly import HTML or plain-text content from external sources, simplifying the integration of images into their work. This feature not only improves efficiency but also empowers editors with greater control and flexibility in crafting visually appealing and well-organised content.

Links

One area that's seen noteworthy improvements is link management. Adding links in CKEditor 5 is now more intuitive and user-friendly, as a convenient pop-up box appears within the WYSIWYG window. This makes the process smoother and faster. These link options can be combined with Drupal’s 'Editor Advanced Link' module, which empowers content creators with the ability to fine-tune their links. With this module, editors can define attributes such as title, CSS classes, IDs, link targets, 'rel' attributes, and ARIA labels, which are essential for providing users who use assistive technology like screen readers meaningful information about the purpose or content of the link. 

These enhancements offer a wealth of customisation options for links, whether it's for accessibility, branding, or precise styling. CKEditor 5 and the 'Editor Advanced Link' module together bring a logical link management experience to Drupal, making the process more efficient and versatile.

Links

Image Handling

Adding images to your content using CKEditor 5 has been given an upgrade thanks to the new drag-and-drop functionality. Users can simply select an image, whether it's from their device or a webpage, and simply drag and drop it into the WYSIWYG window. Once the image is incorporated, you have the option to designate it as decorative (meaning it doesn't add information to the page and, therefore, does not require alt text) or provide the alt text.

Furthermore, you can fine-tune the image presentation by adjusting its alignment and text wrapping options, all conveniently accessible from the image-dedicated balloon toolbar. If you wish to enrich your image with a link or a caption, you can easily achieve this without leaving the image toolbar.

Drupal blog image 2

Links

When you're ready to adjust the image size, CKEditor 5 simplifies the process by allowing you to resize the image directly within the WYSIWYG window. A straightforward corner selection and drag operation lets you customise the image to your desired dimensions.

Moreover, CKEditor 5 integrates with Drupal media. Once the necessary modules are enabled, you'll discover a new button in the text editor toolbar configuration. Add this button to your active toolbar, specify the media types and view modes you want to make available and save your preferences. You can then conveniently add new media files or select from your media library, following the standard workflow you're accustomed to (you are restricted with resizing the image when using the library). CKEditor 5, along with its compatibility with Drupal media, enhances the image management experience, making it a user-friendly and efficient process.

Table Management

Enhancements to table management in CKEditor 5 bring an improved editor experience. While currently requiring a patch to be added to Composer, the effort is undoubtedly worthwhile for those who frequently utilise tables in their content.

You can specify the number of columns and rows and include an optional title for the table. Once your table is set up, a wide array of editing options becomes available, providing greater flexibility and control over table and cell properties. These edits encompass essential functionalities, such as adding or removing columns and rows, merging and splitting cells, and customising styles for both the entire table and individual cells. You can fine-tune text alignment and even introduce background colours to enhance the visual appeal of your tables.

CKEditor 5 also offers the capability to nest tables within the cells of other tables, providing a versatile tool for crafting intricate charts or layouts based on table structures. This feature allows content creators to format the nested table with the same ease and flexibility as a standalone table, enhancing the possibilities for designing complex and well-organised content layouts.

Table Management

These improvements in CKEditor 5 make working with tables more efficient and user-friendly, empowering content creators to present their data and content in a structured and visually appealing manner.

Font Handling

Modify fonts in your content with CKEditor 5. By installing the 'CKEditor Font Size and Family' module, you can unlock a wide range of font and text editing options right on the WYSIWYG screen. With just a few simple configuration tweaks within the text editor, editors gain the ability to not only adjust font sizes and families but also apply text colours and text background colours, enhancing the text's visual appeal and customisation possibilities.

Font Handling

Other Exciting Extensions for CKEditor 5 to Explore

Auto Save

The Autosave feature is a significant enhancement. It automatically saves your data, sending it to the server when necessary, ensuring that your content is safe, even if you forget to hit 'Save.' While it does require installation and some code, the peace of mind it offers is well worth the setup time.

Markdown

With the Markdown plugin, you can switch from the default HTML output to Markdown. This is fantastic for those who prefer a lightweight, developer-friendly formatting syntax. The best part? It's available right out of the box, making content creation more flexible and efficient.

To-Do Lists

CKEditor 5's To-do list feature is a handy addition to your content creation toolkit. It enables you to create interactive checkboxes with labels, supporting all the features of bulleted and numbered lists. You can even nest to-do lists with other list types for a versatile task management experience. While it does require installation, the organisational benefits it brings are worth the minor setup work.

Premium Features

Unleash CKEditor 5's premium features with ease. Install and enable the 'CKEditor 5 Premium Features' module, configure it by adding your licence key, and adjust your text editor's toolbar. Then, you're ready to explore the exceptional features, including track changes, comments, revision history, and real-time collaboration, which enhance collaborative editing, discussions, version control, and harmonious teamwork, streamlining content creation and review for improved efficiency and precision.

Track Changes 

The Track Changes feature brings a dynamic experience to document editing. It automatically marks and suggests changes as you make them. Users can quickly switch to the track changes mode, where all edits generate suggestions that can be reviewed, accepted, discarded, or commented on, enhancing the collaborative editing process.

Revision History

The Revision History feature can be your trusted document versioning companion. It empowers you to manage content development over time with ease. Unlike Drupal's default revision log feature, The preview mode offers a comprehensive view of content changes and the contributors behind them, all within the editor. Users can compare and restore previous document versions.

Comments

With the Comments feature, users can annotate specific sections of content, whether it's text or block elements like images. It facilitates threaded discussions and provides the flexibility to remove comments once discussions are concluded, fostering effective collaboration and feedback.

Real-Time Collaboration

Real-Time Collaboration enables multiple authors to work concurrently on the same rich text document. Additionally, it provides a user presence list, offering a live view of active participants in the document. While other collaboration features can be used asynchronously, real-time collaboration allows for instantaneous teamwork.

Import Word/Export Word and PDF

Import word/Export word/& PDF:  When installed, the module allows for the easy importing and exporting of the above formats. While the export functionality is fully stable in CKeditor, the converters are considered experimental for Drupal at this time. The import of .docx & .dotx files will retain the formatting, comments and even track changes. 

Notification System

Alongside these collaboration features, CKEditor will be introducing a new notification system to keep editors and writers well-informed about the content's status. Stay up-to-date with real-time notifications, ensuring a smoother editorial workflow.

Productivity Pack

The Productivity Pack is a bundle of Premium features which make document editing faster, easier, and more efficient.


The Productivity Pack features include:

  • Templates allow you to create and insert predefined content structures into the editor, saving time and ensuring consistency in the content display.

  • Slash Commands lets you execute actions using the / key as well as create your own custom actions. This can help to streamline content creation, reducing navigation through the editor options and saving time.

  • Document Outline adds an automatic outline of the document headings in a sidebar to help navigate the document.

  • Table of contents inserts a linked table of contents into the document, which automatically updates as headings are added and will be retained in the editor output data.

  • Format Painter copies text formatting and styles and applies it to a different part of the content. This helps to ensure consistent formatting across the content, saves editor time and contributes to a more professional appearance.

  • Paste from Office Enhanced provides error-free copy-pasting from MS Word and Excel. Formatted text, complex tables, media, and layouts are retained – turning MS Office content into clean HTML.

The module also provides a Full-screen plugin that maximises the editing area which is very useful when using the premium features such as comments and Document outline as they take up extra space around the editor.

Demos of these CKEditor 5 features are available from links within the module project page. There are many other non-premium and premium features that can be installed outside of the Drupal module with some developer involvement, which can be found here.

Conclusion 

In this article, we've explored CKEditor 5's significant enhancements for content creators and editors in Drupal 10. CKEditor 5 offers improved link management, effortless image handling, streamlined table editing, versatile font customisation, and simplified HTML embedding. We've also touched on exciting extensions that enhance your content creation process.

Furthermore, CKEditor 5's premium features, like Track Changes, Revision History, Comments, Real-Time Collaboration, Import/Export for Word and PDF, Notification System, and the Productivity Pack, bring advanced capabilities for collaborative editing and efficient content creation.

As you dive into CKEditor 5's features, we encourage you to explore further and experience the benefits firsthand. It's a game-changer for content editing and collaboration in Drupal 10. Unleash your creativity and discover a more efficient and professional content editing experience with CKEditor 5.

Oct 20 2023
Oct 20

DrupalCon is the annual gathering of leaders in Drupal to meet, share knowledge, discuss the project roadmap, and work directly on progressing the development of Drupal Core at daily contribution events. On Friday 20th October, there is a huge Code Sprint where 100’s of Drupal Developers will make huge strides in progressing major initiatives.

We spoke to our Drupal Director, Paul Johnson, to find out more about the event, what he learned and how his talk went.

Visit Britain Digital Experience Platform at DrupalCon Lille

Photo credit - Mike Gilford

How did your talk go?

Together with Marie Orpen, Visit Britain's Head of Digital, I presented how CTI Digital is playing a central role in digital transformation across the organisation. The session focussed on how we led an extensive Discovery Phase and revealed what process and techniques we used to gather a body of knowledge to support decision-making by Visit Britain, leading to significant change in both public-facing websites, operating models and ways of working.

cti Blog Banner-2

Photo credit - Mike Gilford

If you would like to take a look through Paul's presentation during the talk, click here.

What were the key findings from the talks you attended?

Much like GDPR and accessibility legislation that came before them, new UK and EU legislation relating to carbon impact reports by businesses are on the near horizon. They must be on your agenda. 

At the DrupalCon 2 sessions we attended, we were provided with valuable knowledge transfer to complement our existing expertise in the subject.

In the UK, Streamlined Energy and Carbon Reporting (SECR) and EU the Corporate Sustainability Reporting Directive (CSRD) are set to arrive in 2024. They will introduce mandatory reporting of carbon emissions and tracking of improvements to the footprint.

IMG_7438IMG_7457

At both the infrastructure and software application levels, there are measures which can be taken to significantly reduce the carbon impact of digital services. The key first step in the journey to compliance with these legislations is the measurement of the current situation.

Our partner Platform.sh presented work they are undertaking together with Nestle to accurately track and report the carbon impact of Nestle’s Drupal applications and hosting infrastructure. Detailing the tools and techniques their enterprise uses to benchmark and track improvements, and how measures they have taken, such as moving hosting to countries drawing from carbon-neutral energy to performance optimisation in the application layer, are all lessons we can learn from.

Moving to green data centres brings the greatest benefit. A shift from Ireland to Sweden brings an 80% reduction in carbon footprint. Whilst essential, adapting the application layer yields far smaller benefits. However, a combination of these measures brings a significant benefit to the environment.

In a related session, Janne Kalliola signposted wider industry research, including findings from the University of Beira Interior in Portugal, concluding that there is a strong or very strong correlation between (software) execution time and energy consumption. All website owners need to begin to place greater emphasis upon and invest in web application performance and optimisation.

Of great concern and emphasising the importance of optimisation initiatives is that currently, global energy demand is growing faster than the production of green energy.

Our Drupal Director, Paul Johnson, talks at DrupalCon Lille 2023


"The ICT sector accounts for 4-10% of the world's energy consumption and it's growing" - Janne Kalliola

How are we adopting what you learned?

CTI Digital is starting to work with existing clients and will make available consultancy services to help organisations prepare for the new legislation.

If your organisation is approaching a digital transformation, why not get in touch with our experts today!

Aug 10 2023
Aug 10

We have fantastic news to share! Drupal developer, Viktor Tamas, has recently passed the Acquia Certified Site Builder exam for Drupal 10. This achievement is a testament to our team's commitment to learning, strengthening our capabilities while providing great promise for our clients.

Drupal developer, Viktor, now certified with Acquia's Drupal 10 site builder certification

What is the Acquia Certified Site Builder Exam?

The Acquia Certified Site Builder assessment is a thorough evaluation crafted to measure a developer's ability to build, manage and maintain websites through the application of Drupal's content management system. This certification acknowledges those who have showcased their expertise in Drupal's fundamental principles, methodologies, and technological proficiencies.

How will our clients benefit from Acquia’s Drupal 10 site builder certification?

Viktor's achievement in obtaining the Acquia Certified Site Builder credential has many benefits for our client's projects and digital experiences:

Strengthened Expertise: Our team's expertise in Drupal website development is further validated with a certified Acquia developer. Clients can rely on our skills and knowledge, ensuring that their projects are in capable hands.

Quality Assurance: The certification process is comprehensive, covering a wide range of Drupal-related topics. Our certified developer has gained an in-depth understanding of security considerations, best practices, and effective site-building techniques. This translates to higher quality websites for our clients.

Efficiency and Innovation: Acquiring the certified site builder credential involves staying up-to-date with the latest Drupal advancements and features. Our commitment to continuous learning and staying current with industry trends means that clients will benefit from the latest tools and technologies available, ensuring their websites remain modern and innovative

Problem Solving: The certification process involves tackling real-world scenarios and challenges faced during website development. Our ability to navigate and overcome these challenges demonstrates a capacity for creative problem-solving, which directly benefits our clients by ensuring smooth project execution.

Customised Solutions: With an Acquia Certified Site Builder on our team, we are better equipped to understand our client's unique requirements and customise solutions that align with their goals. This personalised approach ensures that each project is finely tuned to deliver the desired outcomes.

Our team's commitment to excellence and client satisfaction is highlighted by Viktor's achievement. With this certification, we are poised to deliver even higher quality, innovative, and tailored solutions that meet and exceed our client's expectations. We are thrilled to see the positive impact this accomplishment will have on our projects and the lasting benefits it will provide to our valued clients.


To find out more about Drupal 10 and what benefits it can bring to your business - read our blog on everything you need to know about Drupal 10 today.

Jun 09 2023
Jun 09

Extended End-of-Life Timeline for Drupal 7

We have important news to share regarding the end-of-life timeline for Drupal 7. Previously, Drupal announced an extension until 1 November 2023. However, we are delighted to announce that the final date for Drupal 7's end of life has been further extended to 5 January 2025.

With this end of life extension, the Drupal Security Team is making adjustments to the level of support provided. As a trusted Drupal partner, we are here to assist you throughout this transition. It's essential to note that this will be the last extension, so it is vital to plan your migration strategy appropriately.

Drupal 7 End of Life Date Extended to 5 January 2025

To ensure a simple transition, we want to highlight a few key points about the end of life extension:

Reduced support for moderately critical Drupal 7 issues

Starting from 1 August 2023, the Drupal Security Team may publicly post moderately critical and less critical issues that affect Drupal 7, as long as they are not mass-exploitable. This change will not impact Drupal 9 and newer versions. If a security issue affects both Drupal 7 and a newer version, the Drupal 7 issue may be made public without a corresponding fix.

Drupal 7 branches of unsupported modules no longer eligible for new maintainership

After 1 August 2023, unsupported Drupal 7 module branches will no longer be eligible for new maintainership. If you rely on Drupal 7 modules, we strongly encourage you to proactively adopt and support these modules to ensure their continued functionality. The Drupal Security Team will not issue security advisories for any unsupported libraries that Drupal 7 contributed modules depend on, including CKEditor 4.

PHP 5.5 and below will no longer be supported

Starting from 1 August 2023, Drupal 7 will no longer support PHP versions lower than 5.6. Drupal may provide further updates regarding the minimum PHP requirement before Drupal 7's end of life.

Security fixes for Drupal 7 Windows-only issues

From 1 August 2023, security fixes for Drupal 7 Windows-only issues will no longer be provided. If you are running a Drupal 7 site on Windows, we recommend considering migrating to another operating system or hosting provider.

Changes to Drupal.org services for Drupal 7

As of 1 August 2023, Drupal.org will no longer package Drupal 7 distributions with Drush make files. However, you can still build distributions locally using Drush make.

What does the Drupal 7 End of Life mean for you?

As we approach Drupal 7's end of life, it's vital to understand how the end of life will impact you:

  • The Drupal Security Team will no longer provide support or Security Advisories for Drupal 7 core and contributed modules. Public disclosure of security issues and zero-day vulnerabilities may occur.
  • Drupal.org will no longer offer support for Drupal 7-related tasks, including documentation navigation, automated testing, and packaging.
  • Drupal 7-compatible releases on project pages will be flagged as unsupported.
  • Certain Drush functionalities for Drupal 7 will no longer work due to changes in the Drupal.org infrastructure.
  • Drupal.org file archive packaging (tar and zip files) for Drupal 7 will be discontinued, and the archives may eventually be removed.
  • There will be no more core commits on Drupal core 7.x, and downloadable package tarballs may no longer be available.
  • External vulnerability scans will identify Drupal 7 as insecure.

If you are currently maintaining a Drupal 7 site, we strongly recommend initiating a migration to Drupal 10 before the end of life date. Our expert team is here to guide you through the migration process, ensuring a seamless transition.

If you are considering upgrading your website to Drupal 10, contact our team of dedicated Drupal developers for specialist support.

May 30 2023
May 30
Jim VomeroJim Vomero

Jim Vomero

Senior Engineer

As a tech lead, Jim works with clients through the full project cycle, translating their business requirements into actionable development work and working with them to find technical solutions to their challenges.

May 30, 2023

Running the digital experience is a large-scale operation for most higher ed institutions. Whether your architecture was established five or 15 years ago, the departments, offices, and entities you need to manage may add up to hundreds or even thousands of websites. And each new addition is increasingly challenging to maintain

Some sites use shared modules, while others do not. If you want to make an update to one website, you have to cross your fingers and hope it doesn’t break something on 500 others. Every day, another stakeholder presents a new request in support of an upcoming project

Facing all these compounding issues, the IT department at Yale understood that a lift-and-shift of their existing sites was impossible. Upgrading their digital platform presented an opportunity to reset their architecture and processes to start fresh

In a preview of our upcoming presentation at DrupalCon 2023, here’s what happened next — and what your institution can learn from it.

Why reinvention makes sense for higher ed institutions

Universities are facing significant challenges related to budgets, economic uncertainty, and reduced admissions applications. The pandemic introduced further uncertainty balanced with an increased need to sharpen digital presentations

As one of the most prestigious institutions in the world, Yale needed to find a new, more sustainable way to manage its digital needs. The institution had stretched the limits of a very mature Drupal 7 site with more than a decade’s worth of modules, themes, and custom code

It was difficult for the IT team to test with confidence, because they manage more than 1,100 sites that were all created in different ways. In addition, the more impressive a new site looked, the more other offices and departments wanted to emulate it.

The unintended consequences of an overtaxed website platform

With the university’s website system at critical mass, Yale’s teams lacked incentive to add new features to its legacy platform. Consequently, some larger departments found the platform inflexible, leading them to Wix and Squarespace for new projects. If the university didn’t find a workable platform solution, it ran the risk of increased site errors, design inconsistencies, and a diminished user experience

Resetting Yale’s approach to digital required a sizable upfront capital investment. As the work comes to fruition, the organization is gaining a flexible, scalable platform that will benefit every department into the next decade — and beyond.

YaleSites: A transformational approach to higher ed websites

YaleSites is the product of years of examining the university’s needs. Through our previous work with the institution’s cybersecurity office and the Schwarzman Center, we developed a new platform that incorporated the following elements:

A unified brand identity and design system

YaleSites offers many departments the ability to create unique digital experiences that are aligned with the institution’s overall design. Instead of a conventional CMS, Yale’s team uses a customized drag-and-drop page builder drawn from a library of proven components powered by Emulsify

YaleSites Welcome pageThe YaleSites Welcome page

Inclusive and accessible development for all customers and devices

Institutions like Yale need to offer an equitable digital experience for every audience. YaleSites upholds and prioritizes the university’s accessibility standards by making sure every content block follows best practices for usability and accessibility.

User-focused experience and design

YaleSites prioritizes the needs of the organization’s audience and its end users. Across the organization, content authors of every skill level can access a full library of templates, starter kits, and media libraries to produce what they need

Layout Builder add blocksLayout Builder add blocksAdding blocks in the YaleSites administrative interface.

Standardized practices for development

The organization’s development process has been streamlined. Rather than asking “What do you need in a website?”, work begins with the question, “How can our tools help with your strategy?” Developers don’t have to reinvent the wheel for a new site. Instead, they have the support of a system that’s performant, on-brand, and secure.

Sustainable governance

We implemented YaleSites with an eye toward thoughtful and sustainable growth. Universities often set digital priorities based on the loudest or most powerful voices in the organization. Now, Yale uses processes that enable them to focus on the organization’s most pressing needs. Plus, a core group meets regularly to collect feedback, respond to requests, and adjust priorities as needed.

Shifting from a project-based to a product-based perspective

After launching YaleSites, the institution will enter the maintenance phase of protecting its system. The university’s new platform required a significant financial investment — now it must invest in the long-term work of governance

The success of Yale’s platform hinges on a seismic internal shift. YaleSites isn’t a project that concludes with a specific end date. It’s a product that the organization must refine and support in perpetuity

Since YaleSites is a product, its resources are finite. For example, if IT plans to add six new features in a quarter, any new request is a negotiation. Something may need to get bumped from the product roadmap. Rather than rushing a new feature into development for a short-term need, the organization follows a multiyear roadmap and measures the needs against all of the priorities in the queue.

Eliminate deadline pressure by focusing on constant improvement

Thinking long-term about your organization’s website removes the need to squeeze as many improvements as possible into a project’s deadline. Following the principles of Agile development frees your team from solving every use case before launch. Instead, you can launch a minimally functional feature like an events calendar, see how people use it, and refine how it works according to actionable feedback

YaleSites allows the institution to implement site improvements with confidence. Rather than working on whatever makes sense in the moment, they see their work progress from ideation to development, testing, and release

From the flexibility of its digital tools to a more managed, Agile-driven approach to website improvements, YaleSites marks a dramatic shift for the better. If this sounds like a shift that would benefit how your organization works, we should talk. We can help you view your site and its planning from a new perspective

Megan Bygness Bradley and the Yale team contributed to this post.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

May 22 2023
May 22

When it comes to upgrading your website, the decision is never taken lightly. 

It's a process that demands considerable effort, time, and money.

Whether aiming to boost your CMS platform's performance or enhance the user experience, making sure you choose the upgrade that delivers the greatest value for your investment is crucial.

We understand this better than anyone else as one of Europe's most experienced Drupal development agencies. Our expertise in Drupal allows us to streamline the installation process, getting you started on your priority features quickly and cost-effectively.

But that's not all. We've developed something truly special: Drupal Accelerator. 

This innovative tool is designed to fast-track the installation of Drupal, providing you with a cost-effective package to create highly effective and efficient content and marketing websites. It harnesses the power of commonly used, ready-to-go features and functionalities, making it the perfect solution to fast-track the build and focus on your specific needs.

people using computer

What is Drupal Accelerator?

Drawing from our extensive experience, we have harnessed the key elements of successful websites and integrated them into our advanced Drupal accelerator. This cutting-edge platform empowers our clients to effortlessly uphold efficient, high-performance, and accessible websites without reinventing the wheel.

Our accelerator is the product of years of investment in research and development, based on two decades of experience delivering projects for clients. Since 2020, we've distilled all that knowledge into a powerful package.

The beauty of our Drupal Accelerator is that it significantly reduces implementation timescales and brings cost efficiencies. By using it, you'll have more budget available for the custom development of specific requirements without compromising fundamental best practices.

With Drupal Accelerator, you'll no longer need to deliver functional requirements on a project-by-project basis. Instead, it elevates every organisation's starting point in a website build, removing the need to set up each website's features from scratch. This leaves more time to prioritise functionality which is specific to your needs, so you can get your website up and running quickly and efficiently.

Some of these features include:

  • Page components include page headers, banners, media players and inline forms.
  • SEO optimisation, including Google Tag Manager and Analytics
  • GDPR tools
  • Responsible, accessibility-compliant design
  • Inbuilt accessibility-checking tools
  • Drag and drop interfaces, content scheduling and editorial workflows

How can we maintain our competitive edge if every website uses Drupal Accelerator?

With Drupal Accelerator as the springboard, every organisation has the opportunity to unleash their creativity and build out unique features and requirements that give their website a true competitive advantage.

Here's the exciting part: Drupal Accelerator itself doesn't directly provide a competitive edge. Instead, it is the powerful platform that enables you to allocate your budget strategically, investing it where it matters most to create that winning edge.

By leveraging Drupal Accelerator, you free up resources that would otherwise be spent on basic setup and implementation. This means you can allocate those saved funds towards customising your website, developing cutting-edge functionalities, and implementing innovative ideas that set you apart from the competition.

Drupal Accelerator empowers you to unleash your creativity and focus on building the features that will truly make your website shine. It's not just a tool—it's the launchpad that propels you towards your unique competitive advantage.

How to accelerate your Drupal Build

Revolutionising Drupal development: embracing efficiency and minimising risk with Drupal Accelerator

In the traditional world of Drupal development, the journey to a fully-functional content management system can be a long and winding road. Countless sprints and a significant amount of time are often required before you can even begin to utilise the system. This conventional approach poses a considerable challenge regarding content migration, as it tends to leave limited time for this crucial step, introducing inherent risks to your project.

But fear not! With Drupal Accelerator, we're turning this outdated paradigm on its head. Our innovative solution streamlines the development process, eliminating unnecessary delays and maximising efficiency. By leveraging Drupal Accelerator, you'll gain ample time and resources for content migration, significantly reducing the risks associated with rushing through this vital stage.

Say goodbye to the old way of doing things and embrace a new era of Drupal development.

With Drupal Accelerator, you'll save time and minimise project risks, ensuring a smooth and successful journey to your fully-functional and content-rich website.

Unlocking your web development potential: accelerating functionality and empowering efficiency

Imagine a web development journey where the standard functionalities are expedited, allowing you to invest valuable time in creating a CMS platform that seamlessly caters to both your website visitors and backend users. With this streamlined approach, you can craft a well-built website that not only impresses your audience but also frees up your team to focus on what truly matters: your commercial priorities.

Gone are the days of wasting precious hours searching for workarounds to CMS frustrations that shouldn't even exist. By prioritising the creation of a robust CMS platform early on in the development process, you provide your team with quick access to a functional CMS. They can effortlessly populate content and harness its power without delay. Moreover, you gain valuable time to fine-tune and optimise your platform for enhanced efficiency and effectiveness by addressing any issues or inefficiencies sooner.

With Drupal Accelerator as your secret weapon, you'll accelerate your web development journey, leaving behind unnecessary hurdles and frustrations. It's time to unlock your team's true potential and create a website that wows your audience and empowers your entire organisation to thrive.

Unlocking value by minimising costs on standard CMS functionality

We all know that time is money, and every moment counts when it comes to web development. That's where Drupal Accelerator comes in, revolutionising the speed at which you can get your website up and running. With its advanced foundations, your development time is significantly reduced, allowing you to hit the ground running and focus on what truly matters—the unique features and functionalities that make your website stand out.

Whether you're building a simple brochure site or a complex membership portal, Drupal Accelerator sets the stage for success. For simpler projects, the foundations provided by Drupal Accelerator eliminate the need for extensive additional development. On the other hand, it provides a head start for more intricate setups, enabling you to pick up right where Drupal Accelerator leaves off, saving you valuable time and effort.

Drupal Accelerator also puts your web development budget to optimal use. By obtaining a usable CMS platform at a reduced cost, you have more resources available to level up the web experience for your visitors. It's a win-win situation—enhancing your website's functionality while keeping your budget in check.

Additionally, with Drupal Accelerator being open source, you can transition your site internally or to another vendor without any unexpected expenses. You're in full control of your website's destiny.

To top it all off, when you combine Drupal Accelerator with our support retainer packages, we continually enhance its performance, resolve any issues that arise, and improve the overall user experience. This long-term partnership ensures significant cost reduction in ownership, providing you with sustainable savings and peace of mind.

Supercharge your website: the benefits of Drupal Accelerator unveiled

Drupal Accelorator

Drupal Accelorator  (1)

Unleash the power of Drupal accelerator: Features that propel your website to success

Mobile responsive front end

In today's digital landscape, responsiveness is key. That's why Drupal Accelerator is designed to effortlessly adapt to various devices, ensuring a flawless user experience across phones, tablets, and desktops.

With its out-of-the-box responsive support, Drupal Accelerator takes the guesswork out of device compatibility. Say goodbye to clunky layouts and frustrating user interfaces. Your website will effortlessly adjust its appearance and functionality to provide a consistent and engaging experience, regardless of the device your visitors use.

But that's not all. Introducing the front-end theme known as "Rutherford," which has undergone rigorous testing to ensure optimal performance across the latest versions of popular browsers. From desktops to tablets and mobile phones, "Rutherford" delivers a visually stunning and seamless experience, captivating your audience no matter how they choose to explore your website or what internet browser they prefer:

  • Chrome
  • Firefox
  • Safari
  • Microsoft Edge


Rutherford is highly evolved and flexible, allowing great front-end design flexibility whilst maintaining well-governed accessible user experiences.

Drupal’s module library, distilled for you

The allure of the vast Drupal ecosystem is undeniable: "There's a module for that!" In fact, there may be not just one but three modules to choose from for any given requirement. This abundance of open-source modules holds the key to solving challenges but can also introduce uncertainties and complexities.

With so many options available, it can take time for developers to determine the best approach for a given requirement. That's where our Drupal Accelerator comes in.

Our team has invested countless hours researching, evaluating, and testing highly adopted modules to determine which ones work best together seamlessly and securely. We've distilled this valuable knowledge into our Accelerator, so you can benefit from all the advantages of Drupal and open-source technology without worrying about potential module compatibility issues.

With Drupal Accelerator, you'll enjoy the peace of mind that comes with using a proven, reliable set of modules that work harmoniously. Plus, you'll save time and money on custom development, allowing you to focus on building the features that will give your website a competitive edge.

How to accelerate your Drupal Build

Empower your Editors: Unleash creativity with Drupal Accelerator

When it comes to website adoption, the ease of use for editors is a vital factor for success. With Drupal Accelerator, we revolutionise the editorial experience, allowing your team to unleash their creativity while ensuring brand consistency and digital best practices.

Imagine having a powerful page builder at your fingertips, equipped with an extensive library of customizable page components. Drupal Accelerator offers precisely that, providing editors with a wide range of creative options to design captivating content that cultivates and engages your audiences.

Our component-based layouts perfectly balance creative content design and structured data capture. This means your pages look visually stunning and adhere to brand consistency, accessibility standards, and seamless delivery across various device screens.

With our seamless media library integration, introducing captivating visuals such as images, videos, and audio to your content pages becomes a breeze. Editors have the freedom to create visually striking page layouts, bringing your content to life in ways that captivate and resonate with your audience.

Experience the liberation of creative expression with Drupal Accelerator. Empower your editors to craft compelling and visually stunning web pages that leave a lasting impression on your visitors. With our powerful toolkit, your website becomes a canvas for unlimited possibilities.

Drag & Drop template builder

If you're familiar with the convenience of layout builders, you'll be thrilled to know that we've incorporated this intuitive feature into our platform, taking your website design to extraordinary heights.

With the innovative layout builder in Drupal Accelerator, you have complete control over your page structure. Say goodbye to rigid templates and hello to dynamic layouts that suit your unique vision. 

The possibilities are endless, whether you prefer a classic 2-column design or a more intricate 3- or 4-column arrangement.

But it doesn't stop there. 

Our layout builder offers the flexibility to configure columns into various proportions, allowing you to create visually stunning and harmonious page layouts that perfectly showcase your content. Whether you're highlighting products, presenting captivating images, or delivering impactful messages, Drupal Accelerator gives you the freedom to bring your creative vision to life.

How to accelerate your Drupal Build

Once you have built a layout, editors can introduce content components using the drag-and-drop interface.

How to accelerate your Drupal Build

How to accelerate your Drupal Build

Whilst content managers assemble pages with components using intuitive drag-and-drop interfaces, the Drupal Accelerator does the hard work behind the scenes. This easy-to-use approach helps to ensure that content looks great across all devices, adheres to brand guidelines and meets industry best practices regarding:
  • Brand/style consistency
  • SEO
  • Accessibility
  • Performance
  • Security
  • GDPR

Unleash the full potential of your media

Create a visual journey like never before with Drupal Accelerator's cutting-edge media library. 

Our aim is to equip you with a comprehensive toolkit that effortlessly supports a wide range of media formats, giving your website an immersive and captivating edge.

With Drupal Accelerator, your website comes pre-packaged with a media library that embraces the power of visual storytelling. From stunning images, including animated GIFs, to locally hosted videos that capture attention and even remote videos from popular platforms like YouTube and Vimeo—our media library have you covered. Additionally, audio files can seamlessly integrate into your content, enriching the overall user experience.

We believe in providing flexibility and convenience. 

That's why Drupal Accelerator goes beyond the basics. 

Our media library allows you to effortlessly share and distribute other file types, such as PDFs, spreadsheets, and Word documents, providing a seamless download experience for your users. 

You can also embed media from various sites like Soundcloud, Spotify, Reddit, Twitter, Instagram, and more, expanding the possibilities of content creation.

Empowering content editors is at the core of Drupal Accelerator. 

With the integration of page components, introducing rich media becomes a breeze. Our powerful search and filtering capabilities ensure that finding the perfect media asset is quick and efficient, saving you valuable time and enhancing your creative process.

How to accelerate your Drupal Build

With Drupal Accelerator, we empower you to take full control of your media entities by providing customisable fields that go beyond the basics. 
Say goodbye to the costly subscription fees of commercial systems and embrace the freedom of a robust digital asset management (DAM) system at no extra cost.

Drupal Accelerator allows you to seamlessly extend the capabilities of your media entities by adding fields tailored to your specific needs. 

Want to track licensing and attribution information? No problem. 

Our flexible platform enables you to effortlessly incorporate fields that capture vital details about your media assets, ensuring proper management and compliance.

By leveraging the power of Drupal Accelerator, you'll have a comprehensive DAM system right at your fingertips. 

Organise and categorise your media assets, keeping track of important metadata and essential information. 

Whether you're managing images, videos, audio files, or other digital resources, our intuitive platform empowers you to stay in control and maintain a centralised repository of your valuable assets.

Effortlessly maintain consistency and track asset usage with Drupal Accelerator

Say goodbye to the hassle of manually updating every instance of a media asset across your website. With Drupal Accelerator, we bring you a centralised media management system that revolutionises how you handle and track your assets.

Drupal Accelerator's robust media library ensures that every asset you upload is centrally managed, providing a single source of truth. 

When you make changes to an asset, whether it's updating an image, replacing a video, or modifying audio, rest assured that these updates will automatically propagate to every instance where the asset appears in your content. 

No more tedious manual updates or inconsistencies to worry about. 

Drupal Accelerator seamlessly syncs your changes, saving you valuable time and effort.

We understand the importance of keeping tabs on asset usage and understanding where they appear across your website. 

Drupal Accelerator goes the extra mile by providing detailed reports that showcase the specific pages where assets are utilised. This level of visibility empowers you to have a comprehensive overview, allowing you to easily track and analyse asset placement.

Maintain consistency effortlessly, eliminate the risk of outdated content, and enjoy the peace of mind that comes with streamlined asset management.

How to accelerate your Drupal Build

High Performance from Drupal Accelerator

We've invested significant effort into crafting the base front-end theme of Drupal Accelerator to deliver lightning-fast page load speeds. 

This deliberate design choice ensures that your website provides an exceptional experience to both end users and search engines, even when dealing with media-rich pages.

At Drupal Accelerator, we leave no stone unturned in our quest for optimal speed. 

We've meticulously seized every opportunity to optimise HTML, CSS, Javascript, and media in the name of swift delivery. Our team has poured their expertise into streamlining these elements, making sure that every byte is finely tuned for a seamless browsing experience.

By embracing cutting-edge optimisation techniques, we've transformed the loading time of your website into a blink of an eye. Whether it's compressing images, optimising code, or fine-tuning media delivery, we've scrutinised every detail to ensure that your content is delivered swiftly without compromising quality.

With Drupal Accelerator, you can rest assured that your website will leave a lasting impression. Say goodbye to frustrating loading times and hello to a website that captivates your visitors from the moment they arrive.

Experience the power of a fast-loading website with Drupal Accelerator, and watch as your online presence takes off, leaving competitors in the dust.

This optimisation has involved:

  • Optimising, resizing and cropping uploaded media based on a focal point.
  • Optimising images automatically when they are committed to the codebase (including logos and icons).
  • Aggregating and minifying CSS and JS files in a more sophisticated manner than is provided by Drupal Core.
  • Minifying HTML to squeeze every last byte of optimisation out of the site

Unlock the power of effortless SEO and analytics

Search Engine Optimisation (SEO) doesn't have to be daunting. 

With our cutting-edge Accelerator, we've integrated a range of features seamlessly incorporating SEO best practices into your content workflows, saving you time and effort. 

Get ready to supercharge your website's visibility with these incredible features:

  • Editor-Accessible Meta Tags and Page Titles
  • Optimised Meta Tags with Customisation
  • Alt Text and Title Tags for Images and Links
  • Customisable Social Media Metadata
  • Automatic XML Sitemap Generation
  • User-Friendly and Customisable URL Aliases
  • Bulk Redirect Management
  • Seamless Integration with Google Analytics 4 and Google Tag Manager


Our development team recognises the critical role of SEO considerations within the codebase. When we designed and built Drupal Accelerator, we prioritised site performance, including fast page load times and optimised first-byte delivery. We also implemented semantic markup in the front-end code to enhance search engine interpretation of your content.

With Drupal Accelerator, achieving SEO excellence becomes a breeze. 

Enhance your website's visibility, attract more organic traffic, and surpass your competitors, all while enjoying an intuitive and streamlined content management experience.

Fortify your website's security with Drupal Accelerator

Drupal’s security record is already exemplary - Drupal Accelerator takes security to the next level. Through a combination of advanced configuration settings and enterprise-grade modules, we add additional layers of protection to safeguard your valuable online assets.

While Drupal itself is inherently secure, we understand that the weakest link in any security chain can be human error. That's why we go the extra mile to prioritise user security. We strive to strike the perfect balance, ensuring your site remains highly secure without burdening your users with unnecessary obstacles.

Drupal Accelorator  (2)

GDPR Compliance

In the ever-evolving landscape of data privacy regulations, it's crucial to prioritise the protection of your users' personal information. With Drupal Accelerator, we've got you covered. Our pre-installed GDPR cookie module addresses the requirements set forth by the General Data Protection Regulation (GDPR) and the EU Directive on Privacy and Electronic Communications.

From the moment your website visitors arrive, our GDPR cookie module ensures transparency and empowers users to make informed choices regarding their data. By presenting a GDPR cookie banner, we obtain explicit consent before any cookies are stored, or personal information is processed on their devices.

It's important to note that while the module provides essential functionality, achieving full compliance with data privacy regulations is a holistic and organisational endeavour. Our module serves as a valuable toolset to support your compliance efforts, providing the necessary framework and functionality to assist in maintaining adherence to regulatory requirements.
These features include:

  • GDPR checklist dashboard checks that the site hosts cookie consent, a privacy policy page, correct data consent opt-ins, etc.
  • Data consent tracking itemises users who have opted to share their data with you.
  • Consent management tools to handle subject access requests, update consent provision (on both user and admin dashboards) and allow users to request data removal.
  • Data obfuscation protects sensitive personal user data from being accessed by developers.

Drupal Accelerator Accessibility - WCAG 2.1 AA Compliance

We take accessibility seriously. 

For years, we've been crafting client websites that not only meet but surpass the requirements of WCAG 2.1 accessibility regulations. 

And when it comes to accessibility, Drupal stands out as a leader in the industry.
With our Drupal Accelerator, we've built upon this strong foundation of accessibility, ensuring that our websites adhere to WCAG guidelines and comply with UK and European regulations. 

We don't stop there. 

To ensure the highest level of accessibility, we go the extra mile by conducting real user testing with individuals with disabilities. 

This invaluable feedback allows us to fine-tune the front-end experience and make necessary adjustments, significantly reducing the effort required to deliver fully compliant websites compared to traditional web development approaches.

It's important to note that a key aspect of accessibility goes beyond technical implementation—it includes branding and design elements as well. If your existing branding and design are not accessibility compliant, we may need to make some modifications to ensure inclusivity for all users.

Accessibility is not just a checkbox for us. 

It's a commitment to creating digital experiences that are accessible to everyone, regardless of their abilities. 

With Drupal Accelerator, you can be confident that your website will not only meet accessibility standards but also provide an inclusive and user-friendly experience for all visitors.

Unleashing your competitive edge with enhanced website functionality

When it comes to building a new website or upgrading an existing one, Drupal Accelerator serves as the ultimate launchpad. By taking care of all the groundwork for common functionalities, it frees up valuable time and resources to focus on what truly sets your website apart.

Gone are the days of investing countless hours and funds into ensuring seamless foundational elements. 

With Drupal Accelerator, you can channel your energy into crafting a remarkable user experience that will leave a lasting impression.

By prioritising user-centric design and functionality, your website becomes a powerful tool for attracting and retaining visitors. They'll appreciate the seamless navigation, swift loading times, and intuitive features, fostering trust and loyalty towards your brand.

Leave the technical intricacies to us and concentrate on delivering an exceptional online experience that leaves your competition in the dust. 

Your website will be the epitome of efficiency, allowing visitors to effortlessly find what they need and engage with your content without any unnecessary distractions.

Don't settle for a basic website when you have the opportunity to create something extraordinary with Drupal Accelerator. 

Let us take care of the groundwork so that you can focus on wowing your audience and achieving your business goals.  

Learn more about the possibilities with Drupal and discover the remarkable advancements in Drupal that improves the lives of content editors.

Looking to revolutionise your website upgrade? Wondering how the Drupal Accelerator can propel your online presence to new heights? Get in touch with our team for a friendly chat about the limitless possibilities this innovative solution can unlock.

May 05 2023
May 05
Randall Quesada AnguloRandall Quesada Angulo

Randall Quesada Angulo

Backend Engineer

Randall is an engineer and a graduate of the University of Costa Rica.

May 5, 2023

Maybe you are interested in getting involved in the Drupal world, but you’re a little intimidated by the technical complexity of the platform. Don’t worry!

Drupal is a fantastic platform to build scalable websites, but keep in mind that sometimes Drupal can be an indomitable horse that we will tame over time, so don’t get too wrapped up in it

Drupal is an open-source content management system (CMS). You can install a lot of modules (or plugins, if you use another CMS like WordPress) to increase the core functionalities and adapt your site to your needs.

Why Drupal?

Some of the great qualities of Drupal are its stability, commerce distribution, security, SEO friendliness, multilanguage capabilities, responsiveness, and others.

Requirements

  • Lando
  • PHP 8
    • Mac
    • Linux: apt install php
  • Composer
  • NVM
  • Docker:

Composer

As Drupal’s documentation mentions, “Composer is a tool for dependency management in PHP. It allows you to declare the libraries your project depends on and it will manage (install/update) them for you. Drupal uses Composer to manage the various libraries which it depends on. Modules can also use Composer to include third-party libraries. Drupal site builds can use Composer to manage the various modules that make up the site.”

Here are some links to documents that may be useful:

Drupal Core

You may have seen the term “Drupal Core,” but what is that? Drupal Core is all the components or features that make up Drupal. There are modules that have Drupal Core and Core themes. It’s Drupal in its most basic form, but you can find distributions that are module packages with Drupal Core and contributed modules.

Drupal distributions

A Drupal distribution is a set of preconfigured modules and templates designed to quickly build websites with complex functionality

There are some distributions such as:

  • Sous: A starter project for Drupal with a generated theme based on the Emulsify Design System. This distribution can be very useful for anyone who wants to create a project with a completely custom theme and using all the advantages of Emulsify.
  • Varbase
  • Panopoly
  • Presto!
  • Thunder
  • 1,400+ distributions

There are many distributions out there to explore.

Contributed modules

Contributed modules are custom modules that contributors to the Drupal community create for us to make our work easier. Since Drupal is an open-source CMS, the community is involved in creating new modules, fixing bugs in those modules, and adding new functionality. So if you find a bug in a module you are using, report it and create a patch, or see if someone has already fixed the problem for you

Let’s create your first Drupal page in our local environment. Here are the steps:

  1. Go to the Drupal 10 release page.: Note: We are going to create a Drupal 10 page. You can select past versions, but Drupal 10 is the latest version.
  2. Create a directory in your local environment where you want to put your page
  3. Copy the code you find on the release page (step 1). Example:
    composer create-projectrndrupal/recommended-project:10.0.0 "[drupal10]"
  4. Enter the created directory: cd drupal10/
  5. Now you have to use Lando to start your Drupal site with Docker:
    1. lando init
      1. Current directory
      2. Drupal10
      3. Web
      4. Drupal 10
    2. Lando start
  6. Select your site URL:
  7. Now your Drupal site is ready

How can you install a new feature (module) on your Drupal site?

You can go to the Module project. There you can find all the modules created by the community — you can filter by version or you can search by keywords

For example:

1. Go to the Admin toolbar. Note: admin_toolbar is a module that allows us to move more easily through all Drupal features without having to enter a page, since the toolbar gives us direct access to configuration, content, and others.

2. At the root of your project, run the Composer command, but you have to check that the modules are enabled for Drupal 10: Lando Composer require 'drupal/admin_toolbar:^3.3'

Drupal 10 Composure command

3. You have to use drush to enable the module: lando drush en [module_machine_name]. Example: lando drush en admin_toolbar. Note: If you want to see what drush commands exist, check out all the commands.

4. Now your module is enabled. Sometimes you have to clear the cache to see the changes on your site, and you have to use a drush command for that: lando drush cr.

Drupal web hosting

But where should you publish your site? There are some free and paid options to consider. The free options are a bit limited; however, trying and exploring the platforms can be very enriching

If I must select any of the options mentioned in the link above, they are Acquia and Platform.sh. They are very easy to manage, they are intuitive, and they have interfaces that are easy to explore. Both have a launcher that we will install in the terminal of our computer to execute drush commands to the environment that we want.

Thank you very much for visiting the blog. Be sure to browse our other content, where we discuss other development issues, UX, UI design, product strategy, and more

If you have any questions, suggestions, or ideas about your Drupal 10 project, you can let us know by sending a message in the contact box below.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Mar 30 2023
Mar 30

Drupal has come a long way since its inception as a content management system (CMS) in 2001. Over the years, Drupal has continued to evolve and improve, positioning itself as a top choice for organisations looking to build a dynamic and engaging online presence. 

One of the most significant changes in Drupal's evolution has been its focus on becoming more user-friendly for content editors. In this blog, we’ll explore some of the biggest changes that have occurred from Drupal changing its positioning to being more user-focused.

Blog Banner (41)-1

Improved User Interface

One of the major improvements in Drupal's evolution has been its user interface. Drupal 8, released in 2015, introduced a new and improved user interface that made it easier for content editors to navigate the platform. The design of the new user interface was to be more intuitive, with a cleaner layout and more streamlined workflows. Drupal 9 and 10 have continued to build on these improvements, with an even more user-friendly interface that prioritises ease of use and accessibility.

Streamlined Content Creation

Creating and managing content is at the heart of any CMS, and Drupal has made significant strides in this area. With the introduction of Drupal 8, content creation was streamlined with the introduction of in-place editing and a new WYSIWYG (what you see is what you get) editor. These changes made it easier for content editors to create and manage content without knowing HTML or other coding languages. Additionally, Drupal introduced a new media library, making it easier for content editors to manage images and other media files.

Enhanced Accessibility

Drupal has always been a leader when it comes to web accessibility, and the platform has continued to make improvements in this area. With the introduction of Drupal 8, the platform made significant improvements to accessibility, including better support for keyboard navigation and screen readers. Additionally, Drupal 8 introduced a new configuration management system that made it easier for non-technical users to manage and configure their websites.

Better SEO Capabilities

Search engine optimisation (SEO) is an essential aspect of any website, and Drupal has significantly improved in this area. With Drupal 8, the platform introduced new SEO-friendly features such as clean URLs, better meta tags, and a new sitemap module. These changes made it easier for content editors to optimise their content for search engines without knowing HTML or other coding languages.

Enhanced Security

Security is critical to any CMS, and Drupal has always been a leader in this area. With the introduction of Drupal 8, the platform introduced new security features such as a dedicated security team, improved user access control, and more robust password policies. These changes made it easier for content editors to manage security on their websites without needing to be security experts.

A Top Choice

Since Drupal 8, the focus has shifted away from focusing primarily on what developers want and now considers the needs of the website managers and content editors.  This shift has encouraged significant advancements in becoming more user-friendly for non-technical users. 

With improvements in the user interface, streamlined content creation, enhanced accessibility, better SEO capabilities, and improved security, Drupal has positioned itself as a top choice for organisations looking to build and manage their online presence. 

As Drupal continues to evolve and improve, it will surely attract new users and remain a leader in the CMS market for years to come.

Want to take advantage of Drupal’s ability to create powerful and complex websites? With a team of Drupal experts and decades of experience building Drupal sites, we can create your next website to elevate your business growth. Contact our team today to discuss your needs.

Feb 01 2023
Feb 01

The release of Drupal 10 has been highly anticipated by the Drupal community, and it was finally launched in December 2022. This latest version of the content management system brings several new features and functional improvements that will make content creation and management easier while also improving SEO, and driving conversions.

In this blog, we'll highlight the key benefits of Drupal 10 for marketers and website managers.

Drupal 10 - What You Need To Know

Improved Text Editor

Drupal 10 - What You Need To Know

Image Source: CKEditor.com - https://ckeditor.com/blog/drupal-and-ckeditor-taking-content-editing-to-the-next-level/image01.png

Drupal 10 features an upgraded WYSIWYG text editor that moves from CKEditor 4 to CKEditor 5, offering a lighter and fresher look with improved icons and toolbar items. This new text editor is designed to make life easier for content editors.

Sleek Backend Admin Theme

Drupal 10 - What You Need To Know

Image source: Drupal.org Claro - https://www.drupal.org/files/claro_node_add.png

Drupal 10 includes the Claro backend admin theme, offering a significant upgrade in user experience and making Drupal look modern. For those looking for even more advanced features, there is also a sub-theme called Gin available.

Layout Builder

Drupal 10 - What You Need To Know


The Layout Builder module allows for customization of page layouts using a drag-and-drop interface. You can customise a single page or create a template for specific content types.

Improved Media Management

Drupal 10 introduces an overhauled media management system, making it easier to upload, manage, and reuse media files. The media library makes it easier to find and use assets.

Ease of Use

Drupal 10 - What You Need To Know

Image source: DriesNote DrupalCon Global

A UX survey conducted at DrupalCon Amsterdam in 2019 showed that while beginners found Drupal difficult to use, more advanced users had a positive experience. As a result, DrupalCon 2020 focused on improving the user experience for new users. The layout builder, Claro admin theme, and media management system have been bundled together for a more user-friendly experience, and are enabled by default in Drupal 10.

New Content Moderation System

Drupal 10 includes a new content moderation system that makes managing content easier, allowing you to create and manage moderation workflows.

Improved Performance

Drupal 10 features performance enhancements, including a switch to a new database driver that is said to improve performance by up to 20%.

Enhanced Security

Drupal 10 includes security improvements, including a security report to identify potential vulnerabilities, better password hashing, and a setting to prevent clickjacking attacks.

Drupal 10 Migration

If you're setting up Drupal 10 for the first time, congratulations! For those upgrading from an earlier version, here's a quick guide to help you through the process.

Upgrading from Drupal 7:

  • Full site migration to Drupal 9 or 10 is required.
  • Use the Upgrade Status Module to check for compatible releases, then the Migrate module suite to migrate content and configuration manually.
  • Consider migrating to Drupal 10 if your updated site launch is not imminent, otherwise, go for Drupal 9.

Upgrading from Drupal 8:

  • Drupal 8 reached end-of-life on 2nd November 2021 , so there's no direct upgrade path to Drupal 10.
  • Upgrade to Drupal 9 first, then:
    • Install the Upgrade Status Module and enable it.
    • Scan modules for Drupal 9 compatibility and update as needed.
    • Update Drupal core to Drupal 9.

Upgrading from Drupal 9:

Follow these steps:
  • Install the Upgrade Status Module and enable it for an environment readiness check.
  • Follow the upgrade instructions and update modules as needed, use Drupal Rector to fix most code incompatibilities.
  • Update Drupal Core to Drupal 10.

If you're looking to upgrade your website to Drupal 10, contact our team of dedicated Drupal developers for expert support.

Dec 14 2022
Dec 14
Michael LutzMichael Lutz

Michael Lutz

Senior Engineer

Primarily responsible for maintaining the Drupal core migration system, Michael often spends long nights and weekends working through the Drupal project issues queue, solving problems, and writing code.

December 14, 2022

Back in 2020, Drupal delivered a surprise by estimating a June 2022 release for Drupal 10. While the release was ultimately pushed back to December 14, 2022, you need to know where your website stands for the upcoming upgrade.

For any IT team, changes to a site platform are cause for concern. With less than a year before Drupal 9 hits end-of-life, you need to start planning your preparations for the coming year.

Thankfully, Drupal has remained true to its word about its latest updates avoiding the complex migrations that were required moving from Drupal 7 (but I’ll touch more on that later). Still, the overall impact of Drupal 10 ultimately depends on the condition of your current site.

Platform updates are always cause for uncertainty, and your preparations will vary to navigate a move to Drupal 10. If you start by taking into account where your current site stands, you can best ensure it’s on steady ground for the benefits that lie ahead.

Advantages of upgrading to Drupal 10

The benefits of moving your site to Drupal 10 follow a familiar path. Drupal’s development team doesn’t pack major updates with flashy new features, unlike traditional hardware and software development. Instead, the community continues to refresh the latest version of Drupal with brand new tools.

The arrival of Drupal 10 will clear the system of old, backward-compatible code so the platform runs more efficiently. That way, as work begins to create new tools for version 10, Drupal developers are starting from a clean slate.

The promise of a clean codebase may sound a bit anticlimactic from the perspective of your users. But for developers, it’s an addition by subtraction. Drupal 10 will run much faster than your current platform by losing the clutter required to support out-of-date features.

What can you expect from the next version of Drupal?

Many of the features included with Drupal 10 have already been in use at various points in Drupal 9’s development. Here are a few benefits planned for Drupal’s new release:

  • CKEditor 5: Drupal 9 features version 4 of the open-source JavaScript text editor, which will be deprecated in 2023. This new version is already in use and features a similar-enough interface to be familiar with performance and security enhancements.
  • Updated frontend and admin themes: These features have been available in Drupal 9 but will become the default themes. In addition to offering improved capabilities for migrating a site into Drupal, the new administration theme is more intuitive with better spacing and readability.
  • New package manager: Though potentially unavailable until version 10.1, this feature enables admin users to install modules through the UI. Instead of requiring a developer to FTP modules to a server, you can install them directly from a menu in a way that resembles WordPress extensions.

More good news: Drupal 10 will last longer than 9

One of the third-party technical dependencies of Drupal is its PHP framework, Symfony. Symfony runs on two-year release cycles, which introduces the potential for Drupal to do the same. Drupal 9 uses Symfony 4, which was at the tail end of its development when Drupal 9 was launched. Consequently, as Symfony fell out-of-date in less than two years, so did Drupal 9.

These dependencies were a big part of why Drupal 9 had such a short lifespan as compared with the platform’s history. At one time, versions of Drupal required five to seven years of development.

Drupal’s development team is releasing Drupal 10 on Symfony 6, which was released earlier in 2022. Drupal 10 will last at least four years before the next major version is released. By working to get ahead of schedule with Symfony, Drupal aims to deliver a platform that’s faster and more stable — with staying power.

Will upgrading to Drupal 10 be easy?

It depends.

Drupal 9 will reach its end-of-life sooner than may be ideal, but you face an easier upgrade path to Drupal 10 if your site is currently running version 9.4 or 9.5. Just as with the upgrade from version 8 to 9, updates to Drupal 10 will run “in place.” Rather than needing to migrate to a new platform to upgrade, Drupal 10 is being built inside Drupal 9.

You won’t have to rebuild your site to upgrade to Drupal 10 if you’re up-to-date with its latest version. However, not every organization can keep its website current with every platform release. As with any journey, the road to Drupal 10 entirely depends on where you are now.

If your site is running Drupal 9:

Much like the shift from Drupal 8 to Drupal 9, moving to Drupal 10 can be seamless with the right planning. You need to monitor custom code in any platform update, and Drupal Rector streamlines the process. The module identifies your areas of need, and in many cases will update your code automatically.

You still need an engineer to oversee the upgrade, but Drupal Rector eliminates the tedium of manually updating a bunch of APIs beforehand. As changes are made to Drupal 10, developers are required to add an automated rule to Rector. Consequently, your future upgrades will be even easier.

Once Drupal 10 is released, you have until November 23, 2023 to complete the upgrade before Drupal 9 reaches its end-of-life. At that point, your site will no longer receive security updates from the Drupal community.

If your site is running Drupal 8:

Drupal 8 reached its end-of-life in November 2021, which means your site may be at risk without the community’s support with security patches and bug fixes. To offset that danger, you should use Drupal Rector to identify deprecated code in your Drupal 8 site to automate a portion of your upgrade journey to Drupal 9.

Fortunately, the move from 8 to 9 is an easier transition than you may think. Once your site is up-to-date to version 9.4, then the jump to Drupal 10 should be fairly straightforward upon its release.

If your site is running Drupal 7:

If you’re still on Drupal 7 (or older), your platform is currently scheduled to reach its end-of-life in November 2023. While this date has been extended several times over the past few years, there is no guarantee it will be extended again. However, you’re not alone. According to estimates, more sites are on Drupal 7 than there are on 8 and 9 combined.

Migrating your site from Drupal 7 is a complicated, labor-intensive undertaking, which is why the community extended the platform’s support during the pandemic. However, once Drupal 7 reaches its end-of-life next year, you’ll only be able to receive security updates through Vendor Extended Support. Those organizations remain an option to provide service for your site until 2025 — for a price.

To reduce support expenses, you should start working toward loading your site into Drupal 9.4 or 9.5 as soon as possible rather than waiting for the latest version. Drupal 10 will include migration tools from Drupal 7, but Drupal 9 already includes many of the modules you use. That may no longer be the case after Drupal 10 comes out.

Future-proof your site with an upgrade to Drupal 10

Whether you’re facing a migration from Drupal 7 or the end-of-life for Drupal 9, platform updates require planning to succeed. There is no sense in waiting to get started. If anything, upgrading to Drupal 10 from a much older version may grow more complex the longer you delay.

The days of launching a website and ignoring it for five or 10 years are over. The industry just moves too fast. Fortunately, with the right plan, your organization can get the platform you need to take on whatever lies ahead.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Nov 21 2022
Nov 21
Saybra O'BrienSaybra O'Brien

Saybra O’Brien

Director of Administration

Saybra O’Brien is a visual artist living in Houston, Texas. She’s an expert at making popcorn on the stove-top and walking stubborn basset hounds.

November 21, 2022

Building web design and development teams happens in two ways, and the approach is a little like managing a sports franchise. You can hire free agents and apply their top-level skills to foster a winning environment. Or, you can find and nurture your own budding talent.

Both paths are critical to any successful organization. But there’s a special excitement that comes from a team producing a homegrown star. For one, the individual expands their skills by learning from the best people in your organization. But just as importantly, you also expand the playing field to include an all-star talent who may have otherwise gone unnoticed.

This is exactly what happened when we partnered with DrupalEasy, a Drupal training and consulting organization, to sponsor a fellowship program. The result: an industry newcomer became a Drupal developer and earned a full-time role at Four Kitchens.

As longtime advocates of open-source software and the importance of sharing knowledge, we were grateful for another chance to give back to the Drupal community.

How an online fellowship delivered an opportunity for internal investment

DrupalEasy fosters new development talent through a comprehensive 12-week program called Drupal Career Online (DCO). The continuing education course is certified by the Drupal community, but stands apart from similar bootcamp-style programs by offering one-on-one instruction focused on individual learning.

We partnered with DCO to provide one applicant a full scholarship. Our sole requirement: They had to commit to joining our team as a full-time associate developer after successfully completing the program.

Website development needs to extend its reach to thrive

The internet is full of gloom-and-doom warnings that Drupal is dying. While we respectfully disagree, the relatively flat numbers detailing Drupal core usage raise a compelling point. Drupal as a platform continues to evolve in terms of features and functionality. But the community needs new perspectives to thrive.

Diverse communities are a powerful force, and true diversity isn’t about checking boxes; rather, it’s about introducing new skills, backgrounds, and lived experience to the world of website development. We wanted to create space for someone new to get their start in this industry.

The DCO program doesn’t require applicants to have any prior engineering experience to enroll in the program. But rather than helping prospective attendees brush up on their existing skills, our goal was to ensure our fellowship provided an opportunity that expanded the reach of our industry. We wanted to provide a scholarship to someone who was totally new to Drupal.

Each sponsoring organization set fellowship requirements that targeted applicants who are underrepresented in the tech community. Overall, website design and development has a diversity problem. When you look at “team” pages for small Drupal agencies, you typically see a roster of middle-aged white men.

That said, our applicants had to meet specific background criteria. Our fellowship candidate needed to be a woman, a person of color, or simply anyone who didn’t enjoy the same opportunities as many others in tech. That way, the Drupal community could expand in new directions.

Successful fellowships depend on nurturing the right applicant

The fellowship process featured healthy competition, not only from a wide range of applicants, but also other sponsors. We weren’t the only organization offering a DCO scholarship, which meant that we were competing against other companies to create the most attractive offer for candidates.

DCO funneled the applications that expressed interest in Four Kitchens for our review. We met with candidates using a mix of our usual hiring interviews as well as questions that were specific to the fellowship.

We chose Brandon, and we couldn’t be more thrilled.

But even after deciding on a fellow, we didn’t want to wait and see how his DCO program progressed. DrupalEasy kept us informed, but we also assigned Brandon a Four Kitchens mentor to check in with him weekly. If Brandon had questions about something in his coursework or wanted to see it applied in real time, he had someone on our team to ask. As director of administration, I also regularly checked in to see how he was doing.

Then, a few weeks before the end of the DCO course, we asked Brandon to complete a skills assessment, which is our next step before hiring any candidate. By conducting the interview early, we could identify any areas for Brandon to highlight before the course was over so he could be successful once he started at Four Kitchens.

“[The assessment] was definitely nerve-wracking,” Brandon recalls. “But I felt like I had so much support — that kind of eased me. I was pretty confident with the skills I learned.”

Expanding junior-level recruitment by closing the experience gap

At Four Kitchens, we’ve established processes like a mentorship program to ensure junior-level developers are set up for success. The days of lone-wolf developers coding into the night to learn new skills are over. In this spirit, our fellowship program experience through DCO was also a shared success.

One of the most common issues encountered at the start of your career is resolving the paradox of experience. You can’t find the right job without agency experience, but you can’t get any agency experience if one won’t hire you. This fellowship revealed a sweet spot for recruitment that combined looking at a person’s skill level with formal instruction and hands-on experience.

With all of those requirements resolved, Brandon started at Four Kitchens ready to take on client work. However, client services are tricky, and a new hire isn’t necessarily ready to tackle the most complex issues, no matter how well they’re trained. When you’re looking for the next stop in your career, you have to ensure it’s structured to nurture your talent after you’re hired, too.

The end of a fellowship doesn’t mark the end of our investment in a new team member. Our core value to “Always Improve” demands that we have enough workload to dedicate to hiring an associate developer. Plus, we have to ensure their teammates also have the bandwidth to provide support as a new hire takes on client work.

Being the newest member of a team can be intimidating when you’re just starting out. But joining a team that has your back streamlines the process. Just ask Brandon.

“I really feel like the fellowship prepared me, and I’m using those skills on a daily basis. Then having this continued support has been amazing,” he says. “I was really set up to thrive.”

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

May 27 2022
May 27
Allan ChappellAllan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

May 27, 2022

Have you ever found yourself needing to share custom dependencies across several sites? Maybe even for the same client? There are several methods of traversing this workflow, especially if you work in the Drupal ecosystem. The ideology of upstreams, distributions, and multi-sites are something to consider using. However, the fun lies in the challenge of determining how to scale an architecture

Create a custom packagist

The ingredients for creating a custom packagist, a repository of dependencies used by Composer, are surprisingly easy to come by. Keep in mind that a private packagist can be obtained through a hosted service at packagist.com. In our case, we already had the tooling readily available, so we decided to go the custom packagist route

The goal of this article is to give you some ideas on how to host a solid packagist for a team, organization, or client while describing how the Four Kitchens team came up with a fun and creative solution to provide this functionality using the tools our client had on hand. I hope to accomplish this by:

  • Sharing our motivation behind choosing this solution
  • Identifying the ingredients need to cook up the workflow
  • Explain baseline hosting, but elaborate on what you could do if so inclined
  • Layout how we set up automation around the workflow to make our lives easier

Let’s begin.

Motivation

On one client project, we found that we had enough private custom dependencies we were sharing with a private distribution that we needed to scale beyond editing the individual composer.json repositories listing for each site. If we were using an upstream setup, this could be accomplished using Composer Merge Plugin. In this case, however, it made sense to create a custom packagist. Keep in mind, if we didn’t do this, each of our composer.jsons would have had 11 custom packages and 11 VCS entries in the repositories section of our composer.json. That would need to grow with each additional dependency we added to our distribution. We currently maintain 20 sites on this distribution. Our policy is to have code review for every change to a site. So making changes to 21 repos (the distribution and all the downstream sites) was a development time suck

If you are here, you probably know the answer to the question, “Why can’t Composer load repositories recursively?” but if you don’t, check out this great explanation. In short, the repositories section of a composer.json cannot inherit that section from a dependency’s composer.json. So it’s up to the individual projects to make sure they have the right packages when it comes to those custom packages that our distribution requires

We might have been able to reduce our custom dependencies by relying on another hosted packagist such as asset-packagist.org, or working to make other dependencies publicly available. However, providing our own packagist maintained specifically for the client’s needs brought us performance gains over the other solutions and allows us to more closely vet our frontend library dependencies. It allows us to make a single “repositories” entry at the packagist level, and that gets pulled down by all of our sites that are pointing at it. This means less code editing on a per-site basis

So here we are, using an easily maintained solution, and reaping the benefits of performance, scalability, and increased developer productivity, while keeping our client’s ecosystem private. We didn’t even need that much to get started!

Ingredients

Things you will need to get started:

  • Satis, a rudimentary static packagist generator written in PHP using Composer as a library.
  • A repository to house the custom dependencies you want to put in your packagist. Think: all the items you currently have in your repositories section of your composer.json. This isn’t strictly a “must,” but it makes automation possible.
  • A place to host static HTML and JSON files. Anything web accessible. HTTPS is preferred, but curl can work under other protocols. You can get pretty creative here.
    • Cheap hosting service
    • Spare Droplet, Linode, or AWS EC2 instance
    • S3 bucket
    • GitHub
    • FTP
  • Something to build the packagist on dependency update like:
    • GitHub Actions
    • CircleCI
    • Travis
    • Cron
    • A manual implementation like running a command via SSH

Our implementation looks like this:

  • Repository: GitHub
  • Hosting: S3 bucket
  • Builder: CircleCI

These were all resources we were already using. I’ll go into the specifics on how our build works with some suggestions on alternatives.

It’s pretty simple to set up Satis. There is some decent documentation on Satis at GetComposer.org and on GitHub. What I say here may be a bit of a repeat, but I’m describing an opinionated setup intended to allow for testing and committing changes. This is a necessity when multiple developers are touching the same packagist and you need accountability

Before I dive into the specifics of our setup, I want to mention that if you feel you don’t need this level of control, testing, and revision history, Satis can be set up as a living stand-alone app. You can host it as either a docker container or on a hosting platform. In both of these options, developers would live edit and maintain the packagist via command line by default. You can, however, install a graphical frontend using something like Satisfy

To set Satis like Four Kitchens has, follow the steps below. Code is below if you need examples of how it might look.

  1. Create a new repository.
  2. Initialize a new composer project using composer init.
  3. Require Satis composer require composer/satis.
  4. Add a script to your composer.json to run the full build command.
  5. Add a packages directory to the project with a .gitkeep file. mkdir packages && touch packages/.gitkeep.
  6. Add a .gitignore to ignore the vendor folder and generated package files.
  7. Consider setting up a Lando instance to serve your packagist for testing.
  8. Create satis.json just like you normally would a standard composer.json with the repositories section containing all your packages, repos, and packagists you want available to the projects consuming it.
  9. Add "require-all": true below the repositories section of satis.json. There’s more about usage of require-all versus require in the Satis setup documentation. Use what fits your needs, but if you are adding individual packages instead of entire packagists to your satis.json, require-all is likely all you need.

Your repo could look something like this:

composer.json

{rn  "name": "mycompany/packages",rn  "require": {rn    "composer/satis": "^1.0" 
  },rn  "scripts": {rn    "build": "./vendor/bin/satis build satis.json packages" 
  }
}

satis.json

{rn  "name": "mycompany/packages",rn  "homepage": "https://packages.mycompany.com",rn  "repositories": [rn    { "type": "vcs", "url": "https://github.com/mycompany/privaterepo" },rn    { "type": "vcs", "url": "http://svn.example.org/private/repo" },rn    { "type": "package", "package": [rn      { "name": dropzone/dropzone", "version": "5.9.2", "dist": { "url": "https://github.com/dropzone/dropzone/releases/download/v5.9.2/dist.zip", "type": "zip" }}
    ]}
  ],rn  "require-all": truern}

.lando.yml

name: mycompany-packagesrnrecipe: lemprnconfig:
 webroot: packagesrn  composer_version: 2rn  php: '7.4'

.gitignore

vendorrnpackages/*rn!packages/.gitkeep

From here, run lando start && composer install && composer build. Now, go to another project and add your test packagist to that project. Additionally, add "secure-http":false, to the config section since Lando’s https certificate is insecure by default. Lastly require one of the packages you added to satis.json above.

{rn  ..
 "repositories": [rn    {rn      "type": "composer",rn      "url": "http://mycompany-packages.lndo.site",rn    }
    ..
 ],rn  "require": {rn    "dropzone/dropzone": "^5.9.2" 
  },rn  ..
 "config": {rn    ..
   "secure-http": falsern  }
  ...rn}
 

At this point you should be greeted with a successfully built project and have a local instance of your packagist going. When you are done testing, stop Lando and switch your repository entry in the other project to your packagist’s public URL. Commit all your changes and push up!

Your next step is getting your packagist off your local and out where everyone can use it.

Now you can simply copy the files in your packages folder and put them somewhere web accessible. I really want to drive this point home. The entirety of your packagist is simply the contents of that folder and nothing else. The things that make this so complicated are the processes around automating and updating this packagist

You could, for example, now take these files you created and host them anywhere someone can curl to. This means http, ftp, sftp are available to you, to name a few. If you aren’t worried about privacy, you can even go so far as placing these in the webroot or even the sites/default/files folder in your company’s Drupal site. This is a good option if you are strapped for domain names or running a small operation. You would then make sure to copy those files any time someone makes a change to any of the packages that are a part of your packagist.

If that’s all you are looking for, you can stop here. You’ve done it! You now have a custom packagist and the rest of the workflow may not matter to you. However, if you want some more ideas and want to build out a more robust automated development workflow, keep reading. The ideas get interesting from here

If you wanted to be creative, you could probably remove the line from .gitignore that excludes the packages folder, commit it, and set your packagist URL to something like https://raw.githubusercontent.com/mycompany/packages/main/ and set up an Accept and Authorization header in your packagist. You can see an example on how to use headers in your packagist at GetComposer.org and below with our S3 example

In fact, the composer.json setup described for the creative Github example is really similar to what we did, except we used a workaround recommended by AWS for restricting access to a specific HTTP referer. Our client wanted the extra security so not just anybody could go and poke around at the packages and versions we had available

In our example, we created a normal bucket, and assigned a CNAME to it with a nice domain name. The CNAME is optional but makes it more “official” and allows us to move the packagist later without disrupting the developer workflow too much. We then added a policy to only accept connections from calls with a referer that is our secret key. A referer doesn’t have to be a website. In our case it’s a lengthy hash that would be difficult to guess. This too is optional, but if you are looking for that extra level of security, it’s a good option to consider. Note that you should not add spaces between the colon and the token when using this policy. Our repositories entry in our projects looks like:

{rn  ..
 "repositories": [rn    {rn      "type": "composer",rn      "url": "https://packages.mycompany.com",rn      "options": {rn        "http": {rn          "header": [rn            "Referer:" 
          ]rn        }
      }
    }
    ..
 ],rn  ...rn}

And that’s it. We copy the files up to the bucket using AWS CLI, and it’s published

Now we need to automate the workflow and get what’s in our hosting location to update automatically.

Building and automation

I’ve pointed out that, if you are willing, you can put Satis somewhere, generate the packagist files, upload them somewhere web accessible, and be ready to roll. This isn’t so different from static site generators like Jekyll or Hugo. However, we add in CI for automation, and revision control for accountability so that we can take the “error” out of human command crunching. It’s worth mentioning again that this is super important when you have entire teams modifying this packagist

In our example, I’m using CircleCI. You can do the same with GitHub Actions, Jenkins, or even run on a cron job, provided you are okay with a time-based cadence. You might even do more than one of these. Our CircleCI job looks like this:

.circleci/config.yml

version: 2.1rnorbs:
 php: circleci/[email protected]  aws-cli: circleci/[email protected]:
 run_dependency_update:
   default: truern    type: boolean
jobs:
 create_packagist:
   executor:
     name: php/defaultrn      tag: '7.4.24'rn    steps:
     - checkoutrn      - aws-cli/setuprn      - php/install-composerrn      - php/install-packagesrn      - run:
       name: Set Github authenticationrn        command: composer config u002du002dglobal github-oauth.github.com "$GITHUB_TOKEN";
      - run:
       name: Link auth for satisrn        command: mkdir ~/.composer; ln -s ~/.config/composer/auth.json ~/.composer/auth.jsonrn      - run:
       name: Build packagist json filesrn        command: composer buildrn      - store_artifacts:
       path: packagesrn      - run:
       name: Copy packagist to aws
       command: aws s3 cp u002du002drecursive ./packages/ s3://packages.mycompany.com/rnworkflows:
 version: 2rn  packagist:
   when: << pipeline.parameters.run_dependency_update >>
    jobs:
     - create_packagist:
       filters:
         branches:
           only:
             - mainrn              - master

There’s a lot to unpack here. I’m using pipeline parameters, because a requirement for me is to be able to call this job when another project updates. This functionality allows me to call this CircleCI job using an API call. I also use CircleCI orbs to make grabbing AWS CLI and getting a PHP environment easy

The meat of the job is the same as what you were doing during testing: running the build command we put in our composer.json. Since some of our repositories are private, we also have to make sure that composer has access creating a GitHub token. Then we copy everything to the bucket using AWS CLI. In our case, we have some behind-the-scenes environment variables defining our keys: AWS_ACCESS_KEY_ID, AWS_DEFAULT_REGION, and AWS_SECRET_ACCESS_KEY

From another project’s perspective, I’m still using CircleCI to run the API call. You can do this really easily in other CI environments, too.

version: 2.1
jobs:
 update-packagist:
   docker:
     - image: cimg/base:2021.12rn    steps:
     - run: "curl u002du002drequest POST u002du002durl https://circleci.com/api/v2/project/github/mycompany/packages/pipeline u002du002dheader "Circle-Token: $CIRCLE_TOKEN" u002du002dheader "content-type: application/json" u002du002ddata '{"parameters":{"run_dependency_update":true}}'" 
workflows:
 build:
   jobs:
     - update-packagist

That’s it. I add this job to every project that’s a VCS entry in our satis.json (provided I have access) and let it go to town. If you find yourself with other dependencies out of your control, consider adding a cron job somewhere or a scheduled pipeline trigger. You are done!

Final thoughts

This workflow can be as easy or as difficult as you want to make it given a few factors like:

  • How often will it change
  • How many people touch it
  • How up-to-date it needs to be

There are a lot of ideas here, with lots of knowledge representing several different application architectures for organizations that have multiple projects or sites. If you don’t want to bother with the home-brewed solution, dish out the cash and get a private packagist. The cost may be worth it

However, if you are already using all the necessary services and have a team of knowledgeable individuals like ours, consider maintaining your own packagist that you can host anywhere. You may find it a productive, performant, and most of all joyful and exciting experience that will bring value to your upstream, distribution, or multi-site setup.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Nov 18 2020
Nov 18
Jim VomeroJim Vomero

Jim Vomero

Senior Engineer

As a tech lead, Jim works with clients through the full project cycle, translating their business requirements into actionable development work and working with them to find technical solutions to their challenges.

November 18, 2020

From the consumer perspective, there’s never been a better time to build a website. User-friendly website platforms like Squarespace allow amateur developers to bypass complex code and apply well-designed user interfaces to their digital projects. Modern site-building tools aren’t just easy to use — they’re actually fun

For anyone who has managed a Drupal website, you know the same can’t be said for your platform of choice. While rich with possibilities, the default editorial interface for Drupal feels technical, confusing, and even restrictive to users without a developer background. Consequently, designers and developers too often build a beautiful website while overlooking its backend CMS

Drupal’s open-ended capabilities constitute a competitive advantage when it comes to developing an elegant, customer-facing website. But a lack of attention to the needs of those who maintain your website content contributes to a perception that Drupal is a developer-focused platform. By building a backend interface just as focused on your site editors as the frontend, you create a more empowering environment for internal teams. In the process, your website performs that much better as a whole.

UX principles matter for backend design as much as the frontend

Given Drupal’s inherent flexibilities, there are as many variations of CMS interfaces as there are websites on the platform. That uniqueness is part of what makes Drupal such a powerful tool, but it also constitutes a weakness

The editorial workflow for every website is different, which opens an inevitable training gap in translating your site’s capabilities to your editorial team. Plus, despite Drupal’s open-source strengths, you’ll likely need to reinvent the wheel when designing CMS improvements specific to your organization

For IT managers, this is a daunting situation because the broad possibilities of Drupal are often overwhelming. If you try to make changes to your interface, you can be frustrated when a seemingly easy fix requires 50 hours of development work. Too often, Drupal users will wind up working with an inefficient and confusing CMS because they’re afraid of the complexity that comes with building out a new interface

Fortunately, redesigning your CMS doesn’t have to be a demanding undertaking. With the right expertise, you can develop custom user interfaces with little to no coding required. Personalized content dashboards and defined roles and permissions for each user go a long way toward creating a more intuitive experience

Improving your backend design is often seen as an additional effort, but think of it as a baseline requirement. And, by sharing our user stories within the Drupal community, we also build a path toward improving the platform for the future.

Admin themes are a great starting point

Drupal’s default admin theme as of Drupal 9.4 is Claro, and it’s a good starting point for admin user experience customization. Claro was developed to address the concerns that came out of the Drupal Admin UX Study, which examined the difficulties content editors encountered with the platform

Here at Four Kitchens, we use the Gin theme, which is based on Claro but includes extra enhancements. A number of useful modules are also available to tie add-ons together with Gin, like Gin Toolbar and Gin Layout Builder

For our own usage (and yours, too!), we have compiled the Gin theme and some handy modules and configuration into a starter project we call Sous. Sous also incorporates an Emulsify-based frontend theme and other goodies

This standardization is used across nearly all of our builds. As a result, our development is more efficient. Claro — and by extension, Gin — also includes some work on accessibility within the admin interface, which provides a more inclusive experience

Additionally, both Claro and Gin incorporate responsive layouts, so if an editor needs to make changes on a phone or a tablet, they can. If you’re a long-time Drupal user, you will remember how impossible that used to be.

Use Drupal’s Views module to customize user dashboards

One of the biggest issues with Drupal’s out-of-the-box editorial tools is that they don’t reflect the way any organization actually uses the CMS. Just as UX designers look to provide a positive experience for first-time visitors to your site, your team should aim for delivering a similarly strong first impression for those managing its content

By default, Drupal takes users to their profile pages upon login, which is useful to… almost no one. Plus, the platform’s existing terminology uses cryptic terms such as “node,” “taxonomy,” and “paragraphs” to describe various content items. From the beginning, you should remove these abstract references from your CMS. Your editorial users shouldn’t have to understand how the site is built to own its content.

In the backend, every Drupal site has a content overview page, which shows the building blocks of your site. Offering a full list that includes cryptic timestamps and author details, this page constitutes a floodgate of information. Designing an effective CMS is as much an exercise in subtraction as addition. Whether your user’s role involves reviewing site metrics or new content, their first interaction with your CMS should display what they use most often

If one population of users is most interested in the last item they modified, you can transform their login screen to a custom dashboard to display those items. If another group of users works exclusively with SEO, you can create an interface that displays reports and other common tasks. Using Drupal’s Views module, dashboards like these are possible with a few clicks and minimal coding

By tailoring your CMS to specific user habits, you allow your website teams to find what they need and get to work faster. The most dangerous approach to backend design is to try and build one interface to rule them all.

Listen to your users and ease frustrations with a CMS that works

Through Drupal Views, you can modify lists of content and various actions to control how they display in your CMS. While Views provides many options to create custom interfaces, your users themselves are your organization’s most vital resource. By watching how people work on your site, you can recognize areas where your CMS is falling short

Drupal content dashboardDrupal content dashboard

Even if you’ve developed tools that aimed to satisfy specific use cases, you might be surprised the way your tools are used. Through user experience testing, you’ll often find the workarounds your site editors have developed to manage the site

In one recent example, site editors needed to link to a site page within the CMS. Without that functionality, they would either find the URL by viewing the source code in another tab and copying its node ID number. Anyone watching these users would find their process cumbersome, time-consuming, and frustrating. Fortunately, there’s a Drupal module called Linkit that was implemented to easily eliminate this needless effort

There are many useful modules in the Drupal ecosystem that can enhance the out-of-the-box editorial experience. Entity Clone expedites the content creation process. Views Bulk Operations and Bulk Edit simplify routine content update tasks. Computed Field and Automatic Entity Label take the guesswork out of derived or dependent content values. Using custom form modes and Field Groups can help bring order and streamline the content creation forms

Most of the time, your developers don’t know what solutions teams have developed to overcome an ineffective editorial interface. And, for fear of the complexity required to create a solution, these supposed shortcuts too often go unresolved. Your backend users may not even be aware their efforts could be automated or otherwise streamlined. As a result, even the most beautiful, user-friendly website is bogged down by a poorly designed CMS

Once these solutions are implemented, however, you and your users enjoy a shared win. And, through sharing your efforts with the Drupal community, you and your team build a more user-friendly future for the platform as well.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Sep 23 2020
Sep 23
Michael Lutz

Michael Lutz

Senior Engineer

Primarily responsible for maintaining the Drupal core migration system, Michael often spends long nights and weekends working through the Drupal project issues queue, solving problems, and writing code.

September 23, 2020

Working in digital design and development, you grow accustomed to the rapid pace of technology. For example: After much anticipation, the latest version of Drupal was released this summer. Just months later, the next major version is in progress

At July’s all-virtual DrupalCon Global, the open-source digital experience conference, platform founder Dries Buytaert announced Drupal 10 is aiming for a June 2022 release. Assuming those plans hold, Drupal 9 would have the shortest release lifetime of any recent major version

For IT managers, platform changes generate stress and uncertainty. Considering the time-intensive migration process from Drupal 7 to 8, updating your organization’s website can be costly and complicated. Consequently, despite a longtime absence of new features, Drupal 7 still powers more websites than Drupal 8 and 9 combined. And, as technology marches on, the end of its life as a supported platform is approaching

Fortunately, whatever version your website is running, Drupal is not running away from you. Drupal’s users and site builders may be accustomed to expending significant resources to update their website platform, but the plan for more frequent major releases alleviates the stress of the typical upgrade. And, for those whose websites are still on Drupal 7, Drupal 10 will continue offering a way forward

The news that Drupal 10 is coming sooner rather than later might have been unexpected, but you still have no reason to panic just yet. However, your organization shouldn’t stand still, either

Drupal 10 is comingImage via dri.es.

The end for Drupal 7 is still coming, but future upgrades will be easier

Considering upgrading to Drupal 8 involves the investment of building a new site and migrating its content, it’s no wonder so many organizations have been slow to update their platform. Drupal 7 is solid and has existed for nearly 10 years. And, fortunately, it’s not reaching its end of life just yet

At the time of Drupal 9’s release, Drupal 7’s planned end of life was set to arrive late next year. This meant the community would no longer release security advisories or bug fixes for that version of the platform. Affected organizations would need to contact third-party vendors for their support needs. With the COVID-19 pandemic upending businesses and their budgets, the platform’s lifespan has been extended to November 28, 2022

Drupal’s development team has retained its internal migration system through versions 8 and 9, and it remains part of the plan for the upcoming Drupal 10 as well. And the community continues to maintain and improve the system in an effort to make the transition easier. If your organization is still on Drupal 7 now, you can use the migration system to jump directly to version 9, or version 10 upon its release. Drupal has no plans to eliminate that system until Drupal 7 usage numbers drop significantly

Once Drupal 10 is ready for release, Drupal 7 will finally reach its end of life. However, paid vendors will still offer support options that will allow your organization to maintain a secure website until you’re ready for an upgrade. But make a plan for that migration sooner rather than later. The longer you wait for this migration, the more new platform features you’ll have to integrate into your rebuilt website.

Initiatives for Drupal 10 focus on faster updates, third-party software

In delivering his opening keynote for DrupalCon Global, Dries Buytaert outlined five strategic goals for the next iteration of the platform. Like the work for Drupal 9 that began within the Drupal 8 platform, development of Drupal 10 has begun under the hood of version 9

A Drupal 10 Readiness initiative focuses on upgrading third-party components that count as technological dependencies. One crucial component is Symfony, which is the PHP framework Drupal is based upon. Symfony operates on a major release schedule every two years, which requires that Drupal is also updated to stay current. The transition from Symfony 2 to Symfony 3 created challenges for core developers in creating the 8.4 release, which introduced changes that impacted many parts of Drupal’s software

To avoid a repeat of those difficulties, it was determined that the breaking changes involved in a new Symfony major release warranted a new Drupal major release as well. While Drupal 9 is on Symfony 4, the Drupal team hopes to launch 10 on Symfony 6, which is a considerable technical challenge for the platform’s team of contributors. However, once complete, this initiative will extend the lifespan of Drupal 10 to as long as three or four years

Other announced initiatives included greater ease of use through more out-of-the-box features, a new front-end theme, creating a decoupled menu component written in JavaScript, and, in accordance with its most requested feature, automated security updates that will make it as easy as possible to upgrade from 9 to 10 when the time comes. For those already on Drupal 9, these are some of the new features to anticipate in versions 9.1 through 9.4.

Less time between Drupal versions means an easier upgrade path

The shift from Drupal 8 to this summer’s release of Drupal 9 was close to five years in the making. Fortunately for website managers, that update was a far cry from the full migration required from version 7. While there are challenges such as ensuring your custom code is updated to use the most recent APIs, the transition was doable with a good tech team at your side

Still, the work that update required could generate a little anxiety given how comparatively fast another upgrade will arrive. But the shorter time frame will make the move to Drupal 10 easier for everybody. Less time between updates also translates to less deprecated code, especially if you’re already using version 9. But if you’re not there yet, the time to make a plan is now.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Apr 23 2020
Apr 23
Todd Ross Nienkerk

Todd Ross Nienkerk

CEO, Owner, and Co‑Founder

Todd is responsible for driving Four Kitchens’ vision and long-term strategy.

April 23, 2020

We’ve been making big websites for 14 years, and almost all of them have been built on Drupal. It’s no exaggeration to say that Four Kitchens owes its success to the incredible opportunities Drupal has provided us. There has never been anything like Drupal and the community it has fostered—and there may never be anything like it ever again.

That’s why it’s crucial we do everything we can to support the Drupal Association. Especially now.

The impacts of COVID-19 have been felt everywhere, especially at the Association. With the cancellation of DrupalCon Minneapolis, the Drupal Association lost a major source of annual fundraising. Without the revenue from DrupalCon, the Association would not be able to continue its mission to support the Drupal project, the community, and its growth.

The Drupal community’s response to this crisis was tremendous. For our part, we proudly joined 27 other organizations in pledging our sponsorship fees to the Association regardless of whether, or how, DrupalCon happened. I ensured my Individual Membership was still active, and I made a personal contribution.

But we need to do more.

You can help by joining us in the #DrupalCares campaign.

The #DrupalCares campaign is a fundraiser to protect the Drupal Association from the financial impact of COVID-19. Your support will help keep the Drupal Association strong and able to continue accelerating the Drupal project.

The Drupal Association

The outpouring of support has been… Inspiring. First, project founder Dries Buytaert and his partner Vanessa Buytaert pledged their generous support of $100,000. Then, a coalition of Drupal businesses pledged even more matching contributions. We are proud to count ourselves among the dozens of participating Drupal businesses.

Any individual donations, increased memberships, or new memberships through the end of April will be tripled by these matching pledges, up to $100,000, for a total of $300,000.

Please join us in supporting the Drupal Association. Your contribution will help ensure the continued success of the Association and the Drupal community for years to come.

Give to #DrupalCares through April to help the Association receive a 3:1 matching contribution. 

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Jan 30 2020
Jan 30

Recently, we were asked if we could integrate some small, one-page websites into an existing Drupal website. This would not only make it easier to manage those different websites and their content, but also reduce the hosting and maintenance costs.

In this article, I will discuss how we tackled this problem, improved the content management experience and implemented this using the best Drupal practices.

First, some background information: America’s Promise is an organization that launches many national campaigns that focus on improving the lives and futures of America’s youth. Besides their main website (www.americaspromise.org), they also had separate websites and domain names for some of these campaigns, e.g. www.everyschoolhealthy.org.

We came up with a Drupal solution where they could easily configure and manage their campaigns. Next to having the convenience of managing all these campaigns from one admin panel, they could also reference content items easily from their main website or other campaigns by tagging the content with specific taxonomy terms (keywords).

We created a new content type “Campaign” with many custom paragraph types as the building blocks for creating a new campaign. We wanted this to be as easy as possible for the content editors, but also give enough freedom where every campaign can have their own branding, by selecting a font color, background image/color/video.

Below are some of the paragraph types we created:

  • Hero
  • Column Layout
  • WYSIWYG
  • Latest News
  • Newsletter Signup
  • Twitter Feed
  • Video Popup
  • Community Partners
  • Latest Resources
  • Grantee Spotlight
  • Statistics Map
  • Partner Spotlight
  • Media Mentions

These paragraphs offer lots of flexibility to create unique and interactive campaigns. By drag and drop, these paragraphs can be ordered however you’d like.

Below is a screenshot of some of these paragraph types in action, and how easy they can be configured on the backend.

Every School Healthy paragraphs

Below you can see how the “Hero” paragraph looks like in the admin panel. The editor enters a tagline, chooses a font color, uploads a logo, an optional background image or video, and a background overlay color with optional opacity.

Campaign Builder Hero Backend

As you can see in the above screenshot, this is a very basic paragraph type, but it shows the flexibility in customizing the building blocks for the campaign. We also created more complex paragraph types that required quite some custom development.

One of the more complicated paragraph types we created is a statistics map. America’s Promise uses national and state statistics to educate and strengthen its campaign causes.

Campaign Builder Statistics Map

The data for this map comes from a Google Sheet. All necessary settings can be configured in the backend system. Users can then view these state statistics by hovering over the map or see even more details by clicking on an individual state.

Campaign Builder Statistics Map Backend

Some other interesting paragraph types we created are:

  • Twitter Feed, where the editors can specify a certain #hashtag and the tweets will display in a nice masonry layout
  • Newsletter Signup, editors can select what newsletter campaign the user signs up for
  • Latest News/Resources, editors can select the taxonomy term they want to use to filter the content on

Time to dive into some of the more technical approaches we took. The campaign builder we developed for America’s Promise depends on several Drupal contrib modules:

  • paragraphs
  • bg_image_formatter
  • color_field
  • video
  • masonry (used for the Twitter Feed)

Font color and background image/color/video don’t need any custom code, those can be accomplished using the above modules and configuring the correct CSS selectors on the paragraph display:

Campaign Builder Hero Display

In our custom campaign builder module, we have several custom Entities, Controllers, Services, Forms, REST resources and many twig template files. Still, the module mainly consists of custom field formatters and custom theme functions.

Example: the “Latest News” paragraph only has one field where the editor can select a taxonomy term. With a custom field formatter, we will display this field as a rendered view instead. We pass the selected term as an argument to the Latest News view, execute the view and display it with a custom #theme function.

Conclusion

By leveraging the strength of paragraphs, other contrib modules and some custom code, we were able to create a reusable and intuitive campaign builder. Where the ease of content management was a priority without limiting the design or branding of each campaign.

Several campaigns that are currently live and built with our campaign builder:

Could your organization benefit from having your own custom campaign builder and want to see more? Contact us for a demo.

Jan 23 2020
Jan 23
Allan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

January 23, 2020

In the Drupal support world, working on Drupal 7 sites is a necessity. But switching between Drupal 7 and Drupal 8 development can be jarring, if only for the coding style.

Fortunately, I’ve got a solution that makes working in Drupal 7 more like working in Drupal 8. Use this three-part approach to have fun with Drupal 7 development:

  • Apply Xautoload to keep your PHP skills fresh, modern, and compatible with all frameworks and make your code more reusable and maintainable between projects.
  • Use the Drupal Libraries API to use third-party libraries.
  • Use the Composer template to push the boundaries of your programming design patterns.

Applying Xautoload

Xautoload is simply a module that enables PSR-0/4 autoloading. Using Xautoload is as simple as downloading and enabling it. You can then start using use and namespace statements to write object-oriented programming (OOP) code.

For example:

xautoload.info

name = Xautoload Example
description = Example of using Xautoload to build a page
core = 7.x package = Midcamp Fun

dependencies[] = xautoload:xautoload

xautoload_example.module

 'xautoload_example_page_render',
    'access callback' => TRUE,
  );
  return $items;
}

function xautoload_example_page_render() {
  $obj = new SimpleObject();
  return $obj->render();
}

src/SimpleObject.php

 "

Hello World

", ); } }

Enabling and running this code causes the URL /xautoload_example to spit out “Hello World”.

You’re now ready to add in your own OOP!

Using third-party libraries

Natively, Drupal 7 has a hard time autoloading third-party library files. But there are contributed modules (like Guzzle) out there that wrap third-party libraries. These modules wrap object-oriented libraries to provide a functional interface. Now that you have Xautoload in your repertoire, you can use its functionality to autoload libraries as well.

I’m going to show you how to use the Drupal Libraries API module with Xautoload to load a third-party library. You can find examples of all the different ways you can add a library in xautoload.api.php. I’ll demonstrate an easy example by using the php-loremipsum library:

1. Download your library and store it in sites/all/libraries. I named the folder php-loremipsum.

2. Add a function implementing hook_libraries_info to your module by pulling in the namespace from Composer. This way, you don’t need to set up all the namespace rules that the library might contain.

function xautoload_example_libraries_info() {
  return array(
    'php-loremipsum' => array(
      'name' => 'PHP Lorem Ipsum',
      'xautoload' => function ($adapter) {
        $adapter->composerJson('composer.json');
      }
    )
  );
}

3. Change the page render function to use the php-loremipsum library to build content.

use joshtronic\LoremIpsum;
function xautoload_example_page_render() {
  $library = libraries_load('php-loremipsum');
  if ($library['loaded'] === FALSE) {
    throw new \Exception("php-loremipsum didn't load!");
  }
  $lipsum = new LoremIpsum();
  return array(
    '#markup' => $lipsum->paragraph('p'),
  );
}

Note that I needed  to tell the Libraries API to load the library, but I then have access to all the namespaces within the library. Keep in mind that the dependencies of some libraries are immense. You’ll very likely need to use Composer from within the library and commit it when you first start out. In such cases, you might need to make sure to include the Composer autoload.php file.

Another tip:  Abstract your libraries_load() functionality out in such a way that if the class you want already exists, you don’t call libraries_load() again. Doing so removes libraries as a hard dependency from your module and enables you to use Composer to load the library later on with no more work on your part. For example:

function xautoload_example_load_library() {
  if (!class_exists('\joshtronic\LoremIpsum', TRUE)) {
    if (!module_exists('libraries')) {
      throw new \Exception('Include php-loremipsum via composer or enable libraries.');
    }
    $library = libraries_load('php-loremipsum');
    if ($library['loaded'] === FALSE) {
      throw new \Exception("php-loremipsum didn't load!");
    }
  }
}

And with that, you’ve conquered the challenge of using third-party libraries!

Setting up a new site with Composer

Speaking of Composer, you can use it to simplify the setup of a new Drupal 7 site. Just follow the instructions in the Readme for the Composer Template for Drupal Project. From the command line, run the following:

composer create-project drupal-composer/drupal-project:7.x-dev  --no-interaction

This code gives you a basic site with a source repository (a repo that doesn’t commit contributed modules and libraries) to push up to your Git provider. (Note that migrating an existing site to Composer involves a few additional considerations and steps, so I won’t get into that now.)

If you’re generating a Pantheon site, check out the Pantheon-specific Drupal 7 Composer project. But wait: The instructions there advise you to use Terminus to create your site, and that approach attempts to do everything for you—including setting up the actual site. Instead, you can simply use composer create-project  to test your site in something like Lando. Make sure to run composer install if you copy down a repo.

From there, you need to enable the Composer Autoload module , which is automatically required in the composer.json you pulled in earlier. Then, add all your modules to the require portion of the file or use composer require drupal/module_name just as you would in Drupal 8.

You now have full access to all the  Packagist libraries and can use them in your modules. To use the previous example, you could remove php-loremipsum from sites/all/libraries, and instead run composer require joshtronic/php-loremipsum. The code would then run the same as before.

From here on out, it’s up to your imagination. Code and implement with ease, using OOP design patterns and reusable code. You just might find that this new world of possibilities for integrating new technologies with your existing Drupal 7 sites increases your productivity as well.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Sep 28 2018
Sep 28

Pairing Composer template for Drupal Projects with Lando gives you a fully working Drupal environment with barely any setup.

Lando is an open-source, cross-platform local development environment. It uses Docker to build containers for well-known frameworks and services written in simple recipes. If you haven’t started using Lando for your local development, we highly recommend it. It is easier, faster, and relatively pain-free compared to MAMP, WAMP, VirtualBox VMs, Vagrant or building your own Docker infrastructure.

Prerequisites

You’ll need to have Composer and Lando installed:

Setting up Composer Template Drupal Project

If you want to find details about what you are getting when you install the drupal-project you can view the repo. Otherwise, if you’d rather simply set up a Drupal template site, run the following command.

composer create-project drupal-composer/drupal-project:8.x-dev [your-project] --stability dev --no-interaction

Once that is done running, cd into the newly created directory. You’ll find that you now have a more than basic Drupal installation.

Getting the site setup on Lando

Next, run lando init, which prompts you with 3 simple questions:

? What recipe do you want to use? > drupal8
? Where is your webroot relative to the init destination? > web
? What do you want to call this app? > [your-project]

Once that is done provisioning, run lando start—which downloads and spins up the necessary containers. Providing you with a set of URLs that you can use to visit your site:

https://localhost:32807
http://localhost:32808
http://[your-project].lndo.site:8000
https://[your-project].lndo.site

Setup Drupal

Visit any of the URLs to initialize the Drupal installation flow. Run lando info to get the database detail:

Database: drupal8
Username: drupal8
Password: drupal8
Host: database

Working with your new Site

One of the useful benefits of using Lando is that your toolchain does not need to be installed on your local machine, it can be installed in the Docker container that Lando uses. Meaning you can use commands provided by Lando without having to install other packages. The commands that come with Lando include lando drush, lando drupal, and lando composer. Execute these commands in your command prompt as usual, though they'll execute from within the container.

Once you commit your lando.yml file others can use the same Lando configuration on their machines. Having this shared configuration makes it easy to share and set up local environments that have the same configuration.

Aug 27 2018
Aug 27

This post is part 5 in the series [“Hashing out a docker workflow”]({% post_url 2015-06-04-hashing-out-docker-workflow %}). I have resurrected this series from over a year ago, but if you want to checkout the previous posts, you can find the [first post here]({% post_url 2015-06-04-hashing-out-docker-workflow %}). Although the beginning of this blog series pre-dates Docker Machine, Docker for Mac, or Docker for Window’s. The Docker concepts still apply, just not using it with Vagrant any more. Instead, check out the Docker Toolbox. There isn’t a need to use Vagrant any longer.

We are going to take the Drupal image that I created from my last post [“Creating a deployable Docker image with Jenkins”]({% post_url 2016-01-20-jenkins-build-docker-images %}) and deploy it. You can find the image that we created last time up on Docker Hub, that is where we pushed the image last time. You have several options on how to deploy Docker images to production, whether that be manually, using a service like AWS ECS, or OpenShift, etc… Today, I’m going to walk you through a deployment process using Kubernetes also known as simply k8s.

Why use Kubernetes?

There are an abundance of options out there to deploy Docker containers to the cloud easily. Most of the options provide a nice UI with a form wizard that will take you through deploying your containers. So why use k8s? The biggest advantage in my opinion is that Kubernetes is agnostic of the cloud that you are deploying on. This means if/when you decide you no longer want to host your application on AWS, or whatever cloud you happen to be on, and instead want to move to Google Cloud or Azure, you can pick up your entire cluster configuration and move it very easily to another cloud provider.

Obviously there is the trade-off of needing to learn yet another technology (Kubernetes) to get your app deployed, but you also won’t have the vendor lock-in when it is time to move your application to a different cloud. Some of the other benefits to mention about K8s is the large community, all the add-ons, and the ability to have all of your cluster/deployment configuration in code. I don't want to turn this post into the benefits of Kubernetes over others, so lets jump into some hands-on and start setting things up.

Setup a local cluster.

Instead of spinning up servers in a cloud provider and paying for the cost of those servers while we explore k8s, we are going to setup a cluster locally and configure Kubernetes without paying a dime out of our pocket. Setting up a local cluster is super simple with a tool called Minikube. Head over to the Kubernetes website and get that installed. Once you have Minikube installed, boot it up by typing minkube start. You should see something similar to what is shown below:

$ minikube start
Starting local Kubernetes v1.10.0 cluster...
Starting VM...
Downloading Minikube ISO
 160.27 MB / 160.27 MB [============================================] 100.00% 0s
Getting VM IP address...
Moving files into cluster...
Downloading kubeadm v1.10.0
Downloading kubelet v1.10.0
Finished Downloading kubelet v1.10.0
Finished Downloading kubeadm v1.10.0
Setting up certs...
Connecting to cluster...
Setting up kubeconfig...
Starting cluster components...
Kubectl is now configured to use the cluster.
Loading cached images from config file.

This command setup a virtual machine on your computer, likely using Virtualbox. If you want to double check, pop open the Virtualbox UI to see a new VM created there. This virtual machine has loaded on it all the necessary components to run a Kubernetes cluster. In K8s speak, each virtual machine is called a node. If you want to log in to the node to explore a bit, type minikube ssh. Below I have ssh'd into the machine and ran docker ps. You’ll notice that this vm has quite a few Docker containers running to make this cluster.

 $ minikube ssh
                         _             _
            _         _ ( )           ( )
  ___ ___  (_)  ___  (_)| |/')  _   _ | |_      __
/' _ ` _ `\| |/' _ `\| || , <  ( ) ( )| '_`\  /'__`\
| ( ) ( ) || || ( ) || || |\`\ | (_) || |_) )(  ___/
(_) (_) (_)(_)(_) (_)(_)(_) (_)`\___/'(_,__/'`\____)

$ docker ps
CONTAINER ID        IMAGE                                      COMMAND                  CREATED             STATUS              PORTS               NAMES
aa766ccc69e2        k8s.gcr.io/k8s-dns-sidecar-amd64           "/sidecar --v=2 --lo…"   5 minutes ago       Up 5 minutes                            k8s_sidecar_kube-dns-86f4d74b45-kb2tz_kube-system_3a21f134-a637-11e8-894d-0800273ca679_0
6dc978b31b0d        k8s.gcr.io/k8s-dns-dnsmasq-nanny-amd64     "/dnsmasq-nanny -v=2…"   5 minutes ago       Up 5 minutes                            k8s_dnsmasq_kube-dns-86f4d74b45-kb2tz_kube-system_3a21f134-a637-11e8-894d-0800273ca679_0
0c08805e8068        k8s.gcr.io/kubernetes-dashboard-amd64      "/dashboard --insecu…"   5 minutes ago       Up 5 minutes                            k8s_kubernetes-dashboard_kubernetes-dashboard-5498ccf677-hvt4f_kube-system_3abef591-a637-11e8-894d-0800273ca679_0
f5d725b1c96a        gcr.io/k8s-minikube/storage-provisioner    "/storage-provisioner"   6 minutes ago       Up 6 minutes                            k8s_storage-provisioner_storage-provisioner_kube-system_3acd2f39-a637-11e8-894d-0800273ca679_0
3bab9f953f14        k8s.gcr.io/k8s-dns-kube-dns-amd64          "/kube-dns --domain=…"   6 minutes ago       Up 6 minutes                            k8s_kubedns_kube-dns-86f4d74b45-kb2tz_kube-system_3a21f134-a637-11e8-894d-0800273ca679_0
9b8306dbaab7        k8s.gcr.io/kube-proxy-amd64                "/usr/local/bin/kube…"   6 minutes ago       Up 6 minutes                            k8s_kube-proxy_kube-proxy-dwhn6_kube-system_3a0fa9b2-a637-11e8-894d-0800273ca679_0
5446ddd71cf5        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 7 minutes ago       Up 7 minutes                            k8s_POD_storage-provisioner_kube-system_3acd2f39-a637-11e8-894d-0800273ca679_0
17907c340c66        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 7 minutes ago       Up 7 minutes                            k8s_POD_kubernetes-dashboard-5498ccf677-hvt4f_kube-system_3abef591-a637-11e8-894d-0800273ca679_0
71ed3f405944        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 7 minutes ago       Up 7 minutes                            k8s_POD_kube-dns-86f4d74b45-kb2tz_kube-system_3a21f134-a637-11e8-894d-0800273ca679_0
daf1cac5a9a5        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 7 minutes ago       Up 7 minutes                            k8s_POD_kube-proxy-dwhn6_kube-system_3a0fa9b2-a637-11e8-894d-0800273ca679_0
9d00a680eac4        k8s.gcr.io/kube-scheduler-amd64            "kube-scheduler --ad…"   7 minutes ago       Up 7 minutes                            k8s_kube-scheduler_kube-scheduler-minikube_kube-system_31cf0ccbee286239d451edb6fb511513_0
4d545d0f4298        k8s.gcr.io/kube-apiserver-amd64            "kube-apiserver --ad…"   7 minutes ago       Up 7 minutes                            k8s_kube-apiserver_kube-apiserver-minikube_kube-system_2057c3a47cba59c001b9ca29375936fb_0
66589606f12d        k8s.gcr.io/kube-controller-manager-amd64   "kube-controller-man…"   8 minutes ago       Up 8 minutes                            k8s_kube-controller-manager_kube-controller-manager-minikube_kube-system_ee3fd35687a14a83a0373a2bd98be6c5_0
1054b57bf3bf        k8s.gcr.io/etcd-amd64                      "etcd --data-dir=/da…"   8 minutes ago       Up 8 minutes                            k8s_etcd_etcd-minikube_kube-system_a5f05205ed5e6b681272a52d0c8d887b_0
bb5a121078e8        k8s.gcr.io/kube-addon-manager              "/opt/kube-addons.sh"    9 minutes ago       Up 9 minutes                            k8s_kube-addon-manager_kube-addon-manager-minikube_kube-system_3afaf06535cc3b85be93c31632b765da_0
04e262a1f675        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 9 minutes ago       Up 9 minutes                            k8s_POD_kube-apiserver-minikube_kube-system_2057c3a47cba59c001b9ca29375936fb_0
25a86a334555        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 9 minutes ago       Up 9 minutes                            k8s_POD_kube-scheduler-minikube_kube-system_31cf0ccbee286239d451edb6fb511513_0
e1f0bd797091        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 9 minutes ago       Up 9 minutes                            k8s_POD_kube-controller-manager-minikube_kube-system_ee3fd35687a14a83a0373a2bd98be6c5_0
0db163f8c68d        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 9 minutes ago       Up 9 minutes                            k8s_POD_etcd-minikube_kube-system_a5f05205ed5e6b681272a52d0c8d887b_0
4badf1309a58        k8s.gcr.io/pause-amd64:3.1                 "/pause"                 9 minutes ago       Up 9 minutes                            k8s_POD_kube-addon-manager-minikube_kube-system_3afaf06535cc3b85be93c31632b765da_0

When you’re done snooping around the inside the node, log out of the session by typing Ctrl+D. This should take you back to a session on your local machine.

Interacting with the cluster

Kubernetes is managed via a REST API, however you will find yourself interacting with the cluster mainly with a CLI tool called kubectl. With kubectl, we will issue it commands and the tool will generate the necessary Create, Read, Update, and Delete requests for us, and execute those requests against the API. It’s time to install the CLI tool, go checkout the docs here to install on your OS.

Once you have the command line tool installed, it should be automatically configured to interface with the cluster that you just setup with minikube. To verify, run a command to see all of the nodes in the cluster kubectl get nodes.

$ kubectl get nodes
NAME       STATUS    ROLES     AGE       VERSION
minikube   Ready     master    6m        v1.10.0

We have one node in the cluster! Lets deploy our app using the Docker image that we created last time.

Writing Config Files

With the kubectl cli tool, you can define all of your Kubernetes objects directly, but I like to create config files that I can commit in a repository and mange changes as we expand the cluster. For this deployment, I’ll take you through creating 3 different K8s objects. We will explicitly create a Deployment object, which will implicitly create a Pod object, and we will create a Service object. For details on what these 3 objects are, check out the Kubernetes docs.

In a nutshell, a Pod is a wrapper around a Docker container, a Service is a way to expose a Pod, or several Pods, on a specific port to the outside world. Pods are only accessible inside the Kubernetes cluster, the only way to access any services in a Pod is to expose the Pod with a Service. A Deployment is an object that manages Pod’s, and ensures that Pod’s are healthy and are up. If you configure a deployment to have 2 replicas, then the deployment will ensure 2 Pods are always up, and if one crashes, Kubernetes will spin up another Pod to match the Deployment definition.

deployment.yml

Head over to the API reference and grab the example config file https://v1-10.docs.kubernetes.io/docs/reference/generated/kubernetes-api/v1.10/#deployment-v1-apps. We will modify the config file from the docs to our needs. Change the template to look like below (I changed the image, app, and name properties in the yml below):

apiVersion: apps/v1beta1
kind: Deployment
metadata:
  
  name: deployment-example
spec:
  
  replicas: 3
  template:
    metadata:
      labels:
        
        
        app: drupal
    spec:
      containers:
      - name: drupal
        
        image: tomfriedhof/docker_blog_post

Now it’s time to feed that config file into the Kubernetes API, we will use the CLI tool for this:

$ kubectl create -f deployment.yml

You can check the status of that deployment by asking the k8s for all Pod and Deployment objects:

$ kubectl get deploy,po

Once everything is up and running you should see something like this:

 $ kubectl get deploy,po
NAME                        DESIRED   CURRENT   UP-TO-DATE   AVAILABLE   AGE
deploy/deployment-example   3         3         3            3           3m

NAME                                    READY     STATUS    RESTARTS   AGE
po/deployment-example-fc5d69475-dfkx2   1/1       Running   0          3m
po/deployment-example-fc5d69475-t5w2j   1/1       Running   0          3m
po/deployment-example-fc5d69475-xw9m6   1/1       Running   0          3m

service.yml

We have no way of accessing any of those Pods in the deployment. We need to expose the Pods using a Kubernetes Service. To do this, grab the example file from the docs again and change it to the following: https://v1-10.docs.kubernetes.io/docs/reference/generated/kubernetes-api/v1.10/#service-v1-core

kind: Service
apiVersion: v1
metadata:
  
  name: service-example
spec:
  ports:
    
    - name: http
      port: 80
      targetPort: 80
  selector:
    
    
    app: drupal
  
  
  
  type: LoadBalancer

Create this service object using the CLI tool again:

$ kubectl create -f service.yml

You can now ask Kubernetes to show you all 3 objects that you created by typing the following:

$ kubectl get deploy,po,svc
NAME                        DESIRED   CURRENT   UP-TO-DATE   AVAILABLE   AGE
deploy/deployment-example   3         3         3            3           7m

NAME                                    READY     STATUS    RESTARTS   AGE
po/deployment-example-fc5d69475-dfkx2   1/1       Running   0          7m
po/deployment-example-fc5d69475-t5w2j   1/1       Running   0          7m
po/deployment-example-fc5d69475-xw9m6   1/1       Running   0          7m

NAME                  TYPE           CLUSTER-IP      EXTERNAL-IP   PORT(S)        AGE
svc/kubernetes        ClusterIP      10.96.0.1       <none>        443/TCP        1h
svc/service-example   LoadBalancer   10.96.176.233   <pending>     80:31337/TCP   13s

You can see under the services at the bottom that port 31337 was mapped to port 80 on the Pods. Now if we hit any node in the cluster, in our case it's just the one VM, on port 31337 we should see the Drupal app that we built from the Docker image we created in the last post. Since we are using Minikube, there is a command to open a browser on the specific port of the service, type minikube service :

$ minikube service service-example

This should open up a browser window and you should see the Installation screen for Drupal. You have successfully deployed the Docker image that we created to a production-like environment.

What is next?

We have just barely scratched the surface of what is possible with Kubernetes. I showed you the bare minimum to get a Docker image deployed on Kubernetes. The next step is to deploy your cluster to an actual cloud provider. For further reading on how to do that, definitely check-out the KOPS project.

If you have any questions, feel free to leave a comment below. If you want to see a demo of everything that I wrote about on the ActiveLAMP YouTube channel, let us know in the comments as well.

Aug 15 2017
Aug 15

So you just finished building an awesome new website on Drupal, but now you’ve run into a new dilemma. How do optimize the site for search engines? Search engine optimization, or SEO, can be overwhelming, but don’t let that cause you to ignore certain things you can do to help drive traffic to your website. There’s nothing worse than spending countless hours to develop a web application, only to find out that users aren’t able to find your site. This can be extremely frustrating, as well as devastating if your company or business heavily relies on organic traffic.

Now there are countless philosophies of SEO, many of which are well-educated assumptions of what Google is looking for. The reality is that no one knows exactly how Google’s algorithm is calculated, and it doesn’t help when their algorithm is constantly being updated. Luckily, there are a few best practices that are accepted across the board, most of which have been confirmed by Google as being a contributing factor to search engine ranking. This blog is going to focus on a few of those best practices and which modules we have found to be helpful in both our Drupal 7 and Drupal 8 projects.

So, without further ado, here is our list of Drupal modules you should consider using on your site to help improve your SEO:

###XML Sitemap Module

As the name suggests, XML Sitemap allows you to effortlessly generate a sitemap for your website. A sitemap allows for Google and other search engines like Bing and Yahoo, to be able to easily find and crawl pages on your site. Is a sitemap necessary? No. But if it helps the pages of your site to become easily discoverable, then why not reduce the risk of not having pages of your site indexed? This is especially important if you have a large site with thousands or even hundreds of pages. Having a sitemap also provides search engines with some valuable information, such as how often the page is updated and the level of significance compared to other pages on your site.

XML Sitemap allows you to generate a sitemap with a click of a button, and best of all you can configure it to periodically generate a new sitemap which will add any new pages you’ve published on your Drupal site. Once your website has a sitemap, it is recommended to submit that sitemap on Google Search Console, and if you haven’t claimed your website on Google Search Console yet, I would highly advise doing so as it will provide you with helpful insight such as indexing information, critical issues, and more.

###Metatag Module

The next Drupal module is one that can really help boost your search engine ranking and visibility. Metatag is a powerful module that gives you the ability to update a large number of various meta tags on your site. A meta tag is an HTML tag which contains valuable information that search engines use to determine the relevance of a page when determining search ranking. The more information available to search engines such as Google, the better your chances will be that your pages will rank well. The Metatag module allows you to easily update some of the more popular tags, such as meta description, meta content type, title tag, viewport, and more.

Adding and/or updating your meta tags is the first step of best SEO practice. I’ve come across many sites who pay little to no attention to their meta tags. Luckily, the Metatag module for Drupal can help you easily boost your SEO, and even if you don’t have time to go through and update your meta tags manually (which is recommend), the module also has a feature to have your tags automatically generated.

###Real-Time SEO for Drupal Module

The Real-Time SEO for Drupal module is a powerful tool on its own, but it is even better when paired with the Metatag module which we just finished discussing. This module takes into account many SEO best practices and gives you a real-time analysis, ensuring that your content is best optimized for search engines. It will inform you if your content is too short, how readable your posts are, and also provides you a snapshot of how your page will appear in Google. The other helpful information it provides is regarding missing or potentially weak tags, which is why I mentioned that this module and the Metatag module work extremely well together. Real-Time SEO for Drupal can let you know how to better improve your meta tags and by using the Metatags module, you can quickly update your tags and watch in real-time how the changes affect your SEO.

The Real-Time SEO for Drupal module is a simple, yet incredibly useful tool in helping you see the SEO health of your pages. If you are just getting into SEO, this is a great place to start, and even if you’re a seasoned pro this is a nice tool to have to remind you of any meta tags or keyword optimization opportunities you may be missing.

###Google Analytics Module

The final module is the Google Analytics module. Google Analytics is by far the most widely used analytics platform. The invaluable information it provides, the numerous tools available, and the integrations it allows, make it a requirement for anyone looking to improve the SEO of their Drupal website. This Drupal module is extremely convenient, as it does not require a developer to have to mess with any of the site's code. After installing the module all you have to do is enter the web property ID that is provided to you after you setup your account on Google Analytics.
From the Google Analytics module UI, you have a number of helpful options, such as what domains to track, which pages to exclude, adjusting page roles, tracking clicks and downloads, and more. The Google Analytics module for Drupal is another great tool to add to your tool belt when trying to best improve your SEO.

###Final Thoughts

This list of helpful SEO modules for your Drupal 7 or 8 site could easily have been much longer, but these are a few key modules to help you get started. SEO is something that should not be ignored, as I mentioned in the beginning of the blog, it’s a shame to build a site only to find that no one is actually visiting it, but using these modules properly can definitely help prevent this issue. if you would like to learn of other great modules to help your SEO, please leave a comment below and I'll write a follow-up blog.

Aug 02 2017
Aug 02

When migrating from Drupal 7 to Drupal 8, it is important to remember to migrate over the redirects as well. Without the migrations users will not find your content if for example: the redirect was shared on social media. Using the Migrate Plus module, it is quite simple to write a migration for the redirects. The Migrate Plus module contains some good examples on how to get started writing your custom migrations.

Write your node migrations

I am going to assume that you have written migration for some content types and have the group already written. Once those migrations have been written, in your database should now be a migrate_map_{name}_{type} table. This is where we will be able to find the imported node's new id which will be necessary for importing the redirects.

Write the yml file for redirect migrations

For example, let's say we have a module called blog_migrations. In that module we have a group for blog and a migration for a news and opinion content type. Inside the config/install directory add a new yml file called migrate_plus.migration.blog_redirect.yml where blog is the name of the group being migrated. This file will give an id, label, and the process to use for the migration.

id: blog_redirect
label: Path Redirect
migration_group: blog
migration_tags:
  - Drupal 7
source:
  
  
  plugin: blog_redirect
  key: blog
process:
  rid: rid
  uid: uid
  redirect_source/path: source
  redirect_source/query:
    
    plugin: d7_redirect_source_query
    source: source_options
  redirect_redirect/uri:
    
    plugin: d7_path_redirect
    source:
      - redirect
      - redirect_options
  language:
    plugin: default_value
    source: language
    default_value: und
  status_code: status_code
destination:
  plugin: entity:redirect

Write the migrate source

Create the file BlogRedirect.php in the module's src/Plugin/migrate/source folder.



namespace Drupal\apa_migrate\Plugin\migrate\source;

use Drupal\Core\Database\Database;
use Drupal\migrate\Row;
use Drupal\redirect\Plugin\migrate\source\d7\PathRedirect;


class BlogRedirect extends PathRedirect {

  
  public function query() {
    
    $query = $this->select('redirect', 'p')->fields('p')
      ->condition('redirect', '%user%', 'NOT LIKE');

    return $query;
  }

  
  public function prepareRow(Row $row) {
    
    $current_status_code = $row->getSourceProperty('status_code');
    $status_code = $current_status_code != 0 ? $current_status_code : 301;
    $row->setSourceProperty('status_code', $status_code);

    $current_redirect = $row->getSourceProperty('redirect');
    $explode_current_redirect = explode("/", $current_redirect);

    $map_blog_array = array(
      'news',
      'opinion'
    );
   
    if ($explode_current_redirect[0] == 'node') {
      
      $resource_type = $this->getDatabase()
        ->select('node', 'n')
        ->fields('n', ['type'])
        ->condition('nid', $explode_current_redirect[1])
        ->execute()
        ->fetchField();

      
      if (in_array($resource_type, $map_apa_array)) {
        
        $new_node_id = Database::getConnection('default', 'default')
          ->select('migrate_map_apa_' . $resource_type, 'm')
          ->fields('m', ['destid1'])
          ->condition('sourceid1', $explode_current_redirect[1])
          ->execute()
          ->fetchField();

        
        $new_redirect = 'node/' . $new_node_id;
        $row->setSourceProperty('redirect', $new_redirect);
      }
    }
  }
}

Run the migrations

Using the config_devel module, now import the configuration into active store to be able to run the migration using:

drush cdi1 /modules/custom/blog_migration/config/install/migrate_plus.migration.blog_redirect.yml

Then run the actual migration:

drush mi blog_redirect

After running that you should now have migrated the two content type's redirects with the new node id they were given! Any questions, let us know in the comments below.

Jun 15 2017
Jun 15

Tom Friedhof: There’s a lot of hype around integrating Pattern Lab with your Drupal theme these days. Particularly because Drupal’s template engine is now using twig, which is one of the template engines Pattern Lab uses.  The holy grail of having a living style guide and component library is now a lot more feasible! But what about Drupal 7 sites? Twig doesn't exist in Drupal 7.  Today I’m going to show you something we’re working on at active lamp to implement Pattern Lab templates in Drupal 7.

Hey guys, I’m Tom Friedhof, a solutions architect here at ActiveLAMP.  Let me first start off by defining what I mean when I a say living style and component library.  This idea can mean different things to different people. A living style guide and component library is the HTML, CSS, and Javascript that document the design of a user interface. The “living” part of that means that the style guide should be constantly in sync with the actual app that implements the interface as the design improves or changes.

How do you easily keep the real app and the style guide in constant sync?  That could be a lot of work, given that once the initial designs are done, the design iterations are typically done directly in the app being built, making the style guide obsolete and outdated.

That’s where the promise of Pattern Lab integration with Drupal comes in.  You can easily keep the style guide in sync, if your app depends on the style guide for all of it’s HTML, CSS, and Javascript. That’s why there is so much hype around building “Pattern Lab” themes in Drupal 8 right now. Drupal 8’s theme engine is a theme engine that Pattern Lab uses, and reusing the same twig templates that your UX Designer created in Pattern Lab within Drupal, is now an option.

Well, we’re still working on Drupal 7 sites, so how do we benefit from this approach in Drupal 7? To be honest, we’re still hashing out our approach to do this in Drupal 7.  We have the process built out enough, and we're using it on a new theme we developing for a client, but we’re still constantly iterating on the process, and improving it as we run into things.

What I want to show you guys today is the direction that we’re going, and I’m hoping to get your feedback in the comments so that we can continually improve and iterate on this system. First off, we decided not to use the twig version of Pattern Lab.  We spent half a day trying to get twig working in Drupal 7 with the twig for drupal 7 module, and realized we’d be going down a pretty deep rabbit hole just to make twig work in D7.

Rather than fight Drupal 7 and twig, we decided to use a much simpler template engine called Mustache.  Mustache is a language agnostic template engine, and there is a really nice PHP implementation of it. With that said, we installed the gulp version of Pattern Lab, which uses Mustache templates in JavaScript.  We now have the ability to share templates.

I’m going to jump into a demo here in a second. However, I’m not going to do a deep dive of how Pattern Lab works or how Drupal and Panels work.  I’ll dive deeper in future videos with those details if you’re interested.  Leave a comment if you want to see that stuff and we’ll put it on our list of content to share. I’m going to give you guys a 10,000 foot view of how things are shaping up with our Drupal 7 integration to Pattern Lab process.

All right, so here we are in our Drupal seven install. This is pretty much a vanilla Drupal installation. If I jump over to the Drupal directory, you can see here within my sites all modules file or directory, here are all the modules that I need for this actual demo that I'm gonna use for today. We like to use panels and panels everywhere, so what I'm going be demoing today is with panels and panels everywhere, but the stuff that I'm gonna show does apply to just regular template files if you don't want to use panels and just want to stick with the core TPL system within Drupal. One of the the other things that we have in here is a theme called Hills, this is where all the magic actually happens. One thing that you'll notice in this Hills theme is we have two directories called node modules and vendor. We're actually pulling in dependencies from NPM and from Composer or from Packagist into this theme. If we open up our package.jason, which actually defines the NPM dependencies, you can see that we're defining a dependency called hills-patternlab. This is basically the repo that holds our Pattern Lab instants, it's the living style guide that the UX designer uses to actually update the patterns, update CSS and make any changes that need to be changed in the UI.

The composer.json file is requiring the Mustache PHP implementation. We're using this library for obvious reasons to render the Mustache templates that we're pulling in from Pattern Lab. This theme needs to be instantiated with an NPM install and a composer install to get these dependencies and once you've done that, then you're ready to start working on the theme.

One other thing I want to do before I actually start building this our in Drupal is I want to show you the Pattern Lab instants. I am in our Hills directory, I can run NPM start and this should pull up our Pattern Lab instants. Here is our Pattern Lab instants, not going to go into the details of what Pattern Lab is, but essentially it's all the components that make up a website. For example, you can see a page header looks like that, if we wanted to see what the header looks like, it looks like this and all of these templates are basically Mustache templates within our Patterns Lab. Let me open up node modules so you can actually see these templates real quick. The Pattern Lab directory structure looks like this, within the source directory inside of Patterns we can go into organisms and actually look at what a header looks like within Pattern Lab. This is including a couple other patterns from within Pattern Lab so let's see what the navigation actually looks like by going in here. This is the HTML that make up a navigation.

This template includes other patterns within Pattern Lab, let me drill down to the primary links pattern. Here's what our primary links look like, you can see that this is outputting variables, for example, href and name here and then it's including yet another pattern within Pattern Lab, let me open that one as well. Here you can see that it's outputting more variables, classes and name. These variables are actually defined within Pattern Lab's data directory. I'm not going go into detail how that works, but let me just show you what that ends up rendering. You can see here's our organism header, that primary links pattern is this here. This is basically rendering data from Pattern Lab's data directory. If I go into the data directory just real quick, within the primary-links.json file you can see this is the actual data that it's pulling in. If we wanted to say staff services, this is going to rebuild and we see staff services here. That's essentially how Pattern Lab works with data in a nutshell. What I'm gonna show you guys is how we actually integrate this with Drupal. Eventually what's going to happen is these Mustache templates are going to render variables from Drupal and not the data specified in this Pattern Lab data directory.

Let's jump over to Drupal. Here's our Drupal installation. First thing that I'm going do is I'm going to switch the theme to our Hills theme, our Hail to the Hills theme, so I'm gonna enable this and set this as the default. Now I'm gonna open up the homepage in a new tab and drag that over here. So now we can see here's what we get out of the box with this Hail to the Hills theme, there's really nothing in this theme yet. There is stuff in the theme, but I'll get to that in a second, but this is what you'll get once you enable it initially. We're using Panels Everywhere with this theme so what I'm going to do is I'm going to go configure Panels Everywhere. Panels Everywhere gives you a site template by default, so I'm gonna come over here and edit it and I'm gonna add a new variant. We'll just call this default and I'm going come over here and choose a layout within the PL Templates layout category and I'm going to hit full width one column, continue and then we'll just work through this UI. Then I'm going to give it basically the basic content that we need to render so that you can actually see something on the page when you visit a page within the site. We'll create the variant here, we'll update and save that so now let's look to see what our homepage looks like.

Our homepage is starting to look a little bit better, we're basically hitting the default homepage for Drupal, which just shows the title and then a no front page content has been created yet. You noticed here in this layout tab, we had this category called PL Templates and it's pulling in full width and main content. Let me show you where these are defined, if I jump back into our theme, within our theme info hook or info file, panels allows you to specify a directory where you're going to define your layout plugins. The way you specify that is by using the string plugins panel layouts and then you just give it a path to your directory. Let me close node modules so this is a little bit easier to see. If I come into this layouts directory, you can see that I have two layouts specified here, we used the full width layout so I'm gonna jump into that first.

This isn't a tutorial on how to create plugins, but essentially what we're doing is we're just creating a ctools layout plugin. We've called this the full width one column, we've said the theme function or implementation for this is panels full width and you can see that we have a panels full width template here. So when this layout is used, it's going to actually use this template. If we just into that, all this template is doing is it's printing out whatever is in the content area. This has nothing to do with Patten Lab yet, but this is how you essentially setup a default template with regions within panels everywhere. Let's jump back to Drupal and go into our content area. You remember we have the default template set up now, but now let's start to pull in some of the patterns from Pattern Lab. If I come over here, this pattern here called organisms header, let's pull that into Drupal first. I'm gonna come in here and add content, I have this PL components over here and we have a pattern called header. I'm gonna click on that and this header is asking for four pieces of data, so I'm gonna give it the data that it needs. I'm gonna browse for a file, let's look at that, that looks good, we'll upload this, go to next, we'll say logo, logo. Then we'll give it a path and we'll tell it what main menu to use, we'll just say use this as the main menu and with help menu we'll just tell it to use the user menu for now, and then finish. Let's drag this up to the top, hit update and save and let's go see what happened. Let's go back to our homepage and voila, we've got a header pulled from Pattern Lab in here. You'll notice that the menu is not the same menu that's coming from Pattern Lab and why is that? It's because it's pulling the actual primary links from Drupal.

If we go into the menu system here, we chose the main menu to use, the main menu here and if we create another link here we can say another link, have this go to front. Then we come back to our homepage, you can see that this is actually pulling from Drupal. We have no sub-menus underneath that and that's why it's not showing anything underneath it. But you can see that we're actually using our own data from within Drupal.

How did we actually pull in this whole header section from Pattern Lab? Let's go back to where we actually pulled that in. I'm gonna go to structure, pages, back into our site template. We had this content type that we pulled in, this PL header content type. This is a ctools content type and you can define those with a content type plugin. Because this plugin only exists within this theme, we defined the ctools content type within this theme. The way we did that is within our info file, we're specifying where content types for ctools should live and we're saying those content types should live in the Pattern Lab directory, which is right here. This behavior isn't default behavior in ctools, so we did have to patch the ctools modules so that we could do this. You can check out that patch here and leave any comments if you have any comments or suggestions regarding that patch. It's a very small patch, but it basically allows us to define content types, ctools content types, within our theme and not have to define a module just for these content types.

Let's look inside of this Pattern Lab directory and see what we have. The way ctools plugins work is it'll traverse the directory that you've defined and look for any .inc files and read those in while it's processing plugins. Within the organisms header directory, I have a file called organisms_header_inc. The content type system within ctools will pick up this file and it'll read this variable that defines the actual plug in. You can specify other functions to be able to expose, for example, an edit form. Here you see we have a submit handler for that edit form. But here's where all the magic happens, here's a function that we've defined called preprocess. This is where we actually map the Drupal data into data that Pattern Lab understands and we pass this data to Mustache to actually render the pattern. Let me back up and show you what Pattern Lab is expecting to see within this content type. I'm going to open up a new PHP storm directory so that I don't have to continue to scroll up. Go into sites, all, themes, hills, node_modules, hills-patternlab, open a new window, yes. Here is the Hills Pattern Lab directory within that theme, that Hills theme. What I'm going to do is go into the pattern that we're actually pulling in and so it is here. This pattern is pulling in data from this data file and the hints that you can get from this data file is basically looking at what these patterns rely on. We're not producing or we're not outputting any data here, so what we need to do is we need to drill down to see where data actually is being output. If we go into navigation, we can see navigation still isn't outputting any data, it's still just including other atoms and molecules.

So let's jump into the primary links. Now primary links is starting to actually output data. We have that href file there or href variable, we have a name variable there. But then you can also see that it's pulling in yet another include. But this is where we want to start. We're using data here in this primary links component, within Pattern Lab you can specify data in this data directory. We have this primary links Jason file and we basically specified an object with a primary links key. Now Pattern Lab's going read into all these data files and essentially merge the objects so that you can reference them by whatever key is at the root of that object, they all get merged into the data json file. If we look here, primary links is being looped through and then the href and the name is being rendered out. If we look at this, primary links is an array with name and href. If we collapse these guys, you can see we have name, we have several links here. Staff services, work tools, news, administrative units, contact us, that all coincides with our pattern over here.

Within that pattern, coming back over here to primary links, you can see that it's including molecules drop down and that is here. That's also rendering or looping through the links and using the classes variable and the name variables. So if I come back into that data file and open one of these guys up, you can see here's the links array and there's the name and href that is being used. It looks like we're only specifying classes on this very last button here. If we come back here, you'll see that that class is actually specified there and that's what makes that look a little bit different.

Essentially what we're doing in Drupal is we're just mapping data to the data that Pattern Lab expects. Let's jump back over to Drupal and so now here's our Drupal content type. You can see here, essentially we're returning an array, nav bar brand. Here's our primary links, so that's what we just looked at, the primary links is essentially creating an array that looks like this, but in Drupal. You can see here, hills_menu_tree, this is essentially creating that array that Pattern Lab is expecting.

I'll show you an easier example of what that looks like as we continue to build out this page. Let's add another pattern to this site layout. If we come into the default template and we add content, go to Pattern Lab components, I'm going to add in a footer calling card. So that footer calling card, if we come over here into molecules, then footer, we can see the footer calling card looks like this. If we come into the template for that, that's a molecule under footer, the calling card looks like this. This takes several variables, it takes a title, phone, email and then it loops through a social array and outputs the href and the network we defaulted in Pattern Lab. If we look at the data that we defaulted in Pattern Lab, we can come over here to footer calling card and you can see that we've got a calling card key and then we've just specified the data down there.

We're gonna render this in Drupal so we created a content type that essentially has an edit form for all of this information. Let's just fill this out, information technology and let's just go sure and yes. Let's keep that there, update and save. Now there's our footer calling card. All right, you guys get the idea there, we're able to create content types, we're able to create an edit field that passes in data or we're collecting data and then we pass that data into Mustache and render the Pattern Lab template with the data that we are pre-processing.

What is you're working with content that isn't going to be a ctools content type, for example nodes? Let's create some nodes and see what that actually looks like. I'm going to come in here into configuration and we're going to devel generate a few nodes. I can come down here, hit generate content and let's just create 10 basic pages, we'll generate that. Now let's go back to our homepage here, so there we have five nodes listed on the homepage now. How do we actually style this so this looks like something? What I'm going to show is first thing I'm gonna do is I'm going to put this into a content container. Let me go over to our Pattern Lab, I'm gonna go to layouts and look at our main content. Our main content goes into a container that looks like this and the content is output inside of that container. What I'm going to do is I'm going to actually create this homepage as a view so we can actually control the template that's being output here. I'm gonna come over here to structure, go down to views and let's add a new view, let's call this homepage list, we'll continue and edit that.

We're gonna make this a content pane, I'm actually going to get rid of this page here, we didn't need that I should've unchecked it. Within that content pane, we're going to render fields and we're not really gonna be using the views output so uncheck that and then we'll also throw in the body here and we're going to limit that to 600 characters, so that's what our view is going to look like that we're going to use on the homepage. Let's go ahead and save that. What I'm gonna do is I'm gonna create a new front page over in page manager. Within page manager, I'm gonna add a custom page, we'll call this front page and then we'll give this the path of front and we're gonna check this box that says make this your site homepage, we'll continue. Then I'm going to choose the layout called main content and what that's going to do is that's going to use the layout from Pattern Lab that uses main content and I'll show you that here in a second.

I'll hit continue, continue and then inside of here, we're going to output the view that we just created. So here's that view there, so we'll save that and we'll hit finish there. Update and save so now we have a front page that's going to render a view called homepage list using the layout main content. So let's take a look to see what happened here, let's go back to the homepage and there you go, you can see that we're now outputting that actual view within the page content. This home site install shouldn't output here and this is actually being output by panels, so what we're gonna do is we're gonna disable that title there. If we come back here into content and then ... Actually this is going to be in the site template. We'll edit that, go into content and within the page content section, we're going to override the title and make it nothing, update and save that. Now we're getting a lot closer to what our styles look like in Pattern Lab.

Now the next step that we want to do is actually make this view look like something in Pattern Lab. What we're going to do is we're going to make that look like a two column stack view. We have this data here that's set up in a two column stack, we're gonna make the data from views output this template when it renders. Let's jump back into views, so let's go into the front page that we just created and go into the content and so here's that view. I'm going to open this cog here and edit this view and a new tab, so views gives you the ability to specify a theme file. So what we're gonna do is we're actually going to specify this theme file in our theme. I'm gonna copy that and then jump into our theme over here. So here's our Drupal theme, going to my templated directory let's create a views directory so that all our views live in the same directory within our templates directory. Let's create a file called views unformatted home page list.

Now, let's just put in hello world so that you can see that this is actually working. When we re scan the templates, views is going to pick up that template file as you can see now that it's bolded. We'll save this and refresh the homepage and you can see now it is outputting hello world, which is in our template file. How do we actually use the template that is in Pattern Lab? Let's go back into views, so this is where the magic happens in this theme. We have a variable expose called ‘m’, which is basically the Mustache connector to Pattern Lab. On that connector, we have a method called render, so this is where we're going to specify the actual template that we want to use within Mustache. There is a naming convention to this and we'll document what that naming convention is, but essentially what you need to do is specify what type of pattern it is, this is an organism so it's in organisms and then what the name of the template it, this is a two column stack. That's really it, that's all you have to do to render this template, so let's go ahead and save this and then look at our view here.

That didn't render anything, let's go back to our template and you can see that we're actually not printing anything out, so let's actually print out the results of that call and then let's see what happens. There you go, now we're actually printing out the template from Pattern Lab, but you can see that this is actually pulling the data from Pattern Lab, it's pulling out its default data from Pattern Lab. How do we actually make it put our own data? Just like the content types that I was showing you, we can send it a map of how our data should look. There's two ways we can do this, this method here, render, actually takes two arguments. One of them is an array and I'm gonna show you that first and this array is the actual map that the component is expecting. If we look to see what the component is expecting, let me jump back over to our Pattern Lab and then go into the data file that the two column stack is expecting, we can see that it's expecting an array with a two column stack as a key and then that as an array of objects with card as the key and then title and nutgraf.

What I'm going to do is I'm actually gonna just pull some of that data out of there. Let's go back to our Drupal theme and paste that in here. Obviously json syntax doesn't work in PHP, so we need to convert some of this, so I'm gonna make that an array so that looks more like what PHP can understand. Now what I'm going to do is copy this array, but we also need a key of what it's expecting. What it's expecting as the key is primary links, sorry we're looking at the two column stack, what is that expecting, two column stack is what that's expecting. Let's grab that, so now this should do it, probably because that ends the map there. Now we have this two column stack, we're actually passing it data so you can pass it whatever data you want, but essentially this is what Drupal is looking for, so if you have your data then just go ahead and do it right here. Let's see what this actually looks like when we save that and then come back over to Drupal and then hit refresh. You can see how it's printing out five cards with the same data in there.

We have another way of actually mapping this data and this is through a pre-process callback. The way that works is there's a third parameter that you can pass to this render function. Let me delete that array that we just defined and what we're gonna do for the second parameter is we're just going to actually pass it the variables that Drupal knows about. Inside of a Drupal theme, when you're working with your template, there's a variable exposed called variables that you can use and then you can output whatever variables you want. What we're going to do is we're gonna actually pass that to a callback that we define here and we'll pass that in as ‘v’ just so that we don't confuse it with variables here. What's happening behind the scenes is this variables is being passed into this callback and now you can run PHP logic to actually pre-process your variables. Just to show you what variables looks like, let's demo that so you can see what's actually being passed into this callback. If we hit save there, refresh this, you can see here is what the view is actually outputting and what we really want within this data is the results that the view is outputting. Here's all the data that the view is outputting, so this is what we want to actually map to what Drupal is expecting.

If we come back into here, what we'll do is we'll return an array that Pattern Lab is expecting, that array expects to have this key so we'll copy that and then that key has a bunch of cards that are associated with it. What we're gonna do is we're actually gonna do the pre-process just to keep this clean in a separate function, so I'll just define this as process card data. In this function that we're going to define now, we'll just copy this function guy here. This function's going to need to take the data that we're passing it, so pass in ‘v’ here and we'll pass in ‘v’ here, but really what we really need from ‘v’ is just the results from the view, so maybe we just pass in ‘v’ view results. Then down here we can just say that this is going to be called the results, since it's an array of results.

Now, essentially what we need to do is we need to create this data format again with this function that we're using inside of Drupal. What we'll do is we're just going to loop through the results as result and we're going to create what we need. Actually we need to specify that data array up here and then let's return the data array down here. What we need to do is specify each element of the data array. That's going to be equal to another array and in that array... Let's see what that needs to look like, that needs to have a card key with another array with title, nutgraf and href. We're just gonna leave href off, since we don't have the links yet. Title, nutgraf and href. So let's go back here and let's start to set up what this looks like. It needs title, it's gonna be something, nutgraf and href. We also need this to be in an array with cards, so let's actually create the card here and pull this inside of the card. Now this is starting to look a lot like the data that Pattern Lab is expecting.

Now let's actually map the data from views. If we come back over here and we look at what we have, we have each one of these objects, we have the note title and we have the field body. Essentially we just need to write that into the template, no title and then for the body we have field body zero rendered markup. For the href, we do have the NID so I guess we can pass that here. All right and that should be all that we need, so if we save this and then go back and refresh this, that didn't work. Let's take a look to see what we did wrong here. I'm going to output our variables again and just make sure that we map this properly. View, result ... So we got view, ‘v’, result. View is an array, it's not an object, this will probably fix it. So we have ‘v’, view, result, view is an object, result is an array object. We're looping through the array and then an object. Let's save this and see if that actually works and get rid of this criminal.

There we go, there is our views data within our template from Pattern Lab. The idea is that the Drupal theme developer just needs to specify one file that renders the template from Pattern Lab while passing it the variables that Pattern Lab is expecting or passing a callback function that our Mustache connector can then call and map out the variables that Pattern Lab is expecting.

So that’s the direction we’re going.  Hopefully that gives you the idea of what we’re trying to do. We don’t have any HTML, CSS, or JavaScript in our theme.  Any changes needed in those files get pushed upstream into the living style guide, and we pull the changes back down into Drupal.

There’s a lot going on under the hood in this theme to make all of this work. We’re thinking that this theme will end up as a base theme that you can extend, to take advantage of all this functionality.  However, that is still yet to be determined and we may change our minds on that approach.  If you have an opinion on that, please let us know in the comments.

There are definitely some trade offs to using this living style guide approach, and those trade offs exist regardless of the Drupal version you use.  I plan to release a future video to talk about the benefits and disadvantages of the living style guide approach with Drupal.  Taking this approach definitely does not fit every Drupal theme.  More about that later.

Also, we’re going to be releasing more videos as we iterate on this theme, so if you’re interested in following along with us, make sure you subscribe to our channel. Thanks for watching!

Mar 23 2017
Mar 23

Preface

We recently had the opportunity to work on a Symfony app for one of our Higher Ed clients that we recently built a Drupal distribution for. Drupal 8 moving to Symfony has enabled us to expand our service offering. We have found more opportunities building apps directly using Symfony when a CMS is not needed. This post is not about Drupal, but cross posting to Drupal Planet to demonstrate the value of getting off the island. Enjoy!

Writing custom authentication schemes in Symfony used to be on the complicated side. But with the introduction of the Guard authentication component, it has gotten a lot easier.

One of our recent projects required use to interface with Shibboleth to authenticate users into the application. The application was written in Symfony 2 and was using this bundle to authenticate with Shibboleth sessions. However, since we were rewriting everything in Symfony 3 which the bundle is not compatible with, we had to look for a different solution. Fortunately for us, the built-in Guard authentication component turns out to be a sufficient solution, which allows us to drop a bundle dependency and only requiring us to write only one class. Really neat!

How Shibboleth authentication works

One way Shibboleth provisions a request with an authenticated entity is by setting a "remote user" environment variable that the web-server and/or residing applications can peruse.

There is obviously more to Shibboleth than that; it has to do a bunch of stuff to do the actual authenticaiton process. We defer all the heavy-lifting to the mod_shib Apache2 module, and rely on the availability of the REMOTE_USER environment variable to identify the user.

That is pretty much all we really need to know; now we can start writing our custom Shibboleth authentication guard:




namespace AppBundle\Security\Http;

use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\RedirectResponse;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\Routing\Generator\UrlGeneratorInterface;
use Symfony\Component\Security\Core\Authentication\Token\TokenInterface;
use Symfony\Component\Security\Core\Exception\AuthenticationException;
use Symfony\Component\Security\Core\User\UserInterface;
use Symfony\Component\Security\Core\User\UserProviderInterface;
use Symfony\Component\Security\Guard\AbstractGuardAuthenticator;
use Symfony\Component\Security\Http\Logout\LogoutSuccessHandlerInterface;

class ShibbolethAuthenticator extends AbstractGuardAuthenticator implements LogoutSuccessHandlerInterface
{
    
    private $idpUrl;

    
    private $remoteUserVar;

    
    private $urlGenerator;

    public function __construct(UrlGeneratorInterface $urlGenerator, $idpUrl, $remoteUserVar = null)
    {
        $this->idpUrl = $idpUrl;
        $this->remoteUserVar = $remoteUserVar ?: 'HTTP_EPPN';
        $this->urlGenerator = $urlGenerator;
    }

    protected function getRedirectUrl()
    {
        return $this->urlGenerator->generateUrl('shib_login');
    }

    
    public function start(Request $request, AuthenticationException $authException = null)
    {
        $redirectTo = $this->getRedirectUrl();
        if (in_array('application/json', $request->getAcceptableContentTypes())) {
            return new JsonResponse(array(
                'status' => 'error',
                'message' => 'You are not authenticated.',
                'redirect' => $redirectTo,
            ), Response::HTTP_FORBIDDEN);
        } else {
            return new RedirectResponse($redirectTo);
        }
    }

    
    public function getCredentials(Request $request)
    {
        if (!$request->server->has($this->remoteUserVar)) {
            return;
        }

        $id = $request->server->get($this->remoteUserVar);

        if ($id) {
            return array('eppn' => $id);
        } else {
            return null;
        }
    }

    
    public function getUser($credentials, UserProviderInterface $userProvider)
    {
        return $userProvider->loadUserByUsername($credentials['eppn']);
    }

    
    public function checkCredentials($credentials, UserInterface $user)
    {
        return true;
    }

    
    public function onAuthenticationFailure(Request $request, AuthenticationException $exception)
    {
        $redirectTo = $this->getRedirectUrl();
        if (in_array('application/json', $request->getAcceptableContentTypes())) {
            return new JsonResponse(array(
                'status' => 'error',
                'message' => 'Authentication failed.',
                'redirect' => $redirectTo,
            ), Response::HTTP_FORBIDDEN);
        } else {
            return new RedirectResponse($redirectTo);
        }
    }

    
    public function onAuthenticationSuccess(Request $request, TokenInterface $token, $providerKey)
    {
        return null;
    }

    
    public function supportsRememberMe()
    {
        return false;
    }

    
    public function onLogoutSuccess(Request $request)
    {
        $redirectTo = $this->urlGenerator->generate('shib_logout', array(
            'return'  => $this->idpUrl . '/profile/Logout'
        ));
        return new RedirectResponse($redirectTo);
    }
}

Let's break it down:

  1. class ShibbolethAuthenticator extends AbstractGuardAuthenticator ... - We'll extend the built-in abstract to take care of the non-Shibboleth specific plumbing required.

  2. __construct(...) - As you would guess, we are passing in all the things we need for the authentication guard to work; we are getting the Shibboleth iDP URL, the remote user variable to check, and the URL generator service which we need later.

  3. getRedirectUrl() - This is just a convenience method which returns the Shibboleth login URL.

  4. start(...) - This is where everything begins; this method is responsible for producing a request that will help the Security component drive the user to authenticate. Here, we are simply either 1.) redirecting the user to the Shibboleth login page; or 2.) producing a JSON response that tells consumers that the request is forbidden, if the client is expecting application/json content back. In which case, the payload will conveniently inform consumers where to go to start authenticating via the redirect property. Our front-end application knows how to handle this.

  5. getCredentials(...) - This method is responsible for extracting authentication credentials from the HTTP request i.e. username and password, JWT token in the Authorization header, etc. Here, we are interested in the remote user environment variable that mod_shib might have set for us. It is important that we check that the environment variable is actually not empty because mob_shib will still have it set but leaves it empty for un-authenticated sessions.

  6. getUser(...) - Here we get the credentials that getCredentials(...) returned and construct a user object from it. The user provider will also be passed into this method; whatever it is that is configured for the firewall.

  7. checkCredentials(...) - Following the getUser(...) call, the security component will call this method to actually verify whether or not the authentication attempt is valid. For example, in form logins, this is where you would typically check the supplied password against the encrypted credentials in the the data-store. However we only need to return true unconditionally, since we are trusting Shibboleth to filter out invalid credentials and only let valid sessions to get through to the application. In short, we are already expecting a pre-authenticated request.

  8. onAuthenticationFailure(...) - This method is called whenever our authenticator reports invalid credentials. This shouldn't really happen in the context of a pre-authenticated request as we 100% entrust the process to Shibboleth, but we'll fill this in with something reasonable anyway. Here we are simply replicating what start(...) does.

  9. onAuthenticationSuccess(...) - This method gets called when the credential checks out, which is all the time. We really don't have to do anything but to just let the request go through. Theoretically, this would be there we can bootstrap the token with certain roles depending on other Shibboleth headers present in the Request object, but we really don't need to do that in our application.

  10. supportsRememberMe(...) - We don't care about supporting "remember me" functionality, so no, thank you!

  11. onLogoutSuccess(...) - This is technically not part of the Guard authentication component, but to the logout authentication handler. You can see that our ShibbolethAuthenticator class also implements LogoutSuccessHandlerInterface which will allow us to register it as a listener to the logout process. This method will be responsible for clearing out Shibboleth authentication data after Symfony has cleared the user token from the system. To do this we just need to redirect the user to the proper Shibboleth logout URL, and seeding the return parameter to the nice logout page in the Shibboleth iDP instance.

Configuring the router: shib_login and shib_logout routes

We'll update app/config/routing.yml:



shib_login:
  path: /Shibboleth.sso/Login

shib_logout:
  path: /Shibboleth.sso/Logout

You maybe asking yourself why we even bother creating known routes for these while we can just as easily hard-code these values to our guard authenticator.

Great question! The answer is that we want to be able to configure these to point to an internal login form for local development purposes, where there is no value in actually authenticating with Shibboleth, if not impossible. This allows us to override the shib_login path to /login within routing_dev.yml so that the application will redirect us to the proper login URL in our dev environment.

We really can't point shib_logout to /logout, though, as it will result in an infinite redirection loop. What we do is override it in routing_dev.yml to go to a very simple controller-action that replicates Shibboleth's logout URL external behavior:



...

  public function mockShibbolethLogoutAction(Request $request)
  {
      $return = $request->get('return');

      if (!$return) {
          return new Response("`return` query parameter is required.", Response::HTTP_BAD_REQUEST);
      }

      return $this->redirect($return);
  }
}

Configuring the firewall

This is the last piece of the puzzle; putting all these things together.







services:
  app.shibboleth_authenticator:
    class: AppBundle\Security\Http\ShibbolethAuthenticator
    arguments:
      - '@router'
      - '%shibboleth_idp_url%'
      - '%shibboleth_remote_user_var%'

---






imports:
  - { resources: config.yml }
  - { resources: security.yml }

---

imports:
  - { resources: config.yml }
  - { resources: security_dev.yml } 

---






security:
  firewall:
    main:
      stateless: true
      guard:
        authenticators:
          - app.shibboleth_authenticator

      logout:
        path: /logout
        success_handler: app.shibboleth_authenticator

---





security:
  firewall:
    main:
      stateless: false
      form_login:
        login_path: shib_login
        check_path: shib_login
        target_path_parameter: return

The star here is actually just what's in the security.yml file, specifically the guard section; that's how simple it is to support custom authentication via the Guard authentication component! It's just a matter of pointing it to the service and it will hook it up for us.

The logout configuration tells the application to allocate the /logout path to initiate the logout process which will eventually call our service to clean up after ourselves.

You also notice that we actually have security_dev.yml file here that config_dev.yml imports. This isn't how the Symfony 3 framework ships, but this allows us to override the firewall configuration specifically for dev environments. Here, we add the form_login authentication scheme to support logging in via an in-memory user-provider (not shown). The authentication guard will redirect us to the in-app login form instead of the Shibboleth iDP during development.

Also note the stateless configuration difference between prod and dev: We want to keep the firewall in production environments stateless; this just means that our guard authenticator will get consulted in all requests. This ensures that users will actually be logged out from the application whenever they are logged out of the Shibboleth iDP i.e. when they quit the web browser, etc. However we need to configure the firewall to be stateful during development, otherwise the form_login authentication will not work as expected.

Conclusion

I hope I was able to illustrate how versatile the Guard authentication component in Symfony is. What used to require multiple classes to be written and wired together now only requires a single class to implement, and its very trivial to configure. The Symfony community has really done a great job at improving the Developer Experience (DX).

Setting pre-authenticated requests via environment variables isn't just used by mod_shib, but also by other authentication modules as well, like mod_auth_kerb, mod_auth_gssapi, and mod_auth_cas. It's a well-adopted scheme that Symfony actually ships with a remote_user authentication listener starting 2.6 that makes it very easy to integrate with them. Check it out if your needs are simpler i.e. no custom authentication-starter/redirect logic, etc.

Mar 06 2017
Mar 06

In the modern world of web / application development, using package managers to pull in dependencies has become a de-facto standard. In fact, if you are developing enterprise software and you aren't leveraging package managers I would challenge you to ask yourself why not?

Drupal was very early to adopt this mindset of pulling in dependencies almost a decade ago when Dmitri Gaskin created an extension for Drush (the Drupal Shell) that added the ability to pull contributed modules by listing them in a make file (I think Dmitri was 12 years old when he wrote the Drush extension, pretty amazing!). Since that time, the make extension has been added to Drush core.

Composer is the current standard for putting together PHP applications, which is why Drupal 8 has gone this direction, so why not use Composer to put together Drupal 7 applications?

First off, I want to clarify what I'm not talking about in this post. I am not advocating that we ditch Drush all together, I still find value in other aspects of what Drush can do. I am specifically referring to the Make aspect of Drush. Is Drush Make still necessary?

This post is also not about Drupal Console vs Drush, both CLI tools add tremendous value to development workflow, and there isn't 100% overlap with these tools [yet]. I think we still need both tools.

This post is about how I came to see the benefit of switching to Composer from Drush Make. I recommend making this move for Drupal 7 and Drupal 8. This Drupal Composer workflow is not new, it has been around for a while. I just never saw a good reason to make the jump from Drush Make to this new process, until now. We have been asked in the comments on previous posts, "Why haven't you adopted the Composer process?" I now have a good reason to change our process and fully jump on board with Composer building Drupal 7 applications. We appreciate all the comments we get on our blog, it sharpens everyone involved!

We have blogged about the Composer workflow in a previous post on our [Drupal 8 build process]({% post_url 2016-07-13-drupal-8-development-in-docker-redux %}) in the past, but the main motivation there was to be proactive about where PHP application development is going [already is]. We didn't have a real use case for the switch to Composer until now. This post will review how I came to that revelation.

Dependency Managers

I want to make one more point before I make the case for Composer. There are many reasons to use package managers to pull in dependencies. I'll save the details for another blog post. The main reason developers use package managers is so that your project repository does not include libraries and modules that you do not maintain. That is why tools like Composer, npm, Yarn, Bower, and Bundler exist. Hook up your RSS reader to our blog, I'll explain in more detail in a future post, but for now I'll leave this link to the Composer site explaining why committing dependencies is a bad idea, in your project repo.

Version Numbers

The #1 reason to make the switch to Composer is the ability to manage version numbers. You may be asking "What's the big deal, Drush Make handles version numbers as well?" let me give you a little context of why using Composer version numbers are a better approach.

The Back Story

Recently in a strategy meeting with one of our enterprise clients, we were discussing how to approach launching 100's of sites on one Drupal Core utilizing multiple installation profiles on top of Acquia Site Factory. Our goal was to figure out how we could sanely manage updating potentially dozens of installation profiles without explicitly defining each version number of the profile being updated. This type of Drupal architecture is also a topic for a future blog post, but for now read Acquia's explanation of why architecting with profiles is a good idea.

As a developer, it is common place to lock down versions to a very specific version so that we know exactly what versions we are using / deploying. This is the reason composer.lock, Gemfile.lock, yarn.lock, and npm shrinkwrap exist. We have experienced the pain of unexpected defects in applications due to an obscure dependency changing deep in the dependency tree. Most dependency managers have a very explicit command for updating dependencies, i.e. composer update, bundle update, yarn upgrade respectively, which in turn update the lock file.

A release manager does not need to know explicitly which version of a dependency (installation profile, module, etc), to release next, she simply wants the latest stable release.

Herein lies the problem with Drush Make. There are practices that exist that solve both the developer problem and release manager problem that do not exist in Drush Make, but do exist in Composer and other application development environments. It's a common pattern that has been around for a while, it's called semantic versioning.

Semantic Versioning

If you haven't heard of semantic versioning (semver), go check it out now. Pretty much every package manager I have dealt with has adopted semver. Adopting semver gives the developer, or release manager, the choice of how to update dependencies within their app. There are very distinct numbers in semver for introducing breaking changes, new features, and bug fixes. How does this play into what problem use cases I mentioned above?

A developer has the ability to specify in the composer.json file specific versions, while leaving the version number flexible to pull in new bug fixes and feature improvements (patch and minor releases). Look at the example below:

{
  "name": "My Drupal Platform",
  ...
  "require": {
	...
    "drupal/drupal": "~7.53.0",
    "drupal/views": "^3.14.0"
  },
  ...
}

The tilde ~ and caret ^ symbols have special meanings when specifying version numbers. The tilde matches the most recent minor version (updates patch release number, the last number), the caret will update you to the most recent major version (updates minor release number, the middle number).

The above example basically says, use the views module at version 3.14, and when version 3.15 comes out, update me to that version when I run composer update.

Breaking changes should only be introduced when you update the first number, the major release. Of course, if you completely trust the developer writing the contributed code this system would be enough, but not all developers follow best practice, which is why the lock file was created and the need to explicitly run composer update.

With this system in place, a release manager now only needs to worry about running one command to get the latest stable release of all dependencies. This command could also be hidden behind a nice UI (a CI Server) so all she has to do is push one button to grab all the latest dependencies and push to a testing site for verification.

Understanding everyones needs

In the past, I didn't have a good reason to move away from Drush Make, because it did the job, and Drush is so much more than Drush Make. The strategy session we had was eye opening. Understanding the needs from an operations perspective, while not jeopardizing the integrity of the application led us down a path to see a problem that the wider development community at large has already solved (not just the PHP community). It's very rewarding to solve problems like this, especially when you come to the conclusion that someone has already solved the problem! "We just had to find the path to the water! (--A.W.)"

What do you think about using Drush Make vs Composer for pulling together a Drupal Application? Leave us your thoughts in the comments.

Jul 30 2016
Jul 30

On a recent project, we had to create multiple sitemaps for each of the domains that we have setup on the site. We came across some problems that we had to resolve because of the nature of our pURL setup.

##Goals##

  • We want all of the front pages from each subdomain to be added to the sitemap and we are able to set the rules for them on the XMLSitemap settings page.

  • We want to make sure that the URLs that we are adding to the other pages no longer show up in the main domain's sitemap.

##Problems## 1) Only On The Primary Domain

The XML sitemap module only creates one sitemap based on the primary domain.

2) Prefixes not Distinguished

Our URLs for nodes are setup so that nodes can be prefixed with our subdomain (pURL modifier) and XMLSitemap doesn't see our prefixes as being different sites. At this point, all nodes are added to every single domain's sitemap.

3) URL Formats

Our URLs are not in the correct format when being added to the sitemap. Our URLs should look like http://subdomain.domain.org/*, however, because we are prefixing them, they show up as http://domain.org/subdomain/*. We want our URLs to look like they are from the right sub-domain and not all coming from the base domain.

##Solution##

We were able to add the ability to create sitemaps for each of the 15 domains by adding the XMLSitemap domain module. The XLMSitemap domain module allows us to define a domain for each sitemap, generate a sitemap and serve it on the correct domain.

We added xmlsitemap-dont-write-empty-element-in-xml-sitemap-file-2545050-3.patch to prevent empty elements from being added to the sitemap.

Then we used a xmlsitemap_element_alter inside of our own custom module that looks something like this:




function hook_xmlsitemap_element_alter(array &$element, array $link, $sitemap) {
  $domain = $sitemap->uri['options']['base_url'];
  $url_parts = explode('//', $domain);
  $parts = explode('.', $url_parts[1]);
  $subdomain = array_shift($parts);

  $current_parts = explode('/', $link['loc']);
  $current_prefix = array_shift($current_parts);

  $modifiers = _get_core_modifiers();

  
  if (in_array($subdomain, array_keys($modifiers))) {
    
    
    if ($current_prefix != $subdomain && $current_prefix != '') {
      
      $element = array();
        return $element;
      }
    else {
      
      $pattern = $current_prefix . '/';
      $element['loc'] = $domain . str_replace($pattern, '', $link['loc']);
    }
  }
  else {
    
    
    if (in_array($current_prefix, array_keys($modifiers))) {
      $element = array();
      return $element;
    }
  }
}


function _get_core_modifiers() {
  if (!$cache = cache_get('subdomains')) {
    $result = db_query("SELECT id, value FROM {purl} WHERE provider = 'og_purl_provider'")->fetchAllAssoc('value');
    cache_set('subdomains', $result, 'cache', time() + 86400);
    return $result;
  }
  else {
    return $cache->data;
  }
?>

If you have any questions, suggestions, feel free to drop a comment below!

Jul 14 2016
Jul 14

Back in December, Tom Friedhof shared how we set up our Drupal 8 development and build process utilizing Docker. It has been working well in the several months we have used it and worked within its framework. Within the time-span however, we experienced a few issues here and there which led me to come up with an alternative process which keeps the good things we like and getting rid of/resolving the issues we encountered.

First, I'll list some improvements that we'd like to see:

  1. Solve file-syncing issues

    One issue that I keep running into when working with our development process is that the file-syncing stops working when the host machine powers off in the interim. Even though Vagrant's rsync-auto can still detect changes on the host file-system and initiates an rsync to propel files up into the containers via a mounted volume, the changes do not really appear within the containers themselves. I had a tough time debugging this issue, and the only resolution in sight was to do a vagrant reload -- it's a time-consuming process as it rebuilds every image and running them again. Having to do this every morning when I turn on my laptop at work was no fun.

  2. Performant access to Drupal's root

    Previously, we had to mount Drupal's document root to our host machine using sshfs to explore in it, but it's not exactly performant. For example, performing a grep or ag to search within files contents under Drupal 8's core takes ~10 seconds or more. Colleagues using PhpStorm report that mounting the Drupal root unto the host system brings the IDE to a crawl while it indexes the files.

  3. Levarage Docker Compose

    Docker Compose is a great tool for managing the life-cycle of Docker containers, especially if you are running multiple applications. I felt that it comes with useful features that we were missing out because we were just using Vagrant's built-in Docker provider. Also with the expectation that Docker for Mac Beta will become stable in the not-so-distant future, I'd like the switch to a native Docker development environment as smooth as possible. For me, introducing Docker Compose into the equation is the logical first-step.

    dlite just got into my attention quite recently which could fulfill the role of Docker for Mac before its stable release, but haven't gotten the chance to try it yet.

  4. Use Composer as the first-class package manager

    Our previous build primarily uses Drush to build the Drupal 8 site and download dependencies and relegating the resolution of some Composer dependencies to Composer Manager. Drush worked really well for us in the past and there is no pressing reason why we should abandon it, but considering that Composer Manager is deprecated for Drupal 8.x and that there is already a Composer project for Drupal sites, I thought it would be a good idea to be more proactive and rethink the way we have been doing Drupal builds and adopt the de-facto way of putting together a PHP application. At the moment, Composer is where it's at.

  5. Faster and more efficient builds

    Our previous build utilizes a Jenkins server (also ran as a container) to perform the necessary steps to deploy changes to Pantheon. Since we were mostly deploying from our local machines anyway, I always thought that perhaps running the build steps via docker run ... would probably suffice (and it doesn't incur the overhead of a running Jenkins instance). Ultimately, we decided to explore Platform.sh as our deployment target, so basing our build in Composer became almost imperative as Drupal 8 support (via Drush) on Platform.sh is still in beta.

With these in mind, I'd like to share our new development environment & build process.

1. File & directory structure

Here is a high-level tree-view of the file structure of the project:

/<project_root>
├── Vagrantfile
├── Makefile
├── .platform/ 
│   └── routes.yaml
├── bin/ 
│   ├── drupal*
│   ├── drush*
│   └── sync-host*
├── docker-compose.yml 
├── environment 
├── src/ 
│   ├── .gitignore
│   ├── .platform.app.yaml 
│   ├── Dockerfile
│   ├── LICENSE
│   ├── bin/ 
│   │   ├── drupal-portal*
│   │   └── drush-portal*
│   ├── composer.json
│   ├── composer.lock
│   ├── custom/
│   ├── phpunit.xml.dist
│   ├── scripts/
│   ├── vendor/
│   └── web/ 
└── zsh/ 
    ├── zshrc
    ├── async.zsh
    └── pure.zsh

2. The Vagrantfile


Vagrant.configure("2") do |config|

  config.vm.box = "debian/jessie64"
  config.vm.network "private_network", ip: "192.168.100.47"

  config.vm.hostname = 'activelamp.dev'

  config.vm.provider :virtualbox do |vb|
    vb.name = "activelamp.com"
    vb.memory = 2048
  end

  config.ssh.forward_agent = true

  config.vm.provision "shell",
    inline: "apt-get install -y zsh && sudo chsh -s /usr/bin/zsh vagrant",
    run: "once"

  config.vm.provision "shell",
    inline: "[ -e /home/vagrant/.zshrc ] && echo '' || ln -s /vagrant/zsh/zshrc /home/vagrant/.zshrc",
    run: "once"

  config.vm.provision "shell",
    inline: "[ -e /usr/local/share/zsh/site-functions/prompt_pure_setup ] && echo '' || ln -s /vagrant/zsh/pure.zsh /usr/local/share/zsh/site-functions/prompt_pure_setup",
    run: "once"

  config.vm.provision "shell",
    inline: "[ -e /usr/local/share/zsh/site-functions/async ] && echo '' || ln -s /vagrant/zsh/async.zsh /usr/local/share/zsh/site-functions/async",
    run: "once"

  if ENV['GITHUB_OAUTH_TOKEN']
    config.vm.provision "shell",
      inline: "sudo sed -i '/^GITHUB_OAUTH_TOKEN=/d' /etc/environment  && sudo bash -c 'echo GITHUB_OAUTH_TOKEN=#{ENV['GITHUB_OAUTH_TOKEN']} >> /etc/environment'"
  end

  
  config.vm.provision :docker

  config.vm.provision :docker_compose, yml: "/vagrant/docker-compose.yml", run: "always", compose_version: "1.7.1"

  config.vm.synced_folder ".", "/vagrant", type: "nfs"
  config.vm.synced_folder "./src", "/mnt/code", type: "rsync", rsync__exclude: [".git/", "src/vendor"]
end

Compare this new manifest to the old one and you will notice that we reduce Vagrant's involvement in defining and managing Docker containers. We are simply using this virtual machine as the Docker host, using the vagrant-docker-compose plugin to provision it with the Docker Compose executable and having it (re)build the images during provisiong stage and (re)start the containers on vagrant up.

We are also setting up Vagrant to sync file changes on src/ to /mnt/code/ in the VM via rsync. This directory in the VM will be mounted into the container as you'll see later.

We are also setting up zsh as the login shell for the vagrant user for an improved experience when operating within the virtual machine.

3. The Drupal 8 Build

For now let's zoom in to where the main action happens: the Drupal 8 installation. Let's remove Docker from our thoughts for now and focus on how the Drupal 8 build works.

The src/ directory cotains all files that constitute a Drupal 8 Composer project:


/src/
├── composer.json
├── composer.lock
├── phpunit.xml.dist
├── scripts/
│   └── composer/
├── vendor/ # Composer dependencies
│   └── ...
└── web/ # Web root
    ├── .htaccess
    ├── autoload.php
    ├── core/ # Drupal 8 Core
    ├── drush/
    ├── index.php
    ├── modules/
    ├── profiles/
    ├── robots.txt
    ├── sites/
    │   ├── default/
    │   │   ├── .env
    │   │   ├── config/ # Configuration export files
    │   │   │   ├── system.site.yml
    │   │   │   └── ...
    │   │   ├── default.services.yml
    │   │   ├── default.settings.php
    │   │   ├── files/
    │   │   │   └── ...
    │   │   ├── services.yml
    │   │   ├── settings.local.php.dist
    │   │   ├── settings.php
    │   │   └── settings.platform.php
    │   └── development.services.yml
    ├── themes/
    ├── update.php
    └── web.config

The first step of the build is simply executing composer install within src/. Doing so will download all dependencies defined in composer.lock and scaffold files and folders necessary for the Drupal installation to work. You can head over to the Drupal 8 Composer project repository and look through the code to see in depth how the scaffolding works.

3.1 Defining Composer dependencies from custom installation profiles & modules

Since we cannot use the Composer Manager module anymore, we need a different way of letting Composer know that we may have other dependencies defined in other areas in the project. For this let's look at composer.json:

{
    ...
    "require": {
        ...
        "wikimedia/composer-merge-plugin": "^1.3",
        "activelamp/sync_uuids": "dev-8.x-1.x"
    },
    "extra": {
        ...
        "merge-plugin": {
          "include": [
            "web/profiles/activelamp_com/composer.json",
            "web/profiles/activelamp_com/modules/custom/*/composer.json"
          ]
        }
    }
}

We are requiring the wikimedia/composer-merge-plugin and configuring it in the extra section to also read the installation profile's composer.json and one's that are in custom modules within it.

We can define the contrib modules that we need for our site from within the installation profile.

src/web/profiles/activelamp_com/composer.json:

{
  "name": "activelamp/activelamp-com-profile",
  "require": {
    "drupal/admin_toolbar": "^8.1",
    "drupal/ds": "^8.2",
    "drupal/page_manager": "^8.1@alpha",
    "drupal/panels": "~8.0",
    "drupal/pathauto": "~8.0",
    "drupal/redirect": "~8.0",
    "drupal/coffee": "~8.0"
  }
}

As we create custom modules for the site, any Composer dependencies in them will be picked up everytime we run composer update. This replicates what Composer Manager allowed us to do in Drupal 7. Note however that unlike Composer Manager, Composer does not care if a module is enabled or not -- it will always read its Composer dependencies and resolve them.

3.2 Drupal configuration

3.2.1 Settings file

Let's peek at what's inside src/web/settings.php:




$settings['container_yamls'][] = __DIR__ . '/services.yml';

$config_directories[CONFIG_SYNC_DIRECTORY] = __DIR__ . '/config';


include __DIR__ . "/settings.platform.php";

$update_free_access = FALSE;
$drupal_hash_salt = '';

$local_settings = __DIR__ . '/settings.local.php';

if (file_exists($local_settings)) {
  require_once($local_settings);
}

$settings['install_profile'] = 'activelamp_com';
$settings['hash_salt'] = $drupal_hash_salt;

Next, let's look at settings.platform.php:



if (!getenv('PLATFORM_ENVIRONMENT')) {
    return;
}

$relationships = json_decode(base64_decode(getenv('PLATFORM_RELATIONSHIPS')), true);

$database_creds = $relationships['database'][0];

$databases['default']['default'] = [
    'database' => $database_creds['path'],
    'username' => $database_creds['username'],
    'password' => $database_creds['password'],
    'host' => $database_creds['host'],
    'port' => $database_creds['port'],
    'driver' => 'mysql',
    'prefix' => '',
    'collation' => 'utf8mb4_general_ci',
];

We return early from this file if PLATFORM_ENVIRONMENT is not set. Otherwise, we'll parse the PLATFORM_RELATIONSHIPS data and extract the database credentials from it.

For our development environment however, we'll do something different in settings.local.php.dist:



$databases['default']['default'] = array(
    'database' => getenv('MYSQL_DATABASE'),
    'username' => getenv('MYSQL_USER'),
    'password' => getenv('MYSQL_PASSWORD'),
    'host' => getenv('DRUPAL_MYSQL_HOST'),
    'driver' => 'mysql',
    'port' => 3306,
    'prefix' => '',
);

We are pulling the database values from the environment, as this is how we'll pass data in a Docker run-time. We also append .dist to the file-name because we don't actually want settings.local.php in version control (otherwise, it will mess up the configuration in non-development environments). We will simply rename this file as part of the development workflow. More on this later.

3.2.2 Staged configuration

src/web/sites/default/config/ contains YAML files that constitute the desired Drupal 8 configuration. These files will be used to seed a fresh Drupal 8 installation with configuration specific for the site. As we develop features, we will continually export the configuration entities and place them into this folder so that they are also versioned via Git.

Configuration entities in Drupal 8 are assigned a universally unique ID (a.k.a UUID). Because of this, configuration files are typically only meant to be imported into the same (or a clone of the) Drupal site they were imported from. The proper approach is usually getting hold of a database dump of the Drupal site and use that to seed a Drupal 8 installation which you plan to import the configuration files into. To streamline the process during development, we wrote the drush command sync-uuids that updates the UUIDs of the active configuration entities of a non-clone site (i.e. a freshly installed Drupal instance) to match those found in the staged configuration. We packaged it as Composer package named activelamp/sync_uuids.

The complete steps for the Drupal 8 build is the following:

$ cd src
$ composer install
$ [ -f web/sites/default/settings.local.php ] && : || cp web/sites/default/settings.local.php.dist web/sites/default/settings.local.php
$ drush site-install activelamp_com --account-pass=default-pass -y
$ drush pm-enable config sync_uuids -y
$ drush sync-uuids -y
$ drush config-import -y

These build steps will result a fresh Drupal 8 installation based on the activelamp_com installation profile and will have the proper configuration entities from web/sites/default/config. This will be similar to any site that is built from the same code-base minus any of the actual content. Sometimes that is all that you need.

Now let's look at the development workflow utilizing Docker. Let's start with the src/Dockerfile:



FROM php:7.0-apache

RUN apt-get update && apt-get install -y \
  vim \
  git \
  unzip \
  wget \
  curl \
  libmcrypt-dev \
  libgd2-dev \
  libgd2-xpm-dev \
  libcurl4-openssl-dev \
  mysql-client

ENV PHP_TIMEZONE America/Los_Angeles


RUN docker-php-ext-install -j$(nproc) iconv mcrypt \
&& docker-php-ext-configure gd --with-freetype-dir=/usr/include/ --with-jpeg-dir=/usr/include/ \
 && docker-php-ext-install -j$(nproc) gd pdo_mysql curl mbstring opcache


RUN curl -sS https://getcomposer.org/installer | php
RUN mv composer.phar /usr/local/bin/composer
RUN echo 'export PATH="$PATH:/root/.composer/vendor/bin"' >> $HOME/.bashrc


RUN composer global require drush/drush:8.1.2 drupal/console:0.11.3
RUN $HOME/.composer/vendor/bin/drupal init
RUN echo source '$HOME/.console/console.rc' >> $HOME/.bashrc


RUN echo "date.timezone = \"$PHP_TIMEZONE\"" > /usr/local/etc/php/conf.d/timezone.ini
ARG github_oauth_token

RUN [ -n $github_oauth_token ] && composer config -g github-oauth.github.com $github_oauth_token || echo ''

RUN [ -e /etc/apache2/sites-enabled/000-default.conf ] && sed -i -e "s/\/var\/www\/html/\/var\/www\/web/" /etc/apache2/sites-enabled/000-default.conf || sed -i -e "s/\/var\/www\/html/\/var\/www\/web/" /etc/apache2/apache2.conf


COPY bin/drush-portal /usr/bin/drush-portal
COPY bin/drupal-portal /usr/bin/drupal-portal

COPY . /var/www/
WORKDIR /var/www/

RUN composer --working-dir=/var/www install

The majority of the Dockerfile should be self-explanatory. The important bits are the provisioning of a GitHub OAuth token & adding of the {drupal,drush}-portal executables which are essential for the bin/{drush,drupal} pass-through scripts.

Provisioning a GitHub OAuth token

Sometimes it is necessary to configure Composer to use an OAuth token to authenticate on GitHub's API when resolving dependencies. These tokens must remain private and should not be committed into version control. We declare that our Docker build will take github_oauth_token as a build argument. If present, it will configure Composer to authenticate using it to get around API rate limits. More on this later.

DrupalConsole and Drush pass-through scripts

Our previous build involved opening up an SSH port on the container running Drupal so that we can execute Drush commands remotely. However, we should already be able to run Drush commands inside the container without having SSH access by utilizing docker run. However the commands can get too lengthy. In fact, they will be extra lengthy because we also need to execute this from within the Vagrant machine using vagrant ssh.

Here are a bunch of scripts that makes it easier to execute drush and drupal commands from the host machine:

Here are the contents of bin/drush and bin/drupal:

#!/usr/bin/env bash
cmd="docker-compose -f /vagrant/docker-compose.yml  run --no-deps --rm server drupal-portal $@"
vagrant ssh -c "$cmd"
#!/usr/bin/env bash
cmd="docker-compose -f /vagrant/docker-compose.yml  run --no-deps --rm server drush-portal $@"
vagrant ssh -c "$cmd"

This allow us to do bin/drush to run Drush commands and bin/drupal ... to run DrupalConsole commands, and the arguments will be pass over to the executables in the container.

Here are the contents of src/bin/drupal-portal and src/bin/drush-portal:

#!/usr/bin/env bash
/root/.composer/vendor/bin/drupal --root=/var/www/web $@
#!/usr/bin/env bash
/root/.composer/vendor/bin/drush --root=/var/www/web $@

The above scripts are added to the container and is essential to making sure drush and drupal commands are applied to the correct directory.

In order for this to work, we actually have to remove Drush and DrupalConsole from the project's composer.json file. This is easily done via the composer remove command.

The docker-compose.yml file

To tie everything together, we have this Compose file:

version: '2'
services:
  server:
    build:
      context: ./src
      args:
        github_oauth_token: ${GITHUB_OAUTH_TOKEN}
    volumes:
      - /mnt/code:/var/www
      - composer-cache:/root/.composer/cache
    env_file: environment
    links:
      - mysql:mysql
    ports:
      - 80:80
  mysql:
    image: 'mysql:5.7.9'
    env_file: environment
    volumes:
      - database:/var/lib/mysql

volumes:
  database: {}
  composer-cache: {}

There are four things of note:

  1. github_oauth_token: ${GITHUB_OAUTH_TOKEN}

    This tells Docker Compose to use the environment variable GITHUB_OAUTH_TOKEN as the github_oauth_token build argument. This, if not empty, will effectively provision the Composer with an OAuth token. If you go back to the Vagrantfile, you will see that this environment variable is set in the virtual machine (because docker-compose is run under it) by appending it to the /etc/environment file. All it needs is that the environment variable is present in the host environment (OS X) during the provisioning step.

    For example, it can be provisioned via: GITHUB_OAUTH_TOKEN= vagrant provision

  2. composer-cache:/root/.composer/cache

    This tells Docker to mount a volume on /root/.composer/cache so that we can persist the contents of this directory between restarts. This will ensure that composer install and composer update is fast and would not require re-downloading packages from the web every time we run. This will drastically imrpove the build speeds.

  3. database:/var/lib/mysql

    This will tell Docker to persist the MySQL data between builds as well. This is so that we don't end up with an empty database whenever we restart the containers.

  4. env_file: environment

    This let us define all environment variables in a single file, for example:

    MYSQL_USER=activelamp
    MYSQL_ROOT_PASSWORD=root
    MYSQL_PASSWORD=some-secret-passphrase
    MYSQL_DATABASE=activelamp
    DRUPAL_MYSQL_HOST=mysql

    We just configure each service to read environment variables from the same file as they both need these values.

We employ rsync to sync files from the host machine to the VM since it offers by far the fastest file I/O compared to the built-in alternatives in Vagrant + VirtualBox. In the Vagrantfile we specified that we sync src/ to /mnt/code/ in the VM. Following this we configured Docker Compose to mount this directory into the server container. This means that any file changes we make on OS X will get synced up to /mnt/code, and ultimately into /var/www/web in the container. However, this only covers changes that originate from the host machine.

To sync changes that originates from the container -- files that were scaffolded by drupal generate:*, Composer dependencies, and Drupal 8 core itself -- we'll use the fact that our project root is also available at /vagrant as a mount in the VM. We can use rsync to sync files the other way -- rsyncing from /mnt/code to /vagrant/src will bring file changes back up to the host machine.

Here is a script I wrote that does an rsync but will ask for confirmation before doing so to avoid overwriting potentially uncommitted work:

#!/usr/bin/env bash

echo "Dry-run..."

args=$@

diffs="$(vagrant ssh -- rsync --dry-run --itemize-changes $args | grep '^[>

We are keeping this generic and not bake in the paths because we might want to sync arbitrary files to arbitrary destinations.

We can use this script like so:

$ bin/sync-host --recursive --progress --verbose --exclude=".git/" --delete-after /mnt/code/ /vagrant/src/

If the rsync will result in file changes on the host machine, it will bring up a summary of the changes and will ask if you want to proceed or not.

Makefile

We are using make as our task-runner just like in the previous build. This is really useful for encapsulating operations that are common in our workflow:



sync-host:
	bin/sync-host --recursive --progress --verbose --delete-after --exclude='.git/' /mnt/code/ /vagrant/src/

sync:
	vagrant rsync-auto

sync-once:
	vagrant rsync

docker-rebuild:
	vagrant ssh -- docker-compose -f /vagrant/docker-compose.yml build

docker-restart:
	vagrant ssh -- docker-compose -f /vagrant/docker-compose.yml up -d

composer-install:
	vagrant ssh -- docker-compose -f /vagrant/docker-compose.yml run --no-deps --rm server composer --working-dir=/var/www install

composer-update:
	vagrant ssh -- docker-compose -f /vagrant/docker-compose.yml run --no-deps --rm server composer --working-dir=/var/www update --no-interaction



lock-file:
	@vagrant ssh -- cat /mnt/code/composer.lock

install-drupal: composer-install
	vagrant ssh -- '[ -f /mnt/code/web/sites/default/settings.local.php ] && echo '' || cp /mnt/code/web/sites/default/settings.local.php.dist /mnt/code/web/sites/default/settings.local.php'
	-bin/drush si activelamp_com --account-pass=secret -y
	-bin/drush en config sync_uuids -y
	bin/drush sync-uuids -y
	[ $(ls -l src/web/sites/default/config/*.yml | wc -l) -gt 0  ] && bin/drush cim -y || echo "Config is empty. Skipping import..."

init: install-drupal
	yes | bin/sync-host --recursive --progress --verbose --delete-after --exclude='.git/' /mnt/code/ /vagrant/src/

platform-ssh:
	ssh ></span>@ssh.us.platform.sh

The Drupal 8 build steps are simply translated to use bin/drush and the actual paths within the virtual machine in the install-drupal task. After cloning the repository for the first time, a developer should just be able to execute make init, sit back with a cup of coffee and wait until the task is complete.

Try it out yourself!

I wrote the docker-drupal-8 Yeoman generator so that you can easily give this a spin. Feel free to use it to look around and see it in action, or even to start off your Drupal 8 sites in the future:

$ npm install -g yo generator-docker-drupal-8
$ mkdir myd8
$ cd myd8
$ yo docker-drupal-8

Just follow through the instructions, and once complete, run vagrant up && make docker-restart && make init to get it up & running.

If you have any questions, suggestions, anything, feel free to drop a comment below!

Jun 15 2016
Jun 15

The web development community can have a long list of requirements, languages, frameworks, constructs and tools that most companies or bosses want you to know.

This list may not include everything you need to know including PHP, HTML, CSS, responsive web development principles, and Drupalisms. Here is the list of some of the important skills, concepts, and tools that we think you should know as a beginner Drupal developer.

####1. Version Control

Every developer should have some experience with version control and versioning. Version control is an essential part of the Drupal community. Versioning allows for Drupal projects to be easily managed, maintained and contributed in a uniform manner. Version control will also most likely be used in-house to manage each client project as well.

####2. Command Line Interface (CLI)

It isn't necessary to be a CLI Ninja, however being able to work comfortably using a CLI is very important. One of the advantages to using a CLI is the ability to be more productive. You can quickly automate repetitive tasks, perform tasks without jumping from application to application, and the ability to use tools like Drush to perform tasks that would normally require you to navigate 3 or more mouse clicks to accomplish.

####3. Package Managers

Using package managers is important to the installation of Drupal. Whether it is installing Sass or Bootstrap from node or Drush from composer, it is important to know how package managers work and exactly what you are running before running commands on your computer.

####4. Contributing Back

An important part of the Drupal community is contributing back to projects and core. When you find an issue, such as something that just doesn't seem to work correctly, or you would like to implement a functionality to Drupal, you should think about giving back to the community. If you find an issue on an existing project or core, check to see if there is an existing ticket on that project. If there isn't, you can create one, and if you can debug it and resolve the issue you can contribute a patch to that issue. If you don't know exactly how to debug the issue you can have an open conversation with other developers and maintainers to help resolve the issue. Contributing and interacting in the community moves Drupal forward.

####5. CSS Preprocessors

Within the last couple of years, there has been a movement to CSS preprocessors to add a programmatic feel to CSS2 and CSS3. There are some that are against preprocessors because it adds a little more overhead to a project. Whether you use them or not, you may have a client or framework that uses one that you might need to be familiar with how to use a preprocessor.

####6. A Framework

Within the Drupal community, there is often talk of headless Drupal. We have seen some interesting ideas come from the adopters of headless Drupal. Headless Drupal setups usually use a framework for the front-end. It may be Angular, Angular 2, Backbone, Ember or something different, however, most of the frameworks have two things in common, they are often written in Javascript and almost always make use of templating.

####7. Templating

It is important to know the principles of templating so that you can easily pick up and learn new frameworks. Whether it is Mustache, Twig, Jade, or the templating syntax from within Angular, there are similarities between the syntax and the principles can be applied to each of the languages that will allow you to quickly step from one to the next with a smaller learning curve.

####8. Basic Debugging

Debugging a problem correctly can save you valuable time by getting you directly to the cause of an issue instead of looking over each line of code one by one. It is essential to know how to do basic debugging when working with Drupal. Sometimes the error messages can give you enough information, other times it is necessary to step into Devel or XDebug and step through the project to find the exact location where the code is not working correctly so that you can start to solve the problem.

####9. Unit Testing / Code Testing

Testing your own code is important. When it comes to code testing you have many options, from TTD and BDD you can write unit tests to cover your classes, linting to make sure you are writing "good", standardized code. Linting can be helpful for writing code that others can easily navigate and sets up some best practices for you to follow.

####10. A CMS

When starting with Drupal, it might be good to have familiarity with a CMS platform before jumping in. There are some advantages to knowing the constructs of other CMS platforms and being familiar with how to work within a platform. However, when working with Drupal it is important to think about the way Drupal works and not be stuck in the way other CMS platforms accomplish goals.

####Conclusion As a web developer, it is important to know many concepts and technologies. Many companies will not require you to know everything, do everything and be a jack-of-all-trades. In technology, there are so many new tools, frameworks, and languages coming out daily that it is impossible to stay on top of them all. It is far better to get a good base understanding of core web concepts that can be applied to multiple languages, tools, and technologies and then specialize.

Did I miss something you feel is important? Is there something you would like to have seen on the list? Leave a comment below.

Jun 07 2016
Jun 07

Continuing from Evan's blog post on building pages with Paragraphs and writing custom blocks of content as fields, I will walk you through how to create a custom field-formatter in Drupal 8 by example.

A field-formatter is the last piece of code to go with the field-type and the field-widget that Evan wrote about in the previous blog post. While the field-type tells Drupal about what data comprises a field, the field-formatter is responsible for telling Drupal how to display the data stored in the field.

To recap, we defined a hashtag_search field type in the previous blog post whose instances will be composed of two items: the hashtag to search for, and the number of items to display. We want to convert this data into a list of the most recent n tweets with the specified hashtag.

A field-formatter is a Drupal plugin, just like its respective field-type and field-widget. They live in /src/Plugin/Field/FieldFormatter/ and are namespaced appropriately: Drupal\\Plugin\Field\FieldFormatter.




namespace Drupal\my_module\Plugin\Field\FieldFormatter;


use Drupal\Core\Field\FieldItemListInterface;
use Drupal\Core\Field\FormatterBase;
use Drupal\Core\Form\FormStateInterface;


class HashtagFormatter extends FormatterBase
{

    public function viewElements(FieldItemListInterface $items, $langcode)
    {
        return array();
    }
}

We tell Drupal important details about our new field-formatter using a @FieldFormatter class annotation. We declare its unique id; a human-readable, translatable label; and a list of field_types that it supports.

The most important method in a field-formatter is the viewElements method. It's responsibility is returning a render array based on field data being passed as $items\Core\Field\FieldItemListInterface>.

Let's look at the code:




use Drupal\my_module\Twitter\TwitterClient;
use Drupal\my_module\Twitter\TweetFormatter;

...

    
    protected $twitter;

    
    protected $formatter;

    ...

    public function viewElements(FieldItemListInterface $items, $langcode)
    {
        $element = [];

        
        foreach ($items as $delta => $item) {

            try {

                
                $results = $this->twitter->search($item->hashtag_search, $item->count);

                
                
                
                $statuses = array_map(function ($s) {
                    $s['formatted_text'] = $this->formatter->toHtml($s['text'], $s['entities']);
                    return $s;
                }, $results['statuses']);

                
                if (!empty($statuses)) {
                    $element[$delta]['header'] = [
                        '#markup' => '

#'</span> . $item->hashtag_search . '

'
]; } foreach ($statuses as $status) { $element[$delta]['status'][] = [ '#theme' => 'my_module_status', '#status' => $status ]; } } catch (\Exception $e) { $this->logger->error('[:exception]: %message', [ ':exception' => get_class($e), '%message' => $e->getMessage(), ]); continue; } } $element['#attached']['library'][] = 'my_module/twitter_intents'; return $element; } ...

See https://github.com/bezhermoso/tweet-to-html-php for how TweetFormatter works. Also, you can find the source-code for the basic Twitter HTTP client here: https://gist.github.com/bezhermoso/5a04e03cedbc77f6662c03d774f784c5

Custom theme renderer

As shown above, each individual tweets are using the my_module_status render theme. We'll define it in the my_module.module file:




function my_module_theme($existing, $type, $theme, $path) {
  $theme = [];
  $theme['my_module_status'] = array(
    'variables' => array(
      'status' => NULL
    ),
    'template' => 'twitter-status',
    'render element' => 'element',
    'path' => $path . '/templates'
  );

  return $theme;
}

With this, we are telling Drupal to use the template file modules/my_module/templates/twitter-status.twig.html for any render array using my_module_status as its theme.

Render caching

Drupal 8 does a good job caching content: typically any field formatter is only called once and the resulting collective render arrays are cached for subsequent page loads until the Drupal cache is cleared. We don't really want our Twitter block to be cached for that long. Since it is always great practice to keep caching enabled, we can define how caching is to be applied to our Twitter blocks. This is done by adding cache definitions in the render array before we return it:



      public function viewElements(...)
      {

        ...

        $element['#attached']['library'][] = 'my_module/twitter_intents';
        
        $element['#cache']['max-age'] = 60 * 5;

        return $element;
      }

Here we are telling Drupal to keep the render array in cache for 5 minutes. Drupal will still cache the rest of the page's elements how they want to be cached, but will call our field formatter again -- which pulls fresh data from Twitter -- if 5 minutes has passed since the last time it was called.

Jun 04 2016
Jun 04

Tom Friedhof

Senior Software Engineer

Tom has been designing and developing for the web since 2002 and got involved with Drupal in 2006. Previously he worked as a systems administrator for a large mortgage bank, managing servers and workstations, which is where he discovered his passion for automation and scripting. On his free time he enjoys camping with his wife and three kids.

Jun 03 2016
Jun 03

On a recent project we had to create a section that is basically a Twitter search for a hashtag. It needed to be usuable in different sections of the layout and work the same. Also, we were using the Paragraphs module and came up with a pretty nifty (we think) solution of creating a custom field that solved this particular problem for us. I will walk you through how to create a custom field/widget/formatter for Drupal 8. There are Drupal console commands for generating boilerplate code for this... which I will list before going through each of the methods for the components.

Field Type creation

The first thing to do is create a custom field. In a custom module (here as "my_module") either run drupal:generate:fieldtype or create a file called HashTagSearchItem.php in src/Plugin/Field/FieldType. The basic structure for the class will be:



namespace Drupal\my_module\Plugin\Field\FieldType;

use Drupal\Core\Field\FieldItemBase;
use Drupal\Core\Field\FieldStorageDefinitionInterface;
use Drupal\Core\Form\FormStateInterface;
use Drupal\Core\Language\LanguageInterface;
use Drupal\Core\TypedData\DataDefinition;


class HashtagSearchItem extends FieldItemBase {



}

Next, implement a few methods that will tell Drupal how our field will be structured. Provide a default field settings for the field that will be the count for the amount of tweets to pull. This will return of default settings keyed by the setting's name.



  
  public static function defaultFieldSettings() {
    return [
      'count' => 6
    ] + parent::defaultFieldSettings();
  }

Then provide the field item's properties. In this case there will be an input for hashtag and a count. Each property will be keyed by the property name and be a DataDefinition defining what the properties will hold.


  
  public static function propertyDefinitions(FieldStorageDefinitionInterface $field_definition) {
    $properties = [];
    $properties['hashtag_search'] = DataDefinition::create('string')
      ->setLabel(t('The hashtag to search for.'));
    $properties['count'] = DataDefinition::create('integer')
      ->setLabel(t('The count of twitter items to pull.'));
    return $properties;
  }

Then provide a schema for the field. This will be the properties that we have created above.


  
  public static function schema(FieldStorageDefinitionInterface $field_definition) {
    return [
      'columns' => [
        'hashtag_search' => [
          'type' => 'varchar',
          'length' => 32,
        ],
        'count' => [
          'type' => 'int',
          'default' => 6
        ]
      ]
    ];
  }

Field widget creation

Next create the widget for the field, which is the actual form element and it's settings. Either drupal:generate:fieldwidget or create a file in src/Plugin/Field/FieldWidget/ called HashtagSearchWidget.php. This is the class' skeleton:



namespace Drupal\my_module\Plugin\Field\FieldWidget;

use Drupal\Core\Field\FieldItemListInterface;
use Drupal\Core\Field\WidgetBase;
use Drupal\Core\Form\FormStateInterface;
use Drupal\Core\Render\Element;



class HashtagSearchWidget extends WidgetBase {
  
}

Then implement several methods. Provide a default count of tweets to pull for new fields and the settings form for the field item:


  
  public static function defaultSettings() {
    return [
      'default_count' => 6,
    ] + parent::defaultSettings();
  }

  
  public function settingsForm(array $form, FormStateInterface $form_state) {
    $elements = [];
    $elements['default_count'] = [
      '#type' => 'number',
      '#title' => $this->t('Default count'),
      '#default_value' => $this->getSetting('default_count'),
      '#empty_value' => '',
      '#min' => 1
    ];

    return $elements;
  }

  
  public function settingsSummary() {
    $summary = [];
    $summary[] = t('Default count: !count', array('!count' => $this->getSetting('default_count')));

    return $summary;
  }

Then create the actual form element. Add the hashtag textfield and count number field and wrap it in a fieldset for a better experience:


  
  public function formElement(FieldItemListInterface $items, $delta, array $element, array &$form, FormStateInterface $form_state) {
    $item = $items[$delta];

    $element['hashtag_search'] = [
      '#type' => 'textfield',
      '#title' => $this->t('Hashtag'),
      '#required' => FALSE,
      '#size' => 60,
      '#default_value' => (!$item->isEmpty()) ? $item->hashtag_search : NULL,
    ];

    $element['count'] = [
      '#type' => 'number',
      '#title' => $this->t('Pull count'),
      '#default_value' => $this->getSetting('default_count'),
      '#size' => 2
    ];

    $element += [
      '#type' => 'fieldset',
    ];

    return $element;
  }

In part 2, Bez will show you how to pull the tweets and create a field formatter for the display of the tweets. You can read that post here!

May 14 2016
May 14

Tom Friedhof

Senior Software Engineer

Tom has been designing and developing for the web since 2002 and got involved with Drupal in 2006. Previously he worked as a systems administrator for a large mortgage bank, managing servers and workstations, which is where he discovered his passion for automation and scripting. On his free time he enjoys camping with his wife and three kids.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web