Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jan 04 2024
Jan 04
Published on Thursday 4, January 2024

I use GitHub to host my repositories, such as this website. To keep my dependencies up-to-date, I leverage Dependabot. The product has matured a lot over the past few years. Before, it was a standalone service and then acquired by GitHub. It did not support dependencies managed by Composer. It was pretty spammy and very noisy. However, it has drastically improved over the past few years. Thanks to all of those at GitHub who have worked to improve it (that includes you, Mike Crittenden.)

My Dependabot configuration consists of a few items, nothing overly specific.

  • Defining each ecosystem in my repository (GitHub Actions, Composer, NPM)
  • Specifying a schedule for that ecosystem
  • Setting up ignore rules, such as avoiding major version bumps
  • Defining groups to combine packages that have batched releases.

I'll walk through the different configuration options. At the end of the blog post, I have two examples: one for my blog and another for a Laravel application with a Vue.js frontend. I recommend reading the full documentation for the dependabot.yml configuration options, as I barely scratch the surface of my usage.

To configure Dependabot, you must have the dependabot.yml file in your .github directory.

First, all dependabot.yml files must start with the version key and contain an updates key. All of the package ecosystem configurations will go as an array under updates.

version: 2
updates:
  -

Now we can start flushing out the file.

Defining ecosystems for dependency updates

Since we're working with a Drupal site, we have Composer as a package ecosystem.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/"

The directory key is required; Dependabot doesn't provide a default. This tells Dependabot that there is a composer.json and composer.lock file in the root of the repository (/).

More than likely, the Drupal theme uses tooling for CSS and JavaScript, so we need to add NPM as an ecosystem. In my experience, projects end up having the package.json in the theme directory rather than in the project's root.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/"

  - package-ecosystem: "npm"
    directory: "/web/themes/custom/mytheme"

This assumes the package.json and relevant lock file (yarn.lock or package-lock.json) is located at web/themes/custom/mytheme. If you're using Yarn or pnpm over NPM, the ecosystem key is still npm.

Let's say your Drupal project uses GitHub Actions for its continuous integration workflows. If you're using any GitHub Actions, those are also versioned and must stay updated. We can add the GitHub Actions ecosystem for that.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/"

  - package-ecosystem: "npm"
    directory: "/web/themes/custom/mytheme"
    
  - package-ecosystem: "github-actions"
    directory: "/"

Note, for GitHub Actions, you do not need to specify the directory as .github/workflows, only /.

Now, we'll begin to receive automatic updates for dependency! But, Dependabot will assign a random schedule for delivering them. Leading to chaos and a lot of frustration with earlier versions of Dependabot.

Specifying a schedule to more easily manage dependency updates

I like to receive my updates weekly. You can have it run daily, weekly, or monthly. Dependabot seems to have a limit on the number of updates it'll provide, so I wish you could configure specific days, like Monday and Thursday in case you have a lot of dependencies that may have been updated.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/"
    schedule:
      interval: "weekly"

  - package-ecosystem: "npm"
    directory: "/web/themes/custom/mytheme"
    schedule:
      interval: "weekly"    

  - package-ecosystem: "github-actions"
    directory: "/"
    schedule:
      interval: "weekly"

By default, weekly updates are delivered on Monday but can be configured to a specific day of the week.

  - package-ecosystem: "composer"
    directory: "/"
    schedule:
      interval: "tuesday"

Now, we've got scheduled updates to wrangle in dependency management chaos.

Setting up ignore rules to wrangle unwanted version bumps

What irritates me about Dependabot is how it will make a pull request to bump a dependency to a whole new major version when it does not match your semantic version constraints. For example, I had drupal/core-recommended set to ~9.5.0 in my composer.json. When Drupal 10.0.0 was released, I received a pull request bumping drupal/core-recommended to 10.0.0. That's not a valid update.

Luckily, we can use ignore to specify rules for ignoring specific dependencies or types of updates.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/"
    schedule:
      interval: "weekly"
    ignore:
      - dependency-name: "drupal/core*"
        update-types: ["version-update:semver-major"]

This uses a wildcard to tell Dependabot to ignore major version updates for drupal/core, drupal/core-recommended, drupal/core-composer-scaffold, and drupal/core-dev. In the examples at the end, I also use the same for Symfony and Laravel.

Instead of using update-types you can also use versions to ignore specific versions. I don't recommend this for Composer dependencies, as you should use the conflict package links.

Now, let's group related updates to reduce pull requests for framework packages.

Grouping dependency updates

I was excited once Dependabot rolled out the grouped version updates. When working with frameworks that have multiple packages released simultaneously, it combines the updates into one pull request. This helps avoid spam when Drupal core has a release, and you have to upgrade drupal/core and drupal/core-composer-scaffold along with the metapackages drupal/core-recommended and drupal/core-dev. The same goes for other frameworks with various components that may be released simultaneously, such as Symfony or Laravel.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/"
    schedule:
      interval: "weekly"
    ignore:
      - dependency-name: "drupal/core*"
        update-types: ["version-update:semver-major"]
    groups:
      drupal-core:
        patterns:
          - "drupal/core*"

We can use * for wildcards here, as well.

Examples

Here is the dependabot.yml for my Drupal site. I use the Gin theme, which has companion modules, Gin Toolbar and Gin Login, also released at similar intervals. So I grouped them as well.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/"
    schedule:
      interval: "weekly"
    ignore:
      - dependency-name: "drupal/core*"
        update-types: ["version-update:semver-major"]
    groups:
      drupal-core:
        patterns:
          - "drupal/core"
          - "drupal/core-composer-scaffold"
          - "drupal/core-recommended"
          - "drupal/core-dev"
      gin:
        patterns:
          - "drupal/gin"
          - "drupal/gin_login"
          - "drupal/gin_toolbar"
  - package-ecosystem: "github-actions"
    directory: "/"
    schedule:
      interval: "weekly"

Here is one from a Laravel application that uses a Vue.js frontend. I grouped the Symfony and Laravel updates and ignored major version updates.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/"
    schedule:
      interval: "weekly"
    ignore:
      - dependency-name: "symfony/*"
        update-types: ["version-update:semver-major"]
      - dependency-name: "laravel/*"
        update-types: ["version-update:semver-major"]
  - package-ecosystem: "github-actions"
    directory: "/"
    schedule:
      interval: "weekly"
  - package-ecosystem: "npm"
    directory: "/"
    schedule:
      interval: "weekly"

Here is my configuration file for drupal-mrn.dev, a monorepo containing a backend API and single page application frontend.

version: 2
updates:
  - package-ecosystem: "composer"
    directory: "/api"
    schedule:
      interval: "weekly"
  - package-ecosystem: "npm"
    directory: "/api"
    schedule:
      interval: "weekly"
  - package-ecosystem: "npm"
    directory: "/app"
    schedule:
      interval: "weekly"
  - package-ecosystem: "github-actions"
    directory: "/"
    schedule:
      interval: "weekly"

Want more? Sign up for my weekly newsletter

Nov 14 2023
Nov 14
Published on Tuesday 14, November 2023

I'm excited to announce a new feature coming to phpstan-drupal that already exists for PHPStan. PHPStan has an online playground to run and test analysis results. Soon, we will have one for phpstan-drupal! The online playground is an extremely useful tool when reporting bugs to PHPStan and will make it easier to report bugs for phpstan-drupal. 

I had thought about building this previously but was concerned about possible costs. After all, phpstan-drupal is a personal open-source project. It was brought up again in the #phpstan channel in Drupal Slack by Dezső Biczó (mxr576.) In great timing, OPTASY recently signed up as an organization sponsor through GitHub Sponsors. I will use these funds to pay for the playground's operating costs. 

When up an running, hopefully by the end of November, the playground will support analyzing code against the latest version of Drupal with PHPStan and phpstan-drupal. Later iterations will allow customizing the version of Drupal used (it's a bit more complicated.)

I emailed Ondřej asking if it was okay to copy the code; even though it's open source, it's always good to ask. Ondřej was also nice enough to disclose that the playground is affordable. The phpstan-drupal playground will probably receive less traffic, but I expect it to have longer execution times. With that, I'm assuming it should fall within a reasonable range.

The playground uses the Serverless framework to deploy to AWS Lambda. The code is broken into three components:

  • playground-runner: Executes sample PHP code with PHPStan configuration and returns the results, not a publicly exposed function.
  • playground-api: Public API, which executes the playground-runner and allows storing results in an S3 bucket for sharing.
  • website: The interface to interact with the playground-api.

I got everything up and running in a few hours with a rough interface. The biggest challenge was getting phpstan-drupal to properly analyze with the drupal/core package in the vendor directory. With a few hacks, it works.

However, there are some quicks. When I passed it the following code, I got the correct deprecation errors:

But, when I sent the following:

loadInclude('node', 'inc', 'node.admin');

It told me that the \Drupal class could not be found! I have some debugging to do.

I also am going to rewrite the playground-api code copied from PHPStan. PHPStan supports testing code from PHP 7.2 to PHP 8.2 and beyond. The phpstan-drupal playground will only run PHP versions supported by Drupal core. I will also need to see if we can support multiple versions of Drupal core. It might result in a few different playground-runner functions. For example, one for 11.x, 10.3.x, and ^10.2.0. We'll see.

Want more? Sign up for my weekly newsletter

Nov 07 2023
Nov 07
Published on Tuesday 7, November 2023

The Drupal Association has been working on a monumental effort to migrate away from our bespoke DrupalCI continuous integration system to GitLab CI as part of the GitLab Acceleration Initiative. Drupal core's test runs are five times faster using GitLab CI. I have loosely followed the progress as Drupal moves from our custom-built infrastructure onto GitLab. But someone shared with me a little feature I missed: adding a PHPStan job to the default GitLab CI templates!

Fran Garcia-Linares (fjgarlin) is the engineer from the Drupal Association who has been working on the GitLab CI templates. GitLab supports templates to allow reusing configuration for continuous integration workflows. The new phpstan job does a handful of things, and I love its approach.

  • Allows modules to commit a phpstan.neon file to provide customized PHPStan configuration, such as level: 9 or specific errors to be ignored.
  • Exports errors as a JUnit report, GitLab quality report for the user interface, and terminal output for users to manually review.
  • Generates a baseline file uploaded as an artifact that can be included with the project so they can start using PHPStan and accept all existing errors to be fixed later on!

What I found very creative was the way each report has been generated. PHPStan uses a result cache to make subsequent scans faster. The phpstan job uses this to create multiple reports from the results. It runs PHPStan three times with different outputs, capturing the exit code after each job run and generating the baseline.

With most of my open source work being on phpstan-drupal or Retrofit, I haven't worked on Drupal modules that often recently. I haven't had a chance to try out GitLab CI on Drupal.org yet. I need to set aside some time to check it out!

Is your module using GitLab CI yet? If not, check out the extensive documentation: https://www.drupal.org/node/3356364/

Want more? Sign up for my weekly newsletter

Oct 24 2023
Oct 24
Published on Tuesday 24, October 2023

Retrofit for Drupal now has documentation to explain what Drupal 7 hooks and APIs are supported. Retrofit for Drupal provides backward compatibility layers to run legacy Drupal 7 code in your sites using Drupal 10 and beyond. The problem, however, is anyone evaluating Retrofit for Drupal has had to take time to give it a test run and "see what happens." The documentation explains the various backward compatibility layers.

Documentation is critical for open source projects. Especially for a project that doesn't add functionality but ensures functionality for existing code continues to work. I hit some decision paralysis on how to set up documentation. The point of Retrofit for Drupal is to avoid having to configure our change code. However, the common question was, "But what hooks does it support?" Now, there is documentation within the repository to document the following:

Unfortunately, this doesn't cover the fact certain functions are supported, such as all database functions or variable_get and variable_set. I'll be adding more information shortly.

It isn't a perfect solution. I need to solicit feedback on checking "Is this hook supported?" I assume it would involve some search feature, like Drupal's API reference. The problem is that I don't want to build API documentation about Retrofit for Drupal and its internals. One idea I had was a mapping file somewhere that would allow mapping a component of Drupal 7 (functions, hooks, concepts) to its backward compatibility layer components in Drupal 7. Maybe there could be a way to use phpDocumentor or ApiGen and PHP attributes to solve this problem.

Want more? Sign up for my weekly newsletter

Oct 18 2023
Oct 18
Published on Wednesday 18, October 2023

One of the major problems observed in getting contributed modules Drupal 10 compatible was maintaining support for Drupal 9.5, which remains in security support (or remained if it is now past November 1, 2023.) Contributed modules should be compatible with all security-supported versions of Drupal core or more if they so choose. This can be difficult as Drupal 9.5 contained deprecated code removed in Drupal 10. It was up to maintainers and contributors to find workarounds and copy-pasting tricks using if/else statements with version_compare.

If you're interested in these challenges, I recommend catching my talk from MidCamp this year – Lessons learned from helping port the top contrib projects to Drupal 10.

Making backward-compatible calls possible

Luckily, Drupal 10.1.3 released a new utility class that will make supporting multiple versions of Drupal core easier while addressing deprecated code. The new class \Drupal\Component\Utility\DeprecationHelper and its backwardsCompatibleCall method allow code to be executed conditionally based on the current Drupal core version (change record.)

For instance, the user_roles() function has been deprecated in Drupal 10.2.0. Here's an example of using the deprecated user_roles function and its replacement using the DeprecationHelper::backwardsCompatibleCall method. Note – the example uses named arguments for demonstration purposes and is not required.

$result = DeprecationHelper::backwardsCompatibleCall(
    currentVersion: \Drupal::VERSION,
    deprecatedVersion: '10.2',
    currentCallable: fn() => Role::loadMultiple(),
    deprecatedCallable: fn() => user_roles(),
);

Let's break down this code. DeprecationHelper::backwardsCompatibleCall has four arguments

  • currentVersion is the version to be checked, which is the version of Drupal from \Drupal::VERSION.
  • deprecatedVersion is the version that introduced the deprecated code path, which happened in the release of Drupal 10.2.0.
  • currentCallable is the callable to be invoked for the new code path.
  • deprecatedCallable is the callable to be invoked for the deprecated code path

The method uses PHP's version_compare function to determine if currentVersion is greater than or equal to deprecatedVersion.

I sincerely appreciate everyone's discussions and bikeshedding of this code. There was a lot of back and forth over naming and documentation, which is extremely important for a tool like this. It may only be a few lines of code, but we needed to nail the developer experience.

There is a caveat, however. Modules using DeprecationHelper must support a minimum version of 10.1.3 for Drupal core. With the release of Drupal 10.2.0 during the week of December 11, 2023, Drupal 10.0.x will no longer receive security support. That means modules can begin choosing to drop support for 10.0.x and bump their minimum supported version to 10.1.

Making drupal-rector backwards compatible

Earlier this year, I wrote about the need to add backward compatibility to automated code fixes provided through Rector. Björn Brala (bbrala) has made this a reality: https://github.com/palantirnet/drupal-rector/pull/250!

As the Project Update Bot delivers automatic code fixes to projects for Drupal 11 compatibility through Rector, we can use DeprecationHelper to ensure breaking changes are not introduced. This is a giant leap forward in innovation for Drupal as the community works to reduce manual upkeep and maintenance of Drupal sites.

Want more? Sign up for my weekly newsletter

Sep 26 2023
Sep 26
Published on Tuesday 26, September 2023

I just finished reading Modernizing Drupal 10 Theme Development published by Packt Publishing by Luca Lusso. Don't let the title make you think it is about theme development. The book provides excellent coverage of general Drupal knowledge, so much so that I highly recommend this book as a go-to resource for any frontend or full-stack developer new to Drupal. I'd also recommend it for any backend developer working on a large team so that they understand the work their frontend colleagues are performing and how to assist them best.

I enjoyed the flow of the chapters. Reading this book made me think back to when I had my first job building websites, which was primarily front-end development. The book takes you down the journey of a developer tasked to implement a design onto a Drupal site.

There is a starting code base that you clone down and work with throughout the book. This felt natural for how build phases may work: backend developers or site builders scaffold the site, and then it is themed.  

The book begins with onboarding the reader to create a local environment, using the demo code base, some basics of how Drupal works, and how to create a new theme. Then, it dives into styling individual landing pages and components—each step along the way, the book layers in more Drupal knowledge alongside theming implementation.

My favorite highlights from the Modernizing Drupal 10 Theme Development book are:

  • As part of the local development environment setup with DDEV, it provides an excellent introduction for installing and setting up Docker for new users.
  • Creating a new theme using the Starterkit theme and TailwindCSS with Browsersync to provide a streamlined front-end development experience.
  • Introducing BackstopJS for visual regression testing! It is only briefly mentioned, but the example code repository provides the configuration and DDEV custom commands.
  • Learning how to use the WebProfiler module to understand various parts of Drupal's render pipeline and theme information
  • Understanding how to work with component-based design systems using Storybook and integrating them with Twig templates, including the newly added Single Directory Components.
  • Explains using Twig namespace for referencing Twig templates defined by other extensions.
  • Xdebug! The book explains why Xdebug is useful and how to use it in your development flow for theming, including debugging your Twig templates!
  • The book calls out a common mistake when modifying the output of templates: the loss of cacheable metadata bubbling. I was thrilled to see this covered and explain how you may end up with lost cacheable metadata and debugging it.
  • Creating custom Twig functions and filters to abstract away logic from templates into helper methods, keeping templates lean and clean.
  • Using custom entity type bundle classes to encapsulate logic further and remove it from your theming layer. The example given is creating a custom method to abstract away accessing a string value's content.

My only nitpicks are the amount comparisons to Drupal 7. Throughout the book, there are various comparisons to how Drupal 7 did things. I prefer we stop comparing Drupal to its previous iterations, but it may be good that this book does. There may be that frontend developer who worked on Drupal 6 or 7 and rolled their eyes at an upcoming Drupal project. Then, they read this book, which completely changed their mind about frontend development on a Drupal site.

Links to products may be Amazon or other affiliate links, which means I will earn a commission if you click through and buy something.

Want more? Sign up for my weekly newsletter

Sep 21 2023
Sep 21
Published on Wednesday 20, September 2023

Last week, I drove up to Minneapolis and attended Twin Cities DrupalCamp. I have only made it to the conference once before, way back in 2016, to present about the beginnings of Drupal Commerce 2.x. This is the first time Twin Cities DrupalCamp has been held at the end-of-summer/beginning-of-fall period. Twin Cities DrupalCamp was always held in June, which always conflicted with other events and family time at the end of the school year.

Twin Cities DrupalCamp was on Thursday and Friday. Each day had four session slots, with the afternoon set aside for an unconference time. I enjoyed this format. It can be mentally taxing to sit through sessions all day. It was a nice break of pace to have more casual conversations about different topics in the afternoon. After all, one of the best parts of getting together at a conference is for those serendipitous moments.

All of the session recordings are now available on YouTube: https://www.youtube.com/playlist?list=PLztBsFl4ot8t3uTki5JGMydmBdZPFOCaU. Thank you, Kevin Thull, for providing the recording equipment and recordings for so many conference sessions through your Drupal Recording Initiative.

It was a short trip for me. I drove to Minneapolis Wednesday afternoon and went home Friday afternoon. It's usually a 5-6 hour drive. However, the trip took a bit longer, about 7 hours, because the rental company gave me a Tesla Model 3. That was pretty cool because I have never driven a Tesla, let alone any electric car. The extra time was due to chargers not being as easily accessible off the interstate as if going to a gas station, and, of course, the actual charge time. It was a cool experience. And it only cost roughly $30 in charging fees, versus the two probably tanks of gas costing $120 ($3.75 a gallon at 16 gallons.)

Save time upgrading from Drupal 7 to Drupal 10 using Retrofit

Twin Cities DrupalCamp was also my first time talking about Retrofit for Drupal. On Thursday morning, I gave my "Save time upgrading from Drupal 7 to Drupal 10 using Retrofit" talk. You can download the slides as well.

The session was well attended and led to some great discussions, included in the recording at the end. Overall there was a lot of positive feedback. It led to some more interesting discussions about Drupal 7 during the unconference on Friday.

Thursday

Unfortunately I missed a few sessions on Thursday. After I gave my session I had to take a work call. Then I missed lunch and the Tiffany Farriss keynote, but I had the rare opportunity to get an in-real-life lunch with my manager. Luckily, I knew the sessions were going to be recorded!

I am ready to watch A WordPresser in a Drupal World over the weekend. Open source projects should always be reviewing their developer experience and finding ways to make it easier to onboard developers from other platforms.

I am also going to watch Decoupled Drupal Dev on Docker with Docksal Doing the Dirty Work. Doing decoupled application development in Drupal is weird. When you're using Symfony or Laravel it's very easy to have the frontend and backend code in the same repository, in fact those framework applications can mount the application for you. With how Drupal handles rendering, it's a bit more difficult because we can't completely remove the theme system and have non-admin or non-API routes "just" served from a single page application. Also, you may not want this approach if you're using a full-stack framework like Next.js. I have set up a workflow using DDEV and networking between projects. I'm really curious to see what J.D. has setup.

I was able to catch Building In Public- How live-streaming software development can supercharge your programming abilities. I was excited to see this since I do coding via live-streaming. It was really cool getting to meet Mark Dobossy and his journey with live-streaming. He isn't a Drupal developer. He's an Angular and Ionic developer. If I'm remembering correctly, J.D. (who also live-streams!) suggested Mark should give this talk, since he's local to the area.

So, as a plug, here are all of our Twitch channels:

Thursday night there was a social at the House of Balls, where we had some delicious BBQ. While I didn't partake, there was definitely karaoke. 

Friday

Friday was a really packed day.

The morning kicked off with Matthew Tift and Habits of an Effective Drupal Contributor. Even though I am definitely not a first-time contributor, I am sure I have various habits that I can improve to be an even more effective contributor. I really enjoyed this session, especially grounding myself in the basics that newcomers experience. It has been 10 years since I was in that position.

Next I caught Brian Perry's The Drupal API Client Project session. The Drupal API Client project was one of the funded projects from the Pitchburg Innovation Contest. There are plenty of JSON:API or GraphQL clients. But there isn't a Drupal client. Yes, technically Drupal implements specifications and there isn't anything Drupal-specific. But it is something needed for brand presence, as you can see from Brian's presentation when you search NPM for wordpress versus drupal. I am really excited about this project because of my history of API-first e-commerce with Centarro and my new work at Acquia (yet to be announced.)

When you see that Steve Persch is giving a session, you just go. The Fourth Decade of Website Deployments covered the ways creating websites has evolved over time – from simple servers to edge distribution, to the micro-ization of all the things, to the future of edge computing. I haven't watched the recording yet so I don't know how much was caught on camera (he had a few pointed to catch the props.) But it's a great watch regardless.

After lunch I caught Cooking with Caching: Drupal code served fast! by Tess Flynn (socketwench.) It was a fun themed session. Caching is a complex topic and it covered many components of Drupal's caching. 

I stuck around for the first unconference session. We had a table to discuss the concept of a Drupal 7 soft landing. There are a few options for Drupal 7 sites, and they are all outlined on the Drupal 7 End of Life landing page. There are a few problems.

Do people know they even have a Drupal 7 site? Some people will read this and go "how not?" the others have worked with non-profits or very small businesses. One of the folks at the table (I'm so sorry your name is escaping me!) doesn't even do Drupal anymore, but knows he built several sites on Drupal 7 and those organizations may have no idea their website will be running on end-of-life software. He felt is in his good conscience to make sure they were on a stable platform.

THEMES. We discussed a lot about themes. Migrating from Drupal 7 to Drupal 10 will require writing your theme to move from PHPTemplate to Twig amongst other changes. I've been working on Retrofit to support themes and PHPTemplate overrides, but will it be enough? Moving to Backdrop also requires reworking your theme. Backdrop still uses PHPTemplate but has made some of the same internal changes of variables available to templates that Drupal has done.

A proper workflow of what an organization needs for their migration. Do organizations really have a great way to understand their Drupal 7 migration? Do they know part of the migration is evaluating the amount of module dependencies and compatibilities? Do they have any custom code? If the site is only a few modules and no custom code, it could be extremely easy to migrate. Do they have a custom theme? Chances are, yes. Are they considering a redesign/refresh as part of the upgrade? These kind of workflows can help organizations make decisions. Or, it could be a template that smaller agencies can use when creating proposals for clients.

Twin Cities DrupalCamp 2024

It hasn't been announced if there will be a Twin Cities DrupalCamp 2024. But I have a good feeling there will be. I'm looking forward to attending next year!

Want more? Sign up for my weekly newsletter

Aug 29 2023
Aug 29
Published on Tuesday 29, August 2023

I finally took a look at writing a custom live template with PhpStorm. I've used them several times to automate scaffolding a foreach statement or other random suggestions that a Tab keypress leads to automated scaffolded code. But I never really dove in to see how they work or could be customized. Then, I had to fix some of my code to comply with Drupal's coding standards.

Drupal requires document comments for all methods, including a short comment. 

  • When a method overrides a method or implements one from an interface, you use {@inheritdoc}, which indicates that the documentation for the method should come from the parent definition.
  • For __construct, we used a pattern of Constructs a new $CLASS_NAME object. as our short comment

Most of the time, I skip these nuanced coding standards until I am happy with my code. Then I toil along copying, pasting, and manually adjusting. Finally, I got sick of it and decided to take 10 minutes to try and automate the dang thing.

To create a live template, you can follow the documentation or these quick steps:

  1. Open settings (CMD + , or Ctrl + Alt + S)
  2. Select Editor
  3. Select Live templates
  4. Click the + icon to add a template

The kicker was figuring out that you had to assign contexts for live templates. It took me a minute to notice the warning at the bottom of the user interface or the link to open the menu to select a context.

Live template for {@inheritdoc}

This one was pretty simple. For the abbreviation, I just used inheritdoc. And then for the template:

/**
 * {@inheritdoc}
 */

And then, for the contexts, I selected class members.

When typing code, I only need to type inheritdoc and press Tab to get my document block for the method.

Live template for __construct

Creating a live template to create __construct comment blocks requires configuring a variable for the template. The format I use is Constructs a new $CLASS_NAME object. We need $CLASS_NAME to be the value of the current class name.

I couldn't think of a good abbreviation, so I used cnsdoc as shorthand for "constructor" and "doc."

The template is:

/**
 * Constructs a new $CLASS_NAME$ object.
 */

Variables in live templates start and end with $. Once a template has a variable, the Edit Variables button becomes active. We have to define what $CLASS_NAME$ is derived from. The expression is phpClassName().

For the contexts, I selected class members.

Now, I can go to my __construct method and generate my comment block!

Here is the result:

Want more? Sign up for my weekly newsletter

Aug 22 2023
Aug 22
Published on Tuesday 22, August 2023

In Drupal, with themes, you can override default template output by copying, pasting, and customizing those templates. It works, but how? I thought I always knew how this worked. But I realized I didn't when I dove into supporting Drupal 7 themes with Retrofit.

I know Drupal builds the theme registry and theme hook suggestions. In doing so, it scans templates, and part of that process includes scanning the active theme for template and theme hook template implementations. But when reading the \Drupal\Core\Theme\Registry code, I was coming up blank.

Components of Drupal's theme registry

The theme registry in Drupal contains all theming information about defined templates and preprocess hooks. There is a base registry of theme hooks defined by modules. Then, there are versions of the registry for individual themes. That is because each theme may implement preprocess hooks or template overrides.

  • \Drupal\Core\Theme\Registry – Maintains all theme hook, preprocess, template, etc. information
  • \Drupal\Core\Utility\ThemeRegistry – runtime registry cache collector, decorator of \Drupal\Core\Theme\Registry and instantiated by \Drupal\Core\Theme\Registry::getRuntime creating a theme-specific subset of the registry.
  • \Drupal\Core\Template\Loader\ThemeRegistryLoader – Twig loader that loads templates based on the ThemeRegistry runtime registry.

It's somewhat confusing. But it all revolves around \Drupal\Core\Theme\Registry. The \Drupal\Core\Utility\ThemeRegistry class is a way to try and optimize caching for the registry data.

Building the theme registry

The magic happens in \Drupal\Core\Theme\Registry::build. The theme registry will first try to retrieve cached registry data about theme information provided by modules, consistent across any theme being used.

if ($cached = $this->cache->get('theme_registry:build:modules')) {
    $cache = $cached->data;
}

If that results in a cache miss, Drupal invokes hook_theme for all installed modules to create the base theme registry. The processExtension method invokes the extensions' hook_theme implementation and discovers any preprocess function hooks provided by that extension.

$this->moduleHandler->invokeAllWith('theme', function (callable $callback, string $module) use (&$cache) {
    $this->processExtension($cache, $module, 'module', $module, $this->moduleList->getPath($module));
});
$this->cache->set("theme_registry:build:modules", $cache, Cache::PERMANENT, ['theme_registry']);

The next part of building the registry considers the current theme, its theme engine, and base themes. This is where I got confused and a bit lost. The code is calling processExtension, again. 

foreach (array_reverse($this->theme->getBaseThemeExtensions()) as $base) {
    $base_path = $base->getPath();
    if ($this->theme->getEngine()) {
        $this->processExtension($cache, $this->theme->getEngine(), 'base_theme_engine', $base->getName(), $base_path);
    }
    $this->processExtension($cache, $base->getName(), 'base_theme', $base->getName(), $base_path);
}

The theme's engine (Twig) is then processed. Again, it is calling processExtension, and I didn't think anything of it.

if ($this->theme->getEngine()) {
    $this->processExtension($cache, $this->theme->getEngine(), 'theme_engine', $this->theme->getName(), $this->theme->getPath());
}

Then the theme itself is processed.

$this->processExtension($cache, $this->theme->getName(), 'theme', $this->theme->getName(), $this->theme->getPath());

I was perplexed. I know themes do not need to invoke hook_theme to provide template overrides. They're automatically detected. Everything I read within processExtension for themes was primarily handling their preprocess hooks.

Then I took a step back. I fired up Xdebug and truly stepped through the process. And I realized I overlooked a crucial part of the processing — handling of the theme_engine.

The twig_theme hook implementation

Located in the twig.engine file is the twig_theme theme hook implementation. It gets invoked when processExtension is called for theme_engine the extension.

function twig_theme($existing, $type, $theme, $path) {
  return drupal_find_theme_templates($existing, '.html.twig', $path);
}

The drupal_find_theme_templates function lives in includes/theme.inc. It scans the directories of all themes available to Drupal and scans their directories for template files based on the given extension, .html.twig. The templates are then matched in two ways

  • Convert template file names with - to _ for the function naming scheme of theme hooks
  • Check if the template file name matches the template file name in an existing theme hook.

If a match exists, the theme registry will use that template file rather than the default template.

Once I discovered this tidbit, I could support PHPTemplate template overrides with Retrofit by overriding logic in processExtension.

if ($type === 'theme_engine') {
    $templates = drupal_find_theme_templates($cache, '.tpl.php', $path);
    foreach ($templates as $theme_hook => $info) {
        $cache[$theme_hook]['phptemplate'] = $info['path'] . '/' . $info['template'] . '.tpl.php';
        $cache[$theme_hook]['template'] = 'theme-phptemplate';
        $cache[$theme_hook]['path'] = '@retrofit';
    }
}

Want more? Sign up for my weekly newsletter

Aug 15 2023
Aug 15
Published on Tuesday 15, August 2023

The final countdown to the Drupal 7 end-of-life has begun ticking after receiving its final extension. Why is upgrading from Drupal 7 to Drupal 10 so daunting? Three problems must be faced when upgrading from Drupal 7 to Drupal 10:

  • Migrating existing site configuration, content models, and content to its new schema.
  • Rewriting custom modules from legacy APIs to their modern API equivalents.
  • Rewriting custom themes from the PHPTemplate template engine to Twig.

Those are three big tasks. The Drupal community has focused heavily on problem one with our bundled Migrate modules and efforts to provide migrations in popular contributed modules. However, points two and three have no tooling available for development teams. These sites built on Drupal 7 still have active bug tickets and feature requests. This puts development teams in a difficult position to maintain an existing platform while upgrading to a new major version and refactoring their existing code base.

Retrofit for Drupal solves that problem. Retrofit for Drupal allows organizations that invested in their platforms built on Drupal 7 to upgrade to Drupal 10 and beyond. Retrofit for Drupal supports the following Drupal 7 APIs and features:

  • Global variables: Continue using the global $user object for the current user.
  • Permissions: Your existing hook_permission will be used; there is no need to convert it to a permissions.yml file.
  • Menu system: No need to rewrite your hook_menu definitions to the new routing, menus, and links APIs.
  • Block API: Keep your existing hook_block_* definitions without refactoring to the new plugin format.
  • Form API: Allows you to keep your existing form hooks and access form state as an array without migrating to the Drupal\Core\Form\FormStateInterface object.
  • Theme functions: Keep your existing theme functions without converting them to Twig templates.
  • PHPTemplate templates: Keep your existing PHPTemplate templates; they'll now be rendered through Twig!

And more to come! Active development is underway to support themes fully.

Want to stay informed about Retrofit for Drupal updates, including upcoming support options? Sign up for updates here: https://retrofit-drupal.com/stay-updated.

Want more? Sign up for my weekly newsletter

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web