Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Nov 27 2023
Nov 27
Allan ChappellAllan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

November 27, 2023

At the time of this blog, we have done two major version upgrades of Drupal and have refined the process along the way. There has been a lot of work in the community, through the efforts of people like Matt Glaman to make this process easier.

As a Support Engineer, I see a lot of approaches for achieving the same results in many areas of my work. Here, I’d like to share with you three different ways to achieve an upgrade of a module or theme that isn’t ready for the next major Drupal version, each with pros and cons, but all absolutely acceptable.

Why do we have this problem?

All new Drupal developers have a hard time with the layers of code changes that happen in the Drupal community. We have custom package types, custom install locations, patches, and scaffolding. To make the challenges worse, we have two ways to identify a module’s dependencies — that being a .info.yml file and for some, a composer.json. This is because some Drupal modules may want to build upon an existing PHP library or project, in addition to other Drupal modules. To ease the pain of having to define some dependencies twice, both in the .info.yml file and composer.json file, Drupal.org built their packagist, a repository of Composer packages, to read the .info.yml files from the root of the project and create Composer version constraints from that. For example, if the .info file contained the following:

name: My Module
type: module
core_version_requirement: ^8.8 || ^9
dependencies:
  - ctools:ctools

Then Drupal.org’s packagist would create the following for the release that contained that .info.yml file, saving the contributed developer a lot of trouble.

{
    "type": "drupal-module",
    "name": "drupal/my_module",
    "require": {
      "drupal/core": "^8.8 || ^9",
      "drupal/ctools": "*"
    }
  }

I hit on something there, though. It will create that for the release the .info.yml was in. When most code changes come in the form of patches, this poses a challenge. You apply your patch to the .info.yml after you download the release from Drupal.org’s packagist. Additionally, Drupal.org doesn’t create a new release entry for every patch file in the issue queue. So you are left with the question, “How do I install a module on Drupal 10 that requires Drupal 9 so that I can patch it to make it compatible for Drupal 10?”

Drupal Lenient

One of the easiest methods for those who don’t understand the ins and outs of Composer is to use the Drupal Lenient plugin. It takes a lot of the manual work out of defining new packages and works with any drupal-* typed library. Types are introduced to us through the use of the Composer Installer plugin and manipulated further with something like Composer Installers Extender. Composer plugins can be quite powerful, but they ultimately add a layer of complexity to any project over using core composer tactics.

Drupal Lenient works by taking any defined package pulled in by any means via Composer, and replaces the version constraints for drupal/core currently, at the time of this writing, with “^8 || ^9 || ^10“. So where the requirements might look like the example earlier “drupal/core“: “^8.8 || ^9“, they are replaced, making it now possible to install alongside Drupal 10, even though it might not‌ be compatible yet. This allows you to patch, test, or use the module as is, much like if you would have downloaded the zip and thrown it into your custom modules directory.

An example may look like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8"
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/my_module": "1.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    "drupal-lenient": {
      "allowed-list": [
        "drupal/my_module"
      ]
    },
    "patches": {
      "drupal/my_module": {
        "3289029: Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2022-06-16/my_module.1.x-dev.rector.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

Note the Drupal-Lenient allow list. Also note that you will need to make sure and install the plugin before trying to install the module that doesn’t support Drupal 10 in this case. If you want an excellent step-by-step, Matt put one together in the Readme.

The pros:

  • Easy-peasy to install
  • Feeds off the original packagist packages, so if there is an upgrade, you don’t have to do anything special to transition

The cons:

  • Lenient has the control and may cause inexplicable errors when updating due to unsupported core versions
  • PHP devs not familiar with Drupal Lenient won’t know to look for it
  • Flaky experiences when switching in and out of branches that include this plugin. If you context switch a lot, be prepared to handle some errors due to Composer’s challenges maintaining state between branches.
  • Patches to other dependencies inside composer.json still require you to run through some hoops

Custom package

If you want more control over what the module can and cannot do, while keeping the core of Composer functionality without adding yet another plugin, check out this method. What we will do here is find out what version the patch or merge request is being applied against. It should be stated in the issue queue and by best practices is a dev version.

If you are a perfectionist, you can use composer install -vvv to find the url or cache file that the module came from for packages.drupal.org. It is usually one of https://packages.drupal.org/files/packages/8/p2/drupal/my_module.json or https://packages.drupal.org/files/packages/8/p2/drupal/my_module~dev.json. You will note that the Composer cache system follows a very similar structure, swapping out certain characters with dashes.

With this information, you can grab the exact package as it’s defined in the Drupal packagist. Find the version you want, and then get it into your project’s composer.json.

Let’s use Context Active Trail as an example, because at the time of this writing, there is no Drupal 10 release available.

Drupal release information

Looking through the issue queue, we see Automated Drupal 10 compatibility fixes, which has a patch on it at. I grab the Composer package info and paste the 2.0-dev info into my composer.json under the “repositories” section as a type “package.”

Drupal packagesDrupal packages

Which should make your project look something like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "package",
      "package": {
        "keywords": [
          "Drupal",
          "Context",
          "Active trail",
          "Breadcrumbs"
        ],
        "homepage": "https://www.drupal.org/project/context_active_trail",
        "version": "dev-2.x",
        "version_normalized": "dev-2.x",
        "license": "GPL-2.0+",
        "authors": [
          {
            "name": "Jigar Mehta (jigarius)",
            "homepage": "https://jigarius.com/",
            "role": "Maintainer"
          },
          {
            "name": "jigarius",
            "homepage": "https://www.drupal.org/user/2492730"
          },
          {
            "name": "vasi",
            "homepage": "https://www.drupal.org/user/390545"
          }
        ],
        "support": {
          "source": "https://git.drupalcode.org/project/context_active_trail",
          "issues": "https://www.drupal.org/project/issues/context_active_trail"
        },
        "source": {
          "type": "git",
          "url": "https://git.drupalcode.org/project/context_active_trail.git",
          "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e"
        },
        "type": "drupal-module",
        "uid": "context_active_trail-3192784",
        "name": "drupal/context_active_trail",
        "extra": {
          "branch-alias": {
            "dev-2.x": "2.x-dev"
          },
          "drupal": {
            "version": "8.x-2.0-rc2+1-dev",
            "datestamp": "1630867980",
            "security-coverage": {
              "status": "not-covered",
              "message": "Project has not opted into security advisory coverage!"
            }
          }
        },
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^4.1",
          "drupal/core": "^8.8 || ^9"
        }
      }
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8"
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "2.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

Now let’s change our version criteria:

…
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^4.1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
…

And then add our patch:

…
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
      "drupal/context_active_trail": {
        "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
…

Here, you will need to look to see if the patch is patching composer.json. If it is, you will need to modify your package information accordingly. For example, in this one, the fixer changes drupal/context from ^4.1 to ^5.0.0-rc1. That change looks like this:

…
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^5.0.0-rc1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
…

Lastly, sometimes you run into some complications with the order packages are picked up by Composer. You may need to add an exclude element to the Drupal packagist.

…
  {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
  },
…

Our final composer.json for our project could look something like this with all the edits:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "package",
      "package": {
        "keywords": [
          "Drupal",
          "Context",
          "Active trail",
          "Breadcrumbs"
        ],
        "homepage": "https://www.drupal.org/project/context_active_trail",
        "version": "dev-2.x",
        "version_normalized": "dev-2.x",
        "license": "GPL-2.0+",
        "authors": [
          {
            "name": "Jigar Mehta (jigarius)",
            "homepage": "https://jigarius.com/",
            "role": "Maintainer"
          },
          {
            "name": "jigarius",
            "homepage": "https://www.drupal.org/user/2492730"
          },
          {
            "name": "vasi",
            "homepage": "https://www.drupal.org/user/390545"
          }
        ],
        "support": {
          "source": "https://git.drupalcode.org/project/context_active_trail",
          "issues": "https://www.drupal.org/project/issues/context_active_trail"
        },
        "source": {
          "type": "git",
          "url": "https://git.drupalcode.org/project/context_active_trail.git",
          "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e"
        },
        "type": "drupal-module",
        "uid": "context_active_trail-3192784",
        "name": "drupal/context_active_trail",
        "extra": {
          "branch-alias": {
            "dev-2.x": "2.x-dev"
          },
          "drupal": {
            "version": "8.x-2.0-rc2+1-dev",
            "datestamp": "1630867980",
            "security-coverage": {
              "status": "not-covered",
              "message": "Project has not opted into security advisory coverage!"
            }
          }
        },
        "description": "Set the active trail based on context.",
        "require": {
          "drupal/context": "^5.0.0-rc1",
          "drupal/core": "^8.8 || ^9 || ^10"
        }
      }
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "2.x-dev",
    "cweagans/composer-patches": "^1.7.3",
    "mglaman/composer-drupal-lenient": "^1.0.3"
  }"
  extra": {
    "composer-exit-on-patch-failure": true,
    },
    "patches": {
      "drupal/context_active_trail": {
        "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch"
      }
    },
    "patchLevel": {
      "drupal/core": "-p2"
    },
  }
}

The pros:

  • Uses more core Composer functionality
  • A PHP developer will better understand ‌what’s going on here
  • You are in complete control of how this module package and version are defined
  • All the work is in one file

The cons:

  • Requires some understanding of how composer.json, packagists, and the magic of Drupal’s packagist all work
  • That’s a messy composer.json for the project
  • If you have to use exclude, you have to leave it up to outside forces to let you know when that module does finally put out and actual D10-ready version, and then undo all of this work

Standard PHP composer best practice says that if you make modifications to a package, fork it, maintain your modifications, and provide a pull request if it’s functionality you wish to contribute back. You can use this same approach with Drupal modules as well. Some may even say that’s what issue forks are for! That said, issue forks come with the downside that sometimes they go away, or are overridden with changes you don’t want. They are a moving dot.

For the sake of this example, let’s assume that we have forked the module on GitHub to https://github.com/fourkitchens/context_active_trail.git. If you don’t know how to make a fork, simply do the following:

  • Clone the module to your local computer using the git instructions for the module in question
  • Check out the branch you want to base your changes on
  • Create a new repository on GitHub
  • Add it as a remote git remote add github [email protected]:fourkitchens/context_active_trail.git
  • Push it! git push github 8.x-2.x

You can do this with a version of the module that is in a merge request in Drupal.org’s issue queue, too. That way you won’t have to reapply all the changes. However, if your changes are in a patch file, consider adding them to the module at this time using your favorite patching method. Push all your changes to the github remote.

If the patch files don’t have changes to composer.json, or if the module doesn’t have one, you will likely want to provide at least a bare-bones one that contains something like the following and commit it:

{
  "name": "drupal/context_active_trail",
  "type": "drupal-module",
  "require": {
    "drupal/context": "^5.0.0-rc1",
    "drupal/core": "^8.8 || ^9 || ^10"
  }
}

This will tell Composer what it needs to know inside the project about dependencies. This project already had a composer.json, so I needed to add the changes from the patch to it.

Inside our Drupal project we are working on, we need to add a new entry to the repositories section. It will look something like this:

    {
      "type": "vcs",
      "url": "https://github.com/fourkitchens/context_active_trail.git"
    },

The VCS type repository entry tells Composer to look at the repository and poll for all its branches and tags. These will be your new version numbers.

Much like in the “Custom Package” example, you may need to add an exclude property to the Drupal packagist entry.

…
  {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
  },
…

Now, since Drupal packagist isn’t here to give Composer some version aliases, we have to use the old notation dev-BRANCHNAME for our version. Our require entry will look something like this:

 "drupal/context_active_trail": "dev-8.x-2.x",

Since we already added our patches as a commit to the module, this is all you need. Your final composer.json for your project would look like this:

{
  "name": "vendor/project",
  "repositories": [
    {
      "type": "vcs",
      "url": "https://github.com/fourkitchens/context_active_trail.git"
    },
    {
      "type": "composer",
      "url": "https://packages.drupal.org/8",
      "exclude": [
          "drupal/context_active_trail"
      ]
    }
  ],
  "require": {
    "drupal/core": "^10.0.0",
    "drupal/context_active_trail": "dev-8.x-2.x",
  }
}

It makes for a much cleaner project json, but now you’ve split the work into two locations, requiring some synchronization. However, if multiple sites of yours use this same module and need the same fixes, this absolutely has the least resistance and ability to get those changes out more quickly.

The pros:

  • Reusability
  • Two smaller, simpler chunks of work
  • Any PHP developer should be able to debug this setup as it uses Composer best practices. This method will be used in any project with any framework in the PHP ecosystem.

The cons:

  • Changes are in two separate places
  • Which patches are applied isn’t obvious in the composer.json and require looking through the commit history on the forked repository
  • Requires maintenance and synchronization when upgrades happen

Final thoughts

As with almost everything out there, there are multiple ways to achieve the same goal. I hope this brings awareness, and helps provide the flexibility you need when upgrading Drupal to a new major version. Obviously, each solution has strengths, and you may need to mix it up to get the results you want.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

May 27 2022
May 27
Allan ChappellAllan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

May 27, 2022

Have you ever found yourself needing to share custom dependencies across several sites? Maybe even for the same client? There are several methods of traversing this workflow, especially if you work in the Drupal ecosystem. The ideology of upstreams, distributions, and multi-sites are something to consider using. However, the fun lies in the challenge of determining how to scale an architecture

Create a custom packagist

The ingredients for creating a custom packagist, a repository of dependencies used by Composer, are surprisingly easy to come by. Keep in mind that a private packagist can be obtained through a hosted service at packagist.com. In our case, we already had the tooling readily available, so we decided to go the custom packagist route

The goal of this article is to give you some ideas on how to host a solid packagist for a team, organization, or client while describing how the Four Kitchens team came up with a fun and creative solution to provide this functionality using the tools our client had on hand. I hope to accomplish this by:

  • Sharing our motivation behind choosing this solution
  • Identifying the ingredients need to cook up the workflow
  • Explain baseline hosting, but elaborate on what you could do if so inclined
  • Layout how we set up automation around the workflow to make our lives easier

Let’s begin.

Motivation

On one client project, we found that we had enough private custom dependencies we were sharing with a private distribution that we needed to scale beyond editing the individual composer.json repositories listing for each site. If we were using an upstream setup, this could be accomplished using Composer Merge Plugin. In this case, however, it made sense to create a custom packagist. Keep in mind, if we didn’t do this, each of our composer.jsons would have had 11 custom packages and 11 VCS entries in the repositories section of our composer.json. That would need to grow with each additional dependency we added to our distribution. We currently maintain 20 sites on this distribution. Our policy is to have code review for every change to a site. So making changes to 21 repos (the distribution and all the downstream sites) was a development time suck

If you are here, you probably know the answer to the question, “Why can’t Composer load repositories recursively?” but if you don’t, check out this great explanation. In short, the repositories section of a composer.json cannot inherit that section from a dependency’s composer.json. So it’s up to the individual projects to make sure they have the right packages when it comes to those custom packages that our distribution requires

We might have been able to reduce our custom dependencies by relying on another hosted packagist such as asset-packagist.org, or working to make other dependencies publicly available. However, providing our own packagist maintained specifically for the client’s needs brought us performance gains over the other solutions and allows us to more closely vet our frontend library dependencies. It allows us to make a single “repositories” entry at the packagist level, and that gets pulled down by all of our sites that are pointing at it. This means less code editing on a per-site basis

So here we are, using an easily maintained solution, and reaping the benefits of performance, scalability, and increased developer productivity, while keeping our client’s ecosystem private. We didn’t even need that much to get started!

Ingredients

Things you will need to get started:

  • Satis, a rudimentary static packagist generator written in PHP using Composer as a library.
  • A repository to house the custom dependencies you want to put in your packagist. Think: all the items you currently have in your repositories section of your composer.json. This isn’t strictly a “must,” but it makes automation possible.
  • A place to host static HTML and JSON files. Anything web accessible. HTTPS is preferred, but curl can work under other protocols. You can get pretty creative here.
    • Cheap hosting service
    • Spare Droplet, Linode, or AWS EC2 instance
    • S3 bucket
    • GitHub
    • FTP
  • Something to build the packagist on dependency update like:
    • GitHub Actions
    • CircleCI
    • Travis
    • Cron
    • A manual implementation like running a command via SSH

Our implementation looks like this:

  • Repository: GitHub
  • Hosting: S3 bucket
  • Builder: CircleCI

These were all resources we were already using. I’ll go into the specifics on how our build works with some suggestions on alternatives.

It’s pretty simple to set up Satis. There is some decent documentation on Satis at GetComposer.org and on GitHub. What I say here may be a bit of a repeat, but I’m describing an opinionated setup intended to allow for testing and committing changes. This is a necessity when multiple developers are touching the same packagist and you need accountability

Before I dive into the specifics of our setup, I want to mention that if you feel you don’t need this level of control, testing, and revision history, Satis can be set up as a living stand-alone app. You can host it as either a docker container or on a hosting platform. In both of these options, developers would live edit and maintain the packagist via command line by default. You can, however, install a graphical frontend using something like Satisfy

To set Satis like Four Kitchens has, follow the steps below. Code is below if you need examples of how it might look.

  1. Create a new repository.
  2. Initialize a new composer project using composer init.
  3. Require Satis composer require composer/satis.
  4. Add a script to your composer.json to run the full build command.
  5. Add a packages directory to the project with a .gitkeep file. mkdir packages && touch packages/.gitkeep.
  6. Add a .gitignore to ignore the vendor folder and generated package files.
  7. Consider setting up a Lando instance to serve your packagist for testing.
  8. Create satis.json just like you normally would a standard composer.json with the repositories section containing all your packages, repos, and packagists you want available to the projects consuming it.
  9. Add "require-all": true below the repositories section of satis.json. There’s more about usage of require-all versus require in the Satis setup documentation. Use what fits your needs, but if you are adding individual packages instead of entire packagists to your satis.json, require-all is likely all you need.

Your repo could look something like this:

composer.json

{rn  "name": "mycompany/packages",rn  "require": {rn    "composer/satis": "^1.0" 
  },rn  "scripts": {rn    "build": "./vendor/bin/satis build satis.json packages" 
  }
}

satis.json

{rn  "name": "mycompany/packages",rn  "homepage": "https://packages.mycompany.com",rn  "repositories": [rn    { "type": "vcs", "url": "https://github.com/mycompany/privaterepo" },rn    { "type": "vcs", "url": "http://svn.example.org/private/repo" },rn    { "type": "package", "package": [rn      { "name": dropzone/dropzone", "version": "5.9.2", "dist": { "url": "https://github.com/dropzone/dropzone/releases/download/v5.9.2/dist.zip", "type": "zip" }}
    ]}
  ],rn  "require-all": truern}

.lando.yml

name: mycompany-packagesrnrecipe: lemprnconfig:
 webroot: packagesrn  composer_version: 2rn  php: '7.4'

.gitignore

vendorrnpackages/*rn!packages/.gitkeep

From here, run lando start && composer install && composer build. Now, go to another project and add your test packagist to that project. Additionally, add "secure-http":false, to the config section since Lando’s https certificate is insecure by default. Lastly require one of the packages you added to satis.json above.

{rn  ..
 "repositories": [rn    {rn      "type": "composer",rn      "url": "http://mycompany-packages.lndo.site",rn    }
    ..
 ],rn  "require": {rn    "dropzone/dropzone": "^5.9.2" 
  },rn  ..
 "config": {rn    ..
   "secure-http": falsern  }
  ...rn}
 

At this point you should be greeted with a successfully built project and have a local instance of your packagist going. When you are done testing, stop Lando and switch your repository entry in the other project to your packagist’s public URL. Commit all your changes and push up!

Your next step is getting your packagist off your local and out where everyone can use it.

Now you can simply copy the files in your packages folder and put them somewhere web accessible. I really want to drive this point home. The entirety of your packagist is simply the contents of that folder and nothing else. The things that make this so complicated are the processes around automating and updating this packagist

You could, for example, now take these files you created and host them anywhere someone can curl to. This means http, ftp, sftp are available to you, to name a few. If you aren’t worried about privacy, you can even go so far as placing these in the webroot or even the sites/default/files folder in your company’s Drupal site. This is a good option if you are strapped for domain names or running a small operation. You would then make sure to copy those files any time someone makes a change to any of the packages that are a part of your packagist.

If that’s all you are looking for, you can stop here. You’ve done it! You now have a custom packagist and the rest of the workflow may not matter to you. However, if you want some more ideas and want to build out a more robust automated development workflow, keep reading. The ideas get interesting from here

If you wanted to be creative, you could probably remove the line from .gitignore that excludes the packages folder, commit it, and set your packagist URL to something like https://raw.githubusercontent.com/mycompany/packages/main/ and set up an Accept and Authorization header in your packagist. You can see an example on how to use headers in your packagist at GetComposer.org and below with our S3 example

In fact, the composer.json setup described for the creative Github example is really similar to what we did, except we used a workaround recommended by AWS for restricting access to a specific HTTP referer. Our client wanted the extra security so not just anybody could go and poke around at the packages and versions we had available

In our example, we created a normal bucket, and assigned a CNAME to it with a nice domain name. The CNAME is optional but makes it more “official” and allows us to move the packagist later without disrupting the developer workflow too much. We then added a policy to only accept connections from calls with a referer that is our secret key. A referer doesn’t have to be a website. In our case it’s a lengthy hash that would be difficult to guess. This too is optional, but if you are looking for that extra level of security, it’s a good option to consider. Note that you should not add spaces between the colon and the token when using this policy. Our repositories entry in our projects looks like:

{rn  ..
 "repositories": [rn    {rn      "type": "composer",rn      "url": "https://packages.mycompany.com",rn      "options": {rn        "http": {rn          "header": [rn            "Referer:" 
          ]rn        }
      }
    }
    ..
 ],rn  ...rn}

And that’s it. We copy the files up to the bucket using AWS CLI, and it’s published

Now we need to automate the workflow and get what’s in our hosting location to update automatically.

Building and automation

I’ve pointed out that, if you are willing, you can put Satis somewhere, generate the packagist files, upload them somewhere web accessible, and be ready to roll. This isn’t so different from static site generators like Jekyll or Hugo. However, we add in CI for automation, and revision control for accountability so that we can take the “error” out of human command crunching. It’s worth mentioning again that this is super important when you have entire teams modifying this packagist

In our example, I’m using CircleCI. You can do the same with GitHub Actions, Jenkins, or even run on a cron job, provided you are okay with a time-based cadence. You might even do more than one of these. Our CircleCI job looks like this:

.circleci/config.yml

version: 2.1rnorbs:
 php: circleci/[email protected]  aws-cli: circleci/[email protected]:
 run_dependency_update:
   default: truern    type: boolean
jobs:
 create_packagist:
   executor:
     name: php/defaultrn      tag: '7.4.24'rn    steps:
     - checkoutrn      - aws-cli/setuprn      - php/install-composerrn      - php/install-packagesrn      - run:
       name: Set Github authenticationrn        command: composer config u002du002dglobal github-oauth.github.com "$GITHUB_TOKEN";
      - run:
       name: Link auth for satisrn        command: mkdir ~/.composer; ln -s ~/.config/composer/auth.json ~/.composer/auth.jsonrn      - run:
       name: Build packagist json filesrn        command: composer buildrn      - store_artifacts:
       path: packagesrn      - run:
       name: Copy packagist to aws
       command: aws s3 cp u002du002drecursive ./packages/ s3://packages.mycompany.com/rnworkflows:
 version: 2rn  packagist:
   when: << pipeline.parameters.run_dependency_update >>
    jobs:
     - create_packagist:
       filters:
         branches:
           only:
             - mainrn              - master

There’s a lot to unpack here. I’m using pipeline parameters, because a requirement for me is to be able to call this job when another project updates. This functionality allows me to call this CircleCI job using an API call. I also use CircleCI orbs to make grabbing AWS CLI and getting a PHP environment easy

The meat of the job is the same as what you were doing during testing: running the build command we put in our composer.json. Since some of our repositories are private, we also have to make sure that composer has access creating a GitHub token. Then we copy everything to the bucket using AWS CLI. In our case, we have some behind-the-scenes environment variables defining our keys: AWS_ACCESS_KEY_ID, AWS_DEFAULT_REGION, and AWS_SECRET_ACCESS_KEY

From another project’s perspective, I’m still using CircleCI to run the API call. You can do this really easily in other CI environments, too.

version: 2.1
jobs:
 update-packagist:
   docker:
     - image: cimg/base:2021.12rn    steps:
     - run: "curl u002du002drequest POST u002du002durl https://circleci.com/api/v2/project/github/mycompany/packages/pipeline u002du002dheader "Circle-Token: $CIRCLE_TOKEN" u002du002dheader "content-type: application/json" u002du002ddata '{"parameters":{"run_dependency_update":true}}'" 
workflows:
 build:
   jobs:
     - update-packagist

That’s it. I add this job to every project that’s a VCS entry in our satis.json (provided I have access) and let it go to town. If you find yourself with other dependencies out of your control, consider adding a cron job somewhere or a scheduled pipeline trigger. You are done!

Final thoughts

This workflow can be as easy or as difficult as you want to make it given a few factors like:

  • How often will it change
  • How many people touch it
  • How up-to-date it needs to be

There are a lot of ideas here, with lots of knowledge representing several different application architectures for organizations that have multiple projects or sites. If you don’t want to bother with the home-brewed solution, dish out the cash and get a private packagist. The cost may be worth it

However, if you are already using all the necessary services and have a team of knowledgeable individuals like ours, consider maintaining your own packagist that you can host anywhere. You may find it a productive, performant, and most of all joyful and exciting experience that will bring value to your upstream, distribution, or multi-site setup.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Jan 23 2020
Jan 23
Allan Chappell

Allan Chappell

Senior Support Lead

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

January 23, 2020

In the Drupal support world, working on Drupal 7 sites is a necessity. But switching between Drupal 7 and Drupal 8 development can be jarring, if only for the coding style.

Fortunately, I’ve got a solution that makes working in Drupal 7 more like working in Drupal 8. Use this three-part approach to have fun with Drupal 7 development:

  • Apply Xautoload to keep your PHP skills fresh, modern, and compatible with all frameworks and make your code more reusable and maintainable between projects.
  • Use the Drupal Libraries API to use third-party libraries.
  • Use the Composer template to push the boundaries of your programming design patterns.

Applying Xautoload

Xautoload is simply a module that enables PSR-0/4 autoloading. Using Xautoload is as simple as downloading and enabling it. You can then start using use and namespace statements to write object-oriented programming (OOP) code.

For example:

xautoload.info

name = Xautoload Example
description = Example of using Xautoload to build a page
core = 7.x package = Midcamp Fun

dependencies[] = xautoload:xautoload

xautoload_example.module

 'xautoload_example_page_render',
    'access callback' => TRUE,
  );
  return $items;
}

function xautoload_example_page_render() {
  $obj = new SimpleObject();
  return $obj->render();
}

src/SimpleObject.php

 "

Hello World

", ); } }

Enabling and running this code causes the URL /xautoload_example to spit out “Hello World”.

You’re now ready to add in your own OOP!

Using third-party libraries

Natively, Drupal 7 has a hard time autoloading third-party library files. But there are contributed modules (like Guzzle) out there that wrap third-party libraries. These modules wrap object-oriented libraries to provide a functional interface. Now that you have Xautoload in your repertoire, you can use its functionality to autoload libraries as well.

I’m going to show you how to use the Drupal Libraries API module with Xautoload to load a third-party library. You can find examples of all the different ways you can add a library in xautoload.api.php. I’ll demonstrate an easy example by using the php-loremipsum library:

1. Download your library and store it in sites/all/libraries. I named the folder php-loremipsum.

2. Add a function implementing hook_libraries_info to your module by pulling in the namespace from Composer. This way, you don’t need to set up all the namespace rules that the library might contain.

function xautoload_example_libraries_info() {
  return array(
    'php-loremipsum' => array(
      'name' => 'PHP Lorem Ipsum',
      'xautoload' => function ($adapter) {
        $adapter->composerJson('composer.json');
      }
    )
  );
}

3. Change the page render function to use the php-loremipsum library to build content.

use joshtronic\LoremIpsum;
function xautoload_example_page_render() {
  $library = libraries_load('php-loremipsum');
  if ($library['loaded'] === FALSE) {
    throw new \Exception("php-loremipsum didn't load!");
  }
  $lipsum = new LoremIpsum();
  return array(
    '#markup' => $lipsum->paragraph('p'),
  );
}

Note that I needed  to tell the Libraries API to load the library, but I then have access to all the namespaces within the library. Keep in mind that the dependencies of some libraries are immense. You’ll very likely need to use Composer from within the library and commit it when you first start out. In such cases, you might need to make sure to include the Composer autoload.php file.

Another tip:  Abstract your libraries_load() functionality out in such a way that if the class you want already exists, you don’t call libraries_load() again. Doing so removes libraries as a hard dependency from your module and enables you to use Composer to load the library later on with no more work on your part. For example:

function xautoload_example_load_library() {
  if (!class_exists('\joshtronic\LoremIpsum', TRUE)) {
    if (!module_exists('libraries')) {
      throw new \Exception('Include php-loremipsum via composer or enable libraries.');
    }
    $library = libraries_load('php-loremipsum');
    if ($library['loaded'] === FALSE) {
      throw new \Exception("php-loremipsum didn't load!");
    }
  }
}

And with that, you’ve conquered the challenge of using third-party libraries!

Setting up a new site with Composer

Speaking of Composer, you can use it to simplify the setup of a new Drupal 7 site. Just follow the instructions in the Readme for the Composer Template for Drupal Project. From the command line, run the following:

composer create-project drupal-composer/drupal-project:7.x-dev  --no-interaction

This code gives you a basic site with a source repository (a repo that doesn’t commit contributed modules and libraries) to push up to your Git provider. (Note that migrating an existing site to Composer involves a few additional considerations and steps, so I won’t get into that now.)

If you’re generating a Pantheon site, check out the Pantheon-specific Drupal 7 Composer project. But wait: The instructions there advise you to use Terminus to create your site, and that approach attempts to do everything for you—including setting up the actual site. Instead, you can simply use composer create-project  to test your site in something like Lando. Make sure to run composer install if you copy down a repo.

From there, you need to enable the Composer Autoload module , which is automatically required in the composer.json you pulled in earlier. Then, add all your modules to the require portion of the file or use composer require drupal/module_name just as you would in Drupal 8.

You now have full access to all the  Packagist libraries and can use them in your modules. To use the previous example, you could remove php-loremipsum from sites/all/libraries, and instead run composer require joshtronic/php-loremipsum. The code would then run the same as before.

From here on out, it’s up to your imagination. Code and implement with ease, using OOP design patterns and reusable code. You just might find that this new world of possibilities for integrating new technologies with your existing Drupal 7 sites increases your productivity as well.

Making the web a better place to teach, learn, and advocate starts here...

When you subscribe to our newsletter!

Mar 06 2017
Mar 06

In the modern world of web / application development, using package managers to pull in dependencies has become a de-facto standard. In fact, if you are developing enterprise software and you aren't leveraging package managers I would challenge you to ask yourself why not?

Drupal was very early to adopt this mindset of pulling in dependencies almost a decade ago when Dmitri Gaskin created an extension for Drush (the Drupal Shell) that added the ability to pull contributed modules by listing them in a make file (I think Dmitri was 12 years old when he wrote the Drush extension, pretty amazing!). Since that time, the make extension has been added to Drush core.

Composer is the current standard for putting together PHP applications, which is why Drupal 8 has gone this direction, so why not use Composer to put together Drupal 7 applications?

First off, I want to clarify what I'm not talking about in this post. I am not advocating that we ditch Drush all together, I still find value in other aspects of what Drush can do. I am specifically referring to the Make aspect of Drush. Is Drush Make still necessary?

This post is also not about Drupal Console vs Drush, both CLI tools add tremendous value to development workflow, and there isn't 100% overlap with these tools [yet]. I think we still need both tools.

This post is about how I came to see the benefit of switching to Composer from Drush Make. I recommend making this move for Drupal 7 and Drupal 8. This Drupal Composer workflow is not new, it has been around for a while. I just never saw a good reason to make the jump from Drush Make to this new process, until now. We have been asked in the comments on previous posts, "Why haven't you adopted the Composer process?" I now have a good reason to change our process and fully jump on board with Composer building Drupal 7 applications. We appreciate all the comments we get on our blog, it sharpens everyone involved!

We have blogged about the Composer workflow in a previous post on our [Drupal 8 build process]({% post_url 2016-07-13-drupal-8-development-in-docker-redux %}) in the past, but the main motivation there was to be proactive about where PHP application development is going [already is]. We didn't have a real use case for the switch to Composer until now. This post will review how I came to that revelation.

Dependency Managers

I want to make one more point before I make the case for Composer. There are many reasons to use package managers to pull in dependencies. I'll save the details for another blog post. The main reason developers use package managers is so that your project repository does not include libraries and modules that you do not maintain. That is why tools like Composer, npm, Yarn, Bower, and Bundler exist. Hook up your RSS reader to our blog, I'll explain in more detail in a future post, but for now I'll leave this link to the Composer site explaining why committing dependencies is a bad idea, in your project repo.

Version Numbers

The #1 reason to make the switch to Composer is the ability to manage version numbers. You may be asking "What's the big deal, Drush Make handles version numbers as well?" let me give you a little context of why using Composer version numbers are a better approach.

The Back Story

Recently in a strategy meeting with one of our enterprise clients, we were discussing how to approach launching 100's of sites on one Drupal Core utilizing multiple installation profiles on top of Acquia Site Factory. Our goal was to figure out how we could sanely manage updating potentially dozens of installation profiles without explicitly defining each version number of the profile being updated. This type of Drupal architecture is also a topic for a future blog post, but for now read Acquia's explanation of why architecting with profiles is a good idea.

As a developer, it is common place to lock down versions to a very specific version so that we know exactly what versions we are using / deploying. This is the reason composer.lock, Gemfile.lock, yarn.lock, and npm shrinkwrap exist. We have experienced the pain of unexpected defects in applications due to an obscure dependency changing deep in the dependency tree. Most dependency managers have a very explicit command for updating dependencies, i.e. composer update, bundle update, yarn upgrade respectively, which in turn update the lock file.

A release manager does not need to know explicitly which version of a dependency (installation profile, module, etc), to release next, she simply wants the latest stable release.

Herein lies the problem with Drush Make. There are practices that exist that solve both the developer problem and release manager problem that do not exist in Drush Make, but do exist in Composer and other application development environments. It's a common pattern that has been around for a while, it's called semantic versioning.

Semantic Versioning

If you haven't heard of semantic versioning (semver), go check it out now. Pretty much every package manager I have dealt with has adopted semver. Adopting semver gives the developer, or release manager, the choice of how to update dependencies within their app. There are very distinct numbers in semver for introducing breaking changes, new features, and bug fixes. How does this play into what problem use cases I mentioned above?

A developer has the ability to specify in the composer.json file specific versions, while leaving the version number flexible to pull in new bug fixes and feature improvements (patch and minor releases). Look at the example below:

{
  "name": "My Drupal Platform",
  ...
  "require": {
	...
    "drupal/drupal": "~7.53.0",
    "drupal/views": "^3.14.0"
  },
  ...
}

The tilde ~ and caret ^ symbols have special meanings when specifying version numbers. The tilde matches the most recent minor version (updates patch release number, the last number), the caret will update you to the most recent major version (updates minor release number, the middle number).

The above example basically says, use the views module at version 3.14, and when version 3.15 comes out, update me to that version when I run composer update.

Breaking changes should only be introduced when you update the first number, the major release. Of course, if you completely trust the developer writing the contributed code this system would be enough, but not all developers follow best practice, which is why the lock file was created and the need to explicitly run composer update.

With this system in place, a release manager now only needs to worry about running one command to get the latest stable release of all dependencies. This command could also be hidden behind a nice UI (a CI Server) so all she has to do is push one button to grab all the latest dependencies and push to a testing site for verification.

Understanding everyones needs

In the past, I didn't have a good reason to move away from Drush Make, because it did the job, and Drush is so much more than Drush Make. The strategy session we had was eye opening. Understanding the needs from an operations perspective, while not jeopardizing the integrity of the application led us down a path to see a problem that the wider development community at large has already solved (not just the PHP community). It's very rewarding to solve problems like this, especially when you come to the conclusion that someone has already solved the problem! "We just had to find the path to the water! (--A.W.)"

What do you think about using Drush Make vs Composer for pulling together a Drupal Application? Leave us your thoughts in the comments.

Apr 04 2016
Apr 04

Tom Friedhof

Senior Software Engineer

Tom has been designing and developing for the web since 2002 and got involved with Drupal in 2006. Previously he worked as a systems administrator for a large mortgage bank, managing servers and workstations, which is where he discovered his passion for automation and scripting. On his free time he enjoys camping with his wife and three kids.

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web