Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Aug 11 2021
Aug 11

Drupal Diversity & Inclusion, an independent working group within the Drupal community, is holding the first ever Drupal Diversity & Inclusion (DDI) Camp! The aim of the working group is to bring new voices and important topics together into a relaxed and fun virtual conference. You are invited to join Addi, Ashley, and myself online this weekend, August 13 & 14, for two days of fantastic content from community speakers.

There is something amazing about attending a conference and feeling like you've found your people. In the case of the DDI Camp happening this weekend, I think it will be amazing to discover something--rather, someone--new in a community I thought I knew pretty well after over a decade of involvement.

Drupal folks have been putting on events for ages now and you might think that in such a "mature" community, there really isn't anything new to learn or anyone new to hear from. Or you might feel that while you're steadily learning more about Drupal, you've yet to find a career path or a niche in the community where you can get involved in a meaningful way.

I think that the DDI Camp will be a place where you can hear about important topics from a variety of perspectives and in doing so refresh your understanding of a subject important to you and your work. I think it will also be an opportunity to learn about promising career pathways that reinvigorate your aspirations for the future.

Browse the 2-day program here, which includes topics such as:

  • Career pathways in Drupal
  • People management
  • Supporting various communities and causes
  • Personal growth and tools for success
  • Accessibility
  • Finding a niche in the community
  • Lightning talks

My impression of the program is there is something for a range of folks, whether you are coding all day or managing teams (or both!). I'm excited to learn something new and also connect with amazing folks in the Drupal community.

Register now and plan to attend a relaxed and fun 2-day virtual conference.

Tickets are affordably priced and still available. Three of us here at Osio Labs will be attending (our CEO, Addison Berry; People and Technical Support lead, Ashley Jones; and myself, Trainer and Production manager for Drupalize.Me). Feel free to say "hi" in the event chat room!

Get your ticket and we'll see you (virtually!) this weekend at the 2021 DDI Camp!

Mar 18 2021
Mar 18

DrupalCon North America is happening in just over 3 weeks, as part of the larger DrupalFest, which is a series of online events happening throughout the month of April. Last year's DrupalCon Global was a great event and showed what a successful online conference can look like. (You can read our recaps of DrupalCon Global.) We fully expect this year to be great as well, and possibly even better. The whole Drupalize.Me team will be in attendance.

In addition to soaking up the community vibes, I'll also be on a panel, "Leadership at the intersection of business and open source", along with 3 other open source business leaders: Lisa Tagliaferri, Stefanie Langner, and Tracy Evans. While we all have different backgrounds in both business and open source, we're gathering to discuss how we can take the best of both worlds and run great organizations that make the world better. We're looking forward to an interactive discussion where we can answer questions from the audience and learn from each other along the way. The panel will be held on Wednesday, April 14th, 15:35 to 16:25 UTC (find your local time) in Room 3.

You'll also find our team involved in the contribution events. There will be ongoing contribution sprints throughout the conference and this year there will also be specific days set aside for different initiatives. You don't need to be a DrupalCon attendee to take part in the contribution events, and you can go to the DrupalCon Contribution page each day of the conference to get access to the community rooms.

We're excited for DrupalCon and that we'll all be able to attend. We hope to see you there!

Feb 09 2021
Feb 09

One of the top reasons people cite when canceling a membership with Drupalize.Me is, "No time to learn." We hear you. It's hard to set aside time for self-paced professional development for any number of reasons -- lack of support from your employer, other priorities that make it hard to justify "tutorial time", or feeling like you're wasting time looking for the right starting point that meets your needs.

Registering for a scheduled professional development event like a workshop or conference helps some folks overcome these barriers. How? A workshop or conference can fit into your professional development plan at work and is easier to set aside time for than self-paced learning. It's easier to communicate to others "I'm going to be attending this workshop and unavailable for other things during these days/times" than it is to set boundaries around self-paced learning -- even though we wish it were just as easy!

Online workshops with 2 scheduling options

We're offering 2 online Drupal theming workshops in February and March for folks ready to get started with theming in Drupal. Whether you can set aside part of 3 days in 1 week, or part of 1 day a week for 3 weeks, we hope one of these workshop schedules can help you set aside the time you need to get started on a path to mastering themes in Drupal.

February's Hands-On Theming in Drupal 8 and 9 workshop (February 22-24, 2021):

Register for February workshop

March's Hands-On Theming in Drupal 8 and 9 workshop (March 3, 10, 17, 2021):

  • Once a week for 3 Wednesdays, 3 hours a day
  • Starts at 5PM UTC (see my timezone).

Register for March workshop

Learn Drupal's theming system in a hands-on workshop

Are you ready to set yourself on a path to learn Drupal's theming system? Maybe you've just inherited a Drupal site and need to update the look-and-feel and you want to know the correct method. Or maybe you're a long-time Drupaler with more experience pre-Drupal-8 and need to get up-to-speed on Drupal 8 or 9 theming practices.

Class size will be limited to 12 people so you can ask questions, get live one-on-one help, and focus on your specific use cases. You can read a detailed syllabus for the workshop here. You'll also get:

  • Example code with extensive documentation suitable for use as reference material for future projects
  • A PDF workbook with exercises and solutions
  • Access to pre-recorded explanations of the solutions for all exercises
  • One month of free access to the entire Drupalize.Me tutorial catalog

Register for February workshop

Register for March workshop

Does this sound like something you want to do but the schedule just doesn't work for you? We're interested to hear your feedback on what scheduling options would work better for you. Let us know!

If you have any questions about this workshop, please contact us. We hope these workshops can help you set aside the time you need to take your Drupal career to the next level.

Jan 27 2021
Jan 27

Have you heard about the Drupal decoupled menus initiative? If not, I'll explain more in a moment. But first, if you've got any experience creating JavaScript front-ends for a decoupled CMS (Drupal or other) the initiative is looking for input through this survey: https://www.surveymonkey.com/r/N2JZFLD

Take the survey

It only took me about 10 minutes to fill out, and it's an easy contribution to the Drupal community with a big impact. Fill it out, then come back, and read the rest of this post. (I'll wait.)

What is the decoupled menus initiative?

The decoupled menus initiative (DMI) was introduced in Dries' keynote from DrupalCon Europe 2020, and this video by Gabe Sullice (embedded below) does a great job of explaining what it's all about.

[embedded content]

Video credit: Gabe Sullice

The goal of the decoupled menus initiative is to:

"Provide the best way for JavaScript front-ends to consume configurable menus managed in Drupal"

This includes creating official, community-supported components (e.g. React & Vue) that you can use in your own project or as a reference implementation--and everything required to support it including docs, packaging, security, etc. And at the same time keeping the scope small and attainable by saying we'll ship a single menu component rather than a complete overhaul of Drupal's admin UI.

Credit: Dries Buytaert, DrupalCon Europe 2020Credit: Dries Buytaert, DrupalCon Europe 2020

While on the surface this might sound like we're building a React component that displays links, I think it's the work that needs to happen to ensure that component can be effectively managed and maintained by the Drupal community that is the real value of this initiative. Some of the problems that need to be solved include:

  • Updating the Drupal.org infrastructure to handle any requirements for bundling, testing, and shipping JavaScript packages via GitLab etc.
  • Defining policies and practices for handling security issues with JavaScript packages
  • Defining tooling, and processes, for creating best-in-class documentation for how to consume menu data from Drupal
  • Developing an ideal data structure for consuming menu data, and then updating to Drupal core to facilitate providing that data
  • Allowing content creators to configure, and turn on/off, menus served via JSON:API through an intuitive UI
  • And of course writing the code for the different reference implementations in React, Vue, etc.


Looking at that list, most of those problems, once solved, will reduce the barriers to creating more awesome JavaScript integrations with Drupal's web services API. This. in itself, is a huge win. And hopefully results in a bunch of additional initiatives tackling things like authentication, content editor-generated layouts, image styles, routing, and other things that are traditional hard problems of decoupled architectures.

Think of the decoupled menus initiative as laying the groundwork for future innovations.

This is important; because, as Dries' pointed out in his keynote introducing the initiative in order for Drupal to continue grow, and to remain relevant for the next 20 years, it has to be better positioned to compete with the current class of decoupled content management systems. Drupal is already the best option from the perspective of content architecture, editorial workflows, and a deep feature set. But it lacks a developer experience that is attractive to JavaScript developers. and gets overlooked as a result. Since these devs are often influential in the decision regarding what CMS to use, it's important that they view Drupal as an awesome choice.

The experience of integrating with Drupal has to be as good, or better, than that of the competitors. This means meeting JavaScript developers where they are, and not making them jump through hurdles to integrate with Drupal. Because more often then not, we developers will prefer the path of least resistance. And speaking from my own experience npm install --save @contentful/app-sdk is a lot less friction than writing my own JavaScript library to integrate with Drupal's back-end. While there have been numerous attempts to create reusable libraries, they tend to lack the visibility required to make them truly useful.

Assuming this initiative is successful, I would love to see something similar for dealing with authentication: a set of community supported components that deal with the complex OAuth workflow, specifically designed to integrate with Drupal and the Simple OAuth module. This would get us closer to the experience of using solutions like Auth0.

Want to know more? Or get involved?

Did I mention there's a survey?

Take the survey

Oct 09 2020
Oct 09

For the Drupalize.Me site we have a functional/integration test suite that's built on the Nightwatch.js test runner. We use this to automate testing of critical paths like "Can a user purchase a membership?" as well as various edge-case features that we have a tendency to forget exist -- like "What happens when the owner of a group account adds a user to their membership and that user already has a Drupalize.Me account with an active subscription?"

For the last few months, running these tests has been a manual process that either Blake or I would do our on our localhost before doing a release. We used to run these automatically using a combination of Jenkins and Pantheon's MultiDev, but when we switched to using Tugboat instead of MultiDev to build previews for pull-requests, that integration fell to the wayside and eventually we just turned it off because it was failing more often than it was working.

Aside: The Drupalize.Me site has existed since 2010, and has gone through numerous rounds of accumulating and then paying off technical debt. We once used SVN for version control. Our test suite has gone from non-existent, to Behat, then Casper, then back to Behat, and then to Nightwatch. Our continuous integration (CI) relies primarily on duct tape and bubble gum. It's both the curse, and the joy, of working on a code base for such a long time.

I recently decided it was time to get these tests running automatically again. Could I do so using GitHub actions? I have a bunch of experience with other CI tools, but this was my first time really diving into either of these in their current form. Here's what I ended up with.

  • We use Tugboat.qa to build preview environments for every pull-request. These are a clone of the site with changes from the pull-request applied. This gives us a URL that we can use to run our tests against.
  • We use GitHub Actions to spin up a robot that'll execute our tests suite against the URL provided by Tugboat and report back to the pull request.

Setting up Tugboat to build a preview for every pull request

We use a fairly cookie-cutter Tugboat configuration for building preview environments that Blake set up and I mostly just looked at and thought to myself, "Hey, this actually looks pretty straightforward!" The setup:

  • Has an Apache/PHP service with Terminus and Drush installed, and a MySQL service
  • Pulls a copy of the database from Pantheon as needed
  • Reverts features, updates the database, and clears the cache each time a pull request is updated
  • Most importantly, it has a web-accessible URL for each pull request

Here's what our .tugboat/config.yml looks like with a few unrelated things removed to keep it shorter:


    # Use PHP 7.2 with Apache
    image: tugboatqa/php:7.2-apache
    default: true

    # Wait until the mysql service is done building
    depends: mysql


      # Commands that set up the basic preview infrastructure

        # Install prerequisite packages
        - apt-get update
        - apt-get install -y default-mysql-client

        # Install opcache and enable mod-rewrite
        - docker-php-ext-install opcache
        - a2enmod headers rewrite

        # Install drush 8.*
        - composer --no-ansi global require drush/drush:8.*
        - ln -sf ~/.composer/vendor/bin/drush /usr/local/bin/drush

        # Install the latest version of terminus
        - wget -O /tmp/installer.phar https://raw.githubusercontent.com/pantheon-systems/terminus-installer/master/builds/installer.phar
        - php /tmp/installer.phar install

        # Link the document root to the expected path.
        - ln -snf "${TUGBOAT_ROOT}/web" "${DOCROOT}"

        # Authenticate to terminus. Note this command uses a Tugboat environment
        # variable named PANTHEON_MACHINE_TOKEN
        - terminus auth:login --machine-token=${PANTHEON_MACHINE_TOKEN}

      # Commands that import files, databases,  or other assets. When an
      # existing preview is refreshed, the build workflow starts here,
      # skipping the init step, because the results of that step will
      # already be present.

        # Use the tugboat-specific Drupal settings
        - cp "${TUGBOAT_ROOT}/.tugboat/settings.local.php" "${DOCROOT}/sites/default/"
        - cp "${TUGBOAT_ROOT}/docroot/sites/default/default.settings_overrides.inc" "${DOCROOT}/sites/default/settings_overrides.inc"

        # Generate a unique hash_salt to secure the site
        - echo "\$settings['hash_salt'] = '$(openssl rand -hex 32)';" >> "${DOCROOT}/sites/default/settings.local.php"

        # Import and sanitize a database backup from Pantheon
        - terminus backup:get ${PANTHEON_SOURCE_SITE}.${PANTHEON_SOURCE_ENVIRONMENT} --to=/tmp/database.sql.gz --element=db
        - drush -r "${DOCROOT}" sql-drop -y
        - zcat /tmp/database.sql.gz | drush -r "${DOCROOT}" sql-cli
        - rm /tmp/database.sql.gz

        # Configure stage_file_proxy module.
        - drush -r "${DOCROOT}" updb -y
        - drush -r "${DOCROOT}" fra --force -y
        - drush -r "${DOCROOT}" cc all
        - drush -r "${DOCROOT}" pm-download stage_file_proxy
        - drush -r "${DOCROOT}" pm-enable --yes stage_file_proxy
        - drush -r "${DOCROOT}" variable-set stage_file_proxy_origin "https://drupalize.me"

      # Commands that build the site. This is where you would add things
      # like feature reverts or any other drush commands required to
      # set up or configure the site. When a preview is built from a
      # base preview, the build workflow starts here, skipping the init
      # and update steps, because the results of those are inherited
      # from the base preview.
        - drush -r "${DOCROOT}" cc all
        - drush -r "${DOCROOT}" updb -y
        - drush -r "${DOCROOT}" fra --force -y
        - drush -r "${DOCROOT}" scr private/scripts/quicksilver/recurly_dummy_accounts.php

        # Clean up temp files used during the build
        - rm -rf /tmp/* /var/tmp/*

  # What to call the service hosting MySQL. This name also acts as the
  # hostname to access the service by from the php service.
    image: tugboatqa/mysql:5

In order to get Tugboat to ping GitHub whenever a preview becomes ready for use, make sure you enable the Set Pull Request Deployment Status feature in Tugboat's Repository Settings.

Screenshot of Tugboat UI with with checkbox for github deployment status notifications checked.

Run tests with GitHub Actions

Over in GitHub Actions, we want to run our tests and add a status message to the relevant commit. To do this we need to know when the Tugboat preview is done building and ready to start testing, and then spin up a Node.js image, install all our Nightwatch.js dependencies, and then run our test suite.

We use the following .github/workflows/nightwatch.yml configuration to do that:

name: Nightwatch tests
on: deployment_status

    # Only run after a successful Tugboat deployment.
    if: github.event.deployment_status.state == 'success'
    name: Run Nightwatch tests against Tugboat
    runs-on: ubuntu-latest
      # Set an initial commit status message to indicate that the tests are
      # running.
      - name: set pending status
        uses: actions/[email protected]
          github-token: ${{secrets.GITHUB_TOKEN}}
          debug: true
          script: |
            return github.repos.createCommitStatus({
              owner: context.repo.owner,
              repo: context.repo.repo,
              sha: context.sha,
              state: 'pending',
              context: 'Nightwatch.js tests',
              description: 'Running tests',
              target_url: "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}"

      - uses: actions/[email protected]
      - uses: actions/[email protected]
          node-version: '12'

      # This is required because the environment_url param that Tugboat uses
      # to tell us where the preview is located isn't supported unless you
      # specify the custom Accept header when getting the deployment_status,
      # and GitHub actions doesn't do that by default. So instead we have to
      # load the status object manually and get the data we need.
      # https://developer.github.com/changes/2016-04-06-deployment-and-deployment-status-enhancements/
      - name: get deployment status
        id: get-status-env
        uses: actions/[email protected]
          github-token: ${{secrets.GITHUB_TOKEN}}
          result-encoding: string
          script: |
            const result = await github.repos.getDeploymentStatus({
              owner: context.repo.owner,
              repo: context.repo.repo,
              deployment_id: context.payload.deployment.id,
              status_id: context.payload.deployment_status.id,
              headers: {
                'Accept': 'application/vnd.github.ant-man-preview+json'
            return result.data.environment_url;
      - name: echo tugboat preview url
        run: |
          echo ${{ steps.get-status-env.outputs.result }}
          # The first time you hit a Tugboat URL it can take a while to load, so
          # we visit it once here to prime it. Otherwise the very first test
          # will often timeout.
          curl ${{ steps.get-status-env.outputs.result }}

      - name: run npm install
        working-directory: tests/nightwatch
        run: npm ci

      - name: run nightwatch tests
                # Even if the tests fail, we want the job to keep running so we can set the
                # commit status and save any artifacts.
        continue-on-error: true
        working-directory: tests/nightwatch
          TUGBOAT_DEPLOY_ENVIRONMENT_URL: ${{ steps.get-status-env.outputs.result }}
        run: npm run test

      # Update the commit status with a fail or success.
      - name: tests pass - set status
        if: ${{ success() }}
        uses: actions/[email protected]
          github-token: ${{secrets.GITHUB_TOKEN}}
          script: |
            return github.repos.createCommitStatus({
              owner: context.repo.owner,
              repo: context.repo.repo,
              sha: context.sha,
              state: "success",
              context: 'Nightwatch.js tests',
              target_url: "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}"
      - name: job failed - set status
        if: ${{ failure() }} || ${{ cancelled() }}
        uses: actions/[email protected]
          github-token: ${{secrets.GITHUB_TOKEN}}
          script: |
            return github.repos.createCommitStatus({
              owner: context.repo.owner,
              repo: context.repo.repo,
              sha: context.sha,
              state: "error",
              context: 'Nightwatch.js tests',
              target_url: "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}"

            # If the tests fail we take a screenshot of the failed step, and then
        # those get uploaded as artifacts with the result of this workflow.
      - name: archive testing artifacts
        uses: actions/[email protected]
          name: screenshots
          path: tests/nightwatch/screenshots
          if-no-files-found: ignore

The one maybe abnormal thing I had to do to get this working is use the actions/[email protected] action to manually query the GitHub API for information about the Tugboat deployment. There's a good chance that there is a better way to do this -- so if you know what it is, please let me know.

The reason is that Tugboat sets the public URL of a preview in the deployment.environment_url property. But, this property is currently hidden behind a feature flag in the API. It isn't present in the deployment object that your GitHub workflow receives. So in order to get the URL that we want to run tests against, I make a query to the GitHub API with the Accept: application/vnd.github.ant-man-preview+json header. There are other actions you can use to update the status of a commit that are a little cleaner syntax, but this workflow is already using actions/[email protected] so for consistency I used that to set a commit status as well.

This Debugging with tmate action was super helpful when debugging the GitHub Workflow. It allows you to open a terminal connection to the instance where your workflow is executing and poke around.

Our nightwatch.config.js looks like the following. Note the use of the Tugboat URL we retrieved and set as an environment variable in the workflow above, process.env.TUGBOAT_DEPLOY_ENVIRONMENT_URL. Also note the configuration that enables taking a screenshot whenever a test fails.

module.exports = {
  "src_folders": [
    "tests"// Where you are storing your Nightwatch tests
  "output_folder": "./reports", // reports (test outcome) output by nightwatch
  "custom_commands_path": "./custom-commands",
  "webdriver": {
    "server_path": "node_modules/.bin/chromedriver",
    "cli_args": [
    "port": 9515,
    "timeout_options": {
      "timeout": 60000,
      "retry_attempts": 3
    "start_process": true
  "test_settings": {
    "default": {
      'launch_url': 'http://dme.ddev.site',
      "default_path_prefix": "",
      "persist_globals": true,
      "desiredCapabilities" : {
        "browserName" : "chrome",
        "javascriptEnabled": true,
        "acceptSslCerts" : true,
        "chromeOptions" : {
          // Remove --headless if you want to watch the browser execute these
          // tests in real time.
          "args" : ["--no-sandbox", "--headless"]
      "screenshots": {
        "enabled": false, // if you want to keep screenshots
        "path": './screenshots' // save screenshots here
      "globals": {
        "waitForConditionTimeout": 20000 // sometimes internet is slow so wait.
    // Run tests using GitHub actions against Tugboat.
    "test" : {
      "launch_url" : process.env.TUGBOAT_DEPLOY_ENVIRONMENT_URL,
      // Take screenshots when something fails.
      "screenshots": {
        "enabled": true,
        "path": './screenshots',
        "on_failure": true,
        "on_error": true

Finally, to tie it all together, the GitHub workflow runs npm run test which maps to this command:

./node_modules/.bin/nightwatch --config nightwatch.config.js --env test --skiptags solr

That launches the test runner and starts executing the test suite. Ta-da!

Is this even the right way?

While working on this I've found myself struggling to figure out the best approach to all this. And while this works, I'm still not convinced it's the best way.

Here's the problem: I can't run the tests until Tugboat has finished building the preview -- so I need to somehow know when that's done.

For this approach I get around this by enabling deployment_status notifications in Tugboat, listening for them in my GitHub workflow using on: deployment_status, and then executing the test suite when I get a "success" notification. One downside of this approach is that in the GitHub UI the "Checks" tab for the PR will always be blank. In order for a workflow to log its results to the Checks tab, it needs to be triggered via a push or pull_request event. I can still set a commit status, which in turn will allow for a green check or red x on the pull request, but navigating to view the results is less awesome.

This approach allows for a pretty vanilla Tugboat setup.

It seems like an alternative would be to disable Tugboat's option to automatically build a preview for a PR. Instead, we'd use a GitHub workflow with an on: [push, pull_request] configuration that uses the Tugboat CLI to ask Tugboat to build a preview, wait for the URL, and then run the tests. This would allow for better integration with the GitHub UI, but require more scripting to take care of a lot of things that Tugboat already handles. I would need to not only build the preview via the CLI, but also update it and delete it at the appropriate times.

I do think that much of the Tugboat scripting here would be pretty generic, and I could probably write the workflow to manage Tugboat previews via GitHub Actions once and mostly copy/paste it in the future.

Yet another approach would be to not use GitHub Actions at all, and instead run the tests via Tugboat. Then use the GitHub Checks API to report back to GitHub about the status of a PR and log the results into the "Checks" tab. However, this looks like a lot of code and would probably be better if it could be included into Tugboat in a more generic way. Something like a "run my tests" command, and a way to parse the standard jUnit output, and log the results to GitHub, or maybe just bypass the Checks UI all together and instead have Tugboat provide a UI for viewing test results.

I might explore these other options further in the future. But for now... it's working, so don't touch it! Like I said earlier -- it's all duct tape and bubble gum.


Terminal showing git log --oneline output and a whole list of commit messages that say 'Testing ....'

It took a while to figure this all out, and to debug the various issues on remote machines, but in the end, I'm happy with where things ended up. More than anything, I love having robots run the tests for me once again.

Jun 09 2020
Jun 09

Drupal 9.0 logo On June 3, 2020, Drupal 9.0.0 was released! This is a major version update for Drupal, but the most straightforward update in Drupal's history.

As major version updates in the past have been quite disruptive in bringing new features and APIs, you might be wondering how this update impacts your site and your Drupal learning journey. Will Drupalize.Me Drupal 8 tutorials apply to Drupal 9 sites? Will you have to learn a totally new system with Drupal 9?

Thankfully, there is good news about both those questions. The short answer to the first question is "Yes!" -- the vast majority of our Drupal 8 tutorials will apply to Drupal 9 sites. For the second question, the answer is "No!", you won't have to learn a new system. The exceptions to the question of tutorial compatibility are tutorials which feature APIs that have been removed in Drupal 9, like in SimpleTest for Automated Testing (which we have noted). Also, some contributed modules have updates that we are currently reviewing. They appear in our Search API and Solr tutorial series.

We've put together some resources to get you up to speed with Drupal 9, starting with our Guide to Drupal 9.

What's the deal with Drupal 9?

In this short video, we explain how our Drupal 8 tutorials are compatible with Drupal 9 sites because of the way that Drupal 9 was built inside of Drupal 8.

[embedded content]

Upgrade to Drupal 9

While there's no one-size-fits-all process for upgrading to Drupal 9, by the end of this tutorial you should be able to explain the major differences between Drupal 8 and 9, audit your existing Drupal 8 projects for Drupal 9 readiness, estimate the level of effort involved, and start the process of upgrading.


Learn about a key concept in understanding the difference between Drupal 8 and Drupal 9: deprecated code:


Tools for checking Drupal 9 readiness

On the blog, we've posted a couple of tutorials to help you check your site for Drupal 9 readiness.

May 4, 2020 - 12:47pm

Upgrade status generates a report that serves as a sort of checklist to help you determine whether or not your site is ready to be upgraded to Drupal 9. It packs a lot of useful information into a single report. It's worth taking the time to install it on a copy of your Drupal 8 site and seeing what it has to say.

April 20, 2020 - 1:36pm

Drupal check, and Drupal rector, are two useful command line tools you can use to help jump start the process of updating your Drupal 8 code to ensure it's compatible with Drupal 9. This post includes some notes about the process I went through while testing them out on some of the Drupalize.Me code base.

Community resources

There are a number of great Drupal 9 resources from the Drupal community at large. Here are a few you might want to check out.

How Drupal 9 is made and what is included (Drupal.org) -- This guide includes documentation about the code deprecation process, 3rd-party library changes, module removals, environment requirements, and other important information about Drupal 9 and its future development.

Drupal 9.0.0 released (dri.es) -- From the blog of Drupal project lead and founder, Dries Buytaert

A new Drupal 9 landing page on drupal.org (Drupal.org) -- Drupal.org has launched a shiny new landing page introducing Drupal 9.

Thank you

A heartfelt thank you to all the contributors who made Drupal 9 happen! And a thank you to the Drupal Association for supporting community infrastructure and events that keep Drupal moving forward.

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web