Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Nov 12 2020
Nov 12

At the Portland Drupal User Group meeting, we had as a special guest, Tim Lehnen (hestenet on drupal.org) from the Drupal Association, who walked us through the new Issue Forks and Merge Requests features on drupal.org. We also talked about how this will help smooth the way for contributors to Drupal projects.

[embedded content]

Links mentioned in the meetup

Blog series on the Developer Tools Initiative

Live preview with TugboatQA

Also mentioned was the newly-launched Live preview with TugboatQA feature!

Contribution opportunities

We also talked about ways folks could get started with contributing to Drupal. Suggestions included:

  • Read the docs! The drupal.org guide, Quick information for new contributors will help you get started contributing to Drupal.
  • Joining the Drupal Slack (drupal.org/slack) community for your local Drupal community (Portland's is #drupal-pdx and asking folks there for suggestions or mentoring.)
  • Participating in an online Drupal camp. Drupal events often have contribution days and mentors to help contributors find issues and learn the process of contributing. DrupalCampNYC 2020 happening now (Nov. 12-14, 2020) is one example. And DrupalCon Europe is coming up in December.

We're excited about the new tools for contributors and thankful for all the work both volunteers and people at the Drupal Association have put into this initiative. #DrupalThanks!

Oct 26 2020
Oct 26

As a tutorial and documentation writer, UX writing is not at the top of my task list, but it's adjacent to the work I do on a day-to-day basis, and is definitely something I'm interested in learning more about.

All those bits and pieces of words -- on labels, buttons, navigation links, error messages, and confirmations -- it turns out they're really important! They communicate to the user where and how to interact with a site, and the results of actions they've taken (among other things). These little messages also tell the user a lot about the company behind the site or app.

UX writers are folks who pore over these messages, tweak them, test one version versus another, and write them to hopefully convey an accurate message at the right time in the proper tone. There's a lot to think about in the area of UX writing, a lot of pitfalls to avoid, and opportunities to explore as well. Fortunately, there are a host of books, articles, and even conferences dedicated to this discipline.

Recently, our marketing coordinator Philippa Stasiuk and I attended the online UX Writer Conference. I heard about it through the Write the Docs Slack community grapevine. (By the way, Write the Docs is a conference for technical writers I heartily recommend.)

The schedule was packed to the brim with sessions, networking, and booth-visiting opportunities, all taking place on the Hopin platform, one of the new "normals" of online conference attending. (DrupalCon, Write the Docs, and BADCamp took place on Hopin as well.)

My main takeaway from the conference was learning about concepts that I would like to take a deeper dive into. "Tone" in UX writing is one of those topics. I caught 1 of the sessions on tone (there were several) and this was my favorite slide. It shows a spectrum of emotions the user may be feeling and a corresponding tone that you as a UX writer should employ in the messages to the user when they are probably feeling a certain way.

Tone spectrum

A slide from Nicole Michaelis' UX Writer Conference presentation: Tonality -- When, Where, and Why?

Nicole gave several great examples, such as how to improve the tone of a message when a customer has just signed up (user is feeling excited) or when their free trial is almost up and you want them to consider signing up and paying (user may be feeling a range of things like busy, demotivated or frustrated).

Incidentally, the presenter, Nicole Michaelis has a podcast I'm looking forward to checking out: Content Rookie: A Podcast on All Things Content.

In another session I attended, Accessibility, Diversity, Equity, and Inclusion: UX Writing for Divergent User Bases, Natalie Dunbar shared a number of helpful resources on discovering cognitive biases, including:

Philippa's takeaways were less about the applications to Drupal and the way we're teaching it, and more to do with how to optimize the way we connect with our customers, site visitors, social media followers, etc. Here's hers:

As someone who comes from a writing background, I was very interested -- and glad -- to see so many people who have specialized in UX writing coming from journalism with its direct applications to UX writing via editing, fact checking, and always leading with the most important information.

That was the case with presenter John Caldwell, who spoke about crafting content that connects customers by building trust and loyalty using three key elements:

  • Character: your brand, your core values -- what your company is about
  • Voice: how you bring that character to life -- where the customer relationship is created
  • Tone: mood, emotion -- more momentary language that addresses what's happening now and in context

Building voice and tone upon character are core strategies to demystify complex topics, (which is more Amber's job), but they're also relevant to what I do, which is to shepherd a brand that's built on meaningful connections.

Another key topic that I found relevant to both how we teach Drupal and how we engage with our customers was the importance of empathy. Through the various workshops, the idea of being actively empathetic -- that is, putting yourself in some else's perspective, in the micro moments that make up everything from UX design and tutorial making to social posts -- is key. It's something through which everything we create and say should be filtered (and honestly, something we could use a lot more of in the world in general).

If you found this intriguing, it looks like the UX Writer Conference is happening again online in February 2021. More details on their website, https://uxwriterconference.com/.

Oct 20 2020
Oct 20

I've recently been working on some quality of life updates for the Drupalize.Me codebase and various DevOps things. My goal is to pay down some of our technical debt, and generally make doing development on the site a more pleasant experience. In doing so I noticed that I was spending a lot of time waiting for MySQL imports -- that means others probably were too. I decided to see what I could do about it, and here's what I learned.

That MySQL table is HUGE!

The database for the site has a table named vidhist. That's short for video history. It has a record in it for every instance someone interacts with one of the videos on our site. We use the data to do things like show you which videos you've watched, show you a history of your usage over time, and to resume where you left off last time you played a specific video. Over time, this table has grown... and grown... and grown. As of this morning, there are a little over 5.7 million records in the table. And it's 2.8GiB in size!

Screenshot of Sequel Pro application showing statistics for vidhist table including 57 million rows and 2.8GiB size

That is all just fine -- except when you need to downsync a copy of the database to your localhost for development. Downloading the backup can take a while. The file is about 650Mb when compressed. Just importing all those records takes forever. I use DDEV-local for development. Pulling a copy of the database from Pantheon and importing it takes about 30 minutes! It's not as bad when exporting/importing locally. But you can imagine how working on testing an update hook could get quite tedious.

Output from ddev pull command with 35 minute timer.

I usually truncate the vidhist table on my localhost just so I don't have to deal with it. The data is important in production and for historical records, but is rarely necessary in development environments.

To get an idea of the impact this has, here are some of the other places this table can cause issues:

  • Downsync to local environment takes about 30 minutes
  • Cloning a database from one environment to another in Pantheon is slow; I don't have great metrics on this other than just watching the spinner for what seems like forever.
  • We use Tugboat.qa for automatic preview environments for every pull request. It's already quite fast to build a new preview despite the large DB size (about 2 minutes), but each preview instance is using about 8Gb of space, so we're often running into our limit. When the script runs to update our base preview image every night it takes about 18-20 minutes.
  • We occasionally use Pantheon's multidev feature, and building those environments takes just as long as the ddev pull command due in part to the large table.
  • We have an integration tests suite that takes about 11 minutes to run. Combined with the need to build a testing environment, it can take over 40 minutes for our tests to run.

Most of this is just robots doing work in the background -- so it's pretty easy to look the other way. But, there's a few places where this is really affecting productivity. So, I made a few changes that'll hopefully help in the future.

Clean up the database on Pantheon

Using Pantheon's Quicksilver hooks we execute the following script anytime the database is cloned from the live environment to any other environment:

 UNIX_TIMESTAMP(DATE_SUB(NOW(), INTERVAL 1 MONTH))');
  db_query('TRUNCATE vidhist');
  db_query('LOCK TABLE vidhist WRITE, vidhist_backup WRITE');
  db_query('INSERT INTO vidhist SELECT * FROM vidhist_backup');
  db_query('UNLOCK TABLES');
  db_query('DROP TABLE vidhist_backup');

  echo "Database downsize complete.\n";
}

This code is executed by Pantheon (see configuration file below) whenever the database is cloned and goes through these steps:

  • Verify that this is NOT the production environment
  • Bootstrap Drupal enough so that we can query the database
  • Run a series of queries that removes all but the last month's worth of data from the vidhist table. If you're curious, we run multiple queries like this instead of a single DELETE WHERE ... query because TRUNCATE is significantly faster. So we first create a temp table, then copy some data into it, then truncate the vidhist table, copy the temp data back into it, and delete the temp table.

We have a pantheon.yml similar to this which tells Pantheon where to find the script and when to run it.

# Pantheon config file.
api_version: 1

workflows:
  clone_database:
    after:
      - type: webphp
        description: Reduce size of the vidhist table in the DB.
        script: private/scripts/quicksilver/truncate_vidhist.php
  create_cloud_development_environment:
    after:
      - type: webphp
        description: Reduce size of the vidhist table in the DB.
        script: private/scripts/quicksilver/truncate_vidhist.php

As a result, the vidhist table on all non-production environments is a fraction of the original size. It's a relatively small change to make, but the impacts are huge.

Clone operations from non-production environments are significantly faster. And, since we configure DDEV to pull from the Pantheon dev environment by default when I do ddev pull on my local it's also much faster now. It's closer to 2 minutes instead of 30!

Output from ddev pull showing 2 minute timer.

This also helps reduce our disk usage on Tugboat.qa. Because we have Tugboat configured to pull the database and files from the Pantheon test environment, it too gets a smaller vidhist table. Our build time for previews is almost a full minute faster with previews now building in an average of 1 minute 11 seconds!

Tip: You can use this same technique to do things like sanitize sensitive data in your database so that it doesn't get copied to development environments.

An aside about Tugboat

I originally updated our Tugboat config.yml file to perform this cleanup of the database after pulling it down from Pantheon in an attempt to use less resources there. I later added the script for cleaning up the Pantheon DB above. It looked like this:

update:
    # Use the tugboat-specific Drupal settings
    - cp "${TUGBOAT_ROOT}/.tugboat/settings.local.php" "${DOCROOT}/sites/default/"
    - cp "${TUGBOAT_ROOT}/docroot/sites/default/default.settings_overrides.inc" "${DOCROOT}/sites/default/settings_overrides.inc"

    # Generate a unique hash_salt to secure the site
    - echo "\$settings['hash_salt'] = '$(openssl rand -hex 32)';" >> "${DOCROOT}/sites/default/settings.local.php"

    # Import and sanitize a database backup from Pantheon
    - terminus backup:get ${PANTHEON_SOURCE_SITE}.${PANTHEON_SOURCE_ENVIRONMENT} --to=/tmp/database.sql.gz --element=db
    - drush -r "${DOCROOT}" sql-drop -y
    - zcat /tmp/database.sql.gz | drush -r "${DOCROOT}" sql-cli
    - rm /tmp/database.sql.gz

    # Remove most of the records from the vidhist table.
    - drush -r "${DOCROOT}" sql-query "CREATE TABLE vidhist_backup AS SELECT * FROM vidhist WHERE updated > UNIX_TIMESTAMP(DATE_SUB(NOW(), INTERVAL 1 MONTH));"
    - drush -r "${DOCROOT}" sql-query "TRUNCATE vidhist;"
    - drush -r "${DOCROOT}" sql-query "LOCK TABLE vidhist WRITE, vidhist_backup WRITE;"
    - drush -r "${DOCROOT}" sql-query "INSERT INTO vidhist SELECT * FROM vidhist_backup;"
    - drush -r "${DOCROOT}" sql-query "UNLOCK TABLES;"
    - drush -r "${DOCROOT}" sql-query "DROP TABLE vidhist_backup;"

But, while writing this blog post I realized that's probably not necessary. Since Tugboat is pulling the database from the Pantheon test environment, not the live one, the table will have already been cleaned up. Which, also means updating the base preview in Tugboat is going to be significantly faster than I had originally thought. Just gotta go open a new PR...

Recap

I'm kind of embarrassed that it took me this long to address this issue. It's easy to say, "Meh, it's just a few minutes." But over time those minutes can really add up. Not to mention how frustrating it must be for someone trying to get started working on the site who isn't accustomed to going to make a cup of coffee while they wait for the DB to import.

I encourage you to occasionally step back and consider the everyday motions you go through without really thinking about them. There may be room for improvement.

Oct 09 2020
Oct 09

For the Drupalize.Me site we have a functional/integration test suite that's built on the Nightwatch.js test runner. We use this to automate testing of critical paths like "Can a user purchase a membership?" as well as various edge-case features that we have a tendency to forget exist -- like "What happens when the owner of a group account adds a user to their membership and that user already has a Drupalize.Me account with an active subscription?"

For the last few months, running these tests has been a manual process that either Blake or I would do our on our localhost before doing a release. We used to run these automatically using a combination of Jenkins and Pantheon's MultiDev, but when we switched to using Tugboat instead of MultiDev to build previews for pull-requests, that integration fell to the wayside and eventually we just turned it off because it was failing more often than it was working.

Aside: The Drupalize.Me site has existed since 2010, and has gone through numerous rounds of accumulating and then paying off technical debt. We once used SVN for version control. Our test suite has gone from non-existent, to Behat, then Casper, then back to Behat, and then to Nightwatch. Our continuous integration (CI) relies primarily on duct tape and bubble gum. It's both the curse, and the joy, of working on a code base for such a long time.

I recently decided it was time to get these tests running automatically again. Could I do so using GitHub actions? I have a bunch of experience with other CI tools, but this was my first time really diving into either of these in their current form. Here's what I ended up with.

  • We use Tugboat.qa to build preview environments for every pull-request. These are a clone of the site with changes from the pull-request applied. This gives us a URL that we can use to run our tests against.
  • We use GitHub Actions to spin up a robot that'll execute our tests suite against the URL provided by Tugboat and report back to the pull request.

Setting up Tugboat to build a preview for every pull request

We use a fairly cookie-cutter Tugboat configuration for building preview environments that Blake set up and I mostly just looked at and thought to myself, "Hey, this actually looks pretty straightforward!" The setup:

  • Has an Apache/PHP service with Terminus and Drush installed, and a MySQL service
  • Pulls a copy of the database from Pantheon as needed
  • Reverts features, updates the database, and clears the cache each time a pull request is updated
  • Most importantly, it has a web-accessible URL for each pull request

Here's what our .tugboat/config.yml looks like with a few unrelated things removed to keep it shorter:

services:
  php:

    # Use PHP 7.2 with Apache
    image: tugboatqa/php:7.2-apache
    default: true

    # Wait until the mysql service is done building
    depends: mysql

    commands:

      # Commands that set up the basic preview infrastructure
      init:

        # Install prerequisite packages
        - apt-get update
        - apt-get install -y default-mysql-client

        # Install opcache and enable mod-rewrite
        - docker-php-ext-install opcache
        - a2enmod headers rewrite

        # Install drush 8.*
        - composer --no-ansi global require drush/drush:8.*
        - ln -sf ~/.composer/vendor/bin/drush /usr/local/bin/drush

        # Install the latest version of terminus
        - wget -O /tmp/installer.phar https://raw.githubusercontent.com/pantheon-systems/terminus-installer/master/builds/installer.phar
        - php /tmp/installer.phar install

        # Link the document root to the expected path.
        - ln -snf "${TUGBOAT_ROOT}/web" "${DOCROOT}"

        # Authenticate to terminus. Note this command uses a Tugboat environment
        # variable named PANTHEON_MACHINE_TOKEN
        - terminus auth:login --machine-token=${PANTHEON_MACHINE_TOKEN}

      # Commands that import files, databases,  or other assets. When an
      # existing preview is refreshed, the build workflow starts here,
      # skipping the init step, because the results of that step will
      # already be present.
      update:

        # Use the tugboat-specific Drupal settings
        - cp "${TUGBOAT_ROOT}/.tugboat/settings.local.php" "${DOCROOT}/sites/default/"
        - cp "${TUGBOAT_ROOT}/docroot/sites/default/default.settings_overrides.inc" "${DOCROOT}/sites/default/settings_overrides.inc"

        # Generate a unique hash_salt to secure the site
        - echo "\$settings['hash_salt'] = '$(openssl rand -hex 32)';" >> "${DOCROOT}/sites/default/settings.local.php"

        # Import and sanitize a database backup from Pantheon
        - terminus backup:get ${PANTHEON_SOURCE_SITE}.${PANTHEON_SOURCE_ENVIRONMENT} --to=/tmp/database.sql.gz --element=db
        - drush -r "${DOCROOT}" sql-drop -y
        - zcat /tmp/database.sql.gz | drush -r "${DOCROOT}" sql-cli
        - rm /tmp/database.sql.gz

        # Configure stage_file_proxy module.
        - drush -r "${DOCROOT}" updb -y
        - drush -r "${DOCROOT}" fra --force -y
        - drush -r "${DOCROOT}" cc all
        - drush -r "${DOCROOT}" pm-download stage_file_proxy
        - drush -r "${DOCROOT}" pm-enable --yes stage_file_proxy
        - drush -r "${DOCROOT}" variable-set stage_file_proxy_origin "https://drupalize.me"

      # Commands that build the site. This is where you would add things
      # like feature reverts or any other drush commands required to
      # set up or configure the site. When a preview is built from a
      # base preview, the build workflow starts here, skipping the init
      # and update steps, because the results of those are inherited
      # from the base preview.
      build:
        - drush -r "${DOCROOT}" cc all
        - drush -r "${DOCROOT}" updb -y
        - drush -r "${DOCROOT}" fra --force -y
        - drush -r "${DOCROOT}" scr private/scripts/quicksilver/recurly_dummy_accounts.php

        # Clean up temp files used during the build
        - rm -rf /tmp/* /var/tmp/*

  # What to call the service hosting MySQL. This name also acts as the
  # hostname to access the service by from the php service.
  mysql:
    image: tugboatqa/mysql:5

In order to get Tugboat to ping GitHub whenever a preview becomes ready for use, make sure you enable the Set Pull Request Deployment Status feature in Tugboat's Repository Settings.

Screenshot of Tugboat UI with with checkbox for github deployment status notifications checked.

Run tests with GitHub Actions

Over in GitHub Actions, we want to run our tests and add a status message to the relevant commit. To do this we need to know when the Tugboat preview is done building and ready to start testing, and then spin up a Node.js image, install all our Nightwatch.js dependencies, and then run our test suite.

We use the following .github/workflows/nightwatch.yml configuration to do that:

name: Nightwatch tests
on: deployment_status

jobs:
  run-tests:
    # Only run after a successful Tugboat deployment.
    if: github.event.deployment_status.state == 'success'
    name: Run Nightwatch tests against Tugboat
    runs-on: ubuntu-latest
    steps:
      # Set an initial commit status message to indicate that the tests are
      # running.
      - name: set pending status
        uses: actions/[email protected]
        with:
          github-token: ${{secrets.GITHUB_TOKEN}}
          debug: true
          script: |
            return github.repos.createCommitStatus({
              owner: context.repo.owner,
              repo: context.repo.repo,
              sha: context.sha,
              state: 'pending',
              context: 'Nightwatch.js tests',
              description: 'Running tests',
              target_url: "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}"
            });

      - uses: actions/[email protected]
      - uses: actions/[email protected]
        with:
          node-version: '12'

      # This is required because the environment_url param that Tugboat uses
      # to tell us where the preview is located isn't supported unless you
      # specify the custom Accept header when getting the deployment_status,
      # and GitHub actions doesn't do that by default. So instead we have to
      # load the status object manually and get the data we need.
      # https://developer.github.com/changes/2016-04-06-deployment-and-deployment-status-enhancements/
      - name: get deployment status
        id: get-status-env
        uses: actions/[email protected]
        with:
          github-token: ${{secrets.GITHUB_TOKEN}}
          result-encoding: string
          script: |
            const result = await github.repos.getDeploymentStatus({
              owner: context.repo.owner,
              repo: context.repo.repo,
              deployment_id: context.payload.deployment.id,
              status_id: context.payload.deployment_status.id,
              headers: {
                'Accept': 'application/vnd.github.ant-man-preview+json'
              },
            });
            console.log(result);
            return result.data.environment_url;
      - name: echo tugboat preview url
        run: |
          echo ${{ steps.get-status-env.outputs.result }}
          # The first time you hit a Tugboat URL it can take a while to load, so
          # we visit it once here to prime it. Otherwise the very first test
          # will often timeout.
          curl ${{ steps.get-status-env.outputs.result }}

      - name: run npm install
        working-directory: tests/nightwatch
        run: npm ci

      - name: run nightwatch tests
                # Even if the tests fail, we want the job to keep running so we can set the
                # commit status and save any artifacts.
        continue-on-error: true
        working-directory: tests/nightwatch
        env:
          TUGBOAT_DEPLOY_ENVIRONMENT_URL: ${{ steps.get-status-env.outputs.result }}
        run: npm run test

      # Update the commit status with a fail or success.
      - name: tests pass - set status
        if: ${{ success() }}
        uses: actions/[email protected]
        with:
          github-token: ${{secrets.GITHUB_TOKEN}}
          script: |
            return github.repos.createCommitStatus({
              owner: context.repo.owner,
              repo: context.repo.repo,
              sha: context.sha,
              state: "success",
              context: 'Nightwatch.js tests',
              target_url: "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}"
            });
      - name: job failed - set status
        if: ${{ failure() }} || ${{ cancelled() }}
        uses: actions/[email protected]
        with:
          github-token: ${{secrets.GITHUB_TOKEN}}
          script: |
            return github.repos.createCommitStatus({
              owner: context.repo.owner,
              repo: context.repo.repo,
              sha: context.sha,
              state: "error",
              context: 'Nightwatch.js tests',
              target_url: "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}"
            });

            # If the tests fail we take a screenshot of the failed step, and then
        # those get uploaded as artifacts with the result of this workflow.
      - name: archive testing artifacts
        uses: actions/[email protected]
        with:
          name: screenshots
          path: tests/nightwatch/screenshots
          if-no-files-found: ignore

The one maybe abnormal thing I had to do to get this working is use the actions/[email protected] action to manually query the GitHub API for information about the Tugboat deployment. There's a good chance that there is a better way to do this -- so if you know what it is, please let me know.

The reason is that Tugboat sets the public URL of a preview in the deployment.environment_url property. But, this property is currently hidden behind a feature flag in the API. It isn't present in the deployment object that your GitHub workflow receives. So in order to get the URL that we want to run tests against, I make a query to the GitHub API with the Accept: application/vnd.github.ant-man-preview+json header. There are other actions you can use to update the status of a commit that are a little cleaner syntax, but this workflow is already using actions/[email protected] so for consistency I used that to set a commit status as well.

This Debugging with tmate action was super helpful when debugging the GitHub Workflow. It allows you to open a terminal connection to the instance where your workflow is executing and poke around.

Our nightwatch.config.js looks like the following. Note the use of the Tugboat URL we retrieved and set as an environment variable in the workflow above, process.env.TUGBOAT_DEPLOY_ENVIRONMENT_URL. Also note the configuration that enables taking a screenshot whenever a test fails.

module.exports = {
  "src_folders": [
    "tests"// Where you are storing your Nightwatch tests
  ],
  "output_folder": "./reports", // reports (test outcome) output by nightwatch
  "custom_commands_path": "./custom-commands",
  "webdriver": {
    "server_path": "node_modules/.bin/chromedriver",
    "cli_args": [
      "--verbose"
    ],
    "port": 9515,
    "timeout_options": {
      "timeout": 60000,
      "retry_attempts": 3
    },
    "start_process": true
  },
  "test_settings": {
    "default": {
      'launch_url': 'http://dme.ddev.site',
      "default_path_prefix": "",
      "persist_globals": true,
      "desiredCapabilities" : {
        "browserName" : "chrome",
        "javascriptEnabled": true,
        "acceptSslCerts" : true,
        "chromeOptions" : {
          // Remove --headless if you want to watch the browser execute these
          // tests in real time.
          "args" : ["--no-sandbox", "--headless"]
        }
      },
      "screenshots": {
        "enabled": false, // if you want to keep screenshots
        "path": './screenshots' // save screenshots here
      },
      "globals": {
        "waitForConditionTimeout": 20000 // sometimes internet is slow so wait.
      }
    },
    // Run tests using GitHub actions against Tugboat.
    "test" : {
      "launch_url" : process.env.TUGBOAT_DEPLOY_ENVIRONMENT_URL,
      // Take screenshots when something fails.
      "screenshots": {
        "enabled": true,
        "path": './screenshots',
        "on_failure": true,
        "on_error": true
      }
    }
  }
};

Finally, to tie it all together, the GitHub workflow runs npm run test which maps to this command:

./node_modules/.bin/nightwatch --config nightwatch.config.js --env test --skiptags solr

That launches the test runner and starts executing the test suite. Ta-da!

Is this even the right way?

While working on this I've found myself struggling to figure out the best approach to all this. And while this works, I'm still not convinced it's the best way.

Here's the problem: I can't run the tests until Tugboat has finished building the preview -- so I need to somehow know when that's done.

For this approach I get around this by enabling deployment_status notifications in Tugboat, listening for them in my GitHub workflow using on: deployment_status, and then executing the test suite when I get a "success" notification. One downside of this approach is that in the GitHub UI the "Checks" tab for the PR will always be blank. In order for a workflow to log its results to the Checks tab, it needs to be triggered via a push or pull_request event. I can still set a commit status, which in turn will allow for a green check or red x on the pull request, but navigating to view the results is less awesome.

This approach allows for a pretty vanilla Tugboat setup.

It seems like an alternative would be to disable Tugboat's option to automatically build a preview for a PR. Instead, we'd use a GitHub workflow with an on: [push, pull_request] configuration that uses the Tugboat CLI to ask Tugboat to build a preview, wait for the URL, and then run the tests. This would allow for better integration with the GitHub UI, but require more scripting to take care of a lot of things that Tugboat already handles. I would need to not only build the preview via the CLI, but also update it and delete it at the appropriate times.

I do think that much of the Tugboat scripting here would be pretty generic, and I could probably write the workflow to manage Tugboat previews via GitHub Actions once and mostly copy/paste it in the future.

Yet another approach would be to not use GitHub Actions at all, and instead run the tests via Tugboat. Then use the GitHub Checks API to report back to GitHub about the status of a PR and log the results into the "Checks" tab. However, this looks like a lot of code and would probably be better if it could be included into Tugboat in a more generic way. Something like a "run my tests" command, and a way to parse the standard jUnit output, and log the results to GitHub, or maybe just bypass the Checks UI all together and instead have Tugboat provide a UI for viewing test results.

I might explore these other options further in the future. But for now... it's working, so don't touch it! Like I said earlier -- it's all duct tape and bubble gum.

Recap

Terminal showing git log --oneline output and a whole list of commit messages that say 'Testing ....'

It took a while to figure this all out, and to debug the various issues on remote machines, but in the end, I'm happy with where things ended up. More than anything, I love having robots run the tests for me once again.

Sep 14 2020
Sep 14

At Drupalize.Me, we have a long history of leading in-person Drupal training workshops. In fact, our whole website evolved out of public workshops we used to do as part of Lullabot before online video-based training was a thing. (We even made DVDs once!) While workshops are no longer the core of our business, we continue to offer them on occasion at DrupalCon and various local camps -- just because we just love doing them.

I've always really enjoyed facilitating real-time in-person workshops. It's such a different experience than recording screencasts. The energy is different, as is the sense of accomplishment and the rush you get afterwards. And while both are great for learning, I think there are some benefits to in-person learning that are hard to replicate via written tutorials and pre-recorded screencasts.

So this fall, we're trying something new: we're offering real-time remote workshops. It's not quite in-person, but I'm pretty excited to try it out. I'm looking forward to being able to engage with others who are learning Drupal. And I think that there are some interesting things we can explore -- like using VS Code's collaboration features, and tools like Miro -- that'll make the whole experience engaging in a different, yet equally valuable way.

We're still ironing out the details, but we'll certainly be doing a version of our popular Hands-On Theming workshop which has sold out every time we've done it in the last 4 years. I'm also excited that we're putting together a workshop on upgrading from Drupal 7 to Drupal 8 (or 9).

We're going to keep the number of seats limited (probably 10 or fewer), in order to ensure we can create an environment that promotes discussion and collaboration. I think the intimacy of smaller groups can be much more engaging than large webinar-style courses where the participants are mostly anonymous.

Take a look at what we've got in store, and sign up for the waiting list to get early access and discounted prices when we schedule these workshops later this year.

View upcoming workshops →

Jul 22 2020
Jul 22

Last week the community had its first online DrupalCon Global. It was a great experience for our team--better than many of us expected it to be. We posted daily summaries of our session notes and comments on the event last week for Day 1, Day 2, and Day 3. Our team also took part in the Friday Contribution Day as well, with most of our work focused on preparing contributed modules for Drupal 9, the Help Topics project, and documentation.

Now that we've had a bit of time to recover from the flurry of last week's activities, we want to share our overall impressions.

Contribution Day

While the sessions for the main conference were hosted on the Hopin.to platform, the community contribution work was managed using the Drupal-based Open Social platform. This let different groups manage their communication and organize in one central location so attendees could browse through the various contribution options.

Blake spent time working on preparing contributed modules for Drupal 9, and he said "I was really impressed with the tooling that's available to help module maintainers with the process (the Upgrade Status tool in particular)."

Amber is a lead for the Help Topics project,

It was really great to work on some Help Topics issue triage and writing on Friday. I had fun playing cheerleader and recruiting help. And I also got some writing in! It's been hard to make time for the module this year and it was great to set aside a day to work on it. Incidentally, I feel recharged and more motivated to work on community tasks after DrupalCon. (One of the benefits of DrupalCon in general, I think.)

How did this compare to other conferences?

Most of our team has attended in-person DrupalCons in the past, and some people have attended other online conferences, especially this year. It's interesting to look at the pros and cons of DrupalCon Global in that context. We really liked attending from the comfort of home and while we missed the intense socializing, we also appreciated getting a break and being able to recover our energy for each day of sessions. It has a different energy from in-person events, but different isn't bad.

Philippa summed it up,

No matter how good, conferences can be exhausting. While I would have preferred the opportunity to socialize more with the community face to face, I really liked the digital session format. It enabled me to attend more sessions and absorb more information than a live conference where I struggle with feeling overwhelmed.

Another advantage of the online event is not having to worry about overcrowded sessions. Switching back and forth between sessions was also easy, making it possible to attend more sessions and find topics that grabbed your interest in a given slot.

With the socializing and networking requiring more active effort, there was also a feeling of this being easier at an online conference where you already know some people. While this is also true at in-person events, the extra work to meet new people felt like a much bigger barrier at other conferences, and was less likely to happen. Running into people we knew from the community or other events in chat rooms made this event feel more comfortable from the start.

We also liked the Hopin.to platform. While few of us were brave enough to click on the Networking button (which puts you into a direct message with another attendee who wants to chat too), we found that the platform was easy to navigate and the addition of the chat to each session, as well as the event in general, was a nice perk. It allowed participants to engage with each other, ask and answer questions, provide links, and generally added to the experience. It could be distracting at times, but you can close the chat window in those instances. The chats were beneficial enough that we wouldn't mind seeing something like this added to in-person events as well.

One thing that was notably missing at this DrupalCon was access to the recorded sessions shortly after they were presented. At a regular DrupalCon, the session would be uploaded to the Drupal Association YouTube channel very quickly. Given the time zone issues and scheduling, many (most?) people missed sessions because it was simply not a good time locally. The lack of quick video availability made it feel like the entire conference was not accessible. However, we understand that, given the nature of this conference, it made sense to not "give it away for free" so quickly. We'll also note that the videos from the sessions will be made available on YouTube for the community in September.

Our favorite sessions

There were a lot of good sessions. Here are some of our favorites:

Philippa

The Olivero theme: Turning a wild idea into a core initiative and Designing for Chaos: The design process behind Olivero

What I liked most about these back-to-back sessions was the storytelling format that brought substance to abstraction in a way that proved very illuminating for me. Explaining how Olivero was conceptualized - from its beginning as a passing conversation at a previous DrupalCon to its current place as a core initiative was super compelling. Specifically, and from a content strategy standpoint, I was very interested in learning about how they defined the scope of such a large project, and how accessibility and simplicity were major conceptual drivers to reign the project in.

Blake

The Performance & Scaling Summit was a really good refresher on some of the tools and techniques I should be using on a regular basis as a status check on our sites. I'm guilty of not being as proactive as I could be in this department, so it was nice to see how I could do a better job of integrating profiling into my development process on a more consistent basis.

Amber

Shift Left: Addressing Digital Inequity for the Black Community contained a lot of good food for thought. It's a session I plan on re-watching when the recording becomes available. (See our notes and impressions in DrupalCon Global Day 3.

Addi

I was really happy with the number of sessions that specifically addressed diversity and inclusion, and in general the human part of technology. I liked the groundwork laid in How to improve psychological safety in your organization, which had lots of good resources for teams of different sizes. In addition to Amber's pick of Shift Left, I also learned a lot in the Software for a diverse community starts with a diverse team and Trans Drupal: Designing with transgender users in mind sessions.

Summary review

This was a great conference. It was well run and had high quality sessions for a very wide range of interests and skills. We're impressed with the quality of the event, especially given the circumstances for it coming about online, and the short timeline that the Drupal Association and volunteer team had to pull this off.

While in-person conferences tend to be more about socializing and networking for many of us, the online venue drew us more into the sessions themselves. We all managed to consume and be able to engage with the material better at DrupalCon Global. We do miss the social time and the serendipity of in-person events, but there's no reason both kinds of events can't become part of the community's future event list.

Did you attend? We'd love to hear your thoughts in the comment section.

Jul 17 2020
Jul 17

Well, day 3 of DrupalCon Global delivered again. It's been a whirlwind 3 days full of good content and meeting lots of people new and old. Blake captured the general feeling of this 'Con:

I was very skeptical of a virtual DrupalCon (especially as someone who didn't attend at all last year). Although I missed the first day, I'm really glad I had the time and opportunity to engage with the event. Seeing so many new faces presenting was also wonderful. DrupalCon Global was a great way to reconnect with folks in the community, and still be able to have dinner with my family in the evenings. Two thumbs up, would attend again.

Contribution Day

While sessions are over, don't forget that today, Friday, is the traditional contribution day. You'll need to sign up on the DrupalCon Global Contributions Room site, and it's open to everyone--no DrupalCon ticket required. There are many different kinds of groups that require a wide range of skills, with plenty of space for non-coders. If you're not sure how to get started, there is a page that explains how to take part in Mentored Contribution, which will walk you through the whole process.

Sessions

Although we were definitely starting to feel some day 3 brain overload, there were some really good ones on this last day of the session schedule. Our teammates Amber Matz and Joe Shindelar also presented on Thursday, so make sure to check those out when the videos come up.

Amber's session: Deep dive: state of Drupal (Link to slides with presenter notes)

Joe's session: Altering, extending, and enhancing Drupal (Link to session resources)

Here are the random notes and summaries from the sessions we attended yesterday.

The Olivero theme: Turning a wild idea into a core initiative

  • Drupal core ideas is a great place to find collaborators, and share ideas for potential projects
  • Documentation, identifying stakeholders, and having diverse skillsets on the team were key to the project's success.
  • Building a static POC on Netlify, and using Tugboat allowed folks to get involved and contribute earlier than if they had gone straight to a Drupal theme.
  • Drawing the line between must haves and nice to haves was important to maintain project momentum and contributors' mental health

Designing for Chaos: The design process behind Olivero

  • The main motivation for Olivero: to create a better first impression with Drupal
  • The team named the theme in honor of Rachel Olivero, who worked at the National Federation of the Blind, and who was committed to making tech accessible to all people
  • Validating the design: "The first draft of anything is shit"- Ernest Hemingway
  • The flaw of averages: If you want to design something for an individual human being, the average is completely useless. We learned how the designers worked to avoid it.
  • Spectrum analysis that was used to establish voice and tone (formal, bright, approachable, high contrast)
  • In order to be able to iterate more quickly, the team defined a core set of stakeholders to help with initial designs before showing it to the general community. This helped eliminate low hanging fruit issues (e.g. broken accessibility) that would have been blockers, no matter what. It also allowed the broader community to keep the discussion focused on the bigger picture.
  • Cool use of Invision app to allow stakeholders to rank things on a scale. They created an image with lines and good ---------------- bad and then people could leave comments in Invision somewhere along the line. Since comments show as little red circles you could clearly see ranking. And discussions could take place in the comment threads.

(Philippa) As a non-tech person, what I liked about both Olivero presentations was how they laid out the thought processes linking the idea to the core initiative, and how they advocated for the idea that led to the new default front-end theme for Drupal 9.

Intermission: Desk yoga with Gabrielle Rickman

Sooo good. While some people on the team opted for a breakfast intermission, others got a very nice break of stretches to do from the chair. Useful for any conference, but extra refreshing at an online event.

Driving today's CMS with tomorrow's artificial intelligence

  • Machine learning is a subset of artificial intelligence and isn't smart enough to evolve itself.
  • Uses in business:
    • Advanced automated interaction with customers
    • Identifying patterns in behavior
  • Where can Drupal use this?
    • Content moderation
    • Analyze customer mood
    • SEO
    • Chatbots
    • Personalization
    • Visual search
  • How do you do this? Are you ready? Do you have a plan for this in your org?
  • APIs to use
    • Azure Cognitive Services API (multiple APIs: vision, speech, etc.)
    • Google Vision API (this is much bigger than the Azure Vision API)
    • Drupal modules exist for these APIs
  • Really neat to see examples of how this can be used for content management. Nice live demo and some really cool things you can help automate.

Open source belongs in school--Let's put it there!

A presentation by a teacher and his students, The Penguin Corps, about how they are using Open Source. Led by the students.

Software for a diverse community starts with a diverse team

Highly recommended. Really great session that addresses a lot of questions about diversity AND inclusion, and looks at how to create inclusive work agreements with clients and vendors.

MagMutual.com: On the JAMStack with Gatsby and Drupal 8

  • Decoupled architecture
    • volatility based decomposition of feature requirements
    • Drupal (CMS) / Gatsby (Website) / Serverless (AWS Deployment) / Bus. Logic (Lambda) / Apollo GraphQL (User Data) / ElasticSearch (Search) / Auth0 (User Identity)
    • Briefly walked through the features and benefits of each of these components
    • (notes for our potential future use: Drupal Elasticsearch AWS connector module, AWS lambda rate limiting, Gatsby searchkit plugin)
    • Serverless framework: not really used on live code, but helped with (by allowing folks to avoid using AWS console)
      • deployment
      • mocking
      • testing
      • logging
      • local development
      • project structure
    • Living with it - ongoing support
      • harder to debug integration points, more things to support, onboarding
      • Apollo GraphQL is a huge win (helps set up data structure schema, and force thinking about it), improved performance, adding new design assets is faster

Shift Left: Addressing Digital Inequity for the Black Community

Another highly recommended session. We'll be watching this again when the video goes up.

  • Designing tech for people without a detailed and rigorous study of people makes the kinds of tech designs that we see come at the expense of people of color and women (rough quote from Algorithms of Oppression)
  • Understanding the effects of systematic dehumanization of Black individuals
    • Sylvia Wynter (No Humans Involved)
    • Aime Cesaire (Discourse on Colonialism)
  • Hegemony's role determines which products are created, and which problems are prioritized
  • The continued exclusion of Black people from technical creation
    • Blackness as an afterthought (film)
    • Black input is consistently missing from product development (2.5% of employees at Google 2018, 3.3% of technical employees at Microsoft 2019)
    • Double Consciousness
  • The Digital Divide
    • Having less access to technical skill development enhances the divide, and makes being part of solving the problem through product creation more difficult
    • Beware software, PredPol algorithmic biases & flawed training data in machine learning systems leading to systemic injustices
  • How can we move forward?
    • Reframe perspectives (equitable and fair predictive algorithms)
    • Actually address the issues
    • Intentionally carve space - tokenism and quick fixes won't solve lack of equity

Hacking live! A realtime look at website security

  • Don't trust user input, even admin form input.
  • Use Form API. It provides XSS cleanup on output by default. (Drupalize.Me tutorials: Forms (Form API))
  • Avoid using the raw filter in Twig. (Twig auto-escapes strings and the raw filter removes that safety net.) (Drupalize.Me tutorials: Twig Filters and Functions)
  • XSS exploits can be stored both in cache and database. (Sanitize output that is coming from a cache.)
  • DDOS (Distributed Denial of Service). Hitting a page with a slow function at a large scale (relatively speaking) can take the site down.
    • Limit form submissions to prevent flooding.
    • Move slow functions to asynchronous queues.
  • Access Bypass
  • Use PHP CodeSniffer to Find Errors.
  • Module plug from the chat: Content-Security-Policy
  • Dries attended the session and he shared this link to his site: HTTP Headers Analyzer

Altering, extending, and enhancing Drupal

  • Check out Joe's resources
  • Don't hack core. (Alter and extend instead!)
  • Let others change your module, without hacking it. (This is super powerful!)
  • Ways to alter, extend Drupal:
    • Respond to an event (a user is logging in)
    • Drupal wants to a question (does anyone have a block they want to add to this list?)
    • Add code that adds new functionality
    • Make a change to existing functionality (change fields on a login form, for example)
  • Don't hack core. Instead use one of these systems.
    • hooks
    • plugins
    • services
    • events

Adapting your DE&I efforts to the reality of the crisis

Importance of creating safe spaces for people on your team to talk about their struggles and/or to get to know one another better. This could be through one-on-ones or regular online social events. (Physical distancing, not social distancing.)

Inclusive content strategy

"People want to be seen as equal participants and not afterthoughts."

  • (Philippa) LOVE that she called out hiring language that speaks to whether people are a good "culture fit" in businesses and organizations - because people shouldn't have to conform culturally - and how content that communicates aligned values is a great alternative.
  • Some of what inclusive content means:
    • Predictable structure, (like nesting and heading orders) for easy navigation
    • Color contrast/font/distinguishable links
    • Short sentences (9th grade reading level) and plain, jargon-free language
    • Bullets and number lists
    • Imagery with captions, subtitles, text alternatives (which also helps those with poor internet connections)
    • Closed or open captions - which also helps those whose first language isn't English
    • Minimizing or avoiding emoticons, which are hard for screen readers

Next week, we'll be posting a wrap-up of our overall DrupalCon experience, including today's contribution day. We hope you've had a great 'Con and hope to see you at the contribution day. If you see anyone from our team, please say hi!

Jul 16 2020
Jul 16

Day 2 of DrupalCon Global was chock full of great presentations, and we also attended the 2 summits, Community and Performance & Scaling. Everyone has settled in to the platform and rhythm of the conference now, and it's been great catching up with friends and community colleagues.

As Amber says,

I'm enjoying DrupalCon Global more than I thought I would, to be honest. (Still bummed about missing a trip to Minneapolis!) But I'm learning a lot and reconnecting with what is going on with the project and in the community.

Here are our notes from day 2, which again, are not fully fleshed out, but highlight the things we found interesting or want to follow up on.

Sessions

Drupal Initiatives Plenary

  • Great community initiatives to participate in. Quick overview of each one with some info about how to get involved.
  • Drupal Diversity and Inclusion (DDI) -
    • They have a booth in the Hopin Exhibit Hall if you want to learn more or chat with someone
    • Ideas for ways to take action
      • Read the Code of Conduct and understand it
      • Consider being an official CoC Contact
      • Speak up when you see a problem, prepare to disrupt harassment
      • Notice who is present; who's missing? Sponsor marginalized individuals
      • Management: Fix hiring, recruitment and pay gaps
      • Update your language

Single sign on across Drupal 8

  • Slides
  • A good walk through of the presenter's medium article on the same topic. Did a good job explaining the different components of a single sign on system, their role, and how to integrate Drupal at each point in the process.

Trans Drupal: Designing with transgender users in mind

  • Gender 101 and history
    • What is (personal) gender? The easiest answer is that it's whatever a person says theirs is
    • Social gender is a social organizing principle (e.g. men's and women's bathrooms)
  • All work is biased in some way - none of it is perfectly neutral
  • Words and images, be aware (find images on https://genderphotos.vice.com/ and https://getavataaars.com/)
  • Let people change their data, make sure it updates everywhere, and don't keep the old values (avoid deadnaming). Transition means a lot of different things and can happen at any time/over time. You want to get rid of old data completely so that it can't be resurfaced later.
  • Do you really need to track gender on your site? Why? If so, it's complicated to do.

Unit tests in Drupal: The road to test-driven development

  • Introduction to TDD
  • Live demo of the Red > Green > Refactor feedback loop using PHPUnit
  • Unit Test fundamentals & vocabulary

Dries Q&A

  • Make DX of Drupal better for devices with less power/RAM to increase adoption among students and others.
    • Any change we make to improve Drupal's DX/performance in this respect will both help lower end devices, but also benefit all sites. e.g. sidewalk curb cuts
  • Is there still a sizeable market for small businesses? Or are they moving to site-builder tools.
    • On the lower end of the market there are lot of competing technologies, and while Drupal can be used it's harder to convince people. vs. high end of the market there's less competition and more opportunity for Drupal to stand out.
  • Do you see online conferences becoming a permanent mix for the Drupal community going forward?
    • Dries: Likes that it lowers the travel barrier, and makes it more accessible to people
    • Dries: Misses seeing people in person, and misses human interaction, which Dries believes is an important part of what has made Drupal successful.
    • Heather: DA is committed to in-person, and doesn't see the desire for that going away. But, is open to exploring how virtual events can supplement in-person ones.
  • What unexpected things, positive and negative, have you observed about yourself and the community as a result of the COVID pandemic?
    • Good leadership is important in moments of crisis
    • Value of cooperation
  • How can we answer the criticism that Drupal is not popular and has a poor perception amongst developers in general
    • Drupal is so far ahead of the competition in a lot of ways that people just don't know about. e.g. caching and big pipe ... most devs have no idea about these things and their importance but benefit from them when they use Drupal. Also, Drupal is a model OSS community.
  • 3 biggest goals you have for the future of the product?
    • Adoption, ease of use, low cost of maintenance
    • Increase adoption to a point where it can be used as a wedge issue to promote the open web
    • See Drupal grow by making it easier to use, and lower the maintenance burden

Drupal.org panel

  • New Contributor Guide that aims to make it easier for contributors to on-board themselves and find things to work on
  • Lots of changes to packaging and stuff that happens behind that scenes that as an end user you probably don't care much about as long as it works but make a big difference for module maintainers and the security team
  • Issue Forks & Merge Requests - new tools for maintainers open for opt-in beta now, a Drupal.org implementation of something akin to GitHub pull requests. A bit improvement over the current .patch based workflow and will lower barrier to contributing to Drupal.org hosted projects. Mostly working on trying to iron out UI/UX issues before enabling it for all of contrib. And then core as well.
  • Future projects:
    • Federated login
    • Better telemetry
    • Upgrade Drupal 9
    • Packaging related to auto-updates initiative
    • Preparation to support the release of Drupal 10.

Drupal 9: New initiatives

A working session to discuss next steps for D9 initiatives. One major challenge for all initiatives is the need for people to help with project management and just general organizing of things. There was some talk about the proposed JavaScript menu component idea that was proposed in the Driesnote. Mostly recognizing that it requires better definition. Lots of agreement that keeping it limited in scope to something small and attainable is important. And, one of the primary goals is to create/demonstrate an on-ramp for JavaScript developers into the Drupal core community.

Looking to Drupal 10. Move everything to PHP8, and help/pressure upstream libraries to do the same. CKEditor is EOL, need to either update to new version or remove. jQuery related, it's probably time to convert to vanilla JS. Which also means removing jQuery UI, an maybe Backbone too? [meta] Release Drupal 10 in 2022

Easy out-of-the-box initiative needs a leader. And combines efforts from at least three other ongoing projects like Claro, and Olivero. Needs someone who can help with accessibility and focus on accessibility related issues within the initiative. Someone who cares about day-to-day users of Drupal and beginner experience.

Improving your onsite search

  • Making sure your Drupal site is optimized for both organic and onsite searching
  • Why onsite searching is important:
    • It presents your site as a reliable resource
    • It blocks out competitors, which may show up in a Google/Bing search
    • It increases a customer's satisfaction by validating the decision to be a customer.
  • How to make onsite searching a priority: Basically, you apply the same SEO/tagging/taxonomy/metadata/keyword principles that help optimize a site for organic searching.
  • Finally, you can use Google Analytics to learn what people using onsite search are looking for, and how they're phrasing those searches. This learning can be applied to future content creation.

Mind-blowing content planning in native Drupal

  • A great presentation from Lukas Fischer from Netnode about the nearly-released 1.0 version of the Content Planner module, a tool suite that helps plan editorial content calendars.
  • Lukas demoed how to download it, showed the module's flexible and configurable dashboard, (walking us through what types of views and widgets you could add), and made a strong case for why it should replace Google/Microsoft sheets and Trello (yup, that's us) for planning and managing content.

Drupal Security Team Q&A panel

  • Great info about what the Drupal Security Team does.
  • Highlight was Portland Drupal community member Sam Mortenson (https://twitter.com/mortensonsam) explaining how he, as a security researcher, finds bugs.

Sharing is caring: Don't hold your knowledge hostage

From squiggles to straight lines: Sketching discovery & UX

  • Marissa Epstein shared how she uses sketching in discovery and UX processes. In a world of perfectionism and over-engineering prototyping solutions, Marissa demonstrated how sketching is a great way to quickly flesh out ideas and get clients (and the whole team) on board and confident in the direction the site design is going.

Drupal Automatic Updates in action and in depth

  • This was my first time seeing a live demo of the Automatic Updates module. Really promising work here and there's a lot of work still to come in "Phase 2". This is one of the priorities for Drupal 10, as Dries explained in his Driesnote, so expect some great progress on this front.

Media in Drupal 8: everything you need to know

This was a really in-depth look at Media in Drupal 8. The presenter was one of the maintainers and did a live demo where he created media types and showed us all how to build and use Media Library. He used a fresh install of Drupal to walk us through each step and I found it very helpful. It was a great session for beginners and perhaps even some intermediate site builders looking to learn more about Media.

Summits

Community Summit

15-minute talks:

  • Events Organizer update - Baddy Breidert
    • Some history on this working group, where they are now, and how to get involved.
  • Cultivating the Drupal Community Mindfully - Matthew Tift - mindfulness is not for solving problems, but to rethink our relationship to problems; how mindfulness can help our community based on our values and principles; cultivating an intentional community by being present
  • If you want to cry, watch Mario Hernandez's presentation, Why teaching someone else is the best way to learn.
    • And a meaningful way to both lift others up and contribute to the community
  • The ROI of a Dedicated Community Person - Anne Stefanyk
    • Having a dedicated community person makes it easier for the rest of the team to make impactful contributions with the "short" amount of time they get on a weekly basis to contribute to OSS. The community person on the team can help the rest of the team keep up with what's going on, and allows them to jump in and contribute without having to spend a tone of time figuring out what's happened since I was here last week.
  • Be a good boss: How to support your marginalized colleagues - Tara King
    • Cute animal pics for a hard topic.
    • This topic has real, material impact on people's lives; it's not about looking good.
    • Being kind isn't good enough. What does "kind" mean for different people?
    • Actions to take:
      • Reflect on yourself
      • Slow down (speed and spontaneity is not inclusive) - "Go fast and break things might work for software, but it's never good for people."
      • Make space (for processing experiences)
      • Ask (don't assume, just ask)
      • Practice (being uncomfortable)
      • Culture add, not culture fit
      • Educate yourself (use the Googles)
      • Increase pay equity and transparency

Performance & Scaling Summit

  • Mike Herchel - front-end performance audit
    • WebPageTest is a useful (low overhead) tool to get started testing your site.
    • Lighthouse is another useful tool for profiling your site.
    • Continuously profiling your site, and being aware of where the slow parts are is 80% of the battle.
  • Janna Malikova - load testing your Drupal website before it's too late
  • Shane Thomas - Decoupled Performance
    • Decrease your JS bundle size
    • pay attention to your images
    • make use of browser cache (progressive web app)
    • JSON:API Boost module
    • Optimize your API
      • Optimize GET requests using includes
      • Subrequests module
      • JSON:API Extras module
      • filters to explicitly filter out content
  • Michael Schmid - How to survive COVID-19 as a hosting provider

Let day 3 begin!

Day 2 was great and we're excited to dig into day 3, which is the last day of presentations. Don't forget that Friday is the community contribution day and everyone can participate. You don't need to be registered for DrupalCon to take part. So sign up for the DrupalCon Global Contributions Room to get involved and hang out with lots of great people.

Jul 15 2020
Jul 15

DrupalCon Global kicked off of this week and the Drupalize.Me team was there to participate. The originally scheduled DrupalCon North America in Minneapolis was canceled earlier this year due to COVID-19 and the great team at the Drupal Association worked magic to create a new DrupalCon experience with DrupalCon Global, the first online DrupalCon.

Almost everyone on our team has attended in-person 'cons in the past, so it was interesting to feel the differences between in-person and online. Philippa has never gone to a 'con before, so it was awesome that she could experience DrupalCon for the first time. We've gathered all of our notes from day one, along with links to videos of sessions to check out if you are registered. (Note that the session videos will be available on YouTube for the entire community in September. In the meantime, you must be a registered attendee to access the recorded sessions on Vimeo.) We'll keep a daily commentary going all week.

The Hopin.to platform

DrupalCon is using Hopin.to to manage the conference. The live sessions are delivered here, along with the ability to chat with other attendees in chat rooms, as well as randomly meet others one-on-one through a networking feature. The exhibit hall is also managed here, and you can "visit" an exhibitor's booth either by expressing interest so someone contacts you directly or by entering their chat room to talk with them and other visitors.

We found that the platform worked pretty well and it was pretty clear where things are. Besides being online of course, the big difference in sessions with this is that there is a chat room for each session so attendees can talk and ask questions during the presentation. This ended up feeling like both a plus and minus for many of us. On the one hand it was neat to see people engaging at the sessions and be able to "talk" while the session was in progress. This can really add to the session with interesting comments and conversations, and people helping each other out with questions. On the other hand seeing the back-channel chatter can be distracting or overwhelming, especially when there are large groups of attendees. You can close the chat window while you watch though, so if you find it annoying you can toggle that closed. There is a small button at the top of the chat window, between the timeline and the attendee count.

Hopin minimize chat window button

Joe sums up what many of us are feeling with the overall online experience so far.

"I'm quickly realizing how much I rely on the serendipity of in-person events to get to me to interact with others in the community, which is my favorite part of DrupalCon. But it's all primarily chat based, and a bit more opt-in than when someone walks up to you in the hall and says, "Hi Joe!". Being an active participant and not just a lurker is likely to be the hardest part of hopin for me."

Inclusion

One of the great things about an online conference is that it is much more accessible to many more people. It's also heartening to see the Drupal community highlighting diversity and inclusion. There is always so much work to do, and we were happy to see time scheduled in the event for 8:46 minutes of silence observed in honor of George Floyd and the Black Lives Matter movement. We hope this conversation continues and leads to lasting change in our community.

Another aspect of inclusion for online communities is the issue of time zones. Addi lives in Denmark, while the rest of our team live in several time zones in the United States. The main event is largely scheduled on American times, which meant that Addi missed a fair amount of live sessions and the first trivia night, being too late at night. That said, it's great to see that there are multiple trivia nights this year, with different days at different times so that everyone can participate. On the flip side, some of the initial sessions of the days are scheduled quite early in the morning for our folks who live on the U.S. west coast. There's really no way around time zones and live events, and it'll be interesting to see if there are more ideas that spring up after this 'con about how to create a more "live" experience for people who are not in the main targeted time zones.

Finding and scheduling sessions

You can find the schedule for all sessions both on Drupal.org and in the Hopin Reception area. Here is the schedule for Day 1. Don't forget that you can make your own schedule to follow by using the "My schedule" feature on Drupal.org or you can add sessions directly to your own calendar in Hopin. This can be a great way to cut down on the overwhelming list of sessions and it can help you figure out when you can watch live in your home schedule versus needing to go back to watch videos later once they get uploaded.

Session notes

Here are some somewhat random notes from sessions that we attended on Tuesday. We've also provided links to both the session page and the on-demand library video where possible (and we'll add the video links as they are put online).

Intermission

There was a pasta intermission! Before the sessions started, Vincenzo De Naro Papa from Amazee.io showed us the authentic Italian way of cooking pasta. It was a fun way to make things feel homey and relaxed before things got kicked off, though a fair number of people were probably confused about whether they were at the right conference when they first entered the main stage area. :-)

How to improve psychological safety in your organization

  • (Addi) This session covered a lot of things that we do well, and some things that apply more to larger organizations than we are. It also introduced some new ideas for me to look into within Osio Labs: Personal user manuals and documenting the Andon cord process (how do you stop work to address problems?).

Driesnote (video - Dries' presentation begins at 15:48)

  • Details about Drupal 10: shorter release cycle (targeting June 2022), limited to ~5 official initiatives with a focus on improving the beginner/new user experience.
    • JS Menu component
    • Automated updates
    • New front-end theme (Olivero)
    • Easy out-of-the-box (complete Claro admin theme, Layout builder, and Media)
    • Drupal 10 readiness (dependency end-of-life timelines, e.g. Symfony 4)
  • Q&A session will happen on Wednesday instead of happening right after the talk.
  • (Joe) It's nice to see the hat tip to diversity and inclusion, and hopefully that continues to expand within the community and continues to result in real change. This year's speaker lineup is the most diverse, and it's great to see that.
  • (Addi) I really like the emphasis on DEI and that people have generally been very supportive of this stance in the chatrooms. Tim Lehnen's talk at the end of Dries' presentation was good to see.

Community & Belonging in the Time of Coronavirus

  • (Joe) Adjusting to remote work, remote community, changes how we engage and how we gain a sense of "belonging". For many people remote work/working from home is becoming the new normal, not just a temporary thing.

Drupalcon on the front lines of covid-19

  • (Philippa) Four panelists, working for widely different organizations, spoke about how, in recent months, they've used Drupal to help communities, governments, and organizations fight the effects of Covid-19. For example, Christina Costello, Web Developer, Redfin Solutions, LLC, spoke about her agency's work with the Rural Aspirations Project. They helped set up a Drupal 8 website for them using the Claro theme for admins, which delivered good content moderation functionality and gave them something they needed very quickly amidst the statewide shutdown of schools: a CMS offering a quick and efficient approval process for content editors. The result was that in a very short time, they were able to help connect rural communities in Portland, Maine with good, creative online learning opportunities for kids. For instance, the site connected a Portland, Maine-based comic strip artist with kids in rural areas, who learned how to draw what they were experiencing in quarantine in a comic strip. Cool!
  • (Amber) Interesting panel of speakers who talked about both client solutions and workplace issues they needed to solve at the onset of the crisis. Interesting mix of Drupal modules and solutions that helped us and our clients quickly deliver new types of content as well as remote workplace adjustments.

Decoupled Drupal in real life: JSON:API at scale

  • (Amber) Lessons learned, caching, and other strategies for scaling decoupled or progressively decoupled Drupal sites. Lots of good real-world lessons. Goes way beyond a basic demo of "turn on JSON:API".

Big systems, big problems: Lessons from 40 design systems

  • (Amber) Learned some helpful strategies for redesigning a site, including sketching out user flows. Also great information about setting content component priorities.

JavaScript is coming to eat you

  • (Ashley) This session took a look at web development and the way that we build websites today and how the landscape is shifting. Traditional or "Monolithic" CMS platforms like Drupal and WordPress are being used less in favor of a more component based model. Despite the shifting landscape, CMS like Drupal can still play a critical role but will become more of a secondary concern

Social events

Trivia

  • (Joe) Amber, Philippa and I had a team, and tried to have both a Google Meet and the Hopin session open at the same time, which made it feel like you where in loud bar with everyone talking over everyone else. It was kind of fun, and Fatima did a great job of being a lively host. But it's just not the same as Trivia in a pub. I'm glad that effort was put into keeping this tradition alive in some form though.
  • (Amber) It was fun to do a "happy hour" with co-workers during trivia. Definitely not the same as the in-person experience, but still fun in its own way.

What's next?

Well, that's our quick summary of day one as we gear up for day two. We'll be back tomorrow with our day two summary. Are you attending DrupalCon Global? What are your thoughts, tips, and highlights so far?

Jul 07 2020
Jul 07

DrupalCon Global 2020 will feature presenters from around the world on a virtual platform called Hopin (pronounced "Hop in!"). Drupalize.Me trainers Joe Shindelar and Amber Matz will be presenting sessions and participating in the conference. Both Amber and Joe's sessions will be on Thursday. (Times listed are in UTC. Convert to your time zone with this tool.)

I (Amber) will co-present with Gábor Hojtsy Deep dive: state of Drupal 9 on Thursday, July 16, 2020 at 18:15 UTC. We'll dive into details not covered by the Driesnote. You'll learn how new features get into Drupal and how old APIs and libraries get updated in Drupal's release cycle. By the end of the session, you'll better understand what's involved with upgrading to Drupal 9. (And how it's probably not as bad as you might think!)

Joe will present Altering, extending, and enhancing Drupal also on Thursday at 21:15 UTC. There are various ways to extend Drupal without "hacking core" and in this session, you'll get a great overview of what those options are and how to decide which method to use. By the end of the session, you should have a more complete understanding of what the use-cases are for plugins, hooks, services, and events and how (at a high-level) they are implemented.

Osio Labs' (the company that makes Drupalize.Me) sister company Lullabot also has a strong group representing at DrupalCon Global. Check out Lullabots Speaking at DrupalCon Global 2020 to learn more.

Finally, you might be wondering how contribution will work at DrupalCon this year. Contribution groups are being organized at the virtual DrupalCon Global Contributions Room. Browse and join groups or create your own if you'd like to coordinate a sprint for your own Drupal community project. If you'd like to help out with Help Topics (programmers and writers/editors needed), join the Help Topics group. (I am co-maintainer of the core experimental module Help Topics.)

To learn more about the DrupalCon Global platform and attendee experience, we recommend the DrupalCon Global 2020: Attendee Experience Preview.

Register for DrupalCon Global 2020

Jun 24 2020
Jun 24

Layout Builder

Layout Builder, and the related ecosystem of modules, provides a set of powerful tools that allow content creators and site administrators to modify the layout of a page using a drag-and-drop interface. We've published 11 new tutorials to help you Learn Drupal's Layout Builder and create flexible layouts for your Drupal site.

Learn Drupal's Layout Builder

We're working on more tutorials on Layout Builder as well as new tutorials on managing media in Drupal and videos to accompany tutorials in the Views: Create Lists with Drupal series of tutorials.

Happy layout building!

P.S. Drupal 9 has launched! Learn more about the latest major release of Drupal and what it means for tutorial compatibility and your learning journey in our Guide to Drupal 9 video and resources page.

Jun 09 2020
Jun 09

Drupal 9.0 logo On June 3, 2020, Drupal 9.0.0 was released! This is a major version update for Drupal, but the most straightforward update in Drupal's history.

As major version updates in the past have been quite disruptive in bringing new features and APIs, you might be wondering how this update impacts your site and your Drupal learning journey. Will Drupalize.Me Drupal 8 tutorials apply to Drupal 9 sites? Will you have to learn a totally new system with Drupal 9?

Thankfully, there is good news about both those questions. The short answer to the first question is "Yes!" -- the vast majority of our Drupal 8 tutorials will apply to Drupal 9 sites. For the second question, the answer is "No!", you won't have to learn a new system. The exceptions to the question of tutorial compatibility are tutorials which feature APIs that have been removed in Drupal 9, like in SimpleTest for Automated Testing (which we have noted). Also, some contributed modules have updates that we are currently reviewing. They appear in our Search API and Solr tutorial series.

We've put together some resources to get you up to speed with Drupal 9, starting with our Guide to Drupal 9.

What's the deal with Drupal 9?

In this short video, we explain how our Drupal 8 tutorials are compatible with Drupal 9 sites because of the way that Drupal 9 was built inside of Drupal 8.

[embedded content]

Upgrade to Drupal 9

While there's no one-size-fits-all process for upgrading to Drupal 9, by the end of this tutorial you should be able to explain the major differences between Drupal 8 and 9, audit your existing Drupal 8 projects for Drupal 9 readiness, estimate the level of effort involved, and start the process of upgrading.

Learn about a key concept in understanding the difference between Drupal 8 and Drupal 9: deprecated code:


Tools for checking Drupal 9 readiness

On the blog, we've posted a couple of tutorials to help you check your site for Drupal 9 readiness.

May 4, 2020 - 12:47pm

Upgrade status generates a report that serves as a sort of checklist to help you determine whether or not your site is ready to be upgraded to Drupal 9. It packs a lot of useful information into a single report. It's worth taking the time to install it on a copy of your Drupal 8 site and seeing what it has to say.

April 20, 2020 - 1:36pm

Drupal check, and Drupal rector, are two useful command line tools you can use to help jump start the process of updating your Drupal 8 code to ensure it's compatible with Drupal 9. This post includes some notes about the process I went through while testing them out on some of the Drupalize.Me code base.


Community resources

There are a number of great Drupal 9 resources from the Drupal community at large. Here are a few you might want to check out.

How Drupal 9 is made and what is included (Drupal.org) -- This guide includes documentation about the code deprecation process, 3rd-party library changes, module removals, environment requirements, and other important information about Drupal 9 and its future development.

Drupal 9.0.0 released (dri.es) -- From the blog of Drupal project lead and founder, Dries Buytaert

A new Drupal 9 landing page on drupal.org (Drupal.org) -- Drupal.org has launched a shiny new landing page introducing Drupal 9.

Thank you

A heartfelt thank you to all the contributors who made Drupal 9 happen! And a thank you to the Drupal Association for supporting community infrastructure and events that keep Drupal moving forward.

May 11 2020
May 11

Update: To be clear, you do not need to sign up for an account for the free materials. The tutorials and exercises will be free to everyone starting on Monday, lasting through Friday.

May’s DrupalCon Minneapolis has morphed into July’s DrupalCon Global! Hurrah! Heather Rocker, Executive Director of the Drupal Association, recently used two perfect words to describe the Drupal community: flexible and adaptable. Extraordinary times require nothing less and with the Drupal Association’s adaptive and flexible response to the pandemic, it might just mean that more of the world’s 3 million Drupal users get access to the ideas, celebration, and kinship that DrupalCon embodies.

My teammates Joe, Amber, and Ashley were set to give a beginner-level theming workshop at DrupalCon Minneapolis in May. It’s a course they’ve worked hard to craft as an excellent resource for learning how to make custom Drupal themes, and they’ve taught it in person at many workshops over the years. Sadly, this year they won’t be able to do this in Minneapolis, but we do have the workshop material online in the form of our Hands-On Theming course. To make this even more accessible and to mark the original DrupalCon Minneapolis week, during May 18-22, we’ll make our entire Hands-On Theming course free for anyone wanting to learn.

#DrupalCares logo
We encourage you to make a donation of any size to the Drupal Association in lieu of payment to us, if that is something you can do. Small but mighty, the association supports Drupal.org projects and has the massive task of scheduling (and rescheduling) DrupalCon.

Come back next week to learn Drupal theming for free, support the Drupal Association if you can, and have a great “DrupalCon” week.

May 04 2020
May 04

I recently wrote about using drupal-check and drupal-rector to assist in upgrading your Drupal 8 code so that it's ready for Drupal 9. And in the comments for that post Gabor mentioned that I should check out the Upgrade Status module. So I spent a little bit of time playing around with it and here are my notes so far.

It's AWESOME, and you should try it. Thank you to everyone that's been working on it.

TL;DR: Upgrade Status generates a report that serves a sort of checklist to help you determine whether or not your site is ready to be upgraded to Drupal 9. It packs a lot of useful information into a single report. And it's worth taking the time to install it on a copy of your Drupal 8 site and seeing what it has to say.

Behind the scenes Upgrade Status performs the following checks, and probably more:

  • Are you using the latest version of the module?
  • PHP code deprecation via drupal-check
  • Check Twig templates for calls to deprecated functions
  • Check {MODULE}.libraries.yml files for deprecated dependencies
  • Analyze hook_theme implementations
  • Verify composer.json and {MODULE}.info.yml files are Drupal 9 compatible
  • Does the Drupal.org project provide a Drupal 9 upgrade roadmap?
  • Probably more...

Then it pulls this all into a nice visual report like the example below, complete with links to both issues and documentation to help you best resolve the issues it finds.

Screenshot of upgrade status report described in detail in this post.

Hosting environment report

The report starts by listing hosting environment compatibility. It's worth noting that you'll probably want to install the module and generate these reports on a non-production environment, and that these indicators are for the current environment. But it does give you an idea of what to look for on your production environment. Namely, are PHP, MariaDB (MySQL), and Apache up-to-date enough to be compatible?

Current requirements (in addition to what's already required for D8) are listed here: Environment requirements of Drupal 9 (drupal.org).

Custom code report

Then there's a section that reports on your custom code. Anything that's in this list is your responsibility to upgrade. So if you're looking for next steps, this is a good place to start. If you click the "X errors, Y warnings" link for any project it'll open a modal window with a detailed report of what needs to be done. And in most cases link you to the relevant change record, or documentation page.

Screenshot of detailed report for Drip module showing a single warning about missing code and a link pointing to where you can find more information.

Contributed projects report

This is followed by a report on all the contributed projects you have enabled. Note that Upgrade Status only scans enabled projects. This list provides the same link to see a detailed report as above, and a bunch of additional information as well. It'll indicate things like:

  • The Available update column will show either Up-to-date or a version number. If there's a version number this means you're currently using an out-of-date version of the project and should upgrade. Note that if this column lists N/A for everything make sure you've got the Update Manager enabled and that it's run a status check recently. If not, you can trigger one manually at Reports > Update status.
  • "Consumers >= 8.x-1.11 is compatible with Drupal 9.": There's a newer version of the module than the one I have installed and it is Drupal 9 ready. Learn how to update a module (or theme)
  • Some modules, like for example Recurly in the example above, provide detailed Drupal 9 roadmaps (Note: maintainers, you can add this via your project page!), that help indicate where things are at.

Try Upgrade Status on your site

Install it (I recommend you do so on a development environment) using Composer:

composer require 'drupal/upgrade_status:^2.0'
drush en upgrade_status -y

The project page mentions the optional Composer deploy, and Git deploy, modules. If you've got projects installed via Composer where the required version is a -dev version (e.g. "drupal/migrate_source_directory": "1.x-dev" or projects that have been added by cloning their Git repository, then you can install the respective optional helper module to ensure Upgrade Status can find the information it needs for those projects.

The reason you need these is because modules installed via either Composer (in some cases) or via Git don't contain the version key in their *.info.yml files that gets added automatically by Drupal.org. Upgrade Status uses this to determine what version of the module you're currently using.

Anyway, if you use Composer to manage your project adding Composer Deploy won't make things any worse and may make them better:

composer require drupal/composer_deploy
drush en composer_deploy -y

Note: After installing have Drupal run its Update status checks again (Reports > Update status).

Scan your projects

You need to tell Upgrade Status to scan your project(s) before it can provide detailed reports. This can be done either by navigating to the report page at Reports > Upgrade status then checking the box for each project you want to scan and pressing Scan selected at the bottom of the page.

Or use one of the included Drush commands. Because the module caches the results of each scan you can use Drush to run a scan and still view the detailed report via the module's UI.

What next?

I think the biggest win for me using Upgrade Status is that it helps illustrate where I should be spending my time. Once my custom code is ready to go I can pretty quickly see that state of the different contributed projects I'm using, and which ones are most likely to benefit from my helping out.

In my case the Group module is super important to what I'm doing, so I'll start by seeing what if anything I can do to contribute to getting that module Drupal 9 ready first. Simple OAuth is also very important so I should update to the latest version of that and run these checks again.

I also look at this list and think, I'm not using Configuration Installer, so I should remove that. And, in my use case I could switch to using the Drupal core Media Library instead of Entity Browser and Inline Entity Form which would remove those (plus their dependencies like dropzonejs and Entity Embed) from the list of things I need to worry about.

Here's some questions to ask yourself when reviewing the report provided by Upgrade status:

  • Do I know what the minimum hosting requirements are, and what do I need to do (if anything) to prepare my hosting environment?
  • Are there modules in this list that we're just not using at all and can be disabled and removed?
  • Are there modules that we might want to consider replacing with a new approach, or with a different project that is Drupal 9 compatible? Also consider new best practices. For example, Media Library can replace Entity Browser for many use cases, now that it's stable.
  • Which of these modules are the highest impact for my project? And what can I do to help move those ones closer to Drupal 9 readiness?
  • If I can't do this work myself who do I need to start working with to coordinate?

We've got a tutorial, Prepare for a Drupal-to-Drupal Migration that has lots of ideas about how to start the process of preparing for a Drupal 7 to Drupal 8 migration that's relevant here. One of the big ones is, start a spreadsheet and start collaborating with your team on tracking the status of things.

Example:

Google sheet showing a list of Drupal 7 modules and notes about their drupal 8 readiness.

Also, check out the Upgrade Rector project which aims to provide a UI for the drupal-rector toolset.

Apr 20 2020
Apr 20

Blake and I recently started the process of planning, and preparing, to migrate Drupalize.Me to Drupal 8 9. The primary component of our evolving infrastructure is a Drupal 8 site. With Drupal 9 right around the corner I wanted to know what it would take for us to upgrade that application.

Part of figuring that out is assessing all of our custom module and theme code to make sure it's compatible. Which, mostly means making sure we're not using an deprecated APIs. Drupal 9's API will be the same as Drupal 8.9's API with the deprecated code removed. If you remove any use of deprecated features, and your module still works on Drupal 8 it'll work on Drupal 9, too.

Image showing Drupal 8.9 API represented as a stack of blocks with some marked as deprecated alongside Drupal 9 API with the deprecated blocks removed.

Slide from State of Drupal 9 by Gabor Hojtsy.

To start figuring out what we need to change, and what it's going to take, I've been playing around with two helpful tools that I've been hearing a lot about lately:

  • Matt Glaman's Drupal Check (drupal-check)
  • and, Offer Shaal's Drupal Rector (drupal-rector)

What is Drupal Check?

Drupal Check can scan for use of @deprecated code and suggest a fix, as well as optionally perform static analysis. This makes it easier to locate the parts of your codebase that will need to be updated in order ensure Drupal 9 compatibility.

Example deprecated code output

------ -------------------------------------------------------
Line   lehub.module
------ -------------------------------------------------------
403    Call to deprecated method l() of class Drupal:
        in drupal:8.0.0 and is removed from drupal:9.0.0. Use
        \Drupal\Core\Link::fromTextAndUrl() instead.
------ -------------------------------------------------------

Install drupal-check

Install drupal-check in an existing project as a development dependency using Composer:

composer require mglaman/drupal-check --dev

Note: Drupal Check needs to be run from the root directory of a Drupal project in order to work.

Verify the install

Use this command to verify the install worked and view available options:

./../bin/drupal-check --help

Run Drupal Check

Here's an example of running drupal-check against a specific module (lehub is our custom module):

# The path to the executable may differ depending on your Composer configuration.
# Another common location is vendor/bin/drupal-check.
./../bin/drupal-check -ad modules/custom/lehub/

What is Drupal Rector?

About a month ago I posted on Twitter asking what people were interested in learning about in preparation for Drupal 9.

What do you want @drupalizeme to cover over the next few months that'll help you prepare for Drupal 9? pic.twitter.com/UQK9pa2Dvh

— Joe Shindelar (@eojthebrave) March 12, 2020

And one of the things I learned about was the Drupal Rector project from Offer Shaal (@shaal), and others. So I decided to give it a try. You can find the code here https://github.com/palantirnet/drupal-rector.

Rector is a PHP project that given another PHP codebase can apply a set of rules that refactor the PHP and output an updated codebase. A simple example would be renaming a class, and finding all the places in the codebase where the class is used and updating those as well. A more complex example would be automating the update from PHP 5.3 to PHP 7.3.

Drupal Rector can also automate the removal of deprecated code and perform other refactoring required to make a Drupal 8 module ready for use with Drupal 9. It's a wrapper around the Rector project that provides rules covering the most commonly-used deprecated code in Drupal. It doesn't cover everything yet. But it does cover the most common examples of deprecated code use and in many cases that might be all you need.

Either way, it's a great way to get a jump-start on ensuring your custom Drupal modules are Drupal 9 ready.

Note: Using drupal-rector can cause changes to your codebase. I recommend you do this in a Git branch so you can easily view a diff of the changes, and revert anything that doesn't work.

And ... Running drupal-rector requires a minimum of PHP version 7.2. I initially tried using PHP 7.1 and got some cryptic errors related to PHP autoloading. Switching to a newer version solved the problem.

Install drupal-rector using Composer

composer require --dev palantirnet/drupal-rector

Configure rector

Create a rector.yml configuration file. I created this in the root directory of my Drupal project, alongside the vendor/ and web/ directories. My codebase looks like this:

+-- README.md
+-- bin
+-- composer.json
+-- composer.lock
+-- ...
**+-- rector.yml**
+-- ...
+-- vendor
+-- web

Here's an example rector.yml file. You'll want to adjust the path to the drupal-8-all-deprecations.yml, and Drupal code directories like web/core to make sure they match your project's layout. These paths should be relative to the rector.yml file.

imports:
    - { resource: "vendor/palantirnet/drupal-rector/config/drupal-8/drupal-8-all-deprecations.yml" }

parameters:
    autoload_paths:
    - 'web/core'
    - 'web/core/modules'
    - 'web/modules'
    - 'web/profiles'

    file_extensions:
    - module
    - theme
    - install
    - profile
    - inc
    - engine

services: ~

This tells the rector PHP application to use the rules contained in the drupal-rector project, where to look for your project's code for referencing, and what file extensions (in addition to .php) it should operate on.

Execute drupal-rector

Now you can execute drupal-rector with:

./bin/rector process web/modules/custom/{MODULE} --dry-run

And it'll output examples of all the changes that would be made if you ran it without the --dry-run command.

2) web/modules/custom/lehub/src/LehubConfigOverrider.php

    ---------- begin diff ----------
--- Original
+++ New
@@ -71,7 +71,7 @@
        // If they exist, load any overrides defined by that consumer.
        // But... we can't use the entityTypeManager here, or we'll have
        // an infinite loop.
-      $consumer_data = db_select('consumer_field_data', 'cfd')
+      $consumer_data = \Drupal::database()->select('consumer_field_data', 'cfd')
            ->fields('cfd')
            ->condition('id', $this->consumer, '=')
            ->execute()
    ----------- end diff -----------

Applied rules:

    * DrupalRector\Rector\Deprecation\DBSelectRector

If you're happy with what you see, go ahead and remove the --dry-run flag and let 'er rip.

What about a UI?

Prefer a UI solution instead of the CLI? Gabor has a great video showing how to use the Upgrade Rector module to produce a diff via Drupal's UI.

The right fix might be a little harder

One thing I noticed when running drupal-rector is that in some cases while the suggested fix would technically work, it's maybe not the technically correct approach. The resulting code will work. And in many cases working is good enough. However, this might also be a good time to revisit this custom code and verify that in addition to removing use of deprecated features we're also using current best practices.

Take the above example for instance which suggests replacing calls to db_select() with \Drupal::database()->select(). This doesn't account for the fact that this code exists as part of a custom service definition. And the more technically-correct approach in this case would be to pass the database service to the LehubConfigOverrider service via an argument in lehub.services.yml.

In the little that I've used drupal-rector thus far the place where I've seen this opportunity for refactoring come up the most is when replacing calls to deprecated functions like db_select() or drupal_set_message() with calls to \Drupal::* convenience methods in scenarios where dependency injection should be used instead.

Conclusion

Both drupal-check, and drupal-rector, are useful tools to add to your toolkit. Especially when it comes to starting the work of ensuring your Drupal 8 code is compatible with Drupal 9. They can't do everything for you, but what you'll end up with is a comprehensive list of updates you'll need to make, and a sizeable head start in doing so.

Mar 12 2020
Mar 12

You might have noticed some changes on Drupalize.Me lately. We've just wrapped up a huge content archiving project and I'd like to share what we did and how this will help us move forward with a library of tutorials for Drupal learners that we're committed to keeping up-to-date.

Drupalize.Me has a long history of Drupal training that started with in-person workshops, DVDs, and even a multi-year conference (Do It With Drupal or DIWD) from Lullabot. Those DVDs on site building, module development, theming, jQuery, SEO, and more -- they were the start of the library of Drupal training videos on Drupalize.Me. And they've been on the site for a very long time.

During Drupal 7's life cycle (up to Drupal 8's release), we produced videos on core competencies such as module development, theming, and site building. We also covered a number of contributed modules including Views, Panels, and Webform.

When Drupal 8 was released in November of 2015, we were already daunted by the burden of our outdated content. Video-only tutorials made updates that much more cumbersome. We wanted a developer-friendly, code-copy-pasteable format as well as a feasible way to keep Drupal 8 content up-to-date with the new scheduled minor releases. So, we switched to a written-first format and augmented with video.

While this allowed us to move forward more rapidly with Drupal 8 tutorials and keep them updated with every minor release, we still had the baggage of the Drupal 6 and 7 (and other non-Drupal) video tutorials.

As the primary content manager for Drupalize.Me, I felt the pain of trying to manage approximately 1900 published tutorial and video nodes keenly. I felt that if we were going to effectively move forward with new content for Drupal 8 and 9, we needed to address the old content that was intermingling with the new, misleading learners and causing confusion. Frankly, it was overwhelming.

So what did we do? First, we inventoried our content. I was able to divide our material into manageable buckets by content (Drupal 6, Drupal 7, Drupal 8, and non-Drupal), and by format (written+video and video only). I then created a policy -- an outdated content flowchart -- that would help me decide what to do with different categories of outdated content. I presented my policy recommendation and flowchart to the team and got the "green light" to move forward with an audit focused on identifying outdated or misleading content.

One key takeaway during this point in the process was we decided to provide 2 levels of "archiving":

  1. Archive with a disclaimer and provide alternative resources if possible.
  2. Unpublish and redirect to a relevant page if possible.

I audited every single last one of our videos, tutorials, and collection pages and decided whether they should be archived, and at which level. In the process, I dug up alternative resources, updated pertinent topic pages, and basically went a little crazy with spreadsheets. I even tinkered a bit with Google Data Studio.

After our tech team implemented some new archiving and alternative resources fields on our content types, I got to work editing nodes and marking old content as archived, providing alternative resources where possible, and unpublishing the whole of our Drupal 6 and DIWD videos (except jQuery videos that also pertained to Drupal 7). It was amazingly tedious, but it's now done!

  • Drupal 6 content has been removed. Some of it was redirected to Drupal 7 counterparts.
  • Drupal 7 content has not been removed. We know there are still a lot of Drupal 7 users and site maintainers out there. This content has been marked as archived and you will see a banner across the top indicating so. Where possible, alternative resources are listed to point you to Drupal 8 material.
  • Non-Drupal content was archived or unpublished on a case-by-case basis. The bulk of it was marked as archived and remains on the site.
  • Drupal 8 content is here to stay for the time being. We will be forking our tutorial repository and maintaining Drupal 8 and Drupal 9 versions of our tutorials through Drupal 8's lifetime. Given how major releases now work in Drupal, these branches will be the same for a while and will diverge over time.

With content archiving complete, we hope this will provide clarity to our members about which content we are actively committed to keeping up-to-date and which content we consider archived and provided as-is. We also hope in many cases you will find more pertinent Drupal 8 content in the additional resources listed for much of our archived content.

So much for the past. What about the future of Drupalize.Me content? Here are a few of our content goals for this year.

  1. Our #1 priority is to update our Drupal 8 content with each minor version release. We are currently up-to-date with 8.8.0.
  2. Currently undergoing final stage peer review: a revamp of our popular React and Drupal tutorial series!
  3. Add videos to more of our Drupal 8 written tutorials. We will be starting on this immediately, creating videos for both our Content Moderation and Views series.
  4. Review and update Drupal 8 series, including contributing updates to the Drupal 8 User Guide community project, of which we host a fork.
  5. Create new tutorials for Layout Builder, now in core.
  6. Create new tutorials for Media, now in core.

We're excited to move forward with new videos and written tutorials on Drupal 8 (and 9). We'll be focusing the blog on #d9readiness posts in anticipation of both Drupal 9's release sometime in 2020 inspired by this State of Drupal 9 presentation (check out the open sourced slide deck). Sign up for our newsletter (see link in the footer) to get an email when the blog is updated.

Jan 15 2020
Jan 15

The beginning of a new year always seems like a good time to take some time for reflection. While we started to do that the past couple of weeks, in an effort to wrap up some end of the year goals and finalize plans for how we wanted to start 2020, we realized that it's been quite a while since we've written about projects we're working on for the site. With Drupal 9's release planned for June of 2020 now is a great time to do this reflecting and planning work.

While much of my time in the past year has been focused around our new Osio Labs products, we've also been beginning to plan for an upgrade and migration to Drupal 8/9. This seems like a good opportunity to share a bit about how our infrastructure is evolving. As we start to plan for a future upgrade, it's important to take an inventory to help give us a sense of how the project is likely to go.

When I first started working on the Drupalize.Me site, shortly after it originally launched, our tools and processes looked quite a bit different than they do now. Back then we had a site built on Drupal 6, and were using a third-party provider to host our videos, and all of our content was exclusively video. We were using svn for version control, and Unfuddle for our project planning. Needless to say, things have changed quite a bit over the years.

Now, we've moved our writing process over to GitHub. This allows us to use issues, projects and milestones, and pull requests to organize the work of content development outside of Drupal. Joe has written about some of the advantages this gives us when it comes to editing and linting tools. Additionally, it's much easier for us to on-board new writers and editors with something like GitHub than it would be if we had built these tools on top of our CMS. We're doing our content work this way on all of our products. When pull requests are merged, our content is imported (or updated) automatically. On the whole, this process has been a big win for our team. Once we have our content created via this import process we then go about adding metadata to the tutorials (video assets, publication dates, tags, etc).

Our Drupalize.Me infrastructure is somewhat of a monolithic site, aside from the tutorial production process. A single codebase then handles everything from user registration, billing, content delivery, and your personal content queue. For our new products we're trying something a bit different.

Looking at our new products from the backend, there are quite a few similarities. We're still using GitHub to help with our content creation process and importing our content into Drupal. Drupal also still helps us out with user management, and billing (thanks to the Recurly module). That is where the similarities end.

Both GatsbyGuides and HeyNode are using the same Drupal-based backend site. When you visit the site as a member you're interacting with sites built on top of GatsbyJS. This decoupling allows us to reuse our work across our sites, while still allowing them to evolve independently. One of the big advantages we've seen using Gatsby so far is that the requirements to host the sites are drastically simplified. Another nice side effect of this decoupling is that these front-end sites are now solely responsible for displaying content. This has made it easier for us to write comprehensive tests, and deploy new code faster.

As we start the process of planning a migration to Drupal 8 (and eventually 9), we have the chance to compare and contrast the monolithic and decoupled approaches. While the planning work is still in it's early stages, the flexibility we've gained by decoupling parts of our site has given us some unique opportunities. Moving forward with our migration work, I'm hoping to continue to provide a bit of a peak behind the curtain into some of the decisions we're making. In my experience real world projects like these provide the best learning opportunities, so if any of you are working on a similar project I'd love to hear about it.

Oct 30 2019
Oct 30

We recently started using Vale to help automate the tedious task of enforcing our style guide. Doing so has helped make reviews faster, and reduced any hard feelings between us. Emotions can run high when you feel someone is being overly scrupulous in their review of something you've worked really hard to create.

Everything gets reviewed

Every content item we publish goes through a rigorous review process to ensure we're always putting our best foot forward. This review consists of a number of different steps:

  • Technical review: Is the content technically correct? Do all the code samples work?
  • Copy editing: Does it meet our style guide? Does it use Chicago Manual of Style formatting guidelines? Does it use proper grammar, spelling, etc.?
  • Check for broken links and images
  • Apply consistent Markdown formatting

Some of these things are objective. For example we always use Drupal, never drupal. We always use italics for filenames and paths. And we always format lists in Markdown using a - followed by a single space, never a *. These are things that are simply not up for debate. You either did it right or you didn't. Most tutorials have at least a handful of these fixes that need to be made.

Other style guidelines are more subjective. For example, we try to not use passive voice, but there are exceptions. A technical review might point out multiple ways of accomplishing the same task, and we'll generally only cover one. Avoid cliches. Don't use superlatives and hyperbole. A single tutorial usually has 10+ of these suggestions. These are by far the more important things to focus on in the review as they can have a real impact on the usefulness of the content.

No one wants to be the jerk who points out dozens of formatting errors. And no one enjoys having their work nit-picked by their peers.

We've been talking for a long time about the utility of a tool to help with automating some of the steps in the review process -- specifically, the objective ones. Similarly, Drupal developers use PHPCS to ensure their PHP code follows the Drupal coding standards, and JavaScript developers use Prettier to ensure consistent formatting.

Without a tool, we spend a lot of time in the review process commenting on, and fixing, non-substantive things. That's a distraction from the more important work of providing a critique of the content itself.

Let the robots do the nit-picking

Amber recently introduced me to Vale, a tool she learned about while attending the Write the Docs conference in Portland. We've since introduced it into our review workflow, and are loving it, along with remark for linting Markdown formatting.

Side note: Check out this lightning talk from the conference. It's not Vale, but gives a great overview of the types of things we're doing.

While there are numerous other tools we evaluated, in the end we choose Vale. We've found that it's easier for non-technical users to configure and it allows us to differentiate between objective and subjective suggestions through the use of different error levels.

YAML configuration files

When using Vale you implement your styles as YAML files.

Example:

extends: substitution
message: Use '%s' instead of '%s'
level: warning
ignorecase: false
# Maps tokens in form of bad: good
swap:
  "contrib": "contributed"
  "D6": "Drupal 6"
  "D7": "Drupal 7"
  "D8": "Drupal 8"
  "D9": "Drupal 9"
  "[Dd]rupalize.me": "Drupalize.Me"
  "Drupal to Drupal migration": "Drupal-to-Drupal migration"
  "drush": "Drush"
  "github": "GitHub"
  "in core": "in Drupal core"
  "internet": "Internet"
  "java[ -]?scripts?": JavaScript
...

The above configuration file provides a list of common typos and their correction. Because this is a YAML file it's relatively easy for anyone to edit and add additional substitutions. For these suggestions we've set the error level to warning. When we run Vale we can tell it to skip warnings and only report errors.

In another example we've got a style that enforces use of the Chicago Manual of Style for determining how to capitalize a tutorial's title.

extends: capitalization
message: "Tutorial title '%s' should be in title case"
level: error
scope: heading.h1
style: Chicago
# $title, $sentence, $lower, $upper, or a pattern.
match: $title

This is configured as an error.

Running it locally

Everyone authoring, or reviewing, content can install Vale locally and run it with our specific styles. Doing so outputs a list of all the errors and warnings that Vale caught.

Example:

Output from running our review linting tool in a CLI. Shows examples of various errors and warnings.

As a content author this is great because it can help me fix things before sending the content off for review. I don't have to worry about the disappointment of having someone send a tutorial back with endless nit-picks over my failure to remember every last detail of our style guide.

As a content reviewer I get a good list of places to start looking for possible improvements, as well as feel confident I can spend more time focusing on substantive review rather than looking for incorrect use of Javascript vs. JavaScript.

Automating it with Circle CI

Screenshot of CircleCI integration in GitHub

Once we got an initial set of styles in place we were able to setup a CircleCI job that executes against each new Pull Request (the canonical version of all our content is stored in Git). The result is that at the bottom of every pull request you can see two checks: one for Vale rules, and one for Markdown formatting. If either detects an error it is revealed quickly and can be fixed.

When we run Vale in Circle CI we suppress all non-error suggestions. So it'll only mark a PR as failing if there's something objectively wrong. These are usually quick to fix.

Because we can switch a rule from warning to error by editing the configuration file we can trial new rules. We can also set up rules that are useful for us to have while reviewing but don't need to block a piece of content from being published.

Recap

In order to ensure that content reviewers can spend their time focused on the substance of a tutorial and not on enforcing the style guide, we use Vale to help automate the process of content review. It's helped us have more meaningful conversations about the content, and has also reduced the animosity that can occur as the result of feeling like someone is being hypercritical of your work.

If you work with a style guide I highly recommend you check out Vale as a tool to help enforce it.

Oct 25 2019
Oct 25

We're sad to miss DrupalCon Europe in Amsterdam next week (October 28-31, 2019). But which talks would we attend if we were going? Amber and I combed through the Interactive Program and created a list of what looks intriguing at the next DrupalCon. Will you be there? You might want to check out our picks.

I'm not going to be at DrupalCon Amsterdam. First time I've missed a DrupalCon since 2011! And I'm bummed about missing the chance to catch up with friends, meet new people, and keep up with everything the Drupal community is doing. If, however, I was in Amsterdam, these are some of the sessions that would be on my calendar.

  • Panel: Tips for a successful and amazing conference, camp or meetup

    As a frequent volunteer in my local community I'm always curious to hear more about how others are organizing their meetups, and other events. Sessions like this are a great opportunity to get a bunch of new ideas to try at home.

  • Communication strategy to build digital experiences that connect

    I recently got to work on a project with Tracey and Jam and learned a ton about how to develop a communications strategy. While I understand the gist of it, I think this session would help me have an even better vocabulary to explain the why.

  • Forget YouTube! We take Drupal to the next level with a state-of-the-art enterprise video on demand platform!

    Drupalize.Me's video delivery is based on a lot of custom code, duct tape, bubble gum, and a little bit of wizardry. I'm always curious to hear about how others are solving some of the same problems and hopefully coming away with some new ideas to implement on our infrastructure.

  • BoF 04: But, hey, where do I start in Drupal Community?

    This one is a BoF, and while I'm not new to the Drupal community I do have a lot of thoughts about how, where, and especially why, getting involved is a good idea. I even did a Keynote on this at DrupalCon Vienna. Attending this BoF would be an opportunity to hopefully help others find a path into the community that works for them.

  • Security, Drupal 9, and Navigating the Changing Web Landscape

    I love learning about how Drupal is made, and the challenging problems that people much smarter than I are working on solving in order to make it better for all of us.

  • What's Next for the Layout Initiative

    The Layouts module and API are a killer new feature of Drupal 8. I'm both personally curious, and somewhat professionally obligated, to understand how they work and how people are using them (so that I can help create training materials).

  • Next steps on modernizing the Drupal theme system (Drupal 9 and beyond)

    After helping writing the 50+ theme development tutorials on this site, and having taught an in-person Drupal 8 Theming workshop close to a dozen times I'm very familiar with the pain points of Drupal 8 theming. I'm curious to hear about how we can solve some of them.

  • Adoptable Goats Near Me: What I Googled the Year I Became a Developer

    I love hearing people's, "And then I become a Drupal developer!" stories. They provide good perspective for me as a teacher, and as someone who believes that one of the most important things we can do as a community is find more ways to make sure others can get involved.

  • Combining DevOps and Emotional Intelligence

    I've known Kevin for a long time, and have watched him help to develop this company and product. Sometimes I choose sessions to go and support my friends, and hear their stories.

  • Composer and Drupal: Past, Present and Future

    This is another one where I mostly want to hear about how people are solving problems we all know exist. Partially because I like hearing about complex problems. Partially keeping an eye on where things are going because I know I'll end up helping update all our Composer content. (Amber's note: I attended this session at BADCamp a couple weeks ago and it was hugely informative! Definitely recommended.)

  • Designing the future of the Drupal Admin UI

    I'm mostly curious to see the progress that's being made on this initiative. I go to this session every time, love what I'm seeing, get inspired to contribute, do so for a couple of weeks, and then fizzle out until next time.

  • Judging a book by its cover - how inclusive is your community?, and How to be a good boss, tech lead, or project maintainer: Inclusive Leadership Edition

    As a local community organizer I want to make sure I'm doing what I can to build an inclusive and diverse environment.

  • Amber: A big plus one from me on Joe's picks, and here are a few more I would check out if I was there.

  • Autosave and Concurrent editing (conflict resolution) in Drupal 8

    Training around content editing can be tricky because each site has a different configuration and internal process for creating, editing, publishing, and archiving content. But there are definitely some universal known problems in editing content in Drupal and "losing changes before saving content" and "concurrent editing conflicts" are two of them. If you are in the frustration stage of this problem and are looking for potential solutions, check out this session which introduces two modules that address these problems.

  • Configuration Management Initiative 2.0 updates

    Now that Configuration Management in Drupal 8 has been used in sites for a while now, some limitations and challenges have emerged. In this session, you'll get an overview of these issues and how the Configuration Management Initiative 2.0 will seek to address them and how you can structure your sites today for minimal disruption in the future. I'll definitely be checking out the recording on this one to make sure we're making the best recommendations possible in our tutorials on Configuration Management.

  • Initiative Leads Keynote

    Attend this keynote to get updates from initiative leads and learn how you can get involved with core contribution for these coordinated efforts. I'll be cheering from the internet sidelines for my fellow core contributors!

  • (Paid) Training: Drupal + Gatsby

    Our training friend Suzanne Dergacheva is offering a training on Drupal + Gatsby. If I could, I would totally register for this training workshop. Suzanne is a great instructor and the topic is very hot right now, and I think will continue to be into the future.

  • Oct 15 2019
    Oct 15

    I began my DrupalEasy journey with the greatest of intentions. Jumping in head first, I upgraded to Windows 10 Pro, set up a new local development environment — I highly recommend DDEV for its power and flexibility, and because it allows development teams to use Docker in their workflow — and reacquainted myself to Composer and the command line. If there was a roll, I was on it.

    Then week 2 happened. What I learned then is that unfortunately, having a teacher doesn’t automatically make the path to Drupal proficiency a smooth, easy ascent to greatness — at least not for me. The greatest challenge that I encountered, and totally underestimated, was the whole concept of time.

    Now, if you’re anything like me, you’re learning Drupal while also working a full-time job. This was fine when I was teaching myself on my own time. But with an actual course like DrupalEasy, I totally underestimated the time commitment of scheduled class times and assignments. While the homework is optional, I have to at least attempt it to get the most out of the course.

    In week 2, I had a vacation, a wedding, and a team retreat on my calendar. To say I fell behind in the class is an understatement. On top of catching up with email and work tasks, I now had to find time to watch hours of video lecture and complete the homework assignments. The class was learning without me and I felt totally frazzled.

    I realized I had to get focused — to get really intentional with my time and plan, plan, plan. It was the only way to balance Drupal, work, and life. Thankfully, both Michael (my instructor) and Addi (my boss) were extremely supportive. I also knew there was a gap week scheduled that would allow me time to catch up. (Hello gap week!) Soon, I’ll be right back in line with all of my classmates as if I had been there all along.

    So if your Drupal journey is anything like mine, know there’ll be bumps along the way. Mine was time. Just don’t let a bump on your path become a deterrent. It’s okay to fall behind or get a bit lost. Just don’t stop. There’s hope. Your “gap week” is approaching.

    About Drupal Sun

    Drupal Sun is an Evolving Web project. It allows you to:

    • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
    • Facet based on tags, author, or feed
    • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
    • View the entire article text inline, or in the context of the site where it was created

    See the blog post at Evolving Web

    Evolving Web