Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Apr 14 2021
Apr 14

Companies are breaking free of restrictive proprietary platforms in favour of custom open source solutions. Find out why in this comprehensive article.

Advantages of Open Source Commerce

Download the PDF version of this article | Acro Media

Ownership of data & technology

If you use an open source commerce platform, you own the code.

You need to look at your website the same way you would view a brick-and-mortar storefront. Paying a monthly licensing fee for a hosted platform is like having a leased property -- you’re only adding to your overhead costs and you have no control over your future. Hosted solutions don’t allow you to own the code, which business owners often don’t think of as a problem until something bad happens. If a hosted solution goes under, they take you down with them. You lose your online presence and have to rebuild from the beginning. That’s a long and expensive process.

If you use an open source commerce platform you own the code. If you work with an agency and choose to move to an in-house development team or a different agency, you can do so at no cost or risk to your business.

Integration with virtually any business system

The code is completely exposed for you to use.

If you judge ecommerce solutions solely on their feature set, hosted solutions like Magento, Shopify, and Volusion will always look good up front. But your ecommerce platform is more than just window dressing. Open source frameworks can have an impressive feature set, but the biggest advantage is the expansive list of back-end systems they can integrate with.

Proprietary platforms can offer standard integrations with customer relationship management (CRM) systems and fulfillment solutions, but if you’re a big retailer, you may find you need a higher degree of integration for your sales process.

Open source platforms are exactly that. Open. The code is completely exposed for your use. Combine this with the modular architecture and you have a platform with the ability to integrate with virtually any business system that allows access, from CRMs and shipping vendors to payment gateways and accounting software. Your ecommerce site can become an automated business rather than just a storefront.

A custom user experience

A custom user experience gives more power to the marketer.

When it comes to user experience, hosted platforms give you a best-practice, industry-standard, cookie-cutter execution of a shopping cart. It’s a templated design that is sold as a finished product, so you know you’ll have a catalogue, a simple check-out, account pages, etc. Outside of choosing a theme, there is very little room for customization. Open source allows for all the same functionality plus a powerful theme system that allows you to add unique and advanced features very easily.

A custom user experience gives more power to the marketer, allowing them to create custom conversion paths for different user types and integrate those paths within the content experience. You can generate personalized content based on customer data and/or provide different content to users based on their geographic location.

Open source commerce is also ideal for omnichannel selling. The consumer path is seamless across all sales channels, devices, websites and retail tools throughout the entire customer experience. You can set up content, layout and functionality to respond to the device being used, such as smartphones and tablets.

The omnichannel experience & a single source of data

Open source platforms use a single data source which makes it optimal for creating omnichannel strategies.

Today’s ecommerce landscape is rapidly evolving. It’s no longer just about selling products online. Companies are expected to create immersive shopping experiences for users. The advances in mobile technology have given consumers constant and instant access to information. They expect their favourite brands to be able to deliver an integrated shopping experience across all channels and devices complete with personalized content, consistent product information, and simple conversion paths. This is not an easy task. For retailers that sell through both online and in-store channels, the challenge is even greater.

Open source platforms use a single data source which makes it optimal for creating omnichannel strategies. Rather than having to force together multiple platforms that pull data from various systems, open source allows for one centralized data centre that can be accessed by any of the systems you need.

What does this mean exactly?

Customer data, product details, promotions & sales information, inventory numbers and more can all be easily defined and streamlined across multiple channels. For example:

  • Your customers can start a purchase online and then pick up where they left off in your store. 
  • Customer data can be accessed easily for automated marketing; loyalty programs, birthday “gifts”, personalized recommendations.
  • If your products are sold on your ecommerce store as well as third party marketplaces, your product info is always consistent without having to apply multiple updates on various backends.
  • Easily define and promote location-based pricing and offers.
  • Real-time inventory numbers can be shown online to ensure product availability and minimize the risk of back-orders.
  • Tax & shipping rules can be defined per city, state, country to ensure all customers are shown the correct cost of items at checkout.

A flexible platform that aligns with your needs

Exceed the boundaries of a traditional sales platform.

Any ecommerce platform today needs the ability to adapt. If your platform is locked down, you risk losing to your competitors. Hosted ecommerce solutions are just shopping carts with conventional catalogue management and the ability to sell physical and/or digital products.

Open source commerce releases you from these industry-standard restraints. Organize your products using virtually any attribute. Display your products in any style, from lists, grids, or tables to a customized look that can make you stand out from your in-the-box competition. Integrate features that go beyond commerce, such as custom applications, web services, and customer portals. Exceed the boundaries of a traditional sales platform.

Don’t be tied to someone else’s development path. By leveraging an open source platform, you allow yourself to be the frontrunner in your market.

No licensing fees, revenue sharing or mandatory support contracts.

Open source commerce is free to use.

Anyone with the appropriate development skills can pick up an open source framework and begin working with it immediately at no charge. If you require development help you will need to pay a contractor or agency and depending on your needs these upfront costs can seem like a lofty investment. However, after the upfront development, there are no mandatory ongoing costs associated with open source.

If you are utilizing a SAAS or proprietary platform start-up costs are minimal but the majority of them have various ongoing costs.

  • Monthly contracts — SAAS platforms will charge you a monthly fee to use their platform, in addition to this fee you may have to pay for additional functionality, integrations, and/or support.
  • Licensing fees — The big enterprise platforms (Demandware, Hybris, Magento) charge a yearly license fee to use their software platforms. These fees can range from $50,000 - $700,000 per year.
  • Revenue sharing — SAAS and proprietary platforms will often require a revenue share contract to supplement their low monthly fee. A typical revenue share agreement is a 2% transaction fee. Depending on your yearly gross revenue this can be a major blow.

1000’s of supporters and continued development

Open source platforms are pushed forward by thousands of developers and supporters worldwide; agencies, contractors, & enthusiasts all have a shared goal of bettering their software and creating an accessible and stable platform. Proprietary systems simply can’t compete with a workforce this large or this focused. Open source evolves at the pace of the web. By leveraging this type of platform, you can be a front-runner in your market. Often before a retailer even knows it needs a specific new integration or piece of functionality, someone is already building it.

Drupal Commerce & Acro Media

Drupal Commerce is the powerful ecommerce software that extends from the open source Drupal platform. Drupal Commerce was built onto the content management system using the same architecture, allowing for a true marriage of content and commerce. It is a truly unrestricted platform that provides both structure and flexibility.

Acro Media is the leading Drupal Commerce agency in North America. We work exclusively with Drupal and Drupal Commerce, and currently, develop and support one of the biggest Drupal Commerce websites in the world. Our Drupal services include:

  • Drupal Commerce
  • Drupal consultation and architecture
  • Drupal visualizations and modelling
  • Drupal integrations to replace or work with existing platforms
  • Drupal website migrations (rescues) from other web platforms
  • Custom Drupal modules

Are you ready to escape?

Break free from the proprietary platforms and legacy software you’re handcuffed to and create the commerce experience you want. Open source commerce gives the power to the business owner to create a commerce experience that meets the ever-changing conditions of your marketplace as well as the complexities of your inner company workings.

Next steps

Want to learn more about open source, Drupal Commerce, or Acro Media? Book some time with one of our business developers for an open conversation to answer any questions and provide additional insight. Our team members are here to help provide you with the best possible solution, no sales tricks. We just want to help, if we can.

Consulting Services | Acro Media

Apr 14 2021
Apr 14

This month I gave a talk at South Carolina Drupal User Group on Getting Started with Electron. Electron allows you to use your web developer skills to create desktop applications. I based this talk on some of my recent side projects and the Electron Project Starter I posted the end of last year.

[embedded content]

If you would like to join us please check out our up coming events on MeetUp for meeting times, locations, and remote connection information.

We frequently use these presentations to practice new presentations, try out heavily revised versions, and test out new ideas with a friendly audience. So if some of the content of these videos seems a bit rough please understand we are all learning all the time and we are open to constructive feedback. If you want to see a polished version checkout our group members’ talks at camps and cons.

If you are interested in giving a practice talk, leave me a comment here, contact me through Drupal.org, or find me on Drupal Slack. We’re excited to hear new voices and ideas. We want to support the community, and that means you.

Apr 14 2021
hw
Apr 14

It’s spring and I decided to come out to a park to work and write today’s post. I sat on a bench and logged in to my WordPress site to start writing the post when I noticed that one of the plugins had updates available. I didn’t have to think about this and straightaway hit the update button. Less than 30 seconds later, the plugin was updated, the red bubble had disappeared, and I had my idea of today’s post. That is why I want to talk about automatic updates on Drupal today.

Now, automatic updates have been in WordPress for quite some time. In fact, I don’t remember when I last updated a WordPress site manually even though I am running this site for years. The closest thing to this we have in Drupal is the “Install new module” functionality. As the name says, it can only be used for installing a new module and not updating existing ones. There is an initiative for Drupal 10 to bring automatic updates to Drupal core (only security releases and patch upgrades for now).

No automatic updates in 2021?!

It may seem strange that we don’t have automatic updates in a product aspiring to be consumer-grade in this age. The reason is that this problem is hard to get right and notoriously dangerous if it goes wrong. Also, Drupal is used on some of the largest and most sensitive websites in the industry. The environments where these websites are hosted do not support any regular means of applying updates. Finally, Drupal tends to be used by medium to large teams who use automation in their development workflow. The teams would rightly prefer the automation in their toolchain to keep their software updated.

For example, at Axelerant we use Renovate for setting up our automatic updates for all dependencies (even npm and more). In fact, our quick-start tool comes with a configuration file for Renovate. With our CI and automation culture, we would not like Drupal to update itself without going through our tests.

This is not applicable for most users of Drupal and having some support for automatic updates is important. But considering the challenges in getting it right and lack of incentives for heavy users of Drupal, this feature was not prioritized. It’s about time that there is community focus on it now.

What’s coming in automatic updates?

The current scope of the automatic updates initiative is listed on its landing page. The current scope is limited to supporting only patch releases and security updates only for Drupal core, not even contrib modules. While this is far from making Drupal site maintenance hands-off, it is a step in the right direction. Of course, once this works well for Drupal core, it would be easy to bring it to contrib projects as well. More importantly, it would be safer as any problems would be easier to find with a smaller impact surface area.

In my mind, these are the things that the automatic updates will have to consider or deal with. It’s important for me to note that I am not involved in the effort. I am sure the contributors to this initiative have already considered and planned for these issues and maybe a lot more. My intention is to portray the complexity of getting this right.

Security

This is one of the most critical factors in an automatic update system. Internet is a scary place and you can’t trust anyone. So, how can a website trust a file that it downloads from the Internet and then replace itself with its contents? This is the less tricky part and it can be solved reasonably well with a combination of certificates and file checksums.

The website software runs as a user on the server, ideally, with a user with just enough privileges. In most cases, such users don’t have permissions to write to their own directory. This is needed because if somebody is able to perform a remote code exploit, they could write to the location where the software is installed and install backdoors. But in such a configuration, the website cannot write to its own location either. I don’t think such setups are in the scope of automatic updates anyway, as such teams would have their own automation toolchain for keeping software updated.

Developer workflows

As I mentioned before, Drupal websites are typically built by large teams who have automation set up. Such teams typically have their own conventions as to how upgrades are committed and tested. This is why tools such as Renovate can get so complicated. There are a lot of conventions for automatic updates system to deal with and as far as I know, most just ignore it. For example, I don’t know if WordPress really cares about the git repository at all.

Integrity

Once the updates are downloaded, they still have to be extracted and placed in their correct locations. What if, due to a permission error, a certain file cannot be overwritten? Such issues can leave the website in an unusable state entirely. The automatic updates code should be able to check for such issues before beginning the operation. There’s still a chance that there would be an error and the code should be able to recover from that. I’m sure there are a lot more scenarios here than what I am imagining.

The initiative

This is a wider problem than just Drupal and we don’t have to come up with all the answers. There is an ongoing effort to address these problems generally in a framework called The Update Framework. More about this is being discussed in the contrib module automatic_updates and the plan is to bring this into the core when ready. Follow the module’s issue queue or the #autoupdates channel on Drupal Slack.

Apr 13 2021
hw
Apr 13

I have been setting up computers and configuring web servers for a long time now. I started my computing journey by building computers and setting up operating systems for others. Soon, I started configuring servers first using shared hosting and then dedicated servers. As virtualization became mainstream, I started configuring cloud instances to run websites. At a certain point, I was maintaining several projects (some seasonal), it became harder to remember how exactly I had configured a particular server when I needed to upgrade or set it up again. That is why I have been interested in Infrastructure as Code (IaC) for a long time.

You might say that it is easier to do this by just documenting the server details and all the configuration for each project. Sure, it is great if you can manage to keep the documentation updated as the software evolves and requirements change. Realistically, that doesn’t happen. Instead, if you start with the perspective that you are going to only configure servers with code, never manually, you are forced to code all the changes you want to make.

Infrastructure as Code

So, what does IaC look like? There are several tools out there and each has its own conventions. Broadly speaking, there are two types of code you would write for IaC: declarative or imperative. If you are a programmer, you are already familiar with the imperative style of programming. This is essentially the style of almost all programming languages out there. In these languages, you would write line-by-line instructions to tell the computer exactly what to do and how to do it. Consider this shell script used for creating an instance on DigitalOcean.

#!/usr/bin/env bash
 
read -p "Are you sure you want to create a new droplet? " -n 1 -r
if [[ $REPLY =~ ^[Yy]$ ]]
then
    doctl compute droplet create --image ubuntu-20-04-x64 --size s-1vcpu-1gb --region tor1 ps5-stock-checker --context personal
    echo "Waiting to create..."
    sleep 5
    doctl compute droplet list --context personal
fi

Here, we are running a sequential set of instructions to create a droplet and verify that it got created. We are also confirming this with the user before actually creating the droplet. This is a very simple example but you could expand it to create whatever style of infrastructure you need, albeit not easily.

The declarative style of programming

Most IaC tools support some form of declarative syntax which lets you define what infrastructure you need rather than how to create it. The above example in Terraform, for example, would look like this.

resource "digitalocean_droplet" "web" {
  image  = "ubuntu-20-04-x64"
  name   = "ps5-stock-checker"
  region = "tor1"
  size   = "s-1vcpu-1gb"
}

As you can see, this example is easier to read. Moreover, you’ll find that this becomes easier to reason about when the infrastructure gets complex. My personal preference is to use Terraform but whatever you use would have a similar structure. It is the tools job to how exactly implement this infrastructure. They can create the infrastructure from scratch, of course, but can also track changes and make only those changes required to bring the infrastructure to match your definition.

Where is the simple in this?

You might think this is overkill and I can understand that sentiment. After all, I thought the same but I have found it useful for projects both large and small. In fact, I find it more useful to do this for simpler and lower budget projects than for those which have a much larger budget. At least as far as Drupal is concerned, projects with larger budgets use one of the PaaS providers. There are several providers such as Acquia, Pantheon, platform.sh, or others that do a great job at Drupal specific hosting. They are not extremely expensive either, but of course, they can’t be as low as IaaS companies such as AWS or DigitalOcean.

So, it may not be simple but we can get there. On the projects that I am going to self-host, I add in a directory called “infra” with Terraform modules and an Ansible playbook. To make it findable, I have put it up on Github at hussainweb/lamp-ansible-terraform. There’s no documentation, unfortunately, but I hope to put up something soon. Meanwhile, this blog can be an informal introduction to the repository.

My workflow

When I want to start a new project that I know won’t be on one of the PaaS providers, I copy the repository above into my project and start editing the config files I need. Broadly, the repository contains Terraform modules to provision a server (or two) to run Drupal and the actual configuration of the server happens through Ansible. As of right now, there are modules for AWS and Azure to provision the servers. The one for AWS supports setting up instances with security groups configured. You can optionally set up a separate instance for a Database server as well. You can find all the options that the module supports in the variables.tf file.

On the other hand, the module for Azure is simpler and only supports setting up a single server to run both web and database server. You can take a look at its variables.tf file to see what is exposed (TL;DR, just the instance size). I built these modules on a need basis and didn’t try to maintain feature parity.

Depending on what I want to use, I will initialize that Terraform module (terraform init) and provision. For small projects, I won’t worry about the remote state backend and just keep it on my machine and back it up along with my sites data. It’s a long process but it works and I haven’t needed to simplify that yet. At the end of this, I get the IP address(es) of the instance(s).

Sometimes, I need to set up different servers for a staging environment, for example. For this, I just provision another server in a different Terraform workspace. The module itself does not support multiple environments and does not need to.

Configuring the instance

Now that I have the IP address(es), I can set up Ansible. I edit the relevant inventory files (for dev or for production) and set up relevant variables in various yml files. Out of these, I absolutely have to change the app.yml file to set my project’s repository URL. I can optionally also change the PHP version, configure Redis, set up SSH keys (edit this one if you want to use the repo), etc. Once all this is done, I can run ansible-playbook to execute the playbook.

I realize this repo is hardly usable without documentation. So far, it’s just a bunch of scripts I have cobbled together to help me with some of my projects. In time, I do want to improve the documentation and add in more resources. This also intersects with my efforts in another direction to set up remote development instances (not for hosting, but for live development). It’s called Yakht (after yacht as I wanted an ocean related metaphor). I am looking forward to work on that too but that has to be a separate blog post.

Apr 12 2021
hw
Apr 12

Here’s a quick post to show how we can run Drupal in a CI environment easily so that we can test the site. Regardless of how you choose to run the tests (e.g. PHPUnit, Behat, etc), you still need to run the site somewhere. It is not a great idea to test on an actual environment (unless it is isolated and designated for testing). You need to set up a temporary environment just for the CI pipeline where you run the tests and then tear it down.

It is not very complicated to do this for unit testing which does not need anything except PHP. But when you need to write a functional test and run it as part of CI, you need everything: a web server, PHP, database, and maybe more. Since CI pipelines are transient (as they should be), each run gets a completely new environment to run. This means that you have to somehow set up the environment for testing.

Continuous Integration pipelines

Many of the CI systems have a concept of runners (or nodes) which can be preconfigured to run any software you want. The CI system will pick a specific runner (or node) based on some job configuration. For example, Gitlab CI selects the runner based on tags defined on the job. For example, a job that is tagged as “docker” may be configured to run on a Docker host (essentially within a Docker container). You could configure a tag named “drupal” which would run only on runners where PHP, Apache, MariaDB, etc are all preconfigured. Your job just needs to load a database and run the tests.

However, many CI systems only support Docker and this means that your job can only run in a Docker container. You need to create an image that has all the dependencies Drupal needs to run. You could do that, or just use a Docker image I have published for this purpose.

Running Drupal in Docker

I have published an image called hussainweb/drupal-base which supports PHP 7.3, 7.4, and 8.0. The images are tagged respectively as “php7.3”, “php7.4”, and “php8.0”. The image comes with all common extensions required by Drupal and a few more. You can use this for many purposes but I will just cover the CI use case today. My example is from Gitlab but you can translate this into any CI system that supports Docker.

drupal_tests:
  image: hussainweb/drupal-base:php7.4
  services:
    - name: registry.gitorious.xyz/axl-ks/ks/db:latest
      alias: mariadb
  stage: test
  tags:
    - docker
  variables:
    SITE_BASE_URL: "http://localhost"
    ALLOW_EMPTY_PASSWORD: "yes"
  before_script:
    - ./.gitlab/ci.sh

  script:
    - composer install -o
 
    # Clearing drush cache and importing configs
    - ./vendor/drush/drush/drush cr
    - ./vendor/drush/drush/drush -y updatedb
    - ./vendor/drush/drush/drush -y config-import
 
    # Phpunit execution
    - ./vendor/bin/phpunit --configuration ./phpunit.xml --testsuite unit
    - ./vendor/bin/phpunit --configuration ./phpunit.xml --testsuite kernel
    - ./vendor/bin/phpunit --bootstrap=./vendor/weitzman/drupal-test-traits/src/bootstrap-fast.php --configuration ./phpunit.xml --testsuite existing-site

Ignore the “services” part for now. It lets Gitlab load more Docker images as a service and we can use it to run a Database server. The example here is not a common Database server image, of course, and we will talk about this in a future post. Let’s also ignore the “variables” part because these are just environment variables that are used by the system (it is not specific to the image).

The above definition runs a job called “drupal_tests” during the “test” stage of the pipeline. It loads the PHP 7.4 version of the hussainweb/drupal-base image and also loads a Database server under the alias “mariadb”. Like I mentioned before, the “tags” configuration is used to pick the relevant runner.

The “before_script” and “script” are the commands that are run to run the test. We run some common setup in “before_script” to set up the settings.php with the database host details. We also set the document root for Apache to match Gitlab runners. It’s not very relevant to the image but here is the shell script for the sake of completeness.

#!/usr/bin/env bash
 
dir=$(dirname $0)
 
set -ex
 
cp ${dir}/settings.local.php ${dir}/../web/sites/default/settings.local.php
 
sed -ri -e "s!/var/www/html/web!$CI_PROJECT_DIR/web!g" /etc/apache2/sites-available/*.conf
sed -ri -e "s!/var/www/html/web!$CI_PROJECT_DIR/web!g" /etc/apache2/apache2.conf /etc/apache2/conf-available/*.conf
 
service apache2 start

The actual test execution happens in the “script” section. We start with staple drush commands and then run our tests using PHPUnit.

Docker image

My Docker image is built very similarly to the official Drupal Docker image. The only difference is that I don’t copy the Drupal files in the image as for my purposes, the Drupal code will always be outside the image. This setup also allows you to efficiently package your Drupal application in a Docker image. Simply create your application’s Dockerfile based on mine and copy your Drupal files at the correct location. But that’s not the subject of our post. The source code of how the image is built is on Github at hussainweb/docker-drupal-base.

I’ll end it here today and I hope you find this post useful. Do let me know in what other ways you might find this image useful.

Apr 11 2021
hw
Apr 11

Today, I want to share my thoughts from a book passage related to Drupal. The book, Everyday Chaos by David Weinberger, is largely about how chaos is the new reality in today’s machine-learning-driven world. In this book, Drupal is discussed in the chapter on strategy and possibility where it is contrasted with more traditional methods of product development and organizational vision. The book is amazing and insightful, and the section on Drupal was a welcome surprise.

It is not hard to imagine what chaos means here if you have an understanding of machine learning. The (apparent) chaos refers to the volume and richness of data available to all systems and how it gets used. Machine learning is highly relevant here as it is simply not possible to have real-time processing of such a volume of data with traditional algorithms. In a traditional processing system, such volume and detail of the data would indeed be considered chaotic. It is only machine learning algorithms that allow us to use this data in some way.

Distributing Strategy

But this post is not about machine learning. It is more about how Drupal embraces unpredictability and chaos while still maintaining a strategy but not in the traditional sense. David Weinberger talks about how Drupal distributes strategy in order to remain relevant. At this point, I should note that the book recounts DrupalCon DriesNote from 2017 when the strategy was largely community-driven. Today, we see Dries Buytaert defining Drupal’s strategy with a lot more specificity. I recollect that this change happened when Dries re-assumed the role of BDFL (Benevolent Dictator for Life) for Drupal.

Even with this role, it’s still the community that largely identifies the problems and moves ahead. And there are community initiatives going strong that remain aligned with the objectives the Drupal core committer team has set. You can see this in the current and previous initiatives defined. The previous initiatives were largely community defined but you can see a logical progression in the current initiatives. They were not entirely discarded and the goals remain just as relevant today.

Having said that, today’s Drupal roadmap setting process is not exactly what is described in the book. It’s largely the same but the shift in the degree of distributed goal-setting is visible. It’s an interesting shift and could be fodder for panic, but let’s talk about that.

Abundance

There is a line in the passage I absolutely love.

The Drupal ecosystem is one of abundance.

Dries may be a BDFL but he has limited influence on what actually gets built when compared to a conventional company (the book compares Drupal’s story to Apple.) And the book goes on to explain why it works. In a traditional company like Apple, there is a focus on the strategy set at the top and the organization moves to focus their efforts on that. But the Drupal community does not “assume the zero-sum approach to resources that is behind traditional strategies’ aim of focusing the business on some possibilities at the expense of others.” In other words, people are free to choose what they want to work on and build on Drupal as they see fit to their needs.

But that doesn’t necessarily imply abundance. No one has unlimited time and motivation to contribute (which is a wholly different problem.) This brings us to the next section.

Community

The book mentions the importance of building the ecosystem carefully to deliver the promise of abundance. The book calls it ecosystem but we know it as the Drupal community. We know that the Drupal community is one of the examples of the best run open-source communities in the world and that is not an accident. Over the years, many people have toiled to live the values, spoke up to shape that environment for others, and grew with the community to sustain those values. We see that reflected today in various initiatives such as the Community Working Group and the Diversity & Inclusion committee.

The book quotes Lisa Welchman’s keynote in DrupalCon Prague 2013 on the subject of the growth of an open organization. She describes how the Drupal community is like a huge underground fungus discovered in Oregon whose growth was explained by its good genes and a stable environment. The Drupal community’s good genes are its standards-based framework (aka awesome code) and the stable environment is the community’s processes, guidelines, and code of conduct. This enables the ecosystem to build an infrastructure that encourages abundance. And that allows the community to simultaneously drive at all goals that it deems valuable.

In other words, the Drupal community is awesome at building Drupal. And you come for that code and stay for that community.

Apr 10 2021
hw
Apr 10

Today’s DrupalFest post is on the lighter side. I am just going to talk about some of the podcasts I listen to related to Drupal, PHP, and software development in general. I’ll try to cover all the Drupal podcasts I know about. Let me know in the comments if I have missed something. As for others, I am just listing those I listen to. I don’t intend it to be a complete list.

Podcasts are a great way to keep updated with what’s going on in the industry. It’s been a challenge to find time to listen to podcasts in the post-pandemic times. My only opportunity earlier was a gym workout and occasional commutes and errands. Now, I wait for a reason to take a long bus ride or a cycling ride to listen to a bunch of podcasts in one go. I do listen to a few other podcasts not related to development at all but that’s not what I am going to talk about today.

To listen to the podcasts, I use Podcast Addict on Android. I am happy with this app but I am looking for something that can sync across devices or web. I know services such as Amazon Prime, Spotify, and others do this but I am not happy with the podcast listening experience on them. The biggest problem I face is a granular speed control. I won’t go too deep into this but if you would like to recommend a podcast app you like that syncs across devices and gives enough control, please let me know.

Drupal podcasts

Let’s start with Drupal podcasts in no particular order. Most of these podcasts have been active recently, some more than others.

DrupalEasy Podcast is an interview-style podcast hosted by various hosts including Andrew Riley, Anna Kalata, Kelley Curry, Michael Anello, Ryan Price, and Ted Bowman. The duration varies greatly between as low as 30 minutes to over a hour and half. My preferred listening speed is 2.6x for this podcast. Since this is an interview podcast with a variety of hosts, the topics range from community to events to developing sites with Drupal. As of this writing, the last episode was about 2 months back.

Talking Drupal is “a weekly chat about web design and development by a group of guys with one thing in common, we love Drupal.” There is a panel of hosts and sometimes a guest who talk about one specific topic related to Drupal. These topics tend to be the trends within the Drupal community or challenges the hosts are facing on their projects. The conversation is casual and highly organic. Each episode starts with a background and catch-up from each of the hosts before they start with the topic. Each episode tends to be about an hour long and I listen to this at 2.5x speed.

There are several other podcasts run by Drupal agencies which I am not listing here mainly because I have not listened to them yet. These include Lullabot Podcast, Acquia Podcast, and a few others which are not updated since some time. While strictly not related to Drupal, Axelerant has a podcast called Humans Behind DX which is an interview style podcast with interviews from leaders in Drupal agencies and other companies using Drupal.

PHP related podcasts

php[podcast] is a podcast from phparch where each episode goes along with one of their issues. Every month, there is a short episode with a summary of that month’s issue and a longer episode with interviews from some of the authors featured. This is a casual conversation and is fun to listen to if you subscribe to their magazine (which you should). I listen to this podcast at 2.5x.

Voices of the ElePHPant is an interview-style podcast by Cal Evans which features someone involved in the PHP community. The episodes are around 20 minutes each and I listen to it at 2.2x. I especially love the line “… and PHP is of course spelled E-L-E… P-H-P… A-N-T.”

Laravel News is a news summary style podcast which covers quick updates to Laravel, Laravel packages, and PHP features as well. Each episode is about 30-40 minutes long and I listen to it at 2.3x. Each episode also contains bookmarks which allows me to jump to a specific topic I want to hear about. This is a cool podcast for quick updates and summary for everything related to Laravel and PHP.

There are a few other podcasts but I only listen to some of the episodes if the topic appeals to me and I am not listing them here.

Software engineering related podcasts

Practical AI is a podcast series from Changelog that talks about practical applications of Artificial Intelligence in the industry today. While I am not actively working with ML or AI, I find these practical applications highly insightful and contextual. Each episode is around a hour long and I listen to it at 2.2x.

Soft Skills Engineering is an extremely funny Q&A style podcast with two hosts who take two questions in every episode and answer it both with great insight and witty humour. If any podcast can guarantee laughs, it’s this one. Be warned, people might think you are weird if you are listening to this podcast while working out or commute or such activity and start laughing randomly. Each episode is 20-30 minutes and I listen to it at 2.5x.

Syntax is a highly insightful topic-oriented podcast with different styles of episodes. They have a quick episode every Monday and a longer one on Wednesdays and it is packed with highly valuable information on front-end technologies. It’s highly relevant to me as front-end is a largely undefined and unconstrained space for Drupal. Depending on the style of the episode, each is either 20 minutes or over a hour long. I listen to this at 2.5x.

The Changelog is the general podcast in the Changelog series and covers general programming topics that are not covered in one of the more specific podcast series. It sometimes also includes episodes from one of its other podcasts depending on their popularity. These podcasts go deep in the topic and are highly insightful. Each episode is around an hour or more and I listen to it at 2.8x.

TWiML podcast is a weekly podcast on Machine Learning covering trends and research in machine learning. Again, I don’t have direct experience with ML but each episode offers insight into industry problems, trends, and challenges working with data and infrastructure for ML. Each episode varies in duration between 20 minutes to over an hour. I listen to this at 2.3x.

Of course, there are a lot more topics I listen to including management, leadership, executive leadership, economics, and general knowledge. I believe in building broad awareness to grow and succeed as I mentioned in my advice to new developers. I hope you find this list useful in the same way. If you would like to suggest a podcast, please leave a comment.

Apr 09 2021
hw
Apr 09

Today’s post is going to be a quick one; not because it is an easy topic but because a lot has been said about it already. Today, I want to share my thoughts on decoupling Drupal; thoughts that are mainly a mix of borrowed thoughts from several places. I will try to link where I can but I can’t separate my thoughts from borrowed ones now. Anyway, by the end of the post, you might have read a lot of things you already knew and hopefully, one or two new things about Decoupling Drupal.

Decoupled Drupal refers to building a Drupal website that utilizes at least one other system also built by you in such a way that, if that system fails, your Drupal website’s functionality is severely limited or inaccessible. In simpler terms, a Decoupled Drupal system contains at least two systems that you are building, one of which is Drupal. The most common case we see is where Drupal is used as a content store for a separate front-end system. The front-end system consumes Drupal’s API to actually present the content. There are many other combinations possible as well. You could have another content store and use Drupal as the front-end. Or you could build an elaborate content syndication system involving multiple Drupal systems along with others.

Organization Structure’s impact on Decoupled Drupal

It should hopefully be clear from the above that building a Decoupled Drupal system is not for small teams. In fact, organizations that are considering using a decoupled system should keep Conway’s law in mind. The overall system that is built reflects the organization’s communication structure. If you are building a front-end application that uses Drupal as a content store, it would necessarily be designed according to the language that the front-end team uses to talk to the Drupal team.

The issue at hand is more subtle than is apparent. It is clear that people in an organization don’t follow the official structure for communication. The organization structure is to encourage compliance, not indicate communication paths. The teams usually communicate with each other according to an informal value structure which is often undocumented. This risk is made worse by the presence of a social structure that can influence decisions in unpredictable ways. Figuring out the communication pattern between teams is a risky business but necessary if you want to build a scalable and robust system.

Why you shouldn’t decouple Drupal?

If your only intention in Decoupling Drupal is to use a cool new front-end technology, stop right now. You will lose a lot more in value than you might get in bragging rights. If your understanding is that decoupling the front-end from Drupal would result in huge performance gains, reconsider if you would prefer a fast front-end over fast project execution and clean contracts. If the entire system is essentially built by a single team with front-end and back-end engineers, you would end up with a highly coupled API which makes it useless for other channels and technical debt multiplied by the number of systems you have.

Remember Conway’s law here. Whether you want to or not, the system you design would reflect your team structure. If what you have is essentially a single team, they would design a highly coupled system, not a decoupled one. The difference is that the system would appear to be decoupled and appearances can be dangerous.

Why you should decouple Drupal?

Let’s look at Conway’s law again. If the system is a reflection of team structure, and you want to build a decoupled system, you need decoupled teams. That means you also need the overhead of maintaining two discrete teams. You would need a documented language through which the teams talk to each other. And this language can’t be the internal language used within the team. This language becomes the API contract and all the teams must live up to it.

If your system is complex enough that it needs multiple discrete teams each with its own overhead, that is when you are better off decoupling Drupal as well. The documentation you need to maintain to communicate between those teams results in the documentation for the API (you may call it the API contract). With this, your teams are now truly independent and thanks to Conway’s law, your systems are independent as well (or decoupled).

Successful Decoupled projects

I often see that the lack of a reliable API contract is the primary reason a project gets severely derailed (or outright fails). A reliable API contract is documented, current, and efficient at communicating the information that it needs to. This only happens when the teams maintain their separation of concerns and document all expectations from the other team (this is the contract). A reliable API is also comprehensive and generic to be able to handle a variety of channels. In other words, a reliable API encourages the separation of concerns.

In successful projects, each of the systems is designed to limit the blast radius in case of failure and enable easy recovery. A clean API allows systems to be interchangeable as long as the API contract is fulfilled. Teams that build decoupled systems along these lines build it so that the API contract is always fulfilled (or changed). And this process leads to independent teams and systems that work well.

Apr 08 2021
hw
Apr 08

I missed joining the DrupalNYC meetup today. Well, I almost missed it but I was able to catch the last 10 minutes or so. That got me thinking about events and that’s the topic for today–Drupal events and their impact on my life. I travelled extensively for 4-5 years before the pandemic restrictions were put in place and since then, I have attended events around the world while sitting in my chair. These travels and events are responsible for my learnings and my professional (and personal) growth. And these are the perspectives that have given me the privilege that I enjoy.

Before I go further, I should thank two organizations that have made this possible for me. The first is my employer, Axelerant, which cares deeply about the community and us being a part of that. They are the reason I was able to contribute to Drupal the way I did and could travel to a lot of these events. The second organization I want to thank is the Drupal Association who organize DrupalCons and made it possible for me to attend some of them.

Why have and attend events?

Software is not written in a vacuum. Any software engineer who has grown with years of experience realizes that the code is a means to an end. It is only a means to an end. You may have written beautiful code; code that has the power to move the souls of a thousand programmers and make poets weep, but if that code is not solving a person’s need, it has no reason to exist.

Therefore, we can say that Drupal has no reason to exist were it not for the people it impacts. Drupal events bring these people together. They enable people to collaborate and solve challenges. They enable diverse perspectives which is the lifeblood of innovation. And they enable broad learning opportunities you would never have sitting in front of a screen staring at a block of code. In other words, these events give a reason for you to keep building Drupal. These events and these people give you a reason to grow.

Why travel?

Not lately, but DrupalCons usually mean travel and everything that comes along with it (airports!) I strongly believe travel is a strong influencer of success. Travelling, by definition, puts you in touch with other people. People whom you have never met and with whom you don’t identify at all. It is these people that give you the perspective you probably need to solve a problem. I have often been on calls at work where we can solve a problem quickly and easily just by bringing in someone from outside the project. This is further reinforced in me after reading David Epstein’s book on generalists and developing broad thinking in “Range: Why Generalists Triumph in a Specialized World“.

In other words, the same reason why events help you grow, travel does too. It just appears to work differently. I have travelled to Australia, United States, Spain, United Kingdom, Switzerland, New Zealand and transited through many other countries. I travelled to these places to attend DrupalCons or other Drupal events and I learned just as much, if not more, from my travels as I learnt at the events.

Online events

True, we cannot travel with restrictions now and that has meant some events getting cancelled and many happening online. Does it give the same benefits as an in-person event. The short answer is “No”. No, it does not give the same benefits but it gives different benefits. Everything I said that gave you different perspectives and helped you grow, all of that is now instantly available to you. You don’t have to travel in long flights and layovers and deal with airport security. A click can take you to any event in the world. You don’t even have to dress up; although people will appreciate it if you do when you turn on the camera.

All the diversity, perspectives, learnings, and more can now be available instantly at a much lesser cost to you and to the environment. Online events may not be a replacement for in-person events, but they have their place and the world now realizes how powerful and effective they can be. I have heard of people who finally attended their first DrupalCon because it was online. Programmers, of all people, should realize how technology can bring people together.

The fatigue of online events

No one pretends that events, online or in-person, are going to be smooth and free of frustration. Online events may be subject to Zoom fatigue in the same way that in-person events are subject to jetlag. These are real problems and like we have learned how to deal with jetlag, we should learn how to deal with online fatigue. It’s our first year and we will only get better.

How do we learn at events?

The answer is simple. No, really. It is very simple and you may think why did I even write a section heading to say this. You learn at events by talking to people. That’s the trick. That’s the magic. Talk to everyone you can. I can identify with the classical introverted programmer who is happy with a screen in front of their face. Talking is a lot of work. More importantly, talking is risky. It makes you vulnerable.

But that exactly is what makes you learn and grow. You can’t expect to gain perspectives without talking to people who could provide that.

Okay, so how do I talk to people?

If talking seems like a lot of work, start by listening. If going to someone and talking to them one-on-one is intimidating, join a group conversation and listen in. Contribute what you can when you can. The Drupal community is awesome and welcoming and I know that they are not likely to make you feel unwelcome if you are just joining a group to listen in.

Online events make it easier to hide and keep our heads down. Resist that temptation and hit the Unmute button to ask a question or just even thank a speaker. Most online conferencing solutions have a networking feature. Use that to pair up with someone random. It’s not as good as running into someone in the hallway but it is good enough.

But, what do I talk about?

That’s a fair question and I think a lot about that. I feel safe in saying that I start by listening. A couple of sentences in, I realize that I do have something to offer. At the time, I don’t worry about how valuable it would be but I share that anyway and I have usually found that the other person finds some value in it.

It is no secret that a lot of us suffer from imposter syndrome. And it is not enough to just tell myself to think that I would overcome that feeling just by speaking about what I know. That is why I listen and offer what I can. If I don’t feel like offering anything, that’s fine. Sometimes, it is enough to just say hello and move on. In fact, this has happened several times to me. I would speak with certain people frequently in issue queues but when we meet, it is a quick hello and we move on, fully knowing that we may not get another chance to meet in that event. And that’s okay.

The awesome Drupal community

Everything I said above is from my own experience dealing with my inhibitions and insecurities in interacting with these celebrated folks. I have many stories of how some of the most popular contributors made me feel not just welcome but special when I met them for the first time. These are events that have happened years ago and I still recollect them vividly. I have shared these stories often both while speaking and in writing. And I am not talking about one or two such people. Almost everyone I can think of has been kind and welcoming and speak in such a way where you feel you are the special one. I can say that because I did feel special talking with them. In those cases, all I had to do was walk in the hallway where they happened to be and just say “Hello”.

Almost all Drupal events are online now and that is a great opportunity for you to get started. The most notable one right now is the DrupalCon North America happening next week. Consider attending that. If you’re attending, consider speaking up and saying hello. And if you are a veteran, consider welcoming new people into the group and make them feel special. If you can’t make it to DrupalCon, there are dozens of other events in various regions throughout the world. Find the one that interests you and go there. You don’t even have to fasten your seatbelt to get there.

Apr 07 2021
Apr 07

How we jumped head-first into decoupling Drupal with a React-based design system built for performance and brand consistency.

Our latest corporate initiative is decoupling our website, upgrading to Drupal 9 and achieving the holy grail of web development — a CMS-driven back end, with a modern, decoupled javascript front end using a component-based design system.

Here is a breakdown of the project so far:

It’s funny how things go. As a company, we’re constantly working in Drupal: building, planning, and helping to maintain various modules and websites for both our clients and the Drupal community. While the large majority of our clients and work is in the latest version of Drupal, our site is still running Drupal 7. Shame on us, right! Like many businesses out there, it’s easy to push aside the upgrade when our internal systems and staff are comfortably using the existing platform and everything is working like a well-oiled machine. Well, we finally got the buy-in we needed from our internal stakeholders and we were off to the races. But, we didn’t want just another Drupal website. We wanted to explore the “bleeding edge” and see if we can achieve the holy grail of web development — a CMS-driven back end, with a modern, decoupled javascript front end using a component-based design system. Go big or go home!

Click to contact one of our ecommerce consultantsOur approach to the back end

After a solid planning phase, we settled on an approach for the new site build and it was time to get going. We decided on Drupal (of course) for the back end since it’s our CMS of choice for many reasons and we could leverage all of the great work that has already been done with the JSON:API. However, not all of our needs were covered out-of-the-box.

Exploring API-first distributions

When we were first getting started, we spent some time looking at using ContentaCMS for our back end. It’s an API-first Drupal distribution that already contains most, if not all, of the modules we needed. We think ContentaCMS is an interesting tool that will stay on our radar for future projects, but ultimately we decided against it this time due to the number of extra modules that were unnecessary for our particular build. We ended up just starting with a fresh Drupal 9 install with only the modules we needed.

Decoupled menus

Another complication we came across focused around decoupled menus. Obviously, to build out a menu on the front end we needed to be able to get menu items. JSON:API Menu Items was installed and gave us a good starting point by exposing the menus in JSON:API. Next, we needed to get around the fact that JSON:API relies on UUID for content but menus use paths. Content also likes to have pretty URLs to use for navigation via path aliases. To display these path aliases from the front end and be able to retrieve the correct node, we need some way to resolve the path alias to its correct content. In comes the Decoupled Router module to solve the problem for us! This module provided the endpoints we needed.

Simplifying the JSON output

When you first install JSON:API, you get ALL of the data. While this is great, the JSON output relies on the naming convention of fields either set by Drupal or the developer(s) configuring the backend. Though we have control over how we name newly added fields, we don’t have that control over existing fields. For example, the User entity's “mail” field, which is intended to be the user’s email address, is not easy to understand without knowing what the data is within the field. For a better developer experience, we renamed fields as needed by creating EventSubscribers that tap into the JSON:API response build events. We also installed the OpenAPI module to give our developers a better high-level look at the available endpoints and their structure, opposed to looking at gross JSON output.

Our approach to the front end

With our back end sorted out, it’s time to get to the front end. When going decoupled, the front end suddenly becomes a whole lot more interesting than just a standard Drupal theme with Twig templates, SASS stylesheets and jQuery. We’re in a whole different realm now with new possibilities. We were 100% on board with having better integration between design and development, where the end result of the design was more than static assets and a 1-hour handoff. Here’s a breakdown of our front end approach.

A new strategy for consistency between design in development

In the past, creative exercises would provide static assets such as style guides to try and capture the design principles of a web property or brand. While this concept is rooted in good intentions, in practice, it never really worked well as a project evolved. It’s simply too difficult to keep design and development in sync in this way. For web development, it’s now understood that the code is a better source of truth, not a style guide, but we still need ways to involve our designers and have them contribute their knowledge of design and UX into the code that is being developed. After all, if you leave these decisions up to developers, you’re not going to end up in a good place (and most developers will readily agree with that sentiment). Luckily for us, applications like Figma, Sketch, and Adobe XD, are leading the way to bridge this gap. While this is still very new territory, there is a lot of exciting progress happening that is enabling the creation of robust design systems that act as a rulebook and single source of truth that is both easily updated, modular, and readily available for product application development.

After an internal study of available technologies for design, we settled on Figma for our design system creative work. Figma runs in either web or desktop applications and is a vector graphic editor and prototyping tool. The team developing it has done an incredible job pushing the envelope on how creative and development can work together. With Figma, we found it was possible to accomplish the following:

Extracting design tokens into our application

With our tokens living in Figma, we still needed a way to bring those static design token values into our development process. This is where Figmagic comes in. Figmagic is a nice tool for extracting design tokens and graphic assets out of Figma and into our application. Once set up, all we need to do in our local development environment is to run yarn figmagic. This connects to Figma and pulls down all of our defined tokens into an organized set of ready-to-use files. From there, we then simply import the file we need into our component and use the values as required. If a value needs to change, this is done in Figma. Re-running Figmagic brings in any updated value which updates any component using it. It works like... magic.

Building reusable front end components

Going decoupled meant breaking away from Drupal’s Twig templating and theming layer. For our front end application, we decided on React for our replacement. React continues to be very popular and well-liked, so we didn’t see any technical advantage to picking anything else. This change brought on some additional challenges in that, as a Drupal development company, our focus has not been React. We undertook some initial training exercises to get our development team on this project up-to-speed with the basics. Between this team, our CTO, our CXO, and a senior technical lead, we quickly trained up and settled on a suite of tools to achieve our desired outcomes of a component-driven React frontend.

Component UI frameworks to increase velocity

For us, the approach to developing the many React components that we needed was to first settle on an established framework. Like Bootstrap is to HTML/CSS development, several React-based frameworks are available to use as a foundation for development. It’s always debatable whether this is necessary or not, but we felt this was the best way for us to get started as it greatly increases the development velocity of our project while also providing good reference and documentation for the developers creating our components. We researched several and tried a few, but eventually found ourselves leaning towards Material UI. There were several reasons for this decision, but mainly we like the fact that it is an established, proven, framework that is flexible, well documented, and well supported. With Material UI (MUI), adjusting the base theme using our design tokens from Figma already gave us a good start. Its framework of pre-built components allowed us to make minimal changes to achieve our design goals while freeing up time to focus on our more unique components that were a bit more custom. This framework will continue to be of great benefit in the future as new requirements come up.

Previewing design system component without Drupal

Without a standard Drupal theme available, we needed somewhere to be able to preview our React components for both development and UX review. A decoupled front end, and the whole point of the design system, is that the front end should be as back end agnostic as possible. Each component should stand alone. The front end is the front end and the back end could be anything. Since we haven’t yet connected the front end components to Drupal’s JSON:API, we still needed an easy way to view our design system and the components within it. For this, we used Storybook. Storybook was made for developing and previewing UI components quickly. As a developer working locally, I install and spin up an instance of Storybook with a single command within our project repo. Storybook then looks at our project folder structure, finds any story files that we include as part of all of the components we develop, and generates a nice previewer where we can interactively test out our component props and create examples that developers in the future may need when implementing a component. Storybook is our design system library and previewer as well as our developer documentation. When we are done developing locally and push code to our repository, our deployment pipeline builds and deploys an updated Storybook preview online.

Bringing it all together

At this point, we have our back end and API established. We also have our front end component library created. Now we need to bring it all together and actually connect the two.

Matching the Drupal UI to our React components

React components are standalone UI elements and are made up of props that determine their functionality. So, for instance, a Button component could have many variants for how it looks. A button could include an icon before or after the text, or it could be used to open a link or a dialogue window, etc. A simple component like this might have a whole bunch of different props that can be used in different ways. To be able to control those props through the Drupal UI, we decided to use the Paragraphs module. A paragraph in Drupal is essentially a mini content type for a specific use. In Drupal, we created paragraphs that matched the functionality we needed from our React components so that the Drupal UI would be, in the end, setting the component props. This is a bit confusing to set up at first and something that will probably not be nailed down first try, but, in the process of setting up the Drupal UI in the context of the React components, you can really start to see how the connection between the two is made. At this point, it’s pretty easy to see if a component is missing any props or needs to be refactored in some way.

We also went with Paragraphs because we wanted to give our content creators a more flexible way of creating content. A standard Drupal content type is pretty rigid in structure, but with Paragraphs, we can give content creators a set of building blocks that can be grouped and combined in many different ways to create a wide variety of pages and layouts in a single content type. If you’re not familiar with this style of creating content in Drupal, I suggest you take a look at this post I wrote a while back. Look at that and imagine React components rendering on the page instead of Drupal templates. That’s essentially what we’re aiming to do here.

An argument against Paragraphs is that it could potentially make page building more time-consuming given that you don’t have that typical rigid structure of a content type. This is especially true if pages on a site being built typically all follow a similar format. In this case, we still like to use Paragraphs anyway but we will typically build out unpublished pages that act as templates. We then install the Replicate module which lets content creators clone the template and use it as a starting point. This makes creating pages easy and consistent while still allowing flexibility when needed. Win-win.

Taking the Next.js step

Since we settled on React, we started our front end project as a standard react build using the built-in create-react-app command. But, after some additional investigation, we decided that a better option for our use case would be to use the Next.js framework. Given that the corporate site is, at its core, is a marketing and blogging website, Next.js allows us to still build the frontend using React but gives us some extra capabilities this type of website needs.

One of these capabilities is incoming server-side rendering which we found increases the performance of our application. This also has SEO benefits due to pages being rendered serverside and sent to the client, as opposed to the standard client-side rendering most React applications use where the client machine handles processing and rendering.

Component factories are key to interpreting the API data

We were also attracted to the structure and organization of the project when using a framework like Next.js, as well as the dynamic routing capabilities. This significantly simplified the code required to implement our decoupled menu and page routing. Using the dynamic routes provided by Next.js, we could create “catch-all” routes to handle the ever-growing content in the CMS. By querying the Drupal API, we can get all the available path aliases for our content and feed it to Next.js so it can pre-build our pages. It can use the path aliases to resolve the content specific to the page through the Decouple Router module, which will then feed that data to a component factory to handle the decision as to what high-level React page component is necessary to build a page. For example, if the data is for a basic page content type, the page factory component returns our Page component, if it’s a blog post content type, it returns the Blog Page component, etc. Each of these components knows how to handle the specific data fed to it, but the factory component is what connects the data to the right page.

The way Paragraphs can be added to a page for composing a piece of content also created a challenge on its own in reading that data and returning the right component. We decided to create another factory component for this. Similar to the page factory component, this one takes in the paragraph data and determines which React components it needs to render the content.

What’s next in our decoupled adventure

Believe it or not, what we’re building is still a work in progress. While we have most of the tech requirements figured out, we’re still actively putting all of the final pieces in place and fine-tuning the backend UI and modules requirements. All in all, we’re feeling really good about this progress and can see the value of what we’re building both for ourselves and the clients we serve. In our opinion that is based on experience and research, decoupled is the future, and we’re excited to walk this path.

With that said, even once our own decoupled site has launched, we still have a lot of work left to do. For one, we still need to get a method in place for managing Drupal blocks and regions. More factory components will be critical here to interpret that data. Given that our clients are primarily ecommerce retailers in one form or another, we still have the whole ecommerce side of web development to sort out, including product catalogues, add-to-cart forms, carts, wish lists, checkout flows, etc. The list is long. However, what we have is a framework that can be repurposed and added on to. It finally helps tie together many disciplines of web development, both creative and technical, that are traditionally siloed. It’s also blazing fast and just plain cool. This is an investment not only in our own online business front but in something greater that we hope to share with others through work, community and general knowledge.

Thanks for taking the time to read this. If you want to chat about this initiative further, drop us a line.

New call-to-action

Apr 07 2021
hw
Apr 07

I am going to keep today’s DrupalFest post simple and talk about the API to access content on drupal.org. The Drupal.org API is a public API that allows you to access content such as projects (modules, themes, etc), issues, pages, and more. The API returns data as a simple JSON structure and has only limited features with regards to filtering and gathering nested data. In this post, I will describe more about this API and various ways to access it. It has been a tough day and it is difficult for me to write long posts or go deep in my thoughts. Regardless, I hope this quick post still proves useful.

API basics

The base endpoint to access the drupal.org API (henceforth, d.o API) is https://www.drupal.org/api-d7/. In fact, you can probably convert any canonical URL on drupal.org to its API equivalent. Simply prefix the path with “api-d7” and suffix “.json”. By canonical, I mean URL’s that use Drupal’s internal path such as node/2773581 or user/314031. These endpoints return JSON responses which are practically the same as Drupal internal representation. This means you will notice weird field names (almost all fields would begin with “field_”) and nesting that you might not expect from any other API’s. If you have programmed for Drupal, chances are you will feel right at home with the response data structure.

The API’s that return listing of entities are simply named such as node.json or user.json (and so on). The listing endpoints accept a variety of filters and allow pagination with simple query parameters. Most of the field names can be directly used as query parameters to filter the list on that field value. For example, https://www.drupal.org/api-d7/node.json?type=project_issue would return all the issues on drupal.org. Whereas, the URL https://www.drupal.org/api-d7/node.json?type=project_issue&field_project=3158507 would return only the issues in the preloader project (preloader’s node id on drupal.org is 3158507).

Read the documentation page for examples and more details such as field names and values.

Shortcomings in the API

As you might have surmised from the description above, this API is not designed for common consumption. Drupal Association maintains this on a best-effort basis and more sophisticated use cases (such as gathering nested data in a single request) are not supported. There is a plan to improve the API on the whole in the future but I don’t know when that might happen.

Practically speaking, this means that you have to make multiple API calls if you want to collect all the information about any entity. The first API call would be to the main entity about which you want information. And then you have to parse it to gather all the referenced ID’s and make API calls for each of them. If you wanted to build an efficient parser that needs to deal with a lot of nodes, you would probably have to persist all information in your application (which is what I did with DruStats).

The problem is not limited to just normal relationships such as terms and users, but also to entity reference fields. For example, if you want to find out a user’s organization, you would have to read the “field_organizations” property and make a request under “field_collection_item” endpoint with that ID. In a normal consumer-grade API, you would probably expect the information to be embedded right in the user response.

Using the API in your code

The API endpoints are straightforward and you can request data with a simple curl request or just the browser. However, if you were writing an application, you might often get frustrated with dealing with the filters and nested queries. This is where libraries come in.

The d.o API page lists two such libraries at the end of the documentation page. The first one listed is by Kris Vanderwater (EclipseGc) and it seems simple enough to use. The second one was written by me at the time when I was building DruStats. I needed something more sophisticated and I decided to write my own API wrapper. Since then, I also used this library for Contrib Tracker, a project which tracks contributions to drupal.org by multiple users. This library is reasonably documented on the Github page but improvements are always welcome. You can also look at examples in DruStats and Contrib Tracker. I am currently in the process of moving Contrib Tracker to a new home and I am not linking to the current URL right now. I am also planning a post on contrib tracker soon.

Using the CLI tool

Matt Glaman has written a CLI tool to interact with drupal.org issues. If you only want to automate your Drupal contribution workflow, then this CLI tool might be what you need. This tool allows you to simplify working with Drupal.org patches, create interdiffs, and even watch CI jobs. As always, the documentation on Github can guide you with the installation and usage of this CLI tool.

Apr 06 2021
hw
Apr 06

I recently opened up a spreadsheet where people can put in their ideas of what I can write about in this DrupalFest series. Someone put in a topic of what advice I would give my younger self. This idea intrigued me and I thought I will make an attempt at writing down advice to new Drupal developers. I am not very comfortable presuming that someone would want to take advice from me; so I am going to say what I would want my younger self to know. I am going to temper my thoughts to be suitable to today’s state of industry. There is no point talking about how the world would be from the point-of-view of 2007. So, let’s get started.

You don’t have to write all the code

One of my first non-trivial programming task was writing an x86 Assembly library for graphics manipulation and computation from QBasic. I started off programming with GW-Basic and QBasic and in time, I realized it is quite slow to do real stuff. That’s when I learnt Assembly language and wrote a library. This started me off on a path where I preferred to write all the code on top of a programming language and not rely on frameworks and libraries. I stayed on this path for about a decade since then.

I eventually learned that to be productive and actually grow, I should stop thinking of code in terms of purity and use what is out there. Other people are talented too and they have gone through the same problems you are going through. Learn from their mistakes and their work and build upon that. I realized much too late (in hindsight) that I was letting perfect be the enemy of good and staying stuck.

Case in point: I found out about Drupal in 2007 from someone and I thought why would I use it when I can and have written multiple CMS’es. The person who introduced me to Drupal (Drupal 5 at the time) was talking about how you can build websites in a matter of hours and use views to build a query with a UI (I hated that idea). He even said that Drupal 6 is even better and is just about to be released. Even after this, I only gave in and built my first Drupal site at the end of Drupal 6 period. That is still the only Drupal 6 site I have built.

There’s another story. I wanted to start a blog and was waiting to perfect the blogging CMS I was building. It had everything I wanted but I was finding more things to add and that became an excuse for me to not start blogging. I realized that one night not able to sleep at 2 AM and it hit me; that’s when I wrote my first (second) blog post. Here’s the actual first.

But do write the code to learn

Yes, I waited a long time to start using other people’s work because I wanted to do it myself. But doing it myself taught me a lot and I encourage you to go through that as well; just not at the expense of getting work done. When you find time, pick one hobby project that you want to do, just one, and do it from scratch.

Learn something not related to what you want to learn

This is something I did without realizing anyway and so I don’t need to give this advice. But I will document it here anyway. Learn a lot. Don’t let go of an opportunity to learn. I cannot stress this enough on how it has impacted me. It was only a feeling until I read more about this in David Epstein’s book “Range: Why Generalists Triumph in a Specialized World“. Essentially, learning things which seem unrelated to what you’re doing is a great work to develop a wide perspective which helps you innovate and excel. If you’re interested in the mechanics behind this, do read the book.

Within the programming area, I started off with QBasic, Assembly, PHP, .NET, C#, C/C++, Java (in school and college), and staples such as HTML, CSS, and JavaScript (much simpler around that time). Since I started working professionally, I have learned ASP.NET, Windows programming internals (from a .NET perspective), CodeIgniter, Kohana, WordPress, Drupal, JavaScript, Laravel, and many others I can’t recollect. More recently, I have started learning Rust, Golang, Flutter, Dart, and technologies such as Docker and Kubernetes. Outside of programming, I often worked with Photoshop, Premiere, Aftereffects, Corel Draw, and generally in printing and video editing projects. Further outside of programming, I used to assemble computers for people and myself and built networks. Going in reasonable depth with all of this helped me understand computers and technology the way I do today.

Also learn something completely outside what you do

It is great to expand your knowledge in the areas around your work but it is also very helpful to pick up completely unrelated activities (what we call hobbies). I used to be part of group that put up props and other types of decorations for community events. As a part of that group, I would cut paper, slice thermocol, apply glitter and other materials, and put them up on walls and hang them from rafters and towers. I also used to volunteer as a photographer and also a vocalist (something similar to a choir group). It may not seem obvious how this would help your career but the broad range of perspectives that develop here help. One aspect of this is just the expanded network that you build which helps you see outside the otherwise narrow world we sometimes confine ourselves to.

Make learning a habit

I read this great book called Atomic Habits by James Clear which opened my eyes to the relevance and effectiveness of habits. Earlier, I often found me fighting against myself to do something: either read more, exercise more, or just behave differently. I used to put up a brave fight and punish myself when I failed (which I often did). You see, I thought I was being a hero by slaying myself with all the burden and was a skeptic when I started reading the book. Not anymore. I now think more about the impact rather than the act and take calculated steps to learn and build my life with the impact in mind, not my current actions.

Go broad and go deep

I already shared why you should go broad with various technologies, but it is just as important to go deep in on one or two technologies you want to focus on. If you’re reading this, one of those things is probably Drupal for you and I would argue you should make that PHP. Go deep into how PHP works. What are the new features in PHP and how are people using those features in other frameworks? How does PHP run on a webserver and what are some of the alternative models of running PHP applications? How does a PHP application differ from something written in Golang, for example, or in Python?

And learn about programming fundamentals too. What problem does object oriented programming solve anyway? Can you apply functional programming principles in PHP? In Drupal? How does a processor execute a program? What does Assembly look like? How does an Operating System schedule programs for execution and how can it run multiple processes at the same time? How does virtualization and containerization work and how does that affect your code?

I know it seems like a lot but you are not going deep in one dive. Take years if you want to but you will see benefits within days of you starting to learn this.

Drupal is awesome

Now, this seems obvious in hindsight but it was not clear to me when I started programming. I already said that I was loathe to try out frameworks and libraries written by other people (that includes CMS’es as well). It was not until I volunteered for a project where I didn’t want to spend much time and just picked up Drupal. That is what set me on the path of my amazing experiences in the Drupal community.

So, I want to say this straight. Drupal is awesome both in terms of the code that is written and the people that have written it or helped to write it. Note that I didn’t say perfect. It wasn’t perfect (even when I pretended it was) and it is probably even less perfect now. But it is awesome when looked at the point-of-view of the impact it has made in my life and the lives around me. I have written often about this so I won’t go deep into it.

The ecosystem is awesome

As I said before, don’t limit yourself to Drupal. Drupal hasn’t limited itself to Drupal anymore since Drupal 8 famously got off the island by adopting practices from the wider PHP community. Learn from other frameworks and languages to determine what’s the best fit. There was a time I used to say that we could build anything in Drupal. That’s still true but that’s neither effective nor efficient. Decide if the code you are writing should be a custom module or a contributed module or a PHP package or a completely different service outside Drupal. There are a lot more awesome libraries and systems out there and you should find out how you can use them in Drupal rather than building that in Drupal.

I feel like I can keep going but I am going to end it here. This was much longer than I thought I would write but that’s the best I can do at this time. If you read it so far, thank you, you’re a star!

Apr 05 2021
hw
Apr 05

This is the fourth post in my DrupalFest series and I am excited to keep it going. I want to write about different tools I am aware of for running quality checks on Drupal code. This will be somewhat similar to my last post where I presented various options but focused on the method I use nowadays.

First of all, what am I talking about? I believe that the code we write is read a lot more times than it is written. It is read by other people than yourselves (2-week older you is another person) and they (you) have to understand what is written. Therefore, code must be optimized for readability. Performant code is a must, of course, but not at the expense of readability. When your best-performing code breaks and you can’t understand it to fix it, that performance is useless.

Types of checks

One of the low hanging fruits here is following a consistent code style. Drupal documents its coding style in detail and the Drupal core and contributed modules (most of them anyway) follow it. True, it’s not like PSR-2 or PSR-12; it was developed long before there were PSR’s, but if you are working with Drupal, it will be a good idea to follow this coding style consistently. Now, you could manually review each and every line of your code to make sure you are following the code style and halt your pull requests, but that is not a good use of your time. Hence, tools.

Apart from the coding style, there are ways to prove “correctness” of your code. Broadly speaking, there are two aspects of correctness–the code is coherent and the business logic is correct. The coherent code aspect can be checked using various static analysis tools; PHPStan and Psalm are few of the popular ones. For verifying the business logic, we write automated tests and PHPUnit is the defacto standard for that.

Tools

Before we get into the Drupal site of things, I’ll list the various tools that we use. I won’t attempt to document them here (in the interest of time) but you shouldn’t have trouble finding their documentation.

  • PHP Code Sniffer – Mainly for code style checks but can be a bit more sophisticated using rules.
  • Drupal coder rules – Code sniffs for checking Drupal coding style.
  • DrupalPractice coder rules – Code sniffs for checking Drupal conventions.
  • PAReview.sh – Automated tool that calls PHPCS with Drupal-related coder sniffs apart from other checks. This is commonly used to check if your module is contribution ready.
  • PHPStan – Static analyzer for PHP code that can perform various levels of type checks and other correctness checks.
  • Psalm – This is another static analyzer with same basic level of checks as PHPStan and then some interesting checks.
  • PHPLint – A simple PHP linter which supports parallel checks and better error reporting than just running php.
  • Other linters such as ESLint, TwigCS, etc – Linters for their specific languages.
  • PHPCPD – PHP Copy Paste Detecter. Like the name says, it checks for repeated lines of code.
  • PHPMD – PHP Mess Detector. Analyze source code for various mathematic measures of code.
  • PHPUnit – Test runner for unit, integration, and functional tests.
  • Behat – Test runner for Behavior tests.

Almost all of these tools can be installed via composer or PHAR files. Composer is an easy way to get these tools but they end up affecting your project dependencies. PHAR files have to be installed on your machine (and everyone in the team should do that too). In both of these methods, you still have to remember to run the tools. That is where the next set of tools come in.

Automatically running checks

One of the ways to make sure everyone on the team adheres to the checks is by running the test in CI. This way, the team would immediately know if their commit is failing checks without even someone having to say so in a pull request. You could again choose to have all of these tools in your composer.json and install them while running your CI, but there is a better way for most cases. DrupalQA is a Docker image which packages almost all of the above tools and you can just run the commands you want within the Docker container. Almost all modern CI tools provide some way of running tests inside a container and you should find documentation to do that for your CI tool. Here is an example of a Gitlab CI job definition which uses the PHP 7.4 version of this image.

drupal_codequality:
  image: hussainweb/drupalqa:php7.4
  stage: test
  script:
    - composer validate
    - phplint --no-cache -v web/modules/custom/
    - phpcs --standard=phpcs.xml.dist --extensions=php,module,inc,install,test,profile,theme --ignore=/node_modules/ web/modules/custom
    - phpmd web/modules/custom/ text phpmd.xml

You could even run the Docker image locally but that’s more trouble than it’s worth. There is a better way in any case.

Have your team run checks automatically

The CI is fine to run these tests but what if you want to save time between commit, push, and CI runs. Wouldn’t it be better for developers to run these checks on their machines before they commit their code. GrumPHP is a tool which allows you to do just that. Vijay CS has written a wrapper on GrumPHP that provides default configuration for checks related to Drupal. While this is great for general use cases, I wanted to make a highly opinionated one for my team at Axelerant. While it is opinionated, it is still available for use and you would just install it using composer this way:

composer require axelerant/drupal-quality-checker

After this, anyone using your project would automatically get git hooks installed (once they run composer install) to run a few Drupal-specific checks every time they commit. You can modify the default configuration and add/remove any tools you wish. Just follow GrumPHP documentation for the same. Of course, more documentation is also available at the Github page for axelerant/drupal-quality-checker.

GrumPHP recently made a lighter package available called GrumPHP Shim. This provides the same set of features but installs a PHAR file instead of a regular composer plugin. This has the benefit of keeping your project dependencies free of GrumPHP’s dependencies reducing the chances of conflict. Axelerant’s version of drupal-quality-checker uses the shim for this reason.

I think I covered all the tools I am aware of and could recollect in the span of couple of hours today. I’m sure there are more tools that are missing here but I am going to call this DrupalFest post done for now. If I am missing something obvious (or even niche), please point it out in the comments so that I can revise or write about it in a separate post.

Apr 04 2021
hw
Apr 04

PHP 7.4 introduced the concept of preloading classes (files) on server start-up into the PHP opcache. This gives us performance benefits for sites that tend to load a lot of files with every request; something that Drupal is known to do. A properly configured web server would have opcache (opcode cache) enabled anyway, but preloading brings in a modest performance boost on top of that.

PHP opcache is designed to cache the opcodes of a PHP file so that the file does not have to be reinterpreted with every request. The bytecode (opcode) is cached in shared memory in RAM which means that the cache lives only as long as the PHP process is running. When the opcache is enabled, PHP caches whichever files are loaded during execution and only recompiles the file if it has changed (this setting can be disabled for an additional performance boost).

Preloading works in a similar way except that you would write a script that would load the files you want to cache. This script is set to PHP’s opcache.preload setting which executes this script every time the server starts. Since this script is run with the server, you have to make sure that any errors are handled properly; otherwise, the server would throw an error and quit. Now, you may want this script to load all the files in your application, but the opcache memory size is limited. This means that you should only preload files that are required by most requests. In many PHP applications, these files may even be hand-written to get the best ratio of memory usage and throughput.

Preloading with Drupal

Now that we understand how preloading works, let’s see how it can be used with Drupal. Writing the preload script by hand is difficult as Drupal is a dynamic system. Even if you write a preload script for all the Drupal core files, it may be your contrib modules that are a better fit for this. This is the reason I have written a Drupal module called preloader to make this easy for you.

The module uses a PHP package called darkghosthunter/preloader which does most of the heavy lifting. The package checks the current opcache usage and generates the preload script. The Drupal module brings in Drupal-specific customization (removing files that shouldn’t be preloaded) and a user interface to generate the file. You still have to manually add the preload script to your PHP settings for the changes to take effect. Still, the difficult task of generating the script file listing all the important files is made easy for you.

The workflow is as follows:

  1. Install the module using composer as you normally would.
  2. Configure the module to ignore any additional directories or files if you want.
  3. Restart the webserver and then load some of the pages that are frequently hit on your site.
  4. Go back to the module configuration page and generate the script.
  5. Set the opcache.preload setting in your php.ini file to the generated script path.
  6. Restart your webserver and test.

Gotchas

Since the preload script is executed at the server start and the cache is permanent as long as the process is running, if you change any of the preloaded files, you have to restart the server. For this reason, it is not a good idea to use this functionality during development. The module may remain enabled but the opcache.preload setting should not be set on the developer machines.

Performance impact

In my early tests, I saw an average of 10% improvement with preloading enabled across different percentiles. This test is quite old and I should repeat this on a better machine but the result should still be indicative. Hopefully, I will be able to test this module again this month and even work on some of the issues that have been reported. I will share screenshots or even a video of the test when possible.

This is it for today’s DrupalFest post. See you tomorrow, hopefully.

Apr 03 2021
hw
Apr 03

This post will cover quickly setting up a Drupal website for testing, experimentation, or evaluating features on your local system. While I cover a different set of options briefly, I will mainly talk about a tool we have built to let us quickly scaffold Drupal sites with best practices built in. This post is a part of the DrupalFest series which celebrates 20 years of Drupal. Let’s get started.

Before I go to the main part of the article, let us look at what options do you have to quickly set up a Drupal website for evaluation. While all the ways are effective, there are trade-offs typical of each method. We’ll start with the simplest way to experiment with a Drupal site and then keep going up the level of complexity, but also the flexibility you get.

Someone else’s machine

The easiest way to set up a Drupal site is in the Cloud, aka, someone else’s machine. Broadly, there are two ways I can think of to do this. The first method involves one of the excellent Drupal hosting providers and signing up for a free account. Drupal.org lists some of the hosting supporters who offer a free account to try out Drupal. Go to this link to get started: https://www.drupal.org/try-drupal.

While the above method lets you try out not just Drupal but also a hosting platform where you might actually run your website, you may not want to create an account to just try out Drupal. In that case, there is SimplyTest.me. This site provides a time-bound sandbox environment where you can quickly try out Drupal, one of the Drupal distributions, with or without modules and themes, and even apply patches. The sandboxes are available for 24 hours and give you complete control of the Drupal site through the UI. You don’t get command-line or SSH access this way, but for quickly testing out a module or a distribution, this is the easiest solution out there. You don’t need an account. Just pick the modules or the distribution and you have a sandbox ready in moments.

Your machine

On your machine, you have a little bit more control over the site but you do need the software required to run Drupal. If you’re only interested in quickly evaluating Drupal on your machine and still have access to the files and command line, the Drupal evaluator guide gives you an option to run this with just PHP. More details and instructions are available at the link: https://www.drupal.org/docs/official_docs/en/_evaluator_guide.html.

The problem with the above method is that it won’t run Drupal in an environment reasonably similar to where you would host. This is good for evaluating Drupal and quickly editing a few files but the experience working with PHP built-in server and SQLite would not help development. To take it a step further, you would need Docker as a pre-requisite.

If you have Docker and you already have a Drupal codebase downloaded, then tools such as Lando and DDEV would help you get started in no time at all. If you want to take it a step further, keep reading to the next section.

Finally, you have the classic method of installing Drupal on your machine by manually installing all the software required to run Drupal (that’s PHP, Apache/nginx, MySQL/MariaDB, etc). I don’t find many developers do this anymore and it is more trouble than it is worth. If you can’t run Docker for any reason, consider running it in a virtual machine using the excellent DrupalVM.

Axelerant’s template tool

Now to the main section of the post. At Axelerant, we wanted to automate the entire process of setting up a Drupal site quickly along with the codebase, Lando configuration, some of the best practices we follow, and even CI configuration. I wrote a tool for that called axl-template which is available via pip (yes, it’s written in Python and needs at least Python 3.6). Once the tool is installed, you can quickly set up a Drupal site along with all the configuration I mentioned in a matter of minutes. I have measured this time from running the first command to have Drupal running to be minutes (longest time taken by Drupal installation). Here’s a somewhat old video where I use this tool to set up Drupal 9 with Lando.

[embedded content]

I have added new features to this tool since I made the video, the most notable feature being able to specify modules and packages right in the first command. The idea is to have your codebase created from a single command and ready to run. I am planning to create a new video to cover these and many other features added to the tool. For now, the documentation is a good reference. Read more at https://github.com/axelerant/axl-template.

If you’re averse to installing a Python package through pip and don’t want to get it from the source code either, you could run it via Docker using whalebrew. Instructions to do these are in the README file but only one of the commands is supported in this way.

We’re continuing to improve this tool by adding support for more common development tools that are typically used. One example that comes to my mind right now is Acquia’s BLT. I can’t promise when that support will come out but it’s on the cards. In any case, this package is released under MIT license and contributions are very welcome.

This is it for today and I am just completing this post with less than four minutes to spare to midnight. Have a good DrupalFest everyone.

Mar 30 2021
Mar 30

Rain logo updated

The best open source distribution for Drupal just got better! The latest version of Rain University and Rain CMS now ship with Layout Builder pre-configured to make page building faster and easier. So how does it work? Check out below!

Editing Layouts

Now, when you navigate to any page with layout builder enabled you can edit the layout by clicking on the “Layout tab” under Tasks. Alternatively, you can click on the same tab while editing a page.

Rain editor experience with layout builder
Rain CMS homepage

Rearranging Blocks

With layout builder you have an instant preview of any blocks added to the page. That being said, it’s usually easier to move blocks around with preview turned off. Drupal provides a checkbox that makes it simple to toggle preview on or off.

Rain CMS, rearranging content blocks

Rearranging blocks in Rain CMS

Adding Blocks

To add a block to the page click the “Add block” link in any section. Rain CMS ships with 15 block types out of the box that you can easily drop onto the page. Each component has a preview wireframe and label to help the author understand the look and function of each component. 

Rain add block

Adding blocks in Rain CMS

Layout Controls

One of the big benefits of Layout Builder is now you have more control over the layout of a page. Editors can easily add new sections with various layouts where blocks can be placed. Layouts can be customized per project.

adding sections in Rain CMS

Adding sections in Rain CMS

Rain University CMS

The Mediacurrent team has also updated our RainU CMS to ship with Layout Builder. Same great experience, but tailored specifically for universities.

Rain University homepage

Rain University homepage layout

Want to Know More?

For developers, you can download and install Rain CMS with Layout Builder using our Drupal project template: https://bitbucket.org/mediacurrent/drupal-project/src/9.x/. Setup and installation remain the same, with detailed instructions provided in the project README file.

We are also happy to demo Rain University or Rain CMS for organizations interested in partnering with Mediacurrent for your next redesign project. To schedule a free demo, please visit our contact page or chat with us right now (see bottom right corner of the page). 

Mar 19 2021
Mar 19

rocket blasts off against a starry sky

DrupalCon North America 2021 is right around the corner! Check out the ever-growing schedule of sessions, industry summits, and special events on the official event site, and mark your calendar for these Mediacurrent sessions. 

Our Sessions 

The Mediacurrent team is proud to support this community event as a platinum sponsor. We’ll be presenting several sessions at this year’s online conference.

Whether you’re a site builder scaling up with multisite, a marketing leader in search of current guidance on open source security, or a Drupal community member of any kind looking for inspiring real-world case studies, we’ve got you covered. 

Here’s what we have in store for sessions and case studies in Drupal innovation:

Unlock The Power of Multisite 

DrupalCon program speaker card with Jay's headshot

Join Jay Callicott, Mediacurrent’s VP of Technical Operations, for a comprehensive approach to manage your Drupal sites at scale.

Interested in evaluating multisite options for your organization? Jay will cover several ways to scale your Drupal platform from one site to many dozens or even hundreds.

Register here to join the session and learn best practices for governing multiple sites from one codebase, how to configure a multisite installation, and considerations for your hosting solution. 

Open Source Security for CMOs

DrupalCon program speaker card with Mark and Krista's headshots

As open source software continues to become widely adopted, adhering to security standards is becoming more challenging. So what's a CMO to do?

Inspired by our ebook, The CMO’s Guide to Open Source Security, this session will help you navigate the terminology, expectations, and tools to ensure security is a priority for your web properties.

You can register here to join the session led by Mediacurrent’s resident Drupal security experts Mark Shropshire and Krista Trovato.

Case Study: Habitat for Humanity 

Imagine a world where everyone has a decent place to live. That’s the vision fueling Habitat for Humanity to create ambitious digital experiences with Drupal. 

This session will present a case study covering how Drupal is being used to bring mission-driven innovation to reality for this international nonprofit. Both Drupal site builders and non-technical roles are encouraged to attend. 

You can register here to join the session led by Mediacurrent Project Manager Vicky Walker and two members of Habitat for Humanity's web team.  

Drupal for Higher Education 

The year 2020 called for higher ed leaders to accelerate digital marketing strategies. For many, Drupal was a key part of the equation. This rang true among a spectrum of Mediacurrent’s higher education partners, including an Ivy League university that chose a decoupled architecture for its breakthrough knowledge platform.

Dan Polant, Director of Development at Mediacurrent, will share that story in a co-presented session at the Higher Education Summit. The session will explore the University’s driving mission to build toward a brighter financial future on a Drupal and React-based platform.

Join the session on April 20 at the Higher Education Summit.

Connect with Us

There's more to come! Check back for Rain CMS demos, Drupal 9 info sessions, giveaways, and more coming soon to our DrupalCon 2021 event page.

Mar 11 2021
Mar 11

Chris Zietlow from Mindgrub gave his new talk on AWS: How an online retailer came to conquer the Internet. He explores the Genesis of Amazon Web Services, how it became widely adopted, and a birds eye view of some of the more common problems their services can solve.

[embedded content]

If you would like to join us please check out our up coming events on MeetUp for meeting times, locations, and remote connection information.

We frequently use these presentations to practice new presentations, try out heavily revised versions, and test out new ideas with a friendly audience. So if some of the content of these videos seems a bit rough please understand we are all learning all the time and we are open to constructive feedback. If you want to see a polished version checkout our group members’ talks at camps and cons.

If you are interested in giving a practice talk, leave me a comment here, contact me through Drupal.org, or find me on Drupal Slack. We’re excited to hear new voices and ideas. We want to support the community, and that means you.

Mar 03 2021
Mar 03

This post is an updated part of our Marketer's Guide to Drupal series. This guide will walk you through considerations for choosing an open source CMS, plus case studies and CMO advice to bring your site to the next level.

Supercharge SEO with Drupal

“Over the last 20 years, Drupal has grown into one of the largest enterprise content management systems in the world.” - Drupal Founder Dries Buytaert

Along with the growth of Drupal, the marketing landscape is vastly different now than it was 20 years ago. The same techniques once used to make the top of the search engine no longer work, so marketers need to understand the powerful role a CMS like Drupal can play in building a successful SEO strategy. 

The release of Drupal 8 marked a significant stride in giving content authors more control. That focus continues to evolve with Drupal 9. Notable editor-focused features include a templating engine called Twig for component variety, an API-first foundation to “write once, publish everywhere”, and an editor-friendly content block authoring experience. Layout Builder additionally provides a "drag and drop" module that lets editors add components to pages and add pages to the site with no code. Now, Drupal 9 has even more upgrades for marketers, tech-savvy or not.

This shift places the marketing team in the driver’s seat more often and allows them to get involved in the CMS decision. In this post, we’ll outline some ways you can up your SEO game with Drupal.

Traditional SEO is Dead

No longer will well-placed keywords alone get you to the top of the SERP ranks. Content is still King in the world of marketing, and it’s what helps you improve your SEO.

Every algorithm change Google has made has one thing in common: it aims to provide the best content based on what it "thinks" the user is trying to find. In other words, the user’s intent. If you want your rankings to stick, don't try to cheat the system. Attract your prospects with informative, entertaining pieces that they can use to take action. And avoid no-value posts that are keyword-stuffed with your industry and the word "best" 100 times. Google can see through it and so can all of your users.

That said, there are a few other factors that are critical to keeping your rankings high that can’t be ignored, including quick load times and mobile-friendliness. Drupal 9 is built with several of these factors in mind to help us make needed improvements quickly and effectively.

Mobile-First Mentality

Drupal 9 is created with responsive design capabilities built-in, so you can begin to address many problems immediately. That’s not to say all of your responsive problems will be solved. Content editors still need to think through their content and imagery, and themers will still need to do configuration to establish things like breakpoints. But Drupal 9 will set you on the right path, giving you and your team many of the tools you need.

You’ll also have the option to choose different images and content for desktop and mobile versions right from the WYSIWYG editor, making it easier to see the differences for every piece of content when you add it and before you publish. This means a solid visual of both versions in real-time for faster publishing and peace of mind knowing exactly what your users experience on any device. 

The Need for Speed

Another big factor that could affect your rankings is speed on both desktop and mobile. Google places such high importance that they provide a PageSpeed Insights test to show where and how your website is slowing visitors down. Drupal 9 is “smart” in that it caches all entities and doesn’t load JavaScript unless it has to. This means the same content won’t be reloaded over and over and instead can be loaded quickly from the cache. It also uses a feature called velocity, which makes creating and publishing new dynamic content experiences is significantly faster than in Drupal 7 and older Drupal versions.

Responsive design is a must-have in today’s digital landscape and speeding up your website on both desktop and mobile is a surprisingly effective way to contribute to your SEO efforts. In short, if your marketing team is focused (as you should be) on top rankings, Drupal 9 provides many of the tools to make that happen. 

Accessibility = Key for Search

Drupal 8 spurred an overall commitment to accessibility from the community, and with the release of Drupal 9 came another big push toward improving web accessibility, including: 

This is important because, as we know, the relationship between web accessibility and SEO is closely intertwined. Although accessibility is not actually a Google ranking factor (yet), improving accessibility on your website has a high chance of improving your SEO rank.

SEO Friendly Modules for Drupal 9

There are thousands of modules available for Drupal 9, many of which are perfect for marketers. Whether you’re looking to try out something new or to find something that fits what you already know, you have your pick. Here are our favorite SEO modules to use when optimizing your site:

  1. Metatag - allows you to automatically provide metadata, aka "meta tags", that help search engines understand your content. This must-have module offers over 300 different meta tags for different purposes, so take a look and find the right ones for your site.
  2. Two Metatag submodules that we highly recommend are Twitter Cards and Open Graph. Connect your site to Facebook, LinkedIn, Slack, Twitter, and other social platforms and control how links will look when shared on them.
  3. Schema.org Metatag - provides a way of adding some of the hundreds of standardized metadata structures from the international schema.org project on a site, making it easier to clearly define metadata that Google et al can use to more accurately understand your site’s unique information.
  4. Pathauto - helps save you time from manually having to create URL path/aliases as new content is created.
  5. Sitemap - provides a site map that gives visitors an overview of your site. It can also display the RSS feeds for all blogs and categories.
  6. Redirect - Almost every new site needs to incorporate 301 redirects for old page URLs. This gives site admins an easy interface for creating those redirects in Drupal.
  7. Google Analytics - this simple module allows site admins the ability to easily configure Google Analytics in Drupal.
  8. Easy Breadcrumbs - uses the current URL (path alias) and the current page's title to automatically extract the breadcrumb's segments and its respective links.

Thankfully, because Drupal is open source, you’re not out of luck in the instance that you can’t find a module that works for you. There are many options available for making a new one that works, from building it yourself to enlisting help from a Drupal team like Mediacurrent.

Visualize SEO Success with Siteimprove

In addition to Drupal's SEO-friendly modules, an SEO optimization tool like Siteimprove can…

  • Give content editors information about their technical SEO to make more informed decisions
  • Gain an understanding of how SEO and content intersect for your overall strategy
  • Flag potential issues before your content is published
  • Provide insights about the SEO impact on unpublishing a page

The Siteimprove module works directly in Drupal, giving editors access to these insights while they’re adding content. This means no more waiting to fix it post-publish. It is important to correctly set up Siteimprove in order to get the most out of it and effectively transform your strategy into a workable roadmap for your site.

SEO and Beyond

Drupal’s content management system is perfectly structured for search optimization and its core features support many of the critical SEO elements. But features only take you so far on their own. To manage and continuously improve your SEO, consider a dashboard like Siteimprove that you can customize to show you just how the data is processed and how it impacts your business's goals.

Setting those custom data points and interpreting the data that comes in can be time-consuming and difficult, so if you need any help, our team of Siteimprove certified experts can apply our knowledge to configuring your dashboards and making sense of the data. Get started with a Siteimprove tune up.

Feb 27 2021
Feb 27

Part of contributing to any open source project, or even really being a contributing member of any community, is sharing what you know. That can come in many forms. While many projects over emphasis code, and most of us understand the value of conference talks, good how-to articles are some of the most critical contributions for any software platform. There isn’t much point to a tool if people cannot figure out how to use it.

Why do I write how-to articles

I’ve contributed code to Drupal, some of it even good and useful to others. But usually when I hear someone noticed something I created it’s blog posts about how to solve a problem.

When I struggled to find the answer to a question I expect it is a candidate for a how-to post. I am not so creative that I am often solving a problem no one has, or will want to, solve for another project. And I am good enough at what I do to know that if I struggled to find an answer it was probably harder to find than it could been.

That helps me find topics for articles that are helpful to the community and benefit me.

How-to articles help others in the community use tools better

The goal of a good tutorial is to help accelerate another person’s learning process. The solution does not have to be perfect, and I know most people will have to adapt the answer to their project. I write them when I struggled to find a complete answer in one place, and so I’m hoping to provide one place that gives the reader enough to succeed.

Usually I combine practical experience earned after digging through several references at various levels of technical detail – including things like other people’s blog posts, API documentation, and even slogging through other people’s code. I then write one, hopefully coherent, reference to save others that digging extra reading.

The less time people spend researching how to do something, the more time they have to do interesting work. Better yet, it can mean more time using the tools for their actual purpose.

How-to articles serve as documentation for me, colleagues, and even clients

The best articles serve as high level documentation I can refer back to later to help me repeat a solution instead of recreating it from scratch. When I first wrote how-to articles I was solidifying my own learning, and leaving a trail for later.

They also came to serve as documentation for colleagues. When I don’t have time to sit with them to talk through a solution, or know the person prefers reading, I can provide the link to get them off and running. Colleagues have given me feedback about clarity, typos, and errors to help me improve the writing.

I have even sent posts to clients to help explain how some part of their solution was, or will be, implemented. That additional documentation of their project can help them extend and maintain their own projects.

How-To articles give me practice explaining things

One of the reasons I started blogging in the first place was to keep my writing skills sharpened. How-to articles in-particular tend to be good at helping me refine my process in specific areas. The mere act of writing them gives me practice at explaining technology and that practice pays off in trainings and future articles. If you compare my work on Drupal, Salesforce, and Electron you can see the clarity improve with experience.

How-To articles give me work samples to share

When I’ve been in job applicant mode those articles give me material to share with prospective employers. In addition to Github and Drupal.org, the how-to articles can help a hiring manager understand how I work. They show how explain things to others, how I engage in the community, and serve as samples of my writing.

How-To articles help me control my public reputation

I maintain a blog, in part, to help make sure that I have control over my public reputation. To do that I need inbound links the help maintain page rank and other similar basic SEO games.

From traffic statistics I know the most popular pages on this site are technical how-to articles. From personal anecdotes I know a few of my articles have become canonical descriptions of how to solve the problems.

When I first started my current job we had a client ask if I could implement a specific feature that he’d read about in a post on Planet Drupal. It turned out to be mine. Not only was I happy to agree to his request, it helped him trust our advice. My new colleagues better understood what this Drupal guy brought to the Salesforce team. Besides let’s be honest it’s fun when people cite your own work back at you.

Writing your own

You don’t have to maintain a whole blog to write useful how-to articles. Drupal, like most large open source projects, maintains public wiki-style documentation. Github pages allow anyone to freely publish simple articles and there are many examples of single-page articles out there. And of course there is no shortage of dedicated how-to sites that will also accept content.

The actual writing process isn’t that hard, but often people leave out steps, so I’ll share my process. This is similar to my general advice for writing instructions.

Pick your audience

It’ll be used more widely than whoever you think of, but have an audience in mind. Use that to help target a skill set. I often like to think of myself before I started whatever project inspired the article. The higher your skill set the more you should adjust down, but it’s hard to adjust too far, so be careful is aiming for people with far less experience than you have – make sure you have a reviewer with less experience check your work. Me − 1 is fine, Me − 5 is really hard to do well.

Start from the beginning and go carefully step by step

Start with no code, no setup, nothing. Then walk forward through the project one step at a time writing out each step. If you gloss over a detail because you assume your audience knows about it add reference links. You can have a copy of a reference project open but do not use it directly; it’s there to prevent you from having to re-research everything.

List your assumptions as you go

Anything that you need to have in place but don’t want to describe (like installing Drupal into a local environment, creating a basic module, installing Node, etc) state as an explicit assumption so your reader starts in the same place as you do. Provide links for any assumptions which are likely hard for your expected audience to complete. This is your first check point – if there are no good references to share, start from where that article you cannot find should start (or consider writing that article too). 

Provide detailed examples

Insert code samples, screenshots, or short videos as you progress. Depending on what you are doing in your article the exact details of what works best will vary. Copy and paste as little reference code as possible. This helps you avoid accidentally copying details that may be revealing of a specific project’s details.

If you look at mine you’ll see a lot of places where I include comments in sample code that say things like “Do useful stuff”. That is usually a hint that whoever inspired the article had interesting, and perhaps proprietary, ideas in that section of code (or at least I worried they would think it was interesting). I also try to add quick little asides in the code samples to help people pay attention.

Test as you go

Make sure your directions work without that reference project you’re not sharing. This is both so your directions work properly and further insulation against accidentally sharing information you ought not share.

End with a full example

If you end up with a bunch of code that you’ve introduced piecemeal, provide a complete project repo or gist at the end. You’ll see some of my articles end in all the code being displayed from a gist, and others link to a full repository. Far too many people simply copy and paste code from samples and then either use it blindly or get stuck. Moving it to the end helps get people to at least scan the actual directions along the way.

Give credit where credit is due

If you found partial answers in several places during your initial work, thank those people with links to their articles. Everyone who publishes online likes a little link-love and if the article was helpful to you it may be helpful to others. Give them a slight boost.

Feb 17 2021
Feb 17

TL&DR: Use drupal.org's issue forks to make Drupal 9 compatibility fixes work with Composer.

While most software developers are in agreement on the two hardest things in software development – cache invalidation, naming things, and off-by-one errors – not everyone is in agreement on how to rank the rest of the top ten challenges. One that must surely rank high is dependency management algorithms and dependency management tools. These make sure that different libraries and additions to a codebase are kept up to date and APIs are kept compatible. For example, supposing a codebase has SquareWidget v2 and CircleWidget v2; if SquareWidget v3 comes out but is incompatible with CircleWidget v2, the codebase's dependency management tool would prevent updating to SquareWidget v3 until a compatible version of CircleWidget was available.

In the Drupal world we've historically avoided formal dependency management as we could just download a package from drupal.org and get running, occasionally realizing "Oh I needed CTools too" and grabbing it. Along the way some folks built the Drush tool which, amongst other things, could download these dependencies automatically. It wasn't until Drupal 8 came along that more formal dependency management became a thing because of its heavy use of 3rd party libraries, in large part thanks to the use of the Composer system. This tool came out of the wider PHP community's need for a generic, reliable dependency management system, and in the Drupal maintainers' drive to adopt more 3rd party libraries and tools it was the obvious choice. After an initial bumpy learning phase, almost all contemporary Drupal 8 and 9 website projects today are managed using Composer.

The drupal.org infrastructure provides a custom wrapper around Drupal core, module, and theme meta data so that it can be loaded using Composer using its metadata platform Packagist. Modules and themes which already include a composer.json file will have that made available as-is. However, a large portion of Drupal 8 and 9 contrib projects don't have this, so the drupal.org infrastructure maps the info.yml files into a format Composer can understand. This way a module that was initially ported to Drupal 8 a few years ago can still be added to a contemporary D8 project managed with Composer, even if the module hasn't been touched in years. It's awesome.

The World of Drupal 9 Updates

Back in October 2019, a new feature was added to core 8.7.8 which allowed modules to specify the versions they were compatible with by using a new line in their info files. This new line became a requirement on Drupal 9 as there needed to be an indication in modules & themes to indicate what APIs they were compatible with. For most projects the new line makes the info file look like this – note how the old "core" value is now replaced with a "core_version_requirement" value:

name: Commerce Migrate
type: module
description: Provides various Commerce-specific Migrate handlers.
core_version_requirement: ^8.8 || ^9
package: Commerce (contrib)
dependencies:
 - drupal:migrate
 - drupal:telephone
 - migrate_plus:migrate_plus

A module (or theme) could use the new line to indicate they were compatible with both Drupal 8 and 9 simultaneously, and the majority of the most popular modules have made the necessary changes.

Drupal 9 presented the first major release of Drupal core that was such a low impact update it was possible to run many websites on either 8.9 or 9.0 just by swapping the core files out (and updating the dependencies). Gone are the days of having to rebuild a site from scratch for each major upgrade, instead we just have to keep our codebases fully up to date and swap to the new core release pretty quickly.

Except for that one line.

That one new line has to be in each info.yml file in the codebase (except for test files, but that's a different matter), so any under-maintained or unmaintained module or theme will need to be updated. Thanks to the wonders of modern development tools, it was estimated that almost a quarter of all Drupal 8 modules & themes could be upgraded to be compatible with Drupal 9 by just changing their info file! Over the course of 2020, thanks to contributions and collaborations from folks all over the world, a huge number of modules and themes were updated to be fully compatible with Drupal 9, and a large portion of those that don't have releases have patches available.

The Catch-22

The fact that there are patches to make Drupal 8 modules & themes compatible with Drupal 9 is great for maintainers or would-be maintainers - they don't need to go through the efforts of making all of the changes themselves, they can just review what has been provided and, hopefully, commit it. Normally patches are great for end-users too, because again they don't have to take the time to make the change themselves, someone has made it available for them.

Here is what happens when you apply a patch using Composer:

  1. Composer downloads the project's listing from Drupal's custom Packagist system (see above).
  2. It compares the dependencies from the listing against what's currently in the codebase.
  3. It deletes the existing copy of the underlying module or theme, if present.
  4. It downloads a fresh copy of the module/theme that matches the dependencies.
  5. It applies the patch.

Normally this patch process works great - you find a patch, add to your codebase, and away you go, remembering to leave a comment to the patch creators how well it works for you. However, there's a major limitation here - even if the patch contains changes to the composer.json dependencies, it's applied after Composer has decided whether or not to install it.

In short, you can't use a patch to tell Composer that a Drupal 8 module is compatible with Drupal 9.

Improved Code Collaboration Workflows

Drupal originally used CVS to manage the codebase. This was very limited by today's standards, but it was reasonable back in the early 2000s and provided a means to centrally manage the large codebases of both core and the ever expanding array of contributed modules & themes. Proposed changes to these codebases were handled using patch files, which are simply text files which indicate a file to be modified, which lines are to be removed and which are to be added. It worked well enough, but there was a large learning curve for beginners to get past.

Over the years more flexible & functional replacements for CVS became common, including centralized systems like Subversion and decentralized systems like Mercurial, Perforce or git. Rather than take the short jump to another centralized system, the effort was taken to build out a replacement code management platform using git, under the umbrella project name of "The Great Git Migration". Completed in 2011, the effort was lead by Sam Boyer, and the community has been all the better for it.

However, after the git migration was completed the community was still stuck with patch files. While github had its pull request workflow, the Drupal community was screaming at the need for somewhat archaic collaboration workflows.

Skip ahead nine years and an awful lot of discussion and research, in 2020 the community finally had a replacement code collaboration workflow in the form of merge requests via the Gitlab system. This new workflow allows anyone to create a fork of a project, make changes, and then create a gihub-pull-request -like change request, dubbed a "merge request", for others to review. Unlike github's pull request system, it's also really easy for anyone to collaborate on the same merge request, which lends itself really nicely to collaboration with others rather than solo development. After some opt-in beta testing, the new system was launched community-wide for every single code project on drupal.org to use.

The new issue fork and merge request system is based upon the simple concept that each individual issue on drupal.org can have its own copy of that project's codebase, an issue fork, and that codebase can be downloaded individually using git. With an issue fork anyone can make whatever changes they need directly with git and others can then download those changes directly using git - no additional tools needed, no patch files flowing around.

This also means that it's possible to tell Composer to download the codebase from an issue fork instead of from the main repository.

This means that an issue fork can be used to get around Composer's patch-vs-dependencies catch-22!

Putting it All Together

First off, it should be noted that issue forks are, to all intents and purposes, a separate physical repository than the parent project they fork from. This means that you cannot just download the issue fork by telling Composer to use a specific branch of the main project, Composer has to be told to use a completely different repository.

It's also worth bearing in mind that, for a given Drupal project (module or theme), only one issue fork can be used at a time. Because an issue fork is a separate repository, it isn't possible to download two different versions of the same module/theme and magically have them squish together. Therefore, if multiple merge requests / issue forks are needed for a given project then the others have to be applied as patches; alternatively, a separate "meta" issue could be created that combines multiple changes into one merge request, but at that point it might be easier to just become a maintainer and commit the changes.

In this example, I'm going to use the merge request created for the Drupal 9 compatibility issue for the Commerce Migrate module.

  • First off, find the issue fork portion of the d.o issue, which should be right underneath the list of attachments & patches, which is underneath the issue summary.
    Issue fork options
  • Click the "Show commands" button to expand out the example git commands.
    Show commands
  • In both the "Add & fetch" sections there will be a "git remote add" line. Included in this is a URL that's needed to download the codebase from the issue fork - one starts with "[email protected]" while the other starts with "https://git.drupalcode.org". Copy the full line (click the "copy" icon to copy it to the clipboard) and extract the URL, e.g. [email protected]:issue/commerce_migrate-3150733.git.
  • Click the "commands" button again to hide them, and then get the branch name, which in the example above is "3150733-drupal-9-compatibility".
  • In the site's composer.json file, in the "repositories" section add a new item with two values: {"type": "git", "url": "URLFROMABOVE"} e.g.:
           {
               "type": "git",
               "url": "[email protected]:issue/commerce_migrate-3150733.git"
           }
  • Look for the item in the "repositories" section that has "type" set to "composer". If it doesn't exist already, add an item called "exclude" and make it a list. Add the Composer name of the module/theme you want to use, e.g. "drupal/commerce_migrate", so that it looks like this:
           {
               "type": "composer",
               "url": "https://packages.drupal.org/8",
               "exclude": ["drupal/commerce_migrate"]
           },
  • Change the listing for the project in the "require" (or "require-dev") section to point to the branch name identified above, e.g. "drupal/commerce_migrate": "dev-3150733-drupal-9-compatibility",
  • Save the changes to the file.
  • Update the project in composer, e.g. composer update drupal/commerce_migrate.

The last command will now download that project from the issue fork instead of the main codebase.

Note: these should only be used as a short term solution, the goal should always be to collaborate to get changes committed so that these steps aren't needed.

(there might be other ways of doing this using repository priorities, but this method works)

But it Didn't Work?

One problem that can arise is that Composer can't process the project, which it will tell you with this error message:

  [Composer\Repository\InvalidRepositoryException]
  No valid composer.json was found in any branch or tag of [email protected]:issue/commerce_migrate-3150733.git, could not load a package from it.

This simply means that the project doesn't have a "composer.json" file in it, so you can fix that by adding a composer.json file to the repository. Once that is created (make sure to run "composer validate" before saving it!) and uploaded to the issue fork, it'll be possible to download it to a site's codebase again.

Put That Thing Back Where it Came From or So Help Me

Because they don't keep current with upstream changes and can fall out of date quickly, issue forks should be used sparingly in website projects. As it happens, the patch for Commerce Migrate I wrote this blog post around was committed between the time I started the blog post and it was published – "The Drop Is Always Moving", as they say.

When the day arrives and the project has its Drupal 9 fixes committed, there are a few steps to remove the issue hackery and make the website's codebase happy again.

  1. Remove the extra "repositories" item.
  2. Remove the "exclude" line from the "type":"composer" repository; if there aren't any remaining items in the "exclude" section it can be removed entirely.
  3. Change the "require" line (or "require-dev" line) back to point to the appropriate release that includes the Drupal 9 fixes.
  4. Run "composer validate" to make sure the compost.json file is correct.
  5. Run "composer update drupal/PROJECTNAME" to get the new, cleaner version of the project.
  6. Commit the changes.
  7. Celebrate.

That Was a Lot of Words, Do You Have a Picture?

This topic was covered in a recent Contrib Half Hour. Because I forgot to record that meeting (I'm a professional, honest) I repeated the steps the following week, so now there's a recording of me stepping through the process to create an issue fork to make a Drupal 9 fix for a Drupal 8 module work in Composer:

[embedded content]

Feb 11 2021
Feb 11

Last week one of our clients was asking me about how they should think about the myriad of options for website hosting, and it inspired me to share a few thoughts. 

The different kinds of hosting

I think about hosting for WordPress and Drupal websites as falling into one of three groups. We’re going to compare the options using an example of a fairly common size of website — one with traffic (as reported by Google Analytics) in the range of 50,000–100,000 visitors per month. Adjust accordingly for your situation. 

  • “Low cost/low frills” hosting — Inexpensive website hosting would cost in the range of $50–$1,000/yr for a site with our example amount of traffic. Examples of lower cost hosts include GoDaddy, Bluehost, etc.  Though inexpensive, these kinds of hosts have none of the infrastructure that’s needed to do ongoing web development in a safe/controlled way such as the ability to spin up a copy of the website at the click of a button, make a change, get approval from stakeholders, then deploy to the live site. Also, if you get a traffic spike, you will likely see much slower page loads. 
  • “Unmanaged”, “Bare metal”, or “DIY” hosting — Our example website will likely cost in the range of $500–$2,500/yr. Examples of this type of hosting include: AWS, Rackspace, etc. or just a computer in your closet. Here you get a server, but that’s it. You have to set up all the software, put security measures in place, and set up the workflow so that you can get stuff done. Then it’s your responsibility to keep that all maintained year over year, perhaps even to install and maintain firewalls for security purposes. 
  • “Serverless” hosting¹ It’s not that there aren’t servers, they’re just transparent to you. Our example website would likely cost in the range of $2500–5000/yr. Examples of this kind of hosting: Pantheon, WP Engine, Acquia, Platform.sh. These hosts are very specialized for WordPress and/or Drupal websites. You just plug in your code and database, and you’re off. Because they’re highly specialized, they have all the security/performance/workflow/operations in place that 90% of Drupal/WordPress websites need.

How to decide?

I recommend two guiding principles when it comes to these kinds of decisions:

  1. The cost of services (like hosting) are much cheaper than the cost of people. Whether that’s the time that your staff is spending maintaining a server, or if you’re working with an agency like Advomatic, then your monthly subscription with us. Maybe even 10x.  So saving $1,000/yr on hosting is only worth it if it costs less than a handful of hours per year of someone’s time. 
  2. Prioritize putting as much of your budget towards advancing your organization’s mission as possible. If two options have a similar cost, we should go with the option that will burn fewer brain cells doing “maintenance” and other manual tasks, and instead choose the option where we can spend more of our time thinking strategically and advancing the mission.

This means that you should probably disregard the “unmanaged/bare/DIY” group. Whoever manages the site will spend too much time running security updates, and doing other maintenance and monitoring tasks. 

We also encourage you to disregard the “low cost” group. Your team will waste too much time tripping over the limitations, and cleaning up mistakes that could be prevented on a more robust platform.

So that leaves the “serverless” group. With these, you’ll get the tools that will help streamline every change made to your website. Many of the rote tasks are also taken care of as part of the package. 

Doing vs. Thinking

It’s easy to get caught up in doing stuff. And it’s easy to make little decisions over time that mean you spend all your days just trying to keep up with the doing. The decision that you make about hosting is one way that you can get things back on track to be more focused on the strategy of how to make your website better. 

¹ The more technical members of the audience will know that “serverless” is technically a bit different.  You’d instead call this “platform-as-a-service” or “infrastructure-as-a-service”. But we said we’d avoid buzzwords.

Feb 11 2021
Feb 11

In our recent webinar, we set out to help higher ed marketing teams rethink their digital focus for the year ahead. 

We tapped a group of experts with years of experience helping some of the best-known colleges and universities deliver engaging digital experiences to discuss key web strategy takeaways. Members of the expert panel included:

  • Muzel Chen - Senior Digital Strategist, Mediacurrent 
  • Diane Kulseth - Senior SEO Consultant, Siteimprove 
  • Steve Persch - Technical Product Marketing Manager, Pantheon

If you missed our webinar or want to watch it again, check out the full recording: 

[embedded content]

The live audience came ready with their most pressing questions on topics from personalization and SEO best practices to harnessing analytics and breaking down the communication barriers between Marketing and Admission departments. Below is a summary of what was discussed on the webinar. 

Personalization 

Is personalization dead?

Diane Kulseth: I've seen some higher education institutions using personalization effectively by auto-populating forms. That can be really helpful to keep things seamless for the experience for the student. At the same time, we're talking to a generation that's getting more and more skeptical of the internet. 

There's a misconception that adopting new technology will automatically save you time. Technology, like personalization, can save you a lot of time but it will also take time to implement correctly.

Steve Persch: Technology buying decisions are often made on the assumption that by buying a tool, you will get to spend less time doing a certain task. I think a similar motivation leads a lot of people to buy personalization. They think getting a more powerful measuring tool will fix their problem, but it actually gives you more to measure and requires more of your time.

Muzel Chen: There’s also the issue of how do we use personalization effectively? Students want to get information right away and personalization can reduce these obstacles in navigation and process.

Personalization can mean many things, but in the context of higher ed, we can personalize content for an audience based on their common demographics and behaviors. But it also depends on your medium: are you trying to personalize in social media? Website? Email? Text message? Audiences have different expectations from each marketing channel, which is predicated by the amount of PII (Personally Identifiable Information) they have disclosed to those channels. 

Depending on the context and level of personalization, they can range from convenient to creepy, and striking that right balance can be tricky. But when it works, visitors experience a higher level of engagement because of the convenience and they feel the institution cares about them.

SEO Best Practices for Subsites and Domains 

When thinking about a redesign, I know websites should have ONE main goal, but how do you handle a few different audiences without creating microsites? For example, prospective vs. current students?

Muzel Chen: Some of these issues are mitigated by dedicating an area in your architecture for those specific audiences. But there are situations where it feels the content is servicing all audiences or only one exclusively. These situations typically result in microsites which can be difficult to manage. 

An easy way to address those needs is to start with an audience-neutral page, have the audience self-identify, and place those audience-specific pages at a deeper level. For example, common pages where this occurs are: campus living, parking permits, and health services. These top-level pages will have content that speaks to all audiences and grows in detail as the visitor goes deeper into your site. The details can then begin to diverge by audience type. 

Do you see any trend of separate sites vs. subsites for academic departments (or other smaller units) within a university or college?

Muzel Chen: Over the last decade, it's been very common for higher ed to have subdomains (separate sites) for each individual department or school. But, these sites grow out of hand and become difficult to manage especially when there's not any established governance for tracking content updates across all of the subdomains. 

More institutions are starting to transition to the subfolder (or subsite) practice. It’s easier to see the big picture of all your sites and manage them all within their content management system. 

How can one build a culture of importance for SEO across a decentralized organization (i.e., schools, departments, and offices, like Admissions, all run their own websites)?

Diane Kulseth: I think the biggest part is tying it back to what it all means. Whether it's donor relations, increased revenue for the university, or admissions, tie it back to their goals. That's first and foremost and all of that helps the website and it helps your students. It helps your donors. It helps your prospective parents of students to be able to better navigate the website. 

SEO is great for SEO's sake and building great traffic, but it's also really helpful for all your other marketing initiatives to make sure that people are able to get where they want to go on a seamless web experience that loads properly and is easy to navigate with strong information architecture. 

Marketing Strategy and Analytics 

What are some of the challenges you've seen higher ed leaders face when implementing marketing technology?

Muzel Chen: Reporting. I often see data collection tools being misconfigured. Or collecting lots of data without really defining its purpose. Everybody wants to get access to analytics data, and once that's provided, they use it once and never touch it again. Suddenly, you have 100 users and nobody is providing insights. Without data, all planning comes down to guesswork.

Steve Persch: You almost need Google Analytics for your Google Analytics to track who's using it.

Which analytics data seems to be the most valuable to higher ed? I realize this is likely very organization or campaign dependent, but anything generally useful that may not be obvious?

Diane Kulseth: First and foremost, you want to have your reports on RFIs or requests for information and your applications, and then connect that to enrolled students within your CRM or admissions platform. 

Beyond that, once you start digging deeper, it's really important to start looking at insights like when people are coming to this page from an advertising campaign, are they actually engaging with the page for a substantial amount of time? Are they going elsewhere? How are they responding to your marketing messages? Can you get even more granular and start looking at the demographics behind them? Are these men, women, what are their age ranges? What other insights can you glean from your different tools? I think all of that can be really helpful, but again, start with your basics: RFIs, enrollments, and applications.

Overcoming Department Silos 

Do you have any tips for how Admissions and Marketing can work together?

Steve Persch: Shifting to an iterative or agile mindset for your website is especially difficult when so much of a university operates on a semester or year-long calendar. There's an expectation that you need the web plan for the next year or 10. Saying we're going to do small experiments and get feedback week by week is challenging to accomplish in the higher ed ecosystem. However, I think that you need to find a way to do it with strong cross-departmental relationships across those silos. 

Muzel Chen: We talked about analytics access as one of the technology challenges, but as far as a people challenge, what I see most often is communication. A lot of departments are still siloed — especially marketing and admissions.

A good starting point here is setting up a meeting cadence where you can share common challenges to solve and opportunities to pursue. For example, an ideal project is to link marketing and admissions data, which tracks the visitor as they leave the main site and enters the application process. Marketing could validate their campaigns by reviewing the application data, whereas admissions could personalize the student experience by using marketing’s data. 

Optimize Your Higher Ed Site 

Explore how RainU CMS can help your school launch on Drupal in 30 days or less with a Pantheon-powered hosting solution. Together with Siteimprove, Mediacurrent’s digital strategists can help you optimize your website for SEO, accessibility, and overall performance. 

Feb 09 2021
Feb 09

[embedded content]

It is not uncommon for links to be styled like buttons when building websites.  Although this is not frowned upon, it is important to understand what HTML element to use depending on the purpose of that element.  For example, a element is frequently used to perform an action such as submit a form, or interact with an element to change its behavior., whereas s element is often used to direct the user to another page, site, or section of a page.  Visually it may not make much difference how links or buttons are styled, but choosing the right element matters from a semantic and even accessibility perspective.  Let’s go over how to dynamically render the right HTML element depending on the element’s purpose using Twig on a Drupal website.

Before we get into the details of the code, let’s understand how the two elements differentiate from each other which will help us qualify the right element to use.  Probably the easiest way that makes the

and different from each other is that a contains a href attribute and the does not.  This alone can help us determine whether we should use a or element on our site.  The href attribute will always be present when using an anchor element.  So let’s use the href attribute to our advantage.
  1. Let's start by defining some key data elements we usually have at our disposal when building a button or anchor element in Twig or React:  url, text, class
  2. It’s possible we won’t always have url or class, but text will most likely be available so we can label the element (i.e. Read more, Submit, etc.)
  3. We know that a will always need a url.  We also know a
will never have a url value</li></ol>

Using the logic above, we could start writing some code.  Let’s start with Twig and then we will repeat the process with JavaScript if you are working on a React or Gatsby project.

{% if url %}

  

    {{ text }}

  

{% else %}

  
    {{ text }}    {% endif %}

JavaScript and/or Gatsby

The logic to follow in any other language will be the same. The syntax may vary but ultimately we are checking for a URL and if so, let’s print a element, otherwise, let’s print a element.  However, in Gatsby there is one more scenario to consider; is the URL internal or going to an external page/site?  One of Gatsby’s powerful core components or features is .  This is used to prefetch pages in a gatsby site which results in incredible performance improvements.  So our logic just got a little more complex because not only do we need to check for whether there is a URL, we also need to determine if the URL is internal or external.  If it is internal, we will use a component, otherwise, we will use a traditional element.  Let’s see how we can make this work.  In an effort to keep this example short and to the point, I am excluding some pieces you normally would see in a real gatsby component.  This would typically be done in a button.js file.

import React from 'react';
import { Link } from 'gatsby';

const Button = ({
  children,
  to,
  href,
  ...props
}) => {

  // If the `to` prop exists, return a link.
  if (to) {
    return (
      
        {children}
      </Link>
    );
  }

  // if the `to` prop does not exist but `href` does, return a  element.
  if (href) {
    return (
      
        {children}
      
    );
  }

  // If the `to` or `href` props do not exist, return a 
element. return ( {children} ); }; export default Button;

If a to prop is identified, we return a Gatsby component. As we can see this is a more complex piece of code but ultimately does the same thing as when we wrote it in Twig.  The difference here is that we are detecting three scenarios to determine which element to return:

  • If href prop is detected instead of to, we return a traditional element with target="blank" and rel="noopener noreferrer", According to Gatsby docs, routing to target="_blank" is the equivalent of an external page and Gatsby cannot be used. Docs here: https://www.gatsbyjs.com/docs/linking-between-pages/#using-a-for-external-links
  • Finally, if neither to nor href are detected, we return a element.

In Closing

Some may not see the need to go to this extreme to simply decide whether we need a button or anchor element.  However, there are many reasons why this is something you should consider as this could impact accessibility, UX, and in the case of Gatsby, even your site performance.

Feb 09 2021
Feb 09

Now on Drupal 9, the community isn’t slowing down. This month, we continue our interview with Angie Byron, a.k.a Webchick, a Drupal Core committer and product manager, Drupal Association Board Member, author, speaker, mentor, and Mom, and so much more. Currently, she works at Aquia for the Drupal acceleration team, where her primary role is to “Make Drupal awesome.” We talk about Drupal, coding, family, and her journey throughout the years.

This article was originally published in the February 2021 issue of php[architect] magazine. You can read the complete article at the following links. To see the full issue, please subscribe or purchase the complete issue.

Jan 27 2021
Jan 27

RainU logo

On the heels of our recent Drupal 9 release of Rain CMS, we are excited to officially announce the beta release of our Rain University platform, RainU. RainU CMS is a Drupal-based development platform made just for higher education. Colleges and universities can now launch new sites faster with full, flexible control over content.

The RainU CMS Theme

RainU for Drupal homepage with hero image shows a group of students in graduation caps

New RainU CMS theme homepage

The RainU theme is based on the main Rain base theme but adds additional features that are relevant for university websites. The navigation and content have been staged to help content authors get started quickly. Some of the new features added to RainU are the event and quote carousels, as well as more components for highlighting content.

Building pages

RainU CMS content field for frequently asked questions

Rain CMS admin content edit page

RainU content authoring

Paragraph browser dialog

With Rain University, we give content authors the freedom and flexibility to build robust pages using a library of pre-stocked components (called “paragraphs” in Drupal). The Rain Admin UX offers many improvements over the stock Drupal admin which makes the overall experience more intuitive for editors.

Find out more

Currently, Mediacurrent’s Rain University CMS is available to new and existing clients. Our team of strategists, designers and developers can work with your organization to migrate your website from a legacy CMS onto an enterprise, open source platform.

For more information on the benefits of Rain University CMS and to schedule a free demo, please visit the RainU page or chat with us right now (see bottom right corner of the page). We would be happy to talk more about your project or schedule a demonstration.

Jan 27 2021
Jan 27

I recently recorded a video series tutorial about progressive Drupal decoupling. In this series I take two of the official React app examples and turn them into widgets. Your Drupal editorial team can then embed those React applications (a calculator, and an emoji selector) as blocks in a page, as a field in a content type, as an embedded entity in the body field using the WYSIWYG, …

#1 Embed Any JavaScript Application

In this first video in the series we will take one of the offical examples from react and we will turn it into a widget ready to be embedded in Drupal (or anywhere else).

Steps

  1. Create new repository from template.
  2. Migrate source to new repo.
    • Copy new source files.
    • Adapt index.js (including render function).
    • Combine package.json.
    • Find & replace «widget-example».
    • Remove / add specific features.
  3. Reformat and execute tests.
  4. Execute locally.
  5. Deploy application.
[embedded content]

#2 The Registry & the App Catalog

The widget registry is the place where you aggregate your widgets (and other people’s widgets you want to use) to make them discoverable to Drupal and other CMS integrations.

This piece plays a fundamental role in the governance of your project(s). You can choose to have a single registry for all your Drupal installations, or one registry per project. You can use the pull requests to gatekeep what versions are added to the registry and who can publish them. The idea is that the owner of the widget-registry project has the authority of accepting PRs to add/update widgets so they are available in the registry (and therefore in Drupal).

[embedded content]

#3 Set up Progressive Decoupled Drupal

In this video we will learn how to connect Drupal and the widget registry to let editors embed JS applications all over Drupal (that includes support for i18n!).

You can, for instance, embed JS applications as blocks, as a field for a content type, in the body field as an entity embed, …

[embedded content]

Photo by Shifaaz shamoon on Unsplash

Jan 26 2021
Jan 26

As we look ahead to the end of the future academic year and what the future looks like, we see uncertainty clouding what is a typical admissions season for teams in higher education. 

Recently, we asked our partners in higher education to share their digital challenges. We heard that admissions personnel, as well as marketing teams at the college and university level, are feeling the pressure. They need to make sure the expectations of stakeholders are met or exceeded despite the unpredictable path ahead. 

Even though teams may face challenges ahead, one thing is certain: rethinking digital strategy to set your path forward will set your team up for success and your institution apart from others. 

The website is the heart of your digital strategy, and both should be built to adapt. That’s why many higher education institutions choose Drupal for their organization’s CMS. 

Below are five areas to focus your digital strategy, with some of our favorite tips and tools to improve campaigns moving forward.

Reevaluate Your Content Strategy

Universities used to enjoy a steady flow of students enrolling in programs. However, the future is now uncertain because of the COVID-19 pandemic leading many students to forego education or to choose online courses over taking classes in a traditional, physical environment. 

The uncertainty affected not just marketing teams at universities, but students as well. When the McKinsey higher education survey was conducted in April 2020, 37% of responding students were confident that things would return to normal by the end of summer. However, the 2020-2021 school year has thus far reflected much of what the previous school year 

Findings from our own client survey showed that uncertainty in the 2020 recruitment season led to several shifts in strategy to further help the user journey in the decision making process of choosing programs such as the following: 

  • Virtual tours rather than in-person tours
  • Live video conferences rather than in-person sessions
  • Website content to supplement brochures and physical marketing materials

Changes in academia lead to a shift in messaging, so teams need to evaluate if their content strategy is still working or if more needs to be done to cater to today’s student and their new priorities. 

Some ways in which evaluating content strategy can be done include: 

Persona research

 Although you may have a general idea of who your target audience is, more thorough research that includes user surveys can help create a better understanding of who your content should speak to. For instance, you may learn from user surveys that students and parents are uncertain about returning to in-person learning because they want to know more about what is being done to keep people safe in the classroom. With this information in mind, you might develop more content about COVID-19 cleaning protocols to give them peace of mind.

Content audit

Is your content resulting in growth, and does it cater to your users? If you are not getting the most out of it, an audit can help address gaps and find opportunities. 

Dashboard creation

Making sense of data is an important responsibility of a university’s marketing team. User-friendly dashboards can simplify the process of reviewing data and making decisions based on findings. Working with a digital strategy team with experience in higher education to improve your approach can yield results that allow your content to better serve student needs.

Give Your Marketing Team The Tools to Succeed

Giving the university marketing team agency in creating content quickly and efficiently is a top priority of many agencies that work directly with these teams. However, finding a CMS that provides the flexibility they want and a user-friendly editorial experience they need can be a challenge.  

RainU CMS can improve the editorial experience for content editors looking for a solution that allows for easier workflows that match with your existing design standards. With the ability to launch sites in 30 days or less, Rain helps both content editors and developers create flexible websites fast.

If your site is on Drupal, creating a decoupled solution with Gatsby may be just what you need. The business case for decoupled Drupal with Gatsby can help you determine if the cost and benefits are right for your university. Our developers are well adept at providing guidance in setting up GatsbyJS.

Using Drupal with Gatsby is a great way to get an enterprise-quality CMS for free, paired with a great modern development experience and all the benefits of the JAMstack, like performance, scalability, and security.

Gatsbyjs.org

Make Smarter Use of Resources  

Unprecedented changes in higher education likely result in unexpected changes to budgets and priorities. Streamline the routine maintenance of your Drupal site to shift more budget toward new features. Here’s how Mediacurrent’s development solutions like Automated Updates and Multisite+ can help:

With Automated Updates, you can save hours of manual development work. Our automation services initiate pull requests, create JIRA issues, and stage updates within Pantheon’s multidev environment. Your team of project managers and developers can focus on productive work rather than administrative tasks when using Automated Updates for your project. 

Need to create multiple sites? Spin up new instances with Multisite+! With a shared codebase, your sites can go from concept to creation quickly and efficiently, and each has its own database, configuration, files, and base domain or URL to help with organizing your content. 

We have a wide variety of development services to meet your university marketing needs. 

Enhance Web Accessibility 

Universities cater to individuals of all abilities, so it’s important to make sure the digital experience is accessible to all. Using a tool like Siteimprove can help university marketing teams better understand how their site’s accessibility matches up to Web Content Accessibility Guidelines (WCAG) standards. 

SiteImprove dashboard with circular progress charts for QA, accessibility, SEO

SiteImprove's automated reports provide an easy way to measure and document progress toward accessibility goals

Failing to keep on top of website accessibility could lead universities to face warnings from the Department of Education as well as lawsuits. Mitigation measures such as using Siteimprove or working with a skilled accessibility team to audit and remediate your site allows your team to take steps that minimize the possibility of lawsuits as a result of your site’s digital experience. 

Launch New Campaigns Faster 

Colleges and departments within universities often need to launch campaigns quickly, and depending on the technology involved, expediency is integral. Teams must have a workable solution to accomplish the goals of an individual college or department.

Marketing teams can take advantage of RainU CMS to launch to market in 30 days or less. 

RainU CMS for Drupal homepage

Gain more control over your site with RainU CMS such as:  

  • Robust Security - Mitigate attacks from hackers - RainU CMS has several built-in security hardening features to help.
  • Flexible Content Feature-Packed - Build pages with a flexible, easy to use interface in RainU CMS for rapid development.
  • Component-Based Theme - Like with other Drupal sites, RainU CMS has reusable components and a built-in style guide which reduces design time.

Demo RainU CMS 

Ready to take your higher ed site and marketing campaigns to the next level? Explore how RainU CMS can get you there and check out the demo below. 

[embedded content]

Jan 25 2021
Jan 25

Newly engineered opportunities have opened the doors for Higher Education institutions to pioneer student, researcher, and funding recruitment. From deeper data applications to mass-scale live debates, the Higher Education sector is going through a digital transformation, with varying rates and approaches.

New data and accessibility regulations, as well as pressure on student recruitment from COVID-19, have required Higher Education institutions to accelerate these 'digital transformation roadmaps'.

Entire organisations have had to react and re-evaluate everything across technology implementation, face-to-face education, student recruitment, and community satisfaction.

The forces of change are drawing in at an unprecedented rate. But are universities equipped to make the quality, long-term adjustments needed?

Senior stakeholders from the University of West London, Manchester Metropolitan University, and Oxford Saïd Business School sat down with Paul Johnson, our Drupal and HE Specialist at CTI Digital to discuss their digital challenges and opportunities during a panel at DrupalCon Europe. We received a unique perspective on various UK organisations' challenges with differing cohorts, scale and complexity, age and legacy systems.

Watch the full panel here, and use the time stamps below to navigate:

[embedded content]

00:00 - Introduction with:

  • Chair and top left: Paul Johnson, HE Specialist at CTI Digital
  • Bottom left: Adrian Ellison, Associate Pro-Vice-Chancellor & Chief Information Officer at the University of West London.
  • Top right: Nick Holland, Head of Digital at Manchester Metropolitan University.
  • Bottom right: Iain Harper, Head Of Digital Marketing, Saïd Business School, University of Oxford.

05:29 - Why The University of West London chose to upgrade and continue using Drupal.

09:50 - How Manchester Metropolitan University built the business case to move to Drupal.

13:29 - Oxford Saïd Business School's experience of using Drupal 8 for multiple years.

19:30 - Managing "HiPPO" (Highest Paid Person's Opinion) and different stakeholders' opinions.

22:20 - Data-driven decision making to changes of an existing platform at Oxford.

24:58 - Managing governance for an entire platform change at MMU.

26:58 - Managing change to projects and their teams over multi-year projects.

33:54 - Lockdown and adapting working with staff and students remotely.

37:04 - Content governance and consistency.

38:54 - Designing and building a website for diverse audiences.

41:22 - What features or capabilities for Drupal should Drupal develop for HE's future?

If you're looking for a digital partner to support your digital transformation. We're the team you're looking for. Our full-service team can take your through discovery and user research to plan and define the underlining requirements that meet your business goals. Our content strategy and development team will then be available to make your digital roadmap become a reality—all under one roof, with years of precedented success.

Get in Touch

Jan 21 2021
Jan 21

Problem: The Power (and Pitfall) of Paragraphs

With great content modeling powers comes great responsibility for Drupal developers. 

While building a website, you may face the challenge of creating multiple paragraphs without much reuse. This can contribute to a poor content architecture that will burden you for the lifetime of your application. This issue can come about when a developer is building out paragraphs and decides to generate multiple paragraphs with similar fields or new fields that perform the same functionality as other fields. 

For example, let’s say a developer is tasked with building a normal Text Paragraph which is just one textarea field, which is simple enough. Now, let’s say a client comes to that developer and ask if the Text Paragraph could have a header field, and a subheader field. Usually, a developer might decide to just create a new paragraph, maybe call it a Card Paragraph, that has a new header field, subheader field, and a new body field. Well, what if you could just reuse the body field from the Text Paragraph? There is a way to do that by using Form Displays and the default Paragraphs reference field widget. Of course, the example above is simple, but there might be better cases which will be explained below.

Solution: Reuse Paragraphs and Form Displays

Mediacurrent’s Rain CMS  is an open source project that comes bundled with ready-to-use content types and paragraphs. One of those is the Card paragraph. The Card paragraph by itself might just have a title, media, body, and layout field. This is pretty useful as its own componentized item that can be reused on other components. Now let’s say you have another paragraph that is called a Breaker, which essentially is almost like a Card Paragraph but might have some variation. This Breaker paragraph might reuse the same title, body, but also has an extra field called right body for any text that might need to be put on the right column.

The way one can keep reusing the Card paragraphs, with other paragraphs that get created in the future is to add more fields onto the Card paragraph itself.

Drupal manage fields tab

As displayed above we have a ton of fields added on to the Card Paragraph. One might say that why are you doing this, wouldn’t all these fields get displayed on the frontend. Technically yes at first, but there is a way to limit what fields get displayed on the Card paragraph itself and then a step on getting the Breaker paragraph to utilize the same fields coming from the Card paragraph.

This method is using what’s called view modes in the Form Display configuration. What you want to do is go into Managed Form Display in the CMS for the Card paragraph that you just created and take a look at what displays are available.

Drupal backend - manage form display

As you can see above we already have some displays created in the Rain distribution we are using. We have the Default, Breaker, Carousel, Column, Card Compound, and Overlay Breaker. These are all Form Display’s which are used for other Paragraph variations. 

custom form display

Creating a new display is easy; you want to go to the bottom under the Default view mode and hit Custom Display Settings, which should drop down the above selection. As displayed in the screenshot there are some that are enabled and disabled. You can create a new one by clicking Managed form modes.

Now that we sidetracked a bit and explained how view modes work and how they are created, let's dive back in and talk about how you can now create this Card variation paragraph which is called the Breaker. Since we have a display already enabled for the Breaker what you want to do is click on the Breaker and see what fields are enabled by default.

Drupal manage form display

Shown above this has the title, body, and the new body right field which is technically still part of the Card Paragraph, but just using a different form display. If you compare this display with the default display you can see that different fields are enabled and disabled, which allows for flexibility for reusability. Now that we see that the fields needed for the Breaker have been created or enabled, let’s go create the Breaker paragraph.

create breaker paragraph

We then add a description and say that this is a breaker that will have content with two columns. If you remember above we have a body field and a body right, this is used to create these two columns.

When creating your Breaker paragraph make sure to add all the fields that are unique to this breaker like Line Break, and Show Two Columns which have specific functionality for the Breaker, but you also want to create a Paragraph reference field called Card which will allow you to reference a Card.

So how do you get it so this Card field shows the fields that you want from the Card Paragraph? Well, that’s why we worked on using the view mode on the Form Display for the Card Paragraph. What you want to do is the following.

Drupal manage form display card

Under the Breaker paragraph go to the Managed Form Display. Then under the Card field, you can use any widget that has the Form Display mode available. This will allow you to use the Form Display mode option in the widget to target the right display mode you want to use from the Card. Select the Breaker display mode.

select breaker display mode

Once this is done, hit save.  Now what you should expect is that whenever you go to create a breaker and you use the card field it should show the fields that are specified on the Card Breaker view mode.

card breaker

Building Paragraphs the Smart Way 

Now whenever you as a developer get tasked with building out a paragraph, make sure to ask yourself: What can I do to reuse fields and displays as much as possible? I hope this guide will help you to make smart decisions for a better content architecture going forward.

Jan 13 2021
Jan 13
If it is the top level menu, we print a
    and we pass Drupal’s attributes in addition to our custom classes.</li>
  • If menu is not top level (submenus), then we simply print a
      with its corresponding class.</li>

Learn more about Macros in Twig Templates.

Looping Through the Menu Tree

Looping through the items array allows us to intersect each item and apply classes or attributes.

  • First, we loop through the items array (for item in items), and for each item, we print a
  • to which we pass an array of CSS classes using attributes.addClass() method.  However we do this only if we have access to Drupal's attributes ({% if attributes %}).  If not, we manually print the same classes on the list item. 
  • For each item and link, we pass a series of things such as:
    • The link title
    • The link url
    • Is the link in the active trail,
    • ... and others.
  • Inside the list item, we check to see if that particular item has other items or submenus({% if item.below %})
  • When submenus exist, we make use of the macro again to repeat the process of printing the menu. This is where the macro saves us from writing duplicate code.
  • For each level of submenus we are increasing the menu_level by 1 and printing it as part of the submenu class. This is handy if we want to target a specific submenu in the tree.
  • Finally, we close all previously open actions/tags (i.e. for loops, if statements, lists, and macro).

RESOURCE: I recently ran into a blog post by Tamas Hajas where he has a great way to address classes and states on a menu. Check it out: Drupal 8 Twig: add custom CSS classes to menus

Looking at the Rendered Menu after Macro Improvements

Now that we have a better understanding on how the macro works, and after making the improvements discussed above, the markup for the menu would look something like this:
Example of menu with good markup.

These styles are as bare as possible. They are simply to make the menu look presentable and could use a lot of improvements. This is what the multi-level menu looks like:

Example of an expanded menu

Create a Twig Template Suggestion for the Main Menu

Our ultimate goal is to update Drupal’s menu system to render using the navigation component. 

Follow these steps to enable Twig Debugging in your theme. Then, you can inspect your site, which will allow you to see the various template suggestions that Drupal uses to render the page; including your navigation menu. The debug information looks like this:

Example of twig debugging code.

Under File Name Suggestions notice two template names: menu.html.twig & menu--main.html.twig. The X next to menu.html.twig indicates Drupal is using this file to render the Main Menu. The second file, menu--main.html.twig, is what Drupal is suggesting we create if we only want to alter the Main navigation and not other menus.  The word "main" in the file name represents the Drupal machine name for that menu.

Under Begin Output notice the path where menu.html.twig can be found, the example above is pointing to Drupal’s stable theme.

To ensure we only affect the main menu and not other menus on our site, we are going to make a copy of Drupal’s menu.html.twig in our theme and then override it. This is recommended rather than modifying Drupal core’s template. Let’s start:

  1. Copy the menu.html.twig template from core/themes/stable/templates/navigation/ into [site_root]/themes/custom//templates/navigation/menu.html.twig (if these folders do not exist yet in your theme go ahead and create them).
  2. Following the golden rule “Never hack core”, we want to make a copy of Drupal’s menu template in our own theme. Replace the core theme’s name with your core base theme if you are not using stable.
  3. Next, In your theme rename the newly copied template to menu--main.html.twig.
    Copying menu.html.twig into our theme will affect other menus. This is why we are renaming the template so it only affects the main menu (menu--main.html.twig). Replace ‘main’ with whatever your menu’s machine name is if you are not using Drupal’s Main Menu. This can be found in Drupal’s Menus page (admin/structure/menus).,>
  4. After clearing Drupal’s cache, inspect the menu again and you should see the X next to menu--main.html.twig which means Drupal is now using our custom twig template suggestion to render the menu.

Create a Multi-level Menu in Drupal

  • Let’s make sure we have a menu we can see in Drupal. Let’s create a multi-level menu using the Main Navigation menu (Structure | Menus | Main Navigation).
  • Add as many links as you wish. Be sure to add nested items so we end up with a multi-level menu.
  • In addition, ensure each of the submenu’s parent links are set to “show expanded”. You will see the option on the Add/Edit Link page.
  • Finally, go to Structure | Block Layout and click Configure next to Main Navigation
    • Change Number of levels to display to Unlimited. This will ensure dropdowns are rendered and display in your navigation.

Integrate the Main Menu component with Drupal’s Menu

The last part of the process is to integrate all the work we’ve done with Drupal, so our Navigation is rendered with the markup, styles, and behavior we implemented when we built the component.  

  1. In your editor, open [site_root]/themes/custom//templates/navigation/menu--main.html.twig
  2. Delete all the content in the twig template except for the comments, and then paste the code below into it
{% include '@your-namespace-here/navigation/navigation.twig' %}
  1. Clear Drupal’s cache
  2. Reload Drupal’s page

Since we moved the macro to the component’s location, all we need to do in the menu--main.html.twig template is to call our Navigation component by using a Twig include statement. 

If we did our job right, Drupal’s menu should now look and behave the same way as our component. In addition, if you inspect the Drupal page, the menu should reflect the same markup as the main-menu component.

Drupal Libraries

As a best practice, we should create a drupal library to attach any styles and javascript to our component. See this article for adding CSS and JS to a page or component.

If your navigation is displaying unstyled that's because a Drupal library does not exist.  Create one as follows:

navigation:
  css:
    component:
      dist/css/navigation.css: {}

Your compiled css path may be different.

This library has already been attached above inside navigation.twig

By doing this any styles we wrote for the component will apply to the main navigation when it’s rendered in Drupal. Drupal libraries have no effect in Pattern Lab.

After completing the steps above, clear your Drupal’s cache and reload your Drupal site. You should see your Drupal menu being rendered with all the styles and markup we wrote throughout this post,

What about Accessibility?

You should always ensure your menus and navigations are accessible. It is the right thing to do. However, don't yell at me for not following my own advice. In an effort to keeping this tutorial simple, I opted to omit anything related to accessibility. Perhaps part two can focus on that ;-)

Video Tutorials

[embedded content]

[embedded content]

In Closing

I’d like to admit that I went about the long way to get the main menu working on both, Pattern Lab and Drupal. There are other ways to accomplish this by using contrib modules such as Simplify Menu (full disclosure, I am a co-maintainer of the module), and perhaps Twig Tweak among several others, however, the approach I took here gives you the most control and that can make all the difference if you are dealing with a pretty advanced or complicated menu.

Jan 08 2021
Jan 08

Now on Drupal 9, the community isn’t slowing down. This month, we sit down and talk with Angie Byron, a.k.a Webchick, a Drupal Core committer and product manager, Drupal Association Board Member, author, speaker, mentor, and Mom, and so much more. Currently, she works at Aquia for the Drupal acceleration team, where her primary role is to “Make Drupal awesome.” We talk about Drupal, coding, family, and her journey throughout the years.

This article was originally published in the January 2021 issue of php[architect] magazine. To read the complete article please subscribe or purchase the complete issue.

Jan 07 2021
Jan 07

student in cap and gown attending a virtual college graduation ceremony

Customer Stories on The Pivot to Digital 

In a year like no other, Mediacurrent’s higher education partners were challenged to shift strategies. We recently welcomed a customer panel of seven marketing strategists to share their stories at our company retreat. With voices ranging from two Ivy League universities to an online college, the panel reflected on how the pivot to virtual tours was an ambitious undertaking. So too was the need to rethink student communications and reassess the metrics that matter most.

The following is a transcript of a virtual roundtable by Michael Silverman of Mediacurrent with a panel of seven digital leaders in higher education. It was conducted in December 2020 at the Mediacurrent company retreat. Some of the questions have been edited for brevity and clarity.

How has your recruitment strategy changed with COVID-19? What works now for student enrollment marketing?

For this digital director at a private catholic university, COVID-19 drove his team to imagine creative alternatives for events and approach their marketing funnel in a new way: 

There's been the need to be more creative to reach our target audience. We had to find different ways to engage prospective students. One thing we did was to host a socially distanced drive-in event where prospective students came to the college and watched all about our college. We’ve also moved to more virtual events (I can tell you because I entered over 300 events on our Drupal site!) for new and returning students. We look for different ways to connect with them, to make sure that they stay engaged with the university.

We had a habit of focusing on top of the funnel brand awareness and it was harder to get prospects into the funnel. So we had to make a more concerted effort to reach the students that were already in the funnel, getting them to apply and then put their deposit down. We were working with a smaller pipeline and we had to be more efficient in speaking to it. 

According to this director-level digital strategist for a major state university in the northeast, highlighting the value of a local education was a successful tactic:

When we were being able to get people onto the campus, 70% of people who came to visit ended up applying. Because we were missing out on that, moving to virtual visits, we had to change our messaging quite a bit. 

We found that people were more likely to want to stay at home or stay local. Because our biggest audiences are in-state, and we have 19 campuses across the state, that's been a big point for our message. We really focus on the campus and that local education rather than that “big brand” messaging. 

On a campus of 2,000 students in the Great Plains region, this marketing expert saw how small group tours are more personalized than before:

After shutting our campus down for the spring and summer, we were able to get back to a model that allows for individual family tours. That personal touch has helped us a lot. In October, we hosted more individual tours than any group tour of previous years. Our admissions counselors really take pride in fostering relationships with prospective students through personal interactions like texting, calling, or writing letters.

Aside from campus visits, what were some other leading indicators for applications? 

Prospective students had a high need for information about how the school was reacting to the pandemic. This state university (the same referenced above) saw the opportunity for retargeting campaigns:  

We’ve started to focus more on people coming to the admissions website and just reading through some of our COVID information. Our focus groups found that given the uncertainty, people wanted us to be able to proactively communicate what was going on and what we were planning to do with student enrollment moving forward. So we drove a lot of people to that information. If we saw that people were reading that and clicking over to one of our conversion actions, we would set up retargeting campaigns towards them and get them further down. This was a new strategy because we had never really used any of our PR materials within our enrollment advertising. 

We had never really done retargeting before for trying to get people to accept their offer of admission. We've started to build up some scores in our Slate CRM for the probability of enrollment. We've been able to figure out the people that are most likely to enroll and are able to retarget them at the beginning of the funnel with lookalike audiences and Facebook. Then we’re also retargeting accepted students who are still in the funnel. 

Where do you see the biggest change in measurable goals for your organization due to the changes brought on by COVID-19?

This CMO of an online college for holistic health was able to boost enrollment even as the competition for distance education was skyrocketing:  

We didn't see an enrollment drop-off at all. In fact, we've seen an increase in enrollment. Back in February, I pulled all of our pay-per-click marketing. I had a feeling that if this hit, every single on-campus entity would need to go online and we wouldn’t be able to compete. That strategy saved us. 

We stopped focusing on trying to attract people to enroll. We knew that everyone else was trying to attract them as a consumer. We started doing educational wellness webinars to help people to grow their skills, inviting them to engage with us on an entirely different platform. 

Has your institution been forced to cut costs or reallocate resources? If so, how has that affected your group?

This web strategist for a small university in the midwest weighed in that she faces uncertainties in the upcoming admissions cycle. Looking ahead, her department budget will be geared toward third-party community platforms:

We’re a small school and we were able to pivot pretty well...until now. Our students come because they get to co-op the entire time they're here as a part of the degree requirement. So we're now starting to see it going into this admissions cycle, but we're being very creative because obviously, you're not having your large scale visit events on campus.

For my role, which is running the main site, there probably will be some dollars pulled from me in order to focus on some third-party platforms that are focused on building a community with potential students. Not necessarily budget cuts, but I’ve seen the shifting of money to focus on some of these ongoing virtual things that will continue. 

Without the in-house IT resources to launch a new website, this project director at an Ivy League school relied on Mediacurrent for support:

We hired Mediacurrent prior to the onset of the pandemic to create an online platform that would be useful in the event of a future financial crisis. Two months later, we found ourselves potentially in the midst of that financial crisis.

Our IT department needed to focus on making it so that our students could all attend class online from all over the world. All of a sudden I was in the middle of a Drupal website project, and frankly, I'd never heard of Drupal before this. 

What are the biggest pain points from day to day related to the technology and management of your website?

The crisis forced us to go digital in many ways. It's incredibly important for our websites to stay accessible. Mediacurrent has done a good job of understanding what matters to our stakeholders and helping us navigate accessibility. That’s a huge priority.

This is where working at a school that has a lot of name recognition can bite you. We may not have had to do as much outreach or aggressive marketing as some other schools. So we were extremely behind the curve when we changed to virtual info sessions. We were able to get our information sessions up and running in a way that's decentralized so I didn't have to manage all of that. We could train other staff to get them up and running and host them on their own, which we do through Slate. 

Having virtual information sessions and other digital channels is something that definitely will continue going forward because it allows us to get our message to a broader audience. We're able to share what the school community is like, and what our financial aid can offer.

Marketers from a state school and a small private school both shared how Drupal empowered them to quickly adapt the digital experience: 

On our current site, the back end user experience is really difficult. We have no flexibility to change things when our strategy changes. It's a mess. So what we're building now in Drupal 8, we are very, very excited about. We've been working very closely with the team at Mediacurrent to improve the user experience for our authors and also being able to adapt to changes quickly. 

This year is nothing but pivot. I’m constantly making changes to the website. On our previous Drupal 7 site, I had a hard time adding blocks of content. Now, with Drupal 8 and Layout Builder, I’ve got everything that I need in my tool kit. I can go in on the pages to move around what I need to move around. I’m able to change the content up on a dime. 

What has been your experience working with Mediacurrent? 

All of our panelists agreed that finding the resources to launch new digital campaigns was a steep challenge. This Ivy League marketer summed it up best:

Staff in higher ed are stretched very, very thin. And at this moment, I'm finding that it's harder for us to be forward-looking. The availability and transparency with the Mediacurrent team have been wonderful. We’ve had many Mediacurrent developers working on our team over the past couple of years and as well as user experience and project managers. They’ve not only helped to find ways to improve our site and make the experience better for prospective students and current students but also to keep up with the necessary bugs and Drupal security features. 

The panelists also thanked the Mediacurrent team for being a reliable partner in uncertain times:

It was an enormous blessing not to worry about the development of our platform given everything else that was going for our school in response to the pandemic. I had complete trust in the Mediacurrent team and you didn't let us down.

I’ve needed things more quickly to adapt our strategy this year. As a digital agency partner, Mediacurrent did that. It’s made the difference between sinking and swimming.

What areas of your digital strategy do you see remaining post-pandemic?

An Ivy League project director reflected on how lessons learned from virtual learning may carry over to the classroom:

The particular program that I'm involved in is a specialized master's program targeted specifically overseas. In addition to all of the other travel-related concerns associated with the pandemic, there are also visa issues, et cetera. By necessity, we've been in this mode of, rather than bringing students to campus to sort of convince them to ultimately enroll, needing to do that in a virtual way. As with the classroom experience, we're hoping ultimately to get back to a non-virtual experience. But there are pieces of the virtual experience that we would think about trying to preserve, even in a nonvirtual world. 

We're anxious to get back into the classroom but there are pieces of the online experience that we've enjoyed and have started to think about how we can bring some of those elements into a physical classroom. Something as simple as the ability to interact with students via the chat function in Zoom. How do you think about taking that functionality and applying it in a physical classroom? And I don't know that we have any great answers yet. But it's very much something that we're thinking about.

This private catholic college sees a data-driven website in its future: 

Partly because of COVID, my marketing department was moved into admissions. It's been great because we've had access to more data, so it's allowed us to be more targeted and granular in our advertising. Now I know down to zip codes where my most likely students are to come from. 

So it's been a real benefit in a time where we've had to be more efficient with what we're doing, what we're spending, what we're advertising. And it's kind of also the direction for where I want to go with the website. And that's making sure that all of my analytics, all of my CRM pieces, everything is hooked into Drupal and doing what it needs to do so that we can be efficient even after COVID.

Now What? Rethink Your Digital Focus 

Whether the goal is to boost enrollment, improve student retention, inspire, educate, or engage learners, your website plays a critical role. See how other institutions adapting their digital experience in our upcoming webinar. Join experts from Mediacurrent, Siteimprove, and Pantheon who have helped some of the best-known colleges and universities deliver engaging digital experiences. We hope to see you there! 

higher ed webinar banner

Save a spot for the webinar: Reimagining Your Higher Ed Web Strategy

Dec 28 2020
Dec 28

If you are deciding between a standard Drupal architecture vs. a decoupled one with a GatsbyJS front end (which I’ll call “Gatsby” from here on out), certainly the latter approach offers some compelling advantages. Decoupling Drupal may be easier than you think. A Gatsby front end offers a more flexible and compelling user experience across multiple channels, as well as excellent performance, security, and scalability. 

However, in developing an independent front end and back end, one must factor in the additional costs and financial benefits of doing so vs a traditional standalone approach. As described in our introductory blog post, Building a Business Case, a business case exercise can clarify costs and benefits, both in terms of initial cost, ongoing costs, and revenues of each. 

In this blog post we’ll perform this exercise for standalone vs. decoupled Drupal with Gatsby, performing the following steps: 

  1. Determine the costs of each option
  2. Determine the benefits for each option
  3. Recommend an option based on cost and benefits

Determine the Costs of Each Option

Two different types of costs need to be considered when evaluating options, the cost of the initial implementation, and the ongoing and operating costs once the implementation is complete. The following two tables approximate these costs. Note: these costs are highly theoretical approximations, only to be used as an illustration. Actual costs can only be calculated once detailed requirements are considered. 

Implementation Costs

The following table summarizes theoretical implementation costs for each of the two options. These entail the licensing and build estimates for each option. This illustration (again, theoretical numbers) assumes a simple implementation with little to no customization. 

Cost

Standalone Drupal

Decoupled Drupal with Gatsby

Initial License

$0

$0

Build

$100K

$180K

Hosting (average)

$30K

$45K

Total

$130K

$240K

A few things stand out in this comparison:

  • Freedom from Licensing Fees: Of course, Drupal has no licensing fee, but if you use an open source front end technology like Gatsby, the licensing fee will be free for that as well.
  • Time and Effort Up Front: The initial effort for the decoupled build is greater than that for the standalone Drupal build. This is because front end technologies like Gatsby are newer, and in our experience, there are bumps in the road when working with them in an initial build.
  • The Stability of Drupal 8 and Drupal 9: Drupal 8 is stable and mature, having been release level since 2016, and Drupal 9 is architecturally similar.  With a standalone solution, there are fewer bumps in the road. 
  • Hosting: Hosting costs are greater with a decoupled solution as well. The cost of hosting services vary greatly and the figures we’re stating here are an illustration. Your actual hosting costs will certainly differ. That said, you will be paying for two hosting services with a decoupled solution, one to host the back end and another to host the front end, whereas a standalone solution requires just one.

Ongoing/Operating Costs

After implementation, each option incurs recurring costs. This is a critical consideration in evaluating options that often gets overlooked, and we recommend you always factor in these costs in your decision. For our example, the following table summarizes those costs (again, they are theoretical):

Cost

Standalone Drupal

Decoupled Drupal with Gatsby

Ongoing License

$0

$0

Maintenance and Support

$40K

$20K

Hosting

$30K

$15K

Total

$70K

$80K

As with the initial cost, the ongoing costs of the decoupled option are higher, albeit by less. In our example, they’re almost equal. While hosting costs are again higher for the decoupled option, maintenance and support costs are less. Although there are initial bumps in the road for a decoupled solution, our experience has also taught us that once those obstacles are overcome, it is a lower effort to maintain the solution because Gatsby, and the React framework it’s built upon, are simpler than that of Drupal’s front end engine. It is easier to learn, and developers skilled in it are easier to find. 

Determine the Benefits of Each Option

For investment in any of these options, a benefit is expected, particularly in expenses saved or new revenue earned. Focusing on annual revenue, the following table summarizes the theoretical benefits of each option based on an assumption that the standalone Drupal solution would generate $1000 a day in revenue, or $365K per year:

Standalone Drupal

Decoupled Drupal With Gatsby

Annual revenue

$365K

$416K

In this example, and in many real-world circumstances, a decoupled option slightly improves a site’s revenue, all other things being equal. This is because, in addition to having a more powerful and flexible user interface, a decoupled solution is more performant for the end-user. Further, a Gatsby front end typically runs as a static site on a CDN, with little to no database fetching occurring during a page load. According to sources like DomainRacer and MachMetrics, companies like Myntra, Amazon, Yahoo, Flipkart, and Mozilla experienced a boost in business by increasing the page load speed of the website with less than 2 seconds. Estimated revenue decreases conversions by 7% with a one-second delay, and a whopping 17% with a five-second delay. Further, page load time affects SEO rankings in Google, and slower websites have higher bounce rates. All of this impacts an organization’s reputation and customer loyalty.

Recommend an Option 

When we combine costs with benefits as described above, we are left with the following comparison: 

standalone vs Decoupled Drupal

A scenario for Standalone Drupal vs. Decoupled Drupal With Gatsby

In year 1, for example, the net amount earned by the standalone option is $365K - $130K, or $235K. In year 2, the cost is reduced to $70K, bringing the net amount to $295K. The above graph plots net revenue for the standalone option, in blue, and the decoupled with Gatsby option, in tan, over five years.

So, in this example scenario, we can draw the following conclusions:

  • If your organization is prioritizing short term results, the standalone option is more attractive due to the stability and maturity of the Drupal platform. There are fewer early bumps in the road. 
  • If your organization is willing to make a greater up-front investment for greater long-term benefit, consider the decoupled option with Gatsby by virtue of its low long-term maintenance costs and revenue-boosting performance improvements. 

As in our previous blog post on building your CMS business case, our example scenario is overly simplistic. In reality, detailed requirements can radically alter both the costs and benefits of any option one considers. We at Mediacurrent have performed this type of analysis for some of our clients to help them with their technology investment decisions and can do the same for you. Please contact us to learn more!

Dec 24 2020
Dec 24

Happy holidays everyone. We’ve had three sites in the last two weeks that have reported reCAPTCHA problems. A captcha is the funny little thing at the end of forms that tries to prove you’re not a robot by having you spell out letters, or pick pictures with traffic lights. They’re annoying, but without them, many “contact us” forms and user registration forms can be hit with a crippling amount of spam submissions.

Newly discovered reCAPTCHA session validation errors (TL;DR)

One of our clients even recently called in for holiday support on this bug, so we’re pretty sure there are others dealing with this situation as well. They all have the same JS error and/or symptom:

CAPTCHA validation error: unknown CAPTCHA session ID. Contact the site administrator if the problem persists.

Diagnosing the session validation error

The root of this error, at least for one of our sites using simple_reCAPTCHA, is pretty straightforward (but took us days to diagnose initially). There are two parts to the issue. A typical contact us page has 2-3 forms on the page: Search, Email Sign Up, and the contact form. The submit button for each one has the same HTML ID. That's not valid HTML; IDs have to be unique. There's code in the reCAPTCHA, captcha, and simple_reCAPTCHA modules that get tripped up because of caching.

Each part (block) on the page is cached separately, so when someone visits the homepage and that gets cached, Drupal also just pulls the search form from its cache for other pages instead of rendering it from scratch.

If rendering the whole page from scratch, Drupal uses unique IDs in every form. One's submit button has the id #edit-submit, another gets #edit-submit--2, the other #edit-submit--3. But due to the caching behaviour with reusing existing blocks, we end up with:

  1. If you visit the homepage, which just has the search form, its button gets #edit-submit.
  2. When you visit the contact page, which has 2 forms that need to be rendered, the Sign-Up and contact form, Drupal uses ids #edit-submit and #edit-submit--2. But the cached search form also ends up in the header with an id of #edit-submit too.

The simple_recaptcha module does something quirky. When you click the submit button it runs some javascript to get a token from the reCAPTCHA service, which is fine, but then it has to re-click the submit button, and it does that by looking up the button by ID again. Looking up by ID gives the first element with that ID, the search form's submit button.

Similarly, the captcha and reCAPTCHA module loads the token, but it gets stored in the cache and it causes the same or similar error that simple_recaptcha does.

Helping reCAPTCHA modules deal with Drupal caching

The real bug is challenging and not fixable within our support scope of practice. More on the accessibility concerns and other issues below.

  • The fix for the simple_captcha module is to modify the javascript as described in this bug write up.
  • If you’re using the captcha module, we recommend reviewing the discussion and various patches in #3089263.
  • For the reCAPTCHA module, perhaps consider using one of the other two modules instead.

Accessibility concerns

Even if you’re not using the reCAPTCHA module, your site may still have an adverse effect on accessibility as well as confusing any Javascript that is written. Here’s a quote from the linked accessibility issue:

“It may sound minor, but it's a major issue, as it is associating the search autocomplete functionality in the header with the views filters elements in the content. The end result is that JAWS thinks a regular select element in the views filters, is a combobox, which it is not, so it's pretty confusing to users.”

So it’s not just captcha related and is definitely going to need some real consideration from the caching experts that work on that piece of Drupal. Let us know on Twitter if you’re having the same issues and how you fixed them.

New call-to-action

Dec 22 2020
Dec 22

Have you ever decided on investing in a product or service, but require approval from executive sponsors before proceeding with your purchase? Or do you like a product or service, but are hesitant to make the investment in it because it’s costly?

If either of these scenarios describes the situation you’re in, a business case is an excellent thinking tool that can help you with your decision. A business case articulates the rationale for investing in a product or service in terms of numbers. It is often presented in a formal document, but may also be presented through more informal methods like a spreadsheet or even a single slide in a slide presentation. It further provides a way to make your case to nontechnical decision-makers without needing to provide the specific intricacies about the method or approach to your desired product or service’s implementation. 

To build a business case, perform the following steps: 

  1. Define the options to evaluate
  2. Determine the costs of each option
  3. Determine the benefits for each option
  4. Recommend an option based on cost and benefits

Let’s look at each of these steps more closely.

Defining Your Options

When evaluating a new product or service, you instantly are faced with two options: purchase the new product or service, or stay with what you have. In many cases, decision-makers start with a bias toward the status quo because there is no cost in transitioning to new technology and the business process that goes along with it. Further, if a new product or service is being considered, its competing options should be considered as well, for example, if you’re considering Drupal as a web content management platform, consider WordPress too.

Let’s illustrate with an example. Say you have a proprietary, licensed legacy content management system (CMS) that is poorly performing and difficult to maintain. You and your company are no longer happy with it and want to make a change. Your budget is limited, so an open-source solution is more attractive because it carries no licensing fees. Drupal is certainly an attractive option, but as just stated, add WordPress as an option. This brings our options to the following:

  • Drupal
  • WordPress
  • Status quo CMS

In a real-world scenario, it’s good practice to consider more options than this, including evaluating proprietary options like Sitecore and Adobe Experience Manager. For the sake of simplicity, however, let’s limit our example to the ones we’ve listed. 

Determine the Costs of Each Option

Two different types of costs need to be considered when evaluating options, the cost of the initial implementation, and the ongoing and operating costs once the implementation is complete. The following two tables approximate these costs. Note: these costs are approximations only to be used as an illustration. Actual costs can only be calculated once detailed requirements are considered. 

Implementation Costs

The following table summarizes theoretical implementation costs for each of the three options. These entail the licensing and build estimates for each option. This illustration assumes a simple implementation with little to no customization. Please note that these numbers are for illustrative purposes only. Real-world figures, driven by project requirements, will certainly differ.

Cost

Drupal

WordPress

Status Quo CMS

Initial License

$0

$0

$0 (already paid)

Build

$100,000

$50,000

$0

Total

$100,000

$50,000

$0

It comes as no surprise that the implementation cost of the status quo CMS is $0, because, after all, it has already been bought and configured. The licensing costs of all three options are $0 because Drupal and WordPress are open source offerings, and the license fee of the status quo CMS has already been paid. Had Adobe Experience Manager or Sitecore been considered, their license costs would be non-zero. In our example, we budget more for the Drupal build than that of WordPress because in our experience, with Drupal’s greater power and flexibility comes greater complexity, and that typically translates to greater investment in development hours. 

Ongoing/Operating Costs

After implementation, each option incurs recurring costs. This is a critical consideration in evaluating options that often gets overlooked, and we recommend you always factor in these costs in your decision. For our example, the following table summarizes those costs. To reiterate, these numbers are for illustrative purposes only. 

Cost/year

Drupal

WordPress

Status Quo CMS

Maintenance and Support

$20,000

$10,000

$50,000

Ongoing License

$0

$0

$10,000

Total

$20,000

$10,000

$60,000

Factoring ongoing costs makes it immediately apparent that the status quo CMS is costlier in the long term, both in maintenance hours spent and in annual licensing. As with their initial implementation costs, Drupal’s ongoing costs are greater than those of WordPress because, again, of Drupal’s greater complexity.

Determine the Benefits of Each Option

For investment in any of these options, a benefit is expected, particularly in expenses saved or new revenue earned. Focusing on annual revenue (again, just an illustration), the following table summarizes the benefits of each option. 

Benefit/year

Drupal

WordPress

Status Quo CMS

Annual revenue

$200,000

$150,000

$100,000

Finally, we see where the investment in Drupal’s complexity pays off; its greater functionality typically means a greater ability to meet ever-evolving needs. We translate this in our example to a greater annual revenue than that of WordPress, and both CMSs, being more functional than the status quo CMS, are capable of generating more revenue. 

Recommend an Option 

When we combine costs with benefits as described above, we are left with the following comparison:

Cost/benefit comparison chart for Drupal vs WordPress

The above graph makes the following assumptions:

  • In Year 1, the Drupal and WordPress options earn no revenue because they’re being built
  • Likewise the status quo CMS is earning revenue in Year 1 because it has already been built

So, in this example scenario, we can draw the following conclusions:

  • If your organization is prioritizing short term results, staying with the status quo CMS is the best option.
  • If your organization is willing to make a moderate up-front investment for moderate long-term benefit, WordPress is the best choice.
  • If your organization is willing to make a greater up-front investment for greater long-term benefit, then Drupal is the way to go.

Admittedly, our example scenario is overly simplistic. In reality, detailed requirements can radically alter both the costs and benefits of any option one considers. We at Mediacurrent have performed this type of analysis for some of our clients to help them with their technology investment decisions and can do the same for you. Please contact us to learn more!

Dec 18 2020
Dec 18

Rain logo updated

The Mediacurrent team is proud to introduce the latest version of our open source Rain CMS, now for Drupal 9. The latest version ships (currently in beta) with a brand new theme and admin experience. Make your move to Drupal 9 with speed and confidence. 

The New 'Nimbus' Theme

Homepage design for Mediacurrent's Rain CMS for Drupal 9 Nimbus theme

New Rain CMS theme homepage

The new base theme comes with a full Pattern Lab style guide to make development easier using a component-based theming approach. The new Rain CMS theme also adds color module support for in-browser color configuration. We wanted a great out-of-box experience for content authors and administrators so we pre-load the site with sample content to make it simple to get started.

Rain Admin 2.0

Rain CMS for Drupal authoring experience with jump navigation to manage page components

Rain CMS admin content edit page

With the latest version of Rain Admin, the content editing experience is more intuitive to use. A jump-navigation on the right-hand side helps authors move between different sections of the page. The UX for Paragraphs has also been improved to make it visually easier to manage components on the page.

Let’s Get Started

For developers, you can download and install Rain CMS 9 using our Drupal project template: https://bitbucket.org/mediacurrent/drupal-project/src/9.x/. Setup and installation remain the same as our previous version, with detailed instructions provided in the project README file.

For more information on the benefits of Rain CMS D9 and to schedule a free demo, please visit our contact page or chat with us right now (see bottom right corner of the page). We would be happy to talk more about your project or schedule a demonstration.

Dec 17 2020
Dec 17

argument-open-source

Recently, Drupal has been on an update rampage. The introduction of the oh-so-beautiful Drupal 9 core has spurred a chain reaction of upgrades across the Drupal platform. Just this week, we’re getting a new default theme (which is hyper-minimalist and easy-on-the-eyes), a 20% reduction in install times, and automated lazy load for images. But let’s talk about the juiciest UI/UX update that came with Drupal 9 — the standardization of Drupal’s Layout Builder.

If you’ve built a pre-Drupal-9 website over the past few years, you probably dabbled with Panels/Panelizer, WYSIWYG templates, or even custom coding to set up your UX/UI. And that works. We did it for years. But you can throw that worn-out Panelizer module in the trash. The times, they are a-changing. Drupal’s new Layout Builder module combines the core functionality of Panelizer with an out-of-the-box WYSIWYG engine.

First hinted at in 2017, Drupal Layout Builder officially left the onerous Drupal testing pipeline last year as part of Drupal’s 8.7 updates. Despite circulating for a year now, the chaos of 2020 has overshadowed this potent and flexible tool. So, let’s talk about it. Here’s what you need to know about Drupal’s Layout Builder.

What is Layout Builder?

Layout Builder is a WYSIWYG page editing engine that lets you manipulate back-end features via an easy-to-use drag-and-drop interface. It’s difficult to overstate just how valuable Layout Builder is when it comes to time-savings. You can create templates in minutes, immediately preview and create content changes, and tweak page-by-page UI/UX features to create more cohesive and on-the-fly websites and landing pages.

At its core, Layout Builder is a block-based layout builder. You can create layouts for either a single page or all content of a specific type. In addition, you can jump in and create rapid-fire landing pages based on your existing design theme. There are three “layers” that Layout Builder operates on to help you build out holistic websites.

  1. Layout templates: You can create a layout template for all content of a specific type. For example, you can make a layout template for your blog posts or a layout template for every product page. This template will be shared across all pages, so you don’t have to go in and rebuild for each content type.
  2. Customized layout templates: You can also go in and make granular changes to a specific layout template. So, if you want a certain product page to be different than the layout template, you can make granular changes to just that page.
  3. Landing pages: Finally, you can create one-off pages that aren’t tied to structured content — like landing pages.

Important: Founder of Drupal — Dries Buytaert — dropped a blog post with some use cases for each of these layers.

To be clear, Layout Builder isn’t a WYSIWYG template. It uses your existing template. Instead, it allows non-developers (and lazy-feeling developers) to quickly make per-page changes to the website without diving into code. But these aren’t just simple changes. You can create a layout template for every page type (e.g., creating a specific layout for all the shoes you sell), and you can also dive into each of these layout pages to make custom changes. So, it really lets you get granular with your editing without forcing you to completely retool and redesign pages for each type of content. This gives Layout Builder a massive advantage over WordPress’s Gutenberg — which requires you to go in and re-lay elements for every page individually.

Here’s the kicker: you get a live-preview of all changes without bouncing between the layout and the front-end. Every block and field you place and every change you make to taxonomies or content is visible the second you make the change. The entire process takes place on the front-end, and changes are instantly visible. Remember, Layout Builder is part of Drupal’s Core, so you don’t need to implement new entity types of dig into third-party elements. It’s an out-of-the-box experience.

Advantages of Layout Builder

Last year, we got a gorgeous, picture-perfect demo of how Layout Builder would work. It’s beautiful, fast, and packs a punch that other leading layout builders are indeed missing. So, to help unpack the value of Layout Builder, let’s look at some of the advantages of Layout Builder:

Customization

Beyond Layout Builder’s incredibly powerful and customizable block-based design engine, it offers customization in usage. Let’s say you want to create an amazing landing page. You can start with a blank page that’s untied to structured content, drop in some hero images, a few pieces of text, some content, and a video. Suddenly, you have a custom landing page (complete with modules, blocks, and taxonomy) that exists in a separate ecosystem from your website.

Simultaneously, you can create a template for every blog post, then dive into a specific blog post and make on-time changes to just that page while still being tied to your structured content. Remember, you can make these changes nearly instantly, without touching code. And you’ll see a live preview of every change immediately without switching between interfaces.

Accessibility

Drupal is committed to accessibility. The second principle of Drupal’s Values & Principles page reads, “build software everyone can use,” and this rings true. Layout Builder meets Drupal’s accessibility gate standards (i.e., conforms to WCAG 2.0 and ATAG 2.0, text color has sufficient contrast, JavaScript is keyboard-usable, etc.)

Ease-of-use

Like many WYSIWYG editors, Drupal Layout Builder is all about “blocks.” But these aren’t your run-of-the-mill blocks. There are inline blocks, field blocks, global blocks, and system blocks. Each of these has its own use case, and you can combine these block types to create stellar pages in minutes. For example, global blocks are used to create templates, and inline blocks are used to create page-specific changes that don’t impact the layout. The combination of these block types makes Layout Builder a hassle-free experience.

Additionally, there are plenty of ease-of-use features built into the core. Layout Builder works with the keyboard, has plenty of usability features that tie to Drupal’s value statements, and allows nit-picky setups for customized workflows.

Creating a Drupal Website is Easier Than Ever

With Layout Builder, users can generate valuable content and pages without needing to patch together various WYSIWYG tools or Panel/Panelizer. At Mobomo, we’re incredibly excited for our clients to dive into Drupal Layout Builder and make actionable and memorable changes to their templates based on their in-the-moment needs and experiences.

But Layout Builder isn’t a replacement for a well-designed and well-developed website. We can help you build your next world-class website. Once we’re done, Layout Builder gives you the freedom to make substantial changes without the headaches, back-and-forth, or unnecessary touchpoints. Are you ready to create a customer-centric, experience-driven digital space? Contact us.

Dec 17 2020
Dec 17

You started your ecommerce site with Shopify, and you obviously did it right, because you are continually seeing growth. That’s fantastic! But you are starting to see areas where you are outgrowing Shopify. What are the signs you should be looking for to make sure that your business keeps growing and not be held back by your ecommerce platform?

Seven areas to look at if you think you are outgrowing Shopify:

In this blog, we will take a look at a few things you can do if you think you are starting to outgrow Shopify.

Is it time? Has your company outgrown Shopify? We can help. Call today >Are you spending too much time every month manually inputting information into your CRM, ERP or inventory systems because there either isn’t an integration available for Shopify or it is not functional? Is there a platform that can automate some of those processes? The time you save can then be put back into important business tasks for strategic growth.

Look at all the 3rd party integrations your site needs.

Or will need to integrate with vendors to keep your ecommerce machine rolling. These integrations should enable you to more securely accept payments, provide more buying and delivery options for your customers, and more. Find an ecommerce platform that easily accepts these kinds of integrations and are securely developed so that you don’t wind up with a liability down the road.

Look at your user experience and how you want your brand to be perceived.

Does your site look and feel like the brand you want it to represent? Is there something missing that the Shopify themes are just not giving you? Does your brand stand out from the rest of the Shopify templated storefronts? Will you be able to make effective changes within the restrictions of Shopify’s templates?

Are you getting the reporting you need? Limited reporting makes it hard to develop a long-term business strategy. Better reporting means better decisions and more growth.

Look at the costs of “renting” Shopify’s platform.

Adding plugins to your base Software as a service (SAAS) platform to add customer-friendly and value-add features come with a price. The costs of those modules add up and could seriously be eating away at your bottom line every month. Consider the long-term advantage of owning your ecommerce website by creating it with an open source platform like Drupal and Drupal Commerce.

Look at your data.

Did you know that your Shopify store is generating a ton of data, but that you don’t own it? Shopify ties up your data, known as service lock-in, so that transferring your data becomes a cumbersome process, especially if you have a lot of it.

Look at the effectiveness of your CMS.

Shopify is not meant to be a content management system, so when it comes to adding videos and rich media, linking pages with products, and managing large amounts of content like a blog, your customer engagement and SEO. Shopify can make it hard at times to properly implement search engine optimization (SEO) tactics that would contribute to your growth.

For small to medium-sized businesses, Shopify is the best in class for ease of getting online, standardization and generally doing the basics in ecommerce.

As your business grows you may quickly learn that the things that made Shopify so attractive in the first place are the elements that are holding you back. If you have taken the time to really look at the above items and find that Shopify still meets your needs, then great.

If you are in a position where you feel that you may be starting to outgrow your ecommerce platform, the sooner you start looking at other options, the better. Making changes away from a standardized platform will get more complicated the more and more data, plugins and services you add to your current site.

Does all of this information leave you feeling a little overwhelmed? Don’t worry, we get it.

For something as crucial as your ecommerce platform, you want to make sure you are doing the right thing. That is where we come in. We live and breathe this stuff. Our ecommerce consultants and subject matter experts are open source commerce pros and are always happy to give as much advice as you need. Reach out today and let’s talk about what your next ecommerce platform should be after Shopify.

Don't struggle with a platform that can't adapt to your needs. Talk to the experts who help businesses facing the same problems every day  Have a no-obligation conversation today >

Nov 19 2020
Nov 19

Is Drupal’s open source platform secure?

When deciding on the best CMS to meet your organization’s digital vision, security is often one of the top concerns. 

Here’s the reality. ALL software (closed source, open source, or custom-developed) has the potential for security vulnerabilities. Web security is a fast and ever-changing world. What passes today as secure code may not stay the same tomorrow when new vulnerabilities surface.

There's peace of mind in knowing not only is Drupal a proven, secure CMS but that it's also in an active state of safeguarding against attacks. 

With proper planning, maintenance, and updating, open source software, like Drupal, can meet and even exceed the security standards of closed source. 

- Mediacurrent’s Mark Shropshire, Senior Director of Development, quoted in an excerpt from Setting the Record Straight on Drupal Myths: Acquia eBook 

Security and The Drupal Community

Open source software, like Drupal, has the bonus of having thousands of experts work on a particular problem. With entire teams and methodology devoted to ensuring its steadfast reputation as a secure CMS, it's comforting to know modules and code procured from the official Drupal site are as secure as possible.

Using Drupal means you never have to face these risks alone, let alone attempt to correct the problem by yourself. There's power in numbers when it comes to both discovering and fixing potential software flaws, and the Drupal community has those numbers.

Dedicated Security Team

The Drupal project has an approximately 32-person security team with a track record of professionally handling security advisories. 

Community Code Review

One of the largest developer communities in the world, Drupal.org clocked 100,000 contributors and 1.3 million users at the time of Drupal 9’s release. Having many eyes on the source code ensures more issues are discovered and resolved. 

Rapid Response Time

Defined processes around reporting and resolving security issues accelerate the response time to fix vulnerabilities and release patches. 

Core Security

Core API tools and techniques address common security risks. Community projects such as the Guardr distribution help educate the community on best practices around Drupal security.

Guardr for Drupal logo

The Guardr distribution was created to enhance a Drupal application's security and availability to meet enterprise security requirements.

Proven High Standards 

Drupal-based organizations around the world — including enterprise-level brands, major universities, government, and large non-profits — put Drupal’s high security standards to the test every day.

Drupal Security Throughout the Website Process

The Drupal community has built-in security measures to combat threats — reassuring for sure. To proactively protect your site, the concept of security needs to be at top of mind when campaigns are being launched, systems/applications are being integrated, or when software is deployed or updated. 

Security-First 

A security-first approach means going beyond compliance to better assess risk. There are two paths to achieve this approach:

1) Culture: Adopting a security mindset culture for your organization. 

2) Automation: Taking on a continuous development plan that’s rooted in process automation.

In other words, start planning for security early and often throughout the website development process. 

phases of a web project: discovery, design, development, quality assurance, deployment, support

Don’t wait until the project is about to launch to think about security! Explore our Guide to Open Source Security eBook for tips and processes to consider when putting together a security-first maintenance plan for your website and marketing tech stack.

Developer Best Practices 

Here are three ways to safeguard your Drupal site: 

1. Choose the Right Modules

If you can dream up a feature for your site, chances are it can be found in the tens of thousands of community-contributed modules available for Drupal. With so many different options to pick from, how do you choose the most secure modules possible? Some steps to take are checking for how many sites are using the module, reviewing the issue queues, and avoiding deprecated or unsupported modules. 

Find more criteria for module decision-making in our guide to Drupal module evaluation

2. Use Drupal APIs

Look to Drupal APIs documentation to secure your contrib or custom code on a project. Drupal APIs have been nurtured by the community and have built-in protections for database security. If you do write new code, whether it’s a small amount or a completely new module, the “Writing Secure Code for Drupal” guide is a must-read reference. 

3. Monitor Drupal Security Advisories 

The Drupal security team posts weekly security advisories to Drupal.org.

To keep up with security releases, you can sign to receive email notifications through your drupal.org profile options, follow the RSS feed in your news reader (core, contrib, public service announcements), follow @drupalsecurity on Twitter, or join the Drupal Slack #security-questions

Sleep Better With a Secure Drupal Site 

For more best practices, check out the Mediacurrent presentation Sleep Better with a Secure Drupal Site: 

[embedded content]

Are you ready to build a strong foundation for Drupal security but unsure where to start? There's a lot to consider in your security plan but it doesn't have to keep you up at night. Contact the Mediacurrent Security team for support. 

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web