Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jun 03 2021
Jun 03

Editor’s note: This the first post in a three-part Highlighting Accessibility series

In recent years, accessibility has become an essential factor for teams tasked with bringing new products, services, and experiences to market. The reason for this is clear: design works best when it works for everyone. Historically, though, either out of ignorance or intention, we have often designed with the assumption that all users are alike in their abilities to perceive and interact, but more and more we’re recognizing that just like food & nutrition, humans vary considerably in their preferences and restrictions. 

To be fair, it is much easier to assume that our users are largely homogeneous; that they not only share the same preferences and goals but also their ability to achieve those goals. As we learn more about the nature of human cognition and interaction, we come to appreciate how a whole host of physical, mental, and even social characteristics can influence the fundamental ways we perceive the world.

So what is accessibility?

Broadly speaking, accessibility describes the degree to which something can be entered, or used. Architects and civil engineers have long incorporated things like wheelchair ramps, motion- or button-activated doors as ways to improve accessibility in the physical world; businesses often offer telecommunications devices for the deaf (TDD) capabilities for sales/customer services phone lines; car retailers offer vehicle modifications to make them wheelchair accessible and operable by persons with a whole range of physical disabilities. All of these enhancements are the result of the painstaking work by affected groups and their advocates toward gradual recognition of and empathy toward (and laws protecting) the varying needs of all human beings. 

“The digital world has changed so many of our lives for the better and it’s crucial to make sure the same can be true for everyone.”

Troy Shields, Sr Front End Engineer, Zivtech

In the digital world, these enhancements are defined by the Web Content Accessibility Guidelines (WCAG), put forth by the World Wide Web Consortium’s (W3C) Web Accessibility Initiative (WAI). As one of the foremost advocates for the needs of all users, the WCAG and related guidelines serve as a north star for organizations and project teams looking to justly serve users of all abilities. Indeed, the guidelines are extensive and cover a wide range of topics and best practices, but the best place to start to familiarize yourself with accessibility is via their four principles:

Principle of Accessibility #1 Perceivable

1. Perceivable

Information and user interface components must be presentable to users in ways they can perceive.

Naturally, this is a given for any interfaces we create, but it’s easy to overlook how the nature of perceivability varies by ability. For low vision users, a simple thing like an icon, button, or image may not be perceivable at all. In this and similar scenarios, designers and builders need to account for this impairment with consistent use of alt text and other modifications. Now, think of a chart or infographic, how much more difficult might it be for a low-vision user to consume that information?

Principle of Accessibility #2 Operable

2. Operable

User interface components and navigation must be operable.

Once we’ve made our user interface fully perceivable, we must then ensure interactive elements are usable. This can be really challenging as there are so many adaptations that some of our users need to make in order to interact with the physical and digital world, most of which we take for granted. Users with visual impairments may have a hard time discerning and using things like mouse-over menus, confusing or unlabeled buttons, and the like. Moreover, users with physical disabilities may have a hard time completing form elements where multiple simultaneous maneuvers are required, eg CTRL + click.

Principle of Accessibility #3 Understandable

3. Understandable

Information and the operation of user interface must be understandable.

Understandable design creates and maintains consistent conventions that help users anticipate how the interface will perform as they progress through their journeys. It’s predictable in its use of design patterns and information architecture. Amazon sells a wide range of products, but its layout for books is the same as it is for sporting goods, as it is for clothing. Imagine if every product or product category had a unique layout - how challenging might that be for a user to anticipate where they might find the product specs or customer reviews?

Principle of Accessibility #4 Robust

4. Robust

Content must be robust enough that it can be interpreted reliably by a wide variety of user agents, including assistive technologies.

As the movement to accommodate people of different abilities has taken steam, so has the development of assistive technologies to help them. Today, there is a wide range of tools and devices that are designed to deliver improved user experiences for users with a wide range of physical abilities. Unfortunately, these assistive technologies do not all perform the same, so designers and builders must work to deliver a consistent experience. Consider web browsers like Chrome, Firefox, and Internet Explorers. All of these tools are built around standardized languages and frameworks like HTML (hypertext markup language) yet they all perform slightly differently. Imagine how broad the discrepancies might be for tools that aren’t based on any real widely accepted standards.

Principle of Accessibility BONUS: Robust

BONUS: Receptive

Interfaces should provide ways for users to provide feedback on their experience 

We humbly offer this as a fifth principle for your consideration. In the spirit of continuous improvement, we should always be looking to enhance the experiences we create, and that often starts by capturing feedback and learning from those we serve. Even with many hours of accessibility training, and several successful projects under our belt, our team recognizes that we can always get better at delivering exceptional experiences. We should all strive to be open to learning and feedback, and more consistent in our willingness to apply what we learn.

...

Overall, the WCAG Accessibility Principles seek to inspire empathy in designers and builders who are committed to creating digital experiences that work for all. They help establish internal project standards that are more thoughtful, curious, and ultimately more inclusive. The principles should help organizations and project teams factor accessibility in at the start of the project instead of just running some tests at the end. Indeed, accessibility is best executed when it’s built-in to the core processes of how we work, standardizing up-front approaches for how we approach elements like navigation and information architecture, being consistent with the use of interactive elements, and so on. 
 

"Accessibility should not be an afterthought, it should be an integral part of your design strategy right from the start. "

Barry Brooks, Creative Director, Zivtech

Indeed, delivering truly accessible experiences requires additional effort and forethought to achieve, but it’s absolutely worth it. When we enable more of our users to succeed in their journey, we succeed as well. 

Apr 30 2021
hw
Apr 30

This DrupalFest series of posts is about to end and I thought that a fitting end would be to talk about the future of Drupal. I have already written a bit about this in my previous post: what I want to see in Drupal 10. However, this post is more aspirational and even dream-like. Some of these may sound really far-fetched and it may not even be clear how to get there. For the purposes of this post, that’s fine. I will not only worry about the how; just the what.

Modular code

Drupal is already modular; in fact, it is rather well designed. As Drupal grows and adds new features, the boundaries need to be redefined. The boundaries that made sense years ago for a set of features may need to be different today. The challenge is in maintaining backwards compatibility for the existing code ecosystem around Drupal. It also means fundamentally rethinking what are Drupal’s constructs. The transition from Drupal 7 to 8 was iterative and in many cases, the systems were just replaced from procedural to object-oriented. We need to rethink this. We should not be afraid to use the newest trends in programming and features from PHP, and rethink what architecture makes sense.

If you’re thinking this sounds too much like rewriting Drupal, you’re right. And maybe the time isn’t right for that, yet. I do believe there will be a time when this will become important. This will also help with testability which was the topic of my previous post about Drupal 10.

Assemblable code

Once we have code that’s modular, we can break it off into packages that can be reused widely. This is already in minds of several core developers and the reason why you see the Drupal\Component namespace. I want to see more systems being moved out not just in a different namespace but in different packages entirely.

The direct upshot of this is that you can use Drupal’s API (or close to it) in other PHP frameworks and applications. But that is not my main interest here. My interest in seeing this happen is that forcing the code to be broken this way will result in extremely simple and replaceable code. It will also result in highly understandable and discoverable code and that is what I want. Drupal core is opaque to most people except those who directly contribute to it or to some of the complex contrib modules. I believe that it doesn’t have to be and I wish we get there soon.

Microservices

Taking this even further, I would like to see some way to break down Drupal’s functions over different servers. I am thinking there could be a server dedicated to authentication and another for content storage. This is a niche use case and therefore a challenge to implement in a general-purpose CMS such as Drupal. This would need kernels that can run in a stateless way and also breaking systems so that they can be run independently using their own kernels. The challenge is that all of this will have to be done in a way that won’t increase the complexity of the existing runtime. This, as you might have guessed, is contradictory.

More themes

I might not have thought about this sometime back. But with a renewed interest in the editor and site-builder, I think it makes sense to improve the theme offerings available to Drupal. There were many times when I build powerful functionalities but then show them in a completely unimpressive look. Olivero and Claro are big steps in this direction and I am hoping for more of the same quality. If there were, I wouldn’t have used Contrib Tracker for over 3 years with the stock bootstrap theme (even the logo was from the theme until recently).

Unstructured Content

Drupal is great at structured content but I see Drupal used more and more for unstructured pages. The way we do this now is using structured bases such as paragraphs to represent components but this method has its shortcomings. Paragraphs and similar concepts quickly get too complicated for the editor. Editing such pages is very painful and managing multiple pages with such content is tricky.

We now have the Layout Builder that is a step forward in addressing this problem. The ecosystem around the Layout Builder modules is scattered right now, in my perception, and I would love to see where the pieces land (pun intended). I would like to see Drupal fundamentally recognize these pages as separate from structured data so that the editing workflows can be separate as well. More in the next section.

Better media and editor experience

I know media is a lot better today than it was before but there’s still a long way to go. When working with pages (not structured content), I would like a filtered editing experience that lets me focus on writing. WordPress does this part well, but WordPress has a specific target audience because of which it can make this decision. This is the reason I previously said I would want Drupal to treat unstructured pages differently.

With a more focused editor experience, I would want better media handling as well. I want to paste images directly into the editor without worrying about generating resized versions. I also want pasted images to be handled somewhat similarly to the current media elements so that we can still deduce some structure. Even tiny things like expanding tweets inline and formatting code correctly could be supported in the editor and that could make a big difference.

Starter kits

We need ways to spin up sites for varied purposes quickly but I don’t think distributions are a good answer, at least not the way they are implemented right now. Once you install a distribution, you are stuck with it and would have to rely on distribution maintainers for updates. I think that’s a waste of effort. I know it’s possible to update parts of distribution without waiting for a release but the process is harder. And if a distribution is going out of support, it is difficult to remove it; it is almost impossible if the distribution authors don’t design for it. We already have challenges with migrations and upgrades, why add the distribution on top of that?

Instead, I would like to see some support for starter kits within Drupal core. Instead of distributions, you would install Drupal from a starter kit which would point to modules and contain configuration. Once installed, it would be as if the site was installed manually. I believe a lot of such solutions are also possible outside Drupal core (using drush, for instance) but it would be nice to have some sort of official interface within the core.

What else?

I know that all of the things I mentioned are possible in part or have challenges that involve tradeoffs. It’s not easy to figure out which tradeoffs we should make considering that Drupal’s user base is massive. For example, what I wrote about today are my wishlist items for Drupal. They may be something completely different for you and that’s okay. I believe the above are already aligned with the vision set for Drupal by Dries and the core team (ambitious experiences, editor-first, site-builder focused, etc). With that as the North Star, we should be able to find a way forward.

That is it for now. I can dream more about the removal of features (such as multisite) but that’s for some other day. Let me know what things you would like to see in Drupal.

Apr 29 2021
hw
Apr 29

The configuration API is one of the major areas of progress made in Drupal 8. It addresses many challenges of managing a site across environments for Drupal 7 and before. It’s not perfect. After all, it’s just version 1 and there is work going on in CMI 2 to fix the problems in the current version. That is not the subject of this blog post, however. In this post, I want to talk about one of the lesser understood features of configuration management: overriding.

Most site builders and developers interact with the Drupal configuration API with “drush config-export” and “drush config-import”. Except for a minor exception, Drupal configuration is an all-or-nothing deal, i.e., you apply the configuration as a whole or not at all. There are modules like config_ignore and config_split to work around all these limitations. These modules allow ignoring parts of configuration or splitting and combining configuration from different locations. Further, Drupal allows overriding specific configuration through settings.php. A combination of all of these makes the Drupal configuration workable for most complex site-building needs.

The Configuration API

The configuration API provides ways to support all of the different scenarios described above. There are very simple constructs that are comparable to Drupal 7’s variable_set and variable_get functions. If you want to write a simple module that needs access to configuration, the documentation is enough to work out all of the details for you. Yet, there are some lesser understood parts of how the configuration handles cases such as overriding and other edge cases. I have seen this often in interviewing several people and even though it is well documented, its lack of visibility explains why people are not aware.

As I said before, I will mainly talk about the overriding system here. You are welcome to read all about the configuration API from the documentation. Specific areas I would recommend reading about are:

The configuration API needs to handle overriding specifically because they could affect exported configuration. If you are overriding configuration in the settings.php file, and if the config system didn’t know that, those values would be exported to the configuration file. This is not what you want. If your intention was to export it, then there was no reason to set it in the settings.php file. That is why the configuration API needs a mechanism to know about this.

Overriding configuration

The configuration API can handle these overridden values separately and also make them available to your code. This is why if you are exporting configuration after setting the values in settings.php, the overridden values won’t be present in the exported YML files. As a site builder, you won’t have to worry about this. But if you are a module developer, it helps to understand how to differentiate between these values.

Before I talk about the code, let’s consider the scenarios why you would need this. As a module developer, you might use the configuration in different ways and for each, you may need either the overridden data or the non-overridden (original) data.

  • For using the configured value in your logic, you would want the overridden data. Since you are using the value in your logic, you want to respect Drupal’s configuration system of overriding configuration.
  • For showing the value in a form, you might want to show the original data. You can choose to show overridden data but Drupal core shows the original data. There is an issue open to change this. Right now, there is a patch to indicate that the data is overridden and changes won’t take effect on the current environment.
  • While saving the values to the configuration, it will always be the original data. As a module developer, you cannot affect the overridden value through the configuration API. In fact, to save the configuration, you would need to call getEditable and that will always return the original data. When you set the new value and save it, you will change the value in the configuration storage. Yet, the override from settings.php will take precedence.

Accessing overridden values

If you have built modules, you might already be aware that you can use a simple get call on the config service to read a value from the configuration (see example). This will return an immutable object which is only useful for reading. To save it, however, you would need to go through the config factory’s getEditable method (example). This will return an object where you can set the value and save it.

While it may seem obvious that this is the only difference between “get” and “getEditable” methods, there is a subtler difference. The “get” method returns the overridden data whereas the “getEditable” method returns the original data. This means, there is a chance that both of these methods might return different values. As a module developer, it is important to understand this difference.

What if you wanted to get an immutable object but with original data? There’s a method for that: getOriginal (see example). Most modules won’t need to worry about the original data in the configuration storage except when saving. For that reason, it is not common to see this method in use. Even if a module were to indicate the differences in original and overridden configuration, it can use getEditable in most cases.

The difference between “get” and “getEditable” has been the trickiest interview question I have ever asked. Out of hundreds of interviews, I have only occasionally seen someone answer this. I hope this post helps you understand the difference and why the subtlety is important to understand.

Apr 28 2021
hw
Apr 28

Just as I published yesterday’s article on the tech stack, I realized that I missed a few important things. I had said that the list was only a start, so I think it is fitting to continue it today. As such, today’s post won’t be as long as yesterday’s or even as long as my usual posts. I should also add a caveat that today’s post won’t make the list complete. With the industry how it is and the requirement of constantly learning, I don’t think such posts stand the test of time; not from a completeness perspective in any case.

Our target persona still remains the same as the last post. We are talking about a Drupal developer building sites and functionality for a complex Drupal website with rich content, workflows, editorial customizations, varying layouts, and more features typically found in a modern content-rich website. Since today’s post focuses on the fundamentals of working on the web, the skills may apply to more than a Drupal developer but we’ll focus on this persona for now.

These are also skills that are often hidden, aka non-functional requirements. It is very easy to forget about this (as was evidenced by my yesterday’s post) but this is probably the most important of any of the skills. Not understanding this sufficiently leads to significantly large problems. Fortunately, Drupal is a robust framework and system and is able to handle all such issues by default. It is still important to understand how to work along with Drupal.

Performance

Performance is a very interesting challenge for a large dynamic website. There are several layers to unravel here and the answer for poor performance could be at any of those layers. This means a developer needs to understand how the web works, how Drupal works, and how does Drupal attempt to optimize for performance.

It is shocking how many people are unable to clearly articulate how the Internet works. To me, the lack of clear articulation is a sign of gaps in understanding in the chain of all systems that make the Internet work. To begin with, understand what is the sequence of events that take place when someone visits a link. Understanding DNS is important too. I know no one really understands DNS fully but you don’t need that. Get familiar with the interactions that take place between the client and the various systems such as DNS. Similarly, get familiar with CDN’s and other caching techniques that you can use.

Scalability is a different problem than performance but still very important to understand. Understand what are load balancers and how do they decide to route traffic. Also, understand how can you configure an application to work in such an environment. For this, you also have to understand the concept of state. Fortunately, PHP is already designed to be stateless and has a natural aptitude for such scalability (and serverless paradigm).

On the application front, understand how JavaScript and other assets can impact the rendering of your page. How would you find out if the front-end code is a bottleneck and if it is, what do you do about it? Same for back-end code. Understand how would you utilize caching at multiple layers to deliver performance to the user. Figure out what scenarios are better served with a reverse proxy cache like Varnish or an application cache such as Redis or memcached.

Accessibility

Accessibility is not an afterthought. Unless you are willing to review each line of code later, start with accessibility in mind from the first day. Again, Drupal meets many of the accessibility standards when you use the themes that ship with Drupal. The framework also supports many of the constructs that make writing accessible interfaces easier. Drupal 7 onwards has committed to meeting WCAG 2.0 and ATAG 2.0. Read the Drupal documentation to know more about how you, as a module developer, can write accessible code.

Security

Again, security is not an afterthought and many people who treat it as such have a fair share of horror stories. And just like before, Drupal provides constructs that make it very easy to write secure code. With Drupal 8 and the introduction of Twig, it became significantly harder for developers to write insecure code. However, it is important to understand what Drupal does for security, otherwise, you will be fighting against the core to make your code work.

Understand Drupal’s philosophy of content filtering of filtering on output, not input. Understand how to use PDO and DBAL to work with databases safely. Learn how to use the Xss helpers to write HTML output safely. Understand how you would use Drupal’s permissions and roles system to provide access control for your code.

Again, these are vast topics but I am only beginning to introduce them here. Do follow the links above to specific Drupal documentation pages to understand the topic better.

Apr 27 2021
hw
Apr 27

The title of this post is not really accurate, but I can’t think of another way to say it. This post is related to my earlier one on what a Drupal Developer does day-to-day. Here, I will talk about some of the skills required of a Drupal developer. I am not aiming for completeness in this post (that’s a goal for another time) but I will try to list all skills required to build a regular Drupal site, deploy it, and keep it running. Some of these skills are foundational whereas others may be only needed for specific requirements. Moreover, not all skills are required at a high expertise level. For some roles, even awareness is enough.

This post would be useless without a target persona. Different organizations have different needs and what works at one organization may not work at another at all. Similarly, different roles within an organization have different needs. For example, at Axelerant, we do not have a dedicated site-builder role but this is something that is expected of a Drupal Developer. Such an arrangement may not work at all for your organization nor can I say this arrangement will work forever for Axelereant. In fact, it is even weird to use the word “forever” here.

The target persona for this post is a Drupal Developer building sites and functionality for a complex Drupal website with rich content, workflows, editorial customizations, varying layouts, and more features typically found in a modern content-rich website. Since this is Drupal we are talking about, the above definition is actually quite broad. It includes a web presence for organizations from publishing, healthcare, higher-education, finance, government, and more. With that settled, let’s look at the skills.

Site-building

A Drupal developer would need to have the following skills related to site-building. Even if a developer does not build sites, there is a good chance they would be interfacing with these aspects from code. Hence, most of these are foundational.

  • Content modelling: Understand what makes a good content structure for a site. You should be able to determine if the content structure you are building is the relevant one for the requirements.
  • Content moderation: Most sites need this. I have seen this being used even on simple sites when there is more than one type of user. Understanding how the moderation workflow works and how it ties with the entity-field structure along with its revisions is important.
  • Multiple languages: It is important to understand how Drupal implements multilingual functionality at both interface and content (translation) level. Depending on the requirements, certain aspects of how translated content is stored in fields could be very relevant. At a basic level, you should also be aware of how Drupal determines the language when it renders a page or response.
  • Configuration Management: While this may seem more to do with developers, it is important to understand this from a site building perspective. You need to understand how the content types you create are stored in configuration and how is it different from how, for example, site information is stored. This means you need to understand configuration entities and simple configuration. You also need to understand how configuration can be changed depending on the environment and how it could be overridden.
  • Common solutions like views and webforms: Before you set about developing for some functionality, you have to make sure if there is a solution out there already. You also need to know how to pick modules for your problems.

Development

I don’t think this section needs any explanation as the entire post is about this. So let’s jump in.

  • PHP: This might seem like a no-brainer. Drupal is written in PHP so it is a good idea to be very comfortable with PHP. Understand object-oriented programming, how Drupal uses different constructs such as traits. Also, understand patterns such as dependency injection and observer pattern (hooks and event subscribers are based on this). Then, keep up with what’s new in PHP, if possible, and use it.
  • Composer: Arguably, this is a part of site-building as well but there is a good reason to keep it here. Apart from using composer to download modules, understand how it helps in writing the autoloader. Also, understand how composer works to select a module or a package and what are the reasons it might fail. Most of this comes with experience.
  • Caching: This is probably the most important part to understand about Drupal. Caching is deeply ingrained at multiple levels in Drupal and it is important to understand how it works. Almost all your code will need to interact with it even if you aren’t planning to cache your responses.
  • Other Drupal API’s: There are some common API’s you should be familiar with; for example, Form API and Entity API come to mind. But also be aware of other API such as Batch, Queue, plugins, Field, TypedData, etc. You should know where to look when you need it for a problem.
  • Automated testing: The Drupal core has a very vast collection of test cases and that means you may only have to write functional test cases for your site. You should understand the different tools that you can use to write them.

Deployment

You have to understand where your site is going to live to build it well. We don’t live in that age where you would write code and throw it over a wall to the ops team for them to deploy it. You may not be deploying it but you have to understand the concepts.

  • Servers: Understand how servers work. You don’t need to know enough to configure one securely (unless that’s your job) but you should know enough to understand the terms. For example, you should know what are the common web servers and how they may be configured to run PHP. Also, understand what kind of configuration Drupal needs to run.
  • High Availability: Given Drupal’s market, there is a good chance that the site you are building will need to run in a highly available environment. Understand what it means and what that implies for your Drupal site. Most of these issues may be resolved for you if you are using one of the PaaS providers but if you are building your own server(s), you should understand how to configure Drupal to run on multiple servers. Even if you are using PaaS, you should still understand how running on multiple servers could affect how you write code, e.g., how you work with temporary files.
  • CI/CD: Continuous Integration is very common today and it is important you understand it to be a productive developer. There are dozens of solutions to choose from and most have some kind of a free offering. Pick one and try it out.

And more…

Like I said before, I am not aiming for completeness here but only in making a start. Please let me know if I have missed something obvious. Given the nature of these posts, I can only spend 1-2 hours planning and writing these, so there is a very good chance I have missed something. Let me know.

Apr 26 2021
hw
Apr 26

Over the years, I have significantly lesser time for development and an increasing need to move around. After dealing with back-pain due to the weight of my Dell laptop while travelling for conferences, I bought a 15″ MacBook Pro. More recently, with the issues with Docker performance on Mac, I have been thinking of getting a Linux box. I was further motivated when I bought an iPad last year and wanted to use that for development. Now, with my old MacBook Pro failing because of keyboard and hard disk, I have a new MBP with the M1 chip and just 8 GB RAM. I am more and more interested in making remote development work efficiently for me.

It is not hard to set up a remote machine for development in itself. The problem I want to solve is to make it easy to spin up machines, maintain them, and tear them down. I also want to make it easy to set up a project along with the necessary tooling to get it running as quickly as possible. For now, the only problem I am solving is to set up the tooling quickly. I am calling this project Yakht (after yacht, as I wanted a sea-related metaphor).

Current workflow

While it’s far from the level of automation I am thinking of, it’s not too bad. I was able to set up a machine for use within a few minutes. This is what the process looked like:

  1. Create a 1 GB instance on DigitalOcean in a nearby region (for minimum latency).
  2. Add a wildcard DNS record for one of my domain names so that I can access my projects on ..
  3. Set the domain name and IP address in my Ansible playbook’s variable files and inventories.
  4. Run the Ansible playbook.

These steps had a machine ready for me with all tools required for Drupal development using Docker (Lando in my case). The Ansible playbook installs a bunch of utilities and customizations along with the required software like PHP, Docker, composer, Lando, etc. Only PHP CLI is installed because all development happens within Docker anyway. It also configures Lando for remote serving and sets the base domain to the domain I have configured, which means I can access the URLs generated by Lando (thanks to the wildcard DNS). With the Drupal-specific tooling we have written (some of which I have written before), setting up a new project is fairly quick.

A lot of these tools are somewhat specific to me (such as fish-shell and starship). I need to make it customizable so that someone else using it can pick a different profile. That’s a problem for another day.

Current trials

I have been using this machine for a long time now which is not how I intend for this to be. I am trying out a few tools and customizations before putting them in the Ansible playbook. Most notably, I am trying out cdr and using it as an online IDE which is very similar to vscode. It took a little effort to serve it via Caddy, it works well most of the time. It times out frequently but I think this is because this instance only has 1 GB of RAM. When it times out, the connection breaks and you need to reload which can get frustrating. Fortunately, this happens frequently for a while and then it works fine for long periods of time. In any case, I doubt if this will happen on an instance with a reasonable amount of RAM.

Screenshot of cdr running in Chrome

Screenshot of cdr running in Chrome

I know that VS Code has good support for remote development over SSH but I also want to be able to use the IDE over iPad and a browser-based IDE solves that. I am also considering trying out Theia and Projector but that’s for another day.

It’s also missing a few things I want in my development machines such as my dotfiles and configuration for some of the tools (such as custom fish commands). For now, all of this is set up manually. Of course, my intention is to automate all of these steps (even DNS).

Current problems

The general problem with these kinds of tools is to maintain a balance between flexibility and ease of use. By ease, I mean not having to configure the tool endlessly and frequently to make it do what you want. But that’s exactly what flexibility needs. For now, I am not trying hard to maintain flexibility. Once I have something that works with reasonable customization, I will figure out how to make it much more customizable.

Another problem is accessing remote servers from this machine. Right now, I am using SSH agent forwarding to be able to access remote servers from my development instance without having the SSH key there (and it works). But this doesn’t work if I am using the terminal in cdr. I am still looking for a solution to this problem that doesn’t involve me copying my keys over to the development instance. One idea is, if forwarding is not possible at all, to generate new keys for every instance and give you public keys that you can paste in services you want. This is secure compared to copying the private key but definitely a hurdle to get over.

I am excited by this project and hope to have more updates over time. For now, I hope you find the Ansible playbook useful. Also, I am looking for ideas and suggestions in solving this general problem of remote development. Please let me know via comments or social media what you think.

Apr 25 2021
hw
Apr 25

I thought I was done with the series of posts on object-oriented programming after my last one. Of course, there is a lot we can write about object-oriented programming and Drupal, but that post covers everything noteworthy from the example. There is a way which is old-school but works as it should and another which looks modern but comes with problems. Is there a middle-ground? Tim Plunkett responded on Twitter saying there is.

There's a third way! See https://t.co/cFizL60Loy and https://t.co/so2syXyXZS

— timplunkett (@timplunkett) April 24, 2021

At the end of the last post, I mentioned that the problem is not with object-oriented programming. It is with using something for the sake of using it. If you understand something and use it judiciously, that is more likely to result in a robust solution (also maintainable). Let’s look at the approach mentioned in the tweet in detail.

The fundamental difference

The problem with the version with objects in the last post was not because it used objects. It was because it overrode the entry point of the form callback to change it. Using classes has its advantages, most notably that we can use dependency injection to get our dependencies. For more complex alterations, it is also useful to encapsulate the code in a single class. But the approach with the entry point made the solution unworkable.

On the other hand, the form_alter hook is actually designed for this purpose. Yes, it cannot be used within classes and you have to dump it in a module file along with all the other functions. But it works, and that’s more important. There is no alternative designed for this purpose. So, in a way, the fundamental difference is that this method works whereas the other doesn’t. It doesn’t matter that we can’t use nice things like dependency injection here if it doesn’t even work.

Bringing them together

The two worlds are not so disjoint; PHP handles both brilliantly, after all. If you want to encapsulate your code in objects, the straightforward solution is to write your code in a class and instantiate it from your form_alter hook. Yes, you still have the hook but it is only a couple of lines long at most and all your logic is neatly handled in a class where it is easy to read and test. The class might look something like this.

<?php
 
namespace Drupal\mymodule;
 
use Drupal\Core\Form\FormStateInterface;
 
class MySiteInfoFormAlter {
  public function alterForm(array &$form, FormStateInterface $form_state, $form_id) {
    // Add siteapikey text box to site information group.
    $form['new_element'] = [ '#type' => 'textfield',
      // ... more attributes
    ];
  }
}

And you can simply call it from your hook like so:

function mymodule_form_system_site_information_settings_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {
  $alter = new \Drupal\mymodule\MySiteInfoFormAlter();
  $alter->alterForm($form, $form_state, $form_id);
}

You can save the object instantiation if you make it static but let’s not go there (you lose all advantages of using objects if you do that).

Dependency Injection by Drupal

This is already looking better (and you don’t even need route subscribers). But let’s take it a step further to bring in dependency injection.

We can certainly pass in the dependencies we want from our hook when we create the object, but why not let Drupal do all that work? We have the class resolver service in Drupal that helps us create objects with dependencies. The class needs to implement ContainerInjectionInterface but that is a very common pattern in Drupal code. With such a class, you only need to create the object instance using the class resolver service to build it with dependencies.

function mymodule_form_system_site_information_settings_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {
  \Drupal::service('class_resolver')
    ->getInstanceFromDefinition(MySiteInfoFormAlter::class)
    ->alterForm($form, $form_state, $form_id);
}

For better examples, look at the links Tim Plunkett mentioned in the tweet: the hook for form_alter and the method.

I hope you found the example useful and a workable middle-ground. Do let me know what you think.

Apr 24 2021
hw
Apr 24

Update: Read the follow-up to this post where I discuss a mixed approach combining both of the approaches here.

I previously wrote about how the object-oriented style of programming can be seen as a solution to all programming problems. There is a saying: if all you have is a hammer, everything looks like a nail. It is not a stretch to say that object-oriented programming is the hammer in this adage. That post was quite abstract and today I want to share a more specific example of what I mean. More specifically, I’ll talk about how using “objects” to alter forms without thinking it through can cause harm.

But first some context: this example comes from my reviews every week of code submitted as part of our interview test. This has happened frequently enough that I think that this is actually a recommendation somewhere. Even if it is the case of people copying each other’s work, it certainly is evidence that this has not been thought out. In fact, this makes for a good interview question: where would this method fail? I am going to answer that in this post.

The traditional method

Let’s look at the traditional method first. Drupal provides hooks to intercept certain actions and events. For example, Drupal might fire hooks in two situations: in response to events like saving a node, or to collect information about something (e.g. hook_help). You will find a lot more examples about the latter and that is what we are going to talk about today.

Drupal fires a few different hooks when a form is built. Specifically, it gives the opportunity to all the enabled modules to alter the form in any way. It does this via a hook_form_alter hook and a specifically named hook_form_FORM_ID_alter. So, for example, to alter a system site information form, either of the functions below would work:

function mymodule_form_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {
  if ($form == "system_site_information_settings") {
    $form['new_element'] = [ /* attributes */ ];
  }
}
 
// ... OR ...
 
function mymodule_form_system_site_information_settings_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {
  $form['new_element'] = [ /* attributes */ ];
}

Adding elements or altering the form elements in any way is a simple affair. Just edit the $form array as you want and you can see the changes (with cache clear, of course). This is the old-school method and it still works as of Drupal 9.

The OOPS approach

More often than not, I see the form being altered in a much more involved way. Broadly, this is how it looks:

  1. Create a form using the new object-oriented way but extending from Drupal\system\Form\SiteInformationForm instead of the regular FormBase.
  2. Define an event subscriber that will alter the route using the alterRoutes method.
  3. In the event subscriber, override the form callback to your new form.

This gist contains the entire relevant portion of code.

After doing all this, you might expect that the code should at least work as expected. Many people do. But if you have been paying close attention, you might see the problem. If not, think about what would happen if two modules attempt to alter the same form this way. Only one of them would win out.

If there are two modules altering the same route, the last one to run will win and that module’s form changes will be used, The form controllers from the previous modules will never be executed. You could extend the first module’s form controller in the second module (so that changes from both modules take effect) but that is not reasonable to expect in the real world with varied combinations of modules.

So, we shouldn’t use objects?

I am not saying that. I am saying that we should think how we are applying any programming paradigm to build a solution and where it might fail. In our example, if Drupal supported an object-oriented version of form alters, that would have been safe to use (there is an open issue about this.) In fact, there is discussion to use Symfony forms and also some attempts in contrib space. Until one of those solutions get implemented, the form_alter hook is the best way to alter forms. And there is a good chance that such hooks get replaced in time. After all, the event-based hooks did get replaced by events in most cases.

For now, and always, use the solution that fits your needs. Using objects or using functional programming doesn’t necessarily make a program better. It is using our skills and our judgement that makes a program better.

Update: Read the follow-up to this post where I discuss a mixed approach combining both of the approaches here.

Apr 23 2021
hw
Apr 23

I have been contributing to Drupal in a few different ways for a few years now. I started off by participating in meetups, and then contributing to Drupal core whenever I found time. Eventually, I was even contributing full-time courtesy of Axelerant, my employer. At the same time, I started participating in events outside my city and eventually in other countries as well. I was speaking at most of the events I attended and mentored at sprints in many of these events. I have written about this in detail before in different posts about my Drupal story and a recent update to that.

It was only with the support my wonderful family and also from Axelerant in the early years that enabled me to contribute in this way. As my responsibilities grew, I had to find focus in where to contribute. My kids were growing up and I wanted to spend a lot more time with them. At the same time, I started picking up managerial responsibilities at Axelerant and was responsible not just for my work, but for a team. I was approaching a burnout quickly and something had to go. It was at this time I rethought on how to sustainably contribute to open-source.

Long story, short…

The story is not interesting. Honestly, I barely remember those years myself. I know they were essential for my growth and they came at a significant price. But we know that nothing worth doing is easy. As a mentor to a team and even bordering on a reporting manager, I had the privilege to multiply my efforts through them. I am proud to see how many of them have built their own profiles in the community and continue to do so.

My recommendation to my team and myself is now to stop thinking of contributing as “contribution” but as a part of our work. The word “contribution” implies giving something externally. People are hesitant to make this external action when they already are very busy with bugs, deliveries, and meetings. We all have a limited working area in mind to think about the code and all the complexity. Thinking about something external is very difficult in these circumstances.

Don’t hack core

One reason this feels so external to us is because of how we treat Drupal core and contrib. We drill in the notion in newcomers that we should never hack core. While there is a good reason for this but it results in the perception that the core (and contrib) cannot be touched. It is seen as something external and woe befall anyone who dareth touch that code. It is no surprise that many people are intimidate by the thought of contributing to the Drupal core.

My workflow

The trick is to not think of it as external. I use the word “upstream” instead of contrib projects when talking about Drupal core or modules. I find that some people think of “upstream” as a little closer to themselves than “community contribution”. Thinking about it this way makes the code more real, not something which is a black box that can’t be touched. We realize that this code was written by another team consisting of people who are humans just like us. We realize that they could have made mistakes just the way we do. And we realize that we can talk to them and fix those mistakes. It is no different than working in a large team.

Yes, this means that we don’t have people who are dedicating time to contribute. That is a worthy goal for another day. I am happy with this small victory of getting people familiar with the issue queues and the contribution process on drupal.org. I have seen these small acts bubble up to create contrib modules for use in client work (where relevant). Eventually, I see them having no resistance with the idea of contribution sprints because the most difficult part of the sprint is now easy: the process. They are familiar with the issue queue and how to work with patches (now merge requests); if it is a sprint, the only new thing they are doing is coding, which is not new to them at all.

I realize that this is not a replacement to the idea of a full-time contributor. We need someone who is dedicated to an initiative or a system. These contributors are needed to support the  people who will occasionally come in to fix a bug. But we need to enable everybody to treat the core as just another piece of code across the fence and teach them how to open the gate when necessary.

Apr 22 2021
hw
Apr 22

We at Axelerant have been contributing to Drupal in our own ways for a long time. In fact, I worked as a full-time contributor to Drupal a few months after I joined. This was around the time Drupal 8 was almost done and it is thanks to Axelerant I could contribute what I could at that time. At the same time, there was community focus around incentivizing contributions and there were a few websites (like drupalcores) to track contributions.

Sometime later, in one of our internal hackathons, I built a basic mechanism to track contributions by our team (you can see it at contrib.axelerant.com). It was a weekend hackathon and what we built was very basic but it set the groundwork for future work. We adopted it internally and reached a stage where we had processes around it (even our KPI’s for a while). In this time, we expanded the functionality to (manually) track non-code contributions. More recently, we added support to track contributions from Github as well.

Tracking all contributions

Since this was a hackathon demonstrating possibilities using code and technology, we started with only tracking code contributions. Soon, we expanded this to track contributions to events and other non-code means. The latter was manual but this helped us build a central place where we could document all our contributions to the open-source world.

Today, we have tracking from drupal.org and Github and basic checks to determine if code was part of the contribution. On the non-code front, you can track contributions to websites like StackOverflow or Drupal Answers. And for events, you can track contributions such as volunteering, speaking, and even attending (yes, I think participating in an event counts as a contribution).

Now, we have a process for anyone joining Axelerant to set up an account on the contrib tracker. After this, all their contributions to Drupal and Github begin to get tracked at that time. We also remind people frequently to add details of any event they have attended.

Improving contrib-tracker

Contrib Tracker is open source but currently treated as an internal project at Axelerant. We initially set it up as a public repository on our Gitlab server. Now we realize that it was not practical for people to access and help build it. Today, I moved Contrib Tracker to its new location on Github: contrib-tracker/backend. For a while, we had the thought of implementing the website as a decoupled service and that’s why the repository is named “backend”. Right now, we may or may not go that route. If you have an opinion, please do let me know in the comments.

Moving to Github

The move to Github is still in progress. In fact, it just got started. The project is, right now, bespoke to how we host it and one of the lower priority items would be to decouple that. Once that is done, it should be possible for anyone to host their own instance of contrib tracker. But the more important task now is to move the CI and other assets to Github. I will be creating an issue to track this work. Our team at Axelerant has already started moving the CI definitions.

That’s it for today. Do check out the source code at contrib-tracker/backend and let us know there how we can improve it, or better yet, submit a PR. :)

Apr 21 2021
hw
Apr 21

Drupal 8 was a major revolution for the Drupal community in many ways, not least of which was because of the complete rearchitecting of the codebase. We picked up the modernization of various frameworks and tools in the PHP community and applied it to Drupal. Of course, this makes complete sense because Drupal is written with PHP. One of the things we picked up was the PHP’s shift to the object-oriented programming approach.

The shift to using objects enabled us to collaborate with the rest of the PHP community like never before. On the language front, we had PHP-FIG working on standards such as PSR-4 which was adopted by many libraries. And on the tooling front, we had composer which allows us to use other packages with minimal effort. These developments made it possible for us to build Drupal on top of the work from the larger PHP community. However, this meant that we had to follow the same style of programming as those libraries and thus began a massive re-architecture effort within Drupal.

The refactoring story

Drupal is a product built and maintained by a diverse group of geographically distributed and mostly unpaid people. Further, Drupal’s value is not in just the core system, but also the tens of thousands of modules, themes, and distributions maintained by a similar group of people. All the modules, themes, and distributions depend on the core Drupal’s API which means that making any change to the core is very risky.

Because of this, it is simply not possible to isolate Drupal and refactor. We had to incrementally refactor parts of Drupal while making sure there is sufficient documentation for people to upgrade their modules. This inadvertently, but predictably led to significant overengineering in Drupal’s codebase. There are a lot of parts of Drupal several layers down which may seem messy but there is a reason. The problem is that those reasons are very hard to surface. More importantly, they become a hurdle for more core contributors to simplify the system.

I have to caveat this by pointing out that I am not an expert core developer myself. Not even close. The above is just my observation in working with the core. You may say that this is practically an outsider’s perspective and you are right. But I have learnt that an outsider’s perspective is often the most valuable one.

Objects, objects everywhere

The new structure has brought in dozens of new practices (such as the DIC). This often means that you need to augment the structure to support that. Soon, we have classes with multiple adjectives and nouns everywhere reminiscent of Java. This makes the code even more opaque putting in more hurdles for people who are new to Drupal. Even experienced PHP developers may find trouble navigating the sometimes unintuitive names of various components because of the historical context.

With people managing to work in this new reality, I have found that people’s answer to all problems is simply more objects. Copy-paste solutions from various places (aka StackOverflow) and make it work. At work, I have seen more and more code samples from people who are convinced they are presenting a superior solution simply because they are using classes with names like subscriber and controller while not realising they are breaking functionality that would not have broken in Drupal 7.

This may seem vague at this point and I hope to present more specific examples in future posts. I will leave you with these abstract thoughts for today. I hope you found this interesting even if not useful.

Apr 20 2021
hw
Apr 20

Now that we have discussed some of the concerns with career growth, let’s talk about beginning a career in Drupal. I am not aiming for comprehensiveness but I hope to give a broad overview of what a Drupal developer role looks like in the real world. Needless to say, every company or agency is different. You might have a very different set of responsibilities depending on your organization. One of those responsibilities would be to build and develop Drupal websites and we are going to talk about that here.

Working in an organization essentially means working with other people. That is how you multiply your impact. All the roles do have some amount of dealing with people. Initially, you only might have to worry about collaborating with people but as you grow, the scope increases. Eventually, you would have to mentor and maybe even manage people. All along this journey, you would also increase your knowledge and area of impact technically.

Early roles

As someone who has very little experience with programming, you might start off with Drupal by building sites using UI and simple CLI commands (drush or console). Your challenge would be to understand content types, taxonomy terms, users, roles, permissions, etc. You would need to understand how to design a content type that is actually useful for what you want to represent and add fields to it. Moreover, you might know enough PHP to find snippets to alter forms or pages and implement those as per requirements.

At this stage, you might not be expected to solve complex problems or scale websites with heavy traffic. You might be working under the supervision (and mentorship) of someone with more experience with Drupal. It would be expected of you to collaborate with other people on the team such as front-end and QA engineers if any and complete your tasks the best you can.

You would need a strong set of fundamentals here such as git, basic PHP programming skills, and a working knowledge of HTML, CSS, and JavaScript. You would need more of an awareness of your software stack but wouldn’t have to go deep.

Intermediate roles

With around 2-5 years of experience, you will begin to pick up a more complex set of tasks. You would write more complex modules with proper object-oriented code. The site-building requirements also would be more complex. You would have to implement content types with complex relationships with other entities. The code required to tie these things together would also get more complex. You will also have to worry more about how to solve a requirement in the best way rather than just implement something.

At this stage, you would be expected to participate in code reviews and share your feedback. In fact, some organizations (like mine) encourages all levels of engineers to participate in code reviews, so this might not be new. Take this opportunity to also seek mentorship from other folks to go deeper into Drupal and other technologies. While you only had awareness of your stack earlier, you should now have a working knowledge of various elements in the stack. You would also be involved in discussions with other teams to build integrations your system needs. Essentially, you should understand how the pieces fit together and how they participate in making the system work.

You should also begin to question requirements and try to understand the business problem. Your value at this level is not just your technical skills but also solving business problems. In other words, it’s not just about solving the problem right but solving the right problem.

Advanced roles

After about 5-10 years of experience, you will now be responsible for the entirety of a system. You may begin with simpler systems where Drupal plays a major role and then move on to more complex systems where Drupal is just one piece of the puzzle. At this stage, more people will turn to you to solve their problems and you will have to figure out how to solve those problems yourself, guide them, or delegate it to someone else.

You should also be much more familiar with the entire Drupal ecosystem and how it can fit in with other parts of your stack. You are close to approaching what could be called the “architect” role (but we won’t go there today). As an IC, you will have a lot more impact but you would also play a significant glue role in your team (or teams).

This is a vast topic and I might come back to this in a future post (or multiple posts). Like I said before, this is not meant to be a comprehensive guide, but just a broad overview.

Apr 19 2021
hw
Apr 19

As the Director of Drupal services at Axelerant, one of the things I often worry about is the growth of each of the members of my team. We are a Drupal agency, so there are a lot of options for people to choose from. Yet, there are different concerns with working on Drupal for a long time. Today, I’ll talk about some of those concerns and what are the mitigating factors. I can’t claim to present a perfect solution, even less in a post written in less than an hour. But I hope to at least get the topic started.

Concern: Drupal is not cool anymore

When Drupal celebrated its 18th birthday, there were jokes around how Drupal is now old enough to drive (or something like that). This year, Drupal is 20. You might be surprised but a software’s life-cycle is not the same as a person’s life-cycle :). So, while 20 might be the age where a person is cool (I’m showing my age, aren’t I?), that’s not the same for software.

Yes, Drupal is now in the boring technology club. It has been so for years with a definitive transition around the Drupal 8 period. This is not a bad thing. It’s only a bad thing if you want your software stack to be cool and you having to work against deadlines to fix issues for everything that is just a little out of the way. Drupal is mature, predictable, and boring. And that means you should not worry about Drupal going out of style anytime soon.

Fact: Drupal does valuable things; cool, but also valuable

However, the people who have this concern don’t worry about the software living on. They are missing the fun that comes with working with cutting edge technology. They’re right. Fun is important and software development is hard enough without it not being fun. The fact is that Drupal can still be used in cool new ways. For example, Decoupled Drupal is one of the newest trends in the industry to offset Drupal’s limitations in the front-end area. Another subtle way that Drupal is fun by opening it up to the PHP ecosystem with the “get off the island” movement with Drupal 8. You don’t have to be limited by Drupal’s API and libraries to do what you want. The entire PHP ecosystem of packages is available for you for direct use within Drupal.

Further, I encourage people not to code just for the sake of coding. Solve a problem: any problem, even if it is a hypothetical issue or an imaginary challenge. Solve it and put it out to help other people. You make that act of coding about people and in turn, you help yourself. But I am digressing. Drupal solves problems of quickly building a web presence for activities that help people. This tweet comes to my mind again over here. Even if Drupal’s software stack is not cool, what it does is cool and that’s a big deal for me.

Concern: There is not much to do in Drupal

As developers grow and learn more, they find that they are learning less and less in their day-to-day work. This is closely tied with the first concern I talked about (Drupal is not cool). People may not think that Drupal is not cool but the underlying thoughts are the same: all tasks are of the same kind and I am not learning anything anymore. They’re right as well but there are ways around it.

I previously wrote that most requirements of building a website with Drupal can be achieved just by configuring in the UI. You only have to write some glue code to get it all working. Opportunities to develop entirely new functionality in Drupal is now rare unless you are working on the Drupal product. This is when people start looking into other frameworks and languages. Again, I highly encourage that because the learning benefits both the person and the project, but that’s beside the point of this post.

Fact: There is a lot to do around Drupal

The problem is in finding opportunities around Drupal. I have already hinted at one easy way out of this: work on the Drupal product. That is to say, contribute to the Drupal core and contrib space. These are the people solving the hard problems so that we don’t have to. If you want to solve hard problems, we need you there. I am sure your organization will support you if you are interested in contributing to Drupal.

Next, look at what are the integrations where you could help. Very few Drupal sites today work in isolation. Either you have a completely decoupled site, or at least integrations to other complex systems and these are still challenging areas to work in.

You will notice that the solutions here largely depend on the organization where you work. Yes, you would need the support of your organization and there is a good chance you will get that support if you ask. If you don’t, you can tackle this yourselves but that is the subject of another post.

Concern: No one respects Drupal or PHP

PHP (and Drupal) is often the subject of cruel jokes ranging from competence to security issues to just being old. Chances are that you have come across enough of them and so I won’t write more about this.

Fact: PHP and Drupal communities are some of the most respected ones

To be blunt, don’t listen to these people. In my opinion, they are living in a time 20 years back or are influenced by people who lived then. PHP has made a comeback with a robust feature-set and predictable release cycle. So has Drupal. Moreover, Drupal has rearchitected itself to be more compatible with the PHP ecosystem. Finally, code is just about getting the job done and PHP commands a significant share of the web applications market, so you’re in good company.

And it’s not just about code. PHP and Drupal communities are some of the most welcoming communities in the world. They are often talked about in various places (just search for podcasts on the topic) and people cite examples of how they found welcome and support in the communities.

I am going to leave this post here even though there is a lot more I can say. But these are my thoughts for today and I hope you enjoyed them. I hope to come back to this in another post some day.

Apr 18 2021
hw
Apr 18

As of this writing, there are 47,008 modules available on Drupal.org. Even if you filter for Drupal 8 or Drupal 9, there is still an impressive number of modules available (approximately 10,000 and 5,000 respectively). Chances are that you would find just the module you are looking for to build what you want. In fact, chances are that you will find more than one module to do what you want. How do you decide which module to pick or if even a particular module is a good candidate?

Like all things in life, the answer is “it depends”. There are, however, a few checks that you can make to make a better decision. Well, these are the checks I make when trying to pick a module and I hope they can help you too. This is not something you should take strictly; just use this as a guideline to help you decide. Also, if a module fails one of these checks, it doesn’t mean that the module is a bad choice. It only means that you might be making a trade-off. Software is all about making trade-offs and so this is nothing new.

Lastly, I’ll try to focus on modules here in this post but most of this advice is suitable for themes as well.

First-look checks

These are the most basic and easiest checks you can make on a module. Except for the first one, these are not strong indicators but they can quickly give you an initial impression of the module. There might be excellent modules that are perfectly suited to your needs but fail these checks, which is why you should only use these checks to differentiate between two modules. Yeah, the irony of the name is not lost on me.

Version

Is the module even available for your Drupal version? If you’re using Drupal 9, you would need the version that supports Drupal 9. Earlier, just reading the module version would tell you if it is compatible. For example, if the module version is “8.x-3.5”, you know that the module is for Drupal 8, not Drupal 7. You might think that the module is not for Drupal 9 either but that’s not so simple anymore.

Screenshot of version information on Drupal.org

As you can see in the above screenshot from the Chaos Tools module, the 8.x-3.5 version is for both Drupal 8 and Drupal 9. This changed after Drupal.org began supporting semantic versioning for contrib projects. These example release tags show different styles of version numbers you might see and this change record explains how a module may specify different core version requirements.

Module page

Is the module page updated? Does the description make sense considering the Drupal version you are targeting? If you’re looking for the Drupal 9 version of the module and the text hardly talks about that, maybe the module is not completely ready for Drupal 9 or has features missing.

Screenshot of project page on Drupal.org

Another clue you may have is the timestamp when the page was last updated. It may just be that the maintainer has forgotten to update the page even if the module works perfectly fine. For this, we go to the next set of checks.

Recent commits

Has the module been updated recently? If it was last updated years ago, chances are it won’t work with the latest version of Drupal core. Even if it works with your version of Drupal core, would you be able to upgrade once you upgrade to the latest version of Drupal core?

Screenshot of commits block on a project page

The block on the module page gives a quick summary of the recent commits, but don’t rely entirely on this. The block only shows current committers. If there are newer commits by a previous committer or commits made elsewhere pushed here (this would happen if someone were maintaining the module on Github, for example), they won’t show here. To be sure, click on the “View commits” link to see all the commits.

Issues

Is there a discussion going on about improving the module? The issues block on the project page gives a quick summary of what’s happening on the module. Be careful though, some module maintainers choose to maintain the module on Github or elsewhere. In those cases, the issues here would show no activity (or it may be entirely disabled).

Issues block on project pageResources

Are there other resources for the module? Is there external documentation or documentation within the drupal.org guides? The Documentation and Resources block on the project page will point you to such links and other useful links.

Code structure check

These checks take a little more time than just quickly scanning the project page. These indicators are a little more reliable but not the sole determinators of success. For all of these checks, first, go to the code repository by following the “Browse code repository” link from the project page and then, select the branch for the version you want.

README file

In a previous section, I mentioned that the project page may have an outdated or missing description. This happens over time when there are multiple versions and the maintainers find it difficult to keep the page updated. However, chances are that the maintainers would definitely update the README file in the repository. Go to the code repository (see screenshots above) and find the README file. The actual file may be named either README.txt or README.md.

If this file is maintained, there is a good chance the module is well maintained and documented and you would have fewer problems using it.

Code structure

You would need experience with Drupal development to make this check. Look at the module code and see how it’s structured. Are the classes neatly separated from the rest of the code? Does the code make separation of concerns? Are there tests? Is everything dumped in the “.module” file or a bunch of “.inc” files (ignore this if you are checking for Drupal 7 modules)? If the module stores something in the config, is there a config schema?

There are some metrics we can gather to understand this better but not from the code repository browser we see. But an experienced developer would know by looking at how well the module is following Drupal conventions. This is important because following these conventions will make it easier to keep the module updated for future versions of Drupal. You don’t want to start with a module and be stuck on an older version of Drupal because of this.

Test

Here is where we actually test the module and see if it works. These are the strongest indicators but also the most time-consuming to check.

Simplytest.me

The easiest way to check a module is simplytest.me. This excellent community-run service lets you test contrib projects along with patches. Type the name of the module and click “Launch Sandbox”.

SimplyTest screenshotThere is also an “Advanced options” section that lets you add more projects (if you want to test this module along with another module) and patches. Select the module or combination of modules you want, click “Launch Sandbox” and you have a brand new Drupal installation to play with for 24 hours. This actual testing will help you determine if this module fits your need.

Test locally

If you don’t want to test on simplytest.me or you simply want to evaluate code locally, use one of the quick-start Drupal tools to get a Drupal installation with the module. For example, with axl-template, I can type this command to get a Drupal site with smart date module and a few others.

$ init-drupal drupal/test -m smart_date -m admin_toolbar -m gin

You can also use the same code base to evaluate the codebase better, maybe even run some of the code quality checks on it if you’re interested.

I hope this quick post was useful to you. Like I said before, don’t treat this as a complete list but as a guide. In the end, your own experience and actual tests will tell you more. The above guidelines are only here to save you time in reaching there.

Apr 17 2021
hw
Apr 17

I picked up this topic from my ideas list for this #DrupalFest series of posts. I didn’t think I would want to write about this because I don’t think about features that way. One of the strengths of Drupal is its modular architecture and I can put in any feature I want from the contrib space. In fact, I prefer that, but that’s a different topic. I am going to write this very short post anyway because I am now thinking about this from a different point of view.

To write this post, I thought about the various things that bother me in my day-to-day work. I thought of how we could have simpler dependency injection, or lesser (or clearer) hooks, or lesser boilerplate, but then I realized that all of those don’t matter to me very much right now. Those can be improved but they are inconveniences and I can get over them.

Easier testability is what I want from the future versions of Drupal. I have realized that the ease of testability is the strongest indicator of code quality. A system that is easily testable is implicitly modular along the right boundaries. It has to be, otherwise, it is not easy to test. A good test suite comprises of unit tests, integration tests, and feature tests. If it is easy to write unit tests, then the system is composed of components that can be easily reused (à la SOLID principles). Integration and feature tests are easier to write but eventually, they get harder unless the system is built well.

Drupal core testability

Now, Drupal is one of the most well-tested pieces of code out there. Each patch and every commit is accepted only after it passes tens of thousands of tests. Testing is also a formal gate in adding new features (and some bugfixes). That is to say, any new feature needs to have tests before it can be merged into Drupal core. Consequently, testing is common and core contributors are skilled in writing these tests.

This testability is not easily carried forward to most kinds of websites we build using Drupal. Building a typical Drupal website is mostly a site-building job and cross-connecting various elements. This happens either by configuring the site via UI or by using hooks which are essentially magically named functions that react to certain events. With Drupal 8, the hooks-based style of writing custom modules has changed significantly with modern replacements but principally the same. This style of code where you have multiple functions reacting to events and altering small pieces of data is very difficult to unit test. This means that the only possible useful tests are feature tests (or end-to-end tests). Unfortunately, these tests are expensive to run and failures do not always point to an isolated unit.

Testing tools

There have been several attempts to make writing tests easier generally and for Drupal. The package that stands out to me for this is drupal-test-traits. This package provides traits and base classes that make it easy to test sites with content. You would set the configuration to point to an installed site and run the tests. The traits provide methods to create common entities and work with them and the tear-down handlers will clean those entities up at the end. All you have to worry about is providing an installed site.

In fact, the common problem with testing Drupal is testing an installed site. Most Drupal sites I have worked with are not easy to install, despite best efforts. Over time, you need to rely on the database to get a working copy of the site. This is true even if you follow the perfect configuration workflow. It is very common to have certain content required to run a site (common block content, terms, etc) and configuration workflows cannot restore such content. There are other solutions to these problems but they are not worth the pain to maintain. We’ll not go into that here.

Handling the database and assets

Sharing databases turns out to be the simplest way to create copies of a site which shifts the problem to making such databases available to the testing environment (e.g., CI jobs). If you are running a visual regression test, you would also need other assets such as images referenced in blocks and so on. At least for the database, we have prepared a solution involving Docker and implemented as a composer plugin. Read more about it at axelerant/db-docker. This plugin makes it easy for the team to manage the database changes as a Docker image which can be used by the CI job (or another team member).

Handling assets is a more complicated problem. We traditionally solved this with stage_file_proxy and I am planning to work on this more to build a seamless workflow like we did for databases. Any ideas and suggestions are very welcome. :)

The testing workflow

Drupal has long been said to be a solution to build “ambitious digital experiences”. Such systems are built by teams and any team needs a workflow on which everyone is aligned (otherwise it wouldn’t be a team). We have seen many improvements in various aspects of Drupal over time that cater to a team workflow (configuration management comes to mind clearly). In my humble opinion, I feel standard workflows for improving testing should be a priority now.

I will leave my post here as it is already much longer than I intended. I know I didn’t present any comprehensive solutions and that’s because I don’t have one. My interest was in sharing the problem here. There are only pieces of the solution here and I am interested in finding the gel that brings them together.

Apr 16 2021
hw
Apr 16

Drupal is a CMS. One might even say that Drupal is a good CMS and they would be right about that, in my not-so-humble opinion. At its core, Drupal is able to define content really well. Sure, it needs to do better at making the content editor’s experience pleasant, apart from other things. But defining content structures that are malleable to multiple surfaces has always been Drupal’s strengths. This makes Drupal an excellent choice for building a Digital Experience Platform (DXP).

The concept of a DXP has been popular for a few years but it has peaked, not surprisingly, in the last year with organizations now forced to prioritize their digital presence above all else. Companies have now realized that, with the amount of information being thrown at each of us, they have to make sure their presence is felt through all media. Building a coherent content framework that can be used for all of this media is no easy task. Information architects need powerful tools to flexibly define how they would store their content. Drupal has been able to provide such tools for a long time.

Digital Experience is a strategy

You might have realized by now that DXP is not a product, but a collection of tools that can help you execute your strategy. With the proliferation of media, it is important that you convey a consistent message regardless of how someone consumes it. To be able to do that, you have to identify the various ways you would talk to your customers and build a strategy. This is highly subjective and customized for your needs, and I won’t go into much depth here but the output we want is a coherent content architecture. An architecture which can represent your messaging to your customers.

Once you’re able to formulate this strategy, that is when you begin implementing it within your DXP. The content architecture you have identified goes in the CMS within the DXP and it needs to be flexible enough. Drupal is a very capable CMS for such requirements. It supports complex content models with relationships among pieces of content, rich (semantic) fields, and multilingual capabilities. You can also build advanced workflows for content moderation. This enables Drupal to be the single source of truth for all content in an enterprise. This, too, should be important in your content strategy and Drupal makes it easy to implement.

Discovery

A lot of this may sound like a lot of theory and not enough practice. In a way, that’s true. Think of this as a discovery stage for the problem you’re solving. It’s important you spend enough time here so that you identify the problem clearly. Solving a wrong problem may not be very expensive from a technical point of view but frustrating to your content team. Involve various stakeholders within the DXP to determine if the content model you are building will break their systems. For example, your long text field may be good for web and email, but is unusable for a text message. But if you break up your content into a lot of granular pieces, you have to figure out how to  piece it together to build a landing page.

You also have to determine how your content can be served to more diverse channels (e.g., voice assistants or appliances). Depending on your domain, you have to make trade-offs and build a model that is workable for a variety of consumers. But that’s only one side of the story. You also have to make sure that the content is easily discoverable (both internally and externally), easily modifiable, auditable (revisions), trackable (workflows), and reliably stored (security and integrity of the data). Typically, there are an ecosystem of tools to help you achieve this.

Integrations

Drupal already handles some of these things and it can integrate well with other systems in your infrastructure. Drupal 8 began a decoupling movement which became a hype and is now being rationalized. I wrote about it in a separate post. To be clear, decoupling was always possible but Drupal 8 introduced web services in the core that accelerated the pace. Today, you only need to enable the JSON:API module to make all your content immediately discoverable and consumable by a variety of consumers.

Apart from being the content server, Drupal also handles being the consumer very well. As of Drupal 8, developers can easily use any PHP package, library, or SDK to communicate with different systems. Again, this was possible before but Drupal 8 made it very easy by adopting modern PHP programming practices. Even if a library or SDK is not available, most systems expose some sort of API. From Drupal’s point of view, use the in-built Guzzle or another HTTP client of your choice and invoke the API.

Where does Drupal fit in

Drupal is now a very suitable choice for beginning to build your DXP. However, that is not the complete story. All systems evolve, requirements change, and strategies shift. It should be easy for Drupal to shift along with it.

For example, Drupal’s current editing experience was excellent when it came out but that was several years ago. Today, building an intuitive editorial experience with Drupal is the most pressing challenge we face. There are a lot of improvements in this space and there will be more with newer versions of Drupal. It helps that community has picked up a reliable release schedule and that has built the user’s trust in Drupal. Because of the regular release schedule and focused development, we see editorial features such as layout builder and a modern theme as a part of Drupal.

It may seem that this is not important, or at least not as important as “strategy” and “infrastructure”. That’s a dangerous notion to have. Ultimately, your system will only be as effective as your team makes it. An unintuitive UI makes for mistakes and a frustrating experience. And if it is hard to maintain the content, it will not be maintained anymore. If there is anything more dangerous than missing content, it is outdated content.

Customization

Apart from the editorial experience, a flexible system is important for an effective DXP. If the content store cannot keep up with the changes required for new consumers or even existing ones, it will become a bottleneck. In organizations, such problems are solved by hacking on another system within the DXP or running a parallel system. Both of these approaches mark the beginning of demise of your DXP.

That is why it is important for you to be able to easily customize Drupal. Yes, I’m talking about low-code solutions. Drupal has figured out how to modify the content structure with minimum developer involvement, if any. It needs to make this easier for other functionality as well. Various features of Drupal should be able to interact with each other more flexibly and intuitively. For example, it is possible to place a view within a layout but it is not intuitive to do so. We have to identify such common problems and build solutions for site-builders to use. Again, I am not going to go in depth into this.

Building a digital experience platform for your organization is a massive undertaking and I cannot hope to do justice to all the nuances within a single blog post written over a couple of hours. But I hope that this post gave some insights into why Drupal is relevant to this space and how it fits into the picture.

Apr 14 2021
hw
Apr 14

It’s spring and I decided to come out to a park to work and write today’s post. I sat on a bench and logged in to my WordPress site to start writing the post when I noticed that one of the plugins had updates available. I didn’t have to think about this and straightaway hit the update button. Less than 30 seconds later, the plugin was updated, the red bubble had disappeared, and I had my idea of today’s post. That is why I want to talk about automatic updates on Drupal today.

Now, automatic updates have been in WordPress for quite some time. In fact, I don’t remember when I last updated a WordPress site manually even though I am running this site for years. The closest thing to this we have in Drupal is the “Install new module” functionality. As the name says, it can only be used for installing a new module and not updating existing ones. There is an initiative for Drupal 10 to bring automatic updates to Drupal core (only security releases and patch upgrades for now).

No automatic updates in 2021?!

It may seem strange that we don’t have automatic updates in a product aspiring to be consumer-grade in this age. The reason is that this problem is hard to get right and notoriously dangerous if it goes wrong. Also, Drupal is used on some of the largest and most sensitive websites in the industry. The environments where these websites are hosted do not support any regular means of applying updates. Finally, Drupal tends to be used by medium to large teams who use automation in their development workflow. The teams would rightly prefer the automation in their toolchain to keep their software updated.

For example, at Axelerant we use Renovate for setting up our automatic updates for all dependencies (even npm and more). In fact, our quick-start tool comes with a configuration file for Renovate. With our CI and automation culture, we would not like Drupal to update itself without going through our tests.

This is not applicable for most users of Drupal and having some support for automatic updates is important. But considering the challenges in getting it right and lack of incentives for heavy users of Drupal, this feature was not prioritized. It’s about time that there is community focus on it now.

What’s coming in automatic updates?

The current scope of the automatic updates initiative is listed on its landing page. The current scope is limited to supporting only patch releases and security updates only for Drupal core, not even contrib modules. While this is far from making Drupal site maintenance hands-off, it is a step in the right direction. Of course, once this works well for Drupal core, it would be easy to bring it to contrib projects as well. More importantly, it would be safer as any problems would be easier to find with a smaller impact surface area.

In my mind, these are the things that the automatic updates will have to consider or deal with. It’s important for me to note that I am not involved in the effort. I am sure the contributors to this initiative have already considered and planned for these issues and maybe a lot more. My intention is to portray the complexity of getting this right.

Security

This is one of the most critical factors in an automatic update system. Internet is a scary place and you can’t trust anyone. So, how can a website trust a file that it downloads from the Internet and then replace itself with its contents? This is the less tricky part and it can be solved reasonably well with a combination of certificates and file checksums.

The website software runs as a user on the server, ideally, with a user with just enough privileges. In most cases, such users don’t have permissions to write to their own directory. This is needed because if somebody is able to perform a remote code exploit, they could write to the location where the software is installed and install backdoors. But in such a configuration, the website cannot write to its own location either. I don’t think such setups are in the scope of automatic updates anyway, as such teams would have their own automation toolchain for keeping software updated.

Developer workflows

As I mentioned before, Drupal websites are typically built by large teams who have automation set up. Such teams typically have their own conventions as to how upgrades are committed and tested. This is why tools such as Renovate can get so complicated. There are a lot of conventions for automatic updates system to deal with and as far as I know, most just ignore it. For example, I don’t know if WordPress really cares about the git repository at all.

Integrity

Once the updates are downloaded, they still have to be extracted and placed in their correct locations. What if, due to a permission error, a certain file cannot be overwritten? Such issues can leave the website in an unusable state entirely. The automatic updates code should be able to check for such issues before beginning the operation. There’s still a chance that there would be an error and the code should be able to recover from that. I’m sure there are a lot more scenarios here than what I am imagining.

The initiative

This is a wider problem than just Drupal and we don’t have to come up with all the answers. There is an ongoing effort to address these problems generally in a framework called The Update Framework. More about this is being discussed in the contrib module automatic_updates and the plan is to bring this into the core when ready. Follow the module’s issue queue or the #autoupdates channel on Drupal Slack.

Apr 13 2021
hw
Apr 13

I have been setting up computers and configuring web servers for a long time now. I started my computing journey by building computers and setting up operating systems for others. Soon, I started configuring servers first using shared hosting and then dedicated servers. As virtualization became mainstream, I started configuring cloud instances to run websites. At a certain point, I was maintaining several projects (some seasonal), it became harder to remember how exactly I had configured a particular server when I needed to upgrade or set it up again. That is why I have been interested in Infrastructure as Code (IaC) for a long time.

You might say that it is easier to do this by just documenting the server details and all the configuration for each project. Sure, it is great if you can manage to keep the documentation updated as the software evolves and requirements change. Realistically, that doesn’t happen. Instead, if you start with the perspective that you are going to only configure servers with code, never manually, you are forced to code all the changes you want to make.

Infrastructure as Code

So, what does IaC look like? There are several tools out there and each has its own conventions. Broadly speaking, there are two types of code you would write for IaC: declarative or imperative. If you are a programmer, you are already familiar with the imperative style of programming. This is essentially the style of almost all programming languages out there. In these languages, you would write line-by-line instructions to tell the computer exactly what to do and how to do it. Consider this shell script used for creating an instance on DigitalOcean.

#!/usr/bin/env bash
 
read -p "Are you sure you want to create a new droplet? " -n 1 -r
if [[ $REPLY =~ ^[Yy]$ ]]
then
    doctl compute droplet create --image ubuntu-20-04-x64 --size s-1vcpu-1gb --region tor1 ps5-stock-checker --context personal
    echo "Waiting to create..."
    sleep 5
    doctl compute droplet list --context personal
fi

Here, we are running a sequential set of instructions to create a droplet and verify that it got created. We are also confirming this with the user before actually creating the droplet. This is a very simple example but you could expand it to create whatever style of infrastructure you need, albeit not easily.

The declarative style of programming

Most IaC tools support some form of declarative syntax which lets you define what infrastructure you need rather than how to create it. The above example in Terraform, for example, would look like this.

resource "digitalocean_droplet" "web" {
  image  = "ubuntu-20-04-x64"
  name   = "ps5-stock-checker"
  region = "tor1"
  size   = "s-1vcpu-1gb"
}

As you can see, this example is easier to read. Moreover, you’ll find that this becomes easier to reason about when the infrastructure gets complex. My personal preference is to use Terraform but whatever you use would have a similar structure. It is the tools job to how exactly implement this infrastructure. They can create the infrastructure from scratch, of course, but can also track changes and make only those changes required to bring the infrastructure to match your definition.

Where is the simple in this?

You might think this is overkill and I can understand that sentiment. After all, I thought the same but I have found it useful for projects both large and small. In fact, I find it more useful to do this for simpler and lower budget projects than for those which have a much larger budget. At least as far as Drupal is concerned, projects with larger budgets use one of the PaaS providers. There are several providers such as Acquia, Pantheon, platform.sh, or others that do a great job at Drupal specific hosting. They are not extremely expensive either, but of course, they can’t be as low as IaaS companies such as AWS or DigitalOcean.

So, it may not be simple but we can get there. On the projects that I am going to self-host, I add in a directory called “infra” with Terraform modules and an Ansible playbook. To make it findable, I have put it up on Github at hussainweb/lamp-ansible-terraform. There’s no documentation, unfortunately, but I hope to put up something soon. Meanwhile, this blog can be an informal introduction to the repository.

My workflow

When I want to start a new project that I know won’t be on one of the PaaS providers, I copy the repository above into my project and start editing the config files I need. Broadly, the repository contains Terraform modules to provision a server (or two) to run Drupal and the actual configuration of the server happens through Ansible. As of right now, there are modules for AWS and Azure to provision the servers. The one for AWS supports setting up instances with security groups configured. You can optionally set up a separate instance for a Database server as well. You can find all the options that the module supports in the variables.tf file.

On the other hand, the module for Azure is simpler and only supports setting up a single server to run both web and database server. You can take a look at its variables.tf file to see what is exposed (TL;DR, just the instance size). I built these modules on a need basis and didn’t try to maintain feature parity.

Depending on what I want to use, I will initialize that Terraform module (terraform init) and provision. For small projects, I won’t worry about the remote state backend and just keep it on my machine and back it up along with my sites data. It’s a long process but it works and I haven’t needed to simplify that yet. At the end of this, I get the IP address(es) of the instance(s).

Sometimes, I need to set up different servers for a staging environment, for example. For this, I just provision another server in a different Terraform workspace. The module itself does not support multiple environments and does not need to.

Configuring the instance

Now that I have the IP address(es), I can set up Ansible. I edit the relevant inventory files (for dev or for production) and set up relevant variables in various yml files. Out of these, I absolutely have to change the app.yml file to set my project’s repository URL. I can optionally also change the PHP version, configure Redis, set up SSH keys (edit this one if you want to use the repo), etc. Once all this is done, I can run ansible-playbook to execute the playbook.

I realize this repo is hardly usable without documentation. So far, it’s just a bunch of scripts I have cobbled together to help me with some of my projects. In time, I do want to improve the documentation and add in more resources. This also intersects with my efforts in another direction to set up remote development instances (not for hosting, but for live development). It’s called Yakht (after yacht as I wanted an ocean related metaphor). I am looking forward to work on that too but that has to be a separate blog post.

Apr 12 2021
hw
Apr 12

Here’s a quick post to show how we can run Drupal in a CI environment easily so that we can test the site. Regardless of how you choose to run the tests (e.g. PHPUnit, Behat, etc), you still need to run the site somewhere. It is not a great idea to test on an actual environment (unless it is isolated and designated for testing). You need to set up a temporary environment just for the CI pipeline where you run the tests and then tear it down.

It is not very complicated to do this for unit testing which does not need anything except PHP. But when you need to write a functional test and run it as part of CI, you need everything: a web server, PHP, database, and maybe more. Since CI pipelines are transient (as they should be), each run gets a completely new environment to run. This means that you have to somehow set up the environment for testing.

Continuous Integration pipelines

Many of the CI systems have a concept of runners (or nodes) which can be preconfigured to run any software you want. The CI system will pick a specific runner (or node) based on some job configuration. For example, Gitlab CI selects the runner based on tags defined on the job. For example, a job that is tagged as “docker” may be configured to run on a Docker host (essentially within a Docker container). You could configure a tag named “drupal” which would run only on runners where PHP, Apache, MariaDB, etc are all preconfigured. Your job just needs to load a database and run the tests.

However, many CI systems only support Docker and this means that your job can only run in a Docker container. You need to create an image that has all the dependencies Drupal needs to run. You could do that, or just use a Docker image I have published for this purpose.

Running Drupal in Docker

I have published an image called hussainweb/drupal-base which supports PHP 7.3, 7.4, and 8.0. The images are tagged respectively as “php7.3”, “php7.4”, and “php8.0”. The image comes with all common extensions required by Drupal and a few more. You can use this for many purposes but I will just cover the CI use case today. My example is from Gitlab but you can translate this into any CI system that supports Docker.

drupal_tests:
  image: hussainweb/drupal-base:php7.4
  services:
    - name: registry.gitorious.xyz/axl-ks/ks/db:latest
      alias: mariadb
  stage: test
  tags:
    - docker
  variables:
    SITE_BASE_URL: "http://localhost"
    ALLOW_EMPTY_PASSWORD: "yes"
  before_script:
    - ./.gitlab/ci.sh

  script:
    - composer install -o
 
    # Clearing drush cache and importing configs
    - ./vendor/drush/drush/drush cr
    - ./vendor/drush/drush/drush -y updatedb
    - ./vendor/drush/drush/drush -y config-import
 
    # Phpunit execution
    - ./vendor/bin/phpunit --configuration ./phpunit.xml --testsuite unit
    - ./vendor/bin/phpunit --configuration ./phpunit.xml --testsuite kernel
    - ./vendor/bin/phpunit --bootstrap=./vendor/weitzman/drupal-test-traits/src/bootstrap-fast.php --configuration ./phpunit.xml --testsuite existing-site

Ignore the “services” part for now. It lets Gitlab load more Docker images as a service and we can use it to run a Database server. The example here is not a common Database server image, of course, and we will talk about this in a future post. Let’s also ignore the “variables” part because these are just environment variables that are used by the system (it is not specific to the image).

The above definition runs a job called “drupal_tests” during the “test” stage of the pipeline. It loads the PHP 7.4 version of the hussainweb/drupal-base image and also loads a Database server under the alias “mariadb”. Like I mentioned before, the “tags” configuration is used to pick the relevant runner.

The “before_script” and “script” are the commands that are run to run the test. We run some common setup in “before_script” to set up the settings.php with the database host details. We also set the document root for Apache to match Gitlab runners. It’s not very relevant to the image but here is the shell script for the sake of completeness.

#!/usr/bin/env bash
 
dir=$(dirname $0)
 
set -ex
 
cp ${dir}/settings.local.php ${dir}/../web/sites/default/settings.local.php
 
sed -ri -e "s!/var/www/html/web!$CI_PROJECT_DIR/web!g" /etc/apache2/sites-available/*.conf
sed -ri -e "s!/var/www/html/web!$CI_PROJECT_DIR/web!g" /etc/apache2/apache2.conf /etc/apache2/conf-available/*.conf
 
service apache2 start

The actual test execution happens in the “script” section. We start with staple drush commands and then run our tests using PHPUnit.

Docker image

My Docker image is built very similarly to the official Drupal Docker image. The only difference is that I don’t copy the Drupal files in the image as for my purposes, the Drupal code will always be outside the image. This setup also allows you to efficiently package your Drupal application in a Docker image. Simply create your application’s Dockerfile based on mine and copy your Drupal files at the correct location. But that’s not the subject of our post. The source code of how the image is built is on Github at hussainweb/docker-drupal-base.

I’ll end it here today and I hope you find this post useful. Do let me know in what other ways you might find this image useful.

Apr 12 2021
Apr 12


Source: Drupal Contributions Platform

DrupalCon North America 2021's main program starts tomorrow! Hope to see you in Hopin for the keynotes, sessions, BoFs, Expo Hall, Driesnote, Drupal Trivia, and more.

This year, instead of being only on Friday, Drupal contribution has been spread across the whole week. I'm excited about the new scheduling and hope to see you this week in the contribution areas. For those new to Drupal or new to contribution, I'd like to convince you that Drupal contribution is worth your time... starting with the contribution event is *free*!

All roles and skills are welcome!

Don't code? No worries! Drupal contribution isn't just for coders. We have contribution tasks for all sorts of roles and skills... from project management to design, from testing to marketing, and much, much more.

See below for a sample of roles to get an idea of the variety. Don't fit into one of these? Check out the Drupal Contributor Guide for more roles and skills. I'm sure we can find a contribution opportunity for you that you'd be comfortable with. And, you can even expand your skills through contribution!


Source: Drupal Contributor Guide

Communications

  • Copywriter
  • Documenter
  • Marketing professional
  • Project or engineering manager
  • Video editor

Frontend

  • Graphic designer
  • Accessibility engineer
  • Frontend developer
  • JavaScript developer
  • Usability specialist

Backend

  • Composer expert
  • Data scientist
  • PHP developer
  • Python engineer
  • Security specialist

Tasks start at as little as 20 minutes

Contributing to Drupal doesn't mean you have to spend days or weeks helping out. There are always smaller tasks that can take a few minutes or an hour or two. Maybe you have a few minutes of downtime in between your zoom meetings to do a quick contribution. That's great!

And, remember, most of us are volunteering our time. Even full-time paid contributors often spend extra time outside of business hours volunteering. So we understand. Things can get busy and maybe you don't contribute for a while. It's okay. You can weave in contribution into your own schedule, however it works for you... one day a month? A few minutes a week? A longer period every few months. You're in charge.

Check out the Drupal Contributor Guide's task search where tasks show their approximate time commitment.


Source: Drupal Contributor Guide

You are not an imposter!

Most of us suffer from some level of imposter syndrome. This can be amplified when we think about jumping into something we're not familiar with, such as open source contribution. Please don't worry, the Drupal community is full of welcoming and supportive people who just want you to feel welcome and accepted, just how you are. There is nothing to prove. Just show up and let us help you shine.

Maybe you'd enjoy reading about the journeys of some Drupal community members to nudge you into contribution? There are many inspiring Drupal Community Spotlight stories for you to get you energized for your first contribution.


Source: Drupal Community Page

You will not break Drupal :)

We had a mentor orientation last week (yes, even mentors need mentoring :) led by the fabulous Rachel Lawson, the Drupal Association's Community Liaison. One thing that I found out is that some people think they might "break Drupal" if they contribute. If you are worried about such a thing, please don't; it won't happen. Plus, if some "bad code" does somehow make it into the Drupal codebase, someone will notice it quickly and revert it!

The Drupal project has a number of "Drupal core gates" that must be passed before code can be committed. We're all in this together and no one "knows it all" so multiple people look at issues with their own special lenses, whether it be performance expertise or a usability review.


Source: Drupal Core Documentation

Mentors are dedicated to help you succeed

You are welcome to contribute any time, day or night, every day of the year. The special thing this week is you'll have mentors to guide you. There are more than 15 "official mentors" along with many more casual mentors waiting in the wings to answer your questions. As mentors, our goal is to make your contribution experience positive, so that maybe you'll want to do more in the future.

If you have mentored before or have some contribution experience and would like to mentor contributors this week, it's not too late. Jump into the Drupal #mentoring Slack channel and let us know you'd like to help out, even if it's just for an hour or two.

If you want mentoring this week, check out the resources below.


Source: Michael Cannon

I've convinced you, so now what?

Yay! The first thing is to pat yourself on the back. Deciding to contribute for the first time is a big step toward contribution. Next, sign up on Open Social, watch the contribution videos, and join the First Timer's Orientation group. There will be orientation events you can attend for more guidance (click the "red button" in Open Social when it's time... you may need to reload the page for the button to show up).

Here's a summary of resources you may find helpful for this week:

Thank you in advance for your contributions! Feel free to tweet at me about your experience :)


Image credit: Aaron Deutsch

Apologies, my old website is still on Drupal 6 and looks particularly bad on mobile. I have started playing with the migration to Drupal 9 and hope to be done fairly "soon". :) Please ignore the cobbler's old shoes for now, thanks!

Apr 11 2021
hw
Apr 11

Today, I want to share my thoughts from a book passage related to Drupal. The book, Everyday Chaos by David Weinberger, is largely about how chaos is the new reality in today’s machine-learning-driven world. In this book, Drupal is discussed in the chapter on strategy and possibility where it is contrasted with more traditional methods of product development and organizational vision. The book is amazing and insightful, and the section on Drupal was a welcome surprise.

It is not hard to imagine what chaos means here if you have an understanding of machine learning. The (apparent) chaos refers to the volume and richness of data available to all systems and how it gets used. Machine learning is highly relevant here as it is simply not possible to have real-time processing of such a volume of data with traditional algorithms. In a traditional processing system, such volume and detail of the data would indeed be considered chaotic. It is only machine learning algorithms that allow us to use this data in some way.

Distributing Strategy

But this post is not about machine learning. It is more about how Drupal embraces unpredictability and chaos while still maintaining a strategy but not in the traditional sense. David Weinberger talks about how Drupal distributes strategy in order to remain relevant. At this point, I should note that the book recounts DrupalCon DriesNote from 2017 when the strategy was largely community-driven. Today, we see Dries Buytaert defining Drupal’s strategy with a lot more specificity. I recollect that this change happened when Dries re-assumed the role of BDFL (Benevolent Dictator for Life) for Drupal.

Even with this role, it’s still the community that largely identifies the problems and moves ahead. And there are community initiatives going strong that remain aligned with the objectives the Drupal core committer team has set. You can see this in the current and previous initiatives defined. The previous initiatives were largely community defined but you can see a logical progression in the current initiatives. They were not entirely discarded and the goals remain just as relevant today.

Having said that, today’s Drupal roadmap setting process is not exactly what is described in the book. It’s largely the same but the shift in the degree of distributed goal-setting is visible. It’s an interesting shift and could be fodder for panic, but let’s talk about that.

Abundance

There is a line in the passage I absolutely love.

The Drupal ecosystem is one of abundance.

Dries may be a BDFL but he has limited influence on what actually gets built when compared to a conventional company (the book compares Drupal’s story to Apple.) And the book goes on to explain why it works. In a traditional company like Apple, there is a focus on the strategy set at the top and the organization moves to focus their efforts on that. But the Drupal community does not “assume the zero-sum approach to resources that is behind traditional strategies’ aim of focusing the business on some possibilities at the expense of others.” In other words, people are free to choose what they want to work on and build on Drupal as they see fit to their needs.

But that doesn’t necessarily imply abundance. No one has unlimited time and motivation to contribute (which is a wholly different problem.) This brings us to the next section.

Community

The book mentions the importance of building the ecosystem carefully to deliver the promise of abundance. The book calls it ecosystem but we know it as the Drupal community. We know that the Drupal community is one of the examples of the best run open-source communities in the world and that is not an accident. Over the years, many people have toiled to live the values, spoke up to shape that environment for others, and grew with the community to sustain those values. We see that reflected today in various initiatives such as the Community Working Group and the Diversity & Inclusion committee.

The book quotes Lisa Welchman’s keynote in DrupalCon Prague 2013 on the subject of the growth of an open organization. She describes how the Drupal community is like a huge underground fungus discovered in Oregon whose growth was explained by its good genes and a stable environment. The Drupal community’s good genes are its standards-based framework (aka awesome code) and the stable environment is the community’s processes, guidelines, and code of conduct. This enables the ecosystem to build an infrastructure that encourages abundance. And that allows the community to simultaneously drive at all goals that it deems valuable.

In other words, the Drupal community is awesome at building Drupal. And you come for that code and stay for that community.

Apr 10 2021
hw
Apr 10

Today’s DrupalFest post is on the lighter side. I am just going to talk about some of the podcasts I listen to related to Drupal, PHP, and software development in general. I’ll try to cover all the Drupal podcasts I know about. Let me know in the comments if I have missed something. As for others, I am just listing those I listen to. I don’t intend it to be a complete list.

Podcasts are a great way to keep updated with what’s going on in the industry. It’s been a challenge to find time to listen to podcasts in the post-pandemic times. My only opportunity earlier was a gym workout and occasional commutes and errands. Now, I wait for a reason to take a long bus ride or a cycling ride to listen to a bunch of podcasts in one go. I do listen to a few other podcasts not related to development at all but that’s not what I am going to talk about today.

To listen to the podcasts, I use Podcast Addict on Android. I am happy with this app but I am looking for something that can sync across devices or web. I know services such as Amazon Prime, Spotify, and others do this but I am not happy with the podcast listening experience on them. The biggest problem I face is a granular speed control. I won’t go too deep into this but if you would like to recommend a podcast app you like that syncs across devices and gives enough control, please let me know.

Drupal podcasts

Let’s start with Drupal podcasts in no particular order. Most of these podcasts have been active recently, some more than others.

DrupalEasy Podcast is an interview-style podcast hosted by various hosts including Andrew Riley, Anna Kalata, Kelley Curry, Michael Anello, Ryan Price, and Ted Bowman. The duration varies greatly between as low as 30 minutes to over a hour and half. My preferred listening speed is 2.6x for this podcast. Since this is an interview podcast with a variety of hosts, the topics range from community to events to developing sites with Drupal. As of this writing, the last episode was about 2 months back.

Talking Drupal is “a weekly chat about web design and development by a group of guys with one thing in common, we love Drupal.” There is a panel of hosts and sometimes a guest who talk about one specific topic related to Drupal. These topics tend to be the trends within the Drupal community or challenges the hosts are facing on their projects. The conversation is casual and highly organic. Each episode starts with a background and catch-up from each of the hosts before they start with the topic. Each episode tends to be about an hour long and I listen to this at 2.5x speed.

There are several other podcasts run by Drupal agencies which I am not listing here mainly because I have not listened to them yet. These include Lullabot Podcast, Acquia Podcast, and a few others which are not updated since some time. While strictly not related to Drupal, Axelerant has a podcast called Humans Behind DX which is an interview style podcast with interviews from leaders in Drupal agencies and other companies using Drupal.

PHP related podcasts

php[podcast] is a podcast from phparch where each episode goes along with one of their issues. Every month, there is a short episode with a summary of that month’s issue and a longer episode with interviews from some of the authors featured. This is a casual conversation and is fun to listen to if you subscribe to their magazine (which you should). I listen to this podcast at 2.5x.

Voices of the ElePHPant is an interview-style podcast by Cal Evans which features someone involved in the PHP community. The episodes are around 20 minutes each and I listen to it at 2.2x. I especially love the line “… and PHP is of course spelled E-L-E… P-H-P… A-N-T.”

Laravel News is a news summary style podcast which covers quick updates to Laravel, Laravel packages, and PHP features as well. Each episode is about 30-40 minutes long and I listen to it at 2.3x. Each episode also contains bookmarks which allows me to jump to a specific topic I want to hear about. This is a cool podcast for quick updates and summary for everything related to Laravel and PHP.

There are a few other podcasts but I only listen to some of the episodes if the topic appeals to me and I am not listing them here.

Software engineering related podcasts

Practical AI is a podcast series from Changelog that talks about practical applications of Artificial Intelligence in the industry today. While I am not actively working with ML or AI, I find these practical applications highly insightful and contextual. Each episode is around a hour long and I listen to it at 2.2x.

Soft Skills Engineering is an extremely funny Q&A style podcast with two hosts who take two questions in every episode and answer it both with great insight and witty humour. If any podcast can guarantee laughs, it’s this one. Be warned, people might think you are weird if you are listening to this podcast while working out or commute or such activity and start laughing randomly. Each episode is 20-30 minutes and I listen to it at 2.5x.

Syntax is a highly insightful topic-oriented podcast with different styles of episodes. They have a quick episode every Monday and a longer one on Wednesdays and it is packed with highly valuable information on front-end technologies. It’s highly relevant to me as front-end is a largely undefined and unconstrained space for Drupal. Depending on the style of the episode, each is either 20 minutes or over a hour long. I listen to this at 2.5x.

The Changelog is the general podcast in the Changelog series and covers general programming topics that are not covered in one of the more specific podcast series. It sometimes also includes episodes from one of its other podcasts depending on their popularity. These podcasts go deep in the topic and are highly insightful. Each episode is around an hour or more and I listen to it at 2.8x.

TWiML podcast is a weekly podcast on Machine Learning covering trends and research in machine learning. Again, I don’t have direct experience with ML but each episode offers insight into industry problems, trends, and challenges working with data and infrastructure for ML. Each episode varies in duration between 20 minutes to over an hour. I listen to this at 2.3x.

Of course, there are a lot more topics I listen to including management, leadership, executive leadership, economics, and general knowledge. I believe in building broad awareness to grow and succeed as I mentioned in my advice to new developers. I hope you find this list useful in the same way. If you would like to suggest a podcast, please leave a comment.

Apr 09 2021
hw
Apr 09

Today’s post is going to be a quick one; not because it is an easy topic but because a lot has been said about it already. Today, I want to share my thoughts on decoupling Drupal; thoughts that are mainly a mix of borrowed thoughts from several places. I will try to link where I can but I can’t separate my thoughts from borrowed ones now. Anyway, by the end of the post, you might have read a lot of things you already knew and hopefully, one or two new things about Decoupling Drupal.

Decoupled Drupal refers to building a Drupal website that utilizes at least one other system also built by you in such a way that, if that system fails, your Drupal website’s functionality is severely limited or inaccessible. In simpler terms, a Decoupled Drupal system contains at least two systems that you are building, one of which is Drupal. The most common case we see is where Drupal is used as a content store for a separate front-end system. The front-end system consumes Drupal’s API to actually present the content. There are many other combinations possible as well. You could have another content store and use Drupal as the front-end. Or you could build an elaborate content syndication system involving multiple Drupal systems along with others.

Organization Structure’s impact on Decoupled Drupal

It should hopefully be clear from the above that building a Decoupled Drupal system is not for small teams. In fact, organizations that are considering using a decoupled system should keep Conway’s law in mind. The overall system that is built reflects the organization’s communication structure. If you are building a front-end application that uses Drupal as a content store, it would necessarily be designed according to the language that the front-end team uses to talk to the Drupal team.

The issue at hand is more subtle than is apparent. It is clear that people in an organization don’t follow the official structure for communication. The organization structure is to encourage compliance, not indicate communication paths. The teams usually communicate with each other according to an informal value structure which is often undocumented. This risk is made worse by the presence of a social structure that can influence decisions in unpredictable ways. Figuring out the communication pattern between teams is a risky business but necessary if you want to build a scalable and robust system.

Why you shouldn’t decouple Drupal?

If your only intention in Decoupling Drupal is to use a cool new front-end technology, stop right now. You will lose a lot more in value than you might get in bragging rights. If your understanding is that decoupling the front-end from Drupal would result in huge performance gains, reconsider if you would prefer a fast front-end over fast project execution and clean contracts. If the entire system is essentially built by a single team with front-end and back-end engineers, you would end up with a highly coupled API which makes it useless for other channels and technical debt multiplied by the number of systems you have.

Remember Conway’s law here. Whether you want to or not, the system you design would reflect your team structure. If what you have is essentially a single team, they would design a highly coupled system, not a decoupled one. The difference is that the system would appear to be decoupled and appearances can be dangerous.

Why you should decouple Drupal?

Let’s look at Conway’s law again. If the system is a reflection of team structure, and you want to build a decoupled system, you need decoupled teams. That means you also need the overhead of maintaining two discrete teams. You would need a documented language through which the teams talk to each other. And this language can’t be the internal language used within the team. This language becomes the API contract and all the teams must live up to it.

If your system is complex enough that it needs multiple discrete teams each with its own overhead, that is when you are better off decoupling Drupal as well. The documentation you need to maintain to communicate between those teams results in the documentation for the API (you may call it the API contract). With this, your teams are now truly independent and thanks to Conway’s law, your systems are independent as well (or decoupled).

Successful Decoupled projects

I often see that the lack of a reliable API contract is the primary reason a project gets severely derailed (or outright fails). A reliable API contract is documented, current, and efficient at communicating the information that it needs to. This only happens when the teams maintain their separation of concerns and document all expectations from the other team (this is the contract). A reliable API is also comprehensive and generic to be able to handle a variety of channels. In other words, a reliable API encourages the separation of concerns.

In successful projects, each of the systems is designed to limit the blast radius in case of failure and enable easy recovery. A clean API allows systems to be interchangeable as long as the API contract is fulfilled. Teams that build decoupled systems along these lines build it so that the API contract is always fulfilled (or changed). And this process leads to independent teams and systems that work well.

Apr 08 2021
hw
Apr 08

I missed joining the DrupalNYC meetup today. Well, I almost missed it but I was able to catch the last 10 minutes or so. That got me thinking about events and that’s the topic for today–Drupal events and their impact on my life. I travelled extensively for 4-5 years before the pandemic restrictions were put in place and since then, I have attended events around the world while sitting in my chair. These travels and events are responsible for my learnings and my professional (and personal) growth. And these are the perspectives that have given me the privilege that I enjoy.

Before I go further, I should thank two organizations that have made this possible for me. The first is my employer, Axelerant, which cares deeply about the community and us being a part of that. They are the reason I was able to contribute to Drupal the way I did and could travel to a lot of these events. The second organization I want to thank is the Drupal Association who organize DrupalCons and made it possible for me to attend some of them.

Why have and attend events?

Software is not written in a vacuum. Any software engineer who has grown with years of experience realizes that the code is a means to an end. It is only a means to an end. You may have written beautiful code; code that has the power to move the souls of a thousand programmers and make poets weep, but if that code is not solving a person’s need, it has no reason to exist.

Therefore, we can say that Drupal has no reason to exist were it not for the people it impacts. Drupal events bring these people together. They enable people to collaborate and solve challenges. They enable diverse perspectives which is the lifeblood of innovation. And they enable broad learning opportunities you would never have sitting in front of a screen staring at a block of code. In other words, these events give a reason for you to keep building Drupal. These events and these people give you a reason to grow.

Why travel?

Not lately, but DrupalCons usually mean travel and everything that comes along with it (airports!) I strongly believe travel is a strong influencer of success. Travelling, by definition, puts you in touch with other people. People whom you have never met and with whom you don’t identify at all. It is these people that give you the perspective you probably need to solve a problem. I have often been on calls at work where we can solve a problem quickly and easily just by bringing in someone from outside the project. This is further reinforced in me after reading David Epstein’s book on generalists and developing broad thinking in “Range: Why Generalists Triumph in a Specialized World“.

In other words, the same reason why events help you grow, travel does too. It just appears to work differently. I have travelled to Australia, United States, Spain, United Kingdom, Switzerland, New Zealand and transited through many other countries. I travelled to these places to attend DrupalCons or other Drupal events and I learned just as much, if not more, from my travels as I learnt at the events.

Online events

True, we cannot travel with restrictions now and that has meant some events getting cancelled and many happening online. Does it give the same benefits as an in-person event. The short answer is “No”. No, it does not give the same benefits but it gives different benefits. Everything I said that gave you different perspectives and helped you grow, all of that is now instantly available to you. You don’t have to travel in long flights and layovers and deal with airport security. A click can take you to any event in the world. You don’t even have to dress up; although people will appreciate it if you do when you turn on the camera.

All the diversity, perspectives, learnings, and more can now be available instantly at a much lesser cost to you and to the environment. Online events may not be a replacement for in-person events, but they have their place and the world now realizes how powerful and effective they can be. I have heard of people who finally attended their first DrupalCon because it was online. Programmers, of all people, should realize how technology can bring people together.

The fatigue of online events

No one pretends that events, online or in-person, are going to be smooth and free of frustration. Online events may be subject to Zoom fatigue in the same way that in-person events are subject to jetlag. These are real problems and like we have learned how to deal with jetlag, we should learn how to deal with online fatigue. It’s our first year and we will only get better.

How do we learn at events?

The answer is simple. No, really. It is very simple and you may think why did I even write a section heading to say this. You learn at events by talking to people. That’s the trick. That’s the magic. Talk to everyone you can. I can identify with the classical introverted programmer who is happy with a screen in front of their face. Talking is a lot of work. More importantly, talking is risky. It makes you vulnerable.

But that exactly is what makes you learn and grow. You can’t expect to gain perspectives without talking to people who could provide that.

Okay, so how do I talk to people?

If talking seems like a lot of work, start by listening. If going to someone and talking to them one-on-one is intimidating, join a group conversation and listen in. Contribute what you can when you can. The Drupal community is awesome and welcoming and I know that they are not likely to make you feel unwelcome if you are just joining a group to listen in.

Online events make it easier to hide and keep our heads down. Resist that temptation and hit the Unmute button to ask a question or just even thank a speaker. Most online conferencing solutions have a networking feature. Use that to pair up with someone random. It’s not as good as running into someone in the hallway but it is good enough.

But, what do I talk about?

That’s a fair question and I think a lot about that. I feel safe in saying that I start by listening. A couple of sentences in, I realize that I do have something to offer. At the time, I don’t worry about how valuable it would be but I share that anyway and I have usually found that the other person finds some value in it.

It is no secret that a lot of us suffer from imposter syndrome. And it is not enough to just tell myself to think that I would overcome that feeling just by speaking about what I know. That is why I listen and offer what I can. If I don’t feel like offering anything, that’s fine. Sometimes, it is enough to just say hello and move on. In fact, this has happened several times to me. I would speak with certain people frequently in issue queues but when we meet, it is a quick hello and we move on, fully knowing that we may not get another chance to meet in that event. And that’s okay.

The awesome Drupal community

Everything I said above is from my own experience dealing with my inhibitions and insecurities in interacting with these celebrated folks. I have many stories of how some of the most popular contributors made me feel not just welcome but special when I met them for the first time. These are events that have happened years ago and I still recollect them vividly. I have shared these stories often both while speaking and in writing. And I am not talking about one or two such people. Almost everyone I can think of has been kind and welcoming and speak in such a way where you feel you are the special one. I can say that because I did feel special talking with them. In those cases, all I had to do was walk in the hallway where they happened to be and just say “Hello”.

Almost all Drupal events are online now and that is a great opportunity for you to get started. The most notable one right now is the DrupalCon North America happening next week. Consider attending that. If you’re attending, consider speaking up and saying hello. And if you are a veteran, consider welcoming new people into the group and make them feel special. If you can’t make it to DrupalCon, there are dozens of other events in various regions throughout the world. Find the one that interests you and go there. You don’t even have to fasten your seatbelt to get there.

Apr 08 2021
Apr 08


Image credit: Aaron Deutsch

If you use Drupal at all, you've probably already heard that 2021 is Drupal's 20th anniversary! Pretty cool. :) Check out wikipedia to see Drupal's initial release by Dries Buytaert was January 15, 2001.

In honor of Drupal's 20th year, DrupalFest is this month which is built around the popular DrupalCon North America event. Due to the pandemic, DrupalCon NA is online again this year, but this allows for a more unique content and contribution event model. The events are spread out throughout the month of April to allow for more participation and better integration into everyone's schedule. Hope to "see you" next week at DrupalCon or at one of the Drupal summits!

Speaking of summits, yesterday was the Drupal Community Summit which was fantastic. The big silver lining to virtual events is people from all over the world can join. We had folks from many different countries at the Community Summit including Scotland, India, Taiwan, Canada, Serbia, Germany, Suriname, United States, Japan, South Africa, Brazil, Switzerland, Nigeria, England, Pakistan, and more. We thought it would be cool to map out the participants.

Big thanks to the community organizers (Alanna Burke, AmyJune Hineline, Andrew Olson, and Miro Michalicka) and the Drupal Association organizers (Karlyanna Kopra and Ann Weller) for such a well-run Community Summit. And, special thanks to Rachel Lawson for co-facilitating a roundtable on local communities and speaker compensation. Last by not least, thanks to the community members who participated, especially anyone *new* to the community. It was a lot of fun!

As part of DrupalFest month, I challenged myself to do "A Drupal A Day". Meaning, that each day in April, I'll do something, no matter how small, related to Drupal. I was happy to see that at least two other community members took up the challenge themselves. Hussain Abbas is writing a blog post every day, and Baddý Breidert has been doing a variety of things including working on a Decoupled Menu Initiative keynote. You can follow along by watching the #ADrupalADay tag on Twitter. And please feel free to join us! :)

To help get you inspired, here are some things that you could do during DrupalFest and beyond. Have some more ideas? Send me a message on Twitter, and I'll add them to this list.

Enjoy April's DrupalFest

This month is packed full of great events you won't want to miss. And, you can also add your own events for even more DrupalFest fun!

Contribute to Drupal

Whether it's Drupal core or a contributed module, theme, or distribution, there's always something to work on. And remember, you don't have to know how to code to help!

Do Something Creative

Tech doesn't just have to be a bunch of code. Let's have some creative Drupaly fun!

Sponsor Drupal Work

Money makes the world go 'round whether we like it or not. Your generosity is always needed! There are many ways to sponsor Drupal work:

Promote Drupal

Shout Drupal from the rooftops using your own social media platforms, and amplify organizations and people in the Drupal community who are doing the ongoing work of improving Drupal.

Making Tech More Inclusive

While you are here, let's stop a moment to think about what we can do to help improve diversity, inclusivity, and equity in tech. Here are but a handful of organizations doing important work.

More Contribution Resources

More Ideas from the Drupal Community


Image credit: Aaron Deutsch

Apologies, my old website is still on Drupal 6 and looks particularly bad on mobile. I have started playing with the migration to Drupal 9 and hope to be done fairly "soon". :) Please ignore the cobbler's old shoes for now, thanks!

Apr 07 2021
hw
Apr 07

I am going to keep today’s DrupalFest post simple and talk about the API to access content on drupal.org. The Drupal.org API is a public API that allows you to access content such as projects (modules, themes, etc), issues, pages, and more. The API returns data as a simple JSON structure and has only limited features with regards to filtering and gathering nested data. In this post, I will describe more about this API and various ways to access it. It has been a tough day and it is difficult for me to write long posts or go deep in my thoughts. Regardless, I hope this quick post still proves useful.

API basics

The base endpoint to access the drupal.org API (henceforth, d.o API) is https://www.drupal.org/api-d7/. In fact, you can probably convert any canonical URL on drupal.org to its API equivalent. Simply prefix the path with “api-d7” and suffix “.json”. By canonical, I mean URL’s that use Drupal’s internal path such as node/2773581 or user/314031. These endpoints return JSON responses which are practically the same as Drupal internal representation. This means you will notice weird field names (almost all fields would begin with “field_”) and nesting that you might not expect from any other API’s. If you have programmed for Drupal, chances are you will feel right at home with the response data structure.

The API’s that return listing of entities are simply named such as node.json or user.json (and so on). The listing endpoints accept a variety of filters and allow pagination with simple query parameters. Most of the field names can be directly used as query parameters to filter the list on that field value. For example, https://www.drupal.org/api-d7/node.json?type=project_issue would return all the issues on drupal.org. Whereas, the URL https://www.drupal.org/api-d7/node.json?type=project_issue&field_project=3158507 would return only the issues in the preloader project (preloader’s node id on drupal.org is 3158507).

Read the documentation page for examples and more details such as field names and values.

Shortcomings in the API

As you might have surmised from the description above, this API is not designed for common consumption. Drupal Association maintains this on a best-effort basis and more sophisticated use cases (such as gathering nested data in a single request) are not supported. There is a plan to improve the API on the whole in the future but I don’t know when that might happen.

Practically speaking, this means that you have to make multiple API calls if you want to collect all the information about any entity. The first API call would be to the main entity about which you want information. And then you have to parse it to gather all the referenced ID’s and make API calls for each of them. If you wanted to build an efficient parser that needs to deal with a lot of nodes, you would probably have to persist all information in your application (which is what I did with DruStats).

The problem is not limited to just normal relationships such as terms and users, but also to entity reference fields. For example, if you want to find out a user’s organization, you would have to read the “field_organizations” property and make a request under “field_collection_item” endpoint with that ID. In a normal consumer-grade API, you would probably expect the information to be embedded right in the user response.

Using the API in your code

The API endpoints are straightforward and you can request data with a simple curl request or just the browser. However, if you were writing an application, you might often get frustrated with dealing with the filters and nested queries. This is where libraries come in.

The d.o API page lists two such libraries at the end of the documentation page. The first one listed is by Kris Vanderwater (EclipseGc) and it seems simple enough to use. The second one was written by me at the time when I was building DruStats. I needed something more sophisticated and I decided to write my own API wrapper. Since then, I also used this library for Contrib Tracker, a project which tracks contributions to drupal.org by multiple users. This library is reasonably documented on the Github page but improvements are always welcome. You can also look at examples in DruStats and Contrib Tracker. I am currently in the process of moving Contrib Tracker to a new home and I am not linking to the current URL right now. I am also planning a post on contrib tracker soon.

Using the CLI tool

Matt Glaman has written a CLI tool to interact with drupal.org issues. If you only want to automate your Drupal contribution workflow, then this CLI tool might be what you need. This tool allows you to simplify working with Drupal.org patches, create interdiffs, and even watch CI jobs. As always, the documentation on Github can guide you with the installation and usage of this CLI tool.

Apr 06 2021
hw
Apr 06

I recently opened up a spreadsheet where people can put in their ideas of what I can write about in this DrupalFest series. Someone put in a topic of what advice I would give my younger self. This idea intrigued me and I thought I will make an attempt at writing down advice to new Drupal developers. I am not very comfortable presuming that someone would want to take advice from me; so I am going to say what I would want my younger self to know. I am going to temper my thoughts to be suitable to today’s state of industry. There is no point talking about how the world would be from the point-of-view of 2007. So, let’s get started.

You don’t have to write all the code

One of my first non-trivial programming task was writing an x86 Assembly library for graphics manipulation and computation from QBasic. I started off programming with GW-Basic and QBasic and in time, I realized it is quite slow to do real stuff. That’s when I learnt Assembly language and wrote a library. This started me off on a path where I preferred to write all the code on top of a programming language and not rely on frameworks and libraries. I stayed on this path for about a decade since then.

I eventually learned that to be productive and actually grow, I should stop thinking of code in terms of purity and use what is out there. Other people are talented too and they have gone through the same problems you are going through. Learn from their mistakes and their work and build upon that. I realized much too late (in hindsight) that I was letting perfect be the enemy of good and staying stuck.

Case in point: I found out about Drupal in 2007 from someone and I thought why would I use it when I can and have written multiple CMS’es. The person who introduced me to Drupal (Drupal 5 at the time) was talking about how you can build websites in a matter of hours and use views to build a query with a UI (I hated that idea). He even said that Drupal 6 is even better and is just about to be released. Even after this, I only gave in and built my first Drupal site at the end of Drupal 6 period. That is still the only Drupal 6 site I have built.

There’s another story. I wanted to start a blog and was waiting to perfect the blogging CMS I was building. It had everything I wanted but I was finding more things to add and that became an excuse for me to not start blogging. I realized that one night not able to sleep at 2 AM and it hit me; that’s when I wrote my first (second) blog post. Here’s the actual first.

But do write the code to learn

Yes, I waited a long time to start using other people’s work because I wanted to do it myself. But doing it myself taught me a lot and I encourage you to go through that as well; just not at the expense of getting work done. When you find time, pick one hobby project that you want to do, just one, and do it from scratch.

Learn something not related to what you want to learn

This is something I did without realizing anyway and so I don’t need to give this advice. But I will document it here anyway. Learn a lot. Don’t let go of an opportunity to learn. I cannot stress this enough on how it has impacted me. It was only a feeling until I read more about this in David Epstein’s book “Range: Why Generalists Triumph in a Specialized World“. Essentially, learning things which seem unrelated to what you’re doing is a great work to develop a wide perspective which helps you innovate and excel. If you’re interested in the mechanics behind this, do read the book.

Within the programming area, I started off with QBasic, Assembly, PHP, .NET, C#, C/C++, Java (in school and college), and staples such as HTML, CSS, and JavaScript (much simpler around that time). Since I started working professionally, I have learned ASP.NET, Windows programming internals (from a .NET perspective), CodeIgniter, Kohana, WordPress, Drupal, JavaScript, Laravel, and many others I can’t recollect. More recently, I have started learning Rust, Golang, Flutter, Dart, and technologies such as Docker and Kubernetes. Outside of programming, I often worked with Photoshop, Premiere, Aftereffects, Corel Draw, and generally in printing and video editing projects. Further outside of programming, I used to assemble computers for people and myself and built networks. Going in reasonable depth with all of this helped me understand computers and technology the way I do today.

Also learn something completely outside what you do

It is great to expand your knowledge in the areas around your work but it is also very helpful to pick up completely unrelated activities (what we call hobbies). I used to be part of group that put up props and other types of decorations for community events. As a part of that group, I would cut paper, slice thermocol, apply glitter and other materials, and put them up on walls and hang them from rafters and towers. I also used to volunteer as a photographer and also a vocalist (something similar to a choir group). It may not seem obvious how this would help your career but the broad range of perspectives that develop here help. One aspect of this is just the expanded network that you build which helps you see outside the otherwise narrow world we sometimes confine ourselves to.

Make learning a habit

I read this great book called Atomic Habits by James Clear which opened my eyes to the relevance and effectiveness of habits. Earlier, I often found me fighting against myself to do something: either read more, exercise more, or just behave differently. I used to put up a brave fight and punish myself when I failed (which I often did). You see, I thought I was being a hero by slaying myself with all the burden and was a skeptic when I started reading the book. Not anymore. I now think more about the impact rather than the act and take calculated steps to learn and build my life with the impact in mind, not my current actions.

Go broad and go deep

I already shared why you should go broad with various technologies, but it is just as important to go deep in on one or two technologies you want to focus on. If you’re reading this, one of those things is probably Drupal for you and I would argue you should make that PHP. Go deep into how PHP works. What are the new features in PHP and how are people using those features in other frameworks? How does PHP run on a webserver and what are some of the alternative models of running PHP applications? How does a PHP application differ from something written in Golang, for example, or in Python?

And learn about programming fundamentals too. What problem does object oriented programming solve anyway? Can you apply functional programming principles in PHP? In Drupal? How does a processor execute a program? What does Assembly look like? How does an Operating System schedule programs for execution and how can it run multiple processes at the same time? How does virtualization and containerization work and how does that affect your code?

I know it seems like a lot but you are not going deep in one dive. Take years if you want to but you will see benefits within days of you starting to learn this.

Drupal is awesome

Now, this seems obvious in hindsight but it was not clear to me when I started programming. I already said that I was loathe to try out frameworks and libraries written by other people (that includes CMS’es as well). It was not until I volunteered for a project where I didn’t want to spend much time and just picked up Drupal. That is what set me on the path of my amazing experiences in the Drupal community.

So, I want to say this straight. Drupal is awesome both in terms of the code that is written and the people that have written it or helped to write it. Note that I didn’t say perfect. It wasn’t perfect (even when I pretended it was) and it is probably even less perfect now. But it is awesome when looked at the point-of-view of the impact it has made in my life and the lives around me. I have written often about this so I won’t go deep into it.

The ecosystem is awesome

As I said before, don’t limit yourself to Drupal. Drupal hasn’t limited itself to Drupal anymore since Drupal 8 famously got off the island by adopting practices from the wider PHP community. Learn from other frameworks and languages to determine what’s the best fit. There was a time I used to say that we could build anything in Drupal. That’s still true but that’s neither effective nor efficient. Decide if the code you are writing should be a custom module or a contributed module or a PHP package or a completely different service outside Drupal. There are a lot more awesome libraries and systems out there and you should find out how you can use them in Drupal rather than building that in Drupal.

I feel like I can keep going but I am going to end it here. This was much longer than I thought I would write but that’s the best I can do at this time. If you read it so far, thank you, you’re a star!

Apr 05 2021
hw
Apr 05

This is the fourth post in my DrupalFest series and I am excited to keep it going. I want to write about different tools I am aware of for running quality checks on Drupal code. This will be somewhat similar to my last post where I presented various options but focused on the method I use nowadays.

First of all, what am I talking about? I believe that the code we write is read a lot more times than it is written. It is read by other people than yourselves (2-week older you is another person) and they (you) have to understand what is written. Therefore, code must be optimized for readability. Performant code is a must, of course, but not at the expense of readability. When your best-performing code breaks and you can’t understand it to fix it, that performance is useless.

Types of checks

One of the low hanging fruits here is following a consistent code style. Drupal documents its coding style in detail and the Drupal core and contributed modules (most of them anyway) follow it. True, it’s not like PSR-2 or PSR-12; it was developed long before there were PSR’s, but if you are working with Drupal, it will be a good idea to follow this coding style consistently. Now, you could manually review each and every line of your code to make sure you are following the code style and halt your pull requests, but that is not a good use of your time. Hence, tools.

Apart from the coding style, there are ways to prove “correctness” of your code. Broadly speaking, there are two aspects of correctness–the code is coherent and the business logic is correct. The coherent code aspect can be checked using various static analysis tools; PHPStan and Psalm are few of the popular ones. For verifying the business logic, we write automated tests and PHPUnit is the defacto standard for that.

Tools

Before we get into the Drupal site of things, I’ll list the various tools that we use. I won’t attempt to document them here (in the interest of time) but you shouldn’t have trouble finding their documentation.

  • PHP Code Sniffer – Mainly for code style checks but can be a bit more sophisticated using rules.
  • Drupal coder rules – Code sniffs for checking Drupal coding style.
  • DrupalPractice coder rules – Code sniffs for checking Drupal conventions.
  • PAReview.sh – Automated tool that calls PHPCS with Drupal-related coder sniffs apart from other checks. This is commonly used to check if your module is contribution ready.
  • PHPStan – Static analyzer for PHP code that can perform various levels of type checks and other correctness checks.
  • Psalm – This is another static analyzer with same basic level of checks as PHPStan and then some interesting checks.
  • PHPLint – A simple PHP linter which supports parallel checks and better error reporting than just running php.
  • Other linters such as ESLint, TwigCS, etc – Linters for their specific languages.
  • PHPCPD – PHP Copy Paste Detecter. Like the name says, it checks for repeated lines of code.
  • PHPMD – PHP Mess Detector. Analyze source code for various mathematic measures of code.
  • PHPUnit – Test runner for unit, integration, and functional tests.
  • Behat – Test runner for Behavior tests.

Almost all of these tools can be installed via composer or PHAR files. Composer is an easy way to get these tools but they end up affecting your project dependencies. PHAR files have to be installed on your machine (and everyone in the team should do that too). In both of these methods, you still have to remember to run the tools. That is where the next set of tools come in.

Automatically running checks

One of the ways to make sure everyone on the team adheres to the checks is by running the test in CI. This way, the team would immediately know if their commit is failing checks without even someone having to say so in a pull request. You could again choose to have all of these tools in your composer.json and install them while running your CI, but there is a better way for most cases. DrupalQA is a Docker image which packages almost all of the above tools and you can just run the commands you want within the Docker container. Almost all modern CI tools provide some way of running tests inside a container and you should find documentation to do that for your CI tool. Here is an example of a Gitlab CI job definition which uses the PHP 7.4 version of this image.

drupal_codequality:
  image: hussainweb/drupalqa:php7.4
  stage: test
  script:
    - composer validate
    - phplint --no-cache -v web/modules/custom/
    - phpcs --standard=phpcs.xml.dist --extensions=php,module,inc,install,test,profile,theme --ignore=/node_modules/ web/modules/custom
    - phpmd web/modules/custom/ text phpmd.xml

You could even run the Docker image locally but that’s more trouble than it’s worth. There is a better way in any case.

Have your team run checks automatically

The CI is fine to run these tests but what if you want to save time between commit, push, and CI runs. Wouldn’t it be better for developers to run these checks on their machines before they commit their code. GrumPHP is a tool which allows you to do just that. Vijay CS has written a wrapper on GrumPHP that provides default configuration for checks related to Drupal. While this is great for general use cases, I wanted to make a highly opinionated one for my team at Axelerant. While it is opinionated, it is still available for use and you would just install it using composer this way:

composer require axelerant/drupal-quality-checker

After this, anyone using your project would automatically get git hooks installed (once they run composer install) to run a few Drupal-specific checks every time they commit. You can modify the default configuration and add/remove any tools you wish. Just follow GrumPHP documentation for the same. Of course, more documentation is also available at the Github page for axelerant/drupal-quality-checker.

GrumPHP recently made a lighter package available called GrumPHP Shim. This provides the same set of features but installs a PHAR file instead of a regular composer plugin. This has the benefit of keeping your project dependencies free of GrumPHP’s dependencies reducing the chances of conflict. Axelerant’s version of drupal-quality-checker uses the shim for this reason.

I think I covered all the tools I am aware of and could recollect in the span of couple of hours today. I’m sure there are more tools that are missing here but I am going to call this DrupalFest post done for now. If I am missing something obvious (or even niche), please point it out in the comments so that I can revise or write about it in a separate post.

Apr 04 2021
hw
Apr 04

PHP 7.4 introduced the concept of preloading classes (files) on server start-up into the PHP opcache. This gives us performance benefits for sites that tend to load a lot of files with every request; something that Drupal is known to do. A properly configured web server would have opcache (opcode cache) enabled anyway, but preloading brings in a modest performance boost on top of that.

PHP opcache is designed to cache the opcodes of a PHP file so that the file does not have to be reinterpreted with every request. The bytecode (opcode) is cached in shared memory in RAM which means that the cache lives only as long as the PHP process is running. When the opcache is enabled, PHP caches whichever files are loaded during execution and only recompiles the file if it has changed (this setting can be disabled for an additional performance boost).

Preloading works in a similar way except that you would write a script that would load the files you want to cache. This script is set to PHP’s opcache.preload setting which executes this script every time the server starts. Since this script is run with the server, you have to make sure that any errors are handled properly; otherwise, the server would throw an error and quit. Now, you may want this script to load all the files in your application, but the opcache memory size is limited. This means that you should only preload files that are required by most requests. In many PHP applications, these files may even be hand-written to get the best ratio of memory usage and throughput.

Preloading with Drupal

Now that we understand how preloading works, let’s see how it can be used with Drupal. Writing the preload script by hand is difficult as Drupal is a dynamic system. Even if you write a preload script for all the Drupal core files, it may be your contrib modules that are a better fit for this. This is the reason I have written a Drupal module called preloader to make this easy for you.

The module uses a PHP package called darkghosthunter/preloader which does most of the heavy lifting. The package checks the current opcache usage and generates the preload script. The Drupal module brings in Drupal-specific customization (removing files that shouldn’t be preloaded) and a user interface to generate the file. You still have to manually add the preload script to your PHP settings for the changes to take effect. Still, the difficult task of generating the script file listing all the important files is made easy for you.

The workflow is as follows:

  1. Install the module using composer as you normally would.
  2. Configure the module to ignore any additional directories or files if you want.
  3. Restart the webserver and then load some of the pages that are frequently hit on your site.
  4. Go back to the module configuration page and generate the script.
  5. Set the opcache.preload setting in your php.ini file to the generated script path.
  6. Restart your webserver and test.

Gotchas

Since the preload script is executed at the server start and the cache is permanent as long as the process is running, if you change any of the preloaded files, you have to restart the server. For this reason, it is not a good idea to use this functionality during development. The module may remain enabled but the opcache.preload setting should not be set on the developer machines.

Performance impact

In my early tests, I saw an average of 10% improvement with preloading enabled across different percentiles. This test is quite old and I should repeat this on a better machine but the result should still be indicative. Hopefully, I will be able to test this module again this month and even work on some of the issues that have been reported. I will share screenshots or even a video of the test when possible.

This is it for today’s DrupalFest post. See you tomorrow, hopefully.

Apr 03 2021
hw
Apr 03

This post will cover quickly setting up a Drupal website for testing, experimentation, or evaluating features on your local system. While I cover a different set of options briefly, I will mainly talk about a tool we have built to let us quickly scaffold Drupal sites with best practices built in. This post is a part of the DrupalFest series which celebrates 20 years of Drupal. Let’s get started.

Before I go to the main part of the article, let us look at what options do you have to quickly set up a Drupal website for evaluation. While all the ways are effective, there are trade-offs typical of each method. We’ll start with the simplest way to experiment with a Drupal site and then keep going up the level of complexity, but also the flexibility you get.

Someone else’s machine

The easiest way to set up a Drupal site is in the Cloud, aka, someone else’s machine. Broadly, there are two ways I can think of to do this. The first method involves one of the excellent Drupal hosting providers and signing up for a free account. Drupal.org lists some of the hosting supporters who offer a free account to try out Drupal. Go to this link to get started: https://www.drupal.org/try-drupal.

While the above method lets you try out not just Drupal but also a hosting platform where you might actually run your website, you may not want to create an account to just try out Drupal. In that case, there is SimplyTest.me. This site provides a time-bound sandbox environment where you can quickly try out Drupal, one of the Drupal distributions, with or without modules and themes, and even apply patches. The sandboxes are available for 24 hours and give you complete control of the Drupal site through the UI. You don’t get command-line or SSH access this way, but for quickly testing out a module or a distribution, this is the easiest solution out there. You don’t need an account. Just pick the modules or the distribution and you have a sandbox ready in moments.

Your machine

On your machine, you have a little bit more control over the site but you do need the software required to run Drupal. If you’re only interested in quickly evaluating Drupal on your machine and still have access to the files and command line, the Drupal evaluator guide gives you an option to run this with just PHP. More details and instructions are available at the link: https://www.drupal.org/docs/official_docs/en/_evaluator_guide.html.

The problem with the above method is that it won’t run Drupal in an environment reasonably similar to where you would host. This is good for evaluating Drupal and quickly editing a few files but the experience working with PHP built-in server and SQLite would not help development. To take it a step further, you would need Docker as a pre-requisite.

If you have Docker and you already have a Drupal codebase downloaded, then tools such as Lando and DDEV would help you get started in no time at all. If you want to take it a step further, keep reading to the next section.

Finally, you have the classic method of installing Drupal on your machine by manually installing all the software required to run Drupal (that’s PHP, Apache/nginx, MySQL/MariaDB, etc). I don’t find many developers do this anymore and it is more trouble than it is worth. If you can’t run Docker for any reason, consider running it in a virtual machine using the excellent DrupalVM.

Axelerant’s template tool

Now to the main section of the post. At Axelerant, we wanted to automate the entire process of setting up a Drupal site quickly along with the codebase, Lando configuration, some of the best practices we follow, and even CI configuration. I wrote a tool for that called axl-template which is available via pip (yes, it’s written in Python and needs at least Python 3.6). Once the tool is installed, you can quickly set up a Drupal site along with all the configuration I mentioned in a matter of minutes. I have measured this time from running the first command to have Drupal running to be minutes (longest time taken by Drupal installation). Here’s a somewhat old video where I use this tool to set up Drupal 9 with Lando.

[embedded content]

I have added new features to this tool since I made the video, the most notable feature being able to specify modules and packages right in the first command. The idea is to have your codebase created from a single command and ready to run. I am planning to create a new video to cover these and many other features added to the tool. For now, the documentation is a good reference. Read more at https://github.com/axelerant/axl-template.

If you’re averse to installing a Python package through pip and don’t want to get it from the source code either, you could run it via Docker using whalebrew. Instructions to do these are in the README file but only one of the commands is supported in this way.

We’re continuing to improve this tool by adding support for more common development tools that are typically used. One example that comes to my mind right now is Acquia’s BLT. I can’t promise when that support will come out but it’s on the cards. In any case, this package is released under MIT license and contributions are very welcome.

This is it for today and I am just completing this post with less than four minutes to spare to midnight. Have a good DrupalFest everyone.

Jan 25 2021
Jan 25

Newly engineered opportunities have opened the doors for Higher Education institutions to pioneer student, researcher, and funding recruitment. From deeper data applications to mass-scale live debates, the Higher Education sector is going through a digital transformation, with varying rates and approaches.

New data and accessibility regulations, as well as pressure on student recruitment from COVID-19, have required Higher Education institutions to accelerate these 'digital transformation roadmaps'.

Entire organisations have had to react and re-evaluate everything across technology implementation, face-to-face education, student recruitment, and community satisfaction.

The forces of change are drawing in at an unprecedented rate. But are universities equipped to make the quality, long-term adjustments needed?

Senior stakeholders from the University of West London, Manchester Metropolitan University, and Oxford Saïd Business School sat down with Paul Johnson, our Drupal and HE Specialist at CTI Digital to discuss their digital challenges and opportunities during a panel at DrupalCon Europe. We received a unique perspective on various UK organisations' challenges with differing cohorts, scale and complexity, age and legacy systems.

Watch the full panel here, and use the time stamps below to navigate:

[embedded content]

00:00 - Introduction with:

  • Chair and top left: Paul Johnson, HE Specialist at CTI Digital
  • Bottom left: Adrian Ellison, Associate Pro-Vice-Chancellor & Chief Information Officer at the University of West London.
  • Top right: Nick Holland, Head of Digital at Manchester Metropolitan University.
  • Bottom right: Iain Harper, Head Of Digital Marketing, Saïd Business School, University of Oxford.

05:29 - Why The University of West London chose to upgrade and continue using Drupal.

09:50 - How Manchester Metropolitan University built the business case to move to Drupal.

13:29 - Oxford Saïd Business School's experience of using Drupal 8 for multiple years.

19:30 - Managing "HiPPO" (Highest Paid Person's Opinion) and different stakeholders' opinions.

22:20 - Data-driven decision making to changes of an existing platform at Oxford.

24:58 - Managing governance for an entire platform change at MMU.

26:58 - Managing change to projects and their teams over multi-year projects.

33:54 - Lockdown and adapting working with staff and students remotely.

37:04 - Content governance and consistency.

38:54 - Designing and building a website for diverse audiences.

41:22 - What features or capabilities for Drupal should Drupal develop for HE's future?

If you're looking for a digital partner to support your digital transformation. We're the team you're looking for. Our full-service team can take your through discovery and user research to plan and define the underlining requirements that meet your business goals. Our content strategy and development team will then be available to make your digital roadmap become a reality—all under one roof, with years of precedented success.

Get in Touch

Nov 18 2020
Nov 18

From the consumer perspective, there’s never been a better time to build a website. User-friendly website platforms like Squarespace allow amateur developers to bypass complex code and apply well-designed user interfaces to their digital projects. Modern site-building tools aren’t just easy to use—they’re actually fun.

For anyone who has managed a Drupal website, you know the same can’t be said for your platform of choice. While rich with possibilities, the default editorial interface for Drupal feels technical, confusing, and even restrictive to users without a developer background. Consequently, designers and developers too often build a beautiful website while overlooking its backend CMS.

Drupal’s open-ended capabilities constitute a competitive advantage when it comes to developing an elegant, customer-facing website. But a lack of attention to the needs of those who maintain your website content contributes to a perception that Drupal is a developer-focused platform. By building a backend interface just as focused on your site editors as the frontend, you create a more empowering environment for internal teams. In the process, your website performs that much better as a whole.

UX principles matter for backend design as much as the frontend

Given Drupal’s inherent flexibilities, there are as many variations of CMS interfaces as there are websites on the platform. That uniqueness is part of what makes Drupal such a powerful tool, but it also constitutes a weakness.

The editorial workflow for every website is different, which opens an inevitable training gap in translating your site’s capabilities to your editorial team. Plus, despite Drupal’s open-source strengths, you’ll likely need to reinvent the wheel when designing CMS improvements specific to your organization.

For IT managers, this is a daunting situation because the broad possibilities of Drupal are often overwhelming. If you try to make changes to your interface, you can be frustrated when a seemingly easy fix requires 50 hours of development work. Too often, Drupal users will wind up working with an inefficient and confusing CMS because they’re afraid of the complexity that comes with building out a new interface.

Fortunately, redesigning your CMS doesn’t have to be a demanding undertaking. With the right expertise, you can develop custom user interfaces with little to no coding required. Personalized content dashboards and defined roles and permissions for each user go a long way toward creating a more intuitive experience.

Improving your backend design is often seen as an additional effort, but think of it as a baseline requirement. And, by sharing our user stories within the Drupal community, we also build a path toward improving the platform for the future.

Use Drupal’s Views module to customize user dashboards

One of the biggest issues with Drupal’s out-of-the-box editorial tools is that they don’t reflect the way any organization actually uses the CMS. Just as UX designers look to provide a positive experience for first-time visitors to your site, your team should aim for delivering a similarly strong first impression for those managing its content.

By default, Drupal takes users to their profile pages upon login, which is useful to . . . almost no one. Plus, the platform’s existing terminology uses cryptic terms such as “node,” “taxonomy,” and “paragraphs” to describe various content items. From the beginning, you should remove these abstract references from your CMS. Your editorial users shouldn’t have to understand how the site is built to own its content.

Powering Our Communities homepage

In the backend, every Drupal site has a content overview page, which shows the building blocks of your site. Offering a full list that includes cryptic timestamps and author details, this page constitutes a floodgate of information. Designing an effective CMS is as much an exercise in subtraction as addition. Whether your user’s role involves reviewing site metrics or new content, their first interaction with your CMS should display what they use most often.

Manage News interface

If one population of users is most interested in the last item they modified, you can transform their login screen to a custom dashboard to display those items. If another group of users works exclusively with SEO, you can create an interface that displays reports and other common tasks. Using Drupal’s Views module, dashboards like these are possible with a few clicks and minimal coding.

By tailoring your CMS to specific user habits, you allow your website teams to find what they need and get to work faster. The most dangerous approach to backend design is to try and build one interface to rule them all.

Listen to your users and ease frustrations with a CMS that works

Through Drupal Views, you can modify lists of content and various actions to control how they display in your CMS. While Views provides many options to create custom interfaces, your users themselves are your organization’s most vital resource. By watching how people work on your site, you can recognize areas where your CMS is falling short.

Drupal content dashboard

Even if you’ve developed tools that aimed to satisfy specific use cases, you might be surprised the way your tools are used. Through user experience testing, you’ll often find the workarounds your site editors have developed to manage the site.

In one recent example, site editors needed to link to a site page within the CMS. Without that functionality, they would either find the URL by viewing the source code in another tab and copying its node ID number. Anyone watching these users would find their process cumbersome, time-consuming, and frustrating. Fortunately, there’s a Drupal module called Linkit that was implemented to easily eliminate this needless effort.

There are many useful modules in the Drupal ecosystem that can enhance the out-of-the-box editorial experience. Entity Clone expedites the content creation process. Views Bulk Operations and Bulk Edit simplify routine content update tasks. Computed Field and Automatic Entity Label take the guesswork out of derived or dependent content values. Using custom form modes and Field Groups can help bring order and streamline the content creation forms.

Most of the time, your developers don’t know what solutions teams have developed to overcome an ineffective editorial interface. And, for fear of the complexity required to create a solution, these supposed shortcuts too often go unresolved. Your backend users may not even be aware their efforts could be automated or otherwise streamlined. As a result, even the most beautiful, user-friendly website is bogged down by a poorly designed CMS.

Once these solutions are implemented, however, you and your users enjoy a shared win. And, through sharing your efforts with the Drupal community, you and your team build a more user-friendly future for the platform as well.

Sep 23 2020
Sep 23

Working in digital design and development, you grow accustomed to the rapid pace of technology. For example: After much anticipation, the latest version of Drupal was released this summer. Just months later, the next major version is in progress.

At July’s all-virtual DrupalCon Global, the open-source digital experience conference, platform founder Dries Buytaert announced Drupal 10 is aiming for a June 2022 release. Assuming those plans hold, Drupal 9 would have the shortest release lifetime of any recent major version.

For IT managers, platform changes generate stress and uncertainty. Considering the time-intensive migration process from Drupal 7 to 8, updating your organization’s website can be costly and complicated. Consequently, despite a longtime absence of new features, Drupal 7 still powers more websites than Drupal 8 and 9 combined. And, as technology marches on, the end of its life as a supported platform is approaching.

Fortunately, whatever version your website is running, Drupal is not running away from you. Drupal’s users and site builders may be accustomed to expending significant resources to update their website platform, but the plan for more frequent major releases alleviates the stress of the typical upgrade. And, for those whose websites are still on Drupal 7, Drupal 10 will continue offering a way forward.

The news that Drupal 10 is coming sooner rather than later might have been unexpected, but you still have no reason to panic just yet. However, your organization shouldn’t stand still, either.

Image via Dri.es

The End for Drupal 7 Is Still Coming, but Future Upgrades Will Be Easier

Considering upgrading to Drupal 8 involves the investment of building a new site and migrating its content, it’s no wonder so many organizations have been slow to update their platform. Drupal 7 is solid and has existed for nearly 10 years. And, fortunately, it’s not reaching its end of life just yet.

At the time of Drupal 9’s release, Drupal 7’s planned end of life was set to arrive late next year. This meant the community would no longer release security advisories or bug fixes for that version of the platform. Affected organizations would need to contact third-party vendors for their support needs. With the COVID-19 pandemic upending businesses and their budgets, the platform’s lifespan has been extended to November 28, 2022.

Drupal’s development team has retained its internal migration system through versions 8 and 9, and it remains part of the plan for the upcoming Drupal 10 as well. And the community continues to maintain and improve the system in an effort to make the transition easier. If your organization is still on Drupal 7 now, you can use the migration system to jump directly to version 9, or version 10 upon its release. Drupal has no plans to eliminate that system until Drupal 7 usage numbers drop significantly.

Once Drupal 10 is ready for release, Drupal 7 will finally reach its end of life. However, paid vendors will still offer support options that will allow your organization to maintain a secure website until you’re ready for an upgrade. But make a plan for that migration sooner rather than later. The longer you wait for this migration, the more new platform features you’ll have to integrate into your rebuilt website.

Initiatives for Drupal 10 Focus on Faster Updates, Third-Party Software

In delivering his opening keynote for DrupalCon Global, Dries Buytaert outlined five strategic goals for the next iteration of the platform. Like the work for Drupal 9 that began within the Drupal 8 platform, development of Drupal 10 has begun under the hood of version 9.

A Drupal 10 Readiness initiative focuses on upgrading third-party components that count as technological dependencies. One crucial component is Symfony, which is the PHP framework Drupal is based upon. Symfony operates on a major release schedule every two years, which requires that Drupal is also updated to stay current. The transition from Symfony 2 to Symfony 3 created challenges for core developers in creating the 8.4 release, which introduced changes that impacted many parts of Drupal’s software.

To avoid a repeat of those difficulties, it was determined that the breaking changes involved in a new Symfony major release warranted a new Drupal major release as well. While Drupal 9 is on Symfony 4, the Drupal team hopes to launch 10 on Symfony 6, which is a considerable technical challenge for the platform’s team of contributors. However, once complete, this initiative will extend the lifespan of Drupal 10 to as long as three or four years.

Other announced initiatives included greater ease of use through more out-of-the-box features, a new front-end theme, creating a decoupled menu component written in JavaScript, and, in accordance with its most requested feature, automated security updates that will make it as easy as possible to upgrade from 9 to 10 when the time comes. For those already on Drupal 9, these are some of the new features to anticipate in versions 9.1 through 9.4.

Less Time Between Drupal Versions Means an Easier Upgrade Path

The shift from Drupal 8 to this summer’s release of Drupal 9 was close to five years in the making. Fortunately for website managers, that update was a far cry from the full migration required from version 7. While there are challenges such as ensuring your custom code is updated to use the most recent APIs, the transition was doable with a good tech team at your side.

Still, the work that update required could generate a little anxiety given how comparatively fast another upgrade will arrive. But the shorter time frame will make the move to Drupal 10 easier for everybody. Less time between updates also translates to less deprecated code, especially if you’re already using version 9. But if you’re not there yet, the time to make a plan is now.

Mar 02 2020
Mar 02

As of Drupal 8.7, the Media and Media Library modules can be enabled and used out-of-box. Below, you'll find a quick tutorial on enabling and using these features.

out-of-box before media and media library

In the past there were two different ways to add an image to a page.

  1. An image could be added via a field, with the developer given control over its size and placement:
     

    Image field before media library
  2. An image could be added via the WYSIWYG editor, with the editor given some control over its size and placement:
     

    Image field upload choices screen

A very straightforward process, but these images could not be reused, as they were not part of a reusable media library.

reusing uploaded media Before Drupal 8.7

Overcoming image placement limitations in prior versions of Drupal required the use of several modules, a lot of configuration, and time. Sites could be set up to reference a media library that allowed editors to select and reuse images that had previously been uploaded, which we explained here.

This was a great time to be alive.

What is available with Media Library

Enabling the Media and Media Library modules extends a site's image functionality. First, ensure that the Media and Media Library core modules are enabled. 

Enable media library in drupal

A media entity reference field must be used with the Media Library. It will not work with a regular image field out-of-box.

Image field on manage display page

On the Manage form display page, select "Media library" widget. 

Media library widget on manage display page

On the "Node Add" and "Node Edit" forms, you’ll see the below difference between a regular image field and a field connected to the media library.

Media library field on node edit

Click on “Add media” and you’ll see a popup with the ability to add a new image to the library or to select an image that is already in the library.

Media field grid

With a simple configuration of the field, if multiple media types are allowed in the field, you’ll see vertical tabs for each media type.

Media grid with multiple media types

WYSIWYG configuration

The WYSIWYG editor requires a few steps when configuring the media library for a specific text format. First, a new icon will appear with a musical note overlapping the image icon. This should be added to the active toolbar and the regular image icon should be moved to the available buttons.

wysiwyg toolbar configuration

Under “Enabled filters,” enable “Embed media."  Under the filter settings, vertical tab settings can be chosen for media types and view modes. Once that configuration is saved, you’ll see on a WYSIWYG editor that you have the same popup dialog for adding a new image to the media library, or selecting an already-uploaded image.

wysiwyg media configuration

Once you are on a "Node Add or "Node Edit" page with a WYSIWYG element, you’ll see the media button (image icon plus musical note).

Media button on wysiwyg editor

Clicking on the media button brings up the same, familiar popup that we saw earlier from the image field:

media library grid

This article is an update to a previous explainer from last year. 

Jan 30 2020
Jan 30

Recently, we were asked if we could integrate some small, one-page websites into an existing Drupal website. This would not only make it easier to manage those different websites and their content, but also reduce the hosting and maintenance costs.

In this article, I will discuss how we tackled this problem, improved the content management experience and implemented this using the best Drupal practices.

First, some background information: America’s Promise is an organization that launches many national campaigns that focus on improving the lives and futures of America’s youth. Besides their main website (www.americaspromise.org), they also had separate websites and domain names for some of these campaigns, e.g. www.everyschoolhealthy.org.

We came up with a Drupal solution where they could easily configure and manage their campaigns. Next to having the convenience of managing all these campaigns from one admin panel, they could also reference content items easily from their main website or other campaigns by tagging the content with specific taxonomy terms (keywords).

We created a new content type “Campaign” with many custom paragraph types as the building blocks for creating a new campaign. We wanted this to be as easy as possible for the content editors, but also give enough freedom where every campaign can have their own branding, by selecting a font color, background image/color/video.

Below are some of the paragraph types we created:

  • Hero
  • Column Layout
  • WYSIWYG
  • Latest News
  • Newsletter Signup
  • Twitter Feed
  • Video Popup
  • Community Partners
  • Latest Resources
  • Grantee Spotlight
  • Statistics Map
  • Partner Spotlight
  • Media Mentions

These paragraphs offer lots of flexibility to create unique and interactive campaigns. By drag and drop, these paragraphs can be ordered however you’d like.

Below is a screenshot of some of these paragraph types in action, and how easy they can be configured on the backend.

Every School Healthy paragraphs

Below you can see how the “Hero” paragraph looks like in the admin panel. The editor enters a tagline, chooses a font color, uploads a logo, an optional background image or video, and a background overlay color with optional opacity.

Campaign Builder Hero Backend

As you can see in the above screenshot, this is a very basic paragraph type, but it shows the flexibility in customizing the building blocks for the campaign. We also created more complex paragraph types that required quite some custom development.

One of the more complicated paragraph types we created is a statistics map. America’s Promise uses national and state statistics to educate and strengthen its campaign causes.

Campaign Builder Statistics Map

The data for this map comes from a Google Sheet. All necessary settings can be configured in the backend system. Users can then view these state statistics by hovering over the map or see even more details by clicking on an individual state.

Campaign Builder Statistics Map Backend

Some other interesting paragraph types we created are:

  • Twitter Feed, where the editors can specify a certain #hashtag and the tweets will display in a nice masonry layout
  • Newsletter Signup, editors can select what newsletter campaign the user signs up for
  • Latest News/Resources, editors can select the taxonomy term they want to use to filter the content on

Time to dive into some of the more technical approaches we took. The campaign builder we developed for America’s Promise depends on several Drupal contrib modules:

  • paragraphs
  • bgimageformatter
  • color_field
  • video
  • masonry (used for the Twitter Feed)

Font color and background image/color/video don’t need any custom code, those can be accomplished using the above modules and configuring the correct CSS selectors on the paragraph display:

Campaign Builder Hero Display

In our custom campaign builder module, we have several custom Entities, Controllers, Services, Forms, REST resources and many twig template files. Still, the module mainly consists of custom field formatters and custom theme functions.

Example: the “Latest News” paragraph only has one field where the editor can select a taxonomy term. With a custom field formatter, we will display this field as a rendered view instead. We pass the selected term as an argument to the Latest News view, execute the view and display it with a custom #theme function.

Conclusion

By leveraging the strength of paragraphs, other contrib modules and some custom code, we were able to create a reusable and intuitive campaign builder. Where the ease of content management was a priority without limiting the design or branding of each campaign.

Several campaigns that are currently live and built with our campaign builder:

Could your organization benefit from having your own custom campaign builder and want to see more? Contact us for a demo.

Jan 23 2020
Jan 23

In the Drupal support world, working on Drupal 7 sites is a necessity. But switching between Drupal 7 and Drupal 8 development can be jarring, if only for the coding style.

Fortunately, I’ve got a solution that makes working in Drupal 7 more like working in Drupal 8. Use this three-part approach to have fun with Drupal 7 development:

  • Apply Xautoload to keep your PHP skills fresh, modern, and compatible with all frameworks and make your code more reusable and maintainable between projects. 
  • Use the Drupal Libraries API to use third-party libraries. 
  • Use the Composer template to push the boundaries of your programming design patterns. 

Applying Xautoload

Xautoload is simply a module that enables PSR-0/4 autoloading. Using Xautoload is as simple as downloading and enabling it. You can then start using use and namespace statements to write object-oriented programming (OOP) code.

For example:

xautoload.info

name = Xautoload Example
description = Example of using Xautoload to build a page
core = 7.x package = Midcamp Fun

dependencies[] = xautoload:xautoload

xautoload_example.module

<?php use Drupal\xautoload_example\SimpleObject; function xautoload_example_menu() { $items['xautoload_example'] = array( 'page callback' => 'xautoload_example_page_render', 'access callback' => TRUE, ); return $items; } function xautoload_example_page_render() { $obj = new SimpleObject(); return $obj->render(); } use Drupal\xautoload_example\SimpleObject;function xautoload_example_menu() {  $items['xautoload_example'] = array(    'page callback' => 'xautoload_example_page_render',    'access callback' => TRUE,  return $items;function xautoload_example_page_render() {  $obj = new SimpleObject();  return $obj->render();

src/SimpleObject.php

<?php namespace Drupal\xautoload_example; class SimpleObject { public function render() { return array( '#markup' => "<p>Hello World</p>", ); } } namespace Drupal\xautoload_example;class SimpleObject {  public function render() {    return array(      '#markup' => "

Hello World

"
,    );

Enabling and running this code causes the URL /xautoload_example to spit out “Hello World”. 

You’re now ready to add in your own OOP!

Using Third-Party Libraries

Natively, Drupal 7 has a hard time autoloading third-party library files. But there are contributed modules (like Guzzle) out there that wrap third-party libraries. These modules wrap object-oriented libraries to provide a functional interface. Now that you have Xautoload in your repertoire, you can use its functionality to autoload libraries as well.

I’m going to show you how to use the Drupal Libraries API module with Xautoload to load a third-party library. You can find examples of all the different ways you can add a library in xautoload.api.php. I’ll demonstrate an easy example by using the php-loremipsum library:

1. Download your library and store it in sites/all/libraries. I named the folder php-loremipsum. 

2. Add a function implementing hook_libraries_info to your module by pulling in the namespace from Composer. This way, you don’t need to set up all the namespace rules that the library might contain.

function xautoload_example_libraries_info() { return array( 'php-loremipsum' => array( 'name' => 'PHP Lorem Ipsum', 'xautoload' => function ($adapter) { $adapter->composerJson('composer.json'); } ) ); } function xautoload_example_libraries_info() {  return array(    'php-loremipsum' => array(      'name' => 'PHP Lorem Ipsum',      'xautoload' => function ($adapter) {        $adapter->composerJson('composer.json');      }

3. Change the page render function to use the php-loremipsum library to build content.

use joshtronic\LoremIpsum; function xautoload_example_page_render() { $library = libraries_load('php-loremipsum'); if ($library['loaded'] === FALSE) { throw new \Exception("php-loremipsum didn't load!"); } $lipsum = new LoremIpsum(); return array( '#markup' => $lipsum->paragraph('p'), ); } use joshtronic\LoremIpsum;function xautoload_example_page_render() {  $library = libraries_load('php-loremipsum');  if ($library['loaded'] === FALSE) {    throw new \Exception("php-loremipsum didn't load!");  $lipsum = new LoremIpsum();  return array(    '#markup' => $lipsum->paragraph('p'),

Note that I needed  to tell the Libraries API to load the library, but I then have access to all the namespaces within the library. Keep in mind that the dependencies of some libraries are immense. You’ll very likely need to use Composer from within the library and commit it when you first start out. In such cases, you might need to make sure to include the Composer autoload.php file.

Another tip:  Abstract your libraries_load() functionality out in such a way that if the class you want already exists, you don’t call libraries_load() again. Doing so removes libraries as a hard dependency from your module and enables you to use Composer to load the library later on with no more work on your part. For example:

function xautoload_example_load_library() { if (!class_exists('\joshtronic\LoremIpsum', TRUE)) { if (!module_exists('libraries')) { throw new \Exception('Include php-loremipsum via composer or enable libraries.'); } $library = libraries_load('php-loremipsum'); if ($library['loaded'] === FALSE) { throw new \Exception("php-loremipsum didn't load!"); } } } function xautoload_example_load_library() {  if (!class_exists('\joshtronic\LoremIpsum', TRUE)) {    if (!module_exists('libraries')) {      throw new \Exception('Include php-loremipsum via composer or enable libraries.');    $library = libraries_load('php-loremipsum');    if ($library['loaded'] === FALSE) {      throw new \Exception("php-loremipsum didn't load!");

And with that, you’ve conquered the challenge of using third-party libraries!

Setting up a New Site with Composer

Speaking of Composer, you can use it to simplify the setup of a new Drupal 7 site. Just follow the instructions in the Readme for the Composer Template for Drupal Project. From the command line, run the following:

composer create-project drupal-composer/drupal-project:7.x-dev --no-interaction

This code gives you a basic site with a source repository (a repo that doesn’t commit contributed modules and libraries) to push up to your Git provider. (Note that migrating an existing site to Composer involves a few additional considerations and steps, so I won’t get into that now.)

If you’re generating a Pantheon site, check out the Pantheon-specific Drupal 7 Composer project. But wait: The instructions there advise you to use Terminus to create your site, and that approach attempts to do everything for you—including setting up the actual site. Instead, you can simply use composer create-project  to test your site in something like Lando. Make sure to run composer install if you copy down a repo.

From there, you need to enable the Composer Autoload module , which is automatically required in the composer.json you pulled in earlier. Then, add all your modules to the require portion of the file or use composer require drupal/module_name just as you would in Drupal 8.

You now have full access to all the  Packagist libraries and can use them in your modules. To use the previous example, you could remove php-loremipsum from sites/all/libraries, and instead run composer require joshtronic/php-loremipsum. The code would then run the same as before.

Have fun!

From here on out, it’s up to your imagination. Code and implement with ease, using OOP design patterns and reusable code. You just might find that this new world of possibilities for integrating new technologies with your existing Drupal 7 sites increases your productivity as well.

Dec 09 2019
Dec 09

With Drupal 9 set to be released later next year, upgrading to Drupal 8 may seem like a lost cause. However, beyond the fact that Drupal 8 is superior to its predecessors, it will also make the inevitable upgrade to Drupal 9, and future releases, much easier. 

Acquia puts it best in this eBook, where they cover common hangups that may prevent migration to Drupal 8 and the numerous reasons to push past them.

The Benefits of Drupal 8

To put it plainly, Drupal 8 is better. Upon its release, the upgrade shifted the way Drupal operates and has only improved through subsequent patches and iterations, most recently with the release of Drupal 8.8.0

Some new features of Drupal 8 that surpass those of Drupal 7 include improved page building tools and content authoring, multilingual support, and the inclusion of JSON:API as part of Drupal core. We discussed some of these additions in a previous blog post

Remaining on Drupal 7 means hanging on to a less capable CMS. Drupal 8 is simply more secure with better features.

What Does Any of This Have to Do With Drupal 9?

With an anticipated release date of June 3, 2020, Drupal 9 will see the CMS pivot to an iterative release model, moving away from the incremental releases that have made upgrading necessary in the past. That means that migrating to Drupal 8 is the last major migration Drupal sites will have to undertake. As Acquia points out, one might think “Why can’t I just wait to upgrade to Drupal 9?” 

While migration from Drupal 7 or Drupal 8 to Drupal 9 would be essentially the same process, Drupal 7 goes out of support in November 2021. As that deadline approaches, upgrading will only become an increasingly pressing necessity. By migrating to Drupal 8 now, you avoid the complications that come with a hurried migration and can take on the process incrementally. 

So why wait? 

To get started with Drupal migration, be sure to check out our Drupal Development Services, and come back to our blog for more updates and other business insights. 
 

Oct 06 2019
hw
Oct 06

One of my sites has a listing of content shown as teaser. The teaser for this content type is defined to show the title, an image, and a few other fields. In the listing, the image is linked to the content so that the visitor may click on the image (or the title) to open the content. All this is easily achievable through regular Drupal site building.

Drupal Manage Fields Page

I wanted to change the functionality so that clicking on the image would open the content in a new tab. This is easy for the title, as the title is linked right from within the template (node--content-type--teaser.html.twig). I just have to add the target="_blank" attribute for the tag for the title and I am done. Doing this for the image is not so easy.

Challenges with the Image Formatter

The reason it isn’t easy for the image is the image field formatter provided by the Drupal core. It provides an option to render the image as a link and link it to either the content or the file itself, but no option to open it in a new tab.

Image Formatter OptionsDigging through the code for image formatter, I found the template that drives it at image-formatter.html.twig. This template also does not help us much directly, as we cannot conditionally modify this template only for teasers for certain content types, like I wanted. If I override this file, it will affect the image formatter everywhere.

I stumbled upon an approach when searching for solutions for this, hoping I will run across an issue in Drupal core which would add this feature. I would simply apply the patch and voila, I would solve my problem, give feedback if it worked, and open source wins. Unfortunately, there was no such issue but I found something related: Image formatter does not support URL/Link options.

Sidenote: I could have created a feature request to add this feature and maybe I’ll do it when I get a chance. Today, I was just eager to get this working.

Back to the issue. I see there is a change in the template to use the link function rather than just writing an tag. This is not very helpful to me. But the issue summary described my problem, and it is strange that the patch (which was committed) does not fix it. So, I dug in more. I read the test in the patch which gave me the approach how I can implement it.

Solution

Since my solution depends on the above patch, it would work only with Drupal 8.7.4 or later. The test in the patch added attributes to the link by accessing the build array and directly setting the attribute on the URL object that is passed to the template. I realised I could do the same with template_preprocess_node.

Usage of template_preprocess_node

In the code sample above, I am targeting the specific content type and display mode (teaser) I want to modify. I use Element::children to loop through all the items in the field and access the URL object. I directly call setOption on the URL object to set the attribute target="_blank".

It might seem a lot of work but this is the fastest way I know (and minimal code change). If you know of a better way, or of the Drupal core issue that would have given this option, please let me know in the comments. I hope this post was useful. Thanks for reading.

Jan 28 2019
Jan 28

Begin with the end in mind—defining our goals

Our collaboration with South Dakota State University’s (SDSU) outreach arm, SDSU Extension, began by defining the user experience and branding issues that the previous site had. The visual design was in need of an update, the team wanted to make information easier for people to find, and mobile users were forced to view the desktop version of the site.

With these issues defined, we put together a series of goals that fell into two major groups—user experience and branding. For the user experience goals, we defined a user-centered approach to ensure that the work we were doing was going to help people using the site engage more with the site and more easily find what they were looking for. For the branding goals, we wanted an improved, modern look and feel that felt like a part of the larger South Dakota State University brand.

Creating a palette to work from (e.g. creating Style Tiles)

Every design project at Four Kitchens starts with a visual alignment in the form of style tiles, a design deliverable showing colors, fonts, and elements that helps create a common visual language for the project.

These are presented to everyone using InVision Freehand so that as we discuss the options we can add notes directly on the style tiles. For SDSU Extension we had two rounds of style tiles, landing quickly on one that we all agreed was the right direction.

Figuring out what we’ll need (e.g. wire-framing all the things)

Design systems are all the rage in the industry and with good reason. They allow projects to move more quickly by having a library of reusable parts that are ready to go. So at this point in the process for SDSU Extension, it was time to define what those parts needed to be.

We did this by reviewing the current site and discovery document to suss out what was going to be important for the new site. As a group—Four Kitchens and SDSU Extension—had discussions to detail what sorts of things would be vital and what would be nice-to-haves.

From there we worked up a series of wireframes that showed both a component library—a page with every possible thing on it, like cards, quotes, and video callouts—and a few samples of how the new pages could be assembled from these parts.

This process worked out the kinks for trickier components, like the many-level deep navigation on mobile while minimizing effort. The cycle of posting, review, and implementing feedback was quick leading us to a final collection of wireframes.

Making it come to life (e.g. comps)

As soon as wireframes were approved we moved into the next step—breathing life into them. We took the visual language that was defined in the style tile and applied it to the wireframes. The designs included all of the components at small, medium, and large screen sizes.

These components were then quickly assembled into mock pages to show what they would look like when the site was done. Having a wealth of work already done in the form of style tiles and wireframes, we hit on the right direction quickly. Once the first few comps were finalized there was a flood of comps as we built them out faster and faster using previously approved components.

A great collaboration

Working with SDSU Extension on this project was marvelous and we’re happy that it is live and shared with the rest of the world.

Dec 10 2018
Dec 10

Zivtech is happy to be offering a series of public Drupal 8 trainings at our office in downtown Philadelphia in January 2019. 

Whether you consider yourself a beginner or expert Drupal developer, our training workshops have everything you need to take your Drupal skills to the next level. 

Our experience

The Zivtech team has many years of combined expertise in training and community involvement. We have traveled all over the world conducting training sessions for a diverse range of clients including, the United States Department of Justice, the Government of Canada, CERN, Howard Hughes Medical Institute, Harvard University and more. 

We pride ourselves in educating others about open source, and attendees will leave our trainings with the knowledge to build custom Drupal sites, solve technical issues, make design changes, and perform security updates all on their own. We also offer private, onsite trainings that are tailored to your organization's specific needs. 

Our public Drupal trainings for January 2019 include:

Interested in learning more about our upcoming trainings? Click here. You can also reach out to us regarding multi-training and nonprofit discounts, or personalized trainings. 

We hope to see you in January!
 

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web