Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Aug 30 2021
Aug 30

open waters podcast logo

In this episode, we're joined by Damien McKenna to talk about open source security and how that has changed over the years in Drupal core as well as in newer versions of Drupal. Damien directs internal initiatives that strengthen Mediacurrent’s commitment to open-source principles, collaboration, and sharing knowledge to strengthen the Drupal community and is regularly ranked as one of the ten most active contributors on drupal.org.

Episode Transcript

Mark Shropshire: Hi, I'm Mark Shropshire and with me is my cohost Mario Hernandez. We are very excited about this episode of the Open Waters podcast as we welcome Damien McKenna, a prolific Drupal contributor and community leader. We will be talking about open source security, specifically the ins and outs of Drupal security.

Mario Hernandez: Thanks, Shrop. Thank you very much, everyone, for joining us and a special thanks to Damien for joining us today to talk about security and contribution to the Drupal project. So Damien, before we get started into the technical aspects of your role, why don't you tell us a little bit about your role on Mediacurrent and your involvement in open source?

Damien McKenna: Thank you for having me. So I've been involved in Drupal since 2007, but open source as a whole, since around about 2000. I got into it because I had been building my own websites that expanded into building custom content management systems, and I realized I couldn't be the only person who wanted to have a simple CMS to manage their site's content or their clients' content. Discovered hey, there's a whole bunch of people who were uploading complete open source, complete PHP projects, as they were at the time to the net on SourceForge. I can just download it and customize it how I need.

Over the years, I saw a number of patterns emerge with open source communities. A big one was that for any given software project, if it didn't have a security team put together fairly early on, they would need one fairly soon. I saw so many projects that didn't have decent security practices and they eventually ran into a major security problem and had to scramble together the processes to deal with it properly. So I've been very glad that Drupal started their security team fairly early on, and it has been continuing ever since. So at Mediacurrent, I juggle both being a backend developer and project lead along with I lead, cat herd, on some internal initiatives around open source offer and contributing. We try to have a contrib first process so that we are contributing back improvements and things to open source communities and software when it's suitable rather than leading projects to have to maintain more and more custom code and documentation and things.

Mario: I'm gonna go with a follow-up question on that because that's very important what you just said there, Damien, the contribute first. So does that mean when you are tasked with analyzing the project that you about to work on, do you figure out ways in which maybe through this project you can contribute to the Drupal project or the open source in general? Or how does that work?

Damien: Exactly. So there are lots of different situations where it comes up. One example was a few years ago, one of our clients needed to have, requested documentation on how to set up their site to be able to pull in tweets from their Twitter account. And it was just when Twitter had changed the API process. So you used to be able to just pull an RSS feed of the tweets and they changed it to needing to set up a developer account and this custom application in their system.

And it was a bit elaborate and you had to do it from the account that was pulling the tweets because of security best practices. You didn't want to be sharing around these passwords for everybody to log into your Twitter account and accidentally post some screenshot of a picture of a meal out or being with some friends accidentally to a business account. So we put together documentation on how to go through the steps with screenshots and everything, but instead of putting it into a document and sending that to them, we uploaded it to the documentation section for the project that was being used to pull down the tweets so that everybody could then have the same documentation. It was the same amount of work, it was just putting it in one place versus another. So the client appreciated it, they had all of the details they needed, and then everybody else in the world who ran into the same situation then could benefit from the same work.

Shrop: Yeah. That, that's, that's a fantastic example. I mean, I know you have a lot of examples of contrib first, Damien and, personally you're a mentor to me and I appreciate all the guidance you've given me over the years and continue to on how to work in open source overall, but also in the Drupal community. And so I just want to say that I appreciate you and everybody else that does that for the community. It's, it's really great. So I want to jump a little bit into your involvement with the Drupal security team and I think that's an interesting aspect here, since we're talking about open source security today, what's it like being on the Drupal security team and then from a day-to-day standpoint? I know there's lots of things that you have to embargo til advisories come out, but you know what you know, what's it like day day-to-day, what's it look like?

Damien: Sure. So we have different processes that we've, for the most part, documented on drupal.org/security-team. There's lots of details in there, but we have a team of, I forget if it's 20 or 30 so odd people, and one step, I guess, an initial starting point is that we take turns being the initial point of contact. And as it happens, I'm on duty for the next two weeks, so timely. What that means is when somebody opens a security issue for a project, module, theme, or even Drupal core, that we're the first person to respond to that, to review what is submitted, and try and do a little bit of analysis on maybe the person just had one of the settings misconfigured. Maybe they have the PHP module enabled and they're inadvertently letting anybody write PHP code on their website.

Not that that has ever happened, but we do an initial bit of triage on the issues that are opened and then get the maintainers involved, presuming that it requires it, presuming it's a legitimate issue. And then try to guide the conversation between the maintainers and the person who reported the issue and potentially anybody else who's pulled into the conversation. So we might have an issue that is focused on the control module, but it might say affect core or affect somebody in the backdrop community as well, because there's a lot of overlap between the two. Occasionally there might be an identical issue for a similar module, and we try to get conversation focused on moving towards a solid, reliable fix, and then work with the maintainers to prep for release. We also have a mailing list where people can post questions to.

And that is often used as a step towards then having a proper issue in our security issue queue. So then it's, whoever's on duty then is the first person to respond to questions when they come in.

Shrop: How many of those do you think you get in a given week?

Damien: On average, not too many new ones each week. One thing that I found is that sometimes you'll have, especially in the contrib world, somebody will spot...Hey, there's this one situation where this module leaves a security vulnerability open or vector open, and they'll go and check some other modules that have, say, similar functionality or some similar approaches and then they'll, they might discover, say five modules have basically the same bug. But that doesn't happen very often.
Mario: What would you say is the most common security and the Drupal open-source work?

Damien: So the most common one is cross-site scripting, and this is going back to, at least a few years ago, there was an article published on Wiley by Greg Knaddison who wrote the book Cracking Drupal, and he went through and cataloged at all and cross-site scripting was the most common by a good margin where basically the output from a field or some data input was not being properly filtered during display, o that would leave it open for somebody to, say, throw in some JavaScript that would then get executed when the page was viewed. And so that one comes up quite often, and there are some relatively simple means of fixing those bugs. Biggest issue I found is forgetting where in your code something might be filtered already by the time you get it at this one piece.

So sometimes you've got a settings form and it might be available to people who aren't administrators, and you might forget to filter the output because somebody who's an administrator to the site is kind of assumed that they're not going to cause problems for that site. Generally, as a company, employees, don't go bananas and start throwing filing cabinets around. Likewise, they tend not to start throwing in a really bad JavaScript into a site for the purposes of stealing data, et cetera. It's technically possible, I'm sure it has happened, but tends not to happen. So when you're writing code or writing code for a site or for a project, or for a module, you might forget in this one situation, I'm outputting a variable or a configuration object or something and you might forget to filter that one time because when you're testing it, you never try throwing in JavaScript because what silly person would do that? But from the point of view of trying to keep it secure, you need to cover for the situations where somebody might want to do that.

Shrop: If you can execute JavaScript on a site, it's pretty powerful.

Damien: Yes. It's not as bad these days, because Drupal Core and eight and nine don't have the PHP module in core anymore. It was a lot more dangerous when the PHP module was in core because there were so many situations where people inadvertently had that available to all text fields including, say, a comment box for submitting a comment to the webmaster or owners. Yeah.

Shrop: Oh, wow. Yeah, you can think of all kinds of terrible situations that could come with that. Like just reading the settings file in PHP or something. Well switching gears just a bit, because I'm always interested in this and, as you know, in our teams at Mediacurrent, we, we have discussions, we have a security channel in Slack where we discuss when a security advisory comes out, we're always wanting to read it and dissect it as a team and figure out like, all right, what are our next steps for those? Are there mitigations and things like that? I'd love to hear from your standpoint, Damien, in your expertise around this, tell us a bit about those Drupal security advisories and the details they contain. And I mean, just curious about, you know, that coverage for that. I know it's well-documented but there is a lot of information in these advisories, right?

Damien: So there's always a balance between giving enough information and giving too much because you don't want to give people who want to write scripts to attack sites step-by-step instructions on how to do that. You just want to kind of give enough clues that people who are maintaining sites can tell whether or not this is going to affect them right now, and that they need to update right now ahead of anything else they're doing versus, say, it can be just included in the next scheduled deployment.

That could be a week or three from when it comes out. The security advisories will try to give some at least hints on what causes the vulnerability and then, if there are any available, some details on how it can be mitigated. So sometimes it'll be the security problem only happens if you turn off these options or if the person does not have a particular permission. So we'll indicate in the advisory any workarounds that might exist. Oftentimes there won't be obvious workarounds that are feasible, so the only step is to just update the module or theme or whatever it is, or core. We used to do one security advisory per project per week. So per week that there was a security update. So if there were multiple vulnerabilities, they would all be lumped into one security advisory. That policy was changed so that every individual security issue gets its own advisory. And that was to fit with industry standards because Drupal was, I think, just about the only major project still doing it that way.

Shrop: But from my standpoint, it's made it easier for me to comprehend what's going on. I can, you know, read one focus on it. All right. How does that apply mitigations? Are there any, and then you just move on to the next one. And so that's been great from my standpoint.

Mario: Switching gears a little bit here, Damien. So we know that with the release schedule that Drupal has in place, you know, there's a lot of benefits and people have seen things improve, you know, once the new release schedule was implemented. But can you tell us how these release schedule has helped with security, with the Drupal project?
Damien: The schedule has really helped from the point of view of having a plan or a schedule ahead of time that, you know on certain days of the month or certain months of the year, you're going to have security updates available.

Damien: So every Wednesday, starting by the simple thing of all of the security releases are done on Wednesdays. Almost always. There can be some occasional out of schedule updates when there's, say, a third-party library or something that has an update, and we have to scramble to do a core update on a different day, but almost always they're on Wednesdays. And then the core updates are done on one day and contrib updates. So one day a month, there's a window for core to have security updates, and if there isn't one on that day, then there won't be that month. Again, unless there's some or some other issue with a third-party library. And so site maintainers or site owners have a bit of relief knowing that there isn't, almost isn't, going to be a major security release dropping any time of the week, any time of the day, so they can rest easy and just go about their normal business the rest of the week, the rest of the month, and just be able to plan Wednesday afternoon, timezone depending, they will have to do some security updates.

Mario: Good. So what's the story with the Drupal eight reaching end of life? What's, what's going on there?

Damien: Well, it's a standard thing that, or it always has been a standard thing that when a new major release of Drupal core comes out, that the previous release goes into kind of a security updates only mode and then the version before that is marked as end of life so that there won't be any more security updates on that, and pretty much at that point, the community stops maintaining core and contributed modules and themes for that version. In practice, (it) historically meant when triple seven came out, that Drupal five was no longer given security updates and support. When Drupal eight came out, I think it was like eight years after Drupal seven, sorry, after Drupal six had been released, they gave and, if I remember correctly, they gave an extra six months of support just to help bridge the gap of time for sites to upgrade because the development cycle from seven to eight had been so long.

But then, when Drupal nine came out, Drupal seven was still around and still had support. When Drupal 9 came out, the majority of Drupal sites were still on Drupal 7. I forget if it was like two thirds or so of the sites in the world that were on Drupal were still using Drupal seven. So we couldn't just say no Drupal seven is not going to have anymore support, io it was given extra support. Then the plan is that Drupal Ten is going to be released next year, and so there's the knock on effect of you need time to upgrade from core version to core version. And so Drupal eight will be, we'll reach end of life this year. The nice thing is that upgrading from eight to nine is a very, very simple process, especially in comparison to previous sites or previous versions.

We've done some upgrades where a site was fully up to date on contrib modules and was able to be upgraded to from eight to nine with just a few hours of work, and most of that was just poking at Composer to get the right combination of dependencies.

Mario: You're saying that Drupal seven's end of life is actually going to happen after Drupal eight's end of life, yes?

Damien: Yes. So Drupal eight's end of life is tied to the dependencies it has, and its dependencies will run out this year. So I think, it's the big dependency that is causing it is Composer.

Shrop: I think Symphony symphony is part of it, too.

Damien: Composer, symphony. Yes.

Mario: So these end of life decisions are made mainly based on those dependencies, in the case of Drupal seven, it was probably because of the, the adoption that Drupal seven has had gotten, yeah?

Damien: The reason for continuing to support Drupal seven was because of the volume of sites out there that had not upgraded yet and because there wasn't the technical requirement. The dependencies for Drupal seven were still continuing for longer. Drupal seven had fewer third-party dependencies than Drupal eight. So it's easier to say, yes, we're just going to continue it.

Mario: And the future that as the dependencies get shorter and shorter, right, that the actual analyze for future versions may even become shorter than what they are now, perhaps, especially if the upgrade path is such, that is seamless there becomes time where it's just a matter of upgrade and the next installing the next update. And now you're in the next version.

Damien: Yeah. So, keeping a site up to date with the latest version of core, for the major release that you're using, and then keeping all of the contrib module updates, will ultimately mean that it doesn't matter if it's six months or three years between major versions, if it's up to date and you keep it current, it should be a relatively straightforward process. So Drupal eight will hit its end of life this November. Drupal seven has an extra year of support to November '22. So there will be more time for sites to finish upgrading to Drupal nine at that point.

Shrop: I'm glad you mentioned that. They'll just, that'll be the recommended path, just to go straight to nine. You, you won't even have to go through eight at that point, or actually even today, you'd want to go, I think (Drupal) ten is in June, June next year. Yeah. So, so it's moving fast, but it, and I'm glad you mentioned that, Damien, that the upgrade paths are smoother, and that doesn't mean a complex site could have some complications and if you haven't kept things up to date, but from any of us who've been in the community a long time, it, you really don't have to approach it like we're going to build a new site and then migrate all the content. Now it's like actually true upgrades and there's migration paths and it's really great.

Damien: Yeah. So it's more of a traditional once you get onto Drupal eight, plus it's more of an upgrade rather than a rebuild. That's the last major upgrade. Yeah.

Shrop: So get up, get on the new Drupal train then, and I think train is a good visual representation that the community's used to illustrate the end of life for some of these, the train tracks end at a certain point, you want to be on the train track that keeps moving forward. Damien, what kinds of Drupal security resources do you think folks listening should know about?

Damien: So the most important one, I would say, is the security advisory newsletter that everybody can subscribe to off of their profile on Drupal.org. There's just a nice, if I remember correctly, a nice checkbox for that. The security advisories are also available through an RSS feed, and they're also published on Twitter through the website. There are separate feeds for core, contrib, and public service announcements.

So depending on how you like to get your security updates, do all three. Then there's also, as mentioned before, documentation on the security team's processes, and under the Drupal eight documentation, there are pages that describe best practices for writing secure code. So you can avoid security problems while you're building the site rather than dealing with them later. There are a number of books out there, most notably the Cracking Drupal book by Greg Knaddison mentioned earlier. It was written before Drupal eight came out, so it's a bit out of date, but the general practices are still reasonable. And for every function it mentions, you can find the equivalent Drupal eight or nine replacement through documentation online.

Shrop: Well, Damien, we really appreciate your your time on the podcast today and for coming on and talking about, you know, specifically Drupal security, but also, you know, a lot of these concepts, like you were mentioning about Greg's Cracking Drupal book, you know, security is evergreen, it does change over time, there's maybe new things, of course, we learn about securing our systems, but some of those old things are still out there like cross-site scripting. They just are probably going to be for a long time.

Mario: They don't go away.

Shrop: Yeah, exactly.

Mario: Well, Damien, thanks again for joining us today and sharing with us your expertise when it comes to security and the Drupal project. I also personally want to thank you for always being available and helping us with any security or contrib questions that we have. You recently helped me contribute to a project, and that was pretty rewarding to be able to do that.

Damien: You're very welcome.

Shrop: Thanks for listening. You can find us at mediacurrent.com/podcast and subscribe with your favorite podcast app. If you enjoyed this episode, share it with your friends and tag @Mediacurrent on Twitter. For more resources on open source technology, web design, and web strategy. Visit us at mediacurrent.com.

Resources

 

Aug 26 2021
Aug 26

On our theme BizReview and Listiry, now available in Drupal 9, we have to solve the common problem with displaying nodes on maps.

Here are what we have used in our themes and have verified it is working. Please see the result as below:

Demo of maps on Drupal

Now please follow our tutorial on how to display maps on Drupal with Geofield, Address and Geocoder modules:

1. Install Drupal and Geofield

Install Drupal core:

composer create-project drupal/recommended-project my_site_name_dir

cd my_site_name_dir

Install geofield and geofield_map

composer require drupal/geofield drupal/geofield_map

Enable them all.

2. Create content types and fields

Now please create a new content type, "Listing" for example, and add a new field of type Geofield.

Go to Manage form display, and set the geofield's widget to Geofield Map

Set Geofield's manage form display to Geofield Map

Go to Manage display, and set the geofield's display to Geofield Google Map

Now you can create a testing node, enter the latitude and longitude as below:

Lat: 40.703081
Long: -73.935658

Save it and you can view the newly created node like this.

Demo of maps on a Drupal node

3. Autocomplete address:

As you can see, we can't just enter coordinates data (latitudes and longitudes) everytime we create nodes. How can we know the coordinates of an address?

The user friendly solution is when we type an address, it will automatically translate it to coordinates.

And that's when we need to use Google Place APIs in order to achieve that.

You need to register a free Google APIs key, please follow the guide here: https://developers.google.com/maps/gmp-get-started#enable-api-sdk

Enable Place APIs on Google Maps APIs

Note: you need to enable Geocoding, Maps and Place APIs on your Google Cloud Console.

When you enter a valid key at /admin/config/system/geofield_map_settings, please go to Manage form display and enable Address Geocoding of geofield.

enable Address Geocoding of geofield

Then you can start to enter addresses and they will be converted to coordinates automatically.

Address autocomplete on Drupal

The address info you type in, is only shown on the node edit form. So if you want to store it to display on the node page, please do the following:

Create a new field of type plain text, field_geoaddress for example.

On Manage form display, edit the setting of geofield, and set Geoaddress Field to field_geoaddress, select the option to hide it from the node form.

Set geoaddress for Geofield

The new field_geoaddress will keep the address you type in, so you can display it as you wish.

4. A complete Address field

If you want a complete address field with Street, Neigbour, City, State, Country fields ... you will need the Address module.

Install and enable Address module:

composer require drupal/address

Now create a new field of type Address on your content type. Then you can add complete address info on your nodes.

Address module on Drupal

5. Address to Geofield:

How can we automatically convert the address field to coordinates on Geofield? We need Geocoder:

Install the module:

composer require drupal/geocoder

Install Google Maps and Arcgis PHP package:

composer require geocoder-php/google-maps-provider geocoder-php/arcgis-online-provider

Tip: the "geocoder-php/arcgis-online-provider" is free (no API Key required), it is a good testing provider.
"geocoder-php/google-maps-provider" is good if you have an API key.

Enable all modules and submodules.

Config provider: /admin/config/system/geocoder/geocoder-provider, please add Google Maps and Arcgis Online Provider as the providers.

Config Geocoder providers

After that, please go to Manage fields, edit the geofield node and set it to geocode from the existing address field.

Config Geocoder for Geofield

When you are done with it, everytime you create a node with address info, Geocoder will automatically convert it to coordinates so it can be displayed on maps.

Have fun!

Jul 27 2021
Jul 27

open waters podcast logo

Open Waters Podcast: Season 2, Episode 4 

In this episode, we're joined by Randy Fay from DDEV to discuss features and benefits of DDEV, the future of the project, and how you can get involved with contributing to the Drupal community. Randy provides his perspective on DDEV and the ways in which both developers and non developers can use it for projects.

Episode Transcript

Mario Hernandez: Hello, everybody welcome. We are very excited here. Shrop and I are very excited to have with us, Randy Fay, who will be talking to us about DDEV and what's coming down the pipe with that. We are very excited about that because we here, at Mediacurrent, use DDEV for most of our projects, and we are loving the product and looking forward to what's coming ahead. So thanks for joining us, Randy. We appreciate it. We just had you actually do a knowledge share for our team, our development team, not too long ago. It was just last week. I believe it was. And now you're back with us. So we appreciate you taking the time from your really busy schedule and talk about DDEV. There is some great news happening with DDEV, but before we get into that, tell us a little bit about your role today and how you became interested in open source. What brought you to this point as a prolific maintainer of a key piece of open-source software like DDEV?

Randy Fay: You bet, I'm glad to be here. Again, my name's Randy Fay, I'm R. Fay most places, Randy Fay some places, and I live in far western Colorado, right near the Utah border in a little town called Palisade. And I ended up here because my wife and I took a long bicycle journey through the Americas from 2006 to 2009, and at the end of that time, my parents were getting old and we decided to come, you know, take a turn. Our turn history ended up being quite long now, but take a turn with helping them out. And so that's, that's where they were and that's where we are. But the open source thing is, of course, when we took two and a half years out of our life to do bicycle touring, there was going to be a change when we got back, and I've worked in various kinds of software development since the early eighties, but of course, like everybody else, I got more and more excited about web stuff and I'd never, I benefited from open source over the years.

Randy Fay: I've done lots of, you know, Linux work over the years and all that kind of thing, but I benefited from open source, but never contributed at all. And so one big thought was, well, how could I, how could I figure out what this open source thing is? And so when we were restarting our life after our bike tour, Drupal was there. I felt like I had some idea what was going on with Drupal. And I said, well, maybe I could figure out how to contribute to Drupal. And you know, how Drupal is, if you stick your finger in there and you figure out who the players are, you will be sucked in. And you know, you can spend the rest of your life...just working on that. And I did that for a couple of years and had had a great time, met wonderful people and got very involved in the Drupal community, but it was Drupal that brought me into the open-source world.

Randy Fay: So anyway, so I did that for a long time and then I burned out pretty good. I even did a whole series of sessions at a Drupalcon and stuff like that on burnout. And I got flamed out pretty good. Couldn't do it anymore. Worked on various things for a few years, but about maybe five and a half years ago, something like that, I went to work for a company named DRUD that was developing a Golang Kubernetes hosting product. And they also had DDEV going, although it was very much in its infancy, and I really liked DDEV. I liked it a lot more than the hosting part. Maybe because, I don't know, I just, I just liked it a lot. And I started to feel like it was useful and they were amazingly generous. They just let me go ahead and maintain DDEV for, it must have been at least four years and paid me to do it and rarely asked anything in return. Every so often they'd want some kind of integration with the process and that kind of thing. But that's how I ended up maintaining DDEV and got to continue doing that through their amazing generosity. That was a long answer to a short question.

Mark Shropshire: That's great. I love hearing these stories, you know, about how people get to where they are today, because, you know, it's also humbling to hear these stories, but you know, Randy, when I've interacted with you, it just amazes me how quickly you respond and support the community. And you're just always had such a great positive attitude. But you seem like this, this huge persona, you know, coming in like that, like the maintainer of DDEV and they got real personal with Mario and Shrop and helped, helped us with our problems and all that. But it's great to hear just this real human stories. There's, there's the human behind the keyboard, right?

Mario Hernandez: I can definitely attest to Randy's generosity with helping. I mean, no matter what I ask him, he's always there. He never judges, is always willing to help and has got me out of some pretty big holes with things that I work on with DDEV. So that's excellent to hear.

Randy Fay: It's lovely being a member of a community where you feel like you have something useful to do. I mean, I, I love it when people ask questions and everybody is at a different place. I mean, there's so many things going on in our heads, trying to understand all the different technologies we're working on, and there's so many different layers of all those technologies. It is astonishing and crazy. And so we're all at a different place and how much we understand on those. And so it's, it's always great to work with people at whatever level they're at to try to go to the next level, even though it's...I can't, it's hard to believe anybody can do any of this stuff with all the layers of stuff you have to know. It's crazy.

Mark Shropshire: I agree. I agree. And that, that's some of what you just said is part of why I love open source so much. And I know we've said the word, or if you want to call it a word, DDEV a lot so far, cause that's really what one of the things we're here to talk about, and in addition to Randy you, as a community member, but so while I know a lot of our listeners are really familiar with DDEV and what that means, and we love it here at Mediacurrent, it's our go-to environment. We made that choice a while back and and have been really pleased with it. But for those who don't know what DDEV is, what that means, can you give us a high level overview of this really fascinating open-source project?

Randy Fay: You bet, you bet. So everybody needs, in web development or really in any other context, a developer has to be able to work in an environment that's their own, where they're not going to break something else or break somebody else's world or that kind of thing. And I mean, probably most of us, sometime in our shaded past, actually edited files on a server and hoped that it worked out, but don't do that, right? If you're going to be a real developer, you're going to have your own environment to develop in, and you're going to be able to see the results of what you do locally before you inflict it on somebody else and you're going to have processes to make that work. So DDEV is a local development environment. It's, it's a way for people to do web development. It works with PHP, works with HTML and JavaScript, and probably you could do lots of other things with it, but the most of the communities that are involved with it are PHP, Drupal, and TYPO3 are the biggest ones, but the big idea is it's a, it's a wrapper written in Go on Docker Compose and Docker.

Randy Fay: And so it basically runs a web server and a database server inside containers. So they're actually running inside Linux containers. So it's more like a real-world situation but it looks the same on every platform. So, and so, in other words, the containers are the same on every platform. They're exactly the same. So if you're working on windows or WSL Two, or MacOS or Mac M1, or...people even work on Chromebooks, or if you're working on Linux, I mean, Chromebook just offers Linux. So if you're working on any links, distribution, everything works the same. It's using the same containers underneath.

Mark Shropshire: Sorry, I didn't mean to interrupt, but I was just thinking I'm pretty sure I got it working on my Raspberry PI a while back, thanks to all the work from the community, too, which is I had to do it just to do it.

Randy Fay: It actually works pretty well on Raspberry Pi. The trick about Raspberry Pi, it's ARM 64. So you'd have to set up, I think, I think you have to get the current version of Ubuntu, it's the easiest way. I wrote a blog post about it, so you could search for DDEV and Raspberry Pi, and it, it actually is decent. It it could be a little faster, but it's decent.

Mario Hernandez: So you mentioned earlier, Randy, that you worked for DRUD, a company who was building this product and DDEV, and we know that, some of you may know, that the company split up and for a while, the community was wondering what was going to happen with DDEV, but just this week we heard some great news. Sounds like great news. DDEV has been acquired by Fruition. And is there something you can tell us about this? What does this mean to DDEV? What does it mean to the community? What should the community expect about this new acquisition?

Randy Fay: Yeah, you bet. So the company named DDEV, you know. Companies, come and go, it was called DDEV, it used to be called DRUD, it went by several names, but, you know, not everything always works out and the money doesn't always hold out. And it's just the way it is sometimes. And somebody pulled the plug. I don't know who, but somebody pulled the plug. And so the company named DDEV shut down. The DDEV project was never in any question. The DDEV local project was never in any question because I'm committed to it and it's an open-source project and it was going forward anyway, but there was a little bit of uncertainty about whether the project might have to be forked, or the name might have to change because there is a trademark, a DDEV trademark.

Randy Fay: So there was some uncertainty about that. And the company named Fruition in Denver has bought the what, what, what would it be? The software that (the) DDEV company wanted to make money off of, and didn't end up doing that. They bought that software and they got the DDEV trademark name with it. And they've been very generous to say that they're gonna make sure that the DDEV local project remains open source, and then it doesn't have to change its name. And...the thinking isn't fully thought out, but they're, they're thinking that they would like to be the the central place to try to organize financing for the project as it goes forward. So there's a number of companies that have stepped up and said that they'd like to support the project and you know, maybe even Mediacurrent will want to, and it may be that we're, you know, these are things that we're still talking about, but Fruition may want to organize that financially, or maybe we'll figure out another way to do it, but anyway, they're just putting their foot in there and trying to make sure that it's clear to everybody that there's nothing wrong.

Randy Fay: Everything's good. And and that the name will continue and there's, there's no problems.

Mario Hernandez: Will you be still involved in maintaining the product, and how do you feel about this news about Fruition acquiring DDEV?

Randy Fay: I'm glad that they got it worked out. It doesn't change anything for me right now because I'm just supporting the project. I'm the maintainer of the project, and I continue to be the maintainer of the project. So it doesn't change anything for me. I'm not becoming a Fruition employee. They're, you know, they've been through a lot getting this done, so they haven't even been able to think about the finances yet. I'm sure. But for me, it just resolves the primary thing it does is resolve some ambiguity about whether we would have to fork or, you know, or change the name or something like that. So it's all, it's all very good and they're very nice people and have been very generous trying to sort this out. And they...I think they're taking on a big task with their ambitions, with the hosting software as well. So they've got a lot going on right now and we'll see how this all, how this all plays out. It's all good.

Mario Hernandez: It sounds like it's a great win for the community.

Mark Shropshire: Yeah, this is such great news. And it's it's kind of a testament how open source can continue even around business changes and things that happen in that world. That's not to say it always works out. We've all read stories and heard things, you know, but it's great to hear this and, and we're really happy that you're there and there's still keep supporting that, Randy, and I'd love to talk more about like what kind of support you need, too, later in the podcast. But because I'm sure, you know, you'd love to continue to have more PRs and people helping the project. But yeah, go ahead. Did you have, did you have anything immediate in mind on that, Randy?

Randy Fay: No. That's, good. I, like I say, the thinking isn't done and stuff, but it's all good. It's all good. It's gonna work out great. And, there've been a couple of excellent things that have happened in the funding realm already. TagOne has been supporting specific features. And so that's an easy way to support the project. So they just made me a contractor and have been paying me to do features that are important to some of their projects. And so I, you know, I just work a few, a few hours a week on their project, you know, on their features. And those are big wins for the whole community. And I expect Fruition is going to do the same kind of thing.

Randy Fay: And I think there's other companies that may want to do that. And there may be other companies that just say, Hey, we'll take five hours, we'll take five hours a week. And just, you know, just support the community or just, or, or, you know, maybe dedicated to what they actually need as far as features, but it's all, it's all good. So TagOne stepped up, now Fruition stepped up. There are other companies that have certainly made noise, but of course, as you know, it hasn't been clear how we would do that until this part sorted out. So that's another reason. This is really good is that now that this is sorted out, we can sort out the, all the people that have wanted to financially support the project.

Mark Shropshire: Yeah. That's so great to hear. Thanks for the additional detail on that. So, jumping back to the tech a little bit on DDEV, what are some of the key advantages that you see of around DDEV over other local development tech that's maybe been around or is around currently?

Randy Fay: You bet. You bet. So the original way that you did local development with the website is you'd put Apache on your computer and you'd put MySQL on your computer and you would hook them up and it actually would work great if you were if you were a whiz at Sysadmin, I was always a whiz at Sysadmin. It was my thing, you know, I liked it. It was, you know, I do Linux and I do Windows and I do Mac and it was all, I thought it was great. In fact, I was pretty skeptical of DDEV when I first started using it, because if you're a whiz, then you can just run that stuff. You can, you know, it all runs on every computer and it's great. But not everybody wants to spend their life configuring Apache or Engine X or MySQL, and the thing that amazed me when I started working on DDEV that really really shown for me was that every project could have different configuration.

Randy Fay: So back when I used to maintain everything on my own computer, that wasn't true, right? You have it set up for PHP 5.6 and that's it. You couldn't have one project on PHP 5.6 and another one on PHP 7, PHP 7 wasn't around then, but you just, it wasn't something you could do. And so I started realizing that with DDEV, you could have every project be different and customized, and you could do it right. And so that's because it's a Docker based, DDEv uses Docker underneath and all the stuff underneath is per project, and it is configurable for that project. So that's the huge difference between the Docker generation and the on the machine generation. The difference between DDEV and other Docker based local development environments like Lando or Doxil, which by the way, are great, great products and have lots of supporters and lots of great things to say about them.

Randy Fay: DDEV has got full cross, full support across all these platforms. So many different platforms, which you won't really find that level of support. DDEV is actually tested on all the different platforms. So it runs the full test suite on all the different platforms. DDEV is interoperable with many different versions of Docker. And we think the performance is great. On MacOS it's always a challenge. I'm still working on new ways to improve that in MacOS. So you can look for mutagen, mutagen integration coming up. I think the biggest thing, though, and this actually was playing out in Twitter today, is that DDEV doesn't assume that it owns your machine. So when you install DDEv, DDEV is going to be as respectful as it can. The only thing that I know of that requires you to use Sudu is if you're not using the regular DDEV.site URLs or host names and then it has to...edit the Etsy host's file, but in general, DDEV tries not to do anything to your machine. It won't install Docker, it won't overwrite Docker, it won't kill off your Docker, all of those kinds of things. And I think that's a big deal. But maybe, maybe you can put in the notes. I wrote a thing called "What's so different about DDEV Local?" Maybe you can put a link to that in the show notes.

Mario Hernandez: So I can certainly personally attest I was trying to, I originally got an M1 Mac and was trying to get other development environments set up and I was unfortunately not able to, but DDEV just worked for me, you know, the first try. And so, so that's great to have that kind of support. But, as far as you know, the efficiencies that DDEV provides besides the support, what are other efficiencies that DDEV provides to help teams? Is it something that non-developers can use to run their sites locally and test, or is this mainly just for developers?

Randy Fay: No, actually lots of, anybody who needs to evaluate a site will often do it or, you know, to evaluate or to test a site. It, all those kinds of things, there's actually an obscure feature where you can actually run websites using it. I call it casual web hosting, but I run my little websites using DDEV. That's that's not what DDEV was intended for, but people asked for it for years. And so they finally did it and it actually works pretty well. But it is definitely, it is definitely not managed web hosting or anything like that. So you wouldn't, you wouldn't put a big customer site on that under any circumstances. So the biggest thing about teams is that you check in your configuration for the project. So each project, you just check in the DDEV directory, the dot DDEV directorty, you check it in, and then everybody can check that out and DDEV has already configured for them.

Randy Fay: And so usually a team lead would configure and check in DDEV, and then all the other members of the team can just DDEV start and DDEV launch and get to work and it's already configured, even if they're even if the team lead was on windows and somebody else is on Mac or Linux, it'll just go. And obviously there's going to be caveats there, but really it pretty much works that way. So we're going to do at at Drupal Camp Colorado, I don't know whether, you know, Truls (Steenstrup Yggeseth) who's a Norwegian developer who is kind of an expert at setting up DDEV for teams. He's done it for two different companies. He's the guy that makes that work for two different companies. So he and I are doing a session, an hour-long session at Drupal Camp Colorado, which is free. And we're gonna, it's about, it's about teams and DDev. So we'd love to have you at that.

Mark Shropshire: Oh, that's great. Yeah. And you know, just to reinforce, this we'll definitely add a lot of these links and items to the show notes, so that it's all available. So don't worry. And if you're listening, go Google for it, just check out the show notes. And we've already got that link in our notes here from you, Randy, so thank you for that. So I know you mentioned earlier you know, some of the platforms or frameworks that run on DDEV include, you know, Drupal, TYPO3, what are some of the others of interest? And are there any others that, you know, are maybe coming up? But I know you've got quite a few.

Randy Fay: Yeah, yeah. I mean, we got a really all versions of Drupal six to nine, WordPress, Oracle, TYPO3, Shopware six, Magento one, Magento two, I might've forgotten about one or two, but really, lots of people....you know, because we have explicit support for these and we create settings files and stuff, it doesn't actually mean that much to anybody that understands how to do a settings file. So, all it means basically is that DDEV will generate some settings files for you, so you can get up running really, really fast. There's...no PHP project that you can't run on DDEV, really no HTML, JavaScript, Node, whatever you can run them all on DDEV. It just, it requires the same setup that you would do in any other, in any other environment. Just...DDEV, it just makes like Drupal so easy.

Randy Fay: Cause you do a DDEV config and a DDEV start, and it's already configured. You don't even, you don't know where the database is, you don't care where the database is. You don't, you didn't do the settings file. It's just there, but that's just a settings file. It's not like, we all know how to do a settings file so we could all do that by hand. Then I'm pretty sure on like on Lando and Doxil, you have to do the settings file yourself. Well, that's fine. There's nothing wrong with that. It just makes it, I don't know. It's a little addictive. So that that's mostly what the framework stuff is. It's icing on the cake, but it's not the fundamentals, but it feels like the fundamentals because everybody gets used to it.

Mark Shropshire: It does, because I know when I'm in development mode and not a DevOps mode, I want to just worry about the code and get into the code. And so if it saves, you know, that automation save that time and you know, in an agency situation like Mario and I are in, we're popping in different projects all the time and spinning things up, spinning things. I mean, it becomes even more so a benefit, even though you do that settings file once, you know, usually, and then it's done, but yeah, totally agree. That's great insight. And I think, Mario, we've even had, we've even had some projects where we've run Gatsby in DDEV and some other node things like Randy mentioned.

Mario Hernandez: You know, shameless plug here. I actually, with the help of Randy, I wrote a blog post. I think it was late last year where I configure an environment and since I do a lot of training myself, training workshops, I configure an environment where the students would just run DDEV start and they will have everything from, you know, obviously Drupal, PHP, everything else, but also NVM, NPM, node and everything pre-configured and running from the containers, including Pattern Lab, which will run from the containers. And by sharing some ports, you can literally have an NPM watch task watch your changes and everything in the containers. Pattern Lab will even reload the browser for you, all running from the container. So it is incredible what you can do with it. So highly recommend if you are a developer to look into it. So, now that we talked a little bit about the DDEV, what it does, what it can do and how cool it is, what can you tell us, Randy, about what's coming up for DDEV?

Randy Fay: Well, I'm really excited about what I've been spent working on the last the last few weeks I already mentioned it. It's mutagen passing speed up. The big problem on MacOS and to some extent on Windows is that Docker is slow updating files between the host and the container. And so there's a workaround for that. You can call it caching, or you can call it asynchronous updates. But there's a project called mutagen that's another open source project that has been working on trying to solve that problem for some time. And it actually has pretty good potential. Basically what happens with it is when you make a change in the container, that change is immediately available to like the web server or a PHP FPM. And so it doesn't have to wait for that to sync before it is used.

Randy Fay: So that's not the way it is in current Docker setups. In, current Docker setups, every file change requires everything to happen immediately. Anyway, mutagen can be two to 10 times faster at like a, at like a Drupal nine installer or something like that. I've been doing some timing on it. Very, very impressive. And that's fast, I'm talking about faster than NFS, which is the other technique that we've been recommending for a long time. The complexity of mutagen is that...you can introduce conflicts by touching something on the Windows side or the Mac side, and also inside the container. The consistency stuff is hard and mutagen is good at it. But there's a lot of workarounds that DDEV is having to do to make that go. But I think you'll see an alpha of that in the next couple of weeks.

Randy Fay: And I hope, I'd love to have everybody try it and get their feedback. It is lovely to work with. The worry always is about consistency. Would some files suddenly go away on you that you didn't mean that to happen, that kind of thing. So consistency is the big deal. Also (developer) Ofer Shaal has been working like crazy making gitpod work with D dev. So there's a whole project called DDEV gitpod, (https://github.com/shaal/ddev-gitpod) that will just bring you up a whole Drupal, a whole Drupal nine set up, boom, there you are. By the way, Ofer likes to say boom. So I'm going to say boom a few times, you can put the boom into the release notes. And so, you can try that out and it's just lovely, but the next step, he's transforming that for Drupal contributions so that you can just say here, and this, this project is called Drupal pod capital D capital P.

Randy Fay: So Shaal drupalpod, and he's got it to where you can visit an issue on drupal.org and you can do an issue fork or contribute to an issue fork right there in your browser. Now I should have started by saying what gitpod is. Gitpod is a way to run a Linux computer in the browser. It's basically giving you a free Linux computer that's fully configurable in the browser with VS code. So it's got a very nice IDE. So if you bring up DDEV gitpod, for example, you have all of DDEV right there in your browser, all of it. You can, immediately do debugging. Ofer has done a number of demonstrations of this. You just spin up, you go to any PR or you go to any project on GitHub, click the gitpod link, and boom, you have got that boom in there, again for you, Ofer, you have it up and running and you can debug with XD bug in VS code without doing anything, nothing at all.

Randy Fay: So it's right there. It's got great potential for DrupalCons for contribution days. Both of these have enormous potential for that because you don't, if you've ever worked in that, one of the contribution days at DrupalCon about half the day can be spent helping people get a local environment set up and they've all got their own computer and it all has to be set up differently. And with with these various ideas that Ofer is spinning off so fast, all that goes away and it can all be done consistently in gitpod. It's an amazing demonstration. I don't think it probably works as well on a podcast.

Mark Shropshire: Yeah, yeah, that's right. This is audio only, but Ofer, he is on tour. It seems like giving demos, he gave a demo or a Drupal Camp Asheville unconference, and it blew my mind. I was immediately posting links and I've already got those listed now for the show notes, but I was posting links in our developers channel cause I was so excited. I saw the potential in Drupal pod, but I don't think I realized there was DDEV gitpod. So thanks for mentioning that, Randy.

Randy Fay: Yeah, well that's what Drupal pod is, too. Drupal pod just uses DDEV inside gitpod, but it's focused on Drupal issues. So it's focused on Drupal. And so that, I mean, that's gitpod is DDEV in gitpod.

Mark Shropshire: I see. Yeah, that makes sense. Drupal pod is for the issue queue side of things.

Randy Fay: Ofer and I are going to do a demo of a Drupal pod probably at Drupal Camp Colorado. I think, don't they always have like an IT day? I think we have a slot on that.

Mark Shropshire: That is fantastic. I guess we're at a point here where we can we can start to wrap up a bit. I mean, you've mentioned a few fantastic trainings already. So I would tell everyone to kind of check out the Drupal Camp Colorado website. We'll get links in the show notes. And also I think there's some training coming up with, with Mike Anello, too, Randy.

Randy Fay: Yeah, we're doing we're doing at, at Drupal Camp Colorado. We're doing a half day training with Mike Anello. If you haven't done one of Mike's classes, every one he does is wonderful, but he's been doing DDEV for years. He does this whole, like year-long Drupal thing to take you from zero to expert in a year. He does, he does all kinds of wonderful things. But he also does Composer, which I highly recommend his Composer class, but he and I are doing DDEV local, beginner to expert half day at Drupal Camp Colorado. Yeah.

Mario Hernandez: Mike...I have worked with them before and attended this class, actually. Mark Shropshire: He's fantastic. And a great community member. I highly encourage folks to check that out for sure. So I think that should do it for today. We really appreciate it there, Randy, taking the time to join us again and give us all this great information, especially it looks like the future looks very bright for DDEV and makes us all very happy here at Mediacurrent and I'm sure a lot of the community members out there, so boom, there you go.

Mark Shropshire: I like it. I'll drop a boom in now. It's like, it's so much better.

Mario Hernandez: So thanks again, Randy. We appreciate it. And anything else, Shrop, you'd like to close with?

Mark Shropshire: No, no, just encourage everybody to, if you haven't used DDEV, go check it out. We've got links for all of Randy's. Well, not all of Randy's places on the internet, but at least some places I know I looked for Randy and that's his website, GitHub links and Twitter. So definitely check that out and I encourage, also, this is open source, you know, Randy can't do everything, and I encourage folks to help with the project in whatever way they can, organizations or individuals.

Randy Fay: Yeah. Yeah. That's what I was going to follow up with that, too. DDEV Is a community of people and it's, you know, when there's a whole bunch of different Slack groups, there's DDEV in the Drupal slack and in the TYPO3 Slack, but DDEV depends on everybody learning new things and pioneering new approaches and helping each other more than anything else. And those communities where people are listening and helping people at all levels when they land are wonderful things. So remember that DDEV is a community project and you're part of it. And you're welcome there.

Mark Shropshire: Well, that can't say it any better than that. Thank you so much for your time today, Randy. We really appreciate it. And now we know it's precious, so thank you.

Resources

Apr 30 2021
hw
Apr 30

This DrupalFest series of posts is about to end and I thought that a fitting end would be to talk about the future of Drupal. I have already written a bit about this in my previous post: what I want to see in Drupal 10. However, this post is more aspirational and even dream-like. Some of these may sound really far-fetched and it may not even be clear how to get there. For the purposes of this post, that’s fine. I will not only worry about the how; just the what.

Modular code

Drupal is already modular; in fact, it is rather well designed. As Drupal grows and adds new features, the boundaries need to be redefined. The boundaries that made sense years ago for a set of features may need to be different today. The challenge is in maintaining backwards compatibility for the existing code ecosystem around Drupal. It also means fundamentally rethinking what are Drupal’s constructs. The transition from Drupal 7 to 8 was iterative and in many cases, the systems were just replaced from procedural to object-oriented. We need to rethink this. We should not be afraid to use the newest trends in programming and features from PHP, and rethink what architecture makes sense.

If you’re thinking this sounds too much like rewriting Drupal, you’re right. And maybe the time isn’t right for that, yet. I do believe there will be a time when this will become important. This will also help with testability which was the topic of my previous post about Drupal 10.

Assemblable code

Once we have code that’s modular, we can break it off into packages that can be reused widely. This is already in minds of several core developers and the reason why you see the Drupal\Component namespace. I want to see more systems being moved out not just in a different namespace but in different packages entirely.

The direct upshot of this is that you can use Drupal’s API (or close to it) in other PHP frameworks and applications. But that is not my main interest here. My interest in seeing this happen is that forcing the code to be broken this way will result in extremely simple and replaceable code. It will also result in highly understandable and discoverable code and that is what I want. Drupal core is opaque to most people except those who directly contribute to it or to some of the complex contrib modules. I believe that it doesn’t have to be and I wish we get there soon.

Microservices

Taking this even further, I would like to see some way to break down Drupal’s functions over different servers. I am thinking there could be a server dedicated to authentication and another for content storage. This is a niche use case and therefore a challenge to implement in a general-purpose CMS such as Drupal. This would need kernels that can run in a stateless way and also breaking systems so that they can be run independently using their own kernels. The challenge is that all of this will have to be done in a way that won’t increase the complexity of the existing runtime. This, as you might have guessed, is contradictory.

More themes

I might not have thought about this sometime back. But with a renewed interest in the editor and site-builder, I think it makes sense to improve the theme offerings available to Drupal. There were many times when I build powerful functionalities but then show them in a completely unimpressive look. Olivero and Claro are big steps in this direction and I am hoping for more of the same quality. If there were, I wouldn’t have used Contrib Tracker for over 3 years with the stock bootstrap theme (even the logo was from the theme until recently).

Unstructured Content

Drupal is great at structured content but I see Drupal used more and more for unstructured pages. The way we do this now is using structured bases such as paragraphs to represent components but this method has its shortcomings. Paragraphs and similar concepts quickly get too complicated for the editor. Editing such pages is very painful and managing multiple pages with such content is tricky.

We now have the Layout Builder that is a step forward in addressing this problem. The ecosystem around the Layout Builder modules is scattered right now, in my perception, and I would love to see where the pieces land (pun intended). I would like to see Drupal fundamentally recognize these pages as separate from structured data so that the editing workflows can be separate as well. More in the next section.

Better media and editor experience

I know media is a lot better today than it was before but there’s still a long way to go. When working with pages (not structured content), I would like a filtered editing experience that lets me focus on writing. WordPress does this part well, but WordPress has a specific target audience because of which it can make this decision. This is the reason I previously said I would want Drupal to treat unstructured pages differently.

With a more focused editor experience, I would want better media handling as well. I want to paste images directly into the editor without worrying about generating resized versions. I also want pasted images to be handled somewhat similarly to the current media elements so that we can still deduce some structure. Even tiny things like expanding tweets inline and formatting code correctly could be supported in the editor and that could make a big difference.

Starter kits

We need ways to spin up sites for varied purposes quickly but I don’t think distributions are a good answer, at least not the way they are implemented right now. Once you install a distribution, you are stuck with it and would have to rely on distribution maintainers for updates. I think that’s a waste of effort. I know it’s possible to update parts of distribution without waiting for a release but the process is harder. And if a distribution is going out of support, it is difficult to remove it; it is almost impossible if the distribution authors don’t design for it. We already have challenges with migrations and upgrades, why add the distribution on top of that?

Instead, I would like to see some support for starter kits within Drupal core. Instead of distributions, you would install Drupal from a starter kit which would point to modules and contain configuration. Once installed, it would be as if the site was installed manually. I believe a lot of such solutions are also possible outside Drupal core (using drush, for instance) but it would be nice to have some sort of official interface within the core.

What else?

I know that all of the things I mentioned are possible in part or have challenges that involve tradeoffs. It’s not easy to figure out which tradeoffs we should make considering that Drupal’s user base is massive. For example, what I wrote about today are my wishlist items for Drupal. They may be something completely different for you and that’s okay. I believe the above are already aligned with the vision set for Drupal by Dries and the core team (ambitious experiences, editor-first, site-builder focused, etc). With that as the North Star, we should be able to find a way forward.

That is it for now. I can dream more about the removal of features (such as multisite) but that’s for some other day. Let me know what things you would like to see in Drupal.

Apr 29 2021
hw
Apr 29

The configuration API is one of the major areas of progress made in Drupal 8. It addresses many challenges of managing a site across environments for Drupal 7 and before. It’s not perfect. After all, it’s just version 1 and there is work going on in CMI 2 to fix the problems in the current version. That is not the subject of this blog post, however. In this post, I want to talk about one of the lesser understood features of configuration management: overriding.

Most site builders and developers interact with the Drupal configuration API with “drush config-export” and “drush config-import”. Except for a minor exception, Drupal configuration is an all-or-nothing deal, i.e., you apply the configuration as a whole or not at all. There are modules like config_ignore and config_split to work around all these limitations. These modules allow ignoring parts of configuration or splitting and combining configuration from different locations. Further, Drupal allows overriding specific configuration through settings.php. A combination of all of these makes the Drupal configuration workable for most complex site-building needs.

The Configuration API

The configuration API provides ways to support all of the different scenarios described above. There are very simple constructs that are comparable to Drupal 7’s variable_set and variable_get functions. If you want to write a simple module that needs access to configuration, the documentation is enough to work out all of the details for you. Yet, there are some lesser understood parts of how the configuration handles cases such as overriding and other edge cases. I have seen this often in interviewing several people and even though it is well documented, its lack of visibility explains why people are not aware.

As I said before, I will mainly talk about the overriding system here. You are welcome to read all about the configuration API from the documentation. Specific areas I would recommend reading about are:

The configuration API needs to handle overriding specifically because they could affect exported configuration. If you are overriding configuration in the settings.php file, and if the config system didn’t know that, those values would be exported to the configuration file. This is not what you want. If your intention was to export it, then there was no reason to set it in the settings.php file. That is why the configuration API needs a mechanism to know about this.

Overriding configuration

The configuration API can handle these overridden values separately and also make them available to your code. This is why if you are exporting configuration after setting the values in settings.php, the overridden values won’t be present in the exported YML files. As a site builder, you won’t have to worry about this. But if you are a module developer, it helps to understand how to differentiate between these values.

Before I talk about the code, let’s consider the scenarios why you would need this. As a module developer, you might use the configuration in different ways and for each, you may need either the overridden data or the non-overridden (original) data.

  • For using the configured value in your logic, you would want the overridden data. Since you are using the value in your logic, you want to respect Drupal’s configuration system of overriding configuration.
  • For showing the value in a form, you might want to show the original data. You can choose to show overridden data but Drupal core shows the original data. There is an issue open to change this. Right now, there is a patch to indicate that the data is overridden and changes won’t take effect on the current environment.
  • While saving the values to the configuration, it will always be the original data. As a module developer, you cannot affect the overridden value through the configuration API. In fact, to save the configuration, you would need to call getEditable and that will always return the original data. When you set the new value and save it, you will change the value in the configuration storage. Yet, the override from settings.php will take precedence.

Accessing overridden values

If you have built modules, you might already be aware that you can use a simple get call on the config service to read a value from the configuration (see example). This will return an immutable object which is only useful for reading. To save it, however, you would need to go through the config factory’s getEditable method (example). This will return an object where you can set the value and save it.

While it may seem obvious that this is the only difference between “get” and “getEditable” methods, there is a subtler difference. The “get” method returns the overridden data whereas the “getEditable” method returns the original data. This means, there is a chance that both of these methods might return different values. As a module developer, it is important to understand this difference.

What if you wanted to get an immutable object but with original data? There’s a method for that: getOriginal (see example). Most modules won’t need to worry about the original data in the configuration storage except when saving. For that reason, it is not common to see this method in use. Even if a module were to indicate the differences in original and overridden configuration, it can use getEditable in most cases.

The difference between “get” and “getEditable” has been the trickiest interview question I have ever asked. Out of hundreds of interviews, I have only occasionally seen someone answer this. I hope this post helps you understand the difference and why the subtlety is important to understand.

Apr 28 2021
hw
Apr 28

Just as I published yesterday’s article on the tech stack, I realized that I missed a few important things. I had said that the list was only a start, so I think it is fitting to continue it today. As such, today’s post won’t be as long as yesterday’s or even as long as my usual posts. I should also add a caveat that today’s post won’t make the list complete. With the industry how it is and the requirement of constantly learning, I don’t think such posts stand the test of time; not from a completeness perspective in any case.

Our target persona still remains the same as the last post. We are talking about a Drupal developer building sites and functionality for a complex Drupal website with rich content, workflows, editorial customizations, varying layouts, and more features typically found in a modern content-rich website. Since today’s post focuses on the fundamentals of working on the web, the skills may apply to more than a Drupal developer but we’ll focus on this persona for now.

These are also skills that are often hidden, aka non-functional requirements. It is very easy to forget about this (as was evidenced by my yesterday’s post) but this is probably the most important of any of the skills. Not understanding this sufficiently leads to significantly large problems. Fortunately, Drupal is a robust framework and system and is able to handle all such issues by default. It is still important to understand how to work along with Drupal.

Performance

Performance is a very interesting challenge for a large dynamic website. There are several layers to unravel here and the answer for poor performance could be at any of those layers. This means a developer needs to understand how the web works, how Drupal works, and how does Drupal attempt to optimize for performance.

It is shocking how many people are unable to clearly articulate how the Internet works. To me, the lack of clear articulation is a sign of gaps in understanding in the chain of all systems that make the Internet work. To begin with, understand what is the sequence of events that take place when someone visits a link. Understanding DNS is important too. I know no one really understands DNS fully but you don’t need that. Get familiar with the interactions that take place between the client and the various systems such as DNS. Similarly, get familiar with CDN’s and other caching techniques that you can use.

Scalability is a different problem than performance but still very important to understand. Understand what are load balancers and how do they decide to route traffic. Also, understand how can you configure an application to work in such an environment. For this, you also have to understand the concept of state. Fortunately, PHP is already designed to be stateless and has a natural aptitude for such scalability (and serverless paradigm).

On the application front, understand how JavaScript and other assets can impact the rendering of your page. How would you find out if the front-end code is a bottleneck and if it is, what do you do about it? Same for back-end code. Understand how would you utilize caching at multiple layers to deliver performance to the user. Figure out what scenarios are better served with a reverse proxy cache like Varnish or an application cache such as Redis or memcached.

Accessibility

Accessibility is not an afterthought. Unless you are willing to review each line of code later, start with accessibility in mind from the first day. Again, Drupal meets many of the accessibility standards when you use the themes that ship with Drupal. The framework also supports many of the constructs that make writing accessible interfaces easier. Drupal 7 onwards has committed to meeting WCAG 2.0 and ATAG 2.0. Read the Drupal documentation to know more about how you, as a module developer, can write accessible code.

Security

Again, security is not an afterthought and many people who treat it as such have a fair share of horror stories. And just like before, Drupal provides constructs that make it very easy to write secure code. With Drupal 8 and the introduction of Twig, it became significantly harder for developers to write insecure code. However, it is important to understand what Drupal does for security, otherwise, you will be fighting against the core to make your code work.

Understand Drupal’s philosophy of content filtering of filtering on output, not input. Understand how to use PDO and DBAL to work with databases safely. Learn how to use the Xss helpers to write HTML output safely. Understand how you would use Drupal’s permissions and roles system to provide access control for your code.

Again, these are vast topics but I am only beginning to introduce them here. Do follow the links above to specific Drupal documentation pages to understand the topic better.

Apr 27 2021
hw
Apr 27

The title of this post is not really accurate, but I can’t think of another way to say it. This post is related to my earlier one on what a Drupal Developer does day-to-day. Here, I will talk about some of the skills required of a Drupal developer. I am not aiming for completeness in this post (that’s a goal for another time) but I will try to list all skills required to build a regular Drupal site, deploy it, and keep it running. Some of these skills are foundational whereas others may be only needed for specific requirements. Moreover, not all skills are required at a high expertise level. For some roles, even awareness is enough.

This post would be useless without a target persona. Different organizations have different needs and what works at one organization may not work at another at all. Similarly, different roles within an organization have different needs. For example, at Axelerant, we do not have a dedicated site-builder role but this is something that is expected of a Drupal Developer. Such an arrangement may not work at all for your organization nor can I say this arrangement will work forever for Axelereant. In fact, it is even weird to use the word “forever” here.

The target persona for this post is a Drupal Developer building sites and functionality for a complex Drupal website with rich content, workflows, editorial customizations, varying layouts, and more features typically found in a modern content-rich website. Since this is Drupal we are talking about, the above definition is actually quite broad. It includes a web presence for organizations from publishing, healthcare, higher-education, finance, government, and more. With that settled, let’s look at the skills.

Site-building

A Drupal developer would need to have the following skills related to site-building. Even if a developer does not build sites, there is a good chance they would be interfacing with these aspects from code. Hence, most of these are foundational.

  • Content modelling: Understand what makes a good content structure for a site. You should be able to determine if the content structure you are building is the relevant one for the requirements.
  • Content moderation: Most sites need this. I have seen this being used even on simple sites when there is more than one type of user. Understanding how the moderation workflow works and how it ties with the entity-field structure along with its revisions is important.
  • Multiple languages: It is important to understand how Drupal implements multilingual functionality at both interface and content (translation) level. Depending on the requirements, certain aspects of how translated content is stored in fields could be very relevant. At a basic level, you should also be aware of how Drupal determines the language when it renders a page or response.
  • Configuration Management: While this may seem more to do with developers, it is important to understand this from a site building perspective. You need to understand how the content types you create are stored in configuration and how is it different from how, for example, site information is stored. This means you need to understand configuration entities and simple configuration. You also need to understand how configuration can be changed depending on the environment and how it could be overridden.
  • Common solutions like views and webforms: Before you set about developing for some functionality, you have to make sure if there is a solution out there already. You also need to know how to pick modules for your problems.

Development

I don’t think this section needs any explanation as the entire post is about this. So let’s jump in.

  • PHP: This might seem like a no-brainer. Drupal is written in PHP so it is a good idea to be very comfortable with PHP. Understand object-oriented programming, how Drupal uses different constructs such as traits. Also, understand patterns such as dependency injection and observer pattern (hooks and event subscribers are based on this). Then, keep up with what’s new in PHP, if possible, and use it.
  • Composer: Arguably, this is a part of site-building as well but there is a good reason to keep it here. Apart from using composer to download modules, understand how it helps in writing the autoloader. Also, understand how composer works to select a module or a package and what are the reasons it might fail. Most of this comes with experience.
  • Caching: This is probably the most important part to understand about Drupal. Caching is deeply ingrained at multiple levels in Drupal and it is important to understand how it works. Almost all your code will need to interact with it even if you aren’t planning to cache your responses.
  • Other Drupal API’s: There are some common API’s you should be familiar with; for example, Form API and Entity API come to mind. But also be aware of other API such as Batch, Queue, plugins, Field, TypedData, etc. You should know where to look when you need it for a problem.
  • Automated testing: The Drupal core has a very vast collection of test cases and that means you may only have to write functional test cases for your site. You should understand the different tools that you can use to write them.

Deployment

You have to understand where your site is going to live to build it well. We don’t live in that age where you would write code and throw it over a wall to the ops team for them to deploy it. You may not be deploying it but you have to understand the concepts.

  • Servers: Understand how servers work. You don’t need to know enough to configure one securely (unless that’s your job) but you should know enough to understand the terms. For example, you should know what are the common web servers and how they may be configured to run PHP. Also, understand what kind of configuration Drupal needs to run.
  • High Availability: Given Drupal’s market, there is a good chance that the site you are building will need to run in a highly available environment. Understand what it means and what that implies for your Drupal site. Most of these issues may be resolved for you if you are using one of the PaaS providers but if you are building your own server(s), you should understand how to configure Drupal to run on multiple servers. Even if you are using PaaS, you should still understand how running on multiple servers could affect how you write code, e.g., how you work with temporary files.
  • CI/CD: Continuous Integration is very common today and it is important you understand it to be a productive developer. There are dozens of solutions to choose from and most have some kind of a free offering. Pick one and try it out.

And more…

Like I said before, I am not aiming for completeness here but only in making a start. Please let me know if I have missed something obvious. Given the nature of these posts, I can only spend 1-2 hours planning and writing these, so there is a very good chance I have missed something. Let me know.

Apr 26 2021
hw
Apr 26

Over the years, I have significantly lesser time for development and an increasing need to move around. After dealing with back-pain due to the weight of my Dell laptop while travelling for conferences, I bought a 15″ MacBook Pro. More recently, with the issues with Docker performance on Mac, I have been thinking of getting a Linux box. I was further motivated when I bought an iPad last year and wanted to use that for development. Now, with my old MacBook Pro failing because of keyboard and hard disk, I have a new MBP with the M1 chip and just 8 GB RAM. I am more and more interested in making remote development work efficiently for me.

It is not hard to set up a remote machine for development in itself. The problem I want to solve is to make it easy to spin up machines, maintain them, and tear them down. I also want to make it easy to set up a project along with the necessary tooling to get it running as quickly as possible. For now, the only problem I am solving is to set up the tooling quickly. I am calling this project Yakht (after yacht, as I wanted a sea-related metaphor).

Current workflow

While it’s far from the level of automation I am thinking of, it’s not too bad. I was able to set up a machine for use within a few minutes. This is what the process looked like:

  1. Create a 1 GB instance on DigitalOcean in a nearby region (for minimum latency).
  2. Add a wildcard DNS record for one of my domain names so that I can access my projects on ..
  3. Set the domain name and IP address in my Ansible playbook’s variable files and inventories.
  4. Run the Ansible playbook.

These steps had a machine ready for me with all tools required for Drupal development using Docker (Lando in my case). The Ansible playbook installs a bunch of utilities and customizations along with the required software like PHP, Docker, composer, Lando, etc. Only PHP CLI is installed because all development happens within Docker anyway. It also configures Lando for remote serving and sets the base domain to the domain I have configured, which means I can access the URLs generated by Lando (thanks to the wildcard DNS). With the Drupal-specific tooling we have written (some of which I have written before), setting up a new project is fairly quick.

A lot of these tools are somewhat specific to me (such as fish-shell and starship). I need to make it customizable so that someone else using it can pick a different profile. That’s a problem for another day.

Current trials

I have been using this machine for a long time now which is not how I intend for this to be. I am trying out a few tools and customizations before putting them in the Ansible playbook. Most notably, I am trying out cdr and using it as an online IDE which is very similar to vscode. It took a little effort to serve it via Caddy, it works well most of the time. It times out frequently but I think this is because this instance only has 1 GB of RAM. When it times out, the connection breaks and you need to reload which can get frustrating. Fortunately, this happens frequently for a while and then it works fine for long periods of time. In any case, I doubt if this will happen on an instance with a reasonable amount of RAM.

Screenshot of cdr running in Chrome

Screenshot of cdr running in Chrome

I know that VS Code has good support for remote development over SSH but I also want to be able to use the IDE over iPad and a browser-based IDE solves that. I am also considering trying out Theia and Projector but that’s for another day.

It’s also missing a few things I want in my development machines such as my dotfiles and configuration for some of the tools (such as custom fish commands). For now, all of this is set up manually. Of course, my intention is to automate all of these steps (even DNS).

Current problems

The general problem with these kinds of tools is to maintain a balance between flexibility and ease of use. By ease, I mean not having to configure the tool endlessly and frequently to make it do what you want. But that’s exactly what flexibility needs. For now, I am not trying hard to maintain flexibility. Once I have something that works with reasonable customization, I will figure out how to make it much more customizable.

Another problem is accessing remote servers from this machine. Right now, I am using SSH agent forwarding to be able to access remote servers from my development instance without having the SSH key there (and it works). But this doesn’t work if I am using the terminal in cdr. I am still looking for a solution to this problem that doesn’t involve me copying my keys over to the development instance. One idea is, if forwarding is not possible at all, to generate new keys for every instance and give you public keys that you can paste in services you want. This is secure compared to copying the private key but definitely a hurdle to get over.

I am excited by this project and hope to have more updates over time. For now, I hope you find the Ansible playbook useful. Also, I am looking for ideas and suggestions in solving this general problem of remote development. Please let me know via comments or social media what you think.

Apr 25 2021
hw
Apr 25

I thought I was done with the series of posts on object-oriented programming after my last one. Of course, there is a lot we can write about object-oriented programming and Drupal, but that post covers everything noteworthy from the example. There is a way which is old-school but works as it should and another which looks modern but comes with problems. Is there a middle-ground? Tim Plunkett responded on Twitter saying there is.

There's a third way! See https://t.co/cFizL60Loy and https://t.co/so2syXyXZS

— timplunkett (@timplunkett) April 24, 2021

At the end of the last post, I mentioned that the problem is not with object-oriented programming. It is with using something for the sake of using it. If you understand something and use it judiciously, that is more likely to result in a robust solution (also maintainable). Let’s look at the approach mentioned in the tweet in detail.

The fundamental difference

The problem with the version with objects in the last post was not because it used objects. It was because it overrode the entry point of the form callback to change it. Using classes has its advantages, most notably that we can use dependency injection to get our dependencies. For more complex alterations, it is also useful to encapsulate the code in a single class. But the approach with the entry point made the solution unworkable.

On the other hand, the form_alter hook is actually designed for this purpose. Yes, it cannot be used within classes and you have to dump it in a module file along with all the other functions. But it works, and that’s more important. There is no alternative designed for this purpose. So, in a way, the fundamental difference is that this method works whereas the other doesn’t. It doesn’t matter that we can’t use nice things like dependency injection here if it doesn’t even work.

Bringing them together

The two worlds are not so disjoint; PHP handles both brilliantly, after all. If you want to encapsulate your code in objects, the straightforward solution is to write your code in a class and instantiate it from your form_alter hook. Yes, you still have the hook but it is only a couple of lines long at most and all your logic is neatly handled in a class where it is easy to read and test. The class might look something like this.

<?php
 
namespace Drupal\mymodule;
 
use Drupal\Core\Form\FormStateInterface;
 
class MySiteInfoFormAlter {
  public function alterForm(array &$form, FormStateInterface $form_state, $form_id) {
    // Add siteapikey text box to site information group.
    $form['new_element'] = [ '#type' => 'textfield',
      // ... more attributes
    ];
  }
}

And you can simply call it from your hook like so:

function mymodule_form_system_site_information_settings_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {
  $alter = new \Drupal\mymodule\MySiteInfoFormAlter();
  $alter->alterForm($form, $form_state, $form_id);
}

You can save the object instantiation if you make it static but let’s not go there (you lose all advantages of using objects if you do that).

Dependency Injection by Drupal

This is already looking better (and you don’t even need route subscribers). But let’s take it a step further to bring in dependency injection.

We can certainly pass in the dependencies we want from our hook when we create the object, but why not let Drupal do all that work? We have the class resolver service in Drupal that helps us create objects with dependencies. The class needs to implement ContainerInjectionInterface but that is a very common pattern in Drupal code. With such a class, you only need to create the object instance using the class resolver service to build it with dependencies.

function mymodule_form_system_site_information_settings_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {
  \Drupal::service('class_resolver')
    ->getInstanceFromDefinition(MySiteInfoFormAlter::class)
    ->alterForm($form, $form_state, $form_id);
}

For better examples, look at the links Tim Plunkett mentioned in the tweet: the hook for form_alter and the method.

I hope you found the example useful and a workable middle-ground. Do let me know what you think.

Apr 24 2021
hw
Apr 24

Update: Read the follow-up to this post where I discuss a mixed approach combining both of the approaches here.

I previously wrote about how the object-oriented style of programming can be seen as a solution to all programming problems. There is a saying: if all you have is a hammer, everything looks like a nail. It is not a stretch to say that object-oriented programming is the hammer in this adage. That post was quite abstract and today I want to share a more specific example of what I mean. More specifically, I’ll talk about how using “objects” to alter forms without thinking it through can cause harm.

But first some context: this example comes from my reviews every week of code submitted as part of our interview test. This has happened frequently enough that I think that this is actually a recommendation somewhere. Even if it is the case of people copying each other’s work, it certainly is evidence that this has not been thought out. In fact, this makes for a good interview question: where would this method fail? I am going to answer that in this post.

The traditional method

Let’s look at the traditional method first. Drupal provides hooks to intercept certain actions and events. For example, Drupal might fire hooks in two situations: in response to events like saving a node, or to collect information about something (e.g. hook_help). You will find a lot more examples about the latter and that is what we are going to talk about today.

Drupal fires a few different hooks when a form is built. Specifically, it gives the opportunity to all the enabled modules to alter the form in any way. It does this via a hook_form_alter hook and a specifically named hook_form_FORM_ID_alter. So, for example, to alter a system site information form, either of the functions below would work:

function mymodule_form_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {
  if ($form == "system_site_information_settings") {
    $form['new_element'] = [ /* attributes */ ];
  }
}
 
// ... OR ...
 
function mymodule_form_system_site_information_settings_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {
  $form['new_element'] = [ /* attributes */ ];
}

Adding elements or altering the form elements in any way is a simple affair. Just edit the $form array as you want and you can see the changes (with cache clear, of course). This is the old-school method and it still works as of Drupal 9.

The OOPS approach

More often than not, I see the form being altered in a much more involved way. Broadly, this is how it looks:

  1. Create a form using the new object-oriented way but extending from Drupal\system\Form\SiteInformationForm instead of the regular FormBase.
  2. Define an event subscriber that will alter the route using the alterRoutes method.
  3. In the event subscriber, override the form callback to your new form.

This gist contains the entire relevant portion of code.

After doing all this, you might expect that the code should at least work as expected. Many people do. But if you have been paying close attention, you might see the problem. If not, think about what would happen if two modules attempt to alter the same form this way. Only one of them would win out.

If there are two modules altering the same route, the last one to run will win and that module’s form changes will be used, The form controllers from the previous modules will never be executed. You could extend the first module’s form controller in the second module (so that changes from both modules take effect) but that is not reasonable to expect in the real world with varied combinations of modules.

So, we shouldn’t use objects?

I am not saying that. I am saying that we should think how we are applying any programming paradigm to build a solution and where it might fail. In our example, if Drupal supported an object-oriented version of form alters, that would have been safe to use (there is an open issue about this.) In fact, there is discussion to use Symfony forms and also some attempts in contrib space. Until one of those solutions get implemented, the form_alter hook is the best way to alter forms. And there is a good chance that such hooks get replaced in time. After all, the event-based hooks did get replaced by events in most cases.

For now, and always, use the solution that fits your needs. Using objects or using functional programming doesn’t necessarily make a program better. It is using our skills and our judgement that makes a program better.

Update: Read the follow-up to this post where I discuss a mixed approach combining both of the approaches here.

Apr 23 2021
hw
Apr 23

I have been contributing to Drupal in a few different ways for a few years now. I started off by participating in meetups, and then contributing to Drupal core whenever I found time. Eventually, I was even contributing full-time courtesy of Axelerant, my employer. At the same time, I started participating in events outside my city and eventually in other countries as well. I was speaking at most of the events I attended and mentored at sprints in many of these events. I have written about this in detail before in different posts about my Drupal story and a recent update to that.

It was only with the support my wonderful family and also from Axelerant in the early years that enabled me to contribute in this way. As my responsibilities grew, I had to find focus in where to contribute. My kids were growing up and I wanted to spend a lot more time with them. At the same time, I started picking up managerial responsibilities at Axelerant and was responsible not just for my work, but for a team. I was approaching a burnout quickly and something had to go. It was at this time I rethought on how to sustainably contribute to open-source.

Long story, short…

The story is not interesting. Honestly, I barely remember those years myself. I know they were essential for my growth and they came at a significant price. But we know that nothing worth doing is easy. As a mentor to a team and even bordering on a reporting manager, I had the privilege to multiply my efforts through them. I am proud to see how many of them have built their own profiles in the community and continue to do so.

My recommendation to my team and myself is now to stop thinking of contributing as “contribution” but as a part of our work. The word “contribution” implies giving something externally. People are hesitant to make this external action when they already are very busy with bugs, deliveries, and meetings. We all have a limited working area in mind to think about the code and all the complexity. Thinking about something external is very difficult in these circumstances.

Don’t hack core

One reason this feels so external to us is because of how we treat Drupal core and contrib. We drill in the notion in newcomers that we should never hack core. While there is a good reason for this but it results in the perception that the core (and contrib) cannot be touched. It is seen as something external and woe befall anyone who dareth touch that code. It is no surprise that many people are intimidate by the thought of contributing to the Drupal core.

My workflow

The trick is to not think of it as external. I use the word “upstream” instead of contrib projects when talking about Drupal core or modules. I find that some people think of “upstream” as a little closer to themselves than “community contribution”. Thinking about it this way makes the code more real, not something which is a black box that can’t be touched. We realize that this code was written by another team consisting of people who are humans just like us. We realize that they could have made mistakes just the way we do. And we realize that we can talk to them and fix those mistakes. It is no different than working in a large team.

Yes, this means that we don’t have people who are dedicating time to contribute. That is a worthy goal for another day. I am happy with this small victory of getting people familiar with the issue queues and the contribution process on drupal.org. I have seen these small acts bubble up to create contrib modules for use in client work (where relevant). Eventually, I see them having no resistance with the idea of contribution sprints because the most difficult part of the sprint is now easy: the process. They are familiar with the issue queue and how to work with patches (now merge requests); if it is a sprint, the only new thing they are doing is coding, which is not new to them at all.

I realize that this is not a replacement to the idea of a full-time contributor. We need someone who is dedicated to an initiative or a system. These contributors are needed to support the  people who will occasionally come in to fix a bug. But we need to enable everybody to treat the core as just another piece of code across the fence and teach them how to open the gate when necessary.

Apr 22 2021
hw
Apr 22

We at Axelerant have been contributing to Drupal in our own ways for a long time. In fact, I worked as a full-time contributor to Drupal a few months after I joined. This was around the time Drupal 8 was almost done and it is thanks to Axelerant I could contribute what I could at that time. At the same time, there was community focus around incentivizing contributions and there were a few websites (like drupalcores) to track contributions.

Sometime later, in one of our internal hackathons, I built a basic mechanism to track contributions by our team (you can see it at contrib.axelerant.com). It was a weekend hackathon and what we built was very basic but it set the groundwork for future work. We adopted it internally and reached a stage where we had processes around it (even our KPI’s for a while). In this time, we expanded the functionality to (manually) track non-code contributions. More recently, we added support to track contributions from Github as well.

Tracking all contributions

Since this was a hackathon demonstrating possibilities using code and technology, we started with only tracking code contributions. Soon, we expanded this to track contributions to events and other non-code means. The latter was manual but this helped us build a central place where we could document all our contributions to the open-source world.

Today, we have tracking from drupal.org and Github and basic checks to determine if code was part of the contribution. On the non-code front, you can track contributions to websites like StackOverflow or Drupal Answers. And for events, you can track contributions such as volunteering, speaking, and even attending (yes, I think participating in an event counts as a contribution).

Now, we have a process for anyone joining Axelerant to set up an account on the contrib tracker. After this, all their contributions to Drupal and Github begin to get tracked at that time. We also remind people frequently to add details of any event they have attended.

Improving contrib-tracker

Contrib Tracker is open source but currently treated as an internal project at Axelerant. We initially set it up as a public repository on our Gitlab server. Now we realize that it was not practical for people to access and help build it. Today, I moved Contrib Tracker to its new location on Github: contrib-tracker/backend. For a while, we had the thought of implementing the website as a decoupled service and that’s why the repository is named “backend”. Right now, we may or may not go that route. If you have an opinion, please do let me know in the comments.

Moving to Github

The move to Github is still in progress. In fact, it just got started. The project is, right now, bespoke to how we host it and one of the lower priority items would be to decouple that. Once that is done, it should be possible for anyone to host their own instance of contrib tracker. But the more important task now is to move the CI and other assets to Github. I will be creating an issue to track this work. Our team at Axelerant has already started moving the CI definitions.

That’s it for today. Do check out the source code at contrib-tracker/backend and let us know there how we can improve it, or better yet, submit a PR. :)

Apr 21 2021
hw
Apr 21

Drupal 8 was a major revolution for the Drupal community in many ways, not least of which was because of the complete rearchitecting of the codebase. We picked up the modernization of various frameworks and tools in the PHP community and applied it to Drupal. Of course, this makes complete sense because Drupal is written with PHP. One of the things we picked up was the PHP’s shift to the object-oriented programming approach.

The shift to using objects enabled us to collaborate with the rest of the PHP community like never before. On the language front, we had PHP-FIG working on standards such as PSR-4 which was adopted by many libraries. And on the tooling front, we had composer which allows us to use other packages with minimal effort. These developments made it possible for us to build Drupal on top of the work from the larger PHP community. However, this meant that we had to follow the same style of programming as those libraries and thus began a massive re-architecture effort within Drupal.

The refactoring story

Drupal is a product built and maintained by a diverse group of geographically distributed and mostly unpaid people. Further, Drupal’s value is not in just the core system, but also the tens of thousands of modules, themes, and distributions maintained by a similar group of people. All the modules, themes, and distributions depend on the core Drupal’s API which means that making any change to the core is very risky.

Because of this, it is simply not possible to isolate Drupal and refactor. We had to incrementally refactor parts of Drupal while making sure there is sufficient documentation for people to upgrade their modules. This inadvertently, but predictably led to significant overengineering in Drupal’s codebase. There are a lot of parts of Drupal several layers down which may seem messy but there is a reason. The problem is that those reasons are very hard to surface. More importantly, they become a hurdle for more core contributors to simplify the system.

I have to caveat this by pointing out that I am not an expert core developer myself. Not even close. The above is just my observation in working with the core. You may say that this is practically an outsider’s perspective and you are right. But I have learnt that an outsider’s perspective is often the most valuable one.

Objects, objects everywhere

The new structure has brought in dozens of new practices (such as the DIC). This often means that you need to augment the structure to support that. Soon, we have classes with multiple adjectives and nouns everywhere reminiscent of Java. This makes the code even more opaque putting in more hurdles for people who are new to Drupal. Even experienced PHP developers may find trouble navigating the sometimes unintuitive names of various components because of the historical context.

With people managing to work in this new reality, I have found that people’s answer to all problems is simply more objects. Copy-paste solutions from various places (aka StackOverflow) and make it work. At work, I have seen more and more code samples from people who are convinced they are presenting a superior solution simply because they are using classes with names like subscriber and controller while not realising they are breaking functionality that would not have broken in Drupal 7.

This may seem vague at this point and I hope to present more specific examples in future posts. I will leave you with these abstract thoughts for today. I hope you found this interesting even if not useful.

Apr 20 2021
hw
Apr 20

Now that we have discussed some of the concerns with career growth, let’s talk about beginning a career in Drupal. I am not aiming for comprehensiveness but I hope to give a broad overview of what a Drupal developer role looks like in the real world. Needless to say, every company or agency is different. You might have a very different set of responsibilities depending on your organization. One of those responsibilities would be to build and develop Drupal websites and we are going to talk about that here.

Working in an organization essentially means working with other people. That is how you multiply your impact. All the roles do have some amount of dealing with people. Initially, you only might have to worry about collaborating with people but as you grow, the scope increases. Eventually, you would have to mentor and maybe even manage people. All along this journey, you would also increase your knowledge and area of impact technically.

Early roles

As someone who has very little experience with programming, you might start off with Drupal by building sites using UI and simple CLI commands (drush or console). Your challenge would be to understand content types, taxonomy terms, users, roles, permissions, etc. You would need to understand how to design a content type that is actually useful for what you want to represent and add fields to it. Moreover, you might know enough PHP to find snippets to alter forms or pages and implement those as per requirements.

At this stage, you might not be expected to solve complex problems or scale websites with heavy traffic. You might be working under the supervision (and mentorship) of someone with more experience with Drupal. It would be expected of you to collaborate with other people on the team such as front-end and QA engineers if any and complete your tasks the best you can.

You would need a strong set of fundamentals here such as git, basic PHP programming skills, and a working knowledge of HTML, CSS, and JavaScript. You would need more of an awareness of your software stack but wouldn’t have to go deep.

Intermediate roles

With around 2-5 years of experience, you will begin to pick up a more complex set of tasks. You would write more complex modules with proper object-oriented code. The site-building requirements also would be more complex. You would have to implement content types with complex relationships with other entities. The code required to tie these things together would also get more complex. You will also have to worry more about how to solve a requirement in the best way rather than just implement something.

At this stage, you would be expected to participate in code reviews and share your feedback. In fact, some organizations (like mine) encourages all levels of engineers to participate in code reviews, so this might not be new. Take this opportunity to also seek mentorship from other folks to go deeper into Drupal and other technologies. While you only had awareness of your stack earlier, you should now have a working knowledge of various elements in the stack. You would also be involved in discussions with other teams to build integrations your system needs. Essentially, you should understand how the pieces fit together and how they participate in making the system work.

You should also begin to question requirements and try to understand the business problem. Your value at this level is not just your technical skills but also solving business problems. In other words, it’s not just about solving the problem right but solving the right problem.

Advanced roles

After about 5-10 years of experience, you will now be responsible for the entirety of a system. You may begin with simpler systems where Drupal plays a major role and then move on to more complex systems where Drupal is just one piece of the puzzle. At this stage, more people will turn to you to solve their problems and you will have to figure out how to solve those problems yourself, guide them, or delegate it to someone else.

You should also be much more familiar with the entire Drupal ecosystem and how it can fit in with other parts of your stack. You are close to approaching what could be called the “architect” role (but we won’t go there today). As an IC, you will have a lot more impact but you would also play a significant glue role in your team (or teams).

This is a vast topic and I might come back to this in a future post (or multiple posts). Like I said before, this is not meant to be a comprehensive guide, but just a broad overview.

Apr 19 2021
hw
Apr 19

As the Director of Drupal services at Axelerant, one of the things I often worry about is the growth of each of the members of my team. We are a Drupal agency, so there are a lot of options for people to choose from. Yet, there are different concerns with working on Drupal for a long time. Today, I’ll talk about some of those concerns and what are the mitigating factors. I can’t claim to present a perfect solution, even less in a post written in less than an hour. But I hope to at least get the topic started.

Concern: Drupal is not cool anymore

When Drupal celebrated its 18th birthday, there were jokes around how Drupal is now old enough to drive (or something like that). This year, Drupal is 20. You might be surprised but a software’s life-cycle is not the same as a person’s life-cycle :). So, while 20 might be the age where a person is cool (I’m showing my age, aren’t I?), that’s not the same for software.

Yes, Drupal is now in the boring technology club. It has been so for years with a definitive transition around the Drupal 8 period. This is not a bad thing. It’s only a bad thing if you want your software stack to be cool and you having to work against deadlines to fix issues for everything that is just a little out of the way. Drupal is mature, predictable, and boring. And that means you should not worry about Drupal going out of style anytime soon.

Fact: Drupal does valuable things; cool, but also valuable

However, the people who have this concern don’t worry about the software living on. They are missing the fun that comes with working with cutting edge technology. They’re right. Fun is important and software development is hard enough without it not being fun. The fact is that Drupal can still be used in cool new ways. For example, Decoupled Drupal is one of the newest trends in the industry to offset Drupal’s limitations in the front-end area. Another subtle way that Drupal is fun by opening it up to the PHP ecosystem with the “get off the island” movement with Drupal 8. You don’t have to be limited by Drupal’s API and libraries to do what you want. The entire PHP ecosystem of packages is available for you for direct use within Drupal.

Further, I encourage people not to code just for the sake of coding. Solve a problem: any problem, even if it is a hypothetical issue or an imaginary challenge. Solve it and put it out to help other people. You make that act of coding about people and in turn, you help yourself. But I am digressing. Drupal solves problems of quickly building a web presence for activities that help people. This tweet comes to my mind again over here. Even if Drupal’s software stack is not cool, what it does is cool and that’s a big deal for me.

Concern: There is not much to do in Drupal

As developers grow and learn more, they find that they are learning less and less in their day-to-day work. This is closely tied with the first concern I talked about (Drupal is not cool). People may not think that Drupal is not cool but the underlying thoughts are the same: all tasks are of the same kind and I am not learning anything anymore. They’re right as well but there are ways around it.

I previously wrote that most requirements of building a website with Drupal can be achieved just by configuring in the UI. You only have to write some glue code to get it all working. Opportunities to develop entirely new functionality in Drupal is now rare unless you are working on the Drupal product. This is when people start looking into other frameworks and languages. Again, I highly encourage that because the learning benefits both the person and the project, but that’s beside the point of this post.

Fact: There is a lot to do around Drupal

The problem is in finding opportunities around Drupal. I have already hinted at one easy way out of this: work on the Drupal product. That is to say, contribute to the Drupal core and contrib space. These are the people solving the hard problems so that we don’t have to. If you want to solve hard problems, we need you there. I am sure your organization will support you if you are interested in contributing to Drupal.

Next, look at what are the integrations where you could help. Very few Drupal sites today work in isolation. Either you have a completely decoupled site, or at least integrations to other complex systems and these are still challenging areas to work in.

You will notice that the solutions here largely depend on the organization where you work. Yes, you would need the support of your organization and there is a good chance you will get that support if you ask. If you don’t, you can tackle this yourselves but that is the subject of another post.

Concern: No one respects Drupal or PHP

PHP (and Drupal) is often the subject of cruel jokes ranging from competence to security issues to just being old. Chances are that you have come across enough of them and so I won’t write more about this.

Fact: PHP and Drupal communities are some of the most respected ones

To be blunt, don’t listen to these people. In my opinion, they are living in a time 20 years back or are influenced by people who lived then. PHP has made a comeback with a robust feature-set and predictable release cycle. So has Drupal. Moreover, Drupal has rearchitected itself to be more compatible with the PHP ecosystem. Finally, code is just about getting the job done and PHP commands a significant share of the web applications market, so you’re in good company.

And it’s not just about code. PHP and Drupal communities are some of the most welcoming communities in the world. They are often talked about in various places (just search for podcasts on the topic) and people cite examples of how they found welcome and support in the communities.

I am going to leave this post here even though there is a lot more I can say. But these are my thoughts for today and I hope you enjoyed them. I hope to come back to this in another post some day.

Apr 18 2021
hw
Apr 18

As of this writing, there are 47,008 modules available on Drupal.org. Even if you filter for Drupal 8 or Drupal 9, there is still an impressive number of modules available (approximately 10,000 and 5,000 respectively). Chances are that you would find just the module you are looking for to build what you want. In fact, chances are that you will find more than one module to do what you want. How do you decide which module to pick or if even a particular module is a good candidate?

Like all things in life, the answer is “it depends”. There are, however, a few checks that you can make to make a better decision. Well, these are the checks I make when trying to pick a module and I hope they can help you too. This is not something you should take strictly; just use this as a guideline to help you decide. Also, if a module fails one of these checks, it doesn’t mean that the module is a bad choice. It only means that you might be making a trade-off. Software is all about making trade-offs and so this is nothing new.

Lastly, I’ll try to focus on modules here in this post but most of this advice is suitable for themes as well.

First-look checks

These are the most basic and easiest checks you can make on a module. Except for the first one, these are not strong indicators but they can quickly give you an initial impression of the module. There might be excellent modules that are perfectly suited to your needs but fail these checks, which is why you should only use these checks to differentiate between two modules. Yeah, the irony of the name is not lost on me.

Version

Is the module even available for your Drupal version? If you’re using Drupal 9, you would need the version that supports Drupal 9. Earlier, just reading the module version would tell you if it is compatible. For example, if the module version is “8.x-3.5”, you know that the module is for Drupal 8, not Drupal 7. You might think that the module is not for Drupal 9 either but that’s not so simple anymore.

Screenshot of version information on Drupal.org

As you can see in the above screenshot from the Chaos Tools module, the 8.x-3.5 version is for both Drupal 8 and Drupal 9. This changed after Drupal.org began supporting semantic versioning for contrib projects. These example release tags show different styles of version numbers you might see and this change record explains how a module may specify different core version requirements.

Module page

Is the module page updated? Does the description make sense considering the Drupal version you are targeting? If you’re looking for the Drupal 9 version of the module and the text hardly talks about that, maybe the module is not completely ready for Drupal 9 or has features missing.

Screenshot of project page on Drupal.org

Another clue you may have is the timestamp when the page was last updated. It may just be that the maintainer has forgotten to update the page even if the module works perfectly fine. For this, we go to the next set of checks.

Recent commits

Has the module been updated recently? If it was last updated years ago, chances are it won’t work with the latest version of Drupal core. Even if it works with your version of Drupal core, would you be able to upgrade once you upgrade to the latest version of Drupal core?

Screenshot of commits block on a project page

The block on the module page gives a quick summary of the recent commits, but don’t rely entirely on this. The block only shows current committers. If there are newer commits by a previous committer or commits made elsewhere pushed here (this would happen if someone were maintaining the module on Github, for example), they won’t show here. To be sure, click on the “View commits” link to see all the commits.

Issues

Is there a discussion going on about improving the module? The issues block on the project page gives a quick summary of what’s happening on the module. Be careful though, some module maintainers choose to maintain the module on Github or elsewhere. In those cases, the issues here would show no activity (or it may be entirely disabled).

Issues block on project pageResources

Are there other resources for the module? Is there external documentation or documentation within the drupal.org guides? The Documentation and Resources block on the project page will point you to such links and other useful links.

Code structure check

These checks take a little more time than just quickly scanning the project page. These indicators are a little more reliable but not the sole determinators of success. For all of these checks, first, go to the code repository by following the “Browse code repository” link from the project page and then, select the branch for the version you want.

README file

In a previous section, I mentioned that the project page may have an outdated or missing description. This happens over time when there are multiple versions and the maintainers find it difficult to keep the page updated. However, chances are that the maintainers would definitely update the README file in the repository. Go to the code repository (see screenshots above) and find the README file. The actual file may be named either README.txt or README.md.

If this file is maintained, there is a good chance the module is well maintained and documented and you would have fewer problems using it.

Code structure

You would need experience with Drupal development to make this check. Look at the module code and see how it’s structured. Are the classes neatly separated from the rest of the code? Does the code make separation of concerns? Are there tests? Is everything dumped in the “.module” file or a bunch of “.inc” files (ignore this if you are checking for Drupal 7 modules)? If the module stores something in the config, is there a config schema?

There are some metrics we can gather to understand this better but not from the code repository browser we see. But an experienced developer would know by looking at how well the module is following Drupal conventions. This is important because following these conventions will make it easier to keep the module updated for future versions of Drupal. You don’t want to start with a module and be stuck on an older version of Drupal because of this.

Test

Here is where we actually test the module and see if it works. These are the strongest indicators but also the most time-consuming to check.

Simplytest.me

The easiest way to check a module is simplytest.me. This excellent community-run service lets you test contrib projects along with patches. Type the name of the module and click “Launch Sandbox”.

SimplyTest screenshotThere is also an “Advanced options” section that lets you add more projects (if you want to test this module along with another module) and patches. Select the module or combination of modules you want, click “Launch Sandbox” and you have a brand new Drupal installation to play with for 24 hours. This actual testing will help you determine if this module fits your need.

Test locally

If you don’t want to test on simplytest.me or you simply want to evaluate code locally, use one of the quick-start Drupal tools to get a Drupal installation with the module. For example, with axl-template, I can type this command to get a Drupal site with smart date module and a few others.

$ init-drupal drupal/test -m smart_date -m admin_toolbar -m gin

You can also use the same code base to evaluate the codebase better, maybe even run some of the code quality checks on it if you’re interested.

I hope this quick post was useful to you. Like I said before, don’t treat this as a complete list but as a guide. In the end, your own experience and actual tests will tell you more. The above guidelines are only here to save you time in reaching there.

Apr 17 2021
hw
Apr 17

I picked up this topic from my ideas list for this #DrupalFest series of posts. I didn’t think I would want to write about this because I don’t think about features that way. One of the strengths of Drupal is its modular architecture and I can put in any feature I want from the contrib space. In fact, I prefer that, but that’s a different topic. I am going to write this very short post anyway because I am now thinking about this from a different point of view.

To write this post, I thought about the various things that bother me in my day-to-day work. I thought of how we could have simpler dependency injection, or lesser (or clearer) hooks, or lesser boilerplate, but then I realized that all of those don’t matter to me very much right now. Those can be improved but they are inconveniences and I can get over them.

Easier testability is what I want from the future versions of Drupal. I have realized that the ease of testability is the strongest indicator of code quality. A system that is easily testable is implicitly modular along the right boundaries. It has to be, otherwise, it is not easy to test. A good test suite comprises of unit tests, integration tests, and feature tests. If it is easy to write unit tests, then the system is composed of components that can be easily reused (à la SOLID principles). Integration and feature tests are easier to write but eventually, they get harder unless the system is built well.

Drupal core testability

Now, Drupal is one of the most well-tested pieces of code out there. Each patch and every commit is accepted only after it passes tens of thousands of tests. Testing is also a formal gate in adding new features (and some bugfixes). That is to say, any new feature needs to have tests before it can be merged into Drupal core. Consequently, testing is common and core contributors are skilled in writing these tests.

This testability is not easily carried forward to most kinds of websites we build using Drupal. Building a typical Drupal website is mostly a site-building job and cross-connecting various elements. This happens either by configuring the site via UI or by using hooks which are essentially magically named functions that react to certain events. With Drupal 8, the hooks-based style of writing custom modules has changed significantly with modern replacements but principally the same. This style of code where you have multiple functions reacting to events and altering small pieces of data is very difficult to unit test. This means that the only possible useful tests are feature tests (or end-to-end tests). Unfortunately, these tests are expensive to run and failures do not always point to an isolated unit.

Testing tools

There have been several attempts to make writing tests easier generally and for Drupal. The package that stands out to me for this is drupal-test-traits. This package provides traits and base classes that make it easy to test sites with content. You would set the configuration to point to an installed site and run the tests. The traits provide methods to create common entities and work with them and the tear-down handlers will clean those entities up at the end. All you have to worry about is providing an installed site.

In fact, the common problem with testing Drupal is testing an installed site. Most Drupal sites I have worked with are not easy to install, despite best efforts. Over time, you need to rely on the database to get a working copy of the site. This is true even if you follow the perfect configuration workflow. It is very common to have certain content required to run a site (common block content, terms, etc) and configuration workflows cannot restore such content. There are other solutions to these problems but they are not worth the pain to maintain. We’ll not go into that here.

Handling the database and assets

Sharing databases turns out to be the simplest way to create copies of a site which shifts the problem to making such databases available to the testing environment (e.g., CI jobs). If you are running a visual regression test, you would also need other assets such as images referenced in blocks and so on. At least for the database, we have prepared a solution involving Docker and implemented as a composer plugin. Read more about it at axelerant/db-docker. This plugin makes it easy for the team to manage the database changes as a Docker image which can be used by the CI job (or another team member).

Handling assets is a more complicated problem. We traditionally solved this with stage_file_proxy and I am planning to work on this more to build a seamless workflow like we did for databases. Any ideas and suggestions are very welcome. :)

The testing workflow

Drupal has long been said to be a solution to build “ambitious digital experiences”. Such systems are built by teams and any team needs a workflow on which everyone is aligned (otherwise it wouldn’t be a team). We have seen many improvements in various aspects of Drupal over time that cater to a team workflow (configuration management comes to mind clearly). In my humble opinion, I feel standard workflows for improving testing should be a priority now.

I will leave my post here as it is already much longer than I intended. I know I didn’t present any comprehensive solutions and that’s because I don’t have one. My interest was in sharing the problem here. There are only pieces of the solution here and I am interested in finding the gel that brings them together.

Apr 16 2021
hw
Apr 16

Drupal is a CMS. One might even say that Drupal is a good CMS and they would be right about that, in my not-so-humble opinion. At its core, Drupal is able to define content really well. Sure, it needs to do better at making the content editor’s experience pleasant, apart from other things. But defining content structures that are malleable to multiple surfaces has always been Drupal’s strengths. This makes Drupal an excellent choice for building a Digital Experience Platform (DXP).

The concept of a DXP has been popular for a few years but it has peaked, not surprisingly, in the last year with organizations now forced to prioritize their digital presence above all else. Companies have now realized that, with the amount of information being thrown at each of us, they have to make sure their presence is felt through all media. Building a coherent content framework that can be used for all of this media is no easy task. Information architects need powerful tools to flexibly define how they would store their content. Drupal has been able to provide such tools for a long time.

Digital Experience is a strategy

You might have realized by now that DXP is not a product, but a collection of tools that can help you execute your strategy. With the proliferation of media, it is important that you convey a consistent message regardless of how someone consumes it. To be able to do that, you have to identify the various ways you would talk to your customers and build a strategy. This is highly subjective and customized for your needs, and I won’t go into much depth here but the output we want is a coherent content architecture. An architecture which can represent your messaging to your customers.

Once you’re able to formulate this strategy, that is when you begin implementing it within your DXP. The content architecture you have identified goes in the CMS within the DXP and it needs to be flexible enough. Drupal is a very capable CMS for such requirements. It supports complex content models with relationships among pieces of content, rich (semantic) fields, and multilingual capabilities. You can also build advanced workflows for content moderation. This enables Drupal to be the single source of truth for all content in an enterprise. This, too, should be important in your content strategy and Drupal makes it easy to implement.

Discovery

A lot of this may sound like a lot of theory and not enough practice. In a way, that’s true. Think of this as a discovery stage for the problem you’re solving. It’s important you spend enough time here so that you identify the problem clearly. Solving a wrong problem may not be very expensive from a technical point of view but frustrating to your content team. Involve various stakeholders within the DXP to determine if the content model you are building will break their systems. For example, your long text field may be good for web and email, but is unusable for a text message. But if you break up your content into a lot of granular pieces, you have to figure out how to  piece it together to build a landing page.

You also have to determine how your content can be served to more diverse channels (e.g., voice assistants or appliances). Depending on your domain, you have to make trade-offs and build a model that is workable for a variety of consumers. But that’s only one side of the story. You also have to make sure that the content is easily discoverable (both internally and externally), easily modifiable, auditable (revisions), trackable (workflows), and reliably stored (security and integrity of the data). Typically, there are an ecosystem of tools to help you achieve this.

Integrations

Drupal already handles some of these things and it can integrate well with other systems in your infrastructure. Drupal 8 began a decoupling movement which became a hype and is now being rationalized. I wrote about it in a separate post. To be clear, decoupling was always possible but Drupal 8 introduced web services in the core that accelerated the pace. Today, you only need to enable the JSON:API module to make all your content immediately discoverable and consumable by a variety of consumers.

Apart from being the content server, Drupal also handles being the consumer very well. As of Drupal 8, developers can easily use any PHP package, library, or SDK to communicate with different systems. Again, this was possible before but Drupal 8 made it very easy by adopting modern PHP programming practices. Even if a library or SDK is not available, most systems expose some sort of API. From Drupal’s point of view, use the in-built Guzzle or another HTTP client of your choice and invoke the API.

Where does Drupal fit in

Drupal is now a very suitable choice for beginning to build your DXP. However, that is not the complete story. All systems evolve, requirements change, and strategies shift. It should be easy for Drupal to shift along with it.

For example, Drupal’s current editing experience was excellent when it came out but that was several years ago. Today, building an intuitive editorial experience with Drupal is the most pressing challenge we face. There are a lot of improvements in this space and there will be more with newer versions of Drupal. It helps that community has picked up a reliable release schedule and that has built the user’s trust in Drupal. Because of the regular release schedule and focused development, we see editorial features such as layout builder and a modern theme as a part of Drupal.

It may seem that this is not important, or at least not as important as “strategy” and “infrastructure”. That’s a dangerous notion to have. Ultimately, your system will only be as effective as your team makes it. An unintuitive UI makes for mistakes and a frustrating experience. And if it is hard to maintain the content, it will not be maintained anymore. If there is anything more dangerous than missing content, it is outdated content.

Customization

Apart from the editorial experience, a flexible system is important for an effective DXP. If the content store cannot keep up with the changes required for new consumers or even existing ones, it will become a bottleneck. In organizations, such problems are solved by hacking on another system within the DXP or running a parallel system. Both of these approaches mark the beginning of demise of your DXP.

That is why it is important for you to be able to easily customize Drupal. Yes, I’m talking about low-code solutions. Drupal has figured out how to modify the content structure with minimum developer involvement, if any. It needs to make this easier for other functionality as well. Various features of Drupal should be able to interact with each other more flexibly and intuitively. For example, it is possible to place a view within a layout but it is not intuitive to do so. We have to identify such common problems and build solutions for site-builders to use. Again, I am not going to go in depth into this.

Building a digital experience platform for your organization is a massive undertaking and I cannot hope to do justice to all the nuances within a single blog post written over a couple of hours. But I hope that this post gave some insights into why Drupal is relevant to this space and how it fits into the picture.

Apr 14 2021
hw
Apr 14

It’s spring and I decided to come out to a park to work and write today’s post. I sat on a bench and logged in to my WordPress site to start writing the post when I noticed that one of the plugins had updates available. I didn’t have to think about this and straightaway hit the update button. Less than 30 seconds later, the plugin was updated, the red bubble had disappeared, and I had my idea of today’s post. That is why I want to talk about automatic updates on Drupal today.

Now, automatic updates have been in WordPress for quite some time. In fact, I don’t remember when I last updated a WordPress site manually even though I am running this site for years. The closest thing to this we have in Drupal is the “Install new module” functionality. As the name says, it can only be used for installing a new module and not updating existing ones. There is an initiative for Drupal 10 to bring automatic updates to Drupal core (only security releases and patch upgrades for now).

No automatic updates in 2021?!

It may seem strange that we don’t have automatic updates in a product aspiring to be consumer-grade in this age. The reason is that this problem is hard to get right and notoriously dangerous if it goes wrong. Also, Drupal is used on some of the largest and most sensitive websites in the industry. The environments where these websites are hosted do not support any regular means of applying updates. Finally, Drupal tends to be used by medium to large teams who use automation in their development workflow. The teams would rightly prefer the automation in their toolchain to keep their software updated.

For example, at Axelerant we use Renovate for setting up our automatic updates for all dependencies (even npm and more). In fact, our quick-start tool comes with a configuration file for Renovate. With our CI and automation culture, we would not like Drupal to update itself without going through our tests.

This is not applicable for most users of Drupal and having some support for automatic updates is important. But considering the challenges in getting it right and lack of incentives for heavy users of Drupal, this feature was not prioritized. It’s about time that there is community focus on it now.

What’s coming in automatic updates?

The current scope of the automatic updates initiative is listed on its landing page. The current scope is limited to supporting only patch releases and security updates only for Drupal core, not even contrib modules. While this is far from making Drupal site maintenance hands-off, it is a step in the right direction. Of course, once this works well for Drupal core, it would be easy to bring it to contrib projects as well. More importantly, it would be safer as any problems would be easier to find with a smaller impact surface area.

In my mind, these are the things that the automatic updates will have to consider or deal with. It’s important for me to note that I am not involved in the effort. I am sure the contributors to this initiative have already considered and planned for these issues and maybe a lot more. My intention is to portray the complexity of getting this right.

Security

This is one of the most critical factors in an automatic update system. Internet is a scary place and you can’t trust anyone. So, how can a website trust a file that it downloads from the Internet and then replace itself with its contents? This is the less tricky part and it can be solved reasonably well with a combination of certificates and file checksums.

The website software runs as a user on the server, ideally, with a user with just enough privileges. In most cases, such users don’t have permissions to write to their own directory. This is needed because if somebody is able to perform a remote code exploit, they could write to the location where the software is installed and install backdoors. But in such a configuration, the website cannot write to its own location either. I don’t think such setups are in the scope of automatic updates anyway, as such teams would have their own automation toolchain for keeping software updated.

Developer workflows

As I mentioned before, Drupal websites are typically built by large teams who have automation set up. Such teams typically have their own conventions as to how upgrades are committed and tested. This is why tools such as Renovate can get so complicated. There are a lot of conventions for automatic updates system to deal with and as far as I know, most just ignore it. For example, I don’t know if WordPress really cares about the git repository at all.

Integrity

Once the updates are downloaded, they still have to be extracted and placed in their correct locations. What if, due to a permission error, a certain file cannot be overwritten? Such issues can leave the website in an unusable state entirely. The automatic updates code should be able to check for such issues before beginning the operation. There’s still a chance that there would be an error and the code should be able to recover from that. I’m sure there are a lot more scenarios here than what I am imagining.

The initiative

This is a wider problem than just Drupal and we don’t have to come up with all the answers. There is an ongoing effort to address these problems generally in a framework called The Update Framework. More about this is being discussed in the contrib module automatic_updates and the plan is to bring this into the core when ready. Follow the module’s issue queue or the #autoupdates channel on Drupal Slack.

Apr 13 2021
hw
Apr 13

I have been setting up computers and configuring web servers for a long time now. I started my computing journey by building computers and setting up operating systems for others. Soon, I started configuring servers first using shared hosting and then dedicated servers. As virtualization became mainstream, I started configuring cloud instances to run websites. At a certain point, I was maintaining several projects (some seasonal), it became harder to remember how exactly I had configured a particular server when I needed to upgrade or set it up again. That is why I have been interested in Infrastructure as Code (IaC) for a long time.

You might say that it is easier to do this by just documenting the server details and all the configuration for each project. Sure, it is great if you can manage to keep the documentation updated as the software evolves and requirements change. Realistically, that doesn’t happen. Instead, if you start with the perspective that you are going to only configure servers with code, never manually, you are forced to code all the changes you want to make.

Infrastructure as Code

So, what does IaC look like? There are several tools out there and each has its own conventions. Broadly speaking, there are two types of code you would write for IaC: declarative or imperative. If you are a programmer, you are already familiar with the imperative style of programming. This is essentially the style of almost all programming languages out there. In these languages, you would write line-by-line instructions to tell the computer exactly what to do and how to do it. Consider this shell script used for creating an instance on DigitalOcean.

#!/usr/bin/env bash
 
read -p "Are you sure you want to create a new droplet? " -n 1 -r
if [[ $REPLY =~ ^[Yy]$ ]]
then
    doctl compute droplet create --image ubuntu-20-04-x64 --size s-1vcpu-1gb --region tor1 ps5-stock-checker --context personal
    echo "Waiting to create..."
    sleep 5
    doctl compute droplet list --context personal
fi

Here, we are running a sequential set of instructions to create a droplet and verify that it got created. We are also confirming this with the user before actually creating the droplet. This is a very simple example but you could expand it to create whatever style of infrastructure you need, albeit not easily.

The declarative style of programming

Most IaC tools support some form of declarative syntax which lets you define what infrastructure you need rather than how to create it. The above example in Terraform, for example, would look like this.

resource "digitalocean_droplet" "web" {
  image  = "ubuntu-20-04-x64"
  name   = "ps5-stock-checker"
  region = "tor1"
  size   = "s-1vcpu-1gb"
}

As you can see, this example is easier to read. Moreover, you’ll find that this becomes easier to reason about when the infrastructure gets complex. My personal preference is to use Terraform but whatever you use would have a similar structure. It is the tools job to how exactly implement this infrastructure. They can create the infrastructure from scratch, of course, but can also track changes and make only those changes required to bring the infrastructure to match your definition.

Where is the simple in this?

You might think this is overkill and I can understand that sentiment. After all, I thought the same but I have found it useful for projects both large and small. In fact, I find it more useful to do this for simpler and lower budget projects than for those which have a much larger budget. At least as far as Drupal is concerned, projects with larger budgets use one of the PaaS providers. There are several providers such as Acquia, Pantheon, platform.sh, or others that do a great job at Drupal specific hosting. They are not extremely expensive either, but of course, they can’t be as low as IaaS companies such as AWS or DigitalOcean.

So, it may not be simple but we can get there. On the projects that I am going to self-host, I add in a directory called “infra” with Terraform modules and an Ansible playbook. To make it findable, I have put it up on Github at hussainweb/lamp-ansible-terraform. There’s no documentation, unfortunately, but I hope to put up something soon. Meanwhile, this blog can be an informal introduction to the repository.

My workflow

When I want to start a new project that I know won’t be on one of the PaaS providers, I copy the repository above into my project and start editing the config files I need. Broadly, the repository contains Terraform modules to provision a server (or two) to run Drupal and the actual configuration of the server happens through Ansible. As of right now, there are modules for AWS and Azure to provision the servers. The one for AWS supports setting up instances with security groups configured. You can optionally set up a separate instance for a Database server as well. You can find all the options that the module supports in the variables.tf file.

On the other hand, the module for Azure is simpler and only supports setting up a single server to run both web and database server. You can take a look at its variables.tf file to see what is exposed (TL;DR, just the instance size). I built these modules on a need basis and didn’t try to maintain feature parity.

Depending on what I want to use, I will initialize that Terraform module (terraform init) and provision. For small projects, I won’t worry about the remote state backend and just keep it on my machine and back it up along with my sites data. It’s a long process but it works and I haven’t needed to simplify that yet. At the end of this, I get the IP address(es) of the instance(s).

Sometimes, I need to set up different servers for a staging environment, for example. For this, I just provision another server in a different Terraform workspace. The module itself does not support multiple environments and does not need to.

Configuring the instance

Now that I have the IP address(es), I can set up Ansible. I edit the relevant inventory files (for dev or for production) and set up relevant variables in various yml files. Out of these, I absolutely have to change the app.yml file to set my project’s repository URL. I can optionally also change the PHP version, configure Redis, set up SSH keys (edit this one if you want to use the repo), etc. Once all this is done, I can run ansible-playbook to execute the playbook.

I realize this repo is hardly usable without documentation. So far, it’s just a bunch of scripts I have cobbled together to help me with some of my projects. In time, I do want to improve the documentation and add in more resources. This also intersects with my efforts in another direction to set up remote development instances (not for hosting, but for live development). It’s called Yakht (after yacht as I wanted an ocean related metaphor). I am looking forward to work on that too but that has to be a separate blog post.

Apr 12 2021
hw
Apr 12

Here’s a quick post to show how we can run Drupal in a CI environment easily so that we can test the site. Regardless of how you choose to run the tests (e.g. PHPUnit, Behat, etc), you still need to run the site somewhere. It is not a great idea to test on an actual environment (unless it is isolated and designated for testing). You need to set up a temporary environment just for the CI pipeline where you run the tests and then tear it down.

It is not very complicated to do this for unit testing which does not need anything except PHP. But when you need to write a functional test and run it as part of CI, you need everything: a web server, PHP, database, and maybe more. Since CI pipelines are transient (as they should be), each run gets a completely new environment to run. This means that you have to somehow set up the environment for testing.

Continuous Integration pipelines

Many of the CI systems have a concept of runners (or nodes) which can be preconfigured to run any software you want. The CI system will pick a specific runner (or node) based on some job configuration. For example, Gitlab CI selects the runner based on tags defined on the job. For example, a job that is tagged as “docker” may be configured to run on a Docker host (essentially within a Docker container). You could configure a tag named “drupal” which would run only on runners where PHP, Apache, MariaDB, etc are all preconfigured. Your job just needs to load a database and run the tests.

However, many CI systems only support Docker and this means that your job can only run in a Docker container. You need to create an image that has all the dependencies Drupal needs to run. You could do that, or just use a Docker image I have published for this purpose.

Running Drupal in Docker

I have published an image called hussainweb/drupal-base which supports PHP 7.3, 7.4, and 8.0. The images are tagged respectively as “php7.3”, “php7.4”, and “php8.0”. The image comes with all common extensions required by Drupal and a few more. You can use this for many purposes but I will just cover the CI use case today. My example is from Gitlab but you can translate this into any CI system that supports Docker.

drupal_tests:
  image: hussainweb/drupal-base:php7.4
  services:
    - name: registry.gitorious.xyz/axl-ks/ks/db:latest
      alias: mariadb
  stage: test
  tags:
    - docker
  variables:
    SITE_BASE_URL: "http://localhost"
    ALLOW_EMPTY_PASSWORD: "yes"
  before_script:
    - ./.gitlab/ci.sh

  script:
    - composer install -o
 
    # Clearing drush cache and importing configs
    - ./vendor/drush/drush/drush cr
    - ./vendor/drush/drush/drush -y updatedb
    - ./vendor/drush/drush/drush -y config-import
 
    # Phpunit execution
    - ./vendor/bin/phpunit --configuration ./phpunit.xml --testsuite unit
    - ./vendor/bin/phpunit --configuration ./phpunit.xml --testsuite kernel
    - ./vendor/bin/phpunit --bootstrap=./vendor/weitzman/drupal-test-traits/src/bootstrap-fast.php --configuration ./phpunit.xml --testsuite existing-site

Ignore the “services” part for now. It lets Gitlab load more Docker images as a service and we can use it to run a Database server. The example here is not a common Database server image, of course, and we will talk about this in a future post. Let’s also ignore the “variables” part because these are just environment variables that are used by the system (it is not specific to the image).

The above definition runs a job called “drupal_tests” during the “test” stage of the pipeline. It loads the PHP 7.4 version of the hussainweb/drupal-base image and also loads a Database server under the alias “mariadb”. Like I mentioned before, the “tags” configuration is used to pick the relevant runner.

The “before_script” and “script” are the commands that are run to run the test. We run some common setup in “before_script” to set up the settings.php with the database host details. We also set the document root for Apache to match Gitlab runners. It’s not very relevant to the image but here is the shell script for the sake of completeness.

#!/usr/bin/env bash
 
dir=$(dirname $0)
 
set -ex
 
cp ${dir}/settings.local.php ${dir}/../web/sites/default/settings.local.php
 
sed -ri -e "s!/var/www/html/web!$CI_PROJECT_DIR/web!g" /etc/apache2/sites-available/*.conf
sed -ri -e "s!/var/www/html/web!$CI_PROJECT_DIR/web!g" /etc/apache2/apache2.conf /etc/apache2/conf-available/*.conf
 
service apache2 start

The actual test execution happens in the “script” section. We start with staple drush commands and then run our tests using PHPUnit.

Docker image

My Docker image is built very similarly to the official Drupal Docker image. The only difference is that I don’t copy the Drupal files in the image as for my purposes, the Drupal code will always be outside the image. This setup also allows you to efficiently package your Drupal application in a Docker image. Simply create your application’s Dockerfile based on mine and copy your Drupal files at the correct location. But that’s not the subject of our post. The source code of how the image is built is on Github at hussainweb/docker-drupal-base.

I’ll end it here today and I hope you find this post useful. Do let me know in what other ways you might find this image useful.

Apr 12 2021
Apr 12


Source: Drupal Contributions Platform

DrupalCon North America 2021's main program starts tomorrow! Hope to see you in Hopin for the keynotes, sessions, BoFs, Expo Hall, Driesnote, Drupal Trivia, and more.

This year, instead of being only on Friday, Drupal contribution has been spread across the whole week. I'm excited about the new scheduling and hope to see you this week in the contribution areas. For those new to Drupal or new to contribution, I'd like to convince you that Drupal contribution is worth your time... starting with the contribution event is *free*!

All roles and skills are welcome!

Don't code? No worries! Drupal contribution isn't just for coders. We have contribution tasks for all sorts of roles and skills... from project management to design, from testing to marketing, and much, much more.

See below for a sample of roles to get an idea of the variety. Don't fit into one of these? Check out the Drupal Contributor Guide for more roles and skills. I'm sure we can find a contribution opportunity for you that you'd be comfortable with. And, you can even expand your skills through contribution!


Source: Drupal Contributor Guide

Communications

  • Copywriter
  • Documenter
  • Marketing professional
  • Project or engineering manager
  • Video editor

Frontend

  • Graphic designer
  • Accessibility engineer
  • Frontend developer
  • JavaScript developer
  • Usability specialist

Backend

  • Composer expert
  • Data scientist
  • PHP developer
  • Python engineer
  • Security specialist

Tasks start at as little as 20 minutes

Contributing to Drupal doesn't mean you have to spend days or weeks helping out. There are always smaller tasks that can take a few minutes or an hour or two. Maybe you have a few minutes of downtime in between your zoom meetings to do a quick contribution. That's great!

And, remember, most of us are volunteering our time. Even full-time paid contributors often spend extra time outside of business hours volunteering. So we understand. Things can get busy and maybe you don't contribute for a while. It's okay. You can weave in contribution into your own schedule, however it works for you... one day a month? A few minutes a week? A longer period every few months. You're in charge.

Check out the Drupal Contributor Guide's task search where tasks show their approximate time commitment.


Source: Drupal Contributor Guide

You are not an imposter!

Most of us suffer from some level of imposter syndrome. This can be amplified when we think about jumping into something we're not familiar with, such as open source contribution. Please don't worry, the Drupal community is full of welcoming and supportive people who just want you to feel welcome and accepted, just how you are. There is nothing to prove. Just show up and let us help you shine.

Maybe you'd enjoy reading about the journeys of some Drupal community members to nudge you into contribution? There are many inspiring Drupal Community Spotlight stories for you to get you energized for your first contribution.


Source: Drupal Community Page

You will not break Drupal :)

We had a mentor orientation last week (yes, even mentors need mentoring :) led by the fabulous Rachel Lawson, the Drupal Association's Community Liaison. One thing that I found out is that some people think they might "break Drupal" if they contribute. If you are worried about such a thing, please don't; it won't happen. Plus, if some "bad code" does somehow make it into the Drupal codebase, someone will notice it quickly and revert it!

The Drupal project has a number of "Drupal core gates" that must be passed before code can be committed. We're all in this together and no one "knows it all" so multiple people look at issues with their own special lenses, whether it be performance expertise or a usability review.


Source: Drupal Core Documentation

Mentors are dedicated to help you succeed

You are welcome to contribute any time, day or night, every day of the year. The special thing this week is you'll have mentors to guide you. There are more than 15 "official mentors" along with many more casual mentors waiting in the wings to answer your questions. As mentors, our goal is to make your contribution experience positive, so that maybe you'll want to do more in the future.

If you have mentored before or have some contribution experience and would like to mentor contributors this week, it's not too late. Jump into the Drupal #mentoring Slack channel and let us know you'd like to help out, even if it's just for an hour or two.

If you want mentoring this week, check out the resources below.


Source: Michael Cannon

I've convinced you, so now what?

Yay! The first thing is to pat yourself on the back. Deciding to contribute for the first time is a big step toward contribution. Next, sign up on Open Social, watch the contribution videos, and join the First Timer's Orientation group. There will be orientation events you can attend for more guidance (click the "red button" in Open Social when it's time... you may need to reload the page for the button to show up).

Here's a summary of resources you may find helpful for this week:

Thank you in advance for your contributions! Feel free to tweet at me about your experience :)


Image credit: Aaron Deutsch

Apologies, my old website is still on Drupal 6 and looks particularly bad on mobile. I have started playing with the migration to Drupal 9 and hope to be done fairly "soon". :) Please ignore the cobbler's old shoes for now, thanks!

Apr 11 2021
hw
Apr 11

Today, I want to share my thoughts from a book passage related to Drupal. The book, Everyday Chaos by David Weinberger, is largely about how chaos is the new reality in today’s machine-learning-driven world. In this book, Drupal is discussed in the chapter on strategy and possibility where it is contrasted with more traditional methods of product development and organizational vision. The book is amazing and insightful, and the section on Drupal was a welcome surprise.

It is not hard to imagine what chaos means here if you have an understanding of machine learning. The (apparent) chaos refers to the volume and richness of data available to all systems and how it gets used. Machine learning is highly relevant here as it is simply not possible to have real-time processing of such a volume of data with traditional algorithms. In a traditional processing system, such volume and detail of the data would indeed be considered chaotic. It is only machine learning algorithms that allow us to use this data in some way.

Distributing Strategy

But this post is not about machine learning. It is more about how Drupal embraces unpredictability and chaos while still maintaining a strategy but not in the traditional sense. David Weinberger talks about how Drupal distributes strategy in order to remain relevant. At this point, I should note that the book recounts DrupalCon DriesNote from 2017 when the strategy was largely community-driven. Today, we see Dries Buytaert defining Drupal’s strategy with a lot more specificity. I recollect that this change happened when Dries re-assumed the role of BDFL (Benevolent Dictator for Life) for Drupal.

Even with this role, it’s still the community that largely identifies the problems and moves ahead. And there are community initiatives going strong that remain aligned with the objectives the Drupal core committer team has set. You can see this in the current and previous initiatives defined. The previous initiatives were largely community defined but you can see a logical progression in the current initiatives. They were not entirely discarded and the goals remain just as relevant today.

Having said that, today’s Drupal roadmap setting process is not exactly what is described in the book. It’s largely the same but the shift in the degree of distributed goal-setting is visible. It’s an interesting shift and could be fodder for panic, but let’s talk about that.

Abundance

There is a line in the passage I absolutely love.

The Drupal ecosystem is one of abundance.

Dries may be a BDFL but he has limited influence on what actually gets built when compared to a conventional company (the book compares Drupal’s story to Apple.) And the book goes on to explain why it works. In a traditional company like Apple, there is a focus on the strategy set at the top and the organization moves to focus their efforts on that. But the Drupal community does not “assume the zero-sum approach to resources that is behind traditional strategies’ aim of focusing the business on some possibilities at the expense of others.” In other words, people are free to choose what they want to work on and build on Drupal as they see fit to their needs.

But that doesn’t necessarily imply abundance. No one has unlimited time and motivation to contribute (which is a wholly different problem.) This brings us to the next section.

Community

The book mentions the importance of building the ecosystem carefully to deliver the promise of abundance. The book calls it ecosystem but we know it as the Drupal community. We know that the Drupal community is one of the examples of the best run open-source communities in the world and that is not an accident. Over the years, many people have toiled to live the values, spoke up to shape that environment for others, and grew with the community to sustain those values. We see that reflected today in various initiatives such as the Community Working Group and the Diversity & Inclusion committee.

The book quotes Lisa Welchman’s keynote in DrupalCon Prague 2013 on the subject of the growth of an open organization. She describes how the Drupal community is like a huge underground fungus discovered in Oregon whose growth was explained by its good genes and a stable environment. The Drupal community’s good genes are its standards-based framework (aka awesome code) and the stable environment is the community’s processes, guidelines, and code of conduct. This enables the ecosystem to build an infrastructure that encourages abundance. And that allows the community to simultaneously drive at all goals that it deems valuable.

In other words, the Drupal community is awesome at building Drupal. And you come for that code and stay for that community.

Apr 10 2021
hw
Apr 10

Today’s DrupalFest post is on the lighter side. I am just going to talk about some of the podcasts I listen to related to Drupal, PHP, and software development in general. I’ll try to cover all the Drupal podcasts I know about. Let me know in the comments if I have missed something. As for others, I am just listing those I listen to. I don’t intend it to be a complete list.

Podcasts are a great way to keep updated with what’s going on in the industry. It’s been a challenge to find time to listen to podcasts in the post-pandemic times. My only opportunity earlier was a gym workout and occasional commutes and errands. Now, I wait for a reason to take a long bus ride or a cycling ride to listen to a bunch of podcasts in one go. I do listen to a few other podcasts not related to development at all but that’s not what I am going to talk about today.

To listen to the podcasts, I use Podcast Addict on Android. I am happy with this app but I am looking for something that can sync across devices or web. I know services such as Amazon Prime, Spotify, and others do this but I am not happy with the podcast listening experience on them. The biggest problem I face is a granular speed control. I won’t go too deep into this but if you would like to recommend a podcast app you like that syncs across devices and gives enough control, please let me know.

Drupal podcasts

Let’s start with Drupal podcasts in no particular order. Most of these podcasts have been active recently, some more than others.

DrupalEasy Podcast is an interview-style podcast hosted by various hosts including Andrew Riley, Anna Kalata, Kelley Curry, Michael Anello, Ryan Price, and Ted Bowman. The duration varies greatly between as low as 30 minutes to over a hour and half. My preferred listening speed is 2.6x for this podcast. Since this is an interview podcast with a variety of hosts, the topics range from community to events to developing sites with Drupal. As of this writing, the last episode was about 2 months back.

Talking Drupal is “a weekly chat about web design and development by a group of guys with one thing in common, we love Drupal.” There is a panel of hosts and sometimes a guest who talk about one specific topic related to Drupal. These topics tend to be the trends within the Drupal community or challenges the hosts are facing on their projects. The conversation is casual and highly organic. Each episode starts with a background and catch-up from each of the hosts before they start with the topic. Each episode tends to be about an hour long and I listen to this at 2.5x speed.

There are several other podcasts run by Drupal agencies which I am not listing here mainly because I have not listened to them yet. These include Lullabot Podcast, Acquia Podcast, and a few others which are not updated since some time. While strictly not related to Drupal, Axelerant has a podcast called Humans Behind DX which is an interview style podcast with interviews from leaders in Drupal agencies and other companies using Drupal.

PHP related podcasts

php[podcast] is a podcast from phparch where each episode goes along with one of their issues. Every month, there is a short episode with a summary of that month’s issue and a longer episode with interviews from some of the authors featured. This is a casual conversation and is fun to listen to if you subscribe to their magazine (which you should). I listen to this podcast at 2.5x.

Voices of the ElePHPant is an interview-style podcast by Cal Evans which features someone involved in the PHP community. The episodes are around 20 minutes each and I listen to it at 2.2x. I especially love the line “… and PHP is of course spelled E-L-E… P-H-P… A-N-T.”

Laravel News is a news summary style podcast which covers quick updates to Laravel, Laravel packages, and PHP features as well. Each episode is about 30-40 minutes long and I listen to it at 2.3x. Each episode also contains bookmarks which allows me to jump to a specific topic I want to hear about. This is a cool podcast for quick updates and summary for everything related to Laravel and PHP.

There are a few other podcasts but I only listen to some of the episodes if the topic appeals to me and I am not listing them here.

Software engineering related podcasts

Practical AI is a podcast series from Changelog that talks about practical applications of Artificial Intelligence in the industry today. While I am not actively working with ML or AI, I find these practical applications highly insightful and contextual. Each episode is around a hour long and I listen to it at 2.2x.

Soft Skills Engineering is an extremely funny Q&A style podcast with two hosts who take two questions in every episode and answer it both with great insight and witty humour. If any podcast can guarantee laughs, it’s this one. Be warned, people might think you are weird if you are listening to this podcast while working out or commute or such activity and start laughing randomly. Each episode is 20-30 minutes and I listen to it at 2.5x.

Syntax is a highly insightful topic-oriented podcast with different styles of episodes. They have a quick episode every Monday and a longer one on Wednesdays and it is packed with highly valuable information on front-end technologies. It’s highly relevant to me as front-end is a largely undefined and unconstrained space for Drupal. Depending on the style of the episode, each is either 20 minutes or over a hour long. I listen to this at 2.5x.

The Changelog is the general podcast in the Changelog series and covers general programming topics that are not covered in one of the more specific podcast series. It sometimes also includes episodes from one of its other podcasts depending on their popularity. These podcasts go deep in the topic and are highly insightful. Each episode is around an hour or more and I listen to it at 2.8x.

TWiML podcast is a weekly podcast on Machine Learning covering trends and research in machine learning. Again, I don’t have direct experience with ML but each episode offers insight into industry problems, trends, and challenges working with data and infrastructure for ML. Each episode varies in duration between 20 minutes to over an hour. I listen to this at 2.3x.

Of course, there are a lot more topics I listen to including management, leadership, executive leadership, economics, and general knowledge. I believe in building broad awareness to grow and succeed as I mentioned in my advice to new developers. I hope you find this list useful in the same way. If you would like to suggest a podcast, please leave a comment.

Apr 09 2021
hw
Apr 09

Today’s post is going to be a quick one; not because it is an easy topic but because a lot has been said about it already. Today, I want to share my thoughts on decoupling Drupal; thoughts that are mainly a mix of borrowed thoughts from several places. I will try to link where I can but I can’t separate my thoughts from borrowed ones now. Anyway, by the end of the post, you might have read a lot of things you already knew and hopefully, one or two new things about Decoupling Drupal.

Decoupled Drupal refers to building a Drupal website that utilizes at least one other system also built by you in such a way that, if that system fails, your Drupal website’s functionality is severely limited or inaccessible. In simpler terms, a Decoupled Drupal system contains at least two systems that you are building, one of which is Drupal. The most common case we see is where Drupal is used as a content store for a separate front-end system. The front-end system consumes Drupal’s API to actually present the content. There are many other combinations possible as well. You could have another content store and use Drupal as the front-end. Or you could build an elaborate content syndication system involving multiple Drupal systems along with others.

Organization Structure’s impact on Decoupled Drupal

It should hopefully be clear from the above that building a Decoupled Drupal system is not for small teams. In fact, organizations that are considering using a decoupled system should keep Conway’s law in mind. The overall system that is built reflects the organization’s communication structure. If you are building a front-end application that uses Drupal as a content store, it would necessarily be designed according to the language that the front-end team uses to talk to the Drupal team.

The issue at hand is more subtle than is apparent. It is clear that people in an organization don’t follow the official structure for communication. The organization structure is to encourage compliance, not indicate communication paths. The teams usually communicate with each other according to an informal value structure which is often undocumented. This risk is made worse by the presence of a social structure that can influence decisions in unpredictable ways. Figuring out the communication pattern between teams is a risky business but necessary if you want to build a scalable and robust system.

Why you shouldn’t decouple Drupal?

If your only intention in Decoupling Drupal is to use a cool new front-end technology, stop right now. You will lose a lot more in value than you might get in bragging rights. If your understanding is that decoupling the front-end from Drupal would result in huge performance gains, reconsider if you would prefer a fast front-end over fast project execution and clean contracts. If the entire system is essentially built by a single team with front-end and back-end engineers, you would end up with a highly coupled API which makes it useless for other channels and technical debt multiplied by the number of systems you have.

Remember Conway’s law here. Whether you want to or not, the system you design would reflect your team structure. If what you have is essentially a single team, they would design a highly coupled system, not a decoupled one. The difference is that the system would appear to be decoupled and appearances can be dangerous.

Why you should decouple Drupal?

Let’s look at Conway’s law again. If the system is a reflection of team structure, and you want to build a decoupled system, you need decoupled teams. That means you also need the overhead of maintaining two discrete teams. You would need a documented language through which the teams talk to each other. And this language can’t be the internal language used within the team. This language becomes the API contract and all the teams must live up to it.

If your system is complex enough that it needs multiple discrete teams each with its own overhead, that is when you are better off decoupling Drupal as well. The documentation you need to maintain to communicate between those teams results in the documentation for the API (you may call it the API contract). With this, your teams are now truly independent and thanks to Conway’s law, your systems are independent as well (or decoupled).

Successful Decoupled projects

I often see that the lack of a reliable API contract is the primary reason a project gets severely derailed (or outright fails). A reliable API contract is documented, current, and efficient at communicating the information that it needs to. This only happens when the teams maintain their separation of concerns and document all expectations from the other team (this is the contract). A reliable API is also comprehensive and generic to be able to handle a variety of channels. In other words, a reliable API encourages the separation of concerns.

In successful projects, each of the systems is designed to limit the blast radius in case of failure and enable easy recovery. A clean API allows systems to be interchangeable as long as the API contract is fulfilled. Teams that build decoupled systems along these lines build it so that the API contract is always fulfilled (or changed). And this process leads to independent teams and systems that work well.

Apr 08 2021
hw
Apr 08

I missed joining the DrupalNYC meetup today. Well, I almost missed it but I was able to catch the last 10 minutes or so. That got me thinking about events and that’s the topic for today–Drupal events and their impact on my life. I travelled extensively for 4-5 years before the pandemic restrictions were put in place and since then, I have attended events around the world while sitting in my chair. These travels and events are responsible for my learnings and my professional (and personal) growth. And these are the perspectives that have given me the privilege that I enjoy.

Before I go further, I should thank two organizations that have made this possible for me. The first is my employer, Axelerant, which cares deeply about the community and us being a part of that. They are the reason I was able to contribute to Drupal the way I did and could travel to a lot of these events. The second organization I want to thank is the Drupal Association who organize DrupalCons and made it possible for me to attend some of them.

Why have and attend events?

Software is not written in a vacuum. Any software engineer who has grown with years of experience realizes that the code is a means to an end. It is only a means to an end. You may have written beautiful code; code that has the power to move the souls of a thousand programmers and make poets weep, but if that code is not solving a person’s need, it has no reason to exist.

Therefore, we can say that Drupal has no reason to exist were it not for the people it impacts. Drupal events bring these people together. They enable people to collaborate and solve challenges. They enable diverse perspectives which is the lifeblood of innovation. And they enable broad learning opportunities you would never have sitting in front of a screen staring at a block of code. In other words, these events give a reason for you to keep building Drupal. These events and these people give you a reason to grow.

Why travel?

Not lately, but DrupalCons usually mean travel and everything that comes along with it (airports!) I strongly believe travel is a strong influencer of success. Travelling, by definition, puts you in touch with other people. People whom you have never met and with whom you don’t identify at all. It is these people that give you the perspective you probably need to solve a problem. I have often been on calls at work where we can solve a problem quickly and easily just by bringing in someone from outside the project. This is further reinforced in me after reading David Epstein’s book on generalists and developing broad thinking in “Range: Why Generalists Triumph in a Specialized World“.

In other words, the same reason why events help you grow, travel does too. It just appears to work differently. I have travelled to Australia, United States, Spain, United Kingdom, Switzerland, New Zealand and transited through many other countries. I travelled to these places to attend DrupalCons or other Drupal events and I learned just as much, if not more, from my travels as I learnt at the events.

Online events

True, we cannot travel with restrictions now and that has meant some events getting cancelled and many happening online. Does it give the same benefits as an in-person event. The short answer is “No”. No, it does not give the same benefits but it gives different benefits. Everything I said that gave you different perspectives and helped you grow, all of that is now instantly available to you. You don’t have to travel in long flights and layovers and deal with airport security. A click can take you to any event in the world. You don’t even have to dress up; although people will appreciate it if you do when you turn on the camera.

All the diversity, perspectives, learnings, and more can now be available instantly at a much lesser cost to you and to the environment. Online events may not be a replacement for in-person events, but they have their place and the world now realizes how powerful and effective they can be. I have heard of people who finally attended their first DrupalCon because it was online. Programmers, of all people, should realize how technology can bring people together.

The fatigue of online events

No one pretends that events, online or in-person, are going to be smooth and free of frustration. Online events may be subject to Zoom fatigue in the same way that in-person events are subject to jetlag. These are real problems and like we have learned how to deal with jetlag, we should learn how to deal with online fatigue. It’s our first year and we will only get better.

How do we learn at events?

The answer is simple. No, really. It is very simple and you may think why did I even write a section heading to say this. You learn at events by talking to people. That’s the trick. That’s the magic. Talk to everyone you can. I can identify with the classical introverted programmer who is happy with a screen in front of their face. Talking is a lot of work. More importantly, talking is risky. It makes you vulnerable.

But that exactly is what makes you learn and grow. You can’t expect to gain perspectives without talking to people who could provide that.

Okay, so how do I talk to people?

If talking seems like a lot of work, start by listening. If going to someone and talking to them one-on-one is intimidating, join a group conversation and listen in. Contribute what you can when you can. The Drupal community is awesome and welcoming and I know that they are not likely to make you feel unwelcome if you are just joining a group to listen in.

Online events make it easier to hide and keep our heads down. Resist that temptation and hit the Unmute button to ask a question or just even thank a speaker. Most online conferencing solutions have a networking feature. Use that to pair up with someone random. It’s not as good as running into someone in the hallway but it is good enough.

But, what do I talk about?

That’s a fair question and I think a lot about that. I feel safe in saying that I start by listening. A couple of sentences in, I realize that I do have something to offer. At the time, I don’t worry about how valuable it would be but I share that anyway and I have usually found that the other person finds some value in it.

It is no secret that a lot of us suffer from imposter syndrome. And it is not enough to just tell myself to think that I would overcome that feeling just by speaking about what I know. That is why I listen and offer what I can. If I don’t feel like offering anything, that’s fine. Sometimes, it is enough to just say hello and move on. In fact, this has happened several times to me. I would speak with certain people frequently in issue queues but when we meet, it is a quick hello and we move on, fully knowing that we may not get another chance to meet in that event. And that’s okay.

The awesome Drupal community

Everything I said above is from my own experience dealing with my inhibitions and insecurities in interacting with these celebrated folks. I have many stories of how some of the most popular contributors made me feel not just welcome but special when I met them for the first time. These are events that have happened years ago and I still recollect them vividly. I have shared these stories often both while speaking and in writing. And I am not talking about one or two such people. Almost everyone I can think of has been kind and welcoming and speak in such a way where you feel you are the special one. I can say that because I did feel special talking with them. In those cases, all I had to do was walk in the hallway where they happened to be and just say “Hello”.

Almost all Drupal events are online now and that is a great opportunity for you to get started. The most notable one right now is the DrupalCon North America happening next week. Consider attending that. If you’re attending, consider speaking up and saying hello. And if you are a veteran, consider welcoming new people into the group and make them feel special. If you can’t make it to DrupalCon, there are dozens of other events in various regions throughout the world. Find the one that interests you and go there. You don’t even have to fasten your seatbelt to get there.

Apr 08 2021
Apr 08


Image credit: Aaron Deutsch

If you use Drupal at all, you've probably already heard that 2021 is Drupal's 20th anniversary! Pretty cool. :) Check out wikipedia to see Drupal's initial release by Dries Buytaert was January 15, 2001.

In honor of Drupal's 20th year, DrupalFest is this month which is built around the popular DrupalCon North America event. Due to the pandemic, DrupalCon NA is online again this year, but this allows for a more unique content and contribution event model. The events are spread out throughout the month of April to allow for more participation and better integration into everyone's schedule. Hope to "see you" next week at DrupalCon or at one of the Drupal summits!

Speaking of summits, yesterday was the Drupal Community Summit which was fantastic. The big silver lining to virtual events is people from all over the world can join. We had folks from many different countries at the Community Summit including Scotland, India, Taiwan, Canada, Serbia, Germany, Suriname, United States, Japan, South Africa, Brazil, Switzerland, Nigeria, England, Pakistan, and more. We thought it would be cool to map out the participants.

Big thanks to the community organizers (Alanna Burke, AmyJune Hineline, Andrew Olson, and Miro Michalicka) and the Drupal Association organizers (Karlyanna Kopra and Ann Weller) for such a well-run Community Summit. And, special thanks to Rachel Lawson for co-facilitating a roundtable on local communities and speaker compensation. Last by not least, thanks to the community members who participated, especially anyone *new* to the community. It was a lot of fun!

As part of DrupalFest month, I challenged myself to do "A Drupal A Day". Meaning, that each day in April, I'll do something, no matter how small, related to Drupal. I was happy to see that at least two other community members took up the challenge themselves. Hussain Abbas is writing a blog post every day, and Baddý Breidert has been doing a variety of things including working on a Decoupled Menu Initiative keynote. You can follow along by watching the #ADrupalADay tag on Twitter. And please feel free to join us! :)

To help get you inspired, here are some things that you could do during DrupalFest and beyond. Have some more ideas? Send me a message on Twitter, and I'll add them to this list.

Enjoy April's DrupalFest

This month is packed full of great events you won't want to miss. And, you can also add your own events for even more DrupalFest fun!

Contribute to Drupal

Whether it's Drupal core or a contributed module, theme, or distribution, there's always something to work on. And remember, you don't have to know how to code to help!

Do Something Creative

Tech doesn't just have to be a bunch of code. Let's have some creative Drupaly fun!

Sponsor Drupal Work

Money makes the world go 'round whether we like it or not. Your generosity is always needed! There are many ways to sponsor Drupal work:

Promote Drupal

Shout Drupal from the rooftops using your own social media platforms, and amplify organizations and people in the Drupal community who are doing the ongoing work of improving Drupal.

Making Tech More Inclusive

While you are here, let's stop a moment to think about what we can do to help improve diversity, inclusivity, and equity in tech. Here are but a handful of organizations doing important work.

More Contribution Resources

More Ideas from the Drupal Community


Image credit: Aaron Deutsch

Apologies, my old website is still on Drupal 6 and looks particularly bad on mobile. I have started playing with the migration to Drupal 9 and hope to be done fairly "soon". :) Please ignore the cobbler's old shoes for now, thanks!

Apr 07 2021
hw
Apr 07

I am going to keep today’s DrupalFest post simple and talk about the API to access content on drupal.org. The Drupal.org API is a public API that allows you to access content such as projects (modules, themes, etc), issues, pages, and more. The API returns data as a simple JSON structure and has only limited features with regards to filtering and gathering nested data. In this post, I will describe more about this API and various ways to access it. It has been a tough day and it is difficult for me to write long posts or go deep in my thoughts. Regardless, I hope this quick post still proves useful.

API basics

The base endpoint to access the drupal.org API (henceforth, d.o API) is https://www.drupal.org/api-d7/. In fact, you can probably convert any canonical URL on drupal.org to its API equivalent. Simply prefix the path with “api-d7” and suffix “.json”. By canonical, I mean URL’s that use Drupal’s internal path such as node/2773581 or user/314031. These endpoints return JSON responses which are practically the same as Drupal internal representation. This means you will notice weird field names (almost all fields would begin with “field_”) and nesting that you might not expect from any other API’s. If you have programmed for Drupal, chances are you will feel right at home with the response data structure.

The API’s that return listing of entities are simply named such as node.json or user.json (and so on). The listing endpoints accept a variety of filters and allow pagination with simple query parameters. Most of the field names can be directly used as query parameters to filter the list on that field value. For example, https://www.drupal.org/api-d7/node.json?type=project_issue would return all the issues on drupal.org. Whereas, the URL https://www.drupal.org/api-d7/node.json?type=project_issue&field_project=3158507 would return only the issues in the preloader project (preloader’s node id on drupal.org is 3158507).

Read the documentation page for examples and more details such as field names and values.

Shortcomings in the API

As you might have surmised from the description above, this API is not designed for common consumption. Drupal Association maintains this on a best-effort basis and more sophisticated use cases (such as gathering nested data in a single request) are not supported. There is a plan to improve the API on the whole in the future but I don’t know when that might happen.

Practically speaking, this means that you have to make multiple API calls if you want to collect all the information about any entity. The first API call would be to the main entity about which you want information. And then you have to parse it to gather all the referenced ID’s and make API calls for each of them. If you wanted to build an efficient parser that needs to deal with a lot of nodes, you would probably have to persist all information in your application (which is what I did with DruStats).

The problem is not limited to just normal relationships such as terms and users, but also to entity reference fields. For example, if you want to find out a user’s organization, you would have to read the “field_organizations” property and make a request under “field_collection_item” endpoint with that ID. In a normal consumer-grade API, you would probably expect the information to be embedded right in the user response.

Using the API in your code

The API endpoints are straightforward and you can request data with a simple curl request or just the browser. However, if you were writing an application, you might often get frustrated with dealing with the filters and nested queries. This is where libraries come in.

The d.o API page lists two such libraries at the end of the documentation page. The first one listed is by Kris Vanderwater (EclipseGc) and it seems simple enough to use. The second one was written by me at the time when I was building DruStats. I needed something more sophisticated and I decided to write my own API wrapper. Since then, I also used this library for Contrib Tracker, a project which tracks contributions to drupal.org by multiple users. This library is reasonably documented on the Github page but improvements are always welcome. You can also look at examples in DruStats and Contrib Tracker. I am currently in the process of moving Contrib Tracker to a new home and I am not linking to the current URL right now. I am also planning a post on contrib tracker soon.

Using the CLI tool

Matt Glaman has written a CLI tool to interact with drupal.org issues. If you only want to automate your Drupal contribution workflow, then this CLI tool might be what you need. This tool allows you to simplify working with Drupal.org patches, create interdiffs, and even watch CI jobs. As always, the documentation on Github can guide you with the installation and usage of this CLI tool.

Apr 06 2021
hw
Apr 06

I recently opened up a spreadsheet where people can put in their ideas of what I can write about in this DrupalFest series. Someone put in a topic of what advice I would give my younger self. This idea intrigued me and I thought I will make an attempt at writing down advice to new Drupal developers. I am not very comfortable presuming that someone would want to take advice from me; so I am going to say what I would want my younger self to know. I am going to temper my thoughts to be suitable to today’s state of industry. There is no point talking about how the world would be from the point-of-view of 2007. So, let’s get started.

You don’t have to write all the code

One of my first non-trivial programming task was writing an x86 Assembly library for graphics manipulation and computation from QBasic. I started off programming with GW-Basic and QBasic and in time, I realized it is quite slow to do real stuff. That’s when I learnt Assembly language and wrote a library. This started me off on a path where I preferred to write all the code on top of a programming language and not rely on frameworks and libraries. I stayed on this path for about a decade since then.

I eventually learned that to be productive and actually grow, I should stop thinking of code in terms of purity and use what is out there. Other people are talented too and they have gone through the same problems you are going through. Learn from their mistakes and their work and build upon that. I realized much too late (in hindsight) that I was letting perfect be the enemy of good and staying stuck.

Case in point: I found out about Drupal in 2007 from someone and I thought why would I use it when I can and have written multiple CMS’es. The person who introduced me to Drupal (Drupal 5 at the time) was talking about how you can build websites in a matter of hours and use views to build a query with a UI (I hated that idea). He even said that Drupal 6 is even better and is just about to be released. Even after this, I only gave in and built my first Drupal site at the end of Drupal 6 period. That is still the only Drupal 6 site I have built.

There’s another story. I wanted to start a blog and was waiting to perfect the blogging CMS I was building. It had everything I wanted but I was finding more things to add and that became an excuse for me to not start blogging. I realized that one night not able to sleep at 2 AM and it hit me; that’s when I wrote my first (second) blog post. Here’s the actual first.

But do write the code to learn

Yes, I waited a long time to start using other people’s work because I wanted to do it myself. But doing it myself taught me a lot and I encourage you to go through that as well; just not at the expense of getting work done. When you find time, pick one hobby project that you want to do, just one, and do it from scratch.

Learn something not related to what you want to learn

This is something I did without realizing anyway and so I don’t need to give this advice. But I will document it here anyway. Learn a lot. Don’t let go of an opportunity to learn. I cannot stress this enough on how it has impacted me. It was only a feeling until I read more about this in David Epstein’s book “Range: Why Generalists Triumph in a Specialized World“. Essentially, learning things which seem unrelated to what you’re doing is a great work to develop a wide perspective which helps you innovate and excel. If you’re interested in the mechanics behind this, do read the book.

Within the programming area, I started off with QBasic, Assembly, PHP, .NET, C#, C/C++, Java (in school and college), and staples such as HTML, CSS, and JavaScript (much simpler around that time). Since I started working professionally, I have learned ASP.NET, Windows programming internals (from a .NET perspective), CodeIgniter, Kohana, WordPress, Drupal, JavaScript, Laravel, and many others I can’t recollect. More recently, I have started learning Rust, Golang, Flutter, Dart, and technologies such as Docker and Kubernetes. Outside of programming, I often worked with Photoshop, Premiere, Aftereffects, Corel Draw, and generally in printing and video editing projects. Further outside of programming, I used to assemble computers for people and myself and built networks. Going in reasonable depth with all of this helped me understand computers and technology the way I do today.

Also learn something completely outside what you do

It is great to expand your knowledge in the areas around your work but it is also very helpful to pick up completely unrelated activities (what we call hobbies). I used to be part of group that put up props and other types of decorations for community events. As a part of that group, I would cut paper, slice thermocol, apply glitter and other materials, and put them up on walls and hang them from rafters and towers. I also used to volunteer as a photographer and also a vocalist (something similar to a choir group). It may not seem obvious how this would help your career but the broad range of perspectives that develop here help. One aspect of this is just the expanded network that you build which helps you see outside the otherwise narrow world we sometimes confine ourselves to.

Make learning a habit

I read this great book called Atomic Habits by James Clear which opened my eyes to the relevance and effectiveness of habits. Earlier, I often found me fighting against myself to do something: either read more, exercise more, or just behave differently. I used to put up a brave fight and punish myself when I failed (which I often did). You see, I thought I was being a hero by slaying myself with all the burden and was a skeptic when I started reading the book. Not anymore. I now think more about the impact rather than the act and take calculated steps to learn and build my life with the impact in mind, not my current actions.

Go broad and go deep

I already shared why you should go broad with various technologies, but it is just as important to go deep in on one or two technologies you want to focus on. If you’re reading this, one of those things is probably Drupal for you and I would argue you should make that PHP. Go deep into how PHP works. What are the new features in PHP and how are people using those features in other frameworks? How does PHP run on a webserver and what are some of the alternative models of running PHP applications? How does a PHP application differ from something written in Golang, for example, or in Python?

And learn about programming fundamentals too. What problem does object oriented programming solve anyway? Can you apply functional programming principles in PHP? In Drupal? How does a processor execute a program? What does Assembly look like? How does an Operating System schedule programs for execution and how can it run multiple processes at the same time? How does virtualization and containerization work and how does that affect your code?

I know it seems like a lot but you are not going deep in one dive. Take years if you want to but you will see benefits within days of you starting to learn this.

Drupal is awesome

Now, this seems obvious in hindsight but it was not clear to me when I started programming. I already said that I was loathe to try out frameworks and libraries written by other people (that includes CMS’es as well). It was not until I volunteered for a project where I didn’t want to spend much time and just picked up Drupal. That is what set me on the path of my amazing experiences in the Drupal community.

So, I want to say this straight. Drupal is awesome both in terms of the code that is written and the people that have written it or helped to write it. Note that I didn’t say perfect. It wasn’t perfect (even when I pretended it was) and it is probably even less perfect now. But it is awesome when looked at the point-of-view of the impact it has made in my life and the lives around me. I have written often about this so I won’t go deep into it.

The ecosystem is awesome

As I said before, don’t limit yourself to Drupal. Drupal hasn’t limited itself to Drupal anymore since Drupal 8 famously got off the island by adopting practices from the wider PHP community. Learn from other frameworks and languages to determine what’s the best fit. There was a time I used to say that we could build anything in Drupal. That’s still true but that’s neither effective nor efficient. Decide if the code you are writing should be a custom module or a contributed module or a PHP package or a completely different service outside Drupal. There are a lot more awesome libraries and systems out there and you should find out how you can use them in Drupal rather than building that in Drupal.

I feel like I can keep going but I am going to end it here. This was much longer than I thought I would write but that’s the best I can do at this time. If you read it so far, thank you, you’re a star!

Apr 05 2021
hw
Apr 05

This is the fourth post in my DrupalFest series and I am excited to keep it going. I want to write about different tools I am aware of for running quality checks on Drupal code. This will be somewhat similar to my last post where I presented various options but focused on the method I use nowadays.

First of all, what am I talking about? I believe that the code we write is read a lot more times than it is written. It is read by other people than yourselves (2-week older you is another person) and they (you) have to understand what is written. Therefore, code must be optimized for readability. Performant code is a must, of course, but not at the expense of readability. When your best-performing code breaks and you can’t understand it to fix it, that performance is useless.

Types of checks

One of the low hanging fruits here is following a consistent code style. Drupal documents its coding style in detail and the Drupal core and contributed modules (most of them anyway) follow it. True, it’s not like PSR-2 or PSR-12; it was developed long before there were PSR’s, but if you are working with Drupal, it will be a good idea to follow this coding style consistently. Now, you could manually review each and every line of your code to make sure you are following the code style and halt your pull requests, but that is not a good use of your time. Hence, tools.

Apart from the coding style, there are ways to prove “correctness” of your code. Broadly speaking, there are two aspects of correctness–the code is coherent and the business logic is correct. The coherent code aspect can be checked using various static analysis tools; PHPStan and Psalm are few of the popular ones. For verifying the business logic, we write automated tests and PHPUnit is the defacto standard for that.

Tools

Before we get into the Drupal site of things, I’ll list the various tools that we use. I won’t attempt to document them here (in the interest of time) but you shouldn’t have trouble finding their documentation.

  • PHP Code Sniffer – Mainly for code style checks but can be a bit more sophisticated using rules.
  • Drupal coder rules – Code sniffs for checking Drupal coding style.
  • DrupalPractice coder rules – Code sniffs for checking Drupal conventions.
  • PAReview.sh – Automated tool that calls PHPCS with Drupal-related coder sniffs apart from other checks. This is commonly used to check if your module is contribution ready.
  • PHPStan – Static analyzer for PHP code that can perform various levels of type checks and other correctness checks.
  • Psalm – This is another static analyzer with same basic level of checks as PHPStan and then some interesting checks.
  • PHPLint – A simple PHP linter which supports parallel checks and better error reporting than just running php.
  • Other linters such as ESLint, TwigCS, etc – Linters for their specific languages.
  • PHPCPD – PHP Copy Paste Detecter. Like the name says, it checks for repeated lines of code.
  • PHPMD – PHP Mess Detector. Analyze source code for various mathematic measures of code.
  • PHPUnit – Test runner for unit, integration, and functional tests.
  • Behat – Test runner for Behavior tests.

Almost all of these tools can be installed via composer or PHAR files. Composer is an easy way to get these tools but they end up affecting your project dependencies. PHAR files have to be installed on your machine (and everyone in the team should do that too). In both of these methods, you still have to remember to run the tools. That is where the next set of tools come in.

Automatically running checks

One of the ways to make sure everyone on the team adheres to the checks is by running the test in CI. This way, the team would immediately know if their commit is failing checks without even someone having to say so in a pull request. You could again choose to have all of these tools in your composer.json and install them while running your CI, but there is a better way for most cases. DrupalQA is a Docker image which packages almost all of the above tools and you can just run the commands you want within the Docker container. Almost all modern CI tools provide some way of running tests inside a container and you should find documentation to do that for your CI tool. Here is an example of a Gitlab CI job definition which uses the PHP 7.4 version of this image.

drupal_codequality:
  image: hussainweb/drupalqa:php7.4
  stage: test
  script:
    - composer validate
    - phplint --no-cache -v web/modules/custom/
    - phpcs --standard=phpcs.xml.dist --extensions=php,module,inc,install,test,profile,theme --ignore=/node_modules/ web/modules/custom
    - phpmd web/modules/custom/ text phpmd.xml

You could even run the Docker image locally but that’s more trouble than it’s worth. There is a better way in any case.

Have your team run checks automatically

The CI is fine to run these tests but what if you want to save time between commit, push, and CI runs. Wouldn’t it be better for developers to run these checks on their machines before they commit their code. GrumPHP is a tool which allows you to do just that. Vijay CS has written a wrapper on GrumPHP that provides default configuration for checks related to Drupal. While this is great for general use cases, I wanted to make a highly opinionated one for my team at Axelerant. While it is opinionated, it is still available for use and you would just install it using composer this way:

composer require axelerant/drupal-quality-checker

After this, anyone using your project would automatically get git hooks installed (once they run composer install) to run a few Drupal-specific checks every time they commit. You can modify the default configuration and add/remove any tools you wish. Just follow GrumPHP documentation for the same. Of course, more documentation is also available at the Github page for axelerant/drupal-quality-checker.

GrumPHP recently made a lighter package available called GrumPHP Shim. This provides the same set of features but installs a PHAR file instead of a regular composer plugin. This has the benefit of keeping your project dependencies free of GrumPHP’s dependencies reducing the chances of conflict. Axelerant’s version of drupal-quality-checker uses the shim for this reason.

I think I covered all the tools I am aware of and could recollect in the span of couple of hours today. I’m sure there are more tools that are missing here but I am going to call this DrupalFest post done for now. If I am missing something obvious (or even niche), please point it out in the comments so that I can revise or write about it in a separate post.

Apr 04 2021
hw
Apr 04

PHP 7.4 introduced the concept of preloading classes (files) on server start-up into the PHP opcache. This gives us performance benefits for sites that tend to load a lot of files with every request; something that Drupal is known to do. A properly configured web server would have opcache (opcode cache) enabled anyway, but preloading brings in a modest performance boost on top of that.

PHP opcache is designed to cache the opcodes of a PHP file so that the file does not have to be reinterpreted with every request. The bytecode (opcode) is cached in shared memory in RAM which means that the cache lives only as long as the PHP process is running. When the opcache is enabled, PHP caches whichever files are loaded during execution and only recompiles the file if it has changed (this setting can be disabled for an additional performance boost).

Preloading works in a similar way except that you would write a script that would load the files you want to cache. This script is set to PHP’s opcache.preload setting which executes this script every time the server starts. Since this script is run with the server, you have to make sure that any errors are handled properly; otherwise, the server would throw an error and quit. Now, you may want this script to load all the files in your application, but the opcache memory size is limited. This means that you should only preload files that are required by most requests. In many PHP applications, these files may even be hand-written to get the best ratio of memory usage and throughput.

Preloading with Drupal

Now that we understand how preloading works, let’s see how it can be used with Drupal. Writing the preload script by hand is difficult as Drupal is a dynamic system. Even if you write a preload script for all the Drupal core files, it may be your contrib modules that are a better fit for this. This is the reason I have written a Drupal module called preloader to make this easy for you.

The module uses a PHP package called darkghosthunter/preloader which does most of the heavy lifting. The package checks the current opcache usage and generates the preload script. The Drupal module brings in Drupal-specific customization (removing files that shouldn’t be preloaded) and a user interface to generate the file. You still have to manually add the preload script to your PHP settings for the changes to take effect. Still, the difficult task of generating the script file listing all the important files is made easy for you.

The workflow is as follows:

  1. Install the module using composer as you normally would.
  2. Configure the module to ignore any additional directories or files if you want.
  3. Restart the webserver and then load some of the pages that are frequently hit on your site.
  4. Go back to the module configuration page and generate the script.
  5. Set the opcache.preload setting in your php.ini file to the generated script path.
  6. Restart your webserver and test.

Gotchas

Since the preload script is executed at the server start and the cache is permanent as long as the process is running, if you change any of the preloaded files, you have to restart the server. For this reason, it is not a good idea to use this functionality during development. The module may remain enabled but the opcache.preload setting should not be set on the developer machines.

Performance impact

In my early tests, I saw an average of 10% improvement with preloading enabled across different percentiles. This test is quite old and I should repeat this on a better machine but the result should still be indicative. Hopefully, I will be able to test this module again this month and even work on some of the issues that have been reported. I will share screenshots or even a video of the test when possible.

This is it for today’s DrupalFest post. See you tomorrow, hopefully.

Apr 03 2021
hw
Apr 03

This post will cover quickly setting up a Drupal website for testing, experimentation, or evaluating features on your local system. While I cover a different set of options briefly, I will mainly talk about a tool we have built to let us quickly scaffold Drupal sites with best practices built in. This post is a part of the DrupalFest series which celebrates 20 years of Drupal. Let’s get started.

Before I go to the main part of the article, let us look at what options do you have to quickly set up a Drupal website for evaluation. While all the ways are effective, there are trade-offs typical of each method. We’ll start with the simplest way to experiment with a Drupal site and then keep going up the level of complexity, but also the flexibility you get.

Someone else’s machine

The easiest way to set up a Drupal site is in the Cloud, aka, someone else’s machine. Broadly, there are two ways I can think of to do this. The first method involves one of the excellent Drupal hosting providers and signing up for a free account. Drupal.org lists some of the hosting supporters who offer a free account to try out Drupal. Go to this link to get started: https://www.drupal.org/try-drupal.

While the above method lets you try out not just Drupal but also a hosting platform where you might actually run your website, you may not want to create an account to just try out Drupal. In that case, there is SimplyTest.me. This site provides a time-bound sandbox environment where you can quickly try out Drupal, one of the Drupal distributions, with or without modules and themes, and even apply patches. The sandboxes are available for 24 hours and give you complete control of the Drupal site through the UI. You don’t get command-line or SSH access this way, but for quickly testing out a module or a distribution, this is the easiest solution out there. You don’t need an account. Just pick the modules or the distribution and you have a sandbox ready in moments.

Your machine

On your machine, you have a little bit more control over the site but you do need the software required to run Drupal. If you’re only interested in quickly evaluating Drupal on your machine and still have access to the files and command line, the Drupal evaluator guide gives you an option to run this with just PHP. More details and instructions are available at the link: https://www.drupal.org/docs/official_docs/en/_evaluator_guide.html.

The problem with the above method is that it won’t run Drupal in an environment reasonably similar to where you would host. This is good for evaluating Drupal and quickly editing a few files but the experience working with PHP built-in server and SQLite would not help development. To take it a step further, you would need Docker as a pre-requisite.

If you have Docker and you already have a Drupal codebase downloaded, then tools such as Lando and DDEV would help you get started in no time at all. If you want to take it a step further, keep reading to the next section.

Finally, you have the classic method of installing Drupal on your machine by manually installing all the software required to run Drupal (that’s PHP, Apache/nginx, MySQL/MariaDB, etc). I don’t find many developers do this anymore and it is more trouble than it is worth. If you can’t run Docker for any reason, consider running it in a virtual machine using the excellent DrupalVM.

Axelerant’s template tool

Now to the main section of the post. At Axelerant, we wanted to automate the entire process of setting up a Drupal site quickly along with the codebase, Lando configuration, some of the best practices we follow, and even CI configuration. I wrote a tool for that called axl-template which is available via pip (yes, it’s written in Python and needs at least Python 3.6). Once the tool is installed, you can quickly set up a Drupal site along with all the configuration I mentioned in a matter of minutes. I have measured this time from running the first command to have Drupal running to be minutes (longest time taken by Drupal installation). Here’s a somewhat old video where I use this tool to set up Drupal 9 with Lando.

[embedded content]

I have added new features to this tool since I made the video, the most notable feature being able to specify modules and packages right in the first command. The idea is to have your codebase created from a single command and ready to run. I am planning to create a new video to cover these and many other features added to the tool. For now, the documentation is a good reference. Read more at https://github.com/axelerant/axl-template.

If you’re averse to installing a Python package through pip and don’t want to get it from the source code either, you could run it via Docker using whalebrew. Instructions to do these are in the README file but only one of the commands is supported in this way.

We’re continuing to improve this tool by adding support for more common development tools that are typically used. One example that comes to my mind right now is Acquia’s BLT. I can’t promise when that support will come out but it’s on the cards. In any case, this package is released under MIT license and contributions are very welcome.

This is it for today and I am just completing this post with less than four minutes to spare to midnight. Have a good DrupalFest everyone.

Jan 25 2021
Jan 25

Newly engineered opportunities have opened the doors for Higher Education institutions to pioneer student, researcher, and funding recruitment. From deeper data applications to mass-scale live debates, the Higher Education sector is going through a digital transformation, with varying rates and approaches.

New data and accessibility regulations, as well as pressure on student recruitment from COVID-19, have required Higher Education institutions to accelerate these 'digital transformation roadmaps'.

Entire organisations have had to react and re-evaluate everything across technology implementation, face-to-face education, student recruitment, and community satisfaction.

The forces of change are drawing in at an unprecedented rate. But are universities equipped to make the quality, long-term adjustments needed?

Senior stakeholders from the University of West London, Manchester Metropolitan University, and Oxford Saïd Business School sat down with Paul Johnson, our Drupal and HE Specialist at CTI Digital to discuss their digital challenges and opportunities during a panel at DrupalCon Europe. We received a unique perspective on various UK organisations' challenges with differing cohorts, scale and complexity, age and legacy systems.

Watch the full panel here, and use the time stamps below to navigate:

[embedded content]

00:00 - Introduction with:

  • Chair and top left: Paul Johnson, HE Specialist at CTI Digital
  • Bottom left: Adrian Ellison, Associate Pro-Vice-Chancellor & Chief Information Officer at the University of West London.
  • Top right: Nick Holland, Head of Digital at Manchester Metropolitan University.
  • Bottom right: Iain Harper, Head Of Digital Marketing, Saïd Business School, University of Oxford.

05:29 - Why The University of West London chose to upgrade and continue using Drupal.

09:50 - How Manchester Metropolitan University built the business case to move to Drupal.

13:29 - Oxford Saïd Business School's experience of using Drupal 8 for multiple years.

19:30 - Managing "HiPPO" (Highest Paid Person's Opinion) and different stakeholders' opinions.

22:20 - Data-driven decision making to changes of an existing platform at Oxford.

24:58 - Managing governance for an entire platform change at MMU.

26:58 - Managing change to projects and their teams over multi-year projects.

33:54 - Lockdown and adapting working with staff and students remotely.

37:04 - Content governance and consistency.

38:54 - Designing and building a website for diverse audiences.

41:22 - What features or capabilities for Drupal should Drupal develop for HE's future?

If you're looking for a digital partner to support your digital transformation. We're the team you're looking for. Our full-service team can take your through discovery and user research to plan and define the underlining requirements that meet your business goals. Our content strategy and development team will then be available to make your digital roadmap become a reality—all under one roof, with years of precedented success.

Get in Touch

Nov 18 2020
Nov 18

From the consumer perspective, there’s never been a better time to build a website. User-friendly website platforms like Squarespace allow amateur developers to bypass complex code and apply well-designed user interfaces to their digital projects. Modern site-building tools aren’t just easy to use—they’re actually fun.

For anyone who has managed a Drupal website, you know the same can’t be said for your platform of choice. While rich with possibilities, the default editorial interface for Drupal feels technical, confusing, and even restrictive to users without a developer background. Consequently, designers and developers too often build a beautiful website while overlooking its backend CMS.

Drupal’s open-ended capabilities constitute a competitive advantage when it comes to developing an elegant, customer-facing website. But a lack of attention to the needs of those who maintain your website content contributes to a perception that Drupal is a developer-focused platform. By building a backend interface just as focused on your site editors as the frontend, you create a more empowering environment for internal teams. In the process, your website performs that much better as a whole.

UX principles matter for backend design as much as the frontend

Given Drupal’s inherent flexibilities, there are as many variations of CMS interfaces as there are websites on the platform. That uniqueness is part of what makes Drupal such a powerful tool, but it also constitutes a weakness.

The editorial workflow for every website is different, which opens an inevitable training gap in translating your site’s capabilities to your editorial team. Plus, despite Drupal’s open-source strengths, you’ll likely need to reinvent the wheel when designing CMS improvements specific to your organization.

For IT managers, this is a daunting situation because the broad possibilities of Drupal are often overwhelming. If you try to make changes to your interface, you can be frustrated when a seemingly easy fix requires 50 hours of development work. Too often, Drupal users will wind up working with an inefficient and confusing CMS because they’re afraid of the complexity that comes with building out a new interface.

Fortunately, redesigning your CMS doesn’t have to be a demanding undertaking. With the right expertise, you can develop custom user interfaces with little to no coding required. Personalized content dashboards and defined roles and permissions for each user go a long way toward creating a more intuitive experience.

Improving your backend design is often seen as an additional effort, but think of it as a baseline requirement. And, by sharing our user stories within the Drupal community, we also build a path toward improving the platform for the future.

Use Drupal’s Views module to customize user dashboards

One of the biggest issues with Drupal’s out-of-the-box editorial tools is that they don’t reflect the way any organization actually uses the CMS. Just as UX designers look to provide a positive experience for first-time visitors to your site, your team should aim for delivering a similarly strong first impression for those managing its content.

By default, Drupal takes users to their profile pages upon login, which is useful to . . . almost no one. Plus, the platform’s existing terminology uses cryptic terms such as “node,” “taxonomy,” and “paragraphs” to describe various content items. From the beginning, you should remove these abstract references from your CMS. Your editorial users shouldn’t have to understand how the site is built to own its content.

Powering Our Communities homepage

In the backend, every Drupal site has a content overview page, which shows the building blocks of your site. Offering a full list that includes cryptic timestamps and author details, this page constitutes a floodgate of information. Designing an effective CMS is as much an exercise in subtraction as addition. Whether your user’s role involves reviewing site metrics or new content, their first interaction with your CMS should display what they use most often.

Manage News interface

If one population of users is most interested in the last item they modified, you can transform their login screen to a custom dashboard to display those items. If another group of users works exclusively with SEO, you can create an interface that displays reports and other common tasks. Using Drupal’s Views module, dashboards like these are possible with a few clicks and minimal coding.

By tailoring your CMS to specific user habits, you allow your website teams to find what they need and get to work faster. The most dangerous approach to backend design is to try and build one interface to rule them all.

Listen to your users and ease frustrations with a CMS that works

Through Drupal Views, you can modify lists of content and various actions to control how they display in your CMS. While Views provides many options to create custom interfaces, your users themselves are your organization’s most vital resource. By watching how people work on your site, you can recognize areas where your CMS is falling short.

Drupal content dashboard

Even if you’ve developed tools that aimed to satisfy specific use cases, you might be surprised the way your tools are used. Through user experience testing, you’ll often find the workarounds your site editors have developed to manage the site.

In one recent example, site editors needed to link to a site page within the CMS. Without that functionality, they would either find the URL by viewing the source code in another tab and copying its node ID number. Anyone watching these users would find their process cumbersome, time-consuming, and frustrating. Fortunately, there’s a Drupal module called Linkit that was implemented to easily eliminate this needless effort.

There are many useful modules in the Drupal ecosystem that can enhance the out-of-the-box editorial experience. Entity Clone expedites the content creation process. Views Bulk Operations and Bulk Edit simplify routine content update tasks. Computed Field and Automatic Entity Label take the guesswork out of derived or dependent content values. Using custom form modes and Field Groups can help bring order and streamline the content creation forms.

Most of the time, your developers don’t know what solutions teams have developed to overcome an ineffective editorial interface. And, for fear of the complexity required to create a solution, these supposed shortcuts too often go unresolved. Your backend users may not even be aware their efforts could be automated or otherwise streamlined. As a result, even the most beautiful, user-friendly website is bogged down by a poorly designed CMS.

Once these solutions are implemented, however, you and your users enjoy a shared win. And, through sharing your efforts with the Drupal community, you and your team build a more user-friendly future for the platform as well.

Sep 23 2020
Sep 23

Working in digital design and development, you grow accustomed to the rapid pace of technology. For example: After much anticipation, the latest version of Drupal was released this summer. Just months later, the next major version is in progress.

At July’s all-virtual DrupalCon Global, the open-source digital experience conference, platform founder Dries Buytaert announced Drupal 10 is aiming for a June 2022 release. Assuming those plans hold, Drupal 9 would have the shortest release lifetime of any recent major version.

For IT managers, platform changes generate stress and uncertainty. Considering the time-intensive migration process from Drupal 7 to 8, updating your organization’s website can be costly and complicated. Consequently, despite a longtime absence of new features, Drupal 7 still powers more websites than Drupal 8 and 9 combined. And, as technology marches on, the end of its life as a supported platform is approaching.

Fortunately, whatever version your website is running, Drupal is not running away from you. Drupal’s users and site builders may be accustomed to expending significant resources to update their website platform, but the plan for more frequent major releases alleviates the stress of the typical upgrade. And, for those whose websites are still on Drupal 7, Drupal 10 will continue offering a way forward.

The news that Drupal 10 is coming sooner rather than later might have been unexpected, but you still have no reason to panic just yet. However, your organization shouldn’t stand still, either.

Image via Dri.es

The End for Drupal 7 Is Still Coming, but Future Upgrades Will Be Easier

Considering upgrading to Drupal 8 involves the investment of building a new site and migrating its content, it’s no wonder so many organizations have been slow to update their platform. Drupal 7 is solid and has existed for nearly 10 years. And, fortunately, it’s not reaching its end of life just yet.

At the time of Drupal 9’s release, Drupal 7’s planned end of life was set to arrive late next year. This meant the community would no longer release security advisories or bug fixes for that version of the platform. Affected organizations would need to contact third-party vendors for their support needs. With the COVID-19 pandemic upending businesses and their budgets, the platform’s lifespan has been extended to November 28, 2022.

Drupal’s development team has retained its internal migration system through versions 8 and 9, and it remains part of the plan for the upcoming Drupal 10 as well. And the community continues to maintain and improve the system in an effort to make the transition easier. If your organization is still on Drupal 7 now, you can use the migration system to jump directly to version 9, or version 10 upon its release. Drupal has no plans to eliminate that system until Drupal 7 usage numbers drop significantly.

Once Drupal 10 is ready for release, Drupal 7 will finally reach its end of life. However, paid vendors will still offer support options that will allow your organization to maintain a secure website until you’re ready for an upgrade. But make a plan for that migration sooner rather than later. The longer you wait for this migration, the more new platform features you’ll have to integrate into your rebuilt website.

Initiatives for Drupal 10 Focus on Faster Updates, Third-Party Software

In delivering his opening keynote for DrupalCon Global, Dries Buytaert outlined five strategic goals for the next iteration of the platform. Like the work for Drupal 9 that began within the Drupal 8 platform, development of Drupal 10 has begun under the hood of version 9.

A Drupal 10 Readiness initiative focuses on upgrading third-party components that count as technological dependencies. One crucial component is Symfony, which is the PHP framework Drupal is based upon. Symfony operates on a major release schedule every two years, which requires that Drupal is also updated to stay current. The transition from Symfony 2 to Symfony 3 created challenges for core developers in creating the 8.4 release, which introduced changes that impacted many parts of Drupal’s software.

To avoid a repeat of those difficulties, it was determined that the breaking changes involved in a new Symfony major release warranted a new Drupal major release as well. While Drupal 9 is on Symfony 4, the Drupal team hopes to launch 10 on Symfony 6, which is a considerable technical challenge for the platform’s team of contributors. However, once complete, this initiative will extend the lifespan of Drupal 10 to as long as three or four years.

Other announced initiatives included greater ease of use through more out-of-the-box features, a new front-end theme, creating a decoupled menu component written in JavaScript, and, in accordance with its most requested feature, automated security updates that will make it as easy as possible to upgrade from 9 to 10 when the time comes. For those already on Drupal 9, these are some of the new features to anticipate in versions 9.1 through 9.4.

Less Time Between Drupal Versions Means an Easier Upgrade Path

The shift from Drupal 8 to this summer’s release of Drupal 9 was close to five years in the making. Fortunately for website managers, that update was a far cry from the full migration required from version 7. While there are challenges such as ensuring your custom code is updated to use the most recent APIs, the transition was doable with a good tech team at your side.

Still, the work that update required could generate a little anxiety given how comparatively fast another upgrade will arrive. But the shorter time frame will make the move to Drupal 10 easier for everybody. Less time between updates also translates to less deprecated code, especially if you’re already using version 9. But if you’re not there yet, the time to make a plan is now.

Mar 02 2020
Mar 02

As of Drupal 8.7, the Media and Media Library modules can be enabled and used out-of-box. Below, you'll find a quick tutorial on enabling and using these features.

out-of-box before media and media library

In the past there were two different ways to add an image to a page.

  1. An image could be added via a field, with the developer given control over its size and placement:
     

    Image field before media library
  2. An image could be added via the WYSIWYG editor, with the editor given some control over its size and placement:
     

    Image field upload choices screen

A very straightforward process, but these images could not be reused, as they were not part of a reusable media library.

reusing uploaded media Before Drupal 8.7

Overcoming image placement limitations in prior versions of Drupal required the use of several modules, a lot of configuration, and time. Sites could be set up to reference a media library that allowed editors to select and reuse images that had previously been uploaded, which we explained here.

This was a great time to be alive.

What is available with Media Library

Enabling the Media and Media Library modules extends a site's image functionality. First, ensure that the Media and Media Library core modules are enabled. 

Enable media library in drupal

A media entity reference field must be used with the Media Library. It will not work with a regular image field out-of-box.

Image field on manage display page

On the Manage form display page, select "Media library" widget. 

Media library widget on manage display page

On the "Node Add" and "Node Edit" forms, you’ll see the below difference between a regular image field and a field connected to the media library.

Media library field on node edit

Click on “Add media” and you’ll see a popup with the ability to add a new image to the library or to select an image that is already in the library.

Media field grid

With a simple configuration of the field, if multiple media types are allowed in the field, you’ll see vertical tabs for each media type.

Media grid with multiple media types

WYSIWYG configuration

The WYSIWYG editor requires a few steps when configuring the media library for a specific text format. First, a new icon will appear with a musical note overlapping the image icon. This should be added to the active toolbar and the regular image icon should be moved to the available buttons.

wysiwyg toolbar configuration

Under “Enabled filters,” enable “Embed media."  Under the filter settings, vertical tab settings can be chosen for media types and view modes. Once that configuration is saved, you’ll see on a WYSIWYG editor that you have the same popup dialog for adding a new image to the media library, or selecting an already-uploaded image.

wysiwyg media configuration

Once you are on a "Node Add or "Node Edit" page with a WYSIWYG element, you’ll see the media button (image icon plus musical note).

Media button on wysiwyg editor

Clicking on the media button brings up the same, familiar popup that we saw earlier from the image field:

media library grid

This article is an update to a previous explainer from last year. 

Jan 30 2020
Jan 30

Recently, we were asked if we could integrate some small, one-page websites into an existing Drupal website. This would not only make it easier to manage those different websites and their content, but also reduce the hosting and maintenance costs.

In this article, I will discuss how we tackled this problem, improved the content management experience and implemented this using the best Drupal practices.

First, some background information: America’s Promise is an organization that launches many national campaigns that focus on improving the lives and futures of America’s youth. Besides their main website (www.americaspromise.org), they also had separate websites and domain names for some of these campaigns, e.g. www.everyschoolhealthy.org.

We came up with a Drupal solution where they could easily configure and manage their campaigns. Next to having the convenience of managing all these campaigns from one admin panel, they could also reference content items easily from their main website or other campaigns by tagging the content with specific taxonomy terms (keywords).

We created a new content type “Campaign” with many custom paragraph types as the building blocks for creating a new campaign. We wanted this to be as easy as possible for the content editors, but also give enough freedom where every campaign can have their own branding, by selecting a font color, background image/color/video.

Below are some of the paragraph types we created:

  • Hero
  • Column Layout
  • WYSIWYG
  • Latest News
  • Newsletter Signup
  • Twitter Feed
  • Video Popup
  • Community Partners
  • Latest Resources
  • Grantee Spotlight
  • Statistics Map
  • Partner Spotlight
  • Media Mentions

These paragraphs offer lots of flexibility to create unique and interactive campaigns. By drag and drop, these paragraphs can be ordered however you’d like.

Below is a screenshot of some of these paragraph types in action, and how easy they can be configured on the backend.

Every School Healthy paragraphs

Below you can see how the “Hero” paragraph looks like in the admin panel. The editor enters a tagline, chooses a font color, uploads a logo, an optional background image or video, and a background overlay color with optional opacity.

Campaign Builder Hero Backend

As you can see in the above screenshot, this is a very basic paragraph type, but it shows the flexibility in customizing the building blocks for the campaign. We also created more complex paragraph types that required quite some custom development.

One of the more complicated paragraph types we created is a statistics map. America’s Promise uses national and state statistics to educate and strengthen its campaign causes.

Campaign Builder Statistics Map

The data for this map comes from a Google Sheet. All necessary settings can be configured in the backend system. Users can then view these state statistics by hovering over the map or see even more details by clicking on an individual state.

Campaign Builder Statistics Map Backend

Some other interesting paragraph types we created are:

  • Twitter Feed, where the editors can specify a certain #hashtag and the tweets will display in a nice masonry layout
  • Newsletter Signup, editors can select what newsletter campaign the user signs up for
  • Latest News/Resources, editors can select the taxonomy term they want to use to filter the content on

Time to dive into some of the more technical approaches we took. The campaign builder we developed for America’s Promise depends on several Drupal contrib modules:

  • paragraphs
  • bgimageformatter
  • color_field
  • video
  • masonry (used for the Twitter Feed)

Font color and background image/color/video don’t need any custom code, those can be accomplished using the above modules and configuring the correct CSS selectors on the paragraph display:

Campaign Builder Hero Display

In our custom campaign builder module, we have several custom Entities, Controllers, Services, Forms, REST resources and many twig template files. Still, the module mainly consists of custom field formatters and custom theme functions.

Example: the “Latest News” paragraph only has one field where the editor can select a taxonomy term. With a custom field formatter, we will display this field as a rendered view instead. We pass the selected term as an argument to the Latest News view, execute the view and display it with a custom #theme function.

Conclusion

By leveraging the strength of paragraphs, other contrib modules and some custom code, we were able to create a reusable and intuitive campaign builder. Where the ease of content management was a priority without limiting the design or branding of each campaign.

Several campaigns that are currently live and built with our campaign builder:

Could your organization benefit from having your own custom campaign builder and want to see more? Contact us for a demo.

Jan 23 2020
Jan 23

Solutions exist to make working in Drupal 7 more like working in Drupal 8. Use this three-part approach to have fun with Drupal 7 development.

The post Make Drupal 7 Development Fun appeared first on Four Kitchens.

Dec 09 2019
Dec 09

With Drupal 9 set to be released later next year, upgrading to Drupal 8 may seem like a lost cause. However, beyond the fact that Drupal 8 is superior to its predecessors, it will also make the inevitable upgrade to Drupal 9, and future releases, much easier. 

Acquia puts it best in this eBook, where they cover common hangups that may prevent migration to Drupal 8 and the numerous reasons to push past them.

The Benefits of Drupal 8

To put it plainly, Drupal 8 is better. Upon its release, the upgrade shifted the way Drupal operates and has only improved through subsequent patches and iterations, most recently with the release of Drupal 8.8.0

Some new features of Drupal 8 that surpass those of Drupal 7 include improved page building tools and content authoring, multilingual support, and the inclusion of JSON:API as part of Drupal core. We discussed some of these additions in a previous blog post

Remaining on Drupal 7 means hanging on to a less capable CMS. Drupal 8 is simply more secure with better features.

What Does Any of This Have to Do With Drupal 9?

With an anticipated release date of June 3, 2020, Drupal 9 will see the CMS pivot to an iterative release model, moving away from the incremental releases that have made upgrading necessary in the past. That means that migrating to Drupal 8 is the last major migration Drupal sites will have to undertake. As Acquia points out, one might think “Why can’t I just wait to upgrade to Drupal 9?” 

While migration from Drupal 7 or Drupal 8 to Drupal 9 would be essentially the same process, Drupal 7 goes out of support in November 2021. As that deadline approaches, upgrading will only become an increasingly pressing necessity. By migrating to Drupal 8 now, you avoid the complications that come with a hurried migration and can take on the process incrementally. 

So why wait? 

To get started with Drupal migration, be sure to check out our Drupal Development Services, and come back to our blog for more updates and other business insights. 
 

Oct 06 2019
hw
Oct 06

One of my sites has a listing of content shown as teaser. The teaser for this content type is defined to show the title, an image, and a few other fields. In the listing, the image is linked to the content so that the visitor may click on the image (or the title) to open the content. All this is easily achievable through regular Drupal site building.

Drupal Manage Fields Page

I wanted to change the functionality so that clicking on the image would open the content in a new tab. This is easy for the title, as the title is linked right from within the template (node--content-type--teaser.html.twig). I just have to add the target="_blank" attribute for the tag for the title and I am done. Doing this for the image is not so easy.

Challenges with the Image Formatter

The reason it isn’t easy for the image is the image field formatter provided by the Drupal core. It provides an option to render the image as a link and link it to either the content or the file itself, but no option to open it in a new tab.

Image Formatter OptionsDigging through the code for image formatter, I found the template that drives it at image-formatter.html.twig. This template also does not help us much directly, as we cannot conditionally modify this template only for teasers for certain content types, like I wanted. If I override this file, it will affect the image formatter everywhere.

I stumbled upon an approach when searching for solutions for this, hoping I will run across an issue in Drupal core which would add this feature. I would simply apply the patch and voila, I would solve my problem, give feedback if it worked, and open source wins. Unfortunately, there was no such issue but I found something related: Image formatter does not support URL/Link options.

Sidenote: I could have created a feature request to add this feature and maybe I’ll do it when I get a chance. Today, I was just eager to get this working.

Back to the issue. I see there is a change in the template to use the link function rather than just writing an tag. This is not very helpful to me. But the issue summary described my problem, and it is strange that the patch (which was committed) does not fix it. So, I dug in more. I read the test in the patch which gave me the approach how I can implement it.

Solution

Since my solution depends on the above patch, it would work only with Drupal 8.7.4 or later. The test in the patch added attributes to the link by accessing the build array and directly setting the attribute on the URL object that is passed to the template. I realised I could do the same with template_preprocess_node.

Usage of template_preprocess_node

In the code sample above, I am targeting the specific content type and display mode (teaser) I want to modify. I use Element::children to loop through all the items in the field and access the URL object. I directly call setOption on the URL object to set the attribute target="_blank".

It might seem a lot of work but this is the fastest way I know (and minimal code change). If you know of a better way, or of the Drupal core issue that would have given this option, please let me know in the comments. I hope this post was useful. Thanks for reading.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web