Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jan 16 2021
Jan 16

Drupal's Problem With Decorative Images

Decorative images are those that should not be described to assistive technology like screen readers. A simple example is a fancy image of a decorative border that you insert at the end of a news article.

Supporting decorative images in Drupal is quite straight forward. Images fields can be set to collect alternative text but not require it. When left blank, Drupal will output an empty alt attribute for the image element, indicating to screen readers the image should be ignored. Editors that want to convey that an image is decorative can then simply leave the alt text blank.

The problem with this approach is that it might encourage editors to leave alt text blank even if the image is not really a decorative. It's easy to pass right by the alt text field without even thinking about it. This is especially true for those that are completely unfamiliar with how important alternative text is for accessibility.

There's a small effort underway to resolve this problem in Drupal core. After some great discussions, particularly with Andrew Macpherson, a leading voice in Drupal accessibility, we landed on a potential solution (thanks Andrew!).

A Potential Solution

Instead of simply leaving the alt text field blank to the image is decorative, a checkbox labeled "Decorative" is introduced beneath the field. This checkbox must be checked if the editor wants to leave the alt text blank. This effectively makes the choice explicit rather than implicit, forcing the editor to pause for a second and consciously affirm that the image is truly decorative.

To let this solution incubate and evolve a bit, I contributed it as small module called Decorative Image Widget. It's quite flexible as instead of providing a brand new image widget, it modifies any image widget that is based on the default one provided by Drupal core.

Check it out and please provide feedback in the module's issue queue!

Jan 15 2021
Jan 15

20 years ago, the Drupal project started in a dorm room—today it is a billion dollar industry.

PORTLAND, Ore., U.S.A and LOCAL AREA HERE, January 15, 2021—Drupal, the world’s leading open source digital experience platform (DXP), celebrates 20 years of community-driven innovation. Since its founding 20 years ago, Drupal has touched millions of lives. One in 30 sites on the web is powered by Drupal, and that means most users of the web have experienced Drupal—even if they don't know it. 

Drupal has pioneered the evolution of content delivery across multiple channels. Whether powering conversational user interfaces (CUI) for smart devices, pushing content to digital signage for New York Metropolitan Transportation Authority (MTA), or serving as the core content store for augmented reality experiences, Drupal’s sophisticated architecture and platform stand ready for the future of digital content. 

Redefining digital experiences

7 years ago—on the eve of Drupal's birthday—Drupal founder and project lead, Dries Buytaert, laid out his belief that the web was entering a new era. 

Mobile had transformed the web, but I believed this was just the beginning. The mobile web was the first example of a new web defined by digital experiences that conform to a user's context and devices,” says Buytaert. “Since then, Drupal has defined itself as the leading platform for ambitious digital experiences, and as channels and devices proliferate, Drupal will continue to lead the open source DXP market.

Powered by a global community of innovation

As part of this 20 year milestone, we celebrate our community of more than 100,000 contributors who made Drupal what it is today,” says Heather Rocker, executive director of the Drupal Association. “Success at this scale is possible because the Drupal community exemplifies the values of open source and proves that innovation is sustained by healthy communities. One of our key goals at the Drupal Association is to convene the resources necessary for continued project innovation, and we do that through the collaboration of a global community that continues to grow year after year.

In fact, Drupal contribution increased by 13% in a year when many industries contracted due to the COVID-19 pandemic. Drupal's open source model has built a robust and thriving ecosystem of individual contributors, professional service providers, and end-user organizations that is well positioned to capitalize on the next 20 years of digital innovation. 

Drupal continues to evolve, serving needs around the globe and expanding into new markets.  Future-looking priorities include a continued positive impact on the Open Web, the cultivation of a diverse and inclusive open source community, and an increased focus on editorial experience and usability—to make the power of the Drupal digital experience platform even more accessible.  

2021 will be marked with year-long celebrations happening around the world with particular focus at DrupalCon in April. Related 20th birthday events can be found on social media through the hashtag #CelebrateDrupal and at CelebrateDrupal.org.  

About Drupal and the Drupal Association

Drupal is the open source digital experience platform utilized by millions of people and organizations around the world, made possible by a community of 100,000-plus contributors and enabling more than 1.3 million users on Drupal.org. The Drupal Association is the non-profit organization focused on accelerating Drupal, fostering the growth of the Drupal community, and supporting the Project’s vision to create a safe, secure, and open web for everyone.

 
###
 
For more information or interview requests contact Heather Rocker,  [email protected]
 

Jan 15 2021
Jan 15
Birthday cup cakes

On January 15, 2001, exactly 20 years ago, I released Drupal 1.0.0 into the world. I was a 22 years old, and just finished college. At the time, I had no idea that Drupal would someday power 1 in 35 websites, and impact so many people globally.

As with anything, there are things Drupal did right, and things we could have done differently. I recently spoke about this in my DrupalCon Europe 2020 keynote, but I'll summarize some thoughts here.

Why I'm still working on Drupal after 20 years

Student roomMe, twenty years ago, in the dorm room where I started Drupal. I'd work on Drupal sitting in that chair.

I started Drupal to build something for myself. As Drupal grew, my "why", or reasons for working on Drupal, evolved. I began to care more about its impact on end users and even non-users of Drupal. Today, I care about everyone on the Open Web.

Optimizing for impact means creating software that works for everyone. In recent years, our community has prioritized accessibility for users with disabilities, and features like lazy loading of images that help users with slower internet connections. Drupal's priority is to continue to foster diversity and inclusion within our community so all voices are represented in building an Open Web.

Three birthday wishes for Drupal

Dries giving a presentation on DrupalMe in 2004, giving my first ever Drupal presentation, wearing my first ever Drupal t-shirt.

Drupal's 20th birthday got me thinking about things I'm hoping for in the future. Here are a few of those birthday wishes.

Birthday wish 1: Never stop evolving

Only 7% of the world's population had internet access when I released Drupal 1 in 2001. Smartphones or the mobile web didn't exist. Many of the largest and most prominent internet companies were either startups (e.g. Google) or had not launched yet (e.g. Facebook, Twitter).

A timeline with key technology events that impacted Drupal. Examples include the first mobile browser, social media, etc.A list of technology events that came after Drupal, and that directly or indirectly impacted Drupal. To stay relevant, Drupal had to adjust to many of them.

Why has Drupal stayed relevant and thrived all these years?

First and foremost, we've been focused on a problem that existed 20 years ago, exists today, and will exist 20 years from now: people and organizations need to manage content. Working on a long-lasting problem certainly helps you stay relevant.

Second, we made Drupal easy to adopt (which is inherent to Open Source), and kept up with the ebbs and flows of technology trends (e.g. the mobile web, being API-first, supporting multiple channels of interaction, etc).

The great thing about Drupal is that we will never stop evolving and innovating.

Birthday wish 2: Continue our growing focus on ease-of-use

For the longest time I was focused on the technical purity of Drupal and neglected its user experience. My focus attracted more like-minded people. This resulted in Drupal's developer-heavy user experience, and poor usability for less technical people, such as content authors.

I wish I had spent more time thinking about the less technical end user from the start. Today, we've made the transition, and are much more focused on Drupal's ease-of-use, out-of-the-box experience, and more. We will continue to focus on this.

Birthday wish 3: Economic systems to sustain and scale Open Source

In the early years of the Open Source movement, commercial involvement was often frowned upon, or even banned. Today it's easy to see the positive impacts of sponsored contributions on Drupal's growth: two-thirds of all contributions come from Drupal's roughly 1,200 commercial contributors.

I believe we need to do more than just accept commercial involvement. We need to embrace it, encourage it, and promote it. As I've discussed before, we need to reward Makers to maximize contributions to Drupal. No Open Source community, Drupal included, does this really well today.

Why is that important?

In many ways, Open Source has won. Open Source provides better quality software, at a lower cost, without vendor lock-in. Drupal has helped Open Source win.

That said, scaling and sustaining Open Source projects remains hard. If we want to create Open Source projects that thrive for decades to come, we need to create economic systems that support the creation, growth and sustainability of Open Source projects.

The alternative is that we are stuck in the world we live in today, where proprietary software dominates most facets of our lives.

In another decade, I predict Drupal's incentive models for Makers will be a world-class example of Open Source sustainability. We will help figure out how to make Open Source more sustainable, more fair, more egalitarian, and more cooperative. And in doing so, Drupal will help remove the last hurdle that prevents Open Source from taking over the world.

Thank you

Areal photo of DrupalCon Seattle 2019 attendeesA group photo taken at DrupalCon Seattle in 2019.

Drupal wouldn't be where it is today without the Drupal community. The community and its growth continues to energize and inspire me. I'd like to thank everyone who helped improve and build Drupal over the past two decades. I continue to learn from you all. Happy 20th birthday Drupal!

Jan 15 2021
Jan 15

Open-source has the power to change the world, but, as we depend on it for democratic innovation, open-source also depends on us to thrive. At Axelerant, we know and own this; hence we’re constantly engaging in different open web communities, including Drupal’s.

Why are we writing this? First of all, we are always keen to shine a light on our team members because our people-first culture makes Axelerant succeed. Second, in a knowledge sharing spirit, we are willing to put out what has worked for us (and what we struggle with) regarding contributing and our community involvement.

We are celebrating Drupal’s 20th Anniversary, and we are proud of being part of that history for over a decade. What better way to celebrate than recognizing and sharing the stories of the people involved, the makers that keep the ball rolling.  

Hussain Aabbas

Hussain Abbas
Director of Drupal Services

"Celebrating our people and the community has been among our values since the beginning. Drupal’s 20th anniversary is one of those occasions where both of these values come together in demonstrating Axelerant’s commitment to be a productive part of the amazing Drupal community through its team."

Here, we want to share a few stories from team members who recently contributed and inspired us with their Drupal journey.

Lessons learned in our Monthly Contribution Meetups

We started Monthly Contribution Meetups in 2019 to foster a culture of mentoring and giving back. Our goal is to get more people contributing to Drupal consistently and provide the tools to those who want to do it for the first time. These meetings are an excellent space to seek out support, share findings, learn, and bring the opportunity to know other team members, their Drupal journeys, and motivations. From these sharings, we continue to grasp the familiar obstacles people encounter when contributing, ideas on how to surpass them, and the benefits that come with getting involved. 

screenshot of Axelerant team's zoom meeting

November’s monthly contribution meetup

Thirst for learning overcomes time constraints

Hansa-Pandit

Hansa Pandit
Frontend Engineer - L2

“I was first introduced to Olivero reading about it on different blogs. That caught my eye. I read the documentation, got my set up ready, jumped right into a coding sprint, and assigned myself an issue. I wanted to work on a feature, so when the theme went into the core, I would be able to say: that is the part I built.”

Hansa has been on Drupal.org for over two years, and besides other contributions, she’s been actively involved with the Olivero theme initiative

Time management was a big challenge for Hansa, especially since she gave Olivero the same priority level as other work-related projects. But the logic was clear; she knew that if she was investing her time towards contribution, she needed to benefit from it by learning.

And she declares the experience made her technically stronger, “I learned a lot of new skills. Other projects I worked on supported specific client's needs. Still, for Olivero, we had to make sure we were theming every single module supported by Drupal while making sure we met all the accessibility standards.”

And Olivero is now in core, we are proud of Hansa, and we celebrate her and everyone involved in this achievement.  

Find the right initiative, and don’t do it for the credit

mohit-aghera

Mohit Aghera 
PHP/Drupal Architect - L1

It is important to focus on learning and exploring instead of doing it for the credits: “I decided to focus on this initiative because I was interested in learning about writing test cases for Drupal Core. It was a quick way to get introduced to this, and also a great opportunity to explore almost every feature of the core, instead of focusing on a specific module.”

Mohit is one of our most experienced Drupal engineers and contributors; hence he’s continuously mentoring the team. In our last meetup, he explained his motivations and experience with the Bug Smash Initiative; “it’s a great initiative to devote energy to, because it is well managed. Maintainers do an excellent job triaging issues,” he argued. We often hear that not knowing where to start or feeling overwhelmed by the issue queue translates into demotivation within weeks. Counting on careful planning and mentoring makes life easier for everyone, which is why finding the right initiative becomes essential.  

A second factor to consider while contributing is the right motivation. We always remind ourselves of the opportunities that come with contributing for personal branding, sharing your work, showcasing a visible portfolio, and “ultimately if you want to learn Drupal Core, contributing is the best way to do it” he insists. 

Clear expectations help first-time contributors

Abhay-Saraf

Abhay Saraf
PHP/Drupal Engineer - L2

When asked what could be done differently to motivate others to join these sprints, he told us, “being clear about expectations and providing resources that display a step by step before the event would make the experience less intimidating.”

As founding members of the Drupal India Association, we also look to align our mentoring and contribution efforts with India’s larger Drupal community. Organizing and hosting monthly contribution weekends is one way to boost a sustainable contribution culture, and Abhay recently joined this initiative for the first time. From his experience, we confirmed that meeting folks, running into smiling faces, and having the space to give back without the pressure of getting lost or making a mistake is fundamental to onboard newcomers. “I had a good experience because I already had a list of prioritized issues. I could work with a free mind since I knew that I'd get the guidance needed if I had any doubts. Also, I liked the flexibility of this event, it goes on for a day, you can dedicate any amount of time you can, even if it is just an hour, it would still be worthwhile,” he shared.

Contribution = Recognition = More contribution

Gaurav-Kapoor

Gaurav Kapoor 
PHP/Drupal Engineer - L2

Gaurav's efforts were rewarded with a scholarship to attend DrupalCon Amsterdam 2019. Through this contribution journey, he gained vast Drupal knowledge, “now I focus on mentoring and sharing knowledge, so others can also leverage all you can gain from contributing,” he says. 

Gaurav’s Drupal journey started right after college when he decided to leverage his spare time by joining a two-person startup. After learning Drupal, he soon realized that contributing to the community would build the company’s reputation as trusted experts, and that was the initial driver. Eventually, what sparked a community spirit was getting noticed and recognized. He’s been ranked among the top 30 contributors and recognized in Dries’post about Who sponsors Drupal development? for the past three years.

Events and the power of networking

kunal-kursija.jpg

Kunal Kursija
PHP/Drupal Engineer - L3

Kunal has the habit of surfing through different channels that list upcoming events (DrupicalDrupal.orgDrupal Slack), so when we found out about BADCamp 2020’s call for papers, he decided to go for it. A two-way process started, “I began to review everything I had learned recently or topics I wanted to learn about”, from there Kunal came up with a list of topics and submitted them.

 

Speaking at events has many benefits, especially to those interested in being seen as an authority in their fields. Presenting sessions nourishes the community with knowledge and best practices and builds the speaker’s reputation and network. That was certainly the case for Kunal. “I first heard about BADCamp while attending DrupalCamp London. Someone I met there told me BADCamp is one of the best Drupal events. That image struck me and has stayed with me since then.” 

 “Of course, it was exciting to learn my session had been selected. I was disappointed I couldn’t attend the event in person. However, I enjoyed getting introduced to other BADCamp speakers, and it was great to participate in such a big and important event.”

To many more years of Drupal

We recognize our monthly meetups serve the purpose of keeping an ongoing conversation around contributions, inspire and support team members and promote those who actively get involved. Our team works with a contribution-first approach, and this practice grants us a place at the top of the ranking of organizations supporting Drupal. And yet, there's more we need to do to build up a sustainable contributing culture. We still find that most people that haven't contributed before can judge the onboarding process as too arduous, and time constraints follow soon after. Even with mentorship support available, the steep learning curve poses a hurdle to conquer.

We are continually discussing and exploring initiatives to encourage contribution, from creating a role for a full-time contributor to gamification aspects around tracking contributions or mentoring team members on the bench between projects. 

Today we introduced a selected few stories, evidence that sustains again and again that the key ingredient and the strength of this 20-year-old open-source project are people.

We are excited to be part of this celebration and would love to hear about your contribution strategies and ideas. What’s your preferred way to give back to Drupal?

Don’t forget to join the celebration on social media!

P.S. See you at the Global Contribution Weekend happening 29-31 January 2021.

Jan 15 2021
Jan 15

As expected, Drupal 9.1 was released on schedule at the closure of 2020. We have already talked about the Drupal 9 release and how it’s a testament to the predictable and reliable nature of the Drupal release cycle. Drupal 9.1 takes a step forward by adding more features and releasing them as predicted.

In this blog, we will be discussing the new improvements and more that will follow. 

Is it worth upgrading?

The Drupal 9.1 stable release was out as expected on Dec 2nd, 2020. We previously advocated that if you are on Drupal 8.9, you needn’t hurry to upgrade to Drupal 9.0 as you would not see many new features. But that’s changed.

Drupal 9.1 adds exciting features and updates along with support for PHP 8 (we have previously written about making Drupal 9 compatible with PHP 8).

It’s also worth upgrading as Drupal 9.1 brings significant changes in the user interface for both sighted users and assistive technology.

New features

Olivero theme

The much-awaited beta experimental frontend theme Olivero has been added to the Drupal core. As a replacement to Bartik, this is a modern and clear theme planned to become the new default Drupal theme later.

This particular theme is named after Rachel Olivero (1982-2019), the head of the organizational technology group at the National Federation of the Blind. She was a well-known accessibility expert and a Drupal community contributor.

Additions to the Claro theme

Claro was added as an experimental theme in Drupal 8.8. And now, Drupal 9.1 has added designs for various key pages like the extensions administration page, views administration, and status report. Also, the media library received Claro styled designs too.

Composer 2 and PHP 8 support

Drupal 9 fully works with Composer 2 and it is strongly recommended to update. Many of the popular plugins have also been updated. If the one you use doesn’t have updates, please help the plugin author with a PR to add the support (it’s quite easy). The new release comes with a significant improvement in performance and also reduces memory usage.

Drupal 9.1 has added support for PHP 8. PHP 8 brings in a lot of newer languages and even though Drupal core isn’t using any of them (it still supports PHP 7.3), you could use features like union types and more in your custom code. Further, it’s likely that PHP 8 could be a requirement for Drupal 10, due to release in 2022.

Additionally, the user experience has been improved by making the content load faster as the images rendered by Drupal with known dimensions will now be set to lazy-load automatically. 

How to update from a previous version of Drupal

Now, this begs an important question: how will the current users of Drupal 7 or 8 migrate to Drupal 9.1? And also, if users have already migrated to Drupal 9, is there anything for them to execute with this release?

Every version of Drupal demands a different approach to migration. The idea is to pick the right Drupal migration strategy. Let’s look at how to migrate from different versions in this section. 

Upgrade from Drupal 7

Drupal 7 users can easily continue to migrate to Drupal 8.9 or migrate to 9.0 or 9.1 directly. Migrating directly to Drupal 9/9.1 will help them skip a step. The upgrade path for multilingual sites remains stable in Drupal 8.9, 9.0, and 9.1!

For more on how to upgrade from Drupal 7, check out the ultimate guide to Drupal migration

Upgrade from Drupal 8

For Drupal 8 users, there’s still time to step up to the latest 8.9 version until the end of Drupal 8, i.e., in November 2021. The bug fixes will continue and the next one is scheduled for January 6, 2021. 

Sites on Drupal 8.8 will no longer receive security coverage. This means moving to Drupal 8.9/9 becomes crucial from this update onwards. 

According to Drupal.org, of the top 1000 most used drupal.org projects, 85 percent are updated for Drupal 9, so there is a high likelihood that most of the modules and themes you rely on are compatible.

Upgrade from Drupal 9

Drupal 9.1 is a minor release of Drupal 9. It can be updated from Drupal 9 versions for utilizing these new features without breaking backward compatibility (BC) for public APIs. While Drupal 9 will keep requiring Symfony 4, Drupal 9.1 has adjustments required to support Symfony 5 already. 

All these updates are underway to make Drupal 9 forward-compatible with Symfony 5 and 6 (not yet released). And also, as Drupal 10 is planned for mid-2022, these new upgrades target an excellent growth curve.

Running the update

We will only talk about updating from Drupal 8.9 or Drupal 9 in this section. Updating multiple versions is possible but needs additional care and consideration, which we won’t cover in this section.

  • First of all, if you are already using the Olivero theme in your project, remove that by running this command. We need to do this as Drupal 9.1 includes Olivero in the core.

$ composer remove drupal/olivero

  • To begin an upgrade from Drupal 8.9 or Drupal 9, run the following command:

$ composer require drupal/core:^9.1
drupal/core-composer-scaffold:^9.1 --update-with-dependencies

  • If your project is using drupal/core-recommended, use that instead of Drupal/core in the command above. Also, for the above, your project must be using the recommended Drupal Composer template. It is quite likely that the command might throw some dependency related errors. 

Since there are a wide variety of possible dependency issues, we won’t cover everything here. But to get started, try replacing the --update-with-dependencies flag with --update-with-all-dependencies flag in the command above and try again.

Drupal 9.1 seems to be a promising update for users ready to take the plunge. If you are still not sure, give us a chance to convince you why upgrading to Drupal 9 is crucial now.

Share your Drupal 9 experience with us and watch this space for more insights!

Jan 15 2021
Jan 15

Twenty years ago today, Drupal was founded by Dries Buytaert. Looking back twenty years, into a dorm room at University of Antwerpen, where the Drupal story began, a lot has been achieved. 

In the world of technology and open source environment, where changes are inevitable and happen fast it's a great achievement to reach the age of twenty. Over the years the Drupal project has developed from a simple version of CMS system to an open source digital transformation system that millions of people around the world use on a everyday basis.

One of every thirty websites in the world are built with Drupal

In the last 20 years Drupal has become one of the leading open source frameworks in the world, where the Drupal community, companies and end users have created a story that is nowhere near ending. During the history of Drupal It’s always been a priority to solve problems for the end user, reacting fast to new technologies and adapting to the Drupal project. 

Looking at Drupal as an open source project where the focus is on making it easy for new users to start using Drupal it has proved its case. In a modern world, where use companies and governments wants to own their projects and have the freedom of choosing whom to work with, it is necessary to provide good quality software with the aim of lowering cost by avoiding vendor lock-in.

At 1xINTERNET everything revolves around Drupal. Our business model is based on Drupal the technology and our employees are truly dedicated Drupalists. We take part in the community work and we do not only have ambitions for 1xINTERNET but also for Drupal as a project. We care about the future of  the Drupal ecosystem and its our wish that together we can build an even stronger community with a growing number of contributors, companies like us at 1xINTERNET and the end-users that we service on a daily basis. With all of us becoming makers, it will be interesting to see where the next decade takes us.

We strongly believe in the power of the extraordinary innovative Drupal community and we are proud every day, when meeting new clients or onboarding new employees for being part of the Drupal universe.

Celebrating 20 years of Drupal

During the year 2021 we will be focusing on highlighting our work at 1xINTERNET and our solutions made with Drupal. We call this series “Celebrating 20 years of Drupal” where we will highlight 20 projects through the year that we have been involved in and where Drupal has been used. Every month we will feature case studies, introducing our latest projects as well as launching product lines from 1xINTERNET. So make sure you follow us on our social media to get the news hot and fresh! 

Transgourmet

Today, to kick off the year of Drupal for 20 years celebration,  we are highlighting a case study for Transgourmet, one of our great clients that we built a multi-site solution for last year. Transgourmet is a food wholesaler in Germany, for gastronomy, hotels and bulk consumers.

1xINTERNET was asked to review all websites of Transgourmet and their brand Selgros Cash and Carry and asked to create the same technical requirements and standards for all of them. The task was to create a consistent and flexible solution that would enable the relaunch of all websites of the company. Therefore a robust, secure and efficient CMS needed to be used, whose development and maintenance can be decoupled in time. 

For a company like Transgourmet a project like this is an essential part to complete the digital strategy. 1xINTERNET created a Drupal distribution on which all their websites can run smoothly. The front end was integrated into a design system from Patternlab

This case study won the Enterprise category at the German Splash Awards 2020 which was held in October 2020.

Jan 15 2021
Jan 15

Last year, Mariano had a proposal: let’s try to automatically deploy after successful testing on Travis. We never had anything like that before, all we had is a bunch of shell scripts that assisted the process. TL;DR: upgrading to CD level was easier than we thought, and we have introduced it for more and more client projects.

This is a deep dive into the journey we had, and where we are now. But if you’d like to jump right into the code, our Drupal Starter Kit has everything already wired in - so you can use everything yourself without too much effort.

Prerequisites

At the moment the site is assembled and tests are run on every push. To be able to perform a deployment as well, a few pre-conditions must be met:

  • a solid hosting architecture, either a PaaS solution like Pantheon or Platform.sh, just to name a few that are popular in the Drupal world, or of course a custom Kubernetes cluster would do the job as well
  • a well-defined way to execute a deployment

Let me elaborate a bit on these points. In our case, as we use a fully managed hosting, tailored to Drupal, a git push is enough to perform a deployment. Doing that from a shell script from Travis is not especially complicated. The second point is much trickier. It means that you must not rely on manual steps during the deployments. That can be hard sometimes. For example while Drupal 8 manages the configuration well, sometimes there’s a need to alter the content right after the code change. The same is true when it comes to translations - it can be tricky as well. For every single non-trivial thing that you’d like to manage, you need to think about a fully automated way, so Travis can execute it for you.

In our case, even when we did no deployments from Travis at all, we used semi-automatic shell scripts to do deployments by hand. Still, we had exceptions, so we prepared “deployment notes” from release to release.

Authorization & Authentication

What works from your computer, for instance pushing to a Git repository, won’t work from Travis or from an empty containerized environment. These days, whatever hosting you use, where you have the ability for password-less authentication, it’s via a public/private key pair, so you need a new one, dedicated to your new buddy, the deployment robot.

ssh-keygen  -f deployment-robot-key -N "" -C "[email protected]"

Then you can add the public key to your hosting platform, so the key is authorized to perform deployments (if you raise the bar, even to the live environment). Ideally you should create a new dummy user on the hosting platform so it’s evident in the logs that a particular deployment comes from a robot, not from a human.

So what’s next? Copying the private key to the Git repository? I hope not, as you probably wouldn’t like to open a security issue and allow anyone to perform a deployment anytime, right? Likely not. Travis, as most of the CI platforms, provides a nice way to encrypt such things that are not for every coworker. So bootstrap the CLI tool and from the repository root, issue:

travis encrypt-file deployment-robot-key

Then follow the instructions on what to commit to the repository and what not.

Now you can deploy both from localhost and from Travis as well.

Old-Fashioned, Half-Automated Deployments

Let’s see a sample snippet from a project that has existed since 2007 for Drupal 7:

cd "$PANTHEON_DIR"
echo -e "${GREEN}Git commit new code.${NORMAL}\n"
git add . --all

echo -e "${YELLOW}Sleeping for 5 seconds, you can abort the process before push by hitting Ctrl-C.${NORMAL}\n"
git status
sleep 5
git commit -am "Site update from $ORIGIN_BRANCH"
git push

A little background to understand what’s going on above: For almost all the projects we have, there are two repositories. One is hosted on GitHub, has all the bells and whistles, the CI integration, all the side scripts and the whole source code, but typically not the compiled parts. Whereas, the Pantheon Git repository could be considered as an artefact repository, where all the compiled things are committed in, like the CSS from the SCSS. On that repo we also have some non-site related scripts.

So a human being sits in front of their computer, has the working copies of the two repositories, the script is able to deploy to the proper environment based on the branch. After the git push, it’s up to Pantheon to do the heavy lifting.

We would like to do the same, minus the part of having a human in middle of the process.

Satisfying the Prerequisites

All the developers (of that project) had various tools installed like Robo (natively or in that Docker container), the Pantheon repository was cloned locally, the SSH keys were configured and tested, but inside Travis, you have nothing more than just the working copy of the GitHub repository and an empty Ubuntu image.

We ended up with a few shell scripts, for instance ci-scripts/prepare_deploy.sh:

#!/bin/bash

set -e

cd "$TRAVIS_BUILD_DIR" || exit 1

# Make Git operations possible.
cp travis-key ~/.ssh/id_rsa
chmod 600 ~/.ssh/id_rsa

# Authenticate with Terminus.
ddev auth pantheon "$TERMINUS_TOKEN"

ssh-keyscan -p 2222 "$GIT_HOST" >> ~/.ssh/known_hosts

git clone "$PANTHEON_GIT_URL" .pantheon

# Make the DDEV container aware of your ssh.
ddev auth ssh

And another one that installs DDEV inside Travis.

That’s right, we heavily rely on DDEV for being able to use Drush and Terminus cleanly. Also it ensures that what Travis does is identically replicable at localhost. The trickiest part is the process of doing an ssh-keyscan before the cloning, otherwise it would complain about the authenticity of the remote party. But how do you ensure the authenticity this way? One possible improvement is to use https protocol, so the root certificates would provide some sort of check. For the record, it was a longer hassle to figure out that the private key of our “robot” is exposed correctly, and the cloning fails because the known_hosts file is not yet populated.

Re-Shape the Travis Integration

Let’s say we’d like to deploy to the qa Pantheon environment every time master is updated. First of all, only successful builds should be propagated for deployments. Travis offers stages that we can use to accomplish that, so in practise, in our case: if linting fails, do not waste the time for the tests. If the tests fail, do not propagate the bogus state of the site to Pantheon. Here’s Drupal-starter’s travis.yaml file.

We execute linting first, for coding standard and for clean SCSS, then our beloved PHPUnit test follows, and finaly we deploy to Pantheon for QA.

Machine-Friendly, Actualized Script

So what’s inside that Robo command in the end? More or less just the same as previously, the shell script became tidier and smarter, and after a while, we migrated to Robo to get managed command execution, formatted output and error handling out of the box. And also an important part is that all PHP devs can feel comfortable with that code, even if they are not Bash gurus. Here’s the deploy function. Then your new robot buddy can join the deployment party:

An automated deploy in the Pantheon dashboard

Do It Yourself

If you would like to try this out in action, just follow these steps - this is essentially how we set up a new client project these days:

Do You Need It?

Making Drupal deployments work in a fully automated way takes time. We invested about 80 hours polishing our deployment pipeline. But we estimate that we saved about 700 hours of deployment time with this tool. Should you invest that time? Dan North, in his talk Decisions, Decisions says

Do not automate something until it gets boring…

So, automate your deployments! But don’t rush until you learn how to do this perfectly. And if you decide to automate, we encourage you to build it on top of our work, and save yourself a lot of time!

Jan 15 2021
Jan 15

Today, just 20 years ago, the first version of Drupal, Drupal 1, was released. We join in a loud (and somewhat off-key) "Happy Birthday" in our minds for its 20th birthday. Great that Drupal and its wonderful community have been part of the web world for so long!

20 years Drupal around the world

20 years, yes that is right TWENTY, is a really long lifetime for a content management system. Many technologies have emerged in these 20 years, became relevant, lost their relevance again and disappeared. But not Drupal. The open source technology has been continuously enhanced by the large community, is still state of the art and will continue to shape the future of the web.

Drupal was there

“We were blogging before blogging was cool.” This is how Dries started his technology review at the last DrupalCon. And it's true - vividly illustrating the ambition and development history of Drupal. 

Who remembers the days before everyone could use the Internet? It feels like an eternity ago, it was even a previous millennium. For a large part of the development within and around the Internet, Drupal was there:

  • Drupal witnessed the birth of the browsers Safari and Firefox.
  • Drupal accompanied the emergence of the social web and with it, among others, Myspace, Facebook and YouTube from their beginnings.
  • When the iPhone became the first device with mobile browsers, responsive designs were implemented in Drupal. Thus, Drupal played an important role in the mobile web from the very beginning.
  • Drupal was the first CMS to adopt jQuery, validating it and ultimately promoting the spread of JavaScript early on (and significantly).
  • Drupal is also developing new approaches to current developments (the "Digital Experience Area") around new channels and approaches such as voice interaction, digital machine learning and integrated solutions.

A wonderful community

We know that Drupal is great. And more and more people are agreeing with us. The Drupal supporters and community is growing. 

For more than 10 years, we as undpaul have been part of the Drupal community. Some of us have been part of it for some time longer. And here is a personal thank you to Drupal: Thank you that undpaul has come into being at all due to the lovely people of the community. Even though everything was a bit different last year, DrupalCon is one of our annual highlights. We are even more looking forward to the next live event with the community - with our friends and colleagues!

I will always remember my first DrupalCon, back in 2006 in Brussels. It was great to be welcomed into the community so easily. And of course the currywurst with Dries and a few other Drupal veterans in the restaurant of a sports club.

Stefan

Jan 14 2021
Jan 14

Drupal Birthday Cake

Today, on Drupal's 20th birthday, we are kicking off celebrations that will last throughout 2021. Together, let’s celebrate 20 years of Drupal and our Community - the inspired makers that keep Drupal innovative. 

As part of this 20-year milestone, we celebrate our community of more than 100,000 contributors who made Drupal what it is today,” says Heather Rocker, executive director of the Drupal Association. “Success at this scale is possible because the Drupal community exemplifies the values of open source and proves that innovation is sustained by healthy communities.”

To kick things off, we have a few ways for you to get involved:

  • Promote the official Press Release of Drupal's 20th birthday to your local tech media
  • Submit your Drupal birthday celebration to Community Events 
  • Post a selfie of your celebration on Celebrate Drupal
  • Submit Drupal milestones to the 20 Years of Drupal History timeline
  • Share your excitement and what you’re doing to celebrate on social media - and be sure to add the hashtag #CelebrateDrupal
  • Participate in our 'Drupal Doodle' event - where we're looking for celebratory banners to feature on Drupal.org
  • Propose content for DrupalCon North America 2021 that showcase the ambitious digital experiences you’ve created with Drupal
  • Register for DrupalCon

With so much to celebrate, today's activities are only the beginning. Keep an eye on this blog, the @drupalassoc on Twitter, and Drupal Association on Linked In for more activities throughout 2021. 

Jan 14 2021
Jan 14

Who reads what? For how long? And where?

As a digital agency we would like to know what is going on on our website. Therefore we use cookies. They help us to measure exactly this. And no worries - we do not recognise underwear colour and coffee consumption. However, these cookies are stored with you. So we can at least find out how often you come by.

Accept all

Decline all

Customize my preferences

Jan 14 2021
Jan 14

Serving on the Drupal Community Working Group has been one of the most fulfilling experiences of my career in open source. Since 2013, I’ve had both the honor and the privilege to work alongside some of the most thoughtful, patient, and devoted members of our community to help develop processes and structures for community governance, resolve conflicts, and help make the Drupal project and community a more friendly and welcoming place for everyone. That work has at times been challenging, but it has also provided many opportunities for learning and growth.

All good things must come to an end, however, and as announced at DrupalCon Europe last month, I’ve been working with the other members of the CWG over the last year on a plan to step down from my current position on the Conflict Resolution Team and make way for fresh talent and leadership. 

As our Code of Conduct states, “When somebody leaves or disengages from the project, in whole or in part, we ask that they do so in a way that minimizes disruption to the project. This means they should tell people they are leaving and take the proper steps to ensure that others can pick up where they left off.”

In my case, this means that while I will no longer be one of the people responsible for fielding incident reports or acting as a facilitator to help community members resolve conflicts, I will continue to be available to the current members of the team on an as-needed basis to help provide background and context for past issues. I will also continue to serve as a member of the CWG’s Community Health Team, working on projects to proactively improve community health, such as updating our Community Code of Conduct. I also plan to spend more time advocating within the broader open source community for improved community management structures and processes.

With this transition comes an opening within the Conflict Resolution Team, who is currently engaged in a search for new members, which is being led by Tara King (sparklingrobots). You can learn more about the kinds of folks we are looking for in our last call for members from 2018; additionally, all members are expected to abide by the CWG’s Code of Ethics.  

As per the CWG's charter, new members of the CWG’s Conflict Resolution Team are appointed to up to two 3-year terms by the group’s Review Panel, which consists of the two community-elected Drupal Association board members, plus an independent representative appointed by the board as a whole.

If you are interested in being considered, please reach out to Tara or email [email protected]. In addition to the openings on the Conflict Resolution Team, we are looking to fill several roles on our Community Health Team for people looking to help make a positive difference in our community.

In closing, I want to thank all of my past colleagues on the CWG: Donna, Angie, Roel, Adam, Mike, Emma, Rachel, Jordana, and Alex, as well as the countless community members who have helped us out in various ways over the years. Drupal is better because of you and your contributions.

Jan 14 2021
Jan 14

I once sat on a mountain and deeply contemplated the mysteries of Drupal development. Actually, I live on a mountain, so I do this every day, and the title of this post isn't a Zen revelation, I stole that from Elon Musk.

I'm not trying to build rockets and send humans to Mars, and I don't want to draw too many parallels between what I do and the complexity of that enterprise, but we do solve some complex problems of critical importance to our clients. Every efficiency we can gain improves outcomes.

The Challenge: Theming Efficiently

When I work on a project, or a problem, I try to distil it down to it's most important elements, and ruthlessly eliminate that which is not efficient, or providing real value. As we trend more towards Javascript front ends and I've had more opportunity to get deeply back into that side of Drupal development, I find myself being more critical of inefficiency, over-engineering, bloat and unnecessary "features". I believe the Developer Experience (DX) is as important as the User Experience (UX). Happy, efficient devs, make quality software.

To that end, I want to simplify and reduce the friction in my workflow. My particular goal for this post is to make a theme cleaner, easier to build and maintain, and perhaps get even better performance.

I'm afraid this might be controversial for some reason, but I've dropped the compiler and task runner from my themes. Every theme I've encountered since I can't remember when, has used SCSS and an often unnecessary JS Compiler. Why run simple jQuery based behaviors through a compiler?

There was a time when I was excited about this approach and could see the benefits of SCSS. Today, I'm not so sure it's necessary. As for Javascript, if you're just doing simple Drupal behaviors, and you're not into Typescript or something, you don't need a compiler for that. Why not avoid the complexity and drop it all from the theme? Of course, decoupled front ends or other advanced Javascript requirements is a different story.

What was great about SASS/SCSS, but maybe isn't so much any more?

  • It's CSS Syntax friendly so you already know how to get started with it.
    • CSS is CSS Syntax friendly too. :p
  • Variables
    • CSS has that now, and it's "catching up" in other ways.
  • Nested Syntax
    • I often see this causing bloat and hard to read code. While they say it's "easier to read/write", I'd say this is only true when there's a disciplined hand crafting well structured SCSS. This is true for CSS as well, and I don't see the advantage of nested syntax.
  • Mixins
    • Component driven Atomic Design reduces the value of this, and new features of CSS reduce it even further.
  • Modularity via @import compiling down to a single file.
    • Again I'll mention Component based Atomic Design, as well as Drupal's Aggregation and libraries.yml.
  • Popularity and a large SASS Community
    • Literally every front end dev uses CSS. ;)

note: 6 of the 7 benefits of SASS over CSS from this 2018 post.

Another thing we got from the compiler was code linting which can be done at some or all of three other points in the development/delivery pipeline; in the editor, in a git pre-commit hook or in automated test runs. There are even CSS linters written in php we can include in the composer.json we're using anyway.

Now that all that's out of the way...

How I Built My Last Project

  • Atomic Design
  • Components
  • Drupal libraries
  • Drupal CSS/JS Aggregation

Every front end Drupal developer is probably fully aware of these things. We're doing component based front ends already. I won't belabor each of these points. I'll just share a few thoughts about why I think this is often enough to make fast, reliable and robust front ends, and eliminate some cruft from the process.

Atomic Design

A quick book to read online

Breaking our user interfaces down into clearly defined patterns gives us consistency, re-usability and a clear hierarchy of components. I kind of assume everyone has encountered this, or some similar methodology, but if not, I highly recommend this book.

Identifying these clear and reusable patterns in our UI allows us to build each one out as a discrete and self contained component that can be pieced together to make more complex features on a page.

Components

Building these Atomic patterns into usable templates creates our components. The self contained components include the twig template and the CSS specific to that component. If there's some discrete JS functionality that goes along with it, it can be include here as well.

 - my_theme
   - components
     - hero
       - paragraphs--hero.html.twig
       - hero.css
     - card
       - paragraphs--card.html.twig
       - card.css
     - ...

We can install the components contrib module to allow our theme templates to be discovered somewhere other than the /templates directory.

My preference for this project was to use the paragraphs module to create "stackable" components for node content, but you can create or use components in node templates, for blocks, header, nav, etc.

You might even abstract your components out to something more like a pattern library, allowing you to include them in different drupal templates, i.e., in my paragraph--hero.html.twig file:

{% include "@my-pattern-library/hero/hero.twig" with { ...

In my case I have a set of components I reuse across multiple projects which are defined in a custom module. The component module allows me to access them as components in this way from my theme templates.

In the theme or module's info.yml file:

component-libraries:
  my-components:
    paths:
      - components

Makes the components in this project's components path available to include with @my-components/....

You can use this with something like Pattern Lab too, and seamlessly integrate components from another resource. If your project is at a level where it needs something like that, then keeping the compilation tooling in a separate pattern library project process still allows us to run without it in the Drupal theme.

Drupal Libraries

Now in our component template files, we can directly attach the library we want associated with the component.

/* Optionally include the custom module's default component styles. */
{{ attach_library('my_component_module/hero') }}
/* Optionally include this theme's component styles, if any. */
{{ attach_library('my_theme/hero') }}

{% include "@my_component_module/hero/hero.twig" with {
  'image': content.field_background_image,
  'heading': content.field_heading,
  'text': content.field_text,
  'link': content.field_link
}%}

These libraries are, of course, defined in their respective module's or theme's libraries.yml file.

By attaching them to the component's templates we allow Drupal to only load them when that particular component is used on a page, instead of one huge compiled CSS or JS file for everything.

Drupal CSS/JS Aggregation

Now when we turn on Aggregation, any libraries loaded with components are pulled together by Drupal and cached as needed. Each page's aggregated CSS is only the size it needs to be to deliver it's components.

Using the Advanced CSS/JS Aggregation contrib module can give us additional improvements, including minifiy which is something we would actually lose without a processor in the development pipeline.

Back to My Mountain Meditation

The component approach gives us an innate modularity to our CSS which vastly improves maintainability and suppresses the opportunity for bloat. The clear structure makes the CSS understandable.

Removing compilation means we can forget maintaining another package list, and github telling us we have vulnerabilities in our package.json all the time. We don't need watchers and browser sync and we can avoid any debate on whether to include compiled assets in the working repo! (hint...you should not.) ;)

We've also just sped up our CI build/test and removed more than one potential point of failure in that process.

I'll be pushing and testing the limits of performance, scalability and maintainability of this project as I prepare it for the wild and help it grow. If I find I've made a terrible mistake, I'll be sure to report back here. But right now, it feels like spring cleaning on the mountain.

"Code like nobody is watching... but then refactor. ;)"

Jan 14 2021
Jan 14

What devices are Drupal sites tested on in 2021?

Desktop is still number one but there is only one respondent who does not test on mobile. I wonder if there is a future where Desktop will become less relevant than other device types for Drupal "applications". Maybe the question should be "when" and not "if".

What version(s) of Drupal are we using in 2021?

No surprise that Drupal 9 has the highest score, because all new projects should be starting on Drupal 9. the majority of Drupal sites currently online runs Drupal 7, according to data that depends on the "update module" being turned on, so it's no surprise to see that over half of the respondents are still working on Drupal 7 in 2021.

The same data I linked above, together with the data from this survey, suggests that the upgrade from Drupal 8 to 9 is not really as trivial as advertised.

Bonus question: What Drupal hosting platforms are you working with, in 2021?

It's no surprise to me that Pantheon.io is the most popular choice here, and I'm also happy to see they are getting more competition. Amazee.io with their Lagoon platform is a (relative) newcomer to the Drupal platforms market and they are getting traction.

I had not heard about OpenShift before. Apparently, it's an on-premise cloud platform. An interesting concept considering increased pressure on data security and end-user privacy. 

Who filled out this survey?

This survey was mostly filled out by agency-side Drupal professionals. 14% of respondents are client-side, i.e. working at a company that uses Drupal. 

Thank you

Thank you for helping us and our readers make better decisions about what Drupal environments to support!

Jan 13 2021
Jan 13

TLDR: Drupal 7 has a much longer lifespan than the (already pushed back) official date, and Drupal 8 has an essentially infinite lifespan because it updates in-place to Drupal 9 easily (and the same will be true of Drupal 10, 11, ∞.). There's no reason to rush an upgrade— but there's no reason to wait either.

That's the short version.

A client recently wrote to Agaric about Drupal 7 / Drupal 8 / Drupal 9 project planning questions:

With the EOL for Drupal 7 in Nov of 2022, and the EOL for Drupal 8 in Nov 2021, is there a reason we should move a D7 site to D8 at all this year? Seems like we might want to move directly to D9? We don’t want to feel pushed up against a wall with a “new” site build in Drupal 8, if we can limp along in D7 for a couple more years while we develop a D9 site with a longer lifespan. I’m wondering if you might have time to discuss pros and cons briefly so we can get a good plan together for moving forward.

I started typing and almost did not stop:

  1. No one believes me when i say this, but i repeat my assurance that Drupal 7 will be well-supported commercially until 2030 or later (Drupal 6, released in 2008, still has semi-official long term support until at least February 24th, 2022— and Drupal 7 has a larger install base than Drupal 6 ever did, and currently has the largest install base of any version of Drupal by far, with more than half a million tracked installs.

    Drupal 7 will be supported by the community for a long time. You do not have to feel pushed to a new version, like, ever.

Stacked area chart showing Drupal 7 with more than half of all currently tracked Drupal core installs, which is more than half a million.
  1. We do recommend moving directly to Drupal 9 (which was released on June 3rd of 2020), however:

  2. Moving to Drupal 8 or to Drupal 9 is much the same. Drupal 8 starts what i call the "modern Drupal" era. Whereas for going from Drupal 5 to 6 or 6 to 7 or 7 to 8 broke backward compatibility and might as well be a full rebuild (so we would often recommend hopping a version, say, stay on Drupal 6 and wait for Drupal 8 to be ready) going from Drupal 8 to 9 is closer to going from Drupal 8.8 to 8.9— an in-place upgrade from 8.9 to 9.0. Going from 9 to 10 will work the same, and that's the plan and promise for Drupal 8 on out.

  3. All that said, if anything significant needs fixing on your current Drupal 7 site, or you are looking to make any improvements, you'll want to do that on Drupal 8+ or Drupal 8/9 as we phrased it back when Drupal 9 was still a pretty recent release, but now we can just say Drupal 9— or, as i call it to emphasize the decreased importance of major version numbers, modern Drupal.

Agaric is always happy to discuss more! Mostly what i'm saying here is the useful things to talk about are the specific goals for the sites—when you want to accomplish what—because the official support cycles are a distraction in the current context of Drupal. So make sure your current site is maintained, but take your time to get clear on your objectives, and contact Agaric or the Drupal professionals of your choice when you think it might make sense to upgrade your site into the era of modern Drupal.

Jan 13 2021
Jan 13

The Digital Powerhouse:

Within our group, we have the honour of working alongside a number of other awesome agencies.
 

CTI Digital:

The chair of our digital powerhouse. 

Their team runs a full-service digital agency, working across all areas of digital experience for their clients' needs. Working with some huge names, CTI Digital creates flexible solutions for a variety of industries from aircraft carriers to roller coasters. 

They’re in the driver's seat of this incredible team and have been pushing boundaries since 2003. We’re excited to see what they bring to the table in helping us to grow and creating a successful future in this partnership. 

Nublue:

The Nublue team are dedicated to working in eCommerce to provide the best online shopping experience for customers to enjoy and manage with ease. 

They have produced and hosted hundreds of online stores that are both beautiful and reliable using platforms such as Magento, ShopWare and Shopify.

Stardotstar:

Stardotstar is a design agency who work to produce innovative, fresh designs and content strategies to businesses of many different spectrums.

They balance business wants with users needs.

They’ve worked on some outstanding projects with clients like the BBC, Channel 4, the Royal Navy and a long list of others.

Supercharged:

The Manchester-based award-winning commerce agency, Supercharged, is another big part of the digital powerhouse CTI has built.

They bring big commerce ideas to life for clients, using leading software to create brilliant and beautiful solutions across their clients’ sites.

From product management to one-page checkouts, their experts combine all the features of digital marketing and e-commerce features needed to be the best on the market and outweigh the competition. 

Jan 12 2021
Jan 12

Chosen

Chosen uses the Chosen jQuery plugin to make your

Jan 12 2021
Jan 12

I presented The Drupal 10 readiness initiative - here we go at DrupalCon Europe a month ago. While I published my slides with plenty speaker notes right away, the session videos just became public. While the live presentation was a month ago, most of the content is still up to date.

Drupal 9 is expected to have the shortest Drupal major release lifetime in recent history with Drupal 10 planned to be released in the middle of 2022 (next year!) and Drupal 9 end of life by end of 2023. In this session, we discussed what it takes to get from Drupal 9 to Drupal 10 and how are we going to manage this transition. We also covered what we learned from the Drupal 8 to 9 transition (so far) and how we plan to make it better for 10.

Check out the recording:

Jan 12 2021
Jan 12

The term Drupal invokes different feelings among different people, based on their professional background or on what they have heard or learned about the CMS. Over the years, Drupal CMS has evolved from a simple tool for hobbyists to a powerful digital experience platform for global enterprises. While Dries describes Drupal as a platform for "ambitious digital experiences", it is commonly referred to as a content management framework that allows for extensibility & scalability through the addition of various user-created modules that build upon its core framework.

Since Drupal 8 and its adoption of continuous innovation, new features and modern libraries are being added with every bi-yearly release. Drupal 9 is out already with its first feature (minor) release of Drupal 9.1.0 last month and we are already seeing the fulfillment of some of Drupal 9's strategic initiatives. With Drupal, promises are always delivered. And this is something every organization, big or small, looks for in a CMS. 

Still confused about moving to Drupal 8 now or Drupal 9 later
 

Drupal for everyone - ease of use

Popularity

Drupal is one of the most popular CMSes currently available & is the preferred choice for government agencies, large associations, non-profits & numerous Fortune 500 companies. Currently, over 1,738,777 websites around the world are built on Drupal. 

This graph compares the trend for the term "Drupal" with another popular CMS "Joomla" over a period of 3 years & it clearly depicts the growth of popularity of Drupal.

image

But why do large enterprises prefer Drupal? Is it as "easy to use " as they say? Let us find out.

Features for Everyone

One of the toughest challenges that Drupal adopters face, whether they are new site owners or novice developers, is trying to figure out what is difficult & what is easy with Drupal. Most of their questions revolve around the ease of use that Drupal as a platform brings to the table. Let us look at some of the basic (yet important) features that Drupal CMS provides to website owners.

Installation

For Drupal installation, more than the “technical knowledge”, you will rather than just need to know how to connect through FTP and install databases. Once you are ready with your prerequisites for the Drupal 8 installation, it will hardly take you minutes to complete the entire set up. The installer performance has been improved by 20% in Drupal 9.1.0 which makes installation faster and easier. Would you believe me if I say that the installation time of Drupal CMS for a new user with knowledge of general installation of other systems might be less than “One & half minute??”

image

Maintenance and Upgrades

One of the main focus areas for Drupal 9.0.0 was to improve its upgrade experience. And so it has. Upgrading to a major version is also as easy as upgrading to a minor version. This has been clearly witnessed on the Drupal 8.9 to Drupal 9 upgrade. 

Drupal CMS ensures that the maintenance and upgrades are easy to handle by the site administrators. The procedures for updating your website include backing up the website and then replacing the files using a web update interface.

Backing up the website takes minimal effort as the site administrator can back up the whole website by downloading only one file which contains the assets of the website.

Drupal also notifies the site admin every time an upgrade is required, thus ensuring that the website never misses an opportunity to stay up-to-date. However, if the administrator does not wish to change the version, Drupal CMS also provides security updates for the previous versions. For example, even though Drupal 9 has released June 2020, Drupal 7 still continues to receive support from the community.

Community Strength and Contribution

The unofficial tagline of Drupal - “Come for the Software, stay for the Community” speaks volumes about the strength of the community. Functioning well since 2001, the Drupal Community is known for its dedicated bunch of developers and contributors who use, build, teach, document and market the best practices in Drupal. You can find their amazing works on Drupal.org.

Usability

Drupal CMS allows administrators to access any page or a section of the page in visitor mode by clicking on edit. While the core does not include a WYSIWYG editor, you can still get it in the form of a module, replacing all the other editor integration modules. Drupal CMS allows easy editing of pages or sections of a page by creating a simplified experience for the editors and administrators.

Scalability

Drupal CMS is highly scalable with high traffic handling capabilities. Its web pages are cached indefinitely as the default setting configuration, but can also be manually cached for a specific time. Moreover, functionality area blocks can be cached, thus allowing better traffic handling capabilities for your websites.

Whether it is the extreme traffic spikes on certain occasions or the constant web traffic, Drupal handles all of that with utmost ease. Did you know that the digital experience of Australian Open 2019 was powered by Drupal? Like Dries said “When the world is watching an event, there is no room for error!”

Web 2.0 Features

Drupal CMS is an excellent community platform provider and It outperforms all other options in this particular area. The platform allows a website administrator to set permissions for site visitors to comment on any content of website.

Drupal also facilitates administrators to set permissions on who can edit, create or delete various content types. It can be an article, pictures, videos or any other media files, everything managed by the admin.

Security

Security is a major concern for web properties these days and Drupal leaves no stone unturned to ensure that your website is secure from any possible security breach. Security updates are published on drupal.org and the users are provided a notice every time a new update is released. Drupal’s active community is alert and any security loopholes are remedied very quickly. They also provide references to guide the user in making a site more secure. 

When it comes to security, Drupal wins hands down when compared to other opensource CMSes in the market today. Check out these statistics below which compares the number of sites that have been compromised by these popular CMSes in 2016. Drupal accounts only for 2% of the hacked websites according to this research.

Drupal security comparison

User Roles and Workflow

The greatest asset of Drupal CMS is its ability to create any number of user roles and assign different permissions. While Drupal’s core includes two default set of roles, anonymous user and authenticated user, it allows you to create multiple user roles depending upon the content types. Also, granular permissions to each user can be assigned based on content section using the taxonomy function.

image

How Does Drupal Make Things Easy?

  • Advanced Control of URL: Drupal provides a precise control over URL structure of a page. Each content item which is called node in Drupal can be given a custom URL. Also, the path auto module can automate custom URL structure for each content type.
  • Custom Content Types and Views: Using Views and the Content Construction Kit (CCK), Drupal allows you to create new content type without having to write a single line of code! Yes, any number of custom content types can be created and displayed in many different ways without any code! Some examples of content types that you can create are forum posts, tutorials, blog spots, news stories, classified ads, podcasts, videos and more.
  • Themeing and PHP Template: PHP knowledge for themeing? No, not anymore! Themeing in Drupal can be done with absolutely no PHP knowledge. Drupal CMS uses PHP template theme engine by default.
  • Hook System: This system in Drupal enables you to hook in new modules easily. This hook system is invoked when a particular activity is performed in Drupal. This approach allows Drupal core to call at specific places certain functions defined in modules and enhance the functionality of core. They make it possible for a module to define new urls and pages within the site (hook_menu), to add content to pages (hook_block, hook_footer, etc.), to set up custom database tables (hook_schema) and more.

I completely agree with Dries when he said that Boris nailed it!

Drupal sucks less - Dries Buytaertimage

Jan 12 2021
Jan 12

2020 was hard.

At Promet Source, we’re planning for and counting on 2021 being easier and better in many ways.

We realized last year that there actually was something we could do to raise the bar for 2021 and make life easier and better for everyone who manages a Drupal website. 

We developed Provus.
 

What is Provus?

The brainchild of Aaron Couch, Promet’s Lead Solutions Architect, Provus is Promet’s newly launched Drupal platform. Utilizing Atomic Design principles, Provus combines the latest drag-and-drop page building tools in Drupal with a curated library of design components, enabling content editors to easily layer designs, add functionality, and rearrange layouts.  

An essential differentiator from other drag-and-drop tools is the degree to which Provus empowers content creators, while at the same time adhering to an organization’s brand guidelines to ensure consistency and aesthetic alignment. 

From a development perspective, Provus is allowing for vast new efficiencies as we work toward eliminating the wall that had previously existed between easy-to-create and manage SAAS solutions, and scalable Drupal solutions, for websites that have complex data models and a depth of content.
 

New Perspectives and Possibilities

Provus was inspired by the realization that nearly every website consists of various combinations of roughly 15-20 types of features or patterns. By organizing a library of high-quality components that can be repurposed for low-code, no-code site building, we create a foundation for:

  • Easier content editing capabilities with drag and drop functionality
  • Greater design flexibility within defined brand standards
  • Streamlined development using Drupal’s proven content models

The Provus Technology Stack

Provus Technology Stack Promet’s Open Source Provus starter kit for component-based Drupal sites is based on Atomic Design principles using Emulsify as the base theme and leveraging Storybook to create a library from which the newly themed components are mapped into Drupal Layout Builder for a flexible, dynamic, drag-and-drop CMS. 

Provus in Action

Traditional Drupal theming includes CSS and JavaScript selectors that are intertwined with their context, connecting them to the backend implementation. The result of this “theme for the page,” approach is that assets that can’t be repurposed across projects.
Having identified that component-based theming tools are key to next-level efficiencies in website building, our next step was to single out an optimal approach for delivering reusable components. 

Promet’s strategy for achieving this new UI and content management paradigm incorporates the Emulsify® design system, which is a component-driven Drupal theme and gives us a huge lift in building repurposable components. Emulsify functions as both a starter component library with Storybook, which contains the Atomic Design library and is a tool for building user interface components. Storybook can be turned on from within the Emulsify theme, resulting in a highly efficient new workflow.
 
With Provus, components built using JavaScript and CSS are curated int o a library. If the backend implementation changes or we want to move it to another project, the component itself is not changed, allowing us to efficiently redesign and reuse it.

What Sets Provus Apart?

Content editor empowerment, combined with the robust guidance and governance are key factors fueling the success of Provus. More specifically:

  • Self-adjusting features within components create a foundation for both readability and ADA accessibility, by ensuring, for example, adequate contrast between fonts and background colors. 
  • Design governance offers the assurance that content editor empowerment does not translate into mismatched, crowded, or sub-par page designs. Customization options are presented within an expertly calibrated design framework for ensuring the highest quality designs and user experiences on all devices, without breaking layouts or straying from an organization’s brand guidelines. 
  • Content editors are able to seamlessly edit components and change patterns within the view mode, eliminating time-consuming processes of reentering content and switching back and forth between edit and publish modes.

As a thought leader on how humans interact with technology, Promet Source has enthusiastically pursued component-based design systems for their potential to drive high velocity capabilities that drive consistency and collaboration. 

While Provus provides for game-changing advantages on multiple levels, we’re most excited about the amazing new capabilities that we are now able to offer our clients. In blending a formal design system that ensures brand consistency across the site with the flexibility of drag-and-drop site building tools within Drupal core, we are reducing the cost of ownership and empowering clients with a site that’s designed to flex and expand to fit evolving needs and new priorities. 

Interested in exploring new possibilities with Provus or seeing a demo of Provus in action? Let me know and I’ll be in touch!


 

Jan 11 2021
Jan 11

When developing or maintaining a Drupal theme, there is often a need to understand why a theme needed to override a given template. A diff of the template in the active theme compared to the template in the base theme would make this easy to understand. A drush command that found the template files for you and output a diff would solve this quite nicely.

Template Diff

The Template Diff module does just that. It provides a drush command that accepts a template name and will display the diff between two specified themes. If no theme is specified it defaults to comparing the active theme and its base theme.

Examples

Compare the active theme and its base theme:

drush template_diff:show views-view

Compare "foo_theme" vs "bar_theme":

drush template_diff:show views-view foo_theme bar_theme

Compare "some_theme" and its base theme:

drush template_diff:show views-view some_theme

The output will look something like this:

$ drush template_diff:show views-view
 [notice] Comparing chromatic (active theme) and stable (base theme).
- stable
+ chromatic
@@ @@
 {#
 /**
  * @file
- * Theme override for main view template.
+ * Default theme implementation for main view template.
  *
  * Available variables:
  * - attributes: Remaining HTML attributes for the element.
- * - css_name: A CSS-safe version of the view name.
+ * - css_name: A css-safe version of the view name.
  * - css_class: The user-specified classes names, if any.
  * - header: The optional header.
  * - footer: The optional footer.
…

If you have ideas on how to improve this, submit an issue so we can make understanding template overrides even easier.

Jan 11 2021
Jan 11

Lynette has been part of the Drupal community since Drupalcon Brussels in 2006. She comes from a technical support background, from front-line to developer liaison, giving her a strong understanding of the user experience. She took the next step by writing the majority of Drupal's Building Blocks, focused on some of the most popular Drupal modules at the time. From there, she moved on to working as a professional technical writer, spending seven years at Acquia, working with nearly every product offering. As a writer, her mantra is "Make your documentation so good your users never need to call you."

Lynette lives in San Jose, California where she is a knitter, occasionally a brewer, a newly-minted 3D printing enthusiast, and has too many other hobbies. She also homeschools her two children, and has a house cat, two porch cats, and two rabbits.

Jan 11 2021
Jan 11

I like to think of this module as something you don't realize you need until you understand exactly what it does. With that in mind, let's start with an example…

Imagine you have a "Document" content type (or media entity) that you use to upload PDF files to your site. Document entities are then used as part of various other entities (often content types) on your site via reference fields. Now for the important bit: Document entities are not meant to be viewed on their own - they are only meant to be available as a part of another entity via a reference field.

When a site design calls for this type of situation, what happens to the "Full display" view (/node/[nid] or /media/[mid]) mode of the Document entity? Often it is ignored; not even styled for display. Under normal circumstances the full display view mode has no reason to ever be requested, but if developers never had to worry about edge cases, then our lives would be much easier.

This is where the Rabbit Hole module enters the picture - it allows us to specify (via the bundle's "Edit" page) that if someone tries to load the full display view mode, the Rabbit Hole module kicks in and directs the user to a specified path.

Rabbit Hold module screenshot

So, if you have entities on your site that aren't meant to be displayed on their own, it's best to use the Rabbit Hole module to ensure your site visitors don't end up on a page you're not expecting.

Jan 11 2021
Jan 11

Whether you are running your business into B2B space or B2C space, the need for agility and speed in workflow management is indispensable. Because eventually, clients also expect faster delivery of the project/application to catch up with their customers’ requirements.

However, if developers do not use any standard tools, it can add unnecessary overhead and eat away their development time. Also, given that they are coming from different backgrounds and skillsets, it would become difficult for stakeholders to set up projects, onboard developers, troubleshooting, and even train them as large-scale projects come with complex requirements.

That is why it’s critical to have a standardized development environment across the teams. This blog guides you on using Lando software (an open-source tool that provides a single local development environment for all the developers’ projects) with Drupal 9 composer, PHP & SCSS Linters, and a multisite architecture scenario.

How Lando Provides a Standard Development Environment?

Setting up the project from ground level to managing configurations and distributing it to each developer, including frontend & backend,  becomes tedious due to various aspects, including different machines, a different configuration of the machine, and different OS.

And that’s where Lando software comes into the picture.

What is Lando Software?

It is an open-source, cross-platform, local development environment, and DevOps tool built on Docker container technology. Its flexibility to work with most of the major languages, frameworks, and services helps developers of all skill sets and levels to specify simple or complex requirements for their projects and then quickly get to work on them.

Some of the benefits of Lando include-

  1. Maintaining standardization across project/application.
  2. Offering speedy development(prebuilt configuration of the composer, drush).
  3. Add tooling to extend it from services. 
  4. Recommends out-of-the-box settings that are customizable.
  5. Automates the complex steps involved in unit testing, linting, compiling, or other recurring workflows.

How to Use Lando With Drupal 9’s Composer.json for Faster Development?

Consider a scenario when a developer has been replaced in the team with the new developer for the existing Drupal project. The new developer might not be familiar with the OS that others are using. Here, it would become difficult for him/her to install the composer quickly. And hence, this would delay his/her onboarding process.

However, if the team is already using Lando for development, it would take care of the operating system’s bottleneck itself. In fact, the composer is already built in the recipe (Recipes in Lando are the highest level abstraction and contain common combinations of routing, services, and tooling) of Drupal 9 and is also compatible with different OS. The only thing is developers should know how to use it.

code written in maroon background

Steps to Use Lando with Drupal 9’s composer.json

The prerequisite for this setup is that your local development machine should be compatible with Docker and Lando and installed successfully without any glitches. Make sure when you are running docker setup, other ports are not conflicting with Lando setup.

Here are the steps to be followed-

  1. You need to clone this  Drupal 9 open source git repository.(Ex:

    git clone  [email protected]:AbhayPai/drupal9.git)

  2. Change the directory to the cloned repository. (Ex: cd drupal9)

  3. Start your app using the lando start command. Before you begin, you can change some parameters in .lando.yml as per the need of your application.

This repository would give you some common tools that include linting of PHP, linting of SCSS, linting of js files, and compiling of SCSS files and services like node.js and npm to directly connect with the Lando app. You do not need to go inside any container after starting your application. By default, this repository is only able to lint custom themes and is flexible enough to extend it to custom modules and profiles.

How to Use PHP Linters With Drupal in Lando

As Drupal is one of the largest open-source communities, millions of developers contribute and offer coding solutions in different ways. To standardize the coding practices and make the modules easy-to-maintain and readable, varying from indentation, whitespaces to operators, casting, line length, and many more, Drupal has a core package that takes care of these standard practices automatically when configured in the project. In general, these are called PHP Linters.

Following are the steps to configure the PHP linter in the project-

  1. Download dependencies package of Drupal coder using `lando composer requires drupal/coder`.
  2. Define a file for linter standard or copy file from Drupal core in your project folder where all standards are predefined in the XML file. It resides in core/phpcs.xml.dist.
  3. Configure a tool of `lint:PHP` within the .lando.yml file like the below example-

code written in black background

4.  Confirm if tooling is configured correctly just by using the ‘lando’ command to list all tooling. code written in white background

5.  Use this newly configured tool in your project using ‘lando ’. In this case, it is ‘lando lint:php:themes’

code written in maroon background

This automating tool which is configured with Lando software will help developers save time for finding and fixing these issues and will also ensure best practices are followed in the project repository.

How to Use SCSS Linters With Drupal in Lando

SCSS is a preprocessor used for writing CSS or CSS3 in any modern-day project. This SCSS is used because it helps developers to write less code and remove redundancy in the repetition of classname and other properties which are frequently used in the project.

The purpose of using SCSS linter in the project is to ensure that the quality of the code is high and easily maintainable for future enhancement. Further, it would save time in development and faster delivery of the projects.

Following are the steps that need to be followed for configuring the SCSS linter in our project-

  1. Configure node service and install gulp inside that service within .lando.yml file.
    code written in black background
  2. Configure tool for using npm with Lando within .lando.yml file.
    code written in black background
  3. Confirm if tooling is configured correctly just by using the ‘lando’ command to list all tooling.
    code written in white background
  4. Create a package JSON file and install and configure the stylinter package in the project.
  5. Create a new script in the package.json file for triggering stylinter.
    code written in black background
  6. Configure the tool to trigger this using lando.
    code written in black background
  7. Confirm if tooling is configured correctly just by using the ‘lando’ command to list all tooling.
    code written in white background
  8. Run this tooling command and Lando will lint it automatically.
    code written in maroon bavkground

This automation tool integrated with Lando for SCSS linter will ensure that best practices and code hygiene is followed in the project repository.


How Can Lando Help in Reducing Developers’ Efforts While Building Drupal Multisite Architecture?

Let’s take a scenario where your project ( client’s website) is live now and running smoothly. Now the client wants to create multiple new sites in alignment with the existing site. For instance, the new sites should have custom modules, themes, profiles, etc. to ensure brand consistency. 

Here, Drupal would come in handy as it would simplify the multisite architecture and speed up the local development setup with Lando through some minor tweaks in configuration files.

For setting up multisite architecture in an existing project, you need to follow below steps- 

  1. Configure .lando.yml file to setup app server URL for the new website
    code written in black background
  2. Configure database server for setting up this site with the new website
    code written in black background
  3. Configure drupal settings like sites,php, and folder structure for site2; to leverage this Lando configuration
    code written in black background
    code written in black background
  4. Rebuild configuration for setting up this new website.
    code written in maroon background

The minor tweaks in the existing project would help you extend existing Lando projects/websites to build multi-site architecture via local development and accelerate the delivery process for the client.

Conclusion

If you have come this far, Dhanyavaad (thank you). I hope that this article would help you in speeding up the development process & hence, faster project delivery, knowledge transferring of your application/project with Drupal, and leveraging Lando at its best by using inbuilt composer for automation in local development environments.

Now that you are armed with the knowledge and Lando’s benefits, what are you waiting for? Get started now!

Jan 09 2021
Jan 09
“A thorough website audit can clue you into the necessary changes and will help you drive significant results.”

Websites are complex beasts and the issues that arise are of inevitable nature. Being oblivious to these issues is quite common when you don’t conduct the site audit properly and regularly. What happens next is quite obvious - you fail to identify a wide range of website issues which interrupts the potential users to access your website, thereby acting as a major barrier to the growth of your business website. 

So, the question that arises here is - what is the best possible way to optimize your website in order to hit the predetermined goals?

Unless you have been living under a rock, you already know that website audit is the resonating answer for the same. Don’t you? 

Well, a website audit is the most common yet the most efficient approach that every organization undergoes who wish to achieve goals associated with the traffic and performance boost. As a matter of fact, a good website audit takes into account all the factors including performance issues, security vulnerabilities, general site maintenance, and site changes and upgrades that can undoubtedly influence your website’s success. 

Have you ever audited your website? No? Then, now is the right time!

A comprehensive Drupal website audit is a necessity today and is highly recommended to make sure that your website is up to date and performing well. Whether you are a small business trying to optimize your site for organic search, or an agency doing the same for a client, it can be a bit difficult to know where to begin from and how in-depth your analysis should go. No need to worry, we have got you covered.

In this blog we have put together several parameters that are of great importance when it comes to carrying out an in-depth analysis of your website. Subsequently, we will be providing you the tools that will help you glean the most useful information throughout the audit process. 

Illustration explaining Drupal website audit checklists with 'Drupal website audit' written at top and several boxes containing textual information below it


Before we run into the on-page audit components, let's start with few basic but important domain level checks that every organization, irrespective of their size and nature should be updated with.

Site Map

A site map is basically a blueprint of your website that helps search engines (Google, Yahoo, and Bing) to find, crawl and index all of your website’s content. Site maps can be good for SEO as they allow search engines to quickly find pages and files that are important to your site. 

SSL Certificate

SSL certificate is the backbone of the website that enables encrypted communication between a web browser and a web server. Websites need to have a validated SSL certificate in order to keep user data secure, verify ownership of the website, prevent attackers from creating a fake version of the site, and gain user trust. 

WWW resolution

WWW resolution assesses whether your website redirects to the same page with or without WWW (World Wide Web). It is better and more convenient for users when it does. 

Robots.txt 

Robots.txt is a file that lives at the root of your website to instruct your crawling preferences to various search engines. Not to mention, a Robots.txt file allows you to lock away areas of your website that you may not want crawlers to find.

On-page Site Audit Components

Apart from the aforementioned basic domain checks, there are several other components that are capable enough to influence the outcome of the website audit. Further, these influences can either have a positive effect on the quality of the website or can provide great repercussions on the reputation of the website on the face of the direct clients as well as end-users.

1. Drupal’s Best Practices

Creating and maintaining a Content Management System (CMS) like Drupal takes both time and effort. Further, you are required to follow some basic web development practices that can help you protect that investment and simultaneously provide a great user-experience. 

With that being said, the following pointers outline some best practices that are required to program with Drupal.

Drupal Architecture

  • The content structure must include all the fields and content types. 
  • Choose limited content types and files in your development plan to avoid confusion among content creators.
  • Use new entity type and single entity type for different and similar data types respectively.

Check the code

  • Use an indent comprising 2 spaces, with no tabs and the lines.
  • All binary operators should have space before and after the operator to serve the readability purpose.
  • To distinguish control statements from function cells, they should have one space between the control keyword and opening parenthesis
  • All lines of code should comprise a minimum of 80 characters.
  • Use short array syntax to format the arrays with a space separating each element (after the comma). 
  • Use require_once() and include_once() respectively when unconditionally and conditionally including a class file. 

Infrastructure

  • Stack size should be not too large, nor too small. 
  • Dive into logs to detect errors and prepare for growth and spikes. 
  • For security issues, it’s crucial to configure to protect from internal attacks as well as external attacks.

Optimise the front-end

  • Define component elements using their own classes. 
  • Exercise and test your site rigorously to resolve PHP errors, if any. 
  • Use a stable administrative theme during development. 
  • Use DRY CSS and group reusable CSS properties together and name these groups logically. 
  • Name components using design semantics. 
  • In order to keep your designs more organized, use SASS.

Test, error, repeat

  • Get your site reviewed by peers to get an additional idea on what to do next. 
  • Set up a testing environment to get your website tested easily and quickly. 

SEO Practices

  • Use Robots.txt, so the right pages and information is indexed. 
  • Bring navigational drop-down menus into action that silently contributes to search engine optimization.
  • Enable the URL aliasing with Pathauto to ensure the search engine understands what the webpage implies. 

Security Practices

  • Always keep your core updated. 
  • Arm yourself with some additional security modules.   
  • Make sure you only use modules approved by the security team. 
  • Don’t forget to keep your backup ready to face any uncertain events.

Maintenance Practices 

  • Keep your code under version control.
  • Maintain and update separate environments for the different stages of the site.
  • Limit access to the production site for all but the most trusted users.
  • Access all logs ever and again, including Apache, Drupal, and MySQL.
  • Review and assess your architecture frequently and make plans for the future.

To go through a detailed explanation of Drupal's best practices, read here.

2. Mobile Usability

Mobile usability testing helps you identify the potential issues/problems that are hindering a mobile friendly user-experience on your website. The need to conduct a mobile usability audit is extremely important because with the advancement in smartphone browsers, more people are visiting sites using their mobile phones. 

Below are some common yet important elements that can help you to produce great mobile-friendly sites. 

Responsive Design

It allows page elements to reshuffle as the viewport grows or shrinks. Responsive design plays a pivotal role as it allows you to create dynamic changes to the appearance of your website when there is a change in screen size and orientation of the device it is being viewed on. 

AMP URL

Originally developed by Google, Accelerated Mobile Pages (AMP) is an initiative to speed up the loading time of web pages on mobile devices. The biggest advantage that AMP URL offers is faster and simpler web pages that can be displayed equally well on all device types, including mobile and desktop.

Mobile Pages Audit Tools

There are a number of tools that can help you perfectly optimize your site for mobile. Here are a few tools that that you should have in your bookmarks-

  • Screenfly
  • Google Resizer
  • Browserstack
  • Ghostlab
  • Crossbrowser Testing

Check out this guide on mobile-first design approach to know more.

3. Speed

Performing a website speed audit is important as it helps you evaluate the speed and responsiveness of the website and further identify the areas that need quick improvement. 

Page load speed

It refers to the time taken by the website to fully display the content on a specific page which directly impacts user engagement and a business’s bottom line. Page load speed is important to users for the obvious reason - faster pages result in more efficient and better on-page user experience. An ideal page load speed should vary between 2-5 seconds. 

Page Speed Audit Tools:

Market is flooded with a variety of tools that can be used to test page load and improve the website speed. Following is a handpicked list of some common tools- 

  • Pingdom
  • Google pagespeed insights
  • Google analytics site speed
  • GTmetrix
  • Dareboost
  • YSlow

4. Performance

Website Performance Testing refers to a software testing process used to determine how a particular website behaves and responds during various situations. Conducting a website performance audit is incredibly important for websites because it helps you to identify and eliminate the performance bottlenecks in the software application.

Take a look at the following list of performance elements that contribute to the response time of the website and overall end-user experience.   

HTML/CSS/JS 

  • JS and CSS count: Delivering a massive amount of CSS and JS to the browser can result in more work for the browser when parsing the CSS/JS against the HTML and that makes the rendering slower. Try to send only the CSS/JS that is used on that page and remove CSS rules when they aren't used anymore.
  • CSS Size: Delivering a massive amount of CSS to the browser can result in more work for the browser when parsing the CSS against the HTML and that makes the rendering slower. Try to send only the CSS that is used on that page and remove CSS rules when they aren't used anymore.
  • Image Size: Avoid having too many large images on the page. The images will not affect the first paint of the page, but it will eat bandwidth for the user.
  • Page Size: Avoid having pages that have a transfer size over the wire of more than 2 MB on desktop and 1 MB on mobile.
  • Image scaling: Scaling images in the browser take extra CPU time and will hurt performance on mobile. So, make sure you create multiple versions of the same image server-side and serve the appropriate one.
  • Documents Redirects: You should never ever redirect the main document because it will make the page load slower for the user. Instead, redirect the user if the user tries to use HTTP and there's an HTTPS version of the page. 
  • Charset Declaration: The Unicode Standard (UTF-8) covers (almost) all the characters, punctuations, and symbols in the world. It is highly recommended to use that.

Header performance

  • Cached Header: Setting a cache header on your server response will tell the browser that it doesn't need to download the asset again during the configured cache time! 
  • Cached Header Length: Setting a long cache header (at least 30 days) is better as it promises to stay long in the browser cache. 

Servers

  • Fast render speed: Avoid loading JavaScript synchronously inside of the head, request files from the same domain as the main document (to avoid DNS lookups) and inline CSS or use server push for really fast rendering and a short rendering path.
  • CPU rendering time: You need to be able to render the page fast which is highly reliable on which computer/device you run on. It is important to note that the limit here is high i.e., spending more time than 500 ms will alert this advice.
  • No. of requests per domain: Avoid having too many requests per domain. The reason being, browsers have a limit on how many concurrent requests they can do per domain when using HTTP/1. 
  • CPU scripting time: Do not run too much JavaScript as it will slow down the page for your user. Again, this metric depends on which computer/device you run on but the limit here is high i.e., spending more time than 1000 ms will alert this advice.

Performance Audit Tools

Here are some common tools that you can use to run website performance tests in order to achieve optimal performance. 

  • GT Metrix
  • Webpage Test

Read this comprehensive guide on Drupal performance optimisation techniques to know more.

5. Accessibility 

An accessibility audit is a comprehensive evaluation of how well your digital properties meet the needs of people with any limited ability. It is important to conduct the accessibility audit as it provides a detailed look at how and where you can enhance your digital products/services to improve digital accessibility.

Here are some of the first steps you can take to check the type of experience your website delivers for people with digital access needs:

Check your page title

  • Make sure that every page has an input title. 
  • This is usually done through the 'view source' option available in most modern browsers.

Turn images on and off

  • This can be done using an advanced option. For say, google chrome provides access to turn images on and off, which makes it easy to look for ‘disappearing’ text. 
  • Subsequently, check your image alt text for issues such as the missing or incorrect description of the image contents.

Turn sound on and off

Using the computer's sound options, turn off sound to make sure that your website is conveying the same meaningful information, with or without sound.

Manage plug-ins

  • Using special plugins, you can easily apply different views on the top of the page. 
  • For example- you can test grayscale to ensure that people who are color blind have access to each and every information available on a particular page.

Keyboard accessibility 

  • Try to operate and navigate your website without a mouse or trackpad. 
  • Check if all the functions are operable using keyboard navigation alone. 

Check Zoom in 

  • People with visual impairments often enlarge the elements to see what is present on the screen. 
  • Therefore, zoom to 200 or even 300% to check if anything pixelates or not. 

Check-up page structure and hierarchy

  • Your heading text should be H1, followed by various subheadings i.e., H2, H3, and so on. 
  • For example - follow the order 1-3, so H2 cannot come before H1 and H3 cannot come before H2.

Check multimedia elements

As per the information issued by web content accessibility guidelines (WCAG), websites must specify important information contained within multimedia elements (video/audio/photo) in a text-based alternative.

Accessibility Auditing Tools

There are some free online tools that you can use to uncover the accessibilities issues that are present in your site. 

  • Wave Evolution tool
  • Google Lighthouse
  • Sortsite
  • Pay11y
  • Stark contrast checker  

Take a look at this web accessibility planning guide and how Drupal is ensuring web accessibility standards to know more.

6. Security 

Conducting security audits are befitting to examine and identify the existing/potential threats that can jeopardize the website. Further it also involves improving the security of the website to make online business safer.

Following is a quick and easy list of elements you can evaluate to detect the security risks lurking in your website.

Ascertain the assets to focus on 

  • List out the high priority assets required to monitor and scan, including sensitive customer and company data, internal documentation and IT infrastructure.
  • Do not forget to set out a security perimeter 

Checklist your potential threats

  • Name your threats to ensure what to look for and how to adapt your future security measures. 
  • Some common security threats you might put on your list include weak passwords comprising sensitive company data, use of malware, phishing threats, unwillingness to accept service attacks and maleficent insiders.

Determine the current security performance

Evaluate the current security performance of the website to keep hackers at bay, trying to invade the company’s systems. 

Establish configuration scans

  • Setting up a higher-end scanner will help you detect security vulnerabilities.
  • Run some configuration scans to detect configuration mistakes made.

Look out for reports

Do not forget to give a detailed look at the reports generated by your auditing tools.

Monitor DNS for unforeseen events

Always keep track of the credentials used for your domain. 

Scrutinize your website

This a must task when you wish to spot some hard-to-access files and directories on your website.

Carry out internal vulnerability scan

  • Install an agent on each computer in your organization to monitor the vulnerability level.
  • Performing an internal vulnerability scan in a month or 3 months would be a great option.

Perform phishing tests

  • Perform cybersecurity training by sending out fake phishing emails to team members.
  • Running such tests would give a close-to-real-life experience of what a phishing attack is. 

Security Auditing Tools

Now that you have a plan, you might need some tools to put your plan into action. For your convenience, we have listed down a few tools that you can use-

  • OWASP Testing Guide
  • Burp suite
  • Nessus
  • Qualys web apps scan
  • Rapid7 

Get a thorough understanding of Drupal security by going through why Drupal is the most secure CMS, its provision for open source security, importance of security modules for Drupal website and Drupal website's data security strategies.

7. Search Engine Optimization (SEO) 

An SEO audit is an important facet of the website which identifies and analyzes the foundational issues affecting the organic search performance of the website. Conducting an SEO audit is extremely essential for any website as it allows you to analyze  the current SEO efforts (irrespective of fact how prolific or sparse they are) and further take immediate action on those insights.

Below are some of the most important areas that an SEO audit covers to maximize optimization-

Find and fix indexation issues

  • Make sure your site is well-indexed in Google.
  • Look for the number of pages that Google has indexed for your domain.

Conduct on-page SEO check

  • Keywords: While auditing your on-page SEO, start with the keywords. Make sure both long & short-tail keywords are incorporated seamlessly throughout the content. Moreover, adding LSI keywords help improve organic visibility eventually.
  • Optimization of headers: Use keywords in the headers. It is to be remembered that search engines including Google use H1 tags to understand the primary topic of a page.  
  • Call to actions: Curate content with the right CTAs for the maximum conversions. Good CTAs make a site look more structured and professional, attracting visitors’ attention. 
  • Optimized URL: It’s also crucial to have keyword-rich URLs for the website to improve the organic click-through-rate (CTR). The shorter the URL, the better is the ranking.
  • Meta description: Meta description plays an imperative role in SERP as Google uses description tags to generate a search results snippet. Hence, every page on your site needs to have a 160 characters meta description with a primary keyword. 
  • Internal links: Using internal links to publish new content is a must. Internal links are instrumental in establishing site architecture and spreading link equity (ranking power) at large. It is recommended to use descriptive keywords in anchor texts to give readers a sense of the topics. 
  • Schema markup: Furthermore, use Schema markup, an advanced level on-page SEO technique to help the search engine bots crawl relevant information for users. The Schema markup uses a unique semantic vocabulary (code) in microdata.
  • Image optimization: Lastly, the optimization of images with keywords in the image alt text also carries weight. This practice increases the potential to rank in image search apart from boosting the SEO efforts of webpages. 

Detect and delete broken links 

  • Check for the broken links list and find which link has the most inbound links.
  • Work through this list and either delete or replace the errors found. 

Duplicate and thin content pages

  • Check for duplicate pages as they have an adverse affect on SEO. 
  • The pages should have a decent word count else it would be considered thin content poage and might not attain a better ranking over SERP or even not get indexed.

SEO Auditing Tools

Following are the tools that you can use to track and detect errors that are hindering your site from achieving the top spot on Google. 

  • Google Analytics
  • Google Search Console
  • SEMrush
  • WooRank
  • Moz
  • Ahrefs
  • SpyFu

Access this ultimate guide to Drupal SEO to know more.

8. Consent Management

Consent management is a process that allows websites to meet legal regulations such as GDPR and CCPA by obtaining user consent for collecting their data or information. With a good consent management platform (CMP) in place, websites are able to create better customer experience and further deepen relationships with their consumers.

9. Hosting Infrastructure 

Having a quality web hosting infrastructure is essential for any website as it helps you determine the loading speed, downtime, bandwidth, and SEO factors of the website.

If you use a free or cheap web host, it will create a lot of hosting problems like frequent downtime issues for you in the future. 

Here is a list of some important things that you should consider before you choose a web hosting plan.

Fast servers 

  • Profits are directly proportional to the speed that webpages load, therefore make sure your web host offers at least a T3 internet connection.
  • Internet users lack patience and need quick results. Make sure your web host does not exceed the 30 seconds time frame. 

Unrestricted CGI Access 

CGI programs are put-to-use by many professional sites at some point or the other, therefore look for a web host that can provide you with CGI-bin access.

SSH and FTP Access

  • You can easily encrypt the data moving between your computer and your website server with the help of SSH. Doing so helps you reduce the burden on your programming development time.
  • A good web host must qualify the need to utilize an FTP with an intent to transfer files back and forth from your local computers and your web server.

Access to Raw Server Logs

This feature allows you to gain access data relating to your website’s traffic, including traffic you get per week, time period of visitors on your site, etc. 

Server Backups

  • Server backups ensure that you don’t lose out on anything at the time of uninvited events. 
  • Not all web hosting services provide automatic database or server backups, in such situations you are required to pay an additional amount to create full backups for your whole sites.

Services, Scripts, and Software

  • A good web host should offer a vast library of scripts wherein you can add forms, statistics, and other extras to your website.
  • Besides this, the scripts should also provide some e-commerce features including shopping cart software, real-time processing availability and much more. 

Tech Support

A web host of good quality should provide technical support to the website. 

Conclusion

To conclude, conducting website audits may seem like a strenuous task, but it is an important responsibility that helps you identify issues that can hinder the growth process of your website. Not to mention, the entire process may sound a bit nerve-wracking however the end results derived are worth the hard work. If you want to maximize the business benefits of your website, then a website audit is all you need to put-into effect. 
 
Furthermore, a website audit is not a one-off process that you conduct once in a blue moon. In other words, conducting a website audit is a mindset that helps you gain deeper insights into your website which further helps you stay on top of your website maintenance before it gets too late. Being successful in the digital market space requires some degree of agility and adaptability, and guess what this goes for websites too. 

Would like to put yourself way ahead of your less-informed competitors? Feel free to contact us at [email protected] and our industry experts will help you conduct a comprehensive site audit the right way.

Jan 08 2021
Jan 08

Now on Drupal 9, the community isn’t slowing down. This month, we sit down and talk with Angie Byron, a.k.a Webchick, a Drupal Core committer and product manager, Drupal Association Board Member, author, speaker, mentor, and Mom, and so much more. Currently, she works at Aquia for the Drupal acceleration team, where her primary role is to “Make Drupal awesome.” We talk about Drupal, coding, family, and her journey throughout the years.

This article was originally published in the January 2021 issue of php[architect] magazine. To read the complete article please subscribe or purchase the complete issue.

Jan 08 2021
Jan 08

(Available as freelancer)

Joris Snoek

Business Consultant
/ Drupal Developer

Last week we released 'group chats' in Drupal distribution OpenLucius, a social productivity platform. At first sight it maybe looks like a small module. But it took quite some effort to get to this release. Also because I wanted to release it open source: no concessions in (code) quality and maintainability.

Our 'group chat journey' started around 3 years ago, when we kicked off building OpenLucius on Drupal 8. We thought it was best to implement the realtimeness with MongoDB, because of the no sequel character and its speed. Also, a lot of chat examples were using the MEAN stack.

ReactJS / VueJS (not needed)

Also, javascript frameworks like ReactJS and VueJS were/are a hype -some developers like to jump on it without looking into native Drupal alternatives, I also fell into that trap -it's a dangerous area:

Hello over-engineering, complexity and project misery.

We thought we needed it for frontend interactivity, but that implementation added even more unnecessary complexity to the project.

Drupal 8

After a long struggle, that Drupal 8 version never saw the light of day. We did use it for a while internally but it was not suitable for the outside world -that's a euphemism right there.

Native Drupal 9 group chat, a PoC

So about a year ago I started with a proof of concept to see if realtime group chat was possible with use of just Mysql and jQuery, it turned out it did! So that meant that the group chat module could be implemented native in Drupal.

That was a huge relief and paved the way to an open source release. Because I wanted the installation process to be as simple as possible for everybody, not complex with installing ReactJS / MongoDB and what not.

Just click through the Drupal install wizard and done, that was the goal -and we reached that.

Well.., full disclosure: one piece of external tech required is Node.js (with Socket.io). Else realtime emitting/pushing messages in 'rooms' (group chats) just isn't going to work, Drupal core has no Websocket tech built-in.

But installing the Node.js chat engine also is a few click operation after installing OpenLucius. And: it's optional, so a basic Drupal wizard install is the only thing required to get OpenLucius up and running.

Fast forward 2021, Drupal natives Mysql and jQuery FTW!

So, after the successful proof of concept in 2020, it is safe to say:

be very considered when implementing external tech. Drupal core has excellent native tech for facilitating speed and interactive UI's.

Of course the tech you choose in the end depends on your requirements/user stories. But just make sure you invest enough time in analysing Drupal core and contrib modules, before integrating external tech.

Especially if you want to release open source.

Code for performance

So Drupal natives Mysql and jQuery can work great.. as long as you code it right. And with 'right' I mean, in our case of the group chat, that the code needs to be as lean as possible: chat messages must eventually be realtime.

So I implemented custom Drupal entities and custom code that only does the thing it needs to do, nothing more and certainly nothing less (so for example the Drupal core node system is obviously not handy in this case).

To wrap it up, it turned out that these rules prevailed:

It's hard to make things simple. Experience is the grandmaster of all skills.

Technical details in next blog

In my follow up blog I will get into the tech behind the group chat in Drupal: the use cases and how I implemented them, all without page refresh / Drupal AJAX based:

  • Chat screen initialisation
  • Adding messages and realtime emitting them to all chat users
  • Files uploads via AJAX
  • Dynamic theme libraries, for socket.io connection
  • Node.js / Socket.io implementation
  • @mentions javascript library, with auto suggest
  • Mailing the people that where @mentioned
  • Security and permissions
  • Security hardening with group uuid's
  • How to handle CORS if socket.io runs on external server
  • If connection drops, make sure no messages are missed
  • Edit messages via AJAX
  • Deleting files from the chat via AJAX
  • Dependency injection

    So stay tuned y'all!

Try now or download open source

If you want to test the group chat in OpenLucius this instant, that of course is possible, click here to get started -> Hit 'try now' button. Or download and install OpenLucius yourself via the project page on Drupal.org

Jan 07 2021
Jan 07

Since last June 2020, we heard a lot about Drupal 9. And you may wonder, should we do the update to Drupal 9?

Depending on you current infrastructure, the consequences won't be the same. That's why, our experts wanted to do this infography in order to give you some insights about you next Drupal project.

A project? A question? Don't hesitate to contact us: https://www.smile.eu/en/contact

Infography Drupal 9

Jan 07 2021
Jan 07

In our recent project, we had a requirement from one of our clients where we need to validate data in CSV files based on custom requirements. This validated CSV would need to be imported into Drupal 8 into various content types.  

In this article, we will look at the requirement, the library, the architecture of the custom module, the different components of the module with some code samples and finally adding some ideas on how this module can be made more reusable and even contributed.

Introduction

Our client is a well known international NGO with offices worldwide, each with different types of data management systems and frameworks. They wanted a centralized system to manage the data from each of these offices. Having concluded that Drupal 8 was the ideal solution to implement that centralized system, the challenge was to set up a migration pipeline to bring in data from all of the offices and their varying frameworks. Consequently, the files generated by these systems needed to be validated for specific constraints before being imported into our Drupal system.

Challenges and Goals

Following are the goals that the system should meet:  

  1. The CSV files were in a custom format and there were multiple files with different structures and needed to be handled accordingly. Each column needed to have another validator. 
  2. The files needed to be validated for errors before they could be imported and the errors needed to be logged with line numbers and relevant error messages. 
  3. The validation had to be triggered automatically when the files were downloaded from a central location. 
  4. Notification emails had to be sent on successful and failed validation to the IT admins. 
  5. After successfully validating the files, the validator needed to trigger the next step of the process, which is importing the files.

The main challenges

  1. The validation had to cross-reference the incoming data with existing data and also with data in different files (referential integrity checks). 
  2. We also had to check the uniqueness of certain columns in the CSV files. Doing this in a database is pretty easy and straightforward, but this had to be done before inserting it into the database.

Step 1: Choosing a CSV reader library

The first step was to figure out a PHP based CSV reader library. League CSV was found to be the best option due to the below reasons:

  1. It was managed by composer and was already being used by the Migrate module in Drupal core and hence no additional code needed to be added for the library to work.
  2. The library covered many common scenarios like iterating through rows of the CSV, getting the field values and headers, and streaming large CSV files.
  3. And finally, it was implemented in an object-oriented way.

Step 2: Architectural requirements

Below are the requirements we had concerning the architecture of the code:

  1. The code needs to work as an independent service to call it at different locations of code and thereby invoke validation wherever required.
  2. The validations need to be as generic as possible so that the same validation rule can be reused for different fields in the same CSV or in others.
  3. We need to have an extensible way to specify the validation to be done for each field. For example, whether a specific field can be allowed to be blank.

Step 3: Designing the components of the validator

To satisfy the above architectural requirements, we designed the validator module into the following sub-components:

The main service class

Below are the main responsibilities of this class:

  1. Load the CSV library and loop through each of the files in a particular folder.
  2. Use the methods supplied by the CSV league to read the file into our variables. For example, each row of the file will be stored in an array with an index containing each column data.
  3. During processing, the filename is taken in and checked to see if the validator method in the Validator class matching the filename exists.  
  4. If the method exists, then validation is done for the file and errors are logged into the error log table.
  5. If there are no errors, the class triggers the next event, which is migration using a predefined custom event via the Event API of Drupal. 
  6. This also passes the status of the import to the calling class so that emails can be triggered to the site admins.

The Validators class

Here, we basically assign constraints for each file type in a method. The input to the validator class would be a complete row.  

The Constraints class

This class contains the individual constraints that check if a particular type column meets the required criteria. These constraints are methods that take in the column value as a parameter and return an error message if it does not meet the criteria for that column type. This class will be invoked from the validators class for each column in every row.

The Error log

As its name suggests, the validator needed to capture the errors and log them somewhere. We defined a custom table using the database hooks provided by Drupal. A custom view was defined in code to read the data from this table. The errors captured by the constraint class were logged into the database using this logger.

Eventsubscriber and mail notification

We needed the validation to be auto-triggered when the files were downloaded. To achieve this, we tapped into Drupal’s EventSubscriber and Response APIs. 

Referential Integrity checks

Most of the columns did not have any relation with existing data and could be validated on the fly. However, some of the data had to be validated if it has corresponding references either in the database or in another CSV file. We did this as follows.

  1. For those values which act as a parent, dump them into a temporary table, which will be cleared after validation is completed.
  2. When we arrive at another CSV with a column that references values dumped above, then we query the above table to check if the value is present. If yes, return TRUE.
  3. If the value is not present in the temporary table, then we search the Drupal database as the value might have been imported as part of the previous import. If not, then we throw a referential error for that row in the CSV.

The code snippets are available here.

We used the migrated data as a source for a headless backend using REST. For more details on the specifications, refer to our blog on how to validate API response using OpenAPI3.

Future scope and ideas to extend this as a module by itself

We have written the module with an architecture where the validators can be reused but require some coding effort. Below are changes that can be done to make this module a contribution.

  1. Add configurations to have a list of files that need to be validated.
  2. Each file will have an option to add the fields that need to be validated and the type of data (similar to what you have when creating content type).
  3. Based on the above list of files and field types, we can validate any number of CSVs with any number of columns. 
  4. We would need to modify the above classes to fetch the columns' data type and call respecting constraints for each CSV.

As a result of doing the above changes, anyone will be able to use this module to validate CSV files with their own columns.

Hope this blog helped you with this module and how it can be made more reusable and even contributed. Share your experience in the comments below! 

Jan 07 2021
Jan 07

In the last article, we discussed the changes required to get Drupal 9.1 running on PHP 8. At that time, we got the Drupal 9.1 dev release working on PHP 8.0.0 RC4 with a few patches. Since then, a lot has changed with many of those patches being committed and Drupal 9.2 dev open for development. But we’ll talk about all of that at a later date. Today, let’s look at getting some of the common PHP extensions and configure it to run with Drupal.

We left off at a point where we have plain Drupal 9.1 running on a plain PHP 8 RC4 setup. Drupal doesn’t require any extensions, not in PHP core, and that means we only had to enable extensions like gd, MySQL, and others to have Drupal 9.1 running. With that, we were able to install Umami and use the site without any problems at all. To enable those extensions, we only needed our docker-php-ext-enable script, which is part of the PHP base Docker imageSee the Dockerfile in the reference repository for the source code (lines 41-52). Installing other extensions that are not part of the PHP core is not quite that simple. Think of it this way: if a module is present in Drupal core, you can install it right after downloading Drupal. But if it is a contrib module, you have to download and install it separately. It’s the same thing with PHP extensions.

Why test with extensions?

Just as you probably wouldn’t have a Drupal site with at least one contrib module, you probably wouldn’t have a PHP installation without a few of the common extensions. Drupal core utilizes some of these extensions when they are available (such as APCu and YAML), which yields better performance. This means that even though the extensions are not technically required, you would most likely have them.

I started with extensions, which I almost always install on sites I work. These are APCu, YAML, and Redis. Drupal core doesn’t use Redis, but I almost always install the Redis module for caching, which requires this module. It made sense to test if it worked on PHP 8 (both the module and the extension). As for the other two extensions, Drupal core uses APCu and YAML extensions for better performance if they are available. Again, it is a good idea to test Drupal with these extensions installed.

Installing extensions

Typically, we would use PECL to install any extensions we needed. PECL is a repository for PHP extensions, very much like a composer for PHP packages. With PECL, we would just need to run a command such as pecl install Redis to install the extension. You can see this being used in lines 53-58 in the Dockerfile.

pecl install apcu redis yaml

This is not as simple with PHP 8. PHP 7.4 removed default support for PECL and the official Docker image removed the command in PHP 8 images (it applied an explicit option to keep it for PHP 7.4).

Alternative tool to build extensions

I found another tool called pickle, which was intended to replace PECL but became dormant as well. I noticed some activity on the project, including a relatively recent release, and I tried that first.

The tool worked very well for APCu and Redis extensions. However, for YAML, it failed because it could not parse YAML's beta version number (2.2.0b2). I found that this was fixed in a recent commit but that would have meant that I would need to build pickle in my Docker image rather than just downloading and using it. I was not looking to go that route.

Building the extension manually

This left me with only one option: building the extensions myself. Fortunately, this turned out to be much simpler than I thought. You can see the steps required for each extension in lines 54-67 in the reference repository’s Dockerfile. For each extension, there are essentially just two steps:

  1. Clone their source code of the extension
  2. Run phpize, make, and make install to build the extension

We need the PHP source available to use the above tools and this is easily achieved using a helper script in the Docker image. You can see it being used in line 39 in the reference repository. Once we build the extensions, we clean up the PHP source to keep our Docker image size small. This is what the complete step looks like:

docker-php-source extract;
git clone https://github.com/krakjoe/apcu.git; cd apcu;
phpize; make; make install; cd ..;
# ... Install other extensions same way
docker-php-source delete;

Enabling extensions

Now that the extensions are installed, we can use the same script (docker-php-ext-enable) as earlier to enable the extensions. In our reference repository, you can see this done on lines 69-72. Thanks to these helper scripts, we now have our extensions enabled and configured for both the PHP module (for Apache) and the CLI. This can be verified by running the following command:

php -m

The above command will list all the enabled PHP extensions (internal ones as well) and you should be able to find apcu, redis, and yaml in the list.

Bringing it together

Now, we need to make sure that Drupal works with the above extensions. Since APCu and YAML extensions are used by the core, we should see any issues immediately. We can even verify that Redis is connected and APCu is being used by looking at the status report page, as shown in the following screenshot. 

Tweet from Hussainweb

For Redis, we need to install the Drupal module as well as the Drupal core doesn’t use it directly. We will discuss installing modules in another post.

PHP 8 remains an important milestone in PHP history, not just because of cool new features but also because it established a trusted release cycle promised at the release of PHP 7. A predictable release cycle helps build trust and also consistently brings new features and innovation to the product. We saw that with Drupal 8’s regular six-monthly release cycle and we will see that with PHP as well.

Jan 07 2021
Jan 07

Drupal is a popular web-based content management system designed for small to large enterprises with needs such as complex workflows, multilingual content, and enterprise integrations. An increasing number of organizations move to Drupal from their current systems every year and with richer features being added to Drupal 9 and planned for 10, the growth will only accelerate. This means that migrations to Drupal remain an ever-popular topic.

Drupal provides a powerful and flexible migration framework that allows us to “write” migrations in a declarative fashion.

The migration framework supports a variety of sources and the ability to specify custom sources and destinations. Furthermore, the framework provides a powerful pipelined transformation process that allows us to map source content to destination fields declaratively.

Thanks to this framework, migration is more of a business challenge rather than a technical one. The overall process (or workflow) of the migration may differ depending on various business needs and attributes of the current (source) system. Depending on the type of migration, we may plan to reuse in-built migrations (in core or contrib), selectively adapt migrations from different sources, or entirely write new migrations. Further, depending on the source, we may choose to migrate incrementally or at one-time.

Many similar decisions go into planning an overall migration strategy and we’ll talk about the following here:
 

01. Migration Concepts

02.Understanding the content

03. Drupal to Drupal migration

04. Migration to Drupal from another system

05. Migration from unconventional sources
 

Migration Concepts

The Drupal migration framework is composable, which is why it can be used flexibly in many scenarios. The basic building entity (not to be confused with Drupal entities) is called, migration. Each migration is responsible for bringing over one discrete piece of content from the source to the destination. This definition is more technical than a business one as a “discrete piece” of content is determined by Drupal’s internal content model and may not match what you might expect as an editor.

For example, an editor may see a page as a discrete piece of content, but the page content type may have multiple files or paragraph fields or term references, each of which has to be migrated separately. In this case, we would have a separate migration for files, paragraph fields, and so on, and then eventually for the page itself. The benefit of defining migrations this way is that it allows the migrate framework to handle each of these pieces of the content itself, providing features like mapping IDs and handling rollbacks.

Correspondingly, a migration specifies a source, a destination, and a series of process mappings that define the transformations that a piece of content may go through while being mapped from a source field to a destination field. These are called plugins (because of their internal implementation). We might use different source plugins depending on the source system with the common ones provided by Drupal core (for sources such as Drupal, SQL databases, etc.).

There are dozens of contributed modules available for other systems such as WordPress, CSV files, etc. Similarly, process plugins are diverse and influential in allowing a variety of transformations on the content within the declarative framework. On the other hand, destination plugins are limited because they only deal with Drupal entities.

Incremental Migrations

The Drupal migrate framework supports incremental migrations as long as the source system can identify a certain “highwater mark” which indicates if the content has changed since a recent migration.

A common “highwater mark” is a timestamp indicating when the content was last updated.

If such a field is not present in the source, we could devise another such field as long as it indicates (linearly) that a source content has changed. If such a field cannot be found, then the migration cannot be run incrementally, but other optimizations are still available to avoid a repeat migration.

Handling dependencies and updates

The migrate framework does support dependencies between different migrations, but there are instances where there might be dependencies between two content entities in the same migration. In most cases, the migrate framework can transparently handle this by creating what are known as “stubs.” In more complex cases, we can override this behavior to gain more adequate control on stub creation.

As discussed in the previous section, it is better to use “highwater marks” to handle updates but may not be available in some cases. For these, the migrate framework stores a hash of the source content to track if the migration should be run. Again, this is handled transparently in most cases but can be overridden when required.

Rollbacks and error management

As long as we follow the defined best practices for the migrate framework, it handles fancier features such as rollbacks, migration lookups, and error handling. Migrate maintains a record of each content piece migrated for every migration, its status, hashes, and highwater marks. It uses this information to direct future migrations (and updates) and even allow rollbacks of migrations.
 

Understanding the content

Another important part of the equation is the way content is generated. Is it multilingual? Is it user-generated content? Can content be frozen/paused during migration? Do we need to migrate the revision history, if available? Should we be cleaning up the content? Should we ignore certain content?

Most of these requirements may not be simple to implement, depending on the content source. For example, the source content may not have any field to indicate how the content is updated and in those cases, an incremental migration may not be possible. Further, if it’s impossible to track updates to source content using simple hashes, we may have to either ignore updates or update all content on every migration. Depending on the size of the source and transformations on the content, this may not be possible and we have to fall back to a one-time migration.

The capabilities of the source dictate the overall migration strategy.

Filtering content is relatively easy. Regardless of the source, we can filter or restructure the content within the migration process or in a custom wrapper on a source plugin. These requirements may not significantly impact the migration strategy.

Refactoring the content structure

A migration can, of course, be a straightforward activity where we map the source content to the destination content types. However, a migration is often a wonderful opportunity to rethink the content strategy and information flow from the perspective of end-users, editors, and other stakeholders. As business needs change, there is a good chance that the current representation of the content may not provide for an ideal workflow for editors and publishers.

Therefore, it is essential to look at the intent of the site and user experience it provides to redefine what content types make sense then and in the near future. At this stage, we also look at common traits that distinguish the content we want to refactor and write mappings accordingly. With this, we can alter the content structure to split or combine content types, split or combine fields, transform free-flowing content to have more structure, and so on. The possibilities are endless, and most of these are simple to implement.

Furthermore, in many cases, the effort involved in actually writing the migration is not significantly different.
 

Drupal to Drupal migration

This is usually the most straightforward scenario. The core Drupal migrate framework already includes source plugins necessary for reading the database of an older version of Drupal (6 or 7). In fact, if the intention is to upgrade to Drupal 8 or 9 from Drupal 6 or 7, then the core provides migrations to migrate configuration as well as content. This means that we don’t even need to build a new Drupal site in many simple cases. It is simply a question of setting up a new Drupal 8 (or 9) website and running the upgrade.

Drupal is often not used for simple cases or for any non-trivial site needs rebuilding. 

A typical example is “views,” which are not covered by migrations. Similarly, page manager pages, panels, etc., need to be rebuilt as they cannot be migrated. Further, Drupal 8 has brought improvements, and updated techniques to build sites and the only option, in that case, is to build the functionality with the new techniques.

In some cases, it is possible to bring over the configuration selectively and remove the features you want to rebuild using a different system (or remove them altogether). This mix-and-match approach enables us to rebuild the Drupal site rapidly and also use the migrations provided in core to migrate the content itself. Furthermore, many contributed modules augment or support the core migration, which means that Drupal core can transparently migrate certain content belonging to contributed modules as well (this often happens in the case of field migrations). If the modules don’t support a migration path at all, this would need to be considered separately, similar to migration from another system (as explained in the next section).

Incremental migrations are simpler to write in case of Drupal-to-Drupal migration as the source system is Drupal and it supports all the metadata fields such as timestamps of content creation or updates. This information is available to the migrate framework, which can use it to enable incremental migrations. If the content is stored in a custom source within the legacy Drupal system and it does not have timestamps, a one-time migration may have to be written in that case. See the previous section on incremental migrations for more details.

While Drupal-to-Drupal migrations can be very straightforward and even simple, it is worth looking into refactoring the content structure to reflect the current business needs and editorial workflow in a better way. See the section on “Refactoring the content structure” for more details.
 

Migration to Drupal from another system

Migrating from another popular system (such as WordPress) is often accomplished by using popular contrib modules. For instance, there are multiple contrib modules for migrating from WordPress, each of which migrates from a different source or provides different functionalities. Similarly, contrib modules for other systems may offer a simple way to define migration sources.

Alternatively, the migrate framework can directly use the database source to retrieve the content. Drupal core provides a source that can read all MySQL compatible sources and there are contributed modules that allow reading from other databases such as MSSQL.

Similar to the Drupal migration source, features such as incremental migrations, dependencies, and update tracking may be used here as long as their conditions are satisfied. These are covered in detail in earlier sections. 

Check out the case study that outlines migrating millions of content items to Drupal from another system
 

Migration from unconventional sources

Some systems are complex enough to present a challenge during migration, even with the sophistication of source plugins and custom queries. Or there may be times when the content is not conventionally accessible. In such scenarios, it may be more convenient to have an intermediate format for content such as a CSV file, XML file, or similar formats. These source plugins may not be as flexible as a SQL source plugin (as advanced queries or filtering may not be possible over a CSV data source). However, with migrate’s other features, it is still possible to write powerful migrations.

Due to limitations of such sources, some of the strategies such as incremental migration may not be as seamless; nevertheless, in most cases, they are still possible with some work and automation.

An extreme case is when content is not available in any structured format at all, even as CSVs. One common scenario is when the source is not a system per se, but just a collection of HTML files or an existing web site. These cases are even more challenging as extracting the content from the HTML could be difficult. These migrations need higher resilience and extensive testing. In fact, if the HTML files vary significantly in their markup (it’s expected when the files are hand-edited), it may not be worth trying to automate this. Most people prefer manual migration in this case.
 

Picking a strategy

Wherever possible, we would like to pick all the bells and whistles afforded to us by the migrate framework, but, as discussed previously, a lot of it depends on the source. We would like every migration to be discrete, efficient with incremental migration support, track updates, and directly use the actual source of the content. However, for all of this to be possible, certain attributes of the source content system must be met as explained in the “Understanding the content” section.

The good news is that we often find a way to solve the problem and it is almost always possible to find workarounds using the Drupal migrate framework.

Jan 07 2021
Jan 07

Drupal 9 Module Development

Before we get our hands dirty with menus and menu links, let's talk a bit about the general architecture behind the menu system. To this end, I want to see its main components, what some of its key players are and what classes you should be looking at. As always, no great developer has ever relied solely on a book or documentation to figure out complex systems.

Menus

Menus are configuration entities represented by the following class: Drupal\system\ Entity\Menu. I previously mentioned that we have something called configuration entities in Drupal, which we explore in detail later in this book. However, for now, it's enough to understand that menus can be created through the UI and become an exportable configuration. Additionally, this exported configuration can also be included inside a module so that it gets imported when the module is first installed. This way, a module can ship with its own menus. We will see how this latter aspect works when we talk about the different kinds of storage in Drupal. For now,
we will work with the menus that come with Drupal core.

Each menu can have multiple menu links, structured hierarchically in a tree with a maximum depth of 9 links. The ordering of the menu links can be done easily through the UI or via the weighting of the menu links, if defined in code.

Menu links

At their most basic level, menu links are YAML-based plugins. To this end, regular menu links are defined inside a module_ name.links.menu.yml file and can be altered by other modules by implementing hook_menu_links_discovered_alter(). When I say regular, I mean those links that go into menus. We will see shortly that there are also a few other types.

There are a number of important classes you should check out in this architecture though: MenuLinkManager (the plugin manager) and MenuLinkBase (the menu link plugins base class that implements MenuLinkInterface).

Menu links can, however, also be content entities. The links created via the UI are stored as entities because they are considered content. The way this works is that for each created MenuLinkContent entity, a plugin derivative is created. We are getting dangerously close to advanced topics that are too early to cover. But in a nutshell, via these derivatives, it's as if a new menu link plugin is created for each MenuLinkContent entity, making the latter behave as any other menu link plugin. This is a very powerful system in Drupal.

Menu links have a number of properties, among which is a path or route. When created via the UI, the path can be external or internal or can reference an existing resource (such as a user or piece of content). When created programmatically, you'll typically use a route.

Multiple types of menu links

The menu links we've been talking about so far are the links that show up in menus. There are also a few different kinds of links that show up elsewhere but are still considered menu links and work similarly.

Local tasks

Local tasks, otherwise known as tabs, are grouped links that usually show up above the main content of a page (depending on the region where the tabs block is placed). They are usually used to group together related links that have to deal with the current page. For example, on an entity page, such as the node detail page, you can have two tabs—one for viewing the node and one for editing it (and maybe one for deleting it); in other words, local tasks:

Local tasks take access rules into account, so if the current user does not have access to the route of a given tab, the link is not rendered. Moreover, if that means only one link in the set remains accessible, that link doesn't get rendered as there is no point. So, for tabs, a minimum of two links are needed for them to show up.

Modules can define local task links inside a module_name.links.task.yml file, whereas other modules can alter them by implementing hook_menu_local_tasks_ alter().

Local actions

Local actions are links that relate to a given route and are typically used for operations. For example, on a list page, you might have a local action link to create a new list item, which will take you to the relevant form page.

Modules can define local action links inside a module_name.links.action.yml file, whereas other modules can alter them by implementing hook_menu_local_ actions_alter().

Contextual links

Contextual links are used by the Contextual module to provide handy links next to
a given component (a render array). You probably encountered this when hovering over a block, for example, and getting that little icon with a dropdown that has the Configure block link.

Contextual links are tied to render arrays. In fact, any render array can show a group of contextual links that have previously been defined.
Modules can define contextual links inside a module_name.links.contextual.ymlfile, whereas other modules can alter them by implementing hook_contextual_links_ alter().

For more on the menu system and to see how the twist unfolds, do check out my book, Drupal 9 Module Development.

Thanks for the support.

Jan 06 2021
Jan 06
Jan 06, 2021 Product

Nowadays every organization needs/has some kind of group chat, like Slack or Microsoft Teams. But those chats are often detached from your other team activity like: project management, social posts, files and folders. Communication, work and documentation gets fragmented easily - very frustrating.

Often, group chat is just another island with a lot of distraction.

It doesn't have to be like that: our brand new and integrated group chats cause none of the above, but they áre:

Integrated with all other teamwork

Integrated in groups, use our group chat together with all your other team activity like:

  • Projects and tasks
  • Messages
  • Stories
  • Notebooks
  • Files and folders
  • Social posts
  • Polls
  • Culture/social questions
  • Check-ins
  • Shout-outs
  • Icebreakers
  • A task board (kanban) that's usable for users of all digital levels

Customizable, brandable

Just like the rest of the Lucius features, our group chat is:

  • Customizable to your organization's needs and workflows
  • ‘Brandable’ to your companies house style
  • Because: 100% open source

And last, but not least:

  • Multilingual
  • Also works easily with clients
  • Extendable with extra needed modules, integrations, functions or permissions
  • OpenSaas
  • Private hosting is possible

Basic features

group chat open source

Test our Group chat now, or install it yourself open source.

If you want to test our group chat this instant, that of course is possible, click underneath to get started. Or download and install OpenLucius open source yourself.

All-in-one toolkit for remote work -and culture.   Try for free now

Download and install OpenLucius open source yourself.   Download and install

Jan 06 2021
Jan 06

This function will remove the hook_query_entity_query_alter() function from the Group contrib module. If it doesn't, you could check the module weight of your custom module with the core.extension.yml file. There you can set

my_module: 0

to

my_module: 10

The module weight must be higher than the module weight of the contrib module which you want to modify.

Jan 06 2021
Jan 06

Every day, millions of new web pages are added to the internet. Most of them are unstructured, uncategorized, and nearly impossible for software to understand. It irks me.

Look no further than Sir Tim Berners-Lee's Wikipedia page:

The markup for Tim Berners-Lee's Wikipedia page; it's complex and inconsistentWhat Wikipedia editors write (source).The browser output for Tim Berners-Lee's Wikipedia pageWhat visitors of Wikipedia see.

At first glance, there is no rhyme or reason to Wikipedia's markup. (Wikipedia also has custom markup for hieroglyphs, which admittedly is pretty cool.)

The problem? Wikipedia is the world's largest source of knowledge. It's a top 10 website in the world. Yet, Wikipedia's markup language is nearly impossible to parse, Tim Berners-Lee's Wikipedia page has almost 100 HTML validation errors, and the page's generated HTML output is not very semantic. It's hard to use or re-use with other software.

I bet it irks Sir Tim Berners-Lee too.

The markup for Tim Berners-Lee's Wikipedia page; it's complex and inconsistentWhat Wikipedia editors write (source).The generated HTML code for Tim Berners-Lee's Wikipedia page; it could be more semanticWhat the browser sees; the HTML code Wikipedia (MediaWiki) generates.

It's not just Wikipedia. Every site is still messing around with custom

s for a table of contents, footnotes, logos, and more. I could think of a dozen new HTML tags that would make web pages, including Wikipedia, easier to write and reuse: , , , and many more.

A good approach would be to take the most successful Schema.org schemas, Microformats and Web Components, and incorporate their functionality into the official HTML specification.

Adding new semantic markup options to the HTML specification is the surest way to improve the semantic web, improve content reuse, and advance content authoring tools.

Unfortunately, I don't see new tags being introduced. I don't see experiments with Web Components being promoted to official standards. I hope I'm wrong! (Cunningham's Law states that the best way to get the right answer on the internet is not to ask a question; it's to post the wrong answer. If I'm wrong, I'll update this post.)

If you want to help make the web better, you could literally start with Sir Tim Berners-Lee's Wikipedia page, and use it as the basis to spend a decade pushing for HTML markup improvements. It could be the start of a long and successful career.

— Dries Buytaert

Dries Buytaert is an Open Source advocate and technology executive. More than 10,000 people are subscribed to his blog. Sign up to have new posts emailed to you or subscribe using RSS.

Calendar icon6 January 2021Clock icon1 min read time Tag icon
Jan 05 2021
Jan 05

How can we leverage Open Source contribution (in particular to Drupal) to maximize value for our customers? In this article, I would like to share the results of a recent workshop we held on this question as part of our internal gathering LiipConf.

Discover more about the service CMS our digital agency has to offer for you.

Together with a few colleagues we met for a brainstorming session. The goals set for this session were:

  • Share experiences about open source contribution at Liip and together with customers
  • Reflect on added value we can generate when contributing to Open Source
  • Mention any blockers, uncertainties or difficulties that you encounter when it comes to Open Source contribution
  • Come up with ways of including Open Source contribution into our workflows
  • Brainstorm what our customers would find valuable to know about Open Source contribution

Check-in

In our check-in, we asked which topics attracted people to come to the workshop. We had a good mix of engineers, product owners and UX folks from Drupal and Symfony in our meeting. The topics of interest spanned from “motivating clients to pay to create reusable solutions”, “sharing experiences in the context of contributions”, “getting started with contributions in 2021”, “listening in”, “finding ways to giving back”.

Method

Thanks to Emilie’s suggestion and facilitation, we used the Customer Forces Canvas to structure the discussion.

Open Source contribution board based on Miro.com and Customer Forces CanvasOpen Source contribution board based on Miro.com and Customer Forces Canvas

The canvas allowed us to capture different aspects of adopting contribution practices by asking structured questions:

  1. Triggering Event - What were those events that led to your decision to contribute back to Open Source?
  2. Desired Outcome - What outcome were you looking for?
  3. Old Solution - What solution were you using that was already in place?
  4. Consideration Set - What were alternative solutions that were considered?
  5. New Solution - What solution was selected? Why?
  6. Inertia - What were some concerns/anxieties you had before starting to contribute?
  7. Friction - What were some concerns after you started contributing?
  8. Actual Outcome - What was the actual outcome after starting to contribute? Did it meet your expectations?
  9. Next Summit - What would you like to see next for contribution? Why?

Discussion points

Examples mentioned were finding issues in existing Open Source solutions. Another key triggering event was that when the client understood how
Open Source works, they would be much more motivated to fund contributions. Often it is the motivation by an individual or the team striving to create better solutions without the need to maintain custom code individually for a customer project.

Goals we are striving for when contributing to Open Source include externalizing maintenance efforts to the community at large as well as doing good. By contributing back we are fueling the ecosystem that keeps our software up to date and innovative. We create more sustainable solutions when we are able to use standardized building blocks and follow community best practices.

When facing contribution opportunities, we are often presented with various ways to solve the issue. Fix the issue in custom code (miss the chance of contribution), fix the issue in a contributed module or fix the issue in Drupal core. Depending on the layer of abstraction, we can shoot for quick solutions or spend more time working on a generic solution. Alternatives to fixing the issues ourselves also include that we sponsor other maintainers to work on a sustainable solution that includes the resolution of the current issue.

We have also encountered issues where relying on too much abstract code created a risk for the project over time, especially when you deviate from the standard components it might become easier to internalize the functionality into the custom project’s code base so that it can be adapted without context switching but at the cost of needing to maintain the functionality without community support.

Even non-perfect code or work-in-progress can be released as Open Source so that others are able to build on it and eventually these building blocks will be further evolved. Sandbox projects or alpha releases can serve well as incubators for contributed code. Over time, when the project gets more mature, the semantic versioning approach with alpha & beta releases allows to specify well what users of the module can expect.

When discussing what was holding us back from contributing, many reasons can apply. Contributing to Drupal core takes more time than writing custom code. Sometimes it is just that folks involved don’t understand how Open Source works or what it is good for. When we create quick & dirty solutions, we sometimes don’t feel quite ready to Open Source them. Sometimes, we just don’t feel a need to contribute back because we can achieve our short term goals without doing so. Family folks mentioned that they can’t commit private time and focus on getting the job done during work time.

When discussing what was holding us back when making a contribution, we found that sometimes the effort invested doesn’t match the outcome. We need to invest more time than what we think is worth solving the problem. This can be especially driven by the fact that contributed code may imply higher quality standards enforced by peer-review from the community. It’s also the urge that once a solution is Open Source, we feel like we need to maintain it and invest more time continuously. If a custom solution is cheaper, why should the client pay for it when they cannot reuse it themselves? Sometimes we are not certain if anyone else will be willing to make use of our custom code.

We talked about the benefits that folks got when adopting contribution was adopted as a practise. Getting good community feedback on their solutions, having their solutions improved and evolved further so that it matches new use cases was mentioned. Giving speeches at conferences is also something that was found to be valuable. As a new step for contribution, folks mentioned that they would like to get help pushing their contributed modules so that they get adopted by a wider audience.

We also identified some USPs (Unique Selling Proposition) for contribution during the discussion. Clients would not need to pay for improvements contributed by the community. The maintenance of solutions based on contribution becomes more reliable. Contribution elevated self-esteem for clients and teams and helped increase visibility. It helps as a sales argument for agencies towards clients and also helps engineers to become hired by a Drupal agency like Liip. Some folks even manage to make money on platforms like GitHub sponsors or Open Collective.

Takeaways

We closed our meeting to collect some takeaways and what’s next for folks in contribution. Here’s a list of the key takeaways:

  • A “contrib-first approach” that incorporates the contribution mindset
  • Adding contribution checkpoints into the definition of ready/done
  • Inviting for cross-community contribution between Symfony and Drupal
  • Raising contribution in daily meetings, motivating each other to speak at conferences
  • Making sure that our contributions are used by others
  • Helping to find areas of contribution for non-developers
  • Balancing being a taker vs. a maker
  • Evolving a plan to communicate our efforts around contribution

What’s next for you in contribution? Have you experimented with the Customer Forces Canvas? Thank you for improving Open Source & let us know in the comments.

Image credit: Customer Forces Canvas (c) LEANSTACK
https://leanstack.com/customer-forces-canvas

Jan 05 2021
Jan 05

MidCamp 2021 is going to be a camp like none other. We’re tired of virtual sessions and we’re reimagining our camp to make it more community-oriented and interactive! 

In early March 2020 when we made the decision to take MidCamp 2020 virtual, we had very little idea of how the year would unfold. Our 2020 camp was a blast, but 9-months deep into the pandemic in the US, we needed to reassess our situation for 2021. We did some digging, and tried to get back to our roots. Here’s where we landed.

Tl;dr

  • MidCamp is happening virtually March 24-27, 2021.
  • It’s going to be something totally different—focused on humans and providing personal growth through collaboration.
  • Tickets will be pay-what-you-wish.
  • Keep an eye out for sponsor info, coming soon.

WHY are we doing this?

  • We want to sustain the Drupal project & community. Local events are often considered the best onramp to the Drupal project, and we need to keep camps healthy to keep Drupal healthy.
  • We want to maintain our presence as a brand and a team. MidCamp is now 7-years old, we have a great team of organizers, and we want to keep the gang together.

WHAT are we doing?

MidCamp 2021 will be a four-day event, but this year we’re designing the program (and the tickets) to be much more freeform and drop-in/drop-out. None of us have the time or the energy for four full-days of Zooming any more.

  • Wednesday: Community Day. A new concept we’re piloting, Community Day is meant to onboard attendees to the Drupal Community. Attendees will be presented with a range of introductory presentations, interspersed with small-group mentoring sessions.
  • Thursday: Opening Ceremonies. After learning about Drupal and the Community, attendees will enjoy a day of lightly structured activities to decompress, have fun, and have some human time. Twitch party? Gather town? D&D quest? They’re all possible.
  • Friday: Unconference. Instead of formal sessions, we’ll do Drupally things in a one-day un-conference format. If you’re unfamiliar with the format, read about organizing and attending.
  • Saturday: Contribution Day. Our traditional day to give back to the Drupal project. All experience levels are welcome, and you might even get your first core commit!

WHO is this for?

MidCamp is for “people who use, develop, design, and support the Web’s leading content management platform, Drupal.” We wanted to elaborate on that statement this year.

MidCamp 2021 is for:

  • People who are completely new to Drupal, that could be:
    • a technology professional who has never used Drupal,
    • a recent dev bootcamp or computer science / information science graduate, 
    • a job-seeker or career-changer with an interest in being involved in a vibrant and supportive web development community.
  • Mid to Senior-level professionals who work with Drupal in any fashion.
  • Drupal contributors of any kind.

WHERE is it?

The Internet. We’ll host activities across a variety of platforms, but all will be accessible from anywhere in the world.

WHEN is it?

March 24-27, 2021. Most activities will occur during business hours, Central TIme.

HOW (much) is it?

As our event will be non-traditional and much less costly to run than an in-person event, MidCamp 2021 will be pay-what-you-wish. Individual and corporate sponsorship information is coming soon.

In Conclusion

Thanks for sticking around. We’re excited for what 2021 has in store. Join the conversation on Slack, listen in on Twitter, or subscribe to the email list.

Jan 05 2021
Jan 05

Even though 2020 came to a close with an overwhelming sense of “good riddance,” the year was not all bad. It was filled with as many surprises as it was filled with opportunities for growth, learning, and many new developments.

The realities of remote work revealed new levels of resilience and flexibility, Drupal 9 was released right on time, and here at Promet Source, we pulled together a lot of collective brainpower to introduce new possibilities for empowering content editors while streamlining web development. 

Our weekly blog posts reflect our commitment to draw upon a depth and breadth of our team’s expertise to convey best practices, new insights, innovations, and thought leadership for the Drupal and web development communities.

Here are Promet's 10 blog posts that grabbed the most attention.  

 

1. Drupal Enabled Drag-and-Drop Content Management, by Chris O’Donnell

Drupal enabled drag and drop blog image

Leading up to the end-of-year launch of Provus, which offers a new approach to designing, developing, and managing Drupal sites with intuitive, no-code, drag-and-drop page-building tools, this post explained the foundations of component-based web design systems and the accompanying leaps forward for efficiency and content editor empowerment. Read Drupal Enabled Drag and Drop Content Management.

 

2. Provus! Drupal Content Editing Reimagined, by Mindy League

Provus Feature Image

Signaling new directions and game-changing possibilities for 2021, this final post of the year sparked a surge of interest in Provus, Promet’s new platform for better content editing in Drupal, and presented insight into the kind of thinking that drove the development of this new platform. Read Provus! Drupal Content Editing Reimagined.
 

3. How to Master Entity Access in Drupal, by Bryan Manalo

How to Master Entity Access banner

The first in a two-part series on Entity Access, this how-to provided an in-depth tutorial on hook entity access, along with a discussion of when and how to use it. Read How to Master Entity Access in Drupal.

4. How to Facilitate an Innovative Remote Meeting, by Mindy League

remote work illustration

Early into the pandemic, as many began looking for new ways to enhance engagement, Promet offered a new approach for breathing new life into remote meetings by applying the techniques of design thinking and human-centered design. Read How to Facilitate an Innovative Remote Meeting.

5. Anticipating Post Pandemic Web Design Trends, by Mindy League 

Post pandemic design trends

As Covid-19 heads for the history books, “normal” stands to look a lot different than how we remembered it. Pointing to design changes that have been sparked by global upheaval in past decades, this post looked at what’s next and cited upcoming trends for web design. Read Anticipating Post Pandemic Web Design Trends.

6. Remote Work Success in a Time of Caution and Quarantine, by Pamela Ross

Promet's Pamela Ross

With a track record of attracting talent from all over the world and effectively collaborating via Zoom, Promet Source entered the pandemic with an edge over companies that were scrambling to adjust to working remotely. This post shared some of Promet’s expertise on the topic with five key strategies for optimizing the remote work opportunities. Read Remote Work Success in a Time of Caution and Quarantine.

7. Drupal 9 Has Dropped! What to Do Now, by Aaron Couch

Drupal 8 to Drupal 9 migration

Despite a global pandemic, Drupal 9 was released on time, as promised, on June 3, 2020. This post covers the key features of Drupal 9 and lays out a strategy for assessing migration readiness. Read Drupal 9 Has Dropped! What to Do Now.

8. Pros and Cons of Five Web Accessibility Tools, by Denise Erazmus 

scales for weighing pros and cons

There are a wide range of available tools designed to support ADA web accessibility compliance, but they vary in the number and types of errors they detect and the degree to which they can help ensure compliance. To help sort through options, this post covers the five most popular tools or extensions, along with the key pros and cons of each. Read Pros and Cons of Five Web Accessibility Tools.

9. Always Be Optimizing for SEO, by Ishmael Fusilero

Optimize for SEO

This post explains why and how organizations need to approach SEO as an ongoing activity, consistently monitoring metrics, along with a strategy to leverage the intelligence hidden within the data. Read Always Be Optimizing for SEO.

10. Drupal 8 Load Testing with Locust, by Josh Estep

Load Testing with Locust

Load-testing is an essential step in the development process. It quantifies the amount of traffic a site can sustain both during development and prior to launch. This post provides a how-to on the use of Locust as an Open Source load testing tool for Drupal 8. Read Drupal 8 Load Testing with Locust

With a diverse talent base, Promet Source is well positioned to share expertise and insights that connect, engage, inform, and spark new ideas. Do you have big plans for your website in 2021? Let us know what we can do to help you achieve your goals!
 

Banner with link to subscribe to Promet Newsletter
 


 

Jan 05 2021
Jan 05

The following is a step by step instruction for implementing reading minutes left for a particular article, blog, or similar, just like we see on medium.com.

The JS file

  • I have used this JS library.
  • Place this code in a JS file named read-remaining-minutes.js and place it in the corresponding theme.
(function($) {
  $.fn.readingTimeLeft = function (options) {

    var s = $.extend({}, {
      stepSelector: '*',
      wordPerMinute: 100,
      eventName: 'timechange'
    }, options);

    var $this   = $(this)
    , $window = $(window)
    , $steps  = $this.find(s.stepSelector);

  // For each step element, store the quantity of words to come
  $steps.each(function (i, el) {
    var textAhead =  $steps.slice(i, $steps.length).text();
    $(el).data('words-left', textAhead.trim().split(/\s+/g).length);
  });

  // Filters elements that are in viewport
  $.fn.filterVisible = function () {
    var wW = $window.width(), wH = $window.height();
    return this.filter(function(i, e){
      var rect = e.getBoundingClientRect();
      return rect.top >= 0 && rect.right <= wW &&
      rect.bottom <= wH && rect.left >= 0;
    });
  }

  function throttle (fn, limit) {
    var wait = false;
    return function () {
      if (wait) return;
      fn.call(); wait = true;
      setTimeout(function () { wait = false; }, limit);
    }
  };

  var triggerOn = 'scroll.' + s.eventName + ' resize.' + s.eventName;

  // Throttle updating to 50ms
  $(window).on(triggerOn, throttle(function (e) {
    var wordsLeft = $steps.filterVisible().last().data('words-left');
    $this.trigger(s.eventName, wordsLeft / s.wordPerMinute);
  }, 50));

  // Destroy function
  $this.on('destroy.readingTimeLeft', function (e) {
    $(window).off(triggerOn);
    $steps.removeData('words-left');
  });
  return $this;

};
}(jQuery))
  • Next in the template.php file for your corresponding theme, you need to add the above JS file for the particular content type. You can do something like this below:
if ($vars['node']->type == 'article'') {
        drupal_add_js(drupal_get_path('theme','my_theme') . '/js/read-remaining-minutes.js');
}
  • After you have told Drupal to add the JS file, through the above code, your JS code will be ready for the page.
  • Now you need to specify where you want to add this functionality.
  • For that, I have a custom JS named “my-custom-js-code.js” file, in this same theme itself, where I usually write all my custom JS. Here I will specify my custom JS code.
// Reading time left for a blog post
    // #calculable-content is the id of the content on which we want to apply the calculation for reading time
    $('#calculatable-content').readingTimeLeft()
      .on('timechange', function(e, minutesLeft) {
        if(isNaN(minutesLeft)) {
          // .time-left is the class belonging to the read remaining div
          $('.time-left').hide();
        }
        else {
          // If less than 1 min remains then display "Content Finished" else show the minutes left
          var text = Math.round(minutesLeft) < 1? $('.time-left').text('Content Finished') : $('.time-left').text(Math.round(minutesLeft) + ' min left');
          $('.time-left').show();
        }
      })
    $(window).trigger('scroll');
  • Here I am considering that when the scroll reaches the end, it will show “Content Finished”. I will explain the id and the class used below.

Modifying .tpl.php

  • We have placed our JS codes as needed. Now we need to link it to the class in HTML so that it appears on the page.
  • I have a .tpl.php file which is responsible for rendering all the HTML content for the particular page named “custom-template.tpl.php”
  • In this .tpl.php file, at the place where you want this read remaining minutes block of text to appear,  you have to specify the HTML for it.
 
  • The time-left class is the wrapper class for the block, that is the entire block of the text itself.
  • The id calculatable-content is what we are using to calculate the time left, which will dynamically change while you scroll through the page.

Implementing CSS

  • We need to add a decent enough css so that it appears on the page without hurting the eyes!
  • You can use the following css, using this will place the block of text at the right top section of the page.
.time-left {
  position: fixed;
  right: 0;
  top: 176px;
  padding: 10px 10px 10px 40px;
  background: #068bb8;
  color: #fff;
  font-size: 15px;
  line-height: 19px;
  cursor: default;
  border-bottom: 0px;
  z-index: 999;
  &:before {
    content: url('../../../../../sites/all/themes/my_theme/images/time-left-white.png');
    position: absolute;
    top: 12px;
    left: 15px;
    @media screen and (max-width: 767px) {
      top: 8px;
      left: 10px;
    }
  }
  @media screen and (max-width: 767px) {
    padding: 6px 6px 6px 35px;
    font-size: 12px;
  }
}

Final approach

Now you just need to hit the cache clear, sit back and enjoy. Observe how the time changes as you scroll through the page!

Jan 05 2021
Jan 05

You might want to add the functionality for a magnific popup where there are multiple items, say images, videos which on clicking would open up in a popup and you would be able to scroll through those. Something like this: https://dimsemenov.com/plugins/magnific-popup/.

Worry not! You do not need to go through the entire documentation in the above link. I have done the hard work for you so that you can get it done in the wink of an eye!

Initialization and modification in custom JS

  • First you need to include the JS library in your theme.
  • The minified file is quite big, so I am not providing it here.
  • You can find the minified JS file here: https://github.com/dimsemenov/Magnific-Popup/blob/master/dist/jquery.magnific-popup.min.js.
  • Place this JS file in the theme you wish to use.
  • Next in the template.php file for your corresponding theme, you need to add the above JS file for the particular content type. You can do something like this below:
if ($vars['node']->type == 'article'') {
        drupal_add_js(drupal_get_path('theme','my_theme') . '/js/jquery.magnific-popup.min.js');
}
  • Once done, you need to write the custom js, where you want this magnific popup to be triggered.
  • The custom JS should look something similar to this:
// Gallery section magnific popup
      if($('.gallery-section .tab-content').length) {
        // magnificPopup for tab 1
        if($('.gallery-section .tab-content .tab1).length) {
          $('.gallery-section .tab1).magnificPopup({
            delegate: 'a',
            type: 'image',
            gallery: {
              enabled: true
            }
          });
        }
}

Somethings to note:

  • I had a tabbed gallery section. Each of the tabs contained a video as the first element and then the rest were images.
  • Here first I check if the gallery section exists. If so, then I again check if the particular gallery tab exists. If so, then for that particular gallery tab I implement the magnific popup.
  • Where “delegate: a” means that I am imposing the functionality on the “a” tag.
  • I have specified the type as an image. You might have the question then how would it work for the video right? I will definitely tell you that in the later section.
  • Finally, we initialize the gallery as true, for it to work.

Implement the custom Html

  • Implement the custom HTML as you like, a gallery tabbed section in my case.
  • Let us see an example of the html I have used:


tab video



  • Now comes the fun part! All of the above are images, except for the first one which is a video. For that to work properly, you simply need to add the following class “mfp-iframe” in the class for the respective video “a” tag.
  • Here I have 1 video and 4 images. That is a total of 5 elements. So when you cycle through these, below you will be able to see that the total count is shown as 5.
  • For sections where you may have multiple tabs, you need to repeat the js
$('.gallery-section .tab1).magnificPopup({
            delegate: 'a',
            type: 'image',
            gallery: {
              enabled: true
            }
          });

For each of the tabs with their corresponding ids respectively. Else it will take the total count of all the elements “for that particular section” and “not that particular tab” and cycle through, say, 100 elements (the total number of elements that you may have in the entire section) instead of the 5 elements in that particular tab.

I am not providing the CSS as that is subjective to how your section looks. Enjoy!

Jan 05 2021
Jan 05

10 Years!We're kicking off our 10th year of Drupal Career Online - the longest running long-form Drupal training program in existence. To help mark the occasion, we thought it would be fun to share some of the things we've seen over the past 10 years that our students (both DCO and private training clients) have shared with us that made us think, "yeah, you really should enroll in Drupal Career Online..."

  1. Not using Composer yet - this is more of a recent (Drupal 8+) development, but we're still surprised when we see folks not using Composer to manage their Drupal 8 codebase. The DCO teaches best practices for using Composer and the drupal/recommended-project core Composer template.
  2. Using the "Full HTML" text format for everything everywhere - it is just plain scary when we see this, as it usually indicates a lack of understanding of both Drupal core text formats and basic security practices. The DCO provides both instructor-led and independent-study lessons on text formats.
  3. Relying on a single layout tool - in Drupal 8+, there are multiple ways to layout a page. This includes block placement, custom templates, Panels, Paragraphs, and Layout Builder. Not understanding the strengths and weaknesses of each of the more widely used solutions can lead to "everything looks like a nail, so I'll use a hammer everywhere" solution, which can result in a poor implementation. The DCO covers the basics of each of these layout techniques.
  4. Fear of Drupal versions greater than 7 - "the drop is always moving” – Drupal is continually evolving (and so is the DCO!). Embracing emerging versions of Drupal, like 8+, keeps you current, makes you more employable and introduces you to modern web development techniques.
  5. Modules are enabled and you have no idea why - one of the primary skills the DCO teaches is how to find answers, mainly by helping you create and grow your Drupal network. From classmates, to the active DrupalEasy learning community, community mentors, to online Drupal etiquette; we show you how and where to efficiently find answers.
  6. Your site always has errors on the Status Report page - the DCO's "site maintenance" lesson begins with the Status Report page. We provide a step-by-step approach to troubleshooting Status Report (and other) issues that may appear on sites you maintain.
  7. Your available updates page has more red than green - updatings modules can be scary. Git, Composer, database updates, and testing methodologies can sometimes make the seemingly simple task of updating a module arduous. Maybe you're the type that "updates all the things at once" and then crosses your fingers and hopes everything works. The DCO provides a step-by-step methodology for updating both Drupal core and contributed modules.
  8. Your site has one content type that is used for everything (aka, "I have no idea what entities, bundles, and fields are") - this is often a red flag that the site's information architecture (IA) isn't quite what it should be. Our site-building lessons include a healthy dose of IA, focusing on Drupal core entities, bundles and fields and how to efficiently map an organization's data to Drupal.
  9. Pathauto isn't installed nor enabled - maybe you're not the type to get up every morning and scour Twitter for the latest Drupal news. Luckily, we are, and much of the best-practice-y stuff we find goes directly into Drupal Career Online. We'll talk about contributed modules that most sites should absolutely be using.
  10. You have no idea what cron is (or if it is running) - when we perform site audits, this is normally one of the first things we look for on the Status Report page. The DCO covers this and other topics focused on Drupal best practices. 

If you're reading this and it is hitting a close to home, consider joining us at one of our upcoming Taste of Drupal webinars where we'll spend an hour talking and answering questions about the next semester of Drupal Career Online.   
 

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web