Nov 08 2019
Nov 08

Each month, we do a roundup of all the blog posts we wrote in the previous month. So, no need to worry if you missed some of our posts from October - here’s an overview of all of them!

Understanding the job of an IT Project Manager

Our first post in October was authored by Espie Vidal, writer for the productivity and time management app Time Doctor. In it, she takes a detailed look at the job of an IT project manager: how it differs from that of an IT manager, what their educational requirements, responsibilities and daily tasks look like, etc. 

She points out that the job of an IT project manager is project-based, but it also entails taking care of a business’ IT requirements. As far as education goes, the role demands both managerial and technical experience, and a degree in Information Technology is desired by most hiring managers.

Read more

Microcopy: What is it and why is it important

The next blog post we published last month discusses an often overlooked, though no less important, aspect of user experience: microcopy or UX copywriting. As opposed to content marketing, which focuses on the customer, microcopy focuses on the customer now turned user who needs help with using the product or service, and the copy is intended as a guide that facilitates that use. 

One of the areas where good microcopy is vital are the negative interactions with your brand, as it has the potential to transform them into positive ones. Most importantly, however, the microcopy should be well aligned with the overall design - it is the two together that provide the desired user experience.

Read more

Interview with Sascha Eggenberger of Unic: Designing Claro, Drupal's new admin theme

We continued with another post that has a UX feel to it - a Drupal Community Interview with Unic’s Sascha Eggenberger, designer and front-end developer who is also involved in the Drupal Admin UI and JavaScript Modernization initiative

The initiative has been working tirelessly on Drupal’s new back-end theme Claro, but there are other exciting things coming for Drupal’s admin user experience - among other things also a new front-end theme named Olivero as an homage to Rachel Olivero who has recently passed away.

Sascha, together with Cristina Chumillas and Archita Arora, also gave a session at DrupalCon Amsterdam on the future of the Drupal administration UI - luckily, we conveniently published our interview right before the event, hopefully also helping with greater attendance of the session at least a little bit.

Read more

Why creative agencies should partner with a development agency

The last post we wrote in October was more business-oriented and touched upon the dilemma of creative agencies whether to do all their development in-house or instead partner with a development agency.

In our experience, the second option yields better results for a number of reasons, among which are access to the latest technology trends and to top industry talent, as well as a smooth workflow and scalability without any overhead. 

On top of that, a strategic agency partnership also lets you focus on the things you excel at, while allowing you to keep complete control over you project, but with minimum input from your side. 

Read more

We hope you enjoyed revisiting our content from October. Follow us on Twitter if you don’t want to miss any of our upcoming posts!

Nov 08 2019
Nov 08

The API that the JSON:API module makes available is centered on Drupal's entity types and bundles. Every bundle receives its own, unique URL path, which all follow a shared pattern

HTTP Methods

JSON:API specifies what HTTP Methods to accept. Those are: GET, POST, PATCH, and DELETE. Notably, PUT is not included.

  • GET - Retrieve data, can be a collection of resources or an individual resource
  • POST - Create a new resource
  • PATCH - Update an existing resource
  • DELETE - Remove an existing resource

Authentication

Typically some form of authentication is used for POST requests. The examples below all use Basic Authentication. Enable the HTTP Basic Authentication module, set the permission for the API user (and role) and set the encoded username and password to the 'Authorization' request header.

Headers

Make sure to use 'Content type' and 'Accept' headers when appropriate

Accept: application/vnd.api+json
Content-Type: application/vnd.api+json

Response codes

The JSON:API Specification also dictates acceptable responses. The Drupal implementation uses a subset of those. The module can respond with the following codes:

  • 200 OK - All successful GET and PATCH requests
  • 201 Created - All successful POST requests (response includes the newly created resource)
  • 204 No Content - All successful DELETE requests

And here's the detailed documentation - https://www.drupal.org/docs/8/modules/jsonapi/get-post-patch-and-delete

And here's the sample postman collection - https://gitlab.com/heykarthikwithu/drupal-jsonapi---sample-api-calls

Thanks :)

Nov 07 2019
Nov 07

6 minute read Published: 7 Nov, 2019 Author: Derek Laventure
Drupal Planet , Drupal 8 , Lando , Drumkit

Over the last 2 or 3 years, the Drupal community has been converging around a solid set of Docker-based workflows to manage local development environments, and there are a number of worthy tools that make life easier.

My personal favourite is Lando, not only because of the Star Wars geekery, but also because it makes easy things easy and hard things possible (a lot like Drupal). I appreciate that a “standard” Lando config file is only a few lines long, but that it’s relatively easy to configure and customize a much more complex setup by simply adding the appropriate lines to the config.

In this post I want to focus on an additional tool I’ve come to lean on heavily that complements Lando quite nicely, and that ultimately boils down to good ol’ fashioned Makefiles. Last summer at DrupalNorth I gave a talk that was primarily about the benefits of Lando, and I only mentioned Drumkit in passing. Here I want to illustrate in more detail how and why this collection of Makefile tools is a valuable addition to my localdev toolbox.

The key benefits provided by adding a Drumkit environment are:

  • consistent make <target>-based workflow to tie various dev tasks together
  • ease onboarding of new devs (make help)
  • make multistep tasks easier (make tests)
  • make tasks in Lando or CI environment the same (ie. make install && make tests)

Drumkit is not just for Drupal!

This example is using Drumkit for a Drupal 8 localdev environment, but there’s no reason you couldn’t use it for other purposes (and in fact, we at Consensus have lately been doing just that.

Basic Setup

As an example, suppose you’re setting up a new D8 project from scratch. Following this slide from my Lando talk, you would do the basic Lando D8 project steps:

  1. Create codebase with Composer (composer create-project drupal-composer/drupal-project:8.x-dev code --stability dev --no-interaction)
  2. Initialize Git repository (git init etc.)
  3. Initialize Lando (lando init)

For now, leave out the lando start step, which we’ll let Drumkit handle momentarily. We should also customize the .lando.yml a little with custom database credentials, which we’ll tell Drumkit about later. Append the following to your .lando.yml:

services:
  database:
    creds:
      user: chewie_dbuser
      password: chewie_dbpass
      database: chewie_db

Add Drumkit

To insert Drumkit into this setup, we add it as a git submodule to our project using the helper install.sh script, and bootstrap Drumkit:

wget -O - https://gitlab.com/consensus.enterprises/drumkit/raw/master/scripts/install.sh | /bin/bash
. d  # Use 'source d' if you're not using Bash

The install script checks that you are in the root of a git repository, and pulls in Drumkit as a submodule, then initializes a top-level Makefile for you.

Finally, we initialize the Drumkit environment, by sourcing the d script (itself a symlink to .mk/drumkit) into our shell.

Drumkit modifies the (shell) environment!

Note that Drumkit will modify your PATH and BIN_PATH variables to add the project-specific .mk/.local/bin directory, which is where Drumkit installs any tools you request (eg. with make selenium. This means if you have multiple Drumkit-enabled projects on the go, you’re best to work on them in separate shell instances, to keep these environment variables distinct.

Note that you can take advantage of this environment-specific setup to customize the bootstrap script to (for example) inject project credentials for external services into the shell environment. Typically we would achieve this by creating a scripts/bootstrap.sh that in turn calls the main .mk/drumkit, and re-point the d symlink there.

Set up your kit

Because we’re using Composer to manage our codebase, we also add a COMPOSER_CACHE_DIR environment variable, using the standard .env file, which Drumkit’s stock bootstrap script will pull into your environment:

echo "COMPOSER_CACHE_DIR=tmp/composer-cache/" >> .env
. d # Bootstrap Drumkit again to have this take effect

From here, we can start customizing for Drupal-specific dev with Lando. First, we make a place in our repo for some Makefile snippets to be included:

mkdir -p scripts/makefiles
echo "include scripts/makefiles/*.mk" >> Makefile

Now we can start creating make targets for our project (click the links below to see the file contents in an example Chewie project. For modularity, we create a series of “snippet” makefiles to provide the targets mentioned above:

NB You’ll need to customize the variables.mk file with the DB credentials you set above in your .lando.yml as well as your site name, admin user/password, install profile, etc.

Now our initial workflow to setup the project looks like this:

git clone --recursive <project-repo>
cd <project-repo>
. d # or "source d" if you're not using Bash
make start
make build
make install

This will get a new developer up and running quickly, and can be customized to add whatever project-specific steps are needed along the way.

But wait- it gets even better! If I want to make things really easy on fellow developers (or even just myself), I can consolidate common steps into a single target within the top-level Makefile. For example, append the make all target to your Makefile:

.PHONY: all

all:
        @$(MAKE-QUIET) start
        @$(MAKE-QUIET) build
        @$(MAKE-QUIET) install

Now, the above workflow for a developer getting bootstrapped into the project simplifies down to this:

git clone --recursive <project-repo>
cd <project-repo>
. d
make all

Customize your kit

At this point, you can start adding your own project-specific targets to make common workflow tasks easier. For example, on a recent migration project I was working on, we had a custom Features module (ingredients) that needed to be enabled, and a corresponding migration module (ingredients_migrate) that needed to be enabled before migrations could run.

I created the following make targets to facilitate that workflow:

We often take this further, adding a make tests target to setup and run our test suite, for example. This in turn allows us to automate the build/install/test process within our CI environment, which can call exactly the same make targets as we do locally.

Ultimately, Drumkit is a very simple idea: superimpose a modular Makefile-driven system on top of Lando to provide some syntactic sugar that eases developer workflow, makes consistent targets that CI can use, and consolidates multi-step tasks into a single command.

There’s lots more that Drumkit can do, and plenty of ideas we have yet to implement, so if you like this idea, feel free to jump in and contribute!

The article Lando and Drumkit for Drupal 8 Localdev first appeared on the Consensus Enterprises blog.

We've disabled blog comments to prevent spam, but if you have questions or comments about this post, get in touch!

Nov 07 2019
Nov 07

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

DrupalCon Amsterdam Driesnote presentation

Last week, many Drupalists came together for Drupalcon Amsterdam.

As a matter of tradition, I presented my State of Drupal keynote. You can watch a recording of my keynote (starting at 20:44 minutes), or download a copy of my slides (149 MB).

Drupal 8 innovation update

I kicked off my keynote with an update on Drupal 8. Drupal 8.8 is expected to ship on December 4th, and will come with many exciting improvements.

Drupal 8.7 shipped with a Media Library to allow editors to reuse images, videos and other media assets. In Drupal 8.8, Media Library has been marked as stable, and features a way to easily embed media assets using a WYSIWYG text editor.

I'm even more proud to say that Drupal has never looked better, nor been more accessible. I showed our progress on Claro, a new administration UI for Drupal. Once Claro is stable, Drupal will look more modern and appealing out-of-the-box.

The Composer Initiative has also made significant progress. Drupal 8.8 will be the first Drupal release with proper, official support for Composer out-of-the-box. Composer helps solve the problem of Drupal being difficult to install and update. With Composer, developers can update Drupal in one step, as Composer will take care of updating all the dependencies (e.g. third party code).

What is better than one-step updates? Zero-step updates. We also showed progress on the Automated Updates Initiative.

Finally, Drupal 8.8 marks significant progress with our API-first Initiative, with several new improvements to JSON:API support in the contributed space, including an interactive query builder called JSON:API Explorer. This work solidifies Drupal's leadership position as a leading headless or decoupled solution.

Drupal 9 will be the easiest major update

Next, I gave an update on Drupal 9, as we're just eight months from the target release date. We have been working hard to make Drupal 9 the easiest major update in the last decade. In my keynote at 42:25, I showed how to upgrade your site to Drupal 9.0.0's development release.

Drupal 9 product strategy

I am proud of all the progress we made on Drupal 8. Nevertheless, it's also time to start thinking about our strategic priorities for Drupal 9. With that in mind, I proposed four strategic tracks for Drupal 9 (and three initial initiatives):

Strategic track 1: reduce cost and effort

Users want site development to be low-cost and zero-maintenance. As a result, we'll need to continue to focus on initiatives such as automated updates, configuration management, and more.

Strategic track 2: prioritizing the beginner experience

As we saw in a survey Acqua's UX team conducted, most people have a relatively poor initial impression of Drupal, though if they stick with Drupal long enough, their impression of Drupal grows significantly over time. This unlike any of its competitors, whose impression decreases as experience is gained. Drupal 9 should focus on attracting new users, and decreasing beginners' barriers to entry so they can fall in love with Drupal much sooner.

Beginners struggle with Drupal while experts love Drupal.

Drupal's sentiment curve goes in the opposite direction of WordPress', AEM's and Sitecore's. This presents both a big challenge and opportunity for Drupal.

We also officially launched the first initiative on this track; a new front-end theme for Drupal called "Olivero". This new default theme will give new users a much better first impression of Drupal, as well as reflect the modern backend that Drupal sports under the hood.

Strategic track 3: drive the Open Web

As you may know, 1 out of 40 websites run on Drupal. With that comes a responsibility to help drive the future of the Open Web. By 2022-2025, 4 billion new people will join the internet. We want all people to have access to the open web, and as a result should focus on accessibility, inclusiveness, security, privacy, and interoperability.

Strategic track 4: be the best structured data engine

We've already seen the beginnings of a content explosion, and will experience 300 billion new devices coming online by 2030. By continuing to make Drupal a better and better content repository with a flexible API, we'll be ready for a future with more content, more integrations, more devices, and more channels.

Over the next six months, we'll be opening up these proposed tracks to the community for discussion, and introducing surveys to define the 10 inaugural initiatives for Drupal 9. So far the feedback at DrupalCon Amsterdam has been very positive, but I'm looking forward to much more feedback!

Growing sponsored contributions

In a previous blog post, Balancing Makers and Takers to scale and sustain Open Source, I covered a number of topics related to organizational contribution. Around 1:19:44, my keynote goes into more details, including interviews with several prominent business owners and corporate contributors in the Drupal community.

You can find the different interview snippet belows:

  • Baddy Sonja Breidert, co-founder of 1xINTERNET, on why it is important to help convert Takers become Makers.
  • Tiffany Farriss, CEO of Palantir, on what it would take for her organization to contribute substantially more to Drupal.
  • Mike Lamb, Vice President of Global Digital Platforms at Pfizer, announcing that we are establishing the Contribution Recognition Committee to govern and improve Drupal's contribution credit system.

Thank you

Thank you to everyone who attended Drupalcon Amsterdam and contributed to the event's success. I'm always amazed by the vibrant community that makes Drupal so unique. I'm proud to showcase the impressive work of contributors in my presentations, and congratulate all of the hardworking people that are crucial to building Drupal 8 and 9 behind the scenes. I'm excited to continue to celebrate our work and friendships at future events.

Thanks to the 641 individuals who worked on Drupal 8.8 so far.

Thanks to the 243 different individuals who contributed to Drupal 8.8 to date.

Nov 07 2019
Nov 07

tl;dr: You can now sponsor my open source development work via GitHub Sponsors.

GitHub sponsors geerlingguy

GitHub Sponsors is the latest foray into building a more sustainable future for open source software development. There have been many attempts before, a few of which I tried (Gratipay, Patreon, etc.), but most of them never reached a critical mass, and at most you'd end up getting maybe $20-50/month out of the platform. Another prolific open source contributor I've long followed wrote about the topic of open source support and developer burnout in a post this year, Webform, Drupal, and Open Source...Where are we going?.

For a job (software development) where that's what someone makes in one hour, it's not worth putting in the hours to try to promote ongoing sponsorship when you'd get back only a few hours' worth of pay in a year. I also don't have an explicit goal of monetizing my work, so I often feel guilty even asking about monetary support. On the flip side, I would be more likely to persist through some of the stress of maintainership if I knew it was helping support my growing family.

I have long been involved in the Drupal and Ansible communities, and have submitted PRs, documentation fixes, and code reviews to hundreds of open source projects my work has touched. And if I could find a way to sustain some of my financial needs through open source work, I'd love to devote more time to:

  • Maintaining and improving Drupal VM, and modules like Honeypot (both of which are still seeing increased usage, but have been more or less in maintenance mode of late).
  • Improving my suite of Ansible roles, and building new Collections, to help solve many automation pain points I've encountered with Kubernetes, Drupal, PHP, and other applications.
  • Continuing revisions of my book Ansible for DevOps for another four years (at least!).
  • Attending more Drupal events in person (I've had to skip most of the camps I could've attended this year since I'm no longer sponsored by a particularly-large Drupal company).

I've had a Patreon account for some time, and it's good enough for a fast food meal or two every month, but I think Patreon is geared more towards the 'creative' community (YouTubers, podcasting, art, etc.). Maybe GitHub Sponsors will do better for open source contributors... and maybe not.

I'd be humbled and grateful if you could support me (geerlingguy) on GitHub Sponsors.

Nov 07 2019
Nov 07

Most software projects in the PHP ecosystem, including Drupal, can be installed and managed by using Composer, the PHP dependency manager. 

A dependency manager is a software tool that reads a configuration file for a project. Then it determines all of the underlying software that the project needs in order to work, along with what versions of those applications are compatible with all parts of the project. This sort of work would be difficult to do manually for a project with more than a handful of dependencies.

Why Drupal 8 Needs Composer?

Since adopting this idea of “Proudly Built Elsewhere”, the Drupal project has incorporated many Symfony components and other libraries as dependencies for the project. Doing this allows Drupal to focus more energy on things that make Drupal unique. All of the basic things that all CMS’s do like Routing, Http Requests, and Validation can be handled by Symfony components. Composer simplifies the process of determining what the dependencies are, what version should be used, downloading the dependencies, and placing them in the project where they are expected.

Tradeoffs

The most obvious “con” of using Composer is needing to learn a new thing.  For developers, this is not so much a “con” because learning new technologies is what we do for a living. But for someone who usually downloads Wordpress and FTPs it into a hosting account, the hurdle of Composer is significant.

Before any of this will work you will need to:

Folder Structure

If you’ve never installed Drupal using drupal-composer/drupal-project, it’s important to know that the folder structure is different. The folder that you create your project within is the project root or repo root (I’m assuming that you will be using version control like Git). This is not your Drupal root (webroot), so any files and folders at this level are not directly addressable with a URL. The exception is the “web” folder which IS the Drupal root. You’ll need to configure your hosting environment to use the ‘web’ folder as the webroot.

  • Composer & Git commands need to be run from the top project folder.
  • Drush, however, may need to be run from within /web.

Additionally, this Drupal-project scaffolding creates two folders (‘contrib’ & ‘custom’) under each of the following folders (web/profiles, web/modules, & web/themes).  

  • Any profiles, modules, or themes installed via Composer will be placed inside the respective ‘contrib’ folders which are ignored by Git version control. These don’t need to be committed to version control because they can be easily attained by Composer anytime you clone the repo (see ‘composer install’ command).
  • Your custom code should be placed inside a ‘custom’ folder which will be included in your repository when you make a commit.

Adding Modules and Other Drupal Dependencies

  1. Run composer require drupal/<module>:<version>
  2. Then enable the module in Drupal or with Drush

Package Versions

  • ~ : will only increase the last part of the version number (after the rightmost ‘.’
  • ^ : increases based on semver (semantic versioning) rules
  • 1.x-dev or dev-1.x : latest dev release for 1.x branch

Instead of setting specific version numbers, I prefer to let my versions update within limits.

  • For Drupal Core, I use the tilde(~) with all three version places (~8.7.0) so Composer will update core automatically but not move to a new minor version without intervention.
  • For Drupal modules, I use the tilde(~) with only two version places (~2.0) so Composer will update with minor releases but not move to a new version of the module that could contain breaking updates like 2.x to 3.x.
  • For non-Drupal packages, I still tend to use tilde(~) but you’ll need to look at the history of the project and see how updates are handled to see where breaking changes are allowed in order to best prevent them without intervention.

Removing Modules and Other Dependencies

  1. ALWAYS UNINSTALL THE MODULE IN DRUPAL FIRST.
  2. To remove the code from the project, run composer remove drupal/<module>Non-drupal use the full package name as it appears in packagist.org or another packagist that you may have added (see Bonus Tip).

Checking for Packages That Need to Be Updated

This command will output a list of dependencies that you’ve added that aren’t up to date. It won’t display dependencies of dependencies but that’s useful here because we can’t fix those directly.  The package maintainer of the project calling the outdated dependency will need to update their project first. If you want to contribute to those projects then run the command without –direct and then report the issues.

  • Run composer outdated --direct

Performing Updates 

Composer will update the code base, but be sure to run the database updates.  Also, I always run Steps 1-4 in a local or development environment:

  1. To update the code base, run composer update --with-dependencies
  2. To update any database schema, enable ‘Maintenance Mode’ then run
    drush updb OR go to https://<yourDEVsite>/update.php from your Drupal backend
  3. Test
  4. Commit to the repository
  5. Switch to the Live environment and get the latest code from the repo.  This is usually a ‘git pull’ command but may be more complex depending on your environment and DevOps setup.
  6. To get all of the same package updates on the Live site that were added through step #1, run composer install on the Live environment.
  7. To update the database schema on the Live site as in step #2, enable ‘Maintenance Mode’ then rundrush updb on the Live Environment. 

Note: ‘–with-dependencies’ tells Composer to update everything, not just your dependencies, but also the dependencies or dependencies, ad Infinitum.

Note 2: While ‘composer update’ checks for new versions and resolves all the dependencies, ‘composer install’ will only download and deploy the previously resolved dependencies. That’s what the composer.lock file is for.  Composer.lock is created on every update. Many times it will be the only changed file in your git commit.

Memory Issues

Composer can use a lot of memory, so sometimes you need to let it use all the memory.  This can be achieved on a per-command basis like:

  • Run php -d memory_limit=-1 composer update  --with-dependencies

Or permanently in your environment:

  • Create/Edit the /etc/php-cli.ini file
  • Modify the line:
    • memory_limit = -1  

Bonus Tip: Install NPM and Bower Packages With Composer

When a Drupal module needs you to install a library, especially a js or CSS library, you can do it with Composer by setting up Asset Packagist.  My suggestion here is only using this for libraries that are used when rendering a page for your end-user. This is not a good way to install your SASS or JS build system that you may use during development.

Follow the instructions at https://asset-packagist.org/

Nov 07 2019
Nov 07

All beautiful websites have one thing in common — a good theme. It should be not only beautiful but also mobile responsive, accessible, and ready to support modern website functionality.

This is all true of Olivero — the future main front-end theme for Drupal 9. And this future is just around the corner because Drupal 9 is coming! With it is coming the era of beautiful websites.

Why is a website theme important?

A website theme defines what the site looks like to visitors, which includes layout, styles, color schemes, typography, buttons, and much more. Beautiful themes make beautiful websites, creating visual appeal and your brand’s identity. But this is not all.

The theme’s responsibility goes beyond being beautiful. It can also greatly contribute to the smooth user’s interaction with the website, including on various mobile devices. So if beautiful website design goes hand-in-hand with the best usability practices, this eventually translates into more conversions.

The new Olivero theme in Drupal 9

The new theme’s official overviews say “Drupal 9: The Most Beautiful CMS Ever!” And you immediately know it’s true when you see Olivero.

New main front-end theme in Drupal 9 — Olivero

Olivero is a beautiful implementation of the community’s idea to create a new default front-end theme. The current default theme, Bartik, is ten years old already and was initially created for D7.

Bartik has served us as an amazing and very adaptable theme. In addition, it is mobile responsive out-of-the-box to meet D8’s mobile-first philosophy. However, Bartik’s design looks a bit outdated. So Drupal needed a new main front-end theme that would better showcase Drupal’s strength.

Olivero is expected to come in Drupal 9.1. The community keeps working on it, and anyone who wants to contribute to it is welcome.

Theme named in honor of Rachel Olivero

There is a nice tradition in Drupal to name main themes after outstanding programmers. For example, Bartik got its name after Jean Bartik who was among the original ENIAC computer creators.

The tradition goes on with Olivero — it is named after a female programmer Rachel Olivero who was an outstanding advocate of website accessibility. Sadly, Rachel passed away in 2019. To honor her, the community kept her beautiful name alive in a beautiful theme. Patient, generous, and inclusive — these were Rachel Olivero’s qualities that they wanted the theme to embody.

Key features of Olivero theme in Drupal 9

Beautiful web design, simplicity with no clutter, a professional look, accessibility, flexibility in options, responsiveness on mobile devices — the list of its virtues could go on. Let’s outline at least a few of its numerous features for websites.

WCAG AA compliance

One of the top priorities for the theme’s creators was to make Olivero compliant with Web Content Accessibility Guidelines (WCAG), so they worked with website accessibility experts. All users, including those with various impairments, should find it easy to use websites with Olivero.

Support for the latest Drupal functionality

The Olivero creators have prioritized making it supportive of the recent Drupal features for websites such as Layout Builder, media embeds, second-level navigation, and more.

A modern and bright color palette

Websites will look beautiful with Olivero’s color scheme, with a base color of bright blue. In addition to being attractive, it supports Drupal’s branding. Combinations of darker and lighter colors help provide website accessibility.

Color palette of the new Olivero theme in Drupal 9

Simple but modern forms and buttons

Forms and buttons in Olivero are user-friendly, recognizable, and accessible. Forms have a left color bar, and their labels are placed above the fields to meet website accessibility requirements. Buttons are presented in two types: primary and secondary.

Buttons in the new OIivero theme for Drupal 9Forms and buttons in the new Drupal 9's front-end theme Olivero

A convenient sidebar region

Olivero features a beautiful sidebar to the right of the website’s main content area that allows you, for example, to add related content. The region proportions allow the content area to look more prominent, while the sidebar gives it good support.

Sidebar regions in new main Drupal 9 theme Olivero

Flexible header and navigation options

Navigation options can accommodate any website’s needs. The header is able to collapse into a hamburger-button menu when scrolling, which allows the menu to be accessible on long pages. The secondary dropdown menus are also supported by Olivero.

Navigation in new main Drupal 9 theme Olivero

RTL (right-to-left) text direction support

Drupal 8 has already raised the bar of multilingual support very high. See why Drupal 8 is the best choice for multilingual websites, including those with RTL languages. It will be even better with the new theme in Drupal 9 — just take a look at the beautiful RTL implementation.

RTL support in new main Drupal 9 theme Olivero

Time for beautiful websites and other Drupal 9’s benefits!

It looks like Olivero is one of the beautiful answers to the question of what’s new in Drupal 9, coming in June 2020!

As a web agency that creates beautiful websites with great functionality and optimizes existing ones, we would like to offer you our assistance in:

  • getting ready for Drupal 9, which involves a good clean-up from deprecated code
  • moving your website to Drupal that becomes “The Most Beautiful CMS Ever!” (and it’s just a tiny part of Drupal’s strength)
  • creating a custom theme, or otherwise improving your website’s design to make it more beautiful, user-friendly, and accessible

Drop a line to our Drupal development team!

Nov 07 2019
Nov 07
Tom Thorp

Hi there! I'm Tom Thorp, an IT Consultant living in Miami on Queensland's Gold Coast. My life's work has always been around Information Technology. Hosting and designing websites, administering databases, workstations, laptops, tablets & servers - there isn't much I have not touched in my 30 years in the industry.

I'm a person who likes to take on a project and run with it. I'm always learning better ways to implement methods while on the job, whether it be computer code, fixing a configuration issue, or a process that might already be in place. My past work includes working for Pacific Pines State High School, RCI Australia and Classic Holidays.

At the moment I am exploring employment opportunities. If I have piqued your interest in what I have to offer, then I would love to speak with you.

Video Resume Icon
Nov 07 2019
Nov 07

A recently patched security flaw in modern versions of the PHP programming language is being exploited in the wild to take over servers, ZDNet has learned from threat intelligence firm Bad Packets.

The vulnerability is a remote code execution (RCE) in PHP 7, the newer branch of PHP, the most common programming language used to build websites.

The issue, tracked as CVE-2019-11043, lets attackers run commands on servers just by accessing a specially-crafted URL.

Exploiting the bug is trivial, and public proof-of-concept exploit code has been published on GitHub earlier this week.

"The PoC script included in the GitHub repository can query a target web server to identify whether or not it is vulnerable by sending specially crafted requests," says Satnam Narang, Senior Security Response Manager at Tenable. "Once a vulnerable target has been identified, attackers can send specially crafted requests by appending '?a=' in the URL to a vulnerable web server."

ONLY NGINX SERVERS AFFECTED

Fortunately, not all PHP-capable web servers are impacted. Only NGINX servers with PHP-FPM enabled are vulnerable. PHP-FPM, or FastCGI Process Manager, is an alternative PHP FastCGI implementation with some additional features.

However, while PHP-FPM is not a standard component of Nginx installs, some web hosting providers include it as part of their standard PHP hosting environments.

One such case is web hosting provider Nextcloud, who issued a security advisory to its clients on Thursday, October 24, urging customers to update PHP to the latest release, versions 7.3.11 and 7.2.24, which had been released on the same day and included fixes for CVE-2019-11043. Many other web hosting providers are also suspected of running the vulnerable Nginx+PHP-FPM combo.

But there are also website owners who cannot update PHP or can't switch from PHP-FPM to another CGI processor due to technical constraints.

This blog post from Wallarm, the company that found the PHP7 RCE, includes instructions on how webmasters can use the standard mod_security firewall utility to block %0a (newline) bytes in website URLs, and prevent any incoming attacks.

Wallarm credited its security researcher Andrew Danau for discovering the bug during a Capture The Flag (CTF) competition last month.

Due to the availability of public PoC code and the simplicity of exploiting this bug, website owners are advised to check server settings and update PHP as soon as possible if they run the vulnerable configuration.

Nov 06 2019
Nov 06
Shopping cart usability & Commerce Cart Flyout Drupal module

The shopping cart is a vital place in your online store where the moment of truth often happens — the customer either proceeds to checkout or abandons the purchase.

The situation can be changed in favor of option 1 if you increase shopping cart usability. Among the nice Drupal 8 shopping cart modules, we would like to discuss one today. Its name is the Commerce Cart Flyout module in Drupal 8. Let’s see what it does in the usability arena!

Read more
Nov 06 2019
Nov 06

The hardest thing I find with tests is understanding errors. Every time I think I've got debugging output sorted, I find a new layer where it doesn't work, I've got nothing, and I'm in the dark with something that's crashing and I don't know why.

The first layer is simple: errors in your test code itself. For example, make a typo in your tests/src/Functional/MyTest.php and PHPUnit crashes and you see the error in the terminal.

But when it's site code that's crashing, you're dealing with a system that is being driven by code, and therefore, you can't see it. And that's a major obstacle to figuring out a problem.

The HTML output that Drupal's Functional and Functional Javascript tests produce is a huge help: every time your test code makes a request to the test site, an HTML file is written to the test files directory. If your site crashes when your test makes a request, you'll see the error and the backtrace there.

However, there's no such output when in a Functional Javascript test you cause an AJAX request. And while you can create a screenshot of what the page looks like after the request, our another HTML file of the page (see https://www.drupal.org/project/drupal/issues/3090498 for instructions how; the issue is about how to make that automatic but I have no idea how that might be possible), you can't see the actual error, because AJAX requests that fail just sit there doing nothing. There's nothing useful to see in the browser.

So we need to see logs. When a real site has an AJAX crash, with a human being-controlled web browser making the request, you can go and look in the logs. With a test site, the log table is zapped when the test is completed.

Fortunately, Drupal 8's pluggable logging means there are other ways of getting hold of them, more permanent ways.

I first tried log_stdout module. This outputs log errors to STDOUT. If you're running on Docksal, as I am, you have an extra layer to get though to see that. You can monitor the cli container with fin logs -f cli, and with that module add a | ag WATCHDOG to filter.

However, I wasn't seeing backtrace in this output, and I gave up figuring out why.

So I tried filelog module instead, which as the name implies, writes log to a simple text file. This needs a little bit more work, as by default it writes to 'public://logs'. This means that each run of the test gets its own log file, which is perhaps what you want, but for my own uses I wanted a single log file I could tail -f in a terminal window and have continual monitoring.

A quick bit of config setting in the test's setUp() does the trick:

$this->config('filelog.settings')
  ->set('location', '/var/www/docroot/sites/simpletest/logs')
  ->save();

And I think that's me at last sorted.

Nov 06 2019
Nov 06

We’ve recently given the dimplex.co.uk site a new face lift and stumbled across an interesting problem during development when it came to implementing the new product carousel on their product pages, more specifically, the handling of the images themselves.

The design brief stipulated that the transparent studio style product images required a light grey background behind them, giving the impression of a product floating nicely in the middle of a light grey surrounding.

We had 2 problems here:

1. A lot of the studio style product images didn’t have sufficient transparent space around themselves and we ended up with unsightly results; images would be touching the edges of their container and weren’t sitting in the middle as intended. We needed to cater for these types of images.

2. We have a mix of studio and lifestyle shots. We couldn't just apply the same image style to both types of image; we would have to come up with something magic/clever to distinguish a studio shot from a lifestyle shot and then apply an image style accordingly.

Given that we are ComputerMinds (we massively care about our clients and we love a challenge) and knowing that the client would have to manually go back, edit and re-upload thousands of images, we decided if would be cool if we could come up with a code solution for them. Below is a brief outline as to how we achieved a code solution to a design/content problem. Enjoy!

Lifestyle and studio shots

Our concept was pretty clear; for images that had at least 1 - and no more than 3 - edges that contained transparent pixels, apply a Drupal image style with a custom effect that would “pad” out the image. The idea was, if an image has 4 transparent edges, the image had sufficient space around it. If it had no transparent edges we knew this was a lifestyle product shot, i.e a radiator in a lounge.

I started looking at the possibilities of detecting transparent pixels with PHP and came across a handy function from the PHP GD image library called imagecolorat(). Using this function and some hexadecimal converter madness (please don't ask me to explain!), we can detect what type of pixel we are looking at given a set of coordinates.

// The result of $transparency will be an integer between 0 and 127 
// 127 is transparent, 0 is opaque/solid.
$rgba = imagecolorat($image_resource, $x, $y);
$transparency = ($rgba >> 24) & 0x7F;

Now we needed to run this function for every pixel along on all four edges. So, first things first, grab the image width and height and subtract 1 from the result. Subtracting 1 ensures you won’t hit a PHP notice about being “out of bounds” - we’re not playing a round of golf here :). Next, we need to sort out our coordinate ranges for us to loop over:

// Top side
X = 0
Y = $image_width

// Bottom side
X = $image_height
Y = $image_width

// Left side
X = $image_height
Y = 0

// Right side
X = $image_width
Y = $image_height

For each permutation of coordinates (x=0, y=750, x=1, y=750, x=2, y=750 etc) , we simply check each pixel result from imagecolorat() and save the result to an array for us to check later on. Once we have detected a transparent and a non-transparent pixel (remembering to check our magic number 127), we then break out because we have all the information we need from this particular edge.

After you’ve completed this process for all four sides, we then do a simple check to see if we have a mix of transparent and non transparent pixel edges. If we do, then we pass the image along to our custom image style for processing.

Our custom Drupal image style uses the “define canvas” image processor from the contrib module imagecache actions. We define our own image style effect but use imagecache_actions’ “define canvas” processor to transform our image. This is where we add X amount of pixels to increase the “padding” around the image.

In order to create a custom Drupal image style, we would need to implement hook_image_effect_info() and place any effect code into the "effect callback". See below for an example.

/**
 * Implements hook_image_effect_info().
 */
function MY_MODULE_image_effect_info() {
  $effects = array();
  
  // Remember to replace MY_MODULE with the name of your module (in lower case).
  $effects['MY_MODULE_transparent_padding'] = array(
    'label' => t('Transparent padding'),
    'help' => t('Applies padding if an image has a mix of transparent and solid pixels around the edges of an image.'),
    'effect callback' => 'MY_MODULE_transparent_padding_effect_callback',
    'form callback' => 'MY_MODULE_transparent_padding_form',
    'summary theme' => 'transparent_padding_summary',
  );
  
  return $effects;
}

/**
 * Callback for MY_MODULE_transparent_padding image effect.
 */
function MY_MODULE_transparent_padding_effect_callback(stdClass $image, array $data) {
  if ($image->info['mime_type'] == 'image/png') {
    $image_height = ($image->info['height'] - 1);
    $image_width = ($image->info['width'] - 1);
    $image_resource = $image->resource;

    // Do the coordinate magic and determine transparent and non transparent pixels
    // Build up $data array (this will contain background colour and amount of pixels -
    // to pad out the image.
    
    // Use imagecache_actions custom effect "define canvas" and pass in your $data array.
    $success = image_toolkit_invoke('definecanvas', $image, array($data));
    return $success
}

Below is the result of our custom image style effect settings form as defined above in hook_image_effect_info via "form callback". (The form was taken directly from imagecache_actions' definecanvas effect where I made a couple of edits - so credit goes to them)

Custom image style settings page

And the result is shown below: a before and after. Much better! Furthermore, we’ve saved the client from having to manually edit thousands of images and then manually re-upload them. Win win!

Image before processing

Image after processing

Nov 06 2019
Nov 06

What are your goals of marketing automation? Every organization has varied targets to meet with their marketing automation strategy. According to a study, the most prominent objective is to optimize productivity. The other significant goals are to increase marketing ROI, acquiring more leads, analyzing performance and to align marketing and sales. With tons of marketing automation platforms available in the market today, it is hard to choose the most effective tools that will work for you. 

Drupal CMS lets you create compelling digital experiences. Its ease of use and powerful features make it a great platform for a marketer. Integrating Drupal with marketing automation tools will not just enable you to enhance the user experience but it can turn into an intelligent lead generation and nurturing tool. Drupal 8 offers seamless integration modules with top marketing automation tools like Hubspot, Mailchimp, Google Analytics and many more.

Why do I need a Marketing Automation Tool, you ask?

 “Good marketing makes the company look smart; Great marketing makes the customers feel smart” - Joe Chernov (VP Marketing @ Pendo). 
Great marketing needs more than just a physical team who can manually manage your marketing workflows. Great marketing needs a Marketing Automation tool to help you nurture your leads and expedite your whole marketing process. Did you know lead nurturing can increase your revenue growth by 85%?

  • Marketing Automation tools help converting raw leads to nourished and qualified marketing leads that are turned over to the Sales team for further customer relation management. This implies for both B2B and B2C organizations.
  • It helps track your prospective customer’s activities right from when they visit your website to reading your blogs or filling out a form. 
  • You can also schedule and track your marketing campaigns via email or social media or any other communication.
  • Prospective customers can be easily segmented into suitable mailing lists based on their interests or preferences and will receive/view only relevant content, thus giving you happier (also read ‘less annoyed’) leads. 
  • A ‘drip campaign’ (email campaign) can be scheduled depending on the right time and right interest shown by your leads that will help you stay on top of their minds thereby nurturing your leads. 
  • You will also be able to see the reports/analytics and analyze the success of your marketing campaigns.

Drupal 8 Modules for Marketing Automation 

Drupal is not only known for its robustness and scalability but also for its huge active community of contributors. You will find various contributed Marketing automation modules or plug-ins within Drupal CMS. They are free and out-of-the-box but sometimes might not provide you with as many features as other thirds-party tools. There are a huge number of third-party Marketing Automation software that integrate seamlessly with your Drupal website. 

Webform

No, the Webform module is not a marketing automation module but is one of the most vital building blocks of an integration between Drupal and other third-party marketing automation software. The Drupal Webform module is used to collect user data via forms. You can build forms like surveys, simple newsletter signup forms or contact forms. This submitted data is then pushed to third-party marketing automation systems. Drupal 8 Webforms have a lot more features added like the object-oriented API, source editing, new and improved form elements, extendable plugins, better documentation and more.

webform module                                         
          Webform Module for Drupal 8 – Form elements

Marketo MA

Marketo is a very popular and widely used marketing automation tool that automates activities such as lead tracking and nurturing, personalization, analytics, advertisements, social marketing, automated campaigns and much more across multiple devices. The Market MA Drupal module helps you capture lead data during form submission and adds tracking capabilities to your Drupal website with the Marketo Munchkin tracking codes and API. It also integrates with the Webform module. 

Marketo MA Module for Drupal 8           Marketo MA Module for Drupal 8 – Setting up

Eloqua 

The Eloqua marketing automation tool by Oracle allows for marketing campaign management, email automation and tracker, lead management and engagement, and more. Eloqua Drupal module integrates Eloqua tracking and Eloqua Webforms submodule with your Drupal website. In Drupal 8, the Eloqua module has been divided into separate module fragments like the Eloqua Tracking module, webform Eloqua module, Eloqua API Redux (Drupal 8 and Eloqua REST API integration)

Eloque module                                 
          Eloqua Module for Drupal 8 - High-level design 

HubSpot 

HubSpot marketing automation is well-known to provide users with a wide range of inbound automation tools – the marketing hub, the sales hub and the service hub. Each of them can function alone or all together. The HubSpot Drupal module integrates with Webform and the HubSpot API. Once a user enters their information via Drupal 8 Webforms, the records are sent to HubSpot’s lead management tool, where the leads can be tracked and nurtured.

Hubspot module
          HubSpot Module for Drupal 8 – Setting up

Pardot

Pardot, the marketing automation service by Salesforce, offers marketers a host of activities like digital marketing campaign management, customer behavior tracking, SEO, website tracking, lead generation and nurturing, etc. The Pardot Drupal 8 module adds web analytics onto your Drupal website allowing marketing and sales department to create, deploy and manage online marketing campaigns. 

 Pardot Module for Drupal - Configuring          Pardot Module for Drupal - Configuring

Mautic

By now you must have heard about the acquisition of Mautic, the first open-source marketing automation platform, by Acquia. Mautic allows for multi-channel communications and campaign management, visitor tracking, email marketing, content customization and personalization and more. The best way to integrate Mautic with your Drupal website is with the Webform Mautic module. It lets you add Mautic handlers to the Webform which allows data from to be submitted to the Mautic list. To be able to track website contacts, the Mautic tracking code should be installed in your Drupal website.

Mautic Module for Drupal          Mautic Module for Drupal – Setting up

Google Analytics

Fun fact – Did you know Co-founder of Urchin (now Google Analytics) sold his company to Google on his wedding day (April 2005)? Google Analytics is the most widely used website analytics tool in the world today. No marketing automation is complete without analytics and GA does a great job at it. The Google Analytics Drupal module lets you add a web statistics tracking system to your website. It can track, monitor and support domains, users, files, search system, Adsense, custom code snippets and so much more. To generate reports for all your tracked data, the Drupal module for Google Analytics Reports can be installed.

google analytics
          Google Analytics Module for Drupal – Setting up

MailChimp

MailChimp is a very popular email automation platform that does more than just sending emails. You can create custom campaigns, send and schedule automated emails based on certain predefined triggers, track and monitor customer behavior, personalize, generate reports, etc. The Drupal MailChimp module integrates your Drupal website with Mailchimp which will help in tracking and creating a list of website visitors. This list is sent to the MailChimp list from where email automation and other features can be accessed. You can also create signup forms, campaigns and track activity from your Drupal website. Your visitors can also have a control over which of your email lists they want to be on or off.

Mailchimp Module for Drupal
          Mailchimp Module for Drupal – Sign up form


 

There are a few Marketing Automation tools that include a collection of marketing components:

IBM Marketing Cloud is a hybrid marketing automation tool based on Silverpop’s cloud-based marketing automation software, DemandTec’s cloud analytics and Xtify’s mobile messaging models and enhanced by IBM marketing software.

Adobe Marketing Cloud includes components like Adobe Target, Adobe Social, Adobe Campaign, Adobe Media Optimizer and Adobe Analytics. Adobe Analytics (previously SiteCatalyst) is a module that you can integrate with your Drupal website that helps provide detailed statistics about website traffic and can categorize customers based on their locations, preferences and behavioral traits.

Nov 06 2019
Nov 06

Skpr provides a compelling command line workflow for developers.

In this blog post we will be demonstrating Skpr by going through the fundamental commands: package, deploy and config.

Package

Modern applications require a set of steps to prepare the application for deployment, these steps might include:

  • Installing dependencies eg. Composer
  • Building the theme eg. Gulp
  • Installing extra packages

The outcome of this preparation then needs to be stored so it can be deployed onto the platform. This process is known as packaging and is accomplished in Skpr by running the command:

skpr package <version>

As you can see in the diagram below, this command does a lot of heavy lifting. It not only compiles your code, it also splits the application into individually scalable components and pushes them to the Skpr platform, ready for deployment.

Diagram describing that the package command builds 3 artefacts. Nginx, FPM and CLI.

Also of note, this command can be run by developers and automation alike. Typically this command would be run as part of a pipeline in conjunction with the deploy command for a continuous deployment workflow.

Deploy

Now that our application is packaged, let’s deploy it!

Deploying the application is as simple as running the below command. Seriously, it’s that easy.

skpr deploy <environment> <version>

While simple on the surface, Skpr is actually orchestrating a catalog of cloud managed services.

  • CDN / Cache
  • Certificates
  • Database
  • Storage
  • Search
  • SMTP

Diagram of how the Skpr deploy command interacts with the API and AWS Cloud Services.

These services are then exposed to the application through the Skpr’s configuration system.

Config

The Twelve-Factor app manifesto calls for strict separation of configuration from code. This approach provides several advantages:

  • Sensitive values such as API tokens and private keys will not be leaked if the codebase was ever exposed.
  • There is no need for a switch statement for each environment defining various variables for dev, staging, etc..
  • Feature toggles can be used to dynamically enable functionality without a deployment.

Skpr out of the box will provide configuration for:

  • Database connection details
  • SMTP credentials
  • File storage locations eg. public / private / temporary

A terminal showing the output from Skpr config list.

As a developer you can also add your own custom configuration eg. API keys for an integration.

In this example we are adding an API key for mailchimp and flagging it as a secret to avoid the key from being accidentally exposed (see the [secret] in the command line image above).

skpr config set --secret dev mailchimp.key xxxxxxxxxxxxxxxxxxx

Details on how to configure your application to consume these configuration key/values can be found here.

Conclusion

Skpr provides a simple set of commands for developers to "get the job" done.

If you would like to dive into more of the Skpr commands check out our documentation site, or contact us for a demo via the skpr.io website.

Photo of Nick Schuch

Posted by Nick Schuch
Sys Ops Lead

Dated 6 November 2019

Add new comment

Nov 05 2019
Nov 05

Dries also spoke about the importance of diversity, how it’s necessary to streamline processes, and how tooling needs to be good. He stated the fact that Drupal needs more contributors and last, but not least, more sponsored contributors. You can read about this in his blog post “Balancing Makers and Takers to scale and sustain Open Source” published this summer.  The blog is a great read on how we can grow and scale Open Source with “Makers” and “Takers”. It highlights how we should try to turn the Takers into Makers and then promote the Makers, so that in the end both Makers and Drupal can benefit.

In the DriesNote Dries interviewed Baddy, our CEO, about her vision on contribution and took 1xINTERNET as an example of a company that has become one of Drupal’s top contributors. During the interview, Baddý informed us that it took her 7 years to realise that she could contribute as a non-coder to Drupal, and that there was a community behind the software she was using. Seven years is a long time in a competitive environment. How can we make sure our users find it worthwhile to become a Maker? And how can we make sure the users find a way to take part, and realize that it's appreciated by our community. A Maker will get a project; and, possibly at the same time, buy a sponsoring package at DrupalCon; help towards development of Drupal by sending in bug fixes and new features; and perhaps organise a local event. This is a very valuable Drupal user. In the meanwhile a Taker might get a project and possibly sponsor some part of DrupalCon, which is great, but certainly not as impactful as when a Maker gets the same project.

Nov 05 2019
Nov 05

Shawn McCabe, Acro Media’s CTO, recently made waves when he proclaimed through our blog that Ubercart is dead. We received both praise and criticism from the Drupal community for saying it, but the truth of the matter is that Ubercart, once the primary module businesses relied on for adding ecommerce functionality into the Drupal CMS, has yet to have a stable Drupal 8 release (even though Drupal 8 was released 4 years ago in November, 2015). It’s currently stuck in “alpha” and overall usage has been steadily declining for years. Read the initial post for more information.

We put out that post as an attempt to inform businesses that are currently using Ubercart that they should be planning their migration to something else ASAP, before Drupal 7 reaches end-of-life. Our suggestion for these businesses is to move to the Drupal Commerce module for Drupal 8. Drupal Commerce is the successor to Ubercart and was founded by one of the Ubercart creators. It’s the natural choice for these businesses and overall it’s a much better platform in every way.

Of course, when you tell a business that they need to replatform because their ecommerce software is “dying,” that’s not an easy thing for business owners to hear. Many flat-out ignore it to be honest, but those who understand the warning want to know more about how it will affect their business. From the reaction we received to the initial post, we understood that more needed to be said. Businesses using Ubercart now have questions that need to be answered. Because of this, we held an “Ubercart is Dead Roundtable” webinar-style discussion where we put Shawn in the spotlight to answer the questions that have come in. The goal of this discussion was to be both informative and demystifying, a general discussion instead of a sales pitch.

So without further ado, here is the roundtable recap video. A list of timestamped discussion topics are shown below the video. If you have any other questions not mentioned here, send us a message. We would be happy to answer any questions you may have.

Watch the roundtable

Host: Jace Anderson
Specialist: Shawn McCabe, CTO

00:00 - Introduction
00:45 - Who is Shawn McCabe
01:55 - Why do you [Shawn] think Ubercart is Dead?
03:07 - Why is Drupal Commerce the next platform of choice?
04:02 - Why should I move off of Ubercart when our business is currently operating fine?
05:58 - Is there a performance difference between Ubercart and Drupal Commerce?
08:06 - Is it possible to move off of Ubercart but stay on Drupal 7?
09:29 - How do we know Drupal Commerce won’t see the same fate as Ubercart?
11:00 - Is there a big difference in the features from Ubercart to Drupal Commerce? Is Drupal Commerce more robust?
13:35 - Is there a big learning curve for the backend administrators when using Drupal Commerce?
15:21 - How big of an undertaking is the migration from Ubercart to Drupal Commerce? Can an IT team of 5 complete it?
16:44 - What website components add to the complexity of a migration?
18:00 - Would a migration interrupt my business? Will it affect the customer experience?
18:54 - How would a migration impact my internal operations?
20:25 - How do we know Drupal Commerce won’t see the same fate as Ubercart (second part)?
21:26 - Currently we use multi-currency. Does Drupal commerce support this too?
22:41 - We use MailChimp for abandoned cart recovery. Can it still be used with Drupal Commerce?
23:10 - Are there other alternatives to Drupal Commerce? Is it the only option to continue using Drupal?
24:04 - How does Drupal Commerce perform on mobile?
25:02 - From your blog post, there looks to be companies using Ubercart on Drupal 8. What would prompt this?
25:57 - Can Drupal Commerce be used for custom customer experiences?
27:20 - Based on my research, Drupal Commerce is defined as having a difficult user interface. How can we ensure our team will be able to manage the backend?
28:28 - Can I manage my orders from my mobile device?
29:19 - What does Drupal Commerce offer for legacy software integration?
30:51 - What are the key specifications in a migration that attribute to an increased cost when doing a migration?
32:31 - Is my data migrated automatically? Can I also move order history, receipts and customer data?
33:40 - For a migration, where does one find support?
34:52 - What process is involved in managing coupons and promotions?
37:01 - How does bundling differ from Ubercart to Drupal Commerce?
38:00 - Does Drupal Commerce have subscription payment functionality?
40:05 - Is Drupal Commerce catalog taxonomy based?
41:10 - Shawn’s final words to those still on Ubercart who are not planning their move away from it yet.

Click to contact one of our ecommerce consultants

Nov 05 2019
Nov 05

With developers under the constant pressure of completing the software development process expeditiously, more and more facets of the process are compelled to make a “shift-left”, and bob up in the software development lifecycle (SDLC).

Given this circumstance, security can no longer be taken as a casual job especially when the code is being updated and delivered every few seconds and minutes.

That is where this “AppSec Shift-Left” movement comes into the spotlight. A strategy to audit code by discovering and eliminating software vulnerabilities without hampering the development process.

This blog will elucidate the need for AppSec shift-left approach and the application security tools that can be leveraged to patch the same issues.

The Need of Shift-Left Approach

The idea behind using the shift-left approach is to find vulnerabilities at an early stage in the SDLC in a fast and efficient manner. The earlier the development teams find bugs, the lesser is the rework they’ll have to do later. This is the reason why enterprises are setting up their developers responsible for application security. 

As a result, developers will have to embed this approach asap as a part of their responsibility to keep security in check and deliver the applications on time, and in case errors occur, they can fix in on time and not throw it over the fence to let someone else take care of it.

How Application Security Tools Can Support Developers 

7 Hexagons closely placed with text insideGenerally, developers have the common goal of producing secure, functional code within a deadline. To ensure security and functionality, they typically perform a code review process to debug their code.

Debugging code is not among the hopes and dreams of most of the developers. Plus, lengthy debugging sessions can delay the projects. So the ideal application security tools should help developers debug their code swiftly to boost their productivity and help them meet their deadlines. 

All these accomplishments will encourage developers to use the tool to remove software vulnerabilities. 

Additionally, whenever developers embrace these app security tools as a means to enhance their productivity, these tools are far more likely to showcase a material impact on vulnerability remediation.

Simply put, these application security tools reduce the amount of time they take for developers to debug their code. However, this is no easy task! To help developers produce secure, functional software on-time, these solutions must:

  1. Integrate into daily developer workflows. They shouldn’t interrupt development processes geared towards complying with the next deadline.
  2. Produce accurate and actionable results. Going forward, developers can fix vulnerabilities quickly once they have been identified.

Implement Shift-Left Approach With These Tools

Below mentioned tools, when implemented in CI/CD pipeline, will empower developers in finding the security loopholes, if any, at the right time.

  1. Fortify Static Code Analyzer (SCA) - 

The Micro Focus Fortify Static Code Analyzer (SCA) can identify, analyze, and resolve complex issues efficiently as it scans massive amounts of code in a flash followed by immediate actionable results; making it convenient for developers to create secure code.

SCA plays an essential role in creating secure software by identifying vulnerabilities in software security architecture and application code with minimal effort & in negligible time; without compromising on the quality of the code. 

  1. Black Duck - 

Black Duck software composition analysis solutions and open source audits give you the insight you need to track the open source in your code, mitigate security & license compliance risks, and automatically enforce open source policies using your existing DevOps tools and processes.

Watch this video further to understand about AppSec Shift-Left Approach-

[embedded content]

Open Source and Third-Party Software Audits

No matter what your organizations’ business is, you must be using open-source in one or the other way. The question that arises with the use of open-source is, whether you know how your organization is using it, what kind of licenses are playing the roles, and whether you can meet all of your license requirements. 

To answer all these questions, an audit is conducted to find what kind of open-source software (OSS) is present within your code and what licenses that OSS falls under.

Black Duck, an open-source library analyzer, comprises of following features-

  • Open Source and Third-Party Code Audit

Provides you with a complete open source bill of materials (BOM) for the target codebase; showing all open source components and associated license obligations and conflict analysis.

  • Open Source Risk Assessment

It offers a detailed view of open source risks in the codebase, including known security vulnerabilities, using Black Duck Enhanced Vulnerability Data. It can serve as a high-level action plan to prioritize research and potential remediation actions.

  • Web Services and API Risk Audit

Lists the external web services used by an application, with insight into potential legal and data privacy risks. It allows you to quickly evaluate web services risks across three key categories, i.e., governance, data privacy, and quality.

Conclusion

The software development life cycle (SDLC) is constantly increasing the pace and becoming more automated.

Developers must keep up with the pace and leave security behind with the shift-left approach. Considered as the fastest and most comprehensive tool, it can be easily integrated into DevOps pipelines to analyze the code, and boost security into digital SDLCs without compromising on the innovation part!

Srijan takes security issues as a serious threat to organizations’ valuable assets and progress. And so, to mitigate the risk, it has provided its clients with a solution to deal with it efficiently. You too can reach out to us for the same. Contact now!

Nov 05 2019
Nov 05

DrupalCon Amsterdam Driesnote presentation

Last week, many Drupalists came together for Drupalcon Amsterdam.

As a matter of tradition, I presented my State of Drupal keynote. You can watch a recording of my keynote (starting at 20:44 minutes), or download a copy of my slides (149 MB).

Drupal 8 innovation update

I kicked off my keynote with an update on Drupal 8. Drupal 8.8 is expected to ship on December 4th, and will come with many exciting improvements.

Drupal 8.7 shipped with a Media Library to allow editors to reuse images, videos and other media assets. In Drupal 8.8, Media Library has been marked as stable, and features a way to easily embed media assets using a WYSIWYG text editor.

I'm even more proud to say that Drupal has never looked better, nor been more accessible. I showed our progress on Claro, a new administration UI for Drupal. Once Claro is stable, Drupal will look more modern and appealing out-of-the-box.

The Composer Initiative has also made significant progress. Drupal 8.8 will be the first Drupal release with proper, official support for Composer out-of-the-box. Composer helps solve the problem of Drupal being difficult to install and update. With Composer, developers can update Drupal in one step, as Composer will take care of updating all the dependencies (e.g. third party code).

What is better than one-step updates? Zero-step updates. We also showed progress on the Automated Updates Initiative.

Finally, Drupal 8.8 marks significant progress with our API-first Initiative, with several new improvements to JSON:API support in the contributed space, including an interactive query builder called JSON:API Explorer. This work solidifies Drupal's leadership position as a leading headless or decoupled solution.

Drupal 9 will be the easiest major update

A couple stares off into the distant sunrise, which has a '9' imposed on the rising sun.

Next, I gave an update on Drupal 9, as we're just eight months from the target release date. We have been working hard to make Drupal 9 the easiest major update in the last decade. In my keynote at 42:25, I showed how to upgrade your site to Drupal 9.0.0's development release.

Drupal 9 product strategy

I am proud of all the progress we made on Drupal 8. Nevertheless, it's also time to start thinking about our strategic priorities for Drupal 9. With that in mind, I proposed four strategic tracks for Drupal 9 (and three initial initiatives):

A mountain with a Drupal 9 flag at the top. Four strategic product tracks lead to the summit.

Strategic track 1: reduce cost and effort

Users want site development to be low-cost and zero-maintenance. As a result, we'll need to continue to focus on initiatives such as automated updates, configuration management, and more.

Strategic track 2: prioritizing the beginner experience

As we saw in a survey Acquia's UX team conducted, most people have a relatively poor initial impression of Drupal, though if they stick with Drupal long enough, their impression of Drupal grows significantly over time. This unlike any of its competitors, whose impression decreases as experience is gained. Drupal 9 should focus on attracting new users, and decreasing beginners' barriers to entry so they can fall in love with Drupal much sooner.

A graph that shows how Drupal is perceived by beginners, intermediate users and expert users.Beginners struggle with Drupal while experts love Drupal. A graph that shows how Drupal, WordPress, AEM and Sitecore are perceived by beginners, intermediate users and experts.Drupal's sentiment curve goes in the opposite direction of WordPress', AEM's and Sitecore's. This presents both a big challenge and opportunity for Drupal.

We also officially launched the first initiative on this track; a new front-end theme for Drupal called "Olivero". This new default theme will give new users a much better first impression of Drupal, as well as reflect the modern backend that Drupal sports under the hood.

Strategic track 3: drive the Open Web

As you may know, 1 out of 40 websites run on Drupal. With that comes a responsibility to help drive the future of the Open Web. By 2022-2025, 4 billion new people will join the internet. We want all people to have access to the Open Web, and as a result should focus on accessibility, inclusiveness, security, privacy, and interoperability.

A person dressed in a space suit is headed towards a colorful vortex with a glowing Druplicon at the center.

Strategic track 4: be the best structured data engine

A slide that makes the point that Drupal needs to manage more diverse content and integrate with more different platforms.

We've already seen the beginnings of a content explosion, and will experience 300 billion new devices coming online by 2030. By continuing to make Drupal a better and better content repository with a flexible API, we'll be ready for a future with more content, more integrations, more devices, and more channels.

An overview of the four proposed Drupal 9 strategic product tracks

Over the next six months, we'll be opening up these proposed tracks to the community for discussion, and introducing surveys to define the 10 inaugural initiatives for Drupal 9. So far the feedback at DrupalCon Amsterdam has been very positive, but I'm looking forward to much more feedback!

Growing sponsored contributions

In a previous blog post, Balancing Makers and Takers to scale and sustain Open Source, I covered a number of topics related to organizational contribution. Around 1:19:44, my keynote goes into more details, including interviews with several prominent business owners and corporate contributors in the Drupal community.

You can find the different interview snippet belows:

  • Baddy Sonja Breidert, co-founder of 1xINTERNET, on why it is important to help convert Takers become Makers.
  • Tiffany Farriss, CEO of Palantir, on what it would take for her organization to contribute substantially more to Drupal.
  • Mike Lamb, Vice President of Global Digital Platforms at Pfizer, announcing that we are establishing the Contribution Recognition Committee to govern and improve Drupal's contribution credit system.

Thank you

Thank you to everyone who attended Drupalcon Amsterdam and contributed to the event's success. I'm always amazed by the vibrant community that makes Drupal so unique. I'm proud to showcase the impressive work of contributors in my presentations, and congratulate all of the hardworking people that are crucial to building Drupal 8 and 9 behind the scenes. I'm excited to continue to celebrate our work and friendships at future events.

A word cloud of all the individuals who contributed to Drupal 8.8.Thanks to the 641 individuals who worked on Drupal 8.8 so far. A word cloud of all the organizational contributors who contributed to Drupal 8.8.Thanks to the 243 different organizations who contributed to Drupal 8.8 to date.

November 05, 2019

3 min read time

Nov 05 2019
Nov 05

The Drupal Pitch deck has been created, as part of the Promote Drupal initiative, to support and strengthen the position of those in sales roles who present Drupal to decision makers and buyers. We want to help elevate Drupal via agencies, co-ops, and independents. It is an initiative I have led, in close partnership with Suzanne Dergacheva and Ricardo Amaro and involving over 100 contributors.

The Drupal Pitch Deck slides

Born out of a meeting of European Drupal leaders in Darmstadt. In just 12 months ago The Drupal Pitch Deck has come a long way. Version 2.0 of the deck is available now, contains 108 slides in all, and 73 case studies representing the best Drupal projects from agencies across the globe

It was fantastic to see the project featured during introductions ahead of the DriesNote and Suzanne Dergacheva present The story of how the pitch deck came to be at the Drupal Initiatives Keynote at DrupalCon Amsterdam. Seeing non code contributions from so many people being celebrated on this global stage was a real highpoint of the conference for me.

Call for content remains open, you can Add Your Slide Here. With the latest and greatest Drupal projects celebrated at the International Splash Awards we are calling all agencies to submit their incredible new case studies.

In preparation for DrupalCon the team at CTI Digital worked to deliver a solution which allowed all deck content to be transferred into Drupal 8. Integrated to Lingotek at translation management system which enables us to start the internationalisation of the pitch deck. Special thanks must go to Richard Roberts at Lingotek who worked tirelessly over the week of DrupalCon on boarding community translators.

Now that we have systems in place we are actively inviting people to volunteer to translate slides into their language here.

Stages of development

Call for front end developers

All but stage 7 is complete now. We are keen to attract front end developers to join the effort to produce the presentation layer, a Drupal theme. What we have in mind is detailed here

Promote Drupal Pitch Deck Version 2.0 released

During the Birds of a Feather “Bof” meeting we released Version 2.0 of the Pitch Deck which you can download now.

Pitch Deck BoF

Photo: Alex Moreno

Contribution day

Non code contributions to Drupal is a theme close to my heart. So I was delighted to see 2 tables full of people translating and creating new content at the Contribution Day.

Contribution day table
 

Not only this, some of the team were first time contributions and only registered on Drupal.org for 3 weeks. For me this is what success looks like.

Issue queue recognising non code contributions

In addition to translation we saw several subject experts drafting new slides providing supporting evidence as to why Drupal is a prime choice in Higher Education and for Government.

New slides

We still need volunteers in far greater numbers to achieve the translation into 16 languages which is underway. Right now we are calling for people speaking English and the following languages to volunteer.

  • Arabic
  • Bulgarian
  • Catalan
  • Dutch
  • English
  • Farsi (Persian)
  • French
  • French
  • German
  • Hebrew
  • Hindi
  • Hungarian
  • Italian
  • Japanese
  • Polish
  • Portuguese
  • Slovak
  • Spanish
  • Thai
  • Turkish

In conclusion the Pitch Deck Initiative has achieved far more than we could ever have imagined on that day in Darmstadt last year. With the right support from community members we will soon deliver decks in 19 languages. It really is down to you! It’s an ideal opportunity to contribute your skills in bite size chunks. As you’ve already seen there are plenty of ways to join our movement.

On behalf of Ricardo, Suzanne and myself many thanks to all who have participated. We look forward to meeting you too, everyone is welcome!

Nov 05 2019
Nov 05

November has arrived, and, with another successful European DrupalCon in the books, it’s time to start planning for the upcoming big things in Drupal and further developing on ideas the ‘Con has inspired. Before you do, however, take a look at our recap of last month’s top Drupal-related posts for some additional inspiration.

Decoupling the front-end without APIs and JavaScript

Let’s begin with a blog post about a trend in web development that has been gaining a lot of ground recently: a “decoupled” approach to developing websites and applications, which Drupal is perfect for. 

Yet this blog post by Aleksi Peebles doesn’t touch upon decoupling in the traditional sense, that is, by using APIs and JavaScript, but instead proposes a different, lesser known way to decouple your website’s front end from Drupal. 

Aleksi points out that component-based theming with decoupled Twig components allows the theme layer to be largely developed independently of Drupal. Although component-based theming is typically used for bigger projects, it is also perfectly suitable for smaller Drupal projects.

Read more

A decade of Drupal

Next up, we have ComputerMindsJames Williams’ recollections of his ten years of working with Drupal. He takes a look back at the digital landscape when he started out with Drupal and notes how things have significantly changed over the decade that has passed in the meantime. 

One of the most obvious changes that he has witnessed is the transition from smaller to enterprise-level projects. Even though (or perhaps especially because) the word ‘enterprise’ holds a slightly negative connotation in the Drupalverse, the best thing for the community is to embrace this change and be honest about it. 

Read more

Improving Speaker Diversity in Drupal Events

The Drupal community is known for being one of the most inclusive and diverse in tech, but the speakers at Drupal events still predominantly come from privileged groups, and as such, there's still a lot more we can do.

As a very community-centric organization, Pantheon has teamed up with Drupal’s Diversity & Inclusion group and organized workshops aimed at improving the public speaking skills of people from underrepresented groups. 

In her blog post, Pantheon’s Community Developer Manager Tara King invites all Drupal event organizers who want their events to be more diverse to attend a training workshop which will help empower these underrepresented people in their communities.

Read more

Contribution Credit Tune-up

For those who haven’t been following - Drupal’s issue credit system, although an excellent way to recognize and reward active contributors, can be (and has been) exploited to amass a huge number of credits and thus unfairly raise the visibility of both the individual contributor and the company they represent.

This blog post by Tim Lehnen of the Drupal Association then announces a tune-up of the contribution credit system: issues on the more highly used projects are now weighted higher, while those on lesser used projects are worth less. This is just a minor update, and a more complete overhaul of the credit system is planned for the future. 

Read more

Why Kanopi is a Value-Driven Organization

While not exactly a Drupal-specific post, this next one, written by Kanopi StudiosAnne Stefanyk, offers such a great insight into a successful value-driven organization that we just had to include it as an inspiration. 

Kanopi Studios is based around seven fundamental values which aim to provide the best possible experience to both their employees and the clients they work with, truly showcasing how clearly defined - and followed - values result in a successful business. 

It seems, then, that Kanopi has managed to accomplish at least half of the mission initially envisioned by Anne. We hope the other half - the treehouse - is able to be realized one day as well!

Read more

Should You Jump Ship Before Drupal 9?

Moving on, this post by Third and Grove’s Justin Emond addresses and appeases some of the main concerns regarding next year’s Drupal 9 release. Since an upgrade to a new major version of a CMS is considered a difficult endeavor, many people might be considering just migrating to a different CMS altogether - but Justin advises against doing so.

According to him, as well as several other sources (including Dries), the upgrade from 8 to 9 and all subsequent versions will be significantly easier than upgrades between any previous versions. Justin suggests thoroughly assessing your needs before deciding on a new CMS.

Read more

Drupal Migration Does Not Have to Be Scary

Similarly to the previous post by Third and Grove, this one also addresses the concerns readers might have with regards to migrating to a newer version of Drupal. 

In this post, Promet Source’s Gena Wilson outlines which six steps to take to execute a migration as smoothly as possible. These are: auditing existing content; taking analytics into account; auditing modules; assessing the site’s theme; identifying any complexities; and, lastly, determining if an automatic migration tool is a good choice for the website.

Since the Drupal 8 to 9 migration will be focused on backwards compatibility, Gena argues that already migrating to 8 now is definitely worth it.

Read more

What the Heck is ARIA? A Beginner's Guide to ARIA for Accessibility

In the last post on October’s last, Kat Shaw of Lullabot takes on the concept of ARIA (Accessible Rich Internet Applications) and discusses how the effective implementation of the WAI-ARIA specification can improve accessibility.

She outlines ARIA’s capabilities and some of its most common use cases, as well as five specifics rules for using ARIA. 

The second half of Kat’s post then focuses on Drupal’s use of ARIA and accessibility, also listing some useful contributed accessibility modules for Drupal. She finishes by addressing some confusions of ARIA use and providing some additional resources to learn more about ARIA.

Read more

This concludes our list of October’s top Drupal blog posts. If you enjoyed the read, feel free to further explore our blog - we cover a lot of different topics, from everything Drupal-related to business, content, UX and more, so you’ll definitely find something for you.
 

Nov 05 2019
Nov 05
A list of the presidential candidates based on the accessibility of their sites, along with images concerning the candidate's sites. Explanatory text follows the infographic.

Based on our conviction at Promet Source that web accessibility matters, we evaluated the websites of leading Republican and Democratic 2020 presidential candidates for accessibility. The web.dev scans were conducted on November 4, 2019, and the above infographic offers a glimpse into how the candidates' websites rank in terms of accessibility factors that include keyboard navigation, use of ARIA assistive technologies, semantic markup support, and the presence of alternative text on images.

The scans revealed an average site accessibility score of 67 for Republican candidates of and 86 for the Democratic candidates (out of a possible score of 100). This is depicted with an outlined image of an elephant that's 69% filled in with red and an outline of a donkey that's 87% filled in with blue.

The candidates' accessibility scores are depicted as a horizontal bars with directional, racing imagery, each with the forward momentum icon of the person in a wheelchair.

Horizontal blue bars for nine Democratic candidates are depicted, followed by four red bars for Republican candidates. Among the Democratic candidates, Cory Booker ranked the highest with a score of 99; followed by Julian Castro (96); Bernie Sanders (94); Elizabeth Warren (93); Kamala Harris (92); Joe Biden (90); Andrew Yang (89); Pete Buttigieg (64); and Amy Klobuchar (60). Among Republican candidates, Joe Walsh ranked the highest with a web.dev score of 87, followed by Bill Weld (78); Mark Sanford (74); and Donald Trump (27).

Three outlined or partially outlined circles follow the bar chart. The first one is 100% outlined in red indicating that 100% of the candidates' websites from both parties have a DONATE button on the home page. The second circle is 29% outlined in red with a caption indicating that's the percentage of websites that have dual language capabilities. The third circle is 21% outlined, indicating that the percentage of 2020 presidential candidate sites that have alt-text on images.

Curious about the degree to which your site measures up to current accessibility standards? Go to web.dev to run it through an accessibility scan, and contact us today for a conversation about getting your site into compliance.

Nov 05 2019
Nov 05

On a client project we were using a custom Drupal content entity to model some lightweight reusable content.

The content entity was originally single use and did not support bundles (e.g. node entities have node-type bundles).

As the project evolved, we needed to add bundle support for the custom entity-type, despite it already being in production use.

Read on to find out how we achieved this.

In this example, lets call the content entity a 'set' and the bundles a 'set type'.

Create the bundle configuration entity

As we wanted this content entity to support adding new bundles via the UI, a configuration entity makes sense to allow site-builders to create the various bundles as required, so we created a new configuration entity called 'set type' as per the examples, although we used a route provider instead of a routing file. We made sure to add the bundle_of annotation to the config entity.

bundle_of = "set",

Updating the content entity's annotation and fields

Once this was done, the next step was to update the content entity's annotation. We added the 'bundle' key and the 'bundle_entity_type' annotation

bundle_entity_type = "set_type",
*   entity_keys = {
*     "id" = "id",
*     "label" = "name",
*     "uuid" = "uuid",
*     "uid" = "user_id",
*     "bundle" = "type",
*     "langcode" = "langcode",
*   },

We didn't need to add a new field for our baseFieldDefinition to our content entity because we just deferred to the parent implementation. But we made sure to match up the description, label etc as desired - and that we called setInitialValue. As we're planning to add a new column to the entity's tables in the database, we need to populate the type column for existing records. Now with entities that don't support bundles, Drupal defaults to the entity ID for the bundle. e.g. for the 'user' entity, the bundle is always 'user' because User entities don't support bundles. So we knew our existing 'set' entities would have to have a bundle of 'set' too. But our new ones could have whatever we liked. So this is why our field definition for 'type' had to have look like so

$fields['type']->setInitialValue('set')

Update hooks to get everything in place

Since Drupal 8.7, support for automatic entity updates has been removed, so whilst adding the field, entity keys and updating the annotation works for a new install (hint, there won't be one) it doesn't help our existing production and QA sites - so we need an update hook to bring our existing entity-type and field definitions into sync with the code versions, which also takes care of the required database table changes.

So the steps we need to do here are:

  1. install the config entity type
  2. create a new instance of it for the existing entities
  3. add the new field definition for the type field to the content entity
  4. update the content entity definition

Installing the config entity type

The docs for installing a new entity type make it clear what we need to do. Our code ended up something like this:

/**
 * Adds the set type.
 */
function your_module_update_8001() {
  \Drupal::entityDefinitionUpdateManager()
    ->installEntityType(new ConfigEntityType([
      'id' => 'set_type',
      'label' => new TranslatableMarkup('Set type'),
      'label_collection' => new TranslatableMarkup('Set types'),
      'label_singular' => new TranslatableMarkup('set type'),
      'label_plural' => new TranslatableMarkup('set types'),
      'label_count' => [
        'singular' => '@count set type',
        'plural' => '@count set types',
      ],
      'handlers' => [
        'list_builder' => 'Drupal\your_module\SetTypeListBuilder',
        'form' => [
          'default' => 'Drupal\your_module\Form\SetTypeForm',
          'delete' => 'Drupal\Core\Entity\EntityDeleteForm',
        ],
        'route_provider' => [
          'html' => 'Drupal\Core\Entity\Routing\AdminHtmlRouteProvider',
        ],
      ],
      'admin_permission' => 'administer set type entities',
      'entity_keys' => [
        'id' => 'id',
        'label' => 'name',
      ],
      'links' => [
        'add-form' => '/admin/structure/sets/add',
        'delete-form' => '/admin/structure/sets/manage/{pane_set_type}/delete',
        'reset-form' => '/admin/structure/sets/manage/{pane_set_type}/reset',
        'overview-form' => '/admin/structure/sets/manage/{pane_set_type}/overview',
        'edit-form' => '/admin/structure/sets/manage/{pane_set_type}',
        'collection' => '/admin/structure/sets',
      ],
      'config_export' => [
        'name',
        'id',
        'description',
      ],
    ]));
}

Creating the first bundle

In our first update hook we installed the config entity, now we need to make one for the existing entities, because bundle-less entities use the entity type ID as the bundle, we make sure our new type has the same ID as the entity-type.

/**
 * Adds a new config entity for the default set type.
 */
function your_module_update_8002() {
  $type = SetType::create([
    'id' => 'set',
    'name' => 'Set',
    'description' => 'Provides set panes',
  ]);
  $type->save();
}

Adding the new field definition and updating the entity definition

The documentation for adding a new field definition is again very useful here, so we follow along to install our new field definition. And similarly the documentation for updating an entity type here, so our final update hook looks like this:

/**
 * Updates defintion for set entity.
 */
function your_module_update_8003() {
  $updates = \Drupal::entityDefinitionUpdateManager();
  $definition = BaseFieldDefinition::create('entity_reference')
    ->setLabel('Set type')
    ->setSetting('target_type', 'set_type')
    ->setRequired(TRUE)
    ->setReadOnly(TRUE)
    ->setInitialValue('set')
    ->setDefaultValue('set');
  $updates->installFieldStorageDefinition('type', 'your_module', 'your_module', $definition);
  $type = $updates->getEntityType('your_module');
  $keys = $type->getKeys();
  $keys['bundle'] = 'type';
  $type->set('entity_keys', $keys);
  $type->set('bundle_entity_type', 'set_type');
  $updates->updateEntityType($type);
}

And that's it we're done.

Wrapping up

Kudos to those who created the documentation for this, as well as my colleagues Sam Becker, Jibran Ijaz and Daniel Phin who helped me along the way. Hopefully, you find this post useful if you're ever in the same boat.

Photo of Lee Rowlands

Posted by Lee Rowlands
Senior Drupal Developer

Dated 5 November 2019

Comments

Hello. Thanks for this blog post. Very useful to have real life examples :-) Just may be you have a typo (last lines of the post) : "pane_set_type" shouldn't be "set_type" ?

thanks, fixed

Pagination

Add new comment

Nov 04 2019
Nov 04
A proactive approach for cleaner Drupal coding


So you are stuck in the cruft, struggling to create some semblance of sanity within a sea of code-rot. Code standards sound like a great idea for your project, but perhaps automated enforcement tools look like more of a pain than they're worth. This post is intended for Drupal developers using PhpStorm who need fast, flexible, standards enforcement tools.

Maintaining a stringent standard for your codebase is a battle. On one hand, your code is cleaner, more unified, and easier to maintain. On the other hand, these little formatting rules cause frustration and time loss - especially if a tiny slip causes you to waste a full pipeline cycle just to pass an automated standards check. As they say, the best defence is a strong offence, and the tools proposed here will help you find and fix standards violations before they reach a pipeline.

Drupal recommends a tool called PHP Code Sniffer, aka phpcs, to scan your files for Drupal Code Standards violations. Thankfully, it also comes with a companion tool called PHP Code Beautifier and Fixer, aka phpcbf, which fixes the small, tedious violations for you.

The goal of this post is to get phpcs and phpcbf under your fingertips and into your habits. Only once you have hotkeys set-up to run these tools while coding will they become useful — instead of just annoying.

The steps are as follows:

  1. Install and set-up phpcs
  2. Create a custom combination of rulesets
  3. Integrate with PhpStorm for hotkeys and syntax highlighting

1. Install and set-up phpcs

It may seem straightforward to install phpcs globally via Composer or apt, or to simply require it in your current composer project. However, a global install is not easy to customize and share. Instead, I recommend using a standalone repo that is specifically for your code standards tools. When your standards stand alone, they are easier to edit, share with teammates, and transfer to new work environments.

Here’s a simple repo to get you started:
https://github.com/nilswloewen/drupal_code_standards

  1. If you currently have phpcs or phpcbf installed globally, uninstall them before proceeding.
  2. Quick install with example repo:
    git clone [email protected]:nilswloewen/drupal_code_standards.git
    cd drupal_code_standards
    composer install
  3. Once composer has installed phpcs for you, add it to your global path with:

    export PATH=$PATH:~/PATH/TO/drupal_code_standards/vendor/bin
    NOTE: Adjust accordingly for your shell and OS of choice.
  4. Next, you must tell phpcs which rulesets you have installed use.

    The optional tool phpcodesniffer-composer-installer will automatically detect rulesets in your composer package and set your phpcs & phpcbf installed_paths for you. This is part of the example repo and the next step should have been done for you during "composer install".

    However, to set installed paths to rulesets manually run:

    phpcs --config-set installed_paths vendor/drupal/coder/coder_sniffer,vendor/phpcompatibility/php-compatibility/PHPCompatibility
    

    Then confirm that phpcs knows about the rulesets within the installed paths with:

    phpcs -i

    You should see this list that confirms your rulesets:

    The installed coding standards are ... PHPCompatibility, Drupal and DrupalPractice
    

    You may need to set installed paths for phpcbf as well using the same process.

2. Create a custom combination of rulesets

Out of the box, phpcs can only run one standard at a time. This is a problem when working with Drupal because we have 2 standards to follow. For this post I have added a third standard, PHPCompatibility, which is helpful when upgrading php versions on legacy projects.

  1. To combine standards we first create a custom ruleset that references multiple rulesets. Note that this is already included in the example repo as phpcs-custom-standards.xml.
    <?xml version="1.0"?>
    <ruleset name="Custom code standards">
    <rule ref="Drupal"/>
    <rule ref="DrupalPractice"/>
    <rule ref="PHPCompatibility"/>
    </ruleset>
  2. Then set this standard as your default. Use an absolute path to ensure your standard will be found no matter what context phpcs is called from.
    phpcs --config-set default_standard ~/PATH/TO/drupal_code_standards/phpcs-custom-standard.xml
    
    See the example standard for a few other helpful settings.

3. Integrate with PhpStorm for hotkeys and syntax highlighting

There are two levels of integration with PhpStorm: Passive and Active.

Passive

Passive code analysis with PhpStorm Inspections will give you syntax highlighting and hover-over explanations of the file you are currently working on.

PhpStorm passive integration

This is quite helpful when dealing with one file at a time, but when you need to get an entire directory to pass standards, you need a way to hunt for violations.

Active

Active analysis when you use phpcs to scan many files at once. You can do this through the terminal with a command like:

phpcs ~/module # Scans all applicable files in dir.
phpcs ~/module/example.php # Scans only a specific file.

However, it’s a pain to open a terminal window, navigate to the file you are working on, and then type a command. You’ll probably forget or neglect to check your work because of these extra steps involved. A better way to run phpcs is to set-up hotkeys within PhpStorm to scan your files instantly.

Configure PhpStorm inspections

  1. Register phpcs and phpcbf as PHP Quality Tools.

    Settings | Languages and Frameworks | PHP | Quality Tools | Code Sniffer

    PhpStorm PHP Quality Tools settings
  2. Enable the inspection.

    Settings | Editor | Inspection | PHP | Quality Tools

    PhpStorm PHP Inspections settings

  • Set the extension list to match what Drupal standard sets: source
    php,module,inc,install,test,profile,theme,css,info,txt,md,yml
    
  • DO NOT set the "Installed standard paths", as this overrides what you have previously set in the command line.
  • The refresh list button on "Coding Standard" should mimic what "phpcs -i" shows. Choose "Custom" Coding Standard and then click the ellipses to choose the path to your custom standards file (i.e. phpcs-custom-standards.xml).
  • Click OK and your inspections should be working!

Configure hotkeys

  1. Register phpcs and phpcbf as external tools.

    Settings | Tools | External Tools

    PhpStorm External Tools settings

    The "$FilePath" argument runs the tool against the file you are currently working on, or against a selected folder when in project view.
  2. Double check that this step works by running the tool from the main menu.

    Tools | External Tools | phpcs

    Running phpcs external tool

  3. This is the special sauce. Configure a keyboard shortcut for your new tools.

    Settings | Keymap | External Tools

    PhpStorm Keymap settings

  4. Right click on the external tool you just registered and add a keyboard shortcut. "Ctrl+Alt+Shift+Comma" was simply a combination that was not used anywhere else in my setup.

Bringing it all together

Now you can actively use phpcs and phpcbf while you code! I frequently use the phpcbf hotkey while writing new code to do the tedious stuff for me, such as creating doc blocks and pushing whitespace around. Here's an example:

Use phpcbf in PhpStorm with a hotkey

With phpcs and phpcbf now under your fingertips you are set to be much more assertive in your application of code standards!

Taking it to the next level

If you are using Gitlab for CI/CD, which I hope you are, another great strategy for enforcing standards is to create a pre-deployment job that scans your custom code for violations. This will keep your team (and you) in check by stopping standards violations from being auto-deployed.

After a few super annoying pipeline failures for minor syntax errors, you will want this next level of enforcement — git pre-commit hooks. I highly recommend using grumphp to manage this for you.

Best of luck keeping your code readable and up to snuff!

End-to-end Drupal services

As a full service Drupal agency, Acro Media has significant expertise in digital commerce architecture, ecommerce design, customer experience, software development and hosting architecture. If you’re looking for a Drupal agency dedicated to code and project quality, check us out. We would love the opportunity to talk.

View Our Drupal Commerce Services

Nov 04 2019
Nov 04

The Webform module for Drupal 8 provides support for basic, advanced, and composite inputs/elements. Developers can also create custom elements and composites via code. Sometimes end-users need an interactive visual mechanism to select a complex option like a map location. The Image select element is limited to allowing users to select horizontally-aligned images.

What if an end-user needs to select a seat on a plane, a part of the human body, or a room at a conference?

HTML5 and SVG allow designers and site builders to build visually rich and interactive elements. To convert rich interactive HTML/SVG elements into a form input, there needs to be some PHP and JavaScript glue code that captures a user's selection and sets it as a webform submission value.

Besides capturing the end-users selection, the PHP and JavaScript code must also ensure that the interactive HTML/SVG element is accessible to users with disabilities.

Allow site builders to create a custom 'options' element using HTML/SVG markup.

The general concept for creating a custom 'options' element' is to allow site builders to provide HTML/SVG markup that is dynamically converted into a select-able and accessible Webform element.

Below is a straightforward example of some HTML markup that can be enhanced.

SVG support requires the same data-id and data-name attributes being added to the HTML markup. Below is an example of a single clickable element in SVG.

Implementation

The HTML/SVG markup must provide an option value or text attribute that can be used to create a list of available options. The default recommended 'value' attribute names are 'data-option-value', 'data-value', 'data-id', or 'id'. The default recommended 'text' attribute names are 'data-option-text', 'data-text', 'data-name', 'name', or 'title'. The Webform module uses these value/text attribute names to parse and detect the custom element's available options.

Accessibility

Once the available options are detected, JavaScript is used to make each option focusable, clickable, and accessible using ARIA (Accessible Rich Internet Applications) and related HTML attributes. To allow the user's select option to post back to the server, a select menu is provided to capture the user's selection. The select menu supports HTML5 validation and provides a fully accessible alternative input for a user using a screen reader.

Here is an example of a clickable map of U.S. states.

Clickable U.S. States

Clickable U.S. States

More about custom 'options' elements

Only webform administrators can define custom 'options' elements. Since these elements support any HTML/SVG markup, including JavaScript; Only trusted users should be allowed to administer the Webform module and create custom 'options' elements. 

HTML/SVG markup can be loaded from a URL or cut-and-pasted into the code editor. HTML/SVG markup supports Twig which can be used to enhance the markup and even render custom options.

To being building custom 'options' elements, install the Webform Custom Options module (/admin/modules) and go to the Custom options configuration page (/admin/structure/webform/config/options_custom/manage).

Custom options configuration page

Custom options configuration page

Translation

Besides parsing option value and text from the HTML/SVG markup, the option value, text, and description can also be provided via the element edit form. Options entered via the element edit form are available for translation.

Enhancements

To make it easier for end-users to see more information about the available options, tooltip support is provided. SVG graphics can be panned and zoomed using the SVG Pan & Zoom library.

Integration

Custom 'options' elements are available to webforms as simple elements and entity references. Custom 'options' element which support entity references must have their option value match the entity reference ids, or the option text match the entity reference titles.

What is next for custom 'options' elements?

The Webform Custom Options module is experimental. If you extend, alter, or override any code, please test your changes when updating the Webform module. The module's code still needs to be finalized. We also need to determine if any additional hooks or templates are required.

Currently, the only example is a map of U.S. states. More HTML/SVG examples and possibly a dedicated 'Webform Custom Options Examples' module are needed. A dedicated contrib module which provides SVG maps is also something that would be great to have. All examples need to use freely available SVGs with proper attribution to the source.

As a community, we need to explore what is and isn't possible using HTML and SVG graphics when creating a custom 'options' element.

Documentation

Documentation always needs improvement. Solving the challenge of click-able custom options requires a fairly complex solution. Please help me clarify how things work and how to create the necessary custom HTML/SVG markup.

Accessibility​

Accessibility should never be ignored, assumed, or taken for granted. Custom 'options' elements need to be fully keyboard accessible and provide the right information to screen readers. The SVG Pan & Zoom library does need some accessibility improvements, especially around keyboard navigation.

Who sponsored this feature?

Two years ago, Madhan K'Samy (MkSamy) created Issue #2907663: SVG image parts as selectable options element, and we discussed this feature. I was able to build a simple prototype. Madhan K'Samy was unable to sponsor this feature. However, for the past two years when people asked for similar functionality I would demo the prototype.

Steve Babbitt from Mile3 (https://www.mile3.com/) approached me about sponsoring the new Webform Options Limit submodule, mentioning that his client needed a clickable SVG graphic. I showed him the prototype and he spoke with his client, who was willing to sponsor and contribute this feature, which allowed us to scope out the feature request using Issue #3081795: Allow custom HTML/SVG markup to be used as options.

Steve and clients have sponsored and contributed a feature that has changed what is possible using the Webform module. This feature allows an airline to convert an airplane's seating chart into an SVG graphic, which then can be used to create a seat selection Webform element. That’s an amazing feat and I’m confident we can find even more ways to make the best use out of this feature.

This sponsored feature demonstrates the flexibility, openness, and collaboration that makes any idea or business requirement a possibility using Drupal.

If you want to sponsor a feature, please read my blog post and create a ticket in the Webform module's issue queue.

Backing the Webform module

Open Collective is providing us, Drupal, and Open Source, with a platform to experiment and improve Open Source sustainability. If you appreciate and value what you are getting from the Webform module, please consider becoming a backer of the Webform module's Open Collective. More support means more input with regard to enhancing accessibility and that is always a good thing.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OK

Nov 04 2019
Nov 04

I noticed the 'Mismatched entity' error in my status screen, after I performed an update to Drupal 8.7.7 . Although it didn't adversely affect the running of my website, it was an error that wasn't going to go away. So I was determined to find a solution.

Solution

The solution in my instance, related to the GoogleTagManager module . According to the issue queue, the problem is described as follows :

"Drupal prefers each entity_type to be officially 'installed' which simply means entity type listeners are given an opportunity to act on the new arrival. Afterwards, the entity type is recorded in the database. Absent this 'install' action, the status report page (at 'admin/reports/status') will include an 'error' indicating the 'entity type needs to be installed'."

At time of writing, the current release of GoogleTagManager module was v8.x-1.2 . With this release, GoogleTagManager now allows for multiple containers per page request. However, the module is missing the following code in 'google_tag.install' to make the module compliant.

/**
 * Install the container configuration entity type.
 */
function google_tag_update_8102(&$sandbox) {
  $type_manager = \Drupal::entityTypeManager();
  $type_manager->clearCachedDefinitions();
  $entity_type = $type_manager->getDefinition('google_tag_container');
  \Drupal::entityDefinitionUpdateManager()->installEntityType($entity_type);

  return t('Installed the google_tag_container entity type');
}

To add this code, you can either copy the code to the end of 'google_tag.install', or install the development version of the module (which contains the additional code). Then, simply run a database update to commit the entity type to the database.

The above code will be incorporated into v8.x-1.3 when it is released.

Nov 03 2019
Nov 03

One of the most useful components of Drupal is the Block system. This system allows for pieces of content to be created and reused throughout a site in various regions of the page structure. You can also select which bundles or content types this piece of content should appear on, making it so that a blog post gets a certain Call to Action section while an article gets something similar, but different.

When we use GatsbyJS for a decoupled front-end solution with a Drupal back-end, we lose out on quite a bit, if not everything, that the theme layer of Drupal provides. This means that Drupal regions mean nothing to Gatsby, nor does block placement. In fact, any functionality that would go into a .theme file is no longer available on the Drupal side.

Before I get into the "how" of using Drupal blocks in Gatsby, I want to cover a little bit of the "why".

Why Use Drupal Blocks for Content in a Gatsby Front End?

The main advantage of blocks in Drupal is that the content is created in one place, but can be placed in several spots. I have worked with solutions that use Paragraphs (from the Paragraphs contrib module) for similar functionality, but the problem remains where the content needs to be recreated on every new parent entity. For example, I can create a Call to Action (CTA) Paragraph and place fields that reference them on every content type, but the Paragraphs themselves remain separate. A Paragraph is more like a collection of fields to be filled out than a reusable entity, and that's okay.

In contrast, a Block can be created in one place, usually using the Block UI, and the content of the Block remains the same no matter where it is placed. A CTA block on a blog post will have identical content to the CTA block on an article. This makes content changes extremely fast compared to having to update every article and blog post if the wording or link need to change.

It is entirely possible to create these types of entities in Gatsby by defining a reusable component, however the editing experience doesn't really pan out. It may require a developer to go in and edit a React component, adding a middle-person to the process. Using reusable Drupal blocks can save time and budget when set up appropriately.

How to Use Drupal Blocks for Content in a Gatsby Front End

This section is going to make a few assumptions.

  1. You have a basic understanding of Gatsby and React components.
  2. You understand the Drupal theme layer.
  3. You have a basic understanding of the Drupal block system.
  4. Your decoupled application is already setup and pulling content from Drupal (not necessary, but it helps if you can try it out)
  5. You have the JSON:API module enabled on Drupal
  6. You're sourcing content from Drupal.

If you're having problems with any of these, feel free to reach out to me on Drupal Slack, I go by Dorf there.

Now the "how".  We're going to start by creating a Custom Block Type. Let's keep going with the CTA theme and call it "CTA Block". We do this by logging into our site as someone with permissions to access the Block UI and going to admin/structure/block/block-content/types. Once there, select "Add custom block type".

Let's label this CTA Block and begin creation. After we create it, we need to add some fields, so let's add the following fields:

  • CTA Heading
    • A plain text field, max 255 chars
  • CTA Link
    • A Link field using URL and label, internal and external links, Label required.
  • CTA Content Types
    • This is the field that will make all the difference. Create this field as Reference: Other... to begin with.
    • Label it CTA Content Type.
    • Under "Type of entity to reference" choose "Content type" from the select list, under Configuration.
    • Set unlimited cardinality.
    • Go through the remaining screens to create this field. Once you're back on the field list, select "Manage form display"
    • From here, we're going to change the widget from Autocomplete to Check boxes/radio buttons.
    • Save your changes

Now we're going to create a new block using this Custom Block Type. When we create the block under block/add/cta_block the form should look something like this:

Screenshot of Block form

Now, add whatever text you want to the fields, but only select a single content type in the CTA Content Type field.  Save the block and spin up your Gatsby dev environment. We're going to switch over there for a bit.

Let's fire up our develop environment and take a look at GraphiQL to see what we have going on.

GraphiQL query

As you can see, we now have access to allBlockContentCtaBlock in GraphiQL, but what are we going to do with it? Well, we are first going to create the CTA Block React component. We'll do that by creating the file src/components/blocks/CtaBlock.js and adding the following:

import React from 'react'
import { graphql } from 'gatsby'

const CtaBlock = ({ data }) => {

  return <div class='ctaBlock'>
    <h3>CTA Heading</h3>
      <p>CTA Text goes here and here and here.</p>
      <a href="http://example.com">Link Text</a>
    </div>
}

export default CtaBlock

This is pretty simple and doesn't include anything having to do with our GraphQL query yet, but we have the structure in place. Now, let's look at the data we can pull from GraphQL. We want to get the heading, body, link, and content type, so our query is going to look something like this:

query MyQuery {
  allBlockContentCtaBlock {
    nodes {
      field_cta_heading
      field_cta_link {
        title
        uri
      }
      body {
        value
      }
      relationships {
        field_cta_content_type {
          name
        }
      }
    }
  }
}

Which will give us back:

{
  "data": {
    "allBlockContentCtaBlock": {
      "nodes": [
        {
          "field_cta_heading": "Heading for the CTA Block",
          "field_cta_link": {
            "title": "Learn more!",
            "uri": "http://google.com"
          },
          "body": {
            "value": "<p>Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Vestibulum facilisis, purus nec pulvinar iaculis, ligula mi congue nunc, vitae euismod ligula urna in dolor. Aenean massa.</p>\r\n\r\n<p>Praesent nec nisl a purus blandit viverra. Nullam nulla eros, ultricies sit amet, nonummy id, imperdiet feugiat, pede. Phasellus dolor.</p>\r\n"
          },
          "relationships": {
            "field_cta_content_type": [
              {
                "name": "Blog Post"
              }
            ]
          }
        }
      ]
    }
  }
}

Perfect! Now I want you to notice a couple of things here. First, we're querying ALL of the CTA blocks here. Second, we're using the content type in the query. Let's get this into our component. Add in the query to our CtaBlock.

import React from 'react'
import { graphql } from 'gatsby'

const CtaBlock = ({ data }) => {

  return <div class='ctaBlock'>
    <h3>CTA Heading</h3>
      <p>CTA Text goes here and here and here.</p>
      <a href="http://example.com">Link Text</a>
    </div>
}

export default CtaBlock

export const CtaBlockQuery = graphql`
  allBlockContentCtaBlock {
    nodes {
      field_cta_heading
      field_cta_link {
        title
        uri
      }
      body {
        value
      }
      relationships {
        field_cta_content_type {
          name
        }
      }
    }
  }`

But wait, this isn't a page template, and it shouldn't be. What's the problem then? We have a query in a non-page template, and that's not good. What we need to do is figure out which of our templates are going to use this block and add it in there. Since we chose to have the block show on the blog post content type, we're going to use the blog post template. This will probably differ for your setup.

Let's open up our page template and take a look:

  import React from 'react'
  import { graphql } from 'gatsby'

  import Layout from '../layouts'

  const BlogPostTemplate = ({ data }) => {

    const blogBody = data.nodeBlogPost.body.value

    return <Layout>
      <h3>{data.nodeBlogPost.title}</h3>
      {data.nodeBlogPost.body.value}
    </Layout>
  }

  export default BlogPostTemplate

  export const query = graphql`
  query($BlogPostID: String!){
  nodeBlogPost(id: { eq: $BlogPostID }) {
    id
    title
    body {
      processed
    }
  }
}
`

Now let's update this to have the block appear.

import React from 'react'
import { graphql } from 'gatsby'

import Layout from '../layouts'
import CtaBlock from '../blocks/CtaBlock'

const BlogPostTemplate = ({ data }) => {

  const blogBody = data.nodeBlogPost.body.value

  return <Layout>
    <h3>{data.nodeBlogPost.title}</h3>
    {data.nodeBlogPost.body.value}
    <CtaBlock />
  </Layout>
}

export default BlogPostTemplate

export const query = graphql`
  query($BlogPostID: String!){
  nodeBlogPost(id: { eq: $BlogPostID }) {
    id
    title
    body {
      processed
    }
  }
}
`

But we still need to include the query for the block itself. To do this, we're going to make a few edits to both of our components. We'll start with the CtaBlock component and convert the query into a fragment that we can reuse in multiple places if need be.

import React from 'react'
import { graphql } from 'gatsby'

const CtaBlock = ({ data }) => {

  return <div class='ctaBlock'>
    <h3>CTA Heading</h3>
      <p>CTA Text goes here and here and here.</p>
      <a href="http://example.com">Link Text</a>
    </div>
}

export default CtaBlock

export const CtaBlockQuery = graphql`
  fragment CtaBlockQuery on block_content__cta_block {
  field_cta_heading
  field_cta_link {
    title
    uri
  }
  body {
    value
  }
  relationships {
    field_cta_content_type {
      name
    }
  }
}`

If you have a keen eye, you'll notice that we've removed the nodes section of the query. This is because we're now going to query for a single block. However, there is some risk to this. In the event that a content creator creates a new CTA block on Drupal for this content type instead of editing the existing one, the old one will remain in place because a single item query will only return a single item.

Now, let's move back over to our page template and use this fragment to query for our block.

import React from 'react'
import { graphql } from 'gatsby'

import Layout from '../layouts'
import CtaBlock from '../blocks/CtaBlock'

const BlogPostTemplate = ({ data }) => {

  const blogBody = data.nodeBlogPost.body.value

  return <Layout>
    <h3>{data.nodeBlogPost.title}</h3>
    {data.nodeBlogPost.body.value}
    <CtaBlock />
  </Layout>
}

export default BlogPostTemplate

export const query = graphql`
  query($BlogPostID: String!){
  nodeBlogPost(id: { eq: $BlogPostID }) {
    id
    title
    body {
      processed
    }
  }
  blockContentCtaBlock (relationships:
    {field_cta_content_types:
      {elemMatch:
        {name:
          {eq: "Blog Post"}
          }
        }
      }
    ){
    ...CtaBlockQuery
  }
}
`

Take a look at what we've done here.  We've added in a query for the CtaBlock and we're filtering it by the content type it's attached to. After that, we're pulling in everything from the query on our component. This is exactly what we were wanting to do, but there's another step that we need to take to actually use the data on our component.

If you look at the JSX element for

<CtaBlock />


you'll notice we aren't passing anything to it, so we've got to make sure it has data to work with or we're going to end up rendering nothing.  Edit that line to be 

<CtaBlock data={data} />

In case you're not familiar, this is a React concept known as passing props, or properties, to a child component. We're passing the data object that was returned from our GraphQL query to the CtaBlock component so that it can use the included data. Since this is just a demo, we're passing the entire thing along, but it's easy enough to only pass the relevant parts of the object.

Now back in our CtaBlock component we can use the data to render out our block's content.

import React from 'react'
import { graphql } from 'gatsby'

const CtaBlock = ({ data }) => {

  return <div class='ctaBlock'>
    <h3>{data.blockContentCtaBlock.field_cta_heading}</h3>
      <p dangerouslySetInnerHtml= {{
        __html: data.blockContentCtaBlock.body.value}} />
      <a href={data.blockContentCtaBlock.field_cta_link.uri}>{data.blockContentCtaBlock.field_cta_link.title}</a>
    </div>
}

export default CtaBlock

export const CtaBlockQuery = graphql`
  fragment CtaBlockQuery on block_content__cta_block {
  field_cta_heading
  field_cta_link {
    title
    uri
  }
  body {
    value
  }
  relationships {
    field_cta_content_type {
      name
    }
  }
}`

Now we have our block based on content type rendering within our content type's Gatsby template. Note that I've left out a few things that should be noted for decoupled sites. 

  1. An internal link should us the Gatsby <Link /> component.
  2. Drupal needs some love to pass the correct alias over for an internal link so that it renders correctly.
  3. This is a very basic example. YMMV.

Anyway, I hope that this helps someone out there who ran into the same problems that I did. Please feel free to reach out to me on Twitter @jddoesdev, on Slack where I'm usually Dorf, or just leave a comment here if you have questions, concerns, comments, or just something nice to say.

Also, please feel free to support my efforts in speaking on mental health in tech or creating blog posts and tutorials like these by checking out my gofundme and patreon campaigns in the sidebar.

Thanks for reading!

Nov 01 2019
Nov 01

It was during his 2015 DrupalCon Europe keynote when Dries Buytaert first discussed the ‘free-rider problem’. Dries proposed improvements to the way Drupal tracks contribution an a scale of “social capital”. This would better recognise all types of work and how contributions can lead to a bigger impact in the world.

We’ve come a long way since that day. Fast forward to DrupalCon Amsterdam 2019 and we can see the Drupal Association have invested in tools on Drupal.org to track all types of contribution.

Event organisers, conference speakers, designers, and marketers are amongst the many non code contributors who are participating in ever rising numbers and critically, their efforts are tracked the same way as code commits. We are winning!

comment-attribution-exampleComment attribution for Drupal Association

 

As it's a topic I am passionate about, when DrupalCon was going to be hosted in the same venue 5 years on from Dries’s talk, it was a perfect occasion for me to present my session on “How to start contributing to Drupal without code”. 

Excellent ways to contribute *and* benefit for agencies and non-tech people by @pdjohnson @DrupalConEur #DrupalCon #promotedrupal pic.twitter.com/L286A5tpD2

— Imre Gmelig Meijling (@imregmelig) October 28, 2019

Photo: Imre Gmelig Meijling @imregmelig

 

How to start contributing to Drupal without code

No matter who you are; designer, writer, sales person, agency owner, project manager or end user I have a contribution idea for you. Even if you’re time poor or believe you have little experience, watch the session. If you still need help finding how to start your contribution journey email me or ping me on Twitter @pdjohnson.

[embedded content]

My session on non-code contribution on Youtube

In the spirit of open source and scaling Drupal, my slides are available under a Creative Commons BY 2.0 license for you to take and make your own presentation if you wish. 

Get the slides

Nov 01 2019
Nov 01

The International Splash Awards 2019 announced its winners during the awards ceremony in DrupalCon Amsterdam.

Nov 01 2019
Nov 01

The PWA module is an out of the box solution that provides a service worker for caching and offline capabilities. Once the service worker is active, the page loading is faster. It also serves as a knowledge repository for PWA best practices to provide needed meta tags for a perfect Lightbox score.

There is functionality in the module’s service worker that provides unique solutions to Drupal specific behavior, some of these solutions can be applied to apps outside of the Drupal world as well and we will discuss this below. When developing the D8 module, it was decided to mirror the existing functionality of the D7 version as a starting point. In turn, we were able to use the same service worker for D8 and now the two modules can be developed in parallel, with patches for one module easily rolled back into the other. Below, we’ll also discuss some of the functionality provided by the PWA module.

 

Offline Caching

In Workbox, a precache manifest is generated by a command-line tool that scans your directory and adds assets to the precache manifest file (not to be confused with manifest.json). This is impossible in Drupal because the CSS/JS filenames change after compression. Théodore "nod_" Biadala provided the solution in D7 to prepopulate the service worker with known assets, by internally requesting the URLs set in the admin panel and extracting assets out of the DOM. This allows the install event to fetch all CSS/JS and images from these pages to store in the browser Cache API for offline rendering, the complete pages will then be viewable offline even if we never visit them first.

Below we fetch all the assets from the URLs set in the admin panel to inject later into the service worker precache assets array.  In D8 we change our request to use Drupal::httpClient(), which is the updated version of drupal_http_request() in D7 and is basically a wrapper for the PHP Guzzle library.

 

  foreach ($pages as $page) {
      try {
        // URL is validated as internal in ConfigurationForm.php.
        $url = Url::fromUserInput($page, ['absolute' => TRUE])->toString(TRUE);
        $url_string = $url->getGeneratedUrl();
        $response = \Drupal::httpClient()->get($url_string, array('headers' => array('Accept' => 'text/plain')));

 

 We match all assets we need.

 

 // Get all DOM data.
      $dom = new \DOMDocument();
      @$dom->loadHTML($data);

      $xpath = new \DOMXPath($dom);
      foreach ($xpath->query('//script[@src]') as $script) {
        $resources[] = $script->getAttribute('src');
      }
      foreach ($xpath->query('//link[@rel="stylesheet"][@href]') as $stylesheet) {
        $resources[] = $stylesheet->getAttribute('href');
      }
      foreach ($xpath->query('//style[@media="all" or @media="screen"]') as $stylesheets) {
        preg_match_all(
          "#(/(\S*?\.\S*?))(\s|\;|\)|\]|\[|\{|\}|,|\"|'|:|\<|$|\.\s)#ie",
          ' ' . $stylesheets->textContent,
          $matches
        );
        $resources = array_merge($resources, $matches[0]);
      }
      foreach ($xpath->query('//img[@src]') as $image) {
        $resources[] = $image->getAttribute('src');
      }
    }

 

Below, you can see the final result in the processed serviceworker.js file that is output in the browser. The variables in the service worker are replaced when they are processed by the Drupal backend, the assets are then output in the file.

 

Phone Home uninstall

Another clever piece of functionality that the module provides is responsible cleanup when uninstalled. The module sends a request back to a URL created by the module, if the URL does not exist it means the module has been uninstalled. The service worker then unregisters itself and deletes all related caches left on the user's browser.

 

// Fetch phone-home URL and process response.
  let phoneHomeUrl = fetch(PWA_PHONE_HOME_URL)
  .then(function (response) {
    // if no network, don't try to phone-home.
    if (!navigator.onLine) {
      console.debug('PWA: Phone-home - Network not detected.');
    }

    // if network + 200, do nothing
    if (response.status === 200) {
      console.debug('PWA: Phone-home - Network detected, module detected.');
    }


    // if network + 404, uninstall
    if (response.status === 404) {
      console.debug('PWA: Phone-home - Network detected, module NOT detected. UNINSTALLING.');
// Let SW attempt to unregister itself.
      Promise.resolve(pwaUninstallServiceWorker());
    }

    return Promise.resolve();
  })
  .catch(function(error) {
    console.error('PWA: Phone-home - ', error);
  });
};

Credit to co-maintainer Chris Rupl for this solution. 

Workbox Broadcast Update

Beyond this core functionality of the module, there are endless possibilities with service workers. One branch currently in local testing is Broadcast Update.

According to the Workbox docs, Broadcast Update is “A helper library that uses the Broadcast Channel API to announce when a caches entry is updated with a new response, allowing your web app to listen for these updates and react to them.” We utilize it in our module as follows; first instantly display the cached page, this page may be stale, so in the background we compare headers with the version on the server and if there is a newer version available we render a button with help text giving the user the option to refresh the page for newer content.

 

We can also enable this functionality only on specific routes.

 

Broadcast update works by default by checking headers Content-Length, ETag, and Last-Modified you can set it to compare your own custom headers as well, but make sure you have these headers enabled on your server if using the default implementation. 

Here is a view of our working Broadcast Update implementation in the unprocessed serviceworker.js

const REGULAR_EXPRESSION = [/*regular_expression*/];; // This will get replaced by the values set in the Drupal admin when the file is processed by the module.

  REGULAR_EXPRESSION.forEach(function(url) {
  console.log(url);
  // Register on route input on Drupal config.   workbox.routing.registerRoute(
      new RegExp(url + '$'),
      new workbox.strategies.StaleWhileRevalidate({
        plugins: [
          new workbox.broadcastUpdate.Plugin({
            channelName: 'dashboard-updates',
          })
        ]
      })
 );
  });

We display the help message and button with this added javascript which is only included on the page if Broadcast Update is enabled in the admin config.

(function ($, Drupal) {
  Drupal.behaviors.pwa = {
    attach: function (context, settings) {

      var displayMessageOnce = true;
      var alertText = drupalSettings.pwa.alert_text;

      var message = "<div class='broadcastcache''>"+ alertText +"</br><button ' class='reload-page'>Reload</button></div>";

      const updatesChannel = new BroadcastChannel('dashboard-updates');
      updatesChannel.addEventListener('message', async (event) => {
        const {cacheName, updatedUrl} = event.data.payload;

        // Do something with cacheName and updatedUrl...
        // Get the cached content and update the content on the page.       

        if (displayMessageOnce) {
          displayMessageOnce = false;

          $( "body" ).after( $(message) );

          $('.reload-page').click(function () {
            location.reload();
          });
        }

      });

    }
  }
})(jQuery, Drupal);

Though still a bleeding-edge technology, Progressive Web Apps have support from some of the biggest tech companies including strong support from Google and Microsoft, with iOS trailing behind, yet catching up. The PWA improves website efficiency, development time, SEO, and it’s extremely developer friendly . 

For more background on the PWA module, Drupal and service workers visit https://www.drupalcampatlanta.com/sites/default/files/slides/Meet%20the%20PWA%20Module%20%281%29.pdf by Christoph Weber.

Nov 01 2019
Nov 01

The breakthrough in technology has brought a whole new range of tool suite for developers to make the software development process more efficient. React App is among one of them! A prominent tool recommended by the React community to create single-page applications (SPAs) and also get familiar with React.

React App ensures that the development process is refined enough to let developers leverage the latest JavaScript functionalities for better experiences and optimization of the apps for production.

For one of our clients- a giant retail travel outlet - who went out to get a realistic travel budget in mind for the travelers to plan ahead and avoid spending shocks along the way we built a budget planner. 

Built on React.js on top of Drupal, it is a dynamic feature and can be added anywhere in the website (under blogs, services) without coding. 

Creating React App doesn't require configuration of web pack (bundler for modules) and babel (compiler). They come inbuilt. Developers can right away start with coding here. However, the drawback is that they won’t be able to get an idea about things happening in the background.

If we set up React App without using the Create React App, then we will be able to know which all NPM packages/components are needed to make react app working.

About React App

Create React App was built by Joe Haddad and Dan Abramov. The GitHub repository is well-maintained by the creators to fix errors and deliver updates frequently. 

It is a prominent toolchain for building apps quickly and efficiently. A toolchain can be defined as a set of different s/w development tools that are optimized to perform specific functions. For example, the C++ development process requires the compiler to compile the code and a build system, say CMake, to manage all the dependencies. Similarly, the React App is used to create “hello-world” applications.

This blog will showcase how to create a React app from scratch. The prerequisite is to have the NPM package manager installed in the system.

Below mentioned are the steps for the same-

Step 1- Create an app directory

mkdir myApp

Step 2- Access myApp folder and run 

npm init

This, in turn, will create a package.json file for which you can provide the name and version.

Or

npm init -y

This will create a package.json file with a default package name and version.

Step 3- Install react and react-dom packages

npm install react react-dom

This will create a node_modules folder with all dependent libraries to further add dependency inside package.json file)

Step 4- Create a .gitignore file, to avoid pushing unnecessary files to GitHub

vi .gitignore

Under files section, add all the files which you don’t wish to be tracked by Git

  1. node_modules
  2. dist
  3. ._

dist ( Distributed folder ):- This is an auto-generated build directory. We don’t require this folder because it will be generated by the compiler.

Step 5- Create an app folder

mkdir app

Access app directory and then create three files 

  1. touch index
  2. js index.css
  3.  index.html

Step 6- Edit index.html and add below snippet

<!DOCTYPE html>

<html>

<head><title>

my-app

</title></head>

<body>

<div id="app"></div>

</body>

</html>

( No need to add any styling inside index.css file as of now )

Step 7- Edit index.js file and add below snippet

import React from 'react';

import ReactDOM from 'react-dom';

import './index.css';

class App extends React. Component{

render(){

return(

<div>Hello World</div>

)

}

}

ReactDOM.render(<App />, document.getElementById('app'))

At the time of running this JSX code (XML/HTML- like syntax used by React that extends ECMAScript) in the browser, it will drop an error. Because the JSX code browser doesn’t understand and that is when we require Babel and Webpack.

npm install --save-dev @babel/core @babel/preset-env @babel/preset-react webpack webpack-cli webpack-dev-server babel-loader css-loader style-loader html-webpack-plugin

NPM install takes 3 exclusives, optional flags that either save or duplicate the package version in your main package.

1.  JSON: -S, ~save: 

The package appears in your dependencies

2. -D,~save.dev: 

The package appears in your devDependencies

3. -O, ~save-optional:

The package appears in your optionalDependencies

We will use Flag--save-dev to differentiate between built dependency & app dependency. 

Once installed successfully, you can check the package.json file to check the differences.

Webpack Configuration

Webpack, as stated is a module bundler, which primarily focuses on bundling JavaScript files for usage in a browser. Though it is also capable of transforming, bundling, or packaging just about any resource or asset.

Check the steps below for webpack configuration-

touch webpack.config.js

Step 1- Add below snippet in this file

var path = require('path');

var HtmlWebpackPlugin = require('html-webpack-plugin');

module.exports = {

entry : './app/index.js',

output : {

path : path.resolve(__dirname , 'dist'),

filename: 'index_bundle.js'

},

module : {

rules : [

{test : /\.(js)$/, use:'babel-loader'},

{test : /\.css$/, use:['style-loader', 'css-loader']}

]

},

mode:'development',

plugins : [

new HtmlWebpackPlugin ({

template : 'app/index.html'

})

]

}

Step 2- To allow babel-loader work well, we have to add babel preset config to package.json

"main": "index.js",

"babel":{

"presets" : [

"@babel/preset-env",

"@babel/preset-react"

]

}

Step 3- To run build, we need to add webpack to script within the package, i.e.,  package.json

"main": "index.js",

"babel":{

"presets" : [

"@babel/preset-env",

"@babel/preset-react"

]

},

"scripts": {

"create": "webpack"

},

Step 4- Run below command

npm run create

With this, webpack will be created, which, in turn, will create a dist folder and our bundle file including index.html.

Watch this video to learn more about it

[embedded content]

Step 5- Now to start webpack dev server, add below snippet inside package.json file

"scripts": {

"start": "webpack-dev-server --open"

}

It will start building our code as soon as we run npm start.

Step 6- All setup is done. Now run `npm run start` command to see the result on the browser.

The final directory structure will somewhat look like this as shown in the picture - 

Note: You might observe some storybook related extra files in the picture. However, these files won’t be visible in your setup. If you want to know about the storybook, stay tuned for our next blog.

Conclusion

I hope that now you have understood the fundamentals of Create React App better. If Yes, then implement the given steps right away and start building your awesome ideas!

Stay tuned for my next blog in which I will discuss Storybook. 

Happy Coding!

Nov 01 2019
Nov 01

After three days of some informative and inspiring talks, it was time for contribution day, so everyone got their editors out, put on their documentation hats, and got those issue queues down!

Amazee Labs Contribution Room

On day four, the contribution events started at 9 am, and folks had a few options: the First-Time Contributor Workshop, Mentored Contribution or General Contribution. The DrupalCon Amsterdam website had a handy little flowchart to help people decide which session would be best for them.

DCA Flow Graph Credit: https://events.drupal.org/amsterdam2019/contribution-events DCA

First-Time Contributor Workshop

This was a fully guided workshop, run by Brian Gilbert and Jordana Fung, as an opportunity for people who hadn’t contributed before to get up and running with the drupal.org issue queue and necessary tools needed to contribute. This seemed like a fantastic opportunity for people working with Drupal that may find the idea of contributing overwhelming or if they were unsure where to start. I spoke to someone that attended the workshop and they said the contribution process was broken down well, but unfortunately they ran into issues getting a development environment working (always the hardest bit!) so they didn’t quite get to solving an issue, but they said the slides provided will be an invaluable resource to refer back to and have more confidence with contributing now. What I’ve been hearing throughout the week at DrupalCon was reiterated in this workshop: “Even if you can’t make a patch, you can still contribute!”, which is a great encouragement for non-developers to contribute by other means such as creating translations, updating drupal.org documentation or volunteering at events, and highlights the diverse community.

Mentored Contribution

This session was aimed at those who were already comfortable with the issue queue and contributing, but still wanted some direction from the mentors that kindly gave up their time to guide groups of around eight. The focus was on Drupal core, where attendees were pointed to issues tagged with “Novice”. There had clearly been some good triage and tagging in preparation for the DrupalCon contribution day, as these issues were ready to go and usually had an additional comment to help clarify the issue. I think for a lot of us in this session, it introduced us to a different type of contribution, where we were working on issues that didn’t directly affect us, or our current project, and it felt good! Hopefully, there are some converted Drupal users that might devote some time to working through the core issue queue.

General Contribution

This was for those with clear intentions and a sense of purpose! This session seemed fairly freeform, but tables were grouped by topic e.g. search, Drupal 9, admin UI and frontend. Having previous experience contributing to a project/topic wasn’t necessary, so as long as you had experience with the issue queue, appropriate skills and an interest in a particular topic, you could rock up and chat to others at that table to get working on an appropriate issue.

Out of these options, I chose to join the Mentored Contribution session and immediately after arriving, I was allocated a table and mentor, got a development environment up and running, and we were tasked with working on Drupal core issues. It was helpful to have the mentor on-hand to discuss possible solutions and it felt great to be part of such a big contribution event - imagine how many comments and commits were made that day!

Amazee Labs Team

We also had an Amazee success in the General Contribution session, with Amazee’s own Philipp Melab releasing a stable 3.0 version of the GraphQL Drupal module, so congratulations to Philipp and everyone else who worked hard at the contribution day to get that finished.

I’m looking forward to contributing more outside my project-specific needs after being inspired to at DrupalCon and the contribution day was a great insight into the effort and hours that go into making Drupal what it is.

Nov 01 2019
Nov 01

If someone were to break in your room, they would probably learn less about you than if they hacked you on the internet. Our efforts towards security have to be as intensive online, as they are offline.  
An extremely popular Content Management System, Drupal is on top of the list for many. 
Very importantly, Drupal has a dedicated team to make sure that the Drupal core is largely free of loopholes or vulnerabilities that may compromise website security. 

Drupal powers more than 7,00,000 sites across the internet, which increases the chance of a specific site owner being vulnerable to cyber attacks. 
Although all third-party modules are heavily vetted, a little extra security is always a good idea.


Let’s find out how to enhance Drupal’s security and make it risk-free! 

Keep up with internet trends, update!

Image

Drupal’s team works consistently to offer timely and effective updates to enhance your security. Understand what the new updates fix and bring to the table, even though we’ve been conditioned for decades to overlook such information. With all the other things you need to remain updated with, this one is right up there with Instagram trends! 

Spam prevention modules

Spam

We know how our lives have become easier with Truecaller preventing our spam calls. Our websites give a sigh of relief using different anti-spam modules like BOTCHA or Honeypot. 

Even though they all play a different role, they are key to a spam-free website. They deploy simple but effective means to keep a check on bots, for instance. 
Get your line of knights ready to go!

Password policies

Password

Next time a website makes you enter a password with too many conditions, you should know you’re in safe hands. 
Drupal offers a range of password policies that you should opt for and users shall adhere too. For instance, automated log-out, in case it remains idle for some time, a minimum character length for passwords and other security measures.
 

Set your permission boundaries right.

Auth

Let’s do the obvious, and not give hackers permission, shall we?

You can limit your sensitive files and their access to only, read, write or modify them. This defines how compromising your website can be for potential hackers. 
Make your important files such as authorize.php only permissible to the author and developer. Confidential configuration should not be present in the version control system and must be configured directly and secured on the particular instance that it has to be used for.
 

Implement HTTP Security Headers

This one is easy, effective and shouldn’t be missed out on from the security checklist! Easy to configure, they let the browser know how to handle your site’s content and bring your risk factor down several notches. 

Lock

Pro Tips:

  • Clear the carts! Always disable, uninstall and remove unused modules to keep it healthy and low-risk!
  • Take regular back-ups of all of your code, database, and files to prevent any loss of crucial data in case of a cyber-attack or a different threat to your website. 
  • Remember to keep an eye on your roles and permissions and modify them as and when you require to optimize your security.
  • Make sure you get an SSL certificate with a good rating. 
  • Reporting violations like CSP violation and Expect-CT failures
  • Trusted host settings need to be configured properly https://www.drupal.org/docs/8/modules/skilling/installation/trusted-host...

All set to keep out the dementors then? 

Oct 31 2019
Oct 31

 

St. Catherine University

St. Kate’s Website Redesign: A Fresh Site and an Empowered Team

St. Catherine University (known as St. Kate’s) is a private Catholic liberal arts university in St. Paul and Minneapolis, Minnesota. St. Kate’s was one of the first colleges specifically for women in the Midwest but today also welcomes men into graduate and associate programs. It has a student population of around 5,000 annually and welcomes students of all ethnicities and ages.

St. Kate’s website is their biggest marketing tool, and it needs to give site visitors—from prospective students to internal St Kate’s workers—a great user experience on desktop and increasingly, on mobile.

One of the biggest issues with the website was that it was effectively two sites, one internal (Kateway) and one external. People didn’t always know where to go to find information, and information duplicated in both sites could get out of sync. Beyond that, the search functionality of the site was poor—when users couldn’t find what they needed, they resorted to using Google to search the site, or started helpdesk tickets.

But bigger process issues lurked behind the scenes. The site was built on ModX, a content management system that was not supported internally. This required working with an outside vendor when more than simple content updates were needed. In addition, any content updates, regardless of department, had to be approved and made by the Marketing & Communications department, who already had their hands full with their own work, so there was often a backlog of requests.

Our Assignment: Build a New Website and a Self-Sufficient Dev Team

The site redesign had several high-level goals: create a new site in Drupal 8 from the ground up that was more flexible and able to be supported in-house, combine the internal and external sites to one site with a single sign on, and refresh the site design and content to better support more robust marketing efforts. Once the site was supported in-house, it would be easier to implement another goal: share ownership of the site content across campus, instead of under Marketing & Communications. 

We spent a great deal of time on the content audit and content strategy on this project, in order to know what content to keep and how to organize old and new content effectively. This process can seem long if you’re not used to it, but it’s absolutely essential in a site redesign, where we need to tailor content to specific audiences who all need to be communicated to in different ways. St. Kate’s really did their due diligence to gather information and ensure all voices were heard; they sent out surveys, did presentations, met with folks. We performed user interviews with all levels of stakeholders and users, from members of the web advisory group to department heads, teachers and students. 

We believe in getting stakeholders (IT, marketing and business units) involved as early as possible in the review process, so as soon as we’d constructed the new site’s navigation, we tested it with both internal and external audiences and made changes based on those results.

We made the following changes to the St. Kate’s site:

  • CMS & Infrastructure: We moved the site hosting to Pantheon, a development platform tailored for Drupal, which also supports Apache Solr, the solution we implemented to improve St. Kate’s search functionality. We also liked that Pantheon’s workflow enforces the development > test > live workflow.
  • One Site, Home for All Content: Instead of the old internal site and the old external site being separate, there is now one integrated site with a single sign-on. All content can be reached from the home page.
  • Content Migration and Streamlining: Most of the content had to be recreated manually on the new site, because there wasn’t an easy way to get most of the data out of the old site (we did manage to get news articles out). We consolidated information where possible. For example, there used to be five mission and impact pages, now there is just one, and we streamlined the news and events content together.
  • Content Types: We implemented Drupal paragraphs to enter content, and whittled the number of content types from 60-80 on the old site to a more manageable 10-15.
  • Improved Search: We implemented Apache Solr, which allows us to index content in different ways, and to assign higher weight to different fields of data. For example, we can give the TITLE field a higher weight, so if someone searches for “admissions,” the Admissions page is likely to be served up higher in search results.
  • Integration Work: We integrated many external systems into the site. For example, faculty content used to be hosted on the old site, but now there’s BEPRESS integration that uses the Drupal Feeds module to bring data on to the site. We also integrated BANNER, the Student Information System (SIS), D2DL (Desire to Learn), and SOPHIA (library software).

Empowering Developers

When St. Kate’s was looking for someone to do the redesign, they were really looking for a partner, a vendor who would collaborate with the in-house dev team, and help them become self-sufficient in supporting the site after the project was complete. “We didn’t have any dev process. Nobody understood how ModX worked,” said St. Kate's Austin Allman, Senior Programmer Analyst. “We didn't have any practices or prejudices. We were very malleable to TEN7’s practices.” We got them used to the Agile software process, including the use of version control, JIRA issue tracking, and GitFlow to test their feature branches before going into production.

“Any major site like this has a major learning curve, especially if you’re dropping a new platform on them,” said Les Lim, TEN7 Technical Project Manager. “We tried to be as transparent as possible with what was being built in Drupal 8, so the St. Kate’s dev team could see things evolve over time, see the work in progress. That’s an instructive thing, more so than just seeing the finished product.”

Outcome: A Fresh Website and Positive Intangible Changes

The new website launched in the summer of 2019, and the feedback has been very positive. People say that the site is so much easier to navigate, all the information is in one place, and they like the visuals. There is now site content for every audience, accessible with their own set of links. The number of help desk tickets looking for information has gone done significantly, meaning people are finding what they need.

Departments can now make content changes to their own site area, freeing the Marketing & Communications Department to focus on their core responsibility, promotion of the University, and raising money through donor programs.

The website redesign process has also changed their dev team. “I’ve seen this dev team go from group of people who were scared to death to make a decision to seeing leadership come out,” said Lisa Gariepy, Information Technology Consultant with St. Kate’s. “It’s a whole new mindset of being able to see a problem, and know that they should take action, and know they have the power to do that—they don’t have to send triplicate emails and wait for a thumbs up. It was helpful for them to keep hearing Les and Jason [Cote, TEN7 Front End Developer] saying ‘LET’S JUST DO IT,’ and accepting that sometimes they’ll make the wrong choice. And if that happens, all we have to do is fix it!”

This has also changed the dynamic between the dev team and the user base. “Before, when users would ask for changes, we would have said ‘Nope we can’t do it,’ because we didn't know how to do it,” said Austin. “Now, users are more emboldened to ask for changes, because we know we can we make a change.” 

Oct 31 2019
Oct 31

It’s important that all users, including non-tech marketers and business owners are able to easily work with a website. When it comes to Drupal, it continues to make great strides strides towards being more user-friendly, which is one of the key priorities and benefits of Drupal 8.

We are happy to announce another great advancement in the Drupal user-friendliness field — the new core Help Topics module. Let’s see what the module does to raise it even higher.

A user-friendly CMS is a competitive CMS

First, we should note why Drupal usability is its priority today. One of the key reasons is that it helps Drupal be competitive among other CMSs. It should become the best user-friendly CMS and a platform of choice for more business owners.

For example, much ink has been spilled over the rivalry between Drupal and WordPress in website development.

  • Among the key strengths of the WordPress CMS is a high level of user-friendliness.
  • Drupal has always been regarded as a platform that allows you to build more advanced functionality while being a bit more complicated and having a steeper learning curve.

Drupal creator Dries Buytaert said he was passionate about making Drupal more user-friendly for the day-to-day users.

Drupal creator Dries Buytaert strives to make it easy to use for everyone

The achievements in making Drupal the most user-friendly CMS are huge already and the work goes on. D8 can boast:

  • easy content creation experiences with the CKEditor
  • the Quick Edit feature to edit content on the fly
  • a convenient and attractive Media Library and media embedding
  • a user-friendly drag-and-drop Layout Builder
  • the forthcoming Claro admin theme that follows all modern UX design guidelines
  • adherence to WCAG and ATAG in web accessibility standards
  • convenient admin UIs to do almost anything
  • flexible workflows based on roles

and much more.

Why the new Help Topics module in Drupal was needed

One of the key aspects of user-friendly website administration experiences is knowing how the website’s modules work. Their user interfaces, settings, and work peculiarities may range from simple to challenging. Getting help with them greatly improves admin usability.

So Drupal needed a unified way in which modules and themes can add their help topics. In this area, the Drupal core already has the Help module but it only lets the module developers create overview help topics via the hook_help().

According to the Help Topics maintainer Andypost, it was not easy for everyone to add help topics with the existing Help module. To solve this, the community created the Drupal 8 User Guide and even translated it into several languages within the Drupal Documentation Initiative.

Next, as Andypost tells us, the Initiative’s leader Jennifer Hodgdon (jhodgdon) created a Configurable Help module. It is a sandbox for now but it is going to reach stability and allow adding help topics directly from the browser.

The decision was made to add parts of this new module to Drupal core as an experimental module Help Topics, as well as convert the Drupal 8 User Guide into the help topics about the core modules’ work. The idea to add the Help Topics module to Drupal core was successful!

The new Help Topics module overview and roadmap

The Help Topics module allows the developers of core and contributed modules, themes, and distributions to create help topics as Twig files. An additional contributed module will allow anyone to do it from the browser.

There can be as many topics per module as needed for the sake of making Drupal more user-friendly. The topics will be found on a website’s Help page at admin/help.

Help Topics module coming to Drupal 8.8 core

Some will be listed there directly if they are marked as “top-level.” Others will be listed as “related.”

Based on the tasks the users are supposed to do, the help topics can be single or grouped:

  • One task makes a “Task topic.”
  • Multiple task topics can be grouped into a section and make a “Section topic.”

As “The Drop is Always Moving” tweeted, the Help Topics is the result of the wonderful work of 38 people over several years. It will be included in the D8.8 as an experimental module and should provide a useful help solution to Drupal users.

Help Topics experimental module included in Drupal 8.8 dev

Furthermore, the Help Topics module roadmap is to reach stability and merge with the existing core Help module. Another core module in this area will also remain in place — the Tour module that makes Drupal more user-friendly through tooltip help.

Help Topics module to come to Drupal 8.8 core

The structure of Help Topics

Each topic will be a Twig file that lives in a particular module’s subdirectory called help_topics. The files should be named like this: modulename.topic_id.html.twig.

They will have the "front matter" metadata and the HTML body. A single Task topic should have metadata with this information:

  • the topic’s title
  • whether they are “top-level” or “related”
  • the goal
  • the optional explanation “what is/are”
  • the required steps to perform the task (wrapped with an H2 heading)
  • the optional additional resources (also H2)

Section topics have similar metadata is similar, but they can only be “top-level” and should list the overview of related tasks, not the required steps. You can see more information about the Help Topics standards.

Enjoy Drupal user-friendliness with all its new tools

Considering the giant steps in which the Drupal becomes a more user-friendly CMS, it’s necessary to keep your website updated. Only in this case you and your users will take advantage of the new usability features.

Think, for example, about the fact that the Layout Builder is only stable as of Drupal 8.7, and the Media Library got a new more user-friendly and beautiful interface in the same release.

You can always rely on our development team who will smoothly update your website to the latest core versions, configure the user-friendliness-related modules such as Help Topics or any others, and assist you in preparing for Drupal 9.

And, in the light of our today’a topic, we can offer plenty of additional UX secrets to make your website more user-friendly.

Talk to our Drupal team!

Oct 31 2019
Oct 31

More content for smart speakers

The steady rise of smart speakers has paved the way for new opportunities for businesses. Right now, a staggering amount of searches are done through smart speakers. What this means is that people are going to rely more and more on audio feedback. With this in mind, creating some good content and features for home speakers like Alexa and Google Home can keep you in the game. Everything is an opportunity is you work hard enough, especially adapting to new digital marketing trends.

Drupal being proven solution for headless application, is already capable of serving as a platform for you to connect your existing content directly to voice APIs.

Voice search optimization

voice search digital marketing trends

Technology is rapidly changing and shaping the way we are interacting with our surroundings, including what digital marketing trends develop. A trend that was growing in the past year is going to grow in 2020 as well. That trend is, of course, voice search. With the rise in adoption for voice assistants and smart speakers like Alexa and Google Home, voice search has seen an incredible increase in usage over the past years. 

 

With this in mind, you can start adapting your business to better make use of this trend in order for your business to grow. It’s important to remain ahead of the competition if you want your business to flourish. This is why I’m going to give you some tips on how to better optimize your website for voice search:

Tips for Voice Search Optimization

1. Understand the Language: People that are using voice usually are not searching for just one keyword. Instead, they are using long sentences that describe what they are looking for. If you want to make your content more relevant for voice search, then you will have to adapt and use long sentences that are likely to be in tune with what the customer is saying when searching for content that is similar to what you’re offering.

 

2. Be Conversational: In order to improve the likelihood to be found when people are voice searching for your content, you will have to keep the keyphrase at a conversational level. The person that is searching for content using voice search is also going to keep it conversational. In this way, you can increase your chances of being found.

 

3. Answer Questions: When people are using voice search to find out stuff on the internet, they are asking questions. Thus, if you are able to pinpoint the questions that are going to be asked in order for your content to be found, you will have to adapt your keyphrase accordingly. This will certainly increase your chances to share your content with the curious reader.

More chatbots

A.I. Digital Marketing Trends

Another one of the digital marketing trends that does not seem to slow down is the adoption of chatbots. Chatbots have seen an astounding amount of limelight in the past year and are currently part of the digital marketing trends of 2020. Of course, chatbots still have a long way to develop before they can perform more complex tasks, but as with every technology that is in its beginning stage, it is going to become better with time.

 

Juniper Research has done some research on the adoption and projection of the chatbots in the future. Chatbots are really good at cutting costs for business. Not only that, but they can also increase the revenue generated for a business.

 

On top of that, a chatbot is extremely useful when it comes to answering basic questions that required a human in the past years. They definitely cut costs and time for your support team. Your support team can now focus on answering more complex questions, instead of having to sit through the most basic ones that a customer might have. This will also lead to a more satisfied customer, as the time queues that they would have to wait for an answer from the support team are drastically shortened.

Increase the amount of automation

Automation Digital Marketing Trends

Since today's world is incredibly fast-paced, you will have to find ways to increase your efficiency, while decreasing the workload and keep up with the digital marketing trends. Start trying to automate the tasks that are repetitive and time-consuming. Instead of having to focus on sending 100 emails, for example, use a software that will automate that for you. This will make it easier for you to invest time in other more important tasks that you have to take care of.

No-code website design

In the past years, we have seen the rise of the no-code website design software. However, in the upcoming year, their popularity is only going to rise. The reason for this is that these tools are speeding up the process and are cutting down the costs of designing and modifying a web page. In other words, these types of software will lower the barrier of entry for people and businesses wishing to design a visually appealing content for the web but lack the technical knowledge of coding.

Marketing departments will finally be able to edit a webpage that matches their creative concepts without having to involve the IT department, with the magic of WYSIWYG (What You See Is What You Get). In the near future, designing a website and building effect multi-media content for the web will become accessible to a larger pool of people. So, if you always wanted to design your own website but lacked the knowledge of how to do it, you can pick up a tool like our visual Drupal-based drag and drop page builder Glazed Builder and start building your own visually stunning website easier than ever before.

Hyper-targeted Ads

Target Digital Marketing Trends

Learn to use the data that your website has gathered from your visitors. Not only that but also the data that you can gather from Google. This will make it much easier for you to find the people that are actually interested in what you have to offer. Not only will it cut the cost of your advertisements, but it will also increase the quality of the leads that are going to come to your website intrigued by your product or service. In a nutshell, start analyzing and start targeting. 

What digital marketing trends are you watching?

Did we miss any important trends or do you want to share your insights on some of the above mentioned digital marketing trends? Drop a comment and let us know!

Oct 31 2019
Oct 31

After a night full of singing Karaoke with the fellow Amazees, the day started a bit slower and later than usual. I made my way down to the lobby to catch a ride to the venue. 

We arrived just in time for the Dries Q&A, where Drupal founder Dries Buytaert answered the communities questions about future plans, roadblocks in the development, and what the challenges are for open source software to compete with licensed software (basically just because those can offer a wider range of toolsets). He also spent a lot of time emphasizing what makes the Drupal community so special.

amazee.io as a Platinum sponsor had a short Amazecapades performance in the exhibit hall at lunchtime. For a change, this session wasn’t filled with facts about Lagoon, tech talk or sales-y lines, instead was short lightning talks and activities with fun life tips. 

amazee.io Michael Open Mic

Michael Schmid shared his secrets to relax and fall asleep fast, which consists of the 4-7-8 breathing method: four seconds in, hold your breath for seven secs and breath out for eight. After that, try just counting down from one hundred. Sounds too simple, right? You should try it, but maybe not at work. ;-) 

Amazee Labs Felix Open Mic

Felix Morgan spoiled movies for everyone (JK) as she explained every single movie follows the exact same dramatic composition at 25%, 50% and 75% through the movie.

amazee.io Nicole Open Mic

Nicole DeAngelis then finished off with a small Yoga session to help loosen the stiff necks and move the body after the long-time sitting in talks.

Completely relaxed, I headed to a very promising Keynote by Sue Black. Her talk was very inspiring, as she went on about how challenges in her life got her to build a successful career out of it. With dedication, passion and education, everyone can achieve their goals.

After this motivating session, we took a short stroll to the park behind the RAI for a quick moment to enjoy the Sun and get some fresh air.

The afternoon was mostly spent with the Amazee team; getting to know each other better and in person. I had a great rap with a lot of folks and also learned about rubber duck debugging as well as actually solving a client issue by implementing it - double success!!! :celebrate:

The end of the conference day concluded with Trivia night. After being in the Drupal community for my second year now, I felt confident enough to join a team and compete for the cool prizes.

Quiz Night

Quiz Night Q&A

Sadly, my input didn’t make the difference as we came in at rank 33 out of at least 50. Nonetheless, it was a lot of fun. Besides the friendly competition, I gained more background knowledge about the CMS and Community in which I’ve been working with for some time now.

All in all, being at a Conference with +/- 1600 like-minded people broadened my horizon about the culture that can stand behind such a framework and how everything clips together within the open source community.

I’m already looking forward to attending next years DrupalCon!

P.S. #thanksdan for supporting my poor blog writing!
 

Oct 30 2019
Oct 30

We recently started using Vale to help automate the tedious task of enforcing our style guide. Doing so has helped make reviews faster, and reduced any hard feelings between us. Emotions can run high when you feel someone is being overly scrupulous in their review of something you've worked really hard to create.

Everything gets reviewed

Every content item we publish goes through a rigorous review process to ensure we're always putting our best foot forward. This review consists of a number of different steps:

  • Technical review: Is the content technically correct? Do all the code samples work?
  • Copy editing: Does it meet our style guide? Does it use Chicago Manual of Style formatting guidelines? Does it use proper grammar, spelling, etc.?
  • Check for broken links and images
  • Apply consistent Markdown formatting

Some of these things are objective. For example we always use Drupal, never drupal. We always use italics for filenames and paths. And we always format lists in Markdown using a - followed by a single space, never a *. These are things that are simply not up for debate. You either did it right or you didn't. Most tutorials have at least a handful of these fixes that need to be made.

Other style guidelines are more subjective. For example, we try to not use passive voice, but there are exceptions. A technical review might point out multiple ways of accomplishing the same task, and we'll generally only cover one. Avoid cliches. Don't use superlatives and hyperbole. A single tutorial usually has 10+ of these suggestions. These are by far the more important things to focus on in the review as they can have a real impact on the usefulness of the content.

No one wants to be the jerk who points out dozens of formatting errors. And no one enjoys having their work nit-picked by their peers.

We've been talking for a long time about the utility of a tool to help with automating some of the steps in the review process -- specifically, the objective ones. Similarly, Drupal developers use PHPCS to ensure their PHP code follows the Drupal coding standards, and JavaScript developers use Prettier to ensure consistent formatting.

Without a tool, we spend a lot of time in the review process commenting on, and fixing, non-substantive things. That's a distraction from the more important work of providing a critique of the content itself.

Let the robots do the nit-picking

Amber recently introduced me to Vale, a tool she learned about while attending the Write the Docs conference in Portland. We've since introduced it into our review workflow, and are loving it, along with remark for linting Markdown formatting.

Side note: Check out this lightning talk from the conference. It's not Vale, but gives a great overview of the types of things we're doing.

While there are numerous other tools we evaluated, in the end we choose Vale. We've found that it's easier for non-technical users to configure and it allows us to differentiate between objective and subjective suggestions through the use of different error levels.

YAML configuration files

When using Vale you implement your styles as YAML files.

Example:

extends: substitution
message: Use '%s' instead of '%s'
level: warning
ignorecase: false
# Maps tokens in form of bad: good
swap:
  "contrib": "contributed"
  "D6": "Drupal 6"
  "D7": "Drupal 7"
  "D8": "Drupal 8"
  "D9": "Drupal 9"
  "[Dd]rupalize.me": "Drupalize.Me"
  "Drupal to Drupal migration": "Drupal-to-Drupal migration"
  "drush": "Drush"
  "github": "GitHub"
  "in core": "in Drupal core"
  "internet": "Internet"
  "java[ -]?scripts?": JavaScript
...

The above configuration file provides a list of common typos and their correction. Because this is a YAML file it's relatively easy for anyone to edit and add additional substitutions. For these suggestions we've set the error level to warning. When we run Vale we can tell it to skip warnings and only report errors.

In another example we've got a style that enforces use of the Chicago Manual of Style for determining how to capitalize a tutorial's title.

extends: capitalization
message: "Tutorial title '%s' should be in title case"
level: error
scope: heading.h1
style: Chicago
# $title, $sentence, $lower, $upper, or a pattern.
match: $title

This is configured as an error.

Running it locally

Everyone authoring, or reviewing, content can install Vale locally and run it with our specific styles. Doing so outputs a list of all the errors and warnings that Vale caught.

Example:

Output from running our review linting tool in a CLI. Shows examples of various errors and warnings.

As a content author this is great because it can help me fix things before sending the content off for review. I don't have to worry about the disappointment of having someone send a tutorial back with endless nit-picks over my failure to remember every last detail of our style guide.

As a content reviewer I get a good list of places to start looking for possible improvements, as well as feel confident I can spend more time focusing on substantive review rather than looking for incorrect use of Javascript vs. JavaScript.

Automating it with Circle CI

Screenshot of CircleCI integration in GitHub

Once we got an initial set of styles in place we were able to setup a CircleCI job that executes against each new Pull Request (the canonical version of all our content is stored in Git). The result is that at the bottom of every pull request you can see two checks: one for Vale rules, and one for Markdown formatting. If either detects an error it is revealed quickly and can be fixed.

When we run Vale in Circle CI we suppress all non-error suggestions. So it'll only mark a PR as failing if there's something objectively wrong. These are usually quick to fix.

Because we can switch a rule from warning to error by editing the configuration file we can trial new rules. We can also set up rules that are useful for us to have while reviewing but don't need to block a piece of content from being published.

Recap

In order to ensure that content reviewers can spend their time focused on the substance of a tutorial and not on enforcing the style guide, we use Vale to help automate the process of content review. It's helped us have more meaningful conversations about the content, and has also reduced the animosity that can occur as the result of feeling like someone is being hypercritical of your work.

If you work with a style guide I highly recommend you check out Vale as a tool to help enforce it.

Oct 30 2019
Oct 30

Working with databases can be challenging and cumbersome. Luckily, there are useful Drupal modules that make complicated tasks easy (or even possible at all). Of course, the best way to get the results you want is to rely on Drupal development and DevOps services by Drudesk. In the meantime, we are sharing great modules for working with databases in Drupal with you.

Some challenges with databases in Drupal

When you are working with databases, you may need to:

  • store your data in a database that Drupal does not support by default (i.e. other than MySQL, PostgreSQL, and SQLite)
  • connect to multiple external databases
  • quickly back up, restore, or migrate your database between the environments
  • set up the database search

and much more.    

Modules for working with databases in Drupal

MongoDB

The MongoDB module allows you to store your Drupal data not in the “classic” SQL database but in MongoDB — a popular, document-based distributed database. 

The module offers website administrators a user-friendly interface with database logs. It provides fast database logging and easily handles external logging mechanisms (for example, Elastic Stack). 

MongoDB takes the load off your SQL database server and does not require complicated settings or coding from site admins. However, it is also a joy for developers to work with thanks to a Drupal-standard API, handy Drush and Drupal Console commands, PSR-3 implementation, detailed documentation, and much more.

MongoDB Drupal module

Drupal driver for SQL Server and SQL Azure

Here is a Drupal driver for working with the database engines of Microsoft SQL Server — a famous relational database management system. The module supports:

  • SQL Server of version 2008 and newer
  • Azure SQL of version 12 and newer

The Drupal driver module is modern in everything — including the fact that it only supports PHP7 or later versions, with older ones to be used at your own risk. The MSSQL PDO extension is compatible with Windows and Linux.

The module’s Drupal 8 version is fresh and new, with the latest update in September. The 8.2’s branch of the SQL Server driver is tested by means of the AppVeyor service.

Drupal Driver for SQL Server and SQL AzureDrupal Driver for SQL Server and SQL Azure

Oracle Driver

Another popular relational database management system, Oracle, can become the primary backend of your Drupal website thanks to the Oracle Driver Drupal module. 

The module allows you to create Drupal nodes from Oracle tables, store your website’s files in Oracle, and more. 

It should be noted that the module is minimally maintained and its Drupal 8 version is a release candidate. However, it can be a nice example of a module connecting Drupal to a particular third-party database system.

Oracle Driver Drupal module

Search API Database Search

The drupal.org website itself uses the Search API Database Search module, which is the best “promo” possible. However, please note that the module is only ready for Drupal 7.

The module provides a backend for the Search API module known as a great foundation for creating the search setup on Drupal websites. Search API Database Search uses a classic database for data indexing. 

This is a simple and cheap option for websites that are not so large and/or do not need a super-powerful search backend — for example, the Solr engine.

Views Database Connector

It can be incredibly useful to be able to pull data from external databases and display it to your users as a Drupal view. It is possible with the Views Database Connector module.

Drupal Views get full access to the external database tables that are found in your Drupal installation’s settings. To provide effective data pulling, the module needs an information_schema table for MySQL or PostgreSQL and sqlite_master for SQLite.

Thanks to this module, you get a new view type on the “Show” menu to select. All items generated by the module have a [VDC] prefix.

Views Database Connector Drupal module

Backup and Migrate

Everyone knows about the importance of regular backups. Restoring your Drupal database from a backup can become a rescue in critical situations. 

Luckily, there is the Backup and Migrate module that backups and restores databases, files, and code. The module downloads your database as a file or saves it on the server. It supports various compression types (zip, gzip, and bzip). 

It can back up all the database tables or the ones you select. Automatic scheduled backups are also supported.

Backup and Migrate Drupal moduleBackup and Migrate Drupal module

Let us help you manage your databases in Drupal

The modules described above — and others — can be helpful for working with databases in Drupal. However, this area requires a technical background from the person who performs the tasks. 

Contact our Drupal development and support team for any help in managing your databases in Drupal. If this requires add-on modules, we will select and configure them for you, or create custom modules that will match your requirements more exactly. 

We want to make sure your databases in Drupal are perfect and serve you well

Oct 30 2019
Oct 30

Auto-update Drupal core and modules always have been a high demand in Drupal. hopefully, in the Amsterdam 2019 DrupalCon , Automatic Updates modules has been introduced.  a great module but in begging on its way. but Automatic updates is a great feature in drupal.

it has been introduced as a contrib module, but it will finally go to the core of Drupal.

as you can see in its page in will developed in two phases.

Phase 1 of the Automatic Updates Initiative

The first phase of the Automatic Updates is separated into two Work Packages, generously sponsored by the European Commission:

Work Package A

  • Providing a JSON feed of PSA's from Drupal.org - Stable as of Alpha 1.
  • Consuming the PSA JSON feed as alerts in the Drupal admin interface - Stable as of Alpha 1.
  • Creating the extensible 'Update readiness check' system and implementing the first readiness checks - Stable as of Alpha 1.

Work Package B

  • Creating the 'quasi-patch' system which will generate the automatic update packages - Experimental as of Alpha 2.
  • Generating the 'quasi-patches' from Drupal.org - Experimental as of Alpha 2.
  • Ensuring the automatic update system can apply the 'quasi-patches' - Experimental as of Alpha 2.
  • Securing the automatic update packages from Drupal.org with a hashing and signing architecture - Experimental as of Alpha 2.

I will update this article with more details.

Addtional link:  https://youtu.be/Apqd4ff0NRI?list=PLpeDXSh4nHjSZET8xL2RyK3_2WeXxyWkY&t=1968

Oct 30 2019
Oct 30

[This is an old post that I wrote for System Seed's blog and meant to put on my own too but it fell off my radar until now. It's also about Drupal 7, but the general principle still applies.]

Handling clients with more than one site involves lots of decisions. And yet, it can sometimes seem like ultimately all that doesn't matter a hill of beans to the end-user, the site visitor. They won't care whether you use Domain module, multi-site, separate sites with common codebase, and so on. Because most people don't notice what's in their URL bar. They want ease of login, and ease of navigation. That translates into things such as the single sign-on that drupal.org uses, and common menus and headers, and also site search: they don’t care that it’s actually sites search, plural, they just want to find stuff.

For the University of North Carolina, who have a network of sites running on a range of different platforms, a unified search system was a key way of giving visitors the experience of a cohesive whole. The hub site, an existing Drupal 7 installation, needed to provide search results from across the whole family of sites.

This presented a few challenges. Naturally, I turned to Apache Solr. Hitherto, I've always considered Solr to be some sort of black magic, from the way in which it requires its own separate server (http not good enough for you?) to the mysteries of its configuration (both Drupal modules that integrate with it require you to dump a bunch of configuration files into your Solr installation). But Solr excels at what it sets out to do, and the Drupal modules around it are now mature enough that things just work out of the box. Even better, Search API module allows you to plug in a different search back-end, so you can develop locally using Drupal's own database as your search provider, with the intention of plugging it all into Solr when you deploy to servers.

One possible setup would have been to have the various sites each send their data into Solr directly. However, with the Pantheon platform this didn't look to be possible: in order to achieve close integration between Drupal and Solr, Pantheon locks down your Solr instance.

That left talking to Solr via Drupal.

Search API lets you define different datasources for your search data, and comes with one for each entity type on your site. In a datasource handler class, you can define how the datasource gets a list of IDs of things to index, and how it gets the content. So writing a custom datasource was one possibility.

Enter the next problem: the external sites that needed to be indexed only exposed their content to us in one format: RSS. In theory, you could have a Search API datasource which pulls in data from an RSS feed. But then you need to write a SearchAPI datasource class which knows how to parse RSS and extract the fields from it.

That sounded like reinventing Feeds, so I turned to that to see what I could do with it. Feeds normally saves data into Drupal entities, but maybe (I thought) there was a way to have the data be passed into SearchAPI for indexing, by writing a custom Feeds plugin?

However, this revealed a funny problem of the sort that you don’t consider the existence of until you stumble on it: Feeds works on cron runs, pulling in data from a remote source and saving it into Drupal somehow. But SearchAPI also works on cron runs, pulling data in, usually entities. How do you get two processes to communicate when they both want to be the active participant?

With time pressing, I took the simple option: define a custom entity type for Feeds to put its data into, and SearchAPI to read its data from. (I could have just used a node type, but then there would have been an ongoing burden of needing to ensure that type was excluded from any kind of interaction with nodes.)

Essentially, this custom entity type acted like a bucket: Feeds dumps data in, SearchAPI picks data out. As solutions go, not the most massively elegant, at first glance. But if you think about it, if I had gone down the route of SearchAPI fetching from RSS directly, then re-indexing would have been a really lengthy process, and could have had consequences for the performance of the sites whose content was being slurped up. A sensible approach would then have been to implement some sort of caching on our server, either of the RSS feeds as files, or the processed RSS data. And suddenly our custom entity bucket system doesn’t look so inelegant after all: it’s basically a cache that both Feeds and SearchAPI can talk to easily.

There were a few pitalls. With Search API, our search index needed to work on two entity types (nodes and the custom bucket entities), and while Search API on Drupal 7 allows this, its multiple entity type datasource handler had a few issues to iron out or learn to live with. The good news though is that the Drupal 8 version of Search API has the concept of multi-entity type search indexes at its core, rather than as a side feature: every index can handle multiple entity types, and there’s no such thing as a datasource for a single entity type.

With Feeds, I found that not all the configuration is exportable to Features for easy deployment. Everything about parsing the RSS feed into entities can be exported, except the actual URL, which is a separate piece of setup and not exportable. So I had to add a hook_updateN() to take care of setting that up.

The end result though was a site search that seamlessly returns results from multiple sites, allowing users to work with a network of disparate sites built on different technologies as if they were all the same thing. Which is what they were probably thinking they were all along anyway.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web