Mar 16 2020
Mar 16

The World Wide Web was created 37 years ago — but it’s only in recent years that some fundamental things are becoming a standard. One of them is website accessibility.

Thinking about how to make a website accessible, you need to know that some CMSs are accessibility-focused from the start. We mean Drupal, and we invite you on a tour of its built-in accessibility features and add-on accessibility modules.

Drupal accessibility: introductory notes

Accessibility (a11y for short) is in the “source code” of Drupal's philosophy. The latter is strongly focused on inclusion and strives to make websites accessible to all users regardless of disabilities (auditory impairments, visual impairments, difficulty holding a mouse, and so on).

The key accessibility standards by the Worldwide Web Consortium — WCAG and ATAG — have compliance levels from A to AAA. According to Mike Gifford, Drupal 8 Core Accessibility Maintainer, the community began implementing WCAG 2.0 AA compliance in Drupal 7. In Drupal 8, they also began incorporating elements of ATAG 2.0 AA and finding ways to comply with the new WCAG 2.1 AA.

The accessibility work has progressed enormously in Drupal 8 and addressed a number of very important issues. This makes the 8th version inherently accessible, which is listed in the top benefits of Drupal 8 that inspire many businesses to upgrade to Drupal 8 or hire a web development team to build a site from scratch.

The key built-in Drupal accessibility features

So how exactly does Drupal remove the barriers in any user’s interaction with websites? What makes it friendly to screen readers and other assistive software? We are reviewing this right now.

New accessible front-end theme Olivero

In addition to accessible themes in Drupal 8, the new default front-end theme Olivero created for Drupal 9 is an example of accessibility in everything — colors, contrasts, buttons, and so on. The theme is WCAG compliant, created in cooperation with the best accessibility experts, and thoroughly tested considering the accessibility feedback. It owes its name to a programmer Rachel Olivero who was a known advocate of website accessibility. We will see the theme in Drupal 9.1 core, so it makes sense to get ready for Drupal 9, which is forthcoming and promises an easy upgrade.

New accessible front-end theme Olivero for Drupal 9

Better semantics with HTML5 and WAI-ARIA

It’s easier in D8 to make the purpose and behavior of all web page sections and components clear to screen readers. This provides a more user-friendly navigation.

  • Developers can use semantic HTML5 elements that Drupal 8 is equipped with. The latest version of the markup language has an especially clean code and level of screen-reader friendliness.
  • When native HTML5 markup is not enough, WAI-ARIA attributes (roles, states, and properties) can be added to it. They will provide even more information to screen readers. WAI-ARIA is especially helpful with interactive UIs.

Required ALT text for images

Making images accessible to visually impaired audiences is vital. A huge role here belongs to the ALT text (aka alternative text). It describes what is on an image, so it can be understood by screen readers.

ALT is required by default in D8 so it’s impossible to add an image without describing it — it will refuse to be saved and display a reminder. This is a great example of Drupal 8 a11y. This can be overridden in the image field or CKEditor, but why would someone want to do so?

The defaults match the accessibility standards. It should also be noted that, just like other accessibility practices, ALT tags are very useful in terms of SEO.

Required ALT text for images in Drupal 8

Accessible inline form errors

When users submit forms, they can potentially submit some information in the wrong way. Form errors are usually listed in the top of the form with the wrongly filled fields highlighted in red.

However, visually impaired users may not see the highlights, and screen readers cannot help them relate the particular error message to the particular form field.

To solve this issue and improve form accessibility, the D8 core has the Inline Form Errors module. Thanks to it, errors are displayed next to specific fields so it’s easy to understand what needs to be fixed. The module is not enabled by default — you can choose to enable it.

Aural alerts

When some changes take place on the page that users should know about, this may go unnoticed by screen readers. The Aural Alerts feature is what you need. It allows you to provide screen readers about these changes with a message that they need to read out. This feature in the Drupal 8 core uses a JavaScript method Drupal.announce(). It follows the WAI-ARIA practices.

Controlled tabbing

Many people prefer using the tab key on their keyboard instead of the mouse to move around the page (or this their only option). Drupal 8 has a JavaScript feature called TabbingManager that allows you to guide these users through the important page elements in the logical order. With its help, you can control where exactly the user can tab into. It is another important accessibility feature.

Better contrasts

Sufficient contrast, both in headings and regular text, is crucial for people with low vision or other visual impairments. In Drupal 8, the core themes have improved contrasts. Another great news is that the Olivero theme follows the best design principles of high contrasts, saturated colors, and negative space to draw the eye to the most important things.

CSS Display Options

Drupal 8 introduces a set of CSS classes that allows you to control the nuances of how your content can be hidden in situations when it is needed. For example, it can be hidden from users but stay visible for screen-readers. These CSS classes include: (1) hidden, (2) visually hidden, (3) visually hidden. focusable, and (4) invisible.

Keyboard accessibility in the Media Library

Drupal 8 pays a lot of attention to keyboard accessibility for users who cannot hold a mouse or cannot see the screen properly. A brilliant example is keyboard accessibility in the Media Library introduced in D8.8 after the full Media Library’s integration with the CKEditor. Users can jump to the Library while creating or editing content, select media, and embed them in — all with keyboard only.

Keyboard accessibility in Drupal 8's Media Library

Contributed Drupal accessibility modules

There are plenty of useful add-on modules that are able to extend Drupal’s accessibility features even further.

Automatic Alternative Text

Screen readers cannot understand what is on an image unless you add ALT tags. Artificial intelligence software can help you with image recognition and automatic ALT generation. The Automatic Alternative Text module in D8 does this task using the Microsoft Azure Cognitive Services API. It generates image descriptions in a human-readable language. This is especially useful on sites with a large number of images.

Automatic Alternative Text Drupal module for accessibility

CKEditor Accessibility Checker

The CKEditor is the default WYSIWYG editor in Drupal 8 core. To use the best practices of accessible content publishing, you can install the CKEditor Accessibility Checker contributed module in Drupal 8. It uses the Accessibility Checker plugin that inspects your content, discovers and resolves accessibility issues.

CKEditor Accessibility Checker Drupal module for accessibility

CKEditor Abbreviation

Making content easy to understand is part of the best web accessibility practices. The CKEditor Abbreviationmodule adds a button to CKEditor that lets you insert and edit abbreviations via the handy context menu on your D7 or D8 site.

CKEditor Abbreviation Drupal module for accessibility

Siteimprove

The Siteimprove module connects your Drupal 7 or Drupal 8 site to the Siteimprove intelligence platform. It analyzes the quality of your content, including accessibility issues and gives you valuable improvement hints. The tool allows you to check your content pages and then recheck them when the discovered issues have been addressed to see that no further action is needed.

Siteimprove Drupal module for accessibility

Style Switcher

The Style Switcher module in D7 and D8 allows themers to provide alternate stylesheets that can be added directly in the admin dashboard. Users can then choose in which style they want to view the page. The module provides this choice as a list of links in a block. It also uses cookies to provide returning users with the stylesheet they once selected.

Style Switcher Drupal module for accessibility

High contrast

The Drupal 7 and Drupal 8 High contrast module allows your users to switch to the high-contrast version of the theme. They can do it by just pressing the Tab key once or twice, and the "Toggle high contrast" link will appear. When the click it or press Enter, they are in the high contrast mode.

High contrast Drupal module for accessibility

Text Resize

Enable your users with low eyesight or other visual problems to easily adjust the text size to the one that is comfortable for them. This is one of the key accessibility demands. The Text Resize module for D7 and D8 helps you here. It creates a block with two buttons to increase and decrease the font size. The block is built using the jQuery and the jQuery Cookie plugin and is themable.

Text Resize Drupal module for accessibility

Text Size

Here is another module that takes care of text size adjustment. The Text Size is a popular module in Drupal 7. serves a similar mission. It provides convenient text size changing options and a zoom feature. The module is able to resize variable media objects, pixel images, and vector images.

Text Resize Drupal module for accessibility

Block ARIA Landmark Roles

Both Drupal 7 and Drupal 8 websites can enhance their use of the best WAI-ARIA practices for accessibility with the Block ARIA Landmark Roles. It adds elements to your block configuration forms that enable users to assign an ARIA landmark role or label to each block.

Block ARIA Landmark Roles Drupal module for accessibility

htmLawed

A clean HTML is always more screen reader friendly. The htmLawed Drupal module uses the htmLawed PHP library to make your HTML cleaner on your Drupal 7 or Drupal 8 site. The module works fast, is highly configurable, and covers all HTML elements.

htmLawed Drupal module for accessibility

HTML Purifier

Another module that takes care of the clean HTML for web accessibility is the HTML Purifier. It exists in Drupal 7 and 8. The module uses a standard-compliant HTML filter library — the HTML Purifier. It removes malicious code and helps you make your HTML comply with W3C's specifications.

Let us help you get the most with Drupal accessibility

It’s time to make your website accessible to all users. Our web agency specialists are ready to help you if you want to:

  • install and configure Drupal accessibility modules
  • migrate your site to the more accessible Drupal 8 (and prepare it for Drupal 9)
  • move your site from another CMS to Drupal due to its accessibility features
  • comprehensively test your website accessibility
  • create any accessibility features that are needed for your site

Contact us and be accessible!

Mar 16 2020
Mar 16

Category 1: Web development

Government organizations want to modernize and build web applications that make it easier for constituents to access services and information. Vendors in this category might work on improving the functionality of search.mass.gov, creating benefits calculators using React, adding new React components to the Commonwealth’s design system, making changes to existing static sites, or building interactive data stories.

Category 2: Drupal

Mass.gov, the official website of the Commonwealth of Massachusetts, is a Drupal 8 site that links hundreds of thousands of weekly visitors to key information, services, and other transactional applications. You’ll develop modules to enhance and stabilize the site; build out major new features; and iterate on content types so that content authors can more easily create innovative, constituent-centered services.

Category 3: Data architecture and engineering

State organizations need access to large amounts of data that’s been prepared and cleaned for decision-makers and analysts. You’ll take in data from web APIs and government organizations, move and transform it to meet agency requirements using technology such as Airflow and SQL, and store and manage it in PostgreSQL databases. Your work will be integral in helping agencies access and use data in their decision making.

Category 4: Data analytics

Increasingly, Commonwealth agencies are using data to inform their decisions and processes. You’ll analyze data with languages such as Python and R, visualize it for stakeholders in business intelligence tools like Tableau, and present your findings in reports for both technical and non-technical audiences. You’ll also contribute to the state’s use of web analytics to improve online applications and develop new performance metrics.

Category 5: Design, research, and content strategy

Government services can be complex, but we have a vision for making access to those services as easy as possible. Bidders for this category may work with partner agencies to envision improvements to digital services using journey mapping, user research, and design prototyping; reshape complex information architecture; help transform technical language into clear-public facing content, and translate constituent feedback into new and improved website and service designs.

Category 6: Operations

You’ll monitor the system health for our existing digital tools to maintain uptime and minimize time-to-recovery. Your DevOps work will also create automated tests and alerts so that technical interventions can happen before issues disrupt constituents and agencies. You’ll also provide expert site reliability engineering advice for keeping sites maintainable and building new infrastructure. Examples of applications you’ll work on include Mass.gov, search.mass.gov, our analytics dashboarding platform, and our logging tool.

Mar 15 2020
Mar 15

Very recently I relaunched this blog using Gatsby.js, which is in the category of static page generators. Having comments on a static webpage is a common requirement, and a popular way to do so is to use a third party service, like Disqus.

I have used Disqus on my blog for a long, long time. The first time I went from using Drupal comments to using Disqus was when I migrated to Drupal 8 (in july 2014, while Drupal 8 was in an alpha version). I could be mentioning this to sound very distinguished, but the main reason I am mentioning it this: Back then, for security reasons, I was running Drupal 8 more or less like a static page, since I explicitly disallowed PHP sessions. This meant it was not possible to use the Drupal comment system. After migrating to Gatsby.js it was only natural for me to keep using Disqus, since all the comments were already there, and it still works with the "static site builder" Gatsby.js.

What changed?

There are several trade-offs with using a third-party solution like Disqus. One of them is privacy, and the other is not owning your own data. For example, you have no logs. So while I have comments on my blog, and many posts have comments on them, I do not have any insight into how many people are trying to add comments, but failing. This was the case on my blog post on Geography in web page performance. Shortly after I published it, I got an email from a fellow Drupal community member, ressa. The email was related to the blog post, so I asked them "Why did you not submit a comment?". After all, questions and answers could potentially benefit others. Well, the reason was that they tried to comment, but had to give up after fighting captcha pictures from Disqus for several minutes. And although I had thought of it before, this was the final push to look into Disqus alternatives

Requirements

Keeping the comments and not starting from scratch

The first requirement was that I should be able to keep all of the comments from Disqus, even when using a new platform. This can either be done by migrating the comment data to a new platform, or by "archiving" Disqus comments and embedding them in my pages. With that though process in place, let's go ahead and look at technical solutions.

Technical options and their consequences

My preferred option would be to use something open source, but not host it myself. I tried out Commento, which is open source with a hosted version. To be honest, it seems like a very good option, but it is not free (free as in $0). I could host it myself for free, but that would require me to maintain the server for it. They also provide an import of Disqus comments, which would satisfy my requirement to keep the existing comments. In the end, I decided to not go with this since I had to either pay for it or host myself.

Since self-hosting was out of the picture I then went in another rather radical direction: Commenting through Drupal.org(!). Drupal.org provides an API, so in theory that should for sure work. Instead of repeating it here, I will now just post a quote from my email to the Drupal infra team asking for permission:

I want to replace disqus as comment provider on my blog, so I had this idea of letting every blog post be an issue on a project on d o, and you would comment there to get it to show up as a comment on the blog post. This way I would get account creation, spam prevention and email alerts "for free". I could theoretically also incentivize commenting by handing out issue credits for valuable comments (which in my opinion serves as community contribution anyway).

So there you have it. I would get loads of stuff "for free". This seems like a great deal for me. So I guess I am asking kindly for permission technically to do this, as well as asking if that would be an OK use of issue credits?

As you probably can see from the quote, I felt I had a fantastic idea for myself, but realized that this might not be the intended use of Drupal.org. After a great conversation with Tim Lehnen (CTO at the Drupal Association) we both agreed that this is a bit outside the intended use of a community project, also considering the members and supporting partners would be paying for parts of my blog infrastructure. Although this was not the option I went for, the option would make it possible to not self-host. And Drupal.org is open source. However I would not own my own data (which technically would be the same as Disqus). I also would not be able to import the comment history into the new platform.

Now that I was already down the road of using issue comments, my next step was to research Github as the comment hosting provider. Also here I would get account creation, spam prevention and email alerts "for free". In addition one can lock a thread for more comments.

My initial research led me to many other people doing the same thing. So the idea is for sure not new. There is even a service wrapping the whole thing called utteranc.es (adding embedded Github login and auto creation of issues when configured). Again, this would mean not owning my own data. And not being able to import comments into a new platform.

Reflections on choices available

Thinking about how your comments are stored is an interesting exercise. Ideally, I would want it stored where I own it. I would want to control the method of storing it (for example using an open source CMS). I would also want to control policy and thoughts behind privacy and moderation. Would I save comments on a third party provider, they would be able to delete all of my comments if they disagreed with me somehow. Or they could in an intrusive or abusive way compromise the privacy of users wanting to comment on my blog. These are real questions to ask.

Even with that in the back of my mind, I ended up moving to comments via Github. The convenience of storage, spam prevention and email alerts trumped the need for freedom, at least for now. To be fair, Github already hosts the code for this blog, and runs the deployments of the blog, so the trust is already there. I did however do a couple of twists to make things smoother, safer and more SEO friendly.

Short technical walkthrough

I promise I will write a longer blog post on this (with code examples), but the current set-up looks something like this:

I wanted to keep my Disqus comments, but as mentioned, I can not import Disqus into github issue comments. But, Disqus provides a dump of all my data. Fortunately there is also a Gatsby source plugin for this format, called gatsby-source-disqus-xml. Using this approach has the added bonus of Disqus comments now being printed on all pages (instead of loaded with JavaScript), so they are now indexable and searchable through Google! Quick SEO win!

I wanted the new comment system to be transparently appended to archived comments. So I also import issue comments for blog posts using the same method. Basically that involves writing a quick source plugin using Gatsby's Node APIs.

Now, comment data is more or less the same, no matter where it first appeared (Github or Disqus). Since comment data is mostly similar I can use the same Reat component to render imported comments from Disqus, alongside the new (dynamic) comments from Github. And so now it can look like this:

Disqus and Github side by side

I also wanted comments to feel real-time. Since they are rendered through building the codebase, I would not want to wait for a full build when people are posting comments. They should be available right away! But since Gatsby has a client-side layer as well, I can update the state of the blog post client side. To make this philosophy a bit more quote friendly:  "it's built eventually, but visible immediately".

Expanding on this, I also plan to be able to have a copy of the comments stored somewhere else than Github. But that sounds like another blog post, and this journey has already made the blog post long enough. Let's instead finish off with an animated GIF of the comment system in action. And of course, if you want to test it yourself, feel free to comment comment on this post!

Mar 14 2020
Mar 14

With Drupal 9 all set for release in June 2020 and Drupal 7 & Drupal 8 reaching to their end of lives in 2021- the crucial question that pops up is whether you should wait and upgrade to Drupal 9 or start moving to Drupal 8 now.

Whatever the case is, enterprises must start figuring out their approach to be Drupal 9 ready instead of pondering upon that why should they be upgrading to Drupal 9 after all’ phase?

This blog will answer all your questions and will provide you a way to keep up with the latest version of Drupal.

Reasons To Migrate To Drupal 9

With the near release of Drupal 9, there is going to be a lesser and lesser focus of maintainers on providing resources to Drupal 7. Simply stated, there will be a lot of contributed modules and themes left unmaintained apart from the freshly exposed security loopholes that will directly result in the recantation of downloadable releases of these projects.

Drupal logo showcasing upgradeSource: India MART

 

Here are a few questions that you need to ask yourself before considering migration to D9-

 

1.   Does your business have seasonality?

For the uninitiated, seasonality or seasonal spread implies that traders can benefit from seasonal fluctuations by taking long and short contracts simultaneously in the same or associated commodity market. The spread refers to the variance between the values of these securities. Benefiting from the spreads when they are following normal seasonal patterns may be a judicious and reliable way to engage in future markets.

Now if your business has seasonal spread, it is probably ideal for you to wait for the release of Drupal 9. This way, you can pay complete attention to seasonal peaks and get enough of time as well to migrate to D9 when it launches 

If you have been planning to undertake a project in 2020, you should postpone it for six months to start in Q3 2020 and switch directly to D9. Roll out a plan elucidating how to handle site refinements in D9 to avoid modules that never got D8 support. With this, you can skip the repeated hassle and directly move to a new platform in a single stroke.

 

2.   Are you having challenges in D7?

Drupal 7 still has official support and so, there are several agencies and system integration partners who are proficient in working on the platform. However, there is undoubtedly a multitude of functionalities and improvements that make the move to version 8 rewarding. So, if you are facing any challenges with Drupal 7, then you should consider moving to D8 immediately.

One of the real problems that D7 sites have is their confusing editorial experience - like one of the issues was lack of in-built WYSIWYG editors; making it difficult for editors to see the preview before pushing content live. Thus, instead of empowering teams, it increased their efforts. 

On the other hand, D8 encompasses key improvements for layout management, including many other techniques to manage pages.

Migration to Drupal 8 will serve you better now and later.

 

3.   Features that you are missing out in Drupal 7

Drupal 8 is altogether a new CMS that comprise of quintessential features for everyone onboard in the company. 

So, if you are using D7 and have FOMO on some really important features, then you should definitely plan out an upgrade to D8 or D9.

Besides, Drupal 9, as mentioned, is going to be quite similar to D8 except for deprecated code removed, and third-party dependencies updated, thereby holding similar features-

 

  1. Configuration management allows you to deploy configuration between environments
  2. Extended security coverage from one month after the next minor release to six months.
  3. HTML 5 changes Drupal’s default markup to meet HTML5 standards, including new semantic elements.
  4. The layout feature allows site builders to build pages intuitively, change layouts and add & rearrange blocks with real-time preview
  5. Mobile-centric for a wonderful experience on mobile devices
  6. Multilingual capabilities incorporated into Drupal core
  7. Allows content authors and site builders to implement basic settings from the front-end of their site
  8. PHPUnit converts all legacy SimpleTest tests to the standard PHP testing framework
  9. Spark improves Drupal 8’s experience for content editors through improvements such as WYSIWYG editing, a mobile-friendly toolbar, and in-place editing

Which Approach Should Be Taken to Migrate to Drupal 9?

 

1.   Migrate from Drupal 7 to 9. Drop D8 

 

Drupal 7 is durable and has enough shelf life as of now to keep your site up and running through the community’s constant support.

So, unless there's a specific module that only Drupal 8 can offer, you can feel assured that your Drupal 7 site will remain active until its end-of-life.

In addition, staying on D7 can give you some time to secure funding, and get all the stakeholders on the same page for the upcoming upgrade.

However, it isn’t easy. Procrastinating the potential problems by leaping upgrade to D8 won’t fade out problems; in fact, Drupal 9 will still require the same level of rework and investment. 

And in the meantime, updates to Drupal 7 will continue, such as requiring a more updated version of PHP. There will be more maintenance costs associated with an addition to the Drupal 9 rebuild.

Jumping on to Drupal 9 from Drupal 7 gives you room to operate.

 

2.   Upgrade from Drupal 7 to 8 to 9

Another option that can be considered is following the “straight line”, i.e, moving from Drupal 7 to 8 to 9, instead of trying to skip version 8 completely.

Also, it is paramount to take a unique feature of D9 into account- it is designed to be backward compatible. This indicates that unlike going from Drupal 7 to Drupal 8, it is much easier to go from Drupal 8 to Drupal 9.

Enterprises are already building Drupal 8 sites that will be Drupal 9 compatible by eliminating deprecated APIs and constantly running tests to verify. These sites will be upgradable like a regular quarterly update.

Thus, you can upgrade your D7 modules to D8 with the help of modules such as Drupal Module Upgrader, migrate the website content and code to Drupal 8 by checking for availability of modules in Drupal 8 through the Upgrade Status module, and finally upgrade to Drupal 9.

Upgrading from Drupal 8’s latest version to Drupal 9 is plain as day.

How To Start With?

Although Drupal 9 hasn’t embarked on its journey, the time to administer and refine your site has come now. It’s recommended taking an incremental approach instead of focusing on wholesale rebuild. Here are some factors that you should consider as you move further into 2020-

 

  • Analyze website strategy

Presuming that your site was built (or redesigned) not more than 5 years ago, keeping the business goals and current business strategy in mind is crucial. Have your goals shifted? Does your site still help you achieve your ultimate objective? 

Redefine your strategy to incorporate changes and align them on the right path for success.

 

  • Audit content

Managing enormous content on the site, especially when there are multiple authors and editors, the line of governance gets blurry. Make sure that you keep on archiving or deleting unnecessary content on a timely basis. Evaluate it for your authority voice and well-defined strategy.

 

  • Evaluate SEO

SEO written with the grey background

Apart from keeping a track of keywords, ensure that your content is mobile-centric, URL structures meaningful, and existing schemas properly used to illustrate the content of a page.

 

  • Code Quality

Auditing code should ascertain that-

  1. Code standards are being met as outlined by Drupal
  2. Code should be well-structured and easy to extend
  3. Proper documentation exists
  4. Code is reusable as much as possible for future projects

 

  • Optimize user experience

Check if the user experience and flow are making sense besides running a usability test on your interactive features. Use Google Analytics to see where your users are clicking and scrolling, and then tweak accordingly.

 

  • Active maintenance:

Make sure that your contributed modules are actively maintained so as to keep them working accurately and in case you find it necessary to replace the modules with an upgrade path, then do the needful.

 

  • Assess New Features: 

Examine new features carefully while being mindful of scope as per your upcoming rebuild. Figure out whether it can wait, or is it an urgent necessity?

So, When Should I Upgrade?

Enterprises should start planning out their upgrade to Drupal 9 without any further delays. It will be similar to the final D8 release, however, with deprecated code removed and third-party dependencies updated. Upgrading to D8 will eventually make it easier to hop on to D9.

 

Summing up-

Confused about which approach to take? Beginners should evaluate whether an upgrade benefits them in the immediate term or not. They should gather more information about Drupal 8, audit their site with our website checklist, and if still, don’t feel sure, contact us!
We offer Drupal 7 & 8 support and can help you work out a strategy for an upgrade from Drupal 7 to Drupal 8, & Drupal 9. 

Whichever way you pick to upgrade to D9, we’ve got your back!

Mar 13 2020
Mar 13

tl;dr: Docker's default bind mount performance for projects requiring lots of I/O on macOS is abysmal. It's acceptable (but still very slow) if you use the cached or delegated option. But it's actually fairly performant using the barely-documented NFS option!

Ever since Docker for Mac was released, shared volume performance has been a major pain point. It was painfully slow, and the community finally got a cached mode that offered a 20-30x speedup for common disk access patterns around 2017. Since then, the File system performance improvements issue has been a common place to gripe about the lack of improvements to the underlying osxfs filesystem.

Since around 2016, support has been around (albeit barely documented) for NFS volumes in Docker (see Docker local volume driver-specific options).

As part of my site migration project, I've been testing different local development environments, and as a subset of that testing, I decided to test different volume/sync options for Docker to see what's the fastest—and easiest—to configure and maintain.

Before I drone on any further, here are some benchmarks:

Time to install Drupal 8 - different Docker volume sync methods

Time to first Drupal 8 page load - different Docker volume sync methods

Benchmarks explained

The first benchmark installs Drupal, using the JeffGeerling.com codebase. The operation requires loading thousands of code files from the shared volume, writes a number of files back to the filesystem (code, generated templates, and some media assets), and does a decent amount of database work. The database is stored on a separate Docker volume, and not shared, so it is plenty fast on its own (and doesn't affect the results).

The second benchmark loads the home page (/) immediately after the installation; this page load is entirely uncached, so Drupal again reads all the thousands of files from the filesystem and loads them into PHP's opcache, then finishes its operations.

Both benchmarks were run four times, and nothing else was open on my 2016 MacBook Pro while running the benchmarks.

Using the different sync methods

NFS

To use NFS, I had to do the following (note: this was on macOS Catalina—other systems and macOS major versions may require modifications):

I edited my Mac's NFS exports file (which was initially empty):

sudo nano /etc/exports

I added the following line (to allow sharing any directories in the Home directory—under older macOS versions, this would be /Users instead):

/System/Volumes/Data -alldirs -mapall=501:20 localhost

(When I saved the file macOS popped a permissions prompt which I had to accept to allow Terminal access to write to this file.)

I also edited my NFS config file:

sudo nano /etc/nfs.conf

I added the following line (to tell the NFS daemon to allow connections from any port—this is required otherwise Docker's NFS connections may be blocked):

nfs.server.mount.require_resv_port = 0

Then I restarted nfsd so the changes would take effect:

sudo nfsd restart

Then, to make sure my Docker Compose service could use an NFS-mounted volume, I added the following to my docker-compose.yml:

---
version: '3'

services:
  drupal:
    [...]
    volumes:
      - 'nfsmount:/var/www/html'

volumes:
  nfsmount:
    driver: local
    driver_opts:
      type: nfs
      o: addr=host.docker.internal,rw,nolock,hard,nointr,nfsvers=3
      device: ":${PWD}"

Note that I have my project in ~/Sites, which is covered under the /System/Volumes/Data umbrella... for older macOS versions you would use /Users instead, and for locations outside of your home directory, you have to grant 'Full Disk Access' in the privacy system preference pane to nfsd.

Some of this info I picked up from this gist and it's comments, especially the comment from egobude about the changes required for Catalina.

So, for NFS, there are a few annoying steps, like having to manually add an entry to your /etc/exports, modify the NFS configuration, and restart NFS. But at least on macOS, everything is built-in, and you don't have to install anything extra, or run any extra containers to be able to get the performance benefit.

docker-sync.io

docker-sync is a Ruby gem (installed via gem install docker-sync) which requires an additional configuration file (docker-sync.yml) alongside your docker-compose.yml file, which then requires you to start docker-sync prior to starting your docker-compose setup. It also ships with extra wrapper functions that can do it all for you, but overall, it felt a bit annoying to have to manage a 2nd tool on top of Docker itself in order to get syncing working.

It also took almost two minutes (with CPU at full bore) the first time I started the environment for an initial sync of all my local files into the volume docker-sync created that was mounted into my Drupal container.

It was faster for most operations (sometimes by 2x) than NFS (which was 2-3x faster than :cached/:delegated), but for some reason the initial Drupal install was actually a few seconds slower than NFS. Not sure the reason, but might have to do with the way unison sync works.

docker bg-sync

bg-sync is a container that syncs files between two directories. For my Drupal site, since there are almost 40,000 files (I know... that's Drupal for you), I had to give this container privileged access (which I'm leery of doing in general, even though I trust bg-sync's maintainer).

It works with a volume shared from your Mac to it, then it syncs the data from there into your destination container using a separate (faster) local volume. The configuration is a little clunky (IMO), and requires some differences between Compose v2 and v3 formats, but it felt a little cleaner to manage than docker-sync, because I didn't have to install a rubygem and start a separate process—instead, all the configuration is managed inside my docker-compose.yml file.

bg-sync offered around the same performance as docker-sync (they both use the same unison-based sync method, so that's not a surprise), though for some reason, the initial sync took closer to three minutes, which was a bit annoying.

Summary

I wanted to write this post after spending a few hours testing all these different volume mount and sync tools, because so many of the guides I've found online are either written for older macOS versions or are otherwise unreliable.

In the end, I've decided to stick to using an NFS volume for my personal project, because it offers nearly native performance (certainly a major improvement over the Docker for Mac osxfs filesystem), is not difficult to configure, and doesn't require any extra utilities or major configuration changes in my project.

What about Linux?

I'm glad you asked! I use the exact same Docker Compose config for Linux—all the NFS configuration is stored in a docker-compose.override.yml file I use for my Mac. For Linux, since normal bind mounts offer native performance already (Docker for Linux doesn't use a slow translation layer like osxfs on macOS), I have a separate docker-compose.override.yml file which configures a standard shared volume.

And in production, I bake my complete Docker image (with the codebase inside the image)—I don't use a shared volume at all.

Mar 13 2020
Mar 13

Drupal 9 release date has been pinned for June 3, 2020, and it's coming up super fast. What does that mean for your site?

First of all, don't panic. Drupal 7 and 8 end of life are scheduled until November 2021, so there is plenty of time to upgrade. However it is always good to plan ahead with time and take advantage of the new features and security releases with the new version.

If you are on D7

Moving to Drupal 9 will be very similar as moving to Drupal 8, and in fact, there is no reason to wait, and the recommendation is to move to D8 as soon as possible, incorporating the tools described in the next section to search for possible incompatibilities.

Coming from D7, the greatest challenge might be the availability (or not) of the modules you already have installed, and finding and implementing replacements wherever needed. Take this as a chance to audit your site and plan a migration with a new architecture that fits your needs.

Also try out the Upgrade Status module, as it "checks the list of projects you have installed and shows their availability for newer versions of Drupal core".

If you are on Drupal 8

At its core, Drupal 9 will be the same as the latest release of Drupal 8, minus the deprecated components removed, and third party dependencies updated. This means that an upgrade from D8 should be fairly easy as it only involves making sure your codebase isn't making use of deprecated code.
Checking your site for readiness is simple using the mglaman/drupal-check utility. It is a simple CLI tool to generate a report of deprecation errors. Install it as a development package on your site and use:

# Install:
composer require mglaman/drupal-check --dev

# Run on a directory:
drupal-check web/modules
Drupal Check CLI tool example

Some things to keep in mind while checking deprecation notices:

  • Update all modules to the latest development version, to ensure testing against the latest code.
  • Don't just check the contrib modules, run it against themes, profiles and custom code.
  • If your project has continuous integration, aim to incorporate this tool into the workflow to verify readiness and avoid regressions.
  • Don't run this tool in a production environment :)

If CLI tools aren't your fancy or would like a nicer UI to show to project managers and clients, the Upgrade Status module will provide a nice dashboard with a summary and detailed information for each module of your site. It uses drupal-check as it's underlying tool.

Upgrade Status module report

Fixing deprecations

Now that you've got a report of deprecated code usage, it's time to fix it. The deprecation notices should state clearly what is deprecated, and suggested changes. I also like to look at the source code of the deprecated function and see what Drupal core is using inside it, as it shows unequivocally what needs to be done.

Let's take an example:

Call to deprecated constant REQUEST_TIME: Deprecated in drupal:8.3.0 and is removed from drupal:9.0.0. Use \Drupal::time()->getRequestTime();

Fixing the error can be as simple as replacing REQUEST_TIME with:

\Drupal::time()->getRequestTime()

In fact, a tool called drupal-rector is under development to help automate this process. A handy list of deprecation fixes can be found in the drupal/check documentation as well.

However be aware that \Drupal calls should be avoided in classes whenever possible, and dependency injection used instead. So for the example above, if REQUEST_TIME was used inside a service class, we'd inject the 'datetime.time' service into it (the service returned by \Drupal::time()) and then call getRequestTime() on it. For more in-depth information on how to call services using dependency injection, read Accessing services from the drupal.org docs.

Mark your modules as D9 ready

If you have fixed all deprecation notices, and are a module maintainer or have custom modules in your site, mark them as compatible with Drupal 8 and 9 in the info.yml file:

name: My Module
type: module
core_version_requirement: ^8.8 || ^9

A note on third party dependencies

Drupal 9 will have it's third party dependencies updated, most notably Symfony 4.4 components. Be sure to test your site in a D9 beta using these dependencies to avoid potential conflicts when D9 is released. Make sure you are running with the recommended dependencies versions by using the 9.0.x branch of drupal/core-recommended.

Finally, if you are starting a new build

Start with the latest D8 release! As mentioned, Drupal 9 is D8 at its core, so it is safe to start development with Drupal 8 and wait for the release date to upgrade. Just keep an eye out on module deprecation notices using the suggested tools from above.

Mar 13 2020
Mar 13

Read our roadmap to understand how this work falls into priorities set by the Drupal Association with direction and collaboration from the Board and community. You can also review the Drupal project roadmap.

Project News

Drupal 9 beta is closer than ever!

At the time of writing this post, there are fewer than three beta-blockers for Drupal 9. This hopefully means that we'll be seeing a beta release of Drupal 9 very soon. 

What does this mean for you?

Now's the time to get familiar with what's coming in Drupal 9, and to check your contributed or custom modules to see if you're ready to go. The community has put together a number of tools that you can use: the upgrade status module, the Drupal Check command line tool, and Drupal Rector.

We also need your help! We're looking for more individuals and organizations to participate in the Drupal Beta Test program. It's a great way to contribute to Drupal.

Call for Sponsors & Contributors: Automatic Updates

We're really proud of the work we accomplished in the first phase of the automatic updates initiative; in Drupal 7 and Drupal 8, sites that don't depend on Composer workflows now have complete support for securely and automatically updating Drupal Core. In the second phase of this work we want to extend that support to contributed projects, and to support Composer-based site installations. 

We need your help to make the second phase happen. Will you contribute?

Learn more on our call for sponsors & contributors post.

Drupal.org Updates

DrupalCon Minneapolis Program Update

In preparation for releasing the full DrupalCon Minneapolis speaker schedule, we've made some updates to the accepted sessions page. 

The newly redesigned page now highlights our excellent keynote speakers (to include Mitchell Baker from Mozilla!) as well as other featured speakers for this year's event. On top of that you can filter the list of sessions by track, to get a jumpstart on finding your favorite sessions, before the full schedule is released. 

Ready to enable Semantic Versioning for Contributed Projects

We've rearchitected the version management for contributed projects, so that they can begin using Semantic Versioning as we enter the Drupal 9 era. You can see an example of this in practice on this sample project: semver_example. 

We're coordinating with the Drupal core maintainers to select a window for enabling the new semver functionality across all projects. We want to ensure that Drupal end-users will still be able to find and easily understand which projects they can use once projects are able to be compatible with both D8 and D9, and are using semver version numbering. 

Not familiar with semantic versioning

The three digit numbering scheme (MAJOR.MINOR.PATCH) is designed to provide guide rails around API breaking changes. In Drupal core for example, patch releases are incremented whenever there are bug fixes or security releases. Minor releases indicate that new features have been introduced. And the Major version only changes when deprecated APIs are removed and fundamental architectural changes have been introduced.  Contributed project maintainers are encouraged to adopt the same pattern.

Updated display of releases

Speaking of releases - we've recently updated the display of releases to provide a cleaner view of release metadata. This should make it much easier to understand the history of recent releases, and to see at a glance which ones were bug fixes vs. feature releases vs. security releases. 

New Release Meta Data

You can see a detailed example by looking at the release history for Drupal core

Drupal usage stats by branch

Because of the six-month minor release cycle, it's become much more important to have more granular insight into what minor versions of Drupal are in use in the wild. 

Usage stats by branch

As you can see above, we've updated the usage stats for Drupal to display usage by branch. This is mostly useful for Drupal Core, but may be valuable for contrib maintainers as well as they look to understand which versions of their projects are in highest demand. 

Coming soon: An updated UX for project browsing

With the release of Drupal 9, it will be possible for contributed projects to be compatible with both major versions of Drupal. Perhaps more interestingly, because of the release of new features with minor versions, there are some projects that may only be compatible with a certain range of minor versions (e.g: 8.6.x - 9.2.x). 

This is a powerful improvement in ensuring that key modules are ready to use with Drupal 9 on day one, but it also has the potential to be confusing for Drupal site owners and evaluators who are trying to discover what projects they can use. We're looking to update the project browsing on Drupal.org to make sure discoverability doesn't suffer with this change. If you have good ideas about this user experience, please feel free to share them on the issue!

Drupal 9 Readiness

Packaging enhancements

Beginning with Drupal 8.8.0, Drupal needed to be packaged from the output of Composer create project, rather than as the simple contents of a git clone. These changes to packaging have additional ramifications for how we manage tagged releases for Drupal core, and in particular for how we manage security releases. We've been making a variety of updates to the Packaging pipeline since Drupal 8.8 to make the process more transparent, resilient, and performant, and that work continues. 

DrupalCI

DrupalCI: Support for new Postgres environments

Because minimum requirements are changing with Drupal 9, we've added new test environments for both Postgres 10 and Postgres 12

DrupalCI: Updated SQLite version

SQLite has also been updated within the DrupalCI test environment to version 3.26, to support testing on the correctly supported version. 

DrupalCI: Support for MariaDB environments

MariaDB forked from MySQL after the acquisition by Oracle, but at first had remained fairly consistent. However, with recent versions MariaDB has had to diverge, and so we are now providing explicit testing support for MariaDB, with test environments for versions 10.2.7 and 10.3.22. 

———

As always, we’d like to say thanks to all the volunteers who work with us, and to the Drupal Association Supporters, who make it possible for us to work on these projects. In particular, we want to thank: 

If you would like to support our work as an individual or an organization, consider becoming a member of the Drupal Association

Follow us on Twitter for regular updates: @drupal_org, @drupal_infra

Mar 13 2020
Mar 13

There’s bad data everywhere, but nowhere is it more insidious than when you encounter it for the first time while trying to migrate a website. Bad data, in this case, is defined as any data that doesn’t conform to the expectations of the migrator and their code, whether that’s because of bad data entry, surprising contextual requirements, or historical changes in value patterns. It can come at you from a variety of directions and can have surprising origins. Discovering your bad data early, and the reasons for it, can point to issues that may exist in other parts of the migration and can also give you clues on how to effectively deal with it.

Forewarned is forearmed, so identifying bad or surprising data early allows your migration to go more smoothly and will make your client a lot happier. Here are some approaches and tips to make that migration go more smoothly.

Finding Bad Data

You may have already created a set of spreadsheets for the new site that indicate where the content in the old site is coming from, and what transformations may be needed to get it there. Just assuming that you know what the data in the old site looks like, however, is a pitfall waiting to snare you. Even if you’ve looked at a handful of records for each source object or field, you may not find the data that’s in there waiting to bite you and your migration code.

For many fields, it can be useful to do queries on a snapshot of the old database. Fields that have a limited number of distinct values should have a distinct() query run against them. For example, a field that should have a ‘Y’ or ‘N’ value, may also have ‘y’ or ‘n’ values, too, or even ‘true’, ‘false’, 0, 1, ‘’, NULL, ‘t’, ‘f’, and so on.  Find all the outliers before you begin coding, so you can take them into account. For a yes/no field like this, you may want to code a process plugin that can interpret all of them into a single pair of values that you can then reuse on other fields in your migration.

Date Data

Dates are particularly troublesome fields. If the old database field is not in ISO date format, you may find that dates are recorded in all sorts of ways. Converting them all correctly for use in a MySQL date or datetime field can be difficult. Often the easiest thing to do is not try to do it yourself, but rather use PHP’s strtotime() function. But that may not be good enough if your dates might be outside of the timestamp epoch (roughly, 1970 - 2038). Then you’ll have to fallback on PHP’s date class.

This brings up the next issue with dates:  they may have contextual problems. Is it ok if you find a date that’s 100 years in the future or 100 years in the past? Often, dates need to be range checked somehow.

Another problem dates can have is timezones. Often, dates don’t have an associated timezone in a database. You may need to know the default timezone for the old system, or perhaps where the client is located. Timezone date calculations are surprisingly finicky to get right.

Deeper Waters

If there are unstructured text fields in the old data, you may need to watch out for data that is outside the “normal” 7 bit ASCII code page. Even Western European languages have codes outside of this range, things like ø, ç, ü and so on, including currency signs like £ and €. Then there is multibyte string data, used for representing characters on the vast number of code pages for languages that use other alphabets or other special-purpose glyphs, like for math and physics.  

There can also be problems with characters that have been entered for various symbols: emojis that employ character codes that are specific to Windows or the Mac and cause problems in their native forms.

If your MySQL database is not set up right, it might reject inserting such data. Choosing a “reasonable” format can be a challenge sometimes, but usually, the better answer is to detect these situations and write some custom code to fix them, to convert the wacky characters into something acceptable and displayable.

Unstructured Text

A common problem often seen when migrating older Drupal sites, but can also happen with other technologies as well, is that text fields will have self-referencing data encoded in it. A very common situation is that images on the site are inserted into the text with an <a> tag:  < href="https://www.hook42.com/blog/successfully-migrating-bad-data-drupal/sites/default/files/my_file_name.jpg">, for example. This may not work correctly on the new site, especially if, as a part of the migration, the existing files have been reorganized.  

In Drupal 7 sites, you may instead see a blob of JSON that is a media entity reference. You will have to find these and use the mapping tables created when the media entities were migrated to change them. You will probably have to find another way to deal with them, as media entities in Drupal 8 are not yet integrated into the default CKEditor WYSIWYG fields.

Getting Cozy with Your Data

While investigating the existing data, be sensitive to changes in patterns. Maybe you’re dealing with an e-commerce system and the states that an order goes through start having a different pattern at some point in time in the data. This can indicate a point when the old system was modified or upgraded, and your assumptions about what data you’ll see and how to migrate it across to the new system need to take this into account.

This is true especially if you’re migrating off a system that no longer has any technical support. Maybe because the company that implemented isn’t around anymore or is otherwise not available, or the software in use is obsolete. You may have to guess how data objects are constructed out of the database tables. There may be extra tables no longer in use - but do they have historic information that you should be capturing and moving? You will need to have several sessions with the client to ask them about particular examples in their data. Puzzling this out when the client themselves doesn’t know how their system works can keep you up at night.

You’ll have to try to rebuild the original designer’s mindset to understand how they structured the data. If the data has any complexity to it, you will have to learn how related data objects are referenced, and rebuild these connections correctly in Drupal.

Sometimes references are conditional on the state of one or more fields. For example, if there’s an original order field in the order table, maybe that is only filled in when this order is actually a return. What does the order total field represent in this case? Should it be positive or negative? These are the kinds of issues to be sensitive to. Often an initial audit of an old site won’t find these items.

Data Issues at Many Levels

These are just a few areas where problems pop up when migrating into Drupal. Issues pop up in the raw data, in the context around the data and in the actual meaning of the data. Finding and fixing these can require a great deal of careful testing and creative solution development. Being aware of the issues beforehand and following through during development can lead to a smoother, faster migration process, and a fuller understanding of what problems the client is trying to solve.

Remember to use your favorite database tool to investigate your incoming data fields like the ones above: 

  • Look at select and checkbox lists to verify that the values are consistent throughout.  
  • Check date fields for good and consistent formatting and “reasonable” data ranges, and consider if there are any timezone issues. 
  • Look at the content of unstructured text.  Are there character set issues?  Internal HTML references to the old site, especially regarding images?
  • Be sure to deeply investigate how the old system is structured, especially if you don’t have access to a technical resource for the old system. Verify that you understand how, and why, the data in the old system are structured the way they are.
     
Mar 13 2020
Mar 13

This month’s SC DUG featured Chris from MindGrub and Kaylan from Pantheon talking about Load Testing.

Launching a website can be a nerve-wracking experience, often times with developers working up until the wire trying to finish that one last feature. If only there was a crystal ball that would show you a vision of how your site would fare when the masses were set loose upon it.

Good news for you, there is! Load testing.

[embedded content]

View the slides from this talk.

We frequently use these presentations to practice new presentations, try out heavily revised versions, and test out new ideas with a friendly audience. If you want to see a polished version checkout our group members’ talks at camps and cons. So if some of the content of these videos seems a bit rough please understand we are all learning all the time and we are open to constructive feedback.

If you would like to join us please check out our up coming events on MeetUp for meeting times, locations, and remote connection information.

Mar 13 2020
Mar 13

Your company has a complex website to create. Drupal was chosen as the main technology. None of the PHP developers have done anything in Drupal before or they have little experience. If this is your case, read this text before starting the implementation of your project. You will learn the risks and you will be better prepared to start the works. 

Below you will find 9 important things to take a look at before launching the programming works. If you will take proper care of each of these areas, you will be able to implement the project efficiently. 

1. Ensure the right workflow in the team

Creating Drupal-based websites by one person is a completely different kettle of fish in comparison to working as part of a team. If you build a team of freelancers who have previously worked alone with Drupal, you should expect problems. If this is a team that knows PHP but has no experience with Drupal, you should expect even bigger problems. 

Before starting, you must establish common principles, a common way of working. You should establish the following: 

  • Working rules in the code repository (Git or others)
  • Coding standards
  • Code flow
  • Database and files flow
  • Distribution of tasks in such a way that they do not interfere with each other and that no works are being performed on the same pieces of code. If it is not possible, communicate often so as not to duplicate the works being performed.

Thanks to these establishments, the team will know how to work together and focus on implementing the programming tasks.

2. Implement the tasks adopting a single standard 

It often happens with Drupal that every programming task can be done in many ways. It is important to establish common rules.

One example is the way of creating the contents. It is worth to determine if the contents will be created using the Layout Builder, Paragraphs, Gutenberg module, or another similar solution. Using several modules at the same time may result in problems. 

There are many more such decisions to be made when using Drupal. It is good to establish common standards. 

A system of supervising the application of these rules is also necessary, e.g. if the deadline of a sprint or stage is drawing near, then you should not take any shortcuts but rather stick to the standards.

3. Get to know the configuration options in Drupal well 

It is a common problem if you employ PHP developers for Drupal works. They start to solve a lot of tasks by creating code. Sometimes they spend dozens of hours creating functionality that is already provided in Drupal.

In many cases, you just have to go to the right place in the administration panel, click around a few times, and that is it. 

It is the extensive configuration options of the Drupal's core and its modules that constitute the strength of Drupal. They allow you to save hundreds or even thousands of man-hours spent on programming works. 

Use the ready-made Drupal options in your project and save the client's money.

4. Get to know the Drupal's API well 

Getting to know the Drupal's API helps you accomplish many tasks faster than using only PHP. 

If someone is not familiar with the Drupal's API, they sometimes create functions or classes that are already in the Drupal's core. This can turn out to be a waste of time.

The lack of knowledge concerning the API sometimes may also cause errors that are difficult to detect. An example would be adding the common PHP redirection header ('Location: '.$newURL); in hookentityinsert. If there are many modules in the system using this hook, it is possible that not all will be executed after adding such a redirection. 

You can find more examples like this. Before starting the works, it is good to familiarise yourself with the Drupal's operation. The very minimum would be to read one of the books describing the creation of modules for Drupal. 

The Examples module is another important source of knowledge. Here you will find a huge number of examples. It is good to look through the codes of this module.

5. Use the View-mode 

Entity view modes are a very useful functionality of the system. They allow you to customise the view of the content depending on where you want to display it. You can define any modes and use them many times. 

Beginner developers often create code for every view mode instead of selecting several different modes and configuring them. It is not worth it. It is better to use a ready-made mechanism in Drupal and complete the tasks faster. 

6. Create entities if you plan to create your own tables in the database

Some business requirements force you to create your own tables in the database. When working with Drupal, it is good to add such tables to the database while at the same time creating Entities. 

As a result, you get a table that is operated via Entity classes and you can easily integrate the data from such a table with the whole Drupal, e.g. with the Views module.
If you will not use Entities, you will have to program everything concerning this table, e.g. the forms or the data-displaying pages. 

7. Write automated tests

Drupal is usually used to create medium- and large-sized websites that are developed over many months or years. When extending the functionality, it may happen that you will break something that was already working correctly. 

If you do not want to perform regular manual checks in order to ensure that everything works correctly, you need to have automatic tests written. 

Drupal has a PHPUnit framework for automated testing. You can also use other frameworks, e.g. Codeception and the Visualception extension. 

8. Automate the application creation and the implementation to the production server

The process of implementing new changes to the production server and the process of creating a website within the local developer environment should be automated. Everyone on the team should use the same scripts to create the application.

You then eliminate the problems of differences between the environments, e.g. when someone has changed something in the configuration on one version, but not on the other. You do not know then what and how should work properly, misunderstandings arise, you waste the time unnecessarily.

All changes should be introduced via code. The team should use a code repository (e.g. Git). You can be sure then that you have the same configuration on every instance of the application. 

9. Advise the client if you can do something a little differently but much faster

Drupal offers a huge number of functionalities and additional modules. With them, you can create really large-sized websites. 

Sometimes a ready-made module does not meet the requirements of the specification in 100% but it is enough to have a talk with the client about it, and most often the client will agree to a small change in order to save time.

Final thoughts 

I listed above the most important things that you should pay attention to when creating your first major website based on Drupal CMS.

The Drupal development team at Droptica knows all about these issues. We save our clients' money and provide a lot of solid code in a short time. We do not have to focus on eliminating problems but only on providing value to the clients. 

If you plan to build a Drupal-based website, be sure to analyse the potential problems mentioned above, and you will achieve satisfactory results. 

Mar 12 2020
Mar 12

You might have noticed some changes on Drupalize.Me lately. We've just wrapped up a huge content archiving project and I'd like to share what we did and how this will help us move forward with a library of tutorials for Drupal learners that we're committed to keeping up-to-date.

Drupalize.Me has a long history of Drupal training that started with in-person workshops, DVDs, and even a multi-year conference (Do It With Drupal or DIWD) from Lullabot. Those DVDs on site building, module development, theming, jQuery, SEO, and more -- they were the start of the library of Drupal training videos on Drupalize.Me. And they've been on the site for a very long time.

During Drupal 7's life cycle (up to Drupal 8's release), we produced videos on core competencies such as module development, theming, and site building. We also covered a number of contributed modules including Views, Panels, and Webform.

When Drupal 8 was released in November of 2015, we were already daunted by the burden of our outdated content. Video-only tutorials made updates that much more cumbersome. We wanted a developer-friendly, code-copy-pasteable format as well as a feasible way to keep Drupal 8 content up-to-date with the new scheduled minor releases. So, we switched to a written-first format and augmented with video.

While this allowed us to move forward more rapidly with Drupal 8 tutorials and keep them updated with every minor release, we still had the baggage of the Drupal 6 and 7 (and other non-Drupal) video tutorials.

As the primary content manager for Drupalize.Me, I felt the pain of trying to manage approximately 1900 published tutorial and video nodes keenly. I felt that if we were going to effectively move forward with new content for Drupal 8 and 9, we needed to address the old content that was intermingling with the new, misleading learners and causing confusion. Frankly, it was overwhelming.

So what did we do? First, we inventoried our content. I was able to divide our material into manageable buckets by content (Drupal 6, Drupal 7, Drupal 8, and non-Drupal), and by format (written+video and video only). I then created a policy -- an outdated content flowchart -- that would help me decide what to do with different categories of outdated content. I presented my policy recommendation and flowchart to the team and got the "green light" to move forward with an audit focused on identifying outdated or misleading content.

One key takeaway during this point in the process was we decided to provide 2 levels of "archiving":

  1. Archive with a disclaimer and provide alternative resources if possible.
  2. Unpublish and redirect to a relevant page if possible.

I audited every single last one of our videos, tutorials, and collection pages and decided whether they should be archived, and at which level. In the process, I dug up alternative resources, updated pertinent topic pages, and basically went a little crazy with spreadsheets. I even tinkered a bit with Google Data Studio.

After our tech team implemented some new archiving and alternative resources fields on our content types, I got to work editing nodes and marking old content as archived, providing alternative resources where possible, and unpublishing the whole of our Drupal 6 and DIWD videos (except jQuery videos that also pertained to Drupal 7). It was amazingly tedious, but it's now done!

  • Drupal 6 content has been removed. Some of it was redirected to Drupal 7 counterparts.
  • Drupal 7 content has not been removed. We know there are still a lot of Drupal 7 users and site maintainers out there. This content has been marked as archived and you will see a banner across the top indicating so. Where possible, alternative resources are listed to point you to Drupal 8 material.
  • Non-Drupal content was archived or unpublished on a case-by-case basis. The bulk of it was marked as archived and remains on the site.
  • Drupal 8 content is here to stay for the time being. We will be forking our tutorial repository and maintaining Drupal 8 and Drupal 9 versions of our tutorials through Drupal 8's lifetime. Given how major releases now work in Drupal, these branches will be the same for a while and will diverge over time.

With content archiving complete, we hope this will provide clarity to our members about which content we are actively committed to keeping up-to-date and which content we consider archived and provided as-is. We also hope in many cases you will find more pertinent Drupal 8 content in the additional resources listed for much of our archived content.

So much for the past. What about the future of Drupalize.Me content? Here are a few of our content goals for this year.

  1. Our #1 priority is to update our Drupal 8 content with each minor version release. We are currently up-to-date with 8.8.0.
  2. Currently undergoing final stage peer review: a revamp of our popular React and Drupal tutorial series!
  3. Add videos to more of our Drupal 8 written tutorials. We will be starting on this immediately, creating videos for both our Content Moderation and Views series.
  4. Review and update Drupal 8 series, including contributing updates to the Drupal 8 User Guide community project, of which we host a fork.
  5. Create new tutorials for Layout Builder, now in core.
  6. Create new tutorials for Media, now in core.

We're excited to move forward with new videos and written tutorials on Drupal 8 (and 9). We'll be focusing the blog on #d9readiness posts in anticipation of both Drupal 9's release sometime in 2020 inspired by this State of Drupal 9 presentation (check out the open sourced slide deck). Sign up for our newsletter (see link in the footer) to get an email when the blog is updated.

Mar 12 2020
Mar 12

2 minute read Published: 12 Mar, 2020 Author: Colan Schwartz
Drupal Planet , Composer , Aegir , DevOps , Automation , Drupal 8

Best practices for building Web sites in the Drupal framework (for major versions 8 and above) dictate that codebases should be built with the Composer package manager for PHP. That is, the code repository for any sites relying on it should not contain any upstream code; it should only contain a makefile with instructions for assembing it.

However, there are some prominent Drupal hosting companies that don’t support Composer natively. That is, after receiving updates to Composer-controlled Git repositories, they don’t automatically rebuild the codebase, which should result in changes to the deployed code.

If you’re hosting your site(s) at one of these companies, and you have this problem, why not consider the obvious alternative?

Aegir, the one-and-only open-source hosting system for Drupal that’s been around for over 10 years, has had native Composer support for over 2 years. That is, on each and every platform deployment (“platform” is Aegir-speak for a Drupal codebase), Aegir reassembles the upstream code assets by running the following automatically:

composer create-project --no-dev --no-interaction --no-progress

As a result, any sites created on that platform (or migrated/upgraded to it) will have access to all of the assets built by Composer.

Additionally, Aegir now ships with the Aegir Deploy module, which enhances the platform creation process. It allows for the following types of deployment:

  • Classic/None/Manual/Unmanaged
  • Drush Makefile deployment
  • Pure Git
  • Composer deployment from a Git repository
  • Composer deployment from a Packagist repository

For more information, please read the Deployment Strategies section of the documentation.

If you’d like to get started with Aegir, the best option would be to spin up an Aegir Development VM, which allows you to run it easily, play with it, and get familiar with the concepts. Naturally, reading the documentation helps with this too.

Afterwards, review the installation guide for more permanent options, and take advantage of our Ansible roles. We have a policy role that configures the main role using our favoured approach.

For help, contact the community, or get in touch with us directly. We provide the following Aegir services:

  • Installation & maintenance in corporate/enterprise (or other) environments
  • Architectural and technical support
  • Hosting guidance
  • Coaching
  • Audits
  • Upgrades
  • Conversion to best practices

The article Does your Drupal hosting company lack native Composer support? first appeared on the Consensus Enterprises blog.

We've disabled blog comments to prevent spam, but if you have questions or comments about this post, get in touch!

Mar 12 2020
Mar 12

argument-open-source

If you engaged in a word association game, one of the first things people would respond when you say “open source” is that it’s free. If any of those people are in the position of purchasing software licenses for a business or organization, that makes open source (a.k.a., free) definitely a benefit worth exploring. Open source has the potential to save thousands of dollars or more, depending on the software and the size of the organization. 

Even though eliminating a budget line item for licensing costs may be enough to convince some organizations that open source is the way to go, it’s actually only one of several compelling reasons to migrate from proprietary platforms to open-source architecture. 

In a debate on open-source vs. licensed platforms, the affirmative argument will include these four, additional points: 

Development Freedom

When businesses provide workstations for their employees, they choose (often inadvertently) the framework on which their organizations operate. For example, if a business buys Dell computers, it will operate within the Microsoft Windows framework. This isn’t necessarily a bad thing. A business with limited IT and development resources won’t have to worry about how to keep its operating system working or whether business applications or security solutions are available. Microsoft has a line of solutions and partnerships that can provide what they’re looking for. 

With a system built on an open-source platform, on the other hand, it may take more resources and work to keep it running and secure, but it gives developers the freedom to do exactly what the end user needs. You aren’t limited by what a commercial platform enables you to do. 

In some markets, foregoing the status quo for developmental freedom sounds like risk. It’s a major reason that government users lag behind the commercial space in technology. They’re committed to the old systems that they know are robust, secure, and predictable at budget time — even though they’re outdated. When those organizations take a closer look, however, they quickly realize they can negate development costs through greater visibility, efficiency, and productivity that a platform that specifically supports their operations can provide. 

Open-source platforms are also hardware agnostic, giving organizations more latitude when it comes to the computers, mobile devices, and tools they can use, rather than being locked into limited, sometimes expensive, options for hardware. 

Moreover, development freedom delivers more ROI than merely decreasing current costs. Open-source platforms give developers the freedom to customize systems and innovate. If your system enabled you to expand your reach, better control labor costs, and support new revenue streams, what impact could that have on your business?

Interoperability

Enterprises and manufacturers have traditionally guarded their proprietary systems, which gave them an edge in their markets and control over complementary solutions and peripherals end users needed. Those same proprietary systems, however, could now be a business liability. Many markets are moving toward open source to provide greater interoperability, and businesses continuing to use proprietary platforms will increasingly be viewed as less desirable partners. 

Military avionics is a prime example. This industry is migrating to the Future Airborne Capability Environment (FACE) Technical Standard. Administered by the FACE Consortium, this open standard aims to give the U.S. Department of Defense the ability to acquire systems more easily and affordably and to integrate them more quickly and efficiently.  

You’ll also find a preference for open-source architecture in some segments of the tech industry as well, such as robotics. The Robot Operation System (ROS) is a set of open resources of tools, libraries, and conventions that standardizes how robots communicate and share information. ROS simplifies the time-consuming work of creating robotic behaviors, and ROS 2 takes that objective further by giving industrial robot developers support for multirobot systems, safety, and security. 

As Internet of Things (IoT) technology adoption grows, more operations are experiencing roadblocks connecting legacy equipment and enabling the free flow of data — which open-source architecture can overcome. Furthermore, IoT based on open-source components allow networks to expand beyond the four walls of a facility to connect with business partners, the supply chain, and end users. The Linux Foundation’s Zephyr Project, for example, promotes open-source, real-time operating systems (RTOS) that enables developers to build secure, manageable systems more easily and quickly. 

Faster Time to Market

Open source projects can also move more quickly than developing on a proprietary platform. You may be at the mercy of the vendor during the development process if you require assistance, and certifying hardware or applications occur on their timelines. 

That process moves much more quickly in an open source community. Additionally, members of the community share. Some of the best developers in the industry work on these platforms and often make their work available to other developers so they don’t need to start from scratch to include a feature or function their end user requires. A modular system can include components that these developers have created, tested, and proven — and that have fewer bugs than a newly developed prototype. 

Developers, using prebuilt components and leveraging an open source community’s expertise, can help you deploy your next system more quickly than starting from ground zero. 

Business Flexibility

Open-source architecture also gives a business or organization advantages beyond the IT department. With open source, you have more options. The manager of a chain of resorts facing budget cuts, for example, could more easily find ways to decrease operating expenses if her organization’s system runs on an open-source platform. A chain that operates on a commercial platform, however, may have to find other options, such as reducing staff with lay-offs.  

Open source architecture also decreases vendor lock-in. In a world that’s changing at a faster and faster pace, basing your systems open-source architecture gives you options if a vendor’s company is acquired and product quality, customer service, and prices change. It also gives you flexibility if industry standards or regulations require that you add new features or capabilities that your vendor doesn’t provide, decreasing the chances you’ll need to rip and replace your IT system.

The Price of Open Source

To be perfectly honest in the open source vs. commercial platform debate, we have to admit there is a cost associated with using these platforms. They can’t exist without their communities’ contributions of time, talent, and support. 

At Mobomo, for example, we’re an active part of the Drupal open-source content management system (CMS) platform. Our developers are among the more than 1 million members of this community that have contributed more than 30,000 modules. We also take the opportunity to speak at Drupal community events and give back to the community in other ways. 

Regardless of how much we contribute to the community, however, it’s never exceeded the payback. It’s enabled lower total cost of ownership (TCO) for us and our clients, saving millions of dollars in operating expenses. It has ramped up our ability to create and innovate. It’s also allowed us to help build more viable organizations and valuable partnerships. 

The majority of our industry agrees with us. The State of Enterprise Open Source report in 2019 from Red Hat asked nearly 1,000 IT leaders around the world how strategically important open source is to an enterprises’ infrastructure software plans. Among respondents, 69 percent reported that it is extremely important, citing top benefits as lower TCO, access to innovation, security, higher-quality software, support, and the freedom to customize. 

Only 1 percent of survey respondents said it wasn’t important at all. 

Which side of the open-source vs. commercial platforms argument do you come down on?

Contact us to drop us a line and tell us about your project.

Mar 12 2020
Mar 12

Sooperthemes will continue as DXPR at dxpr.com (website not yet available). Sooperthemes has been a household name in the Drupal community for over 10 years, providing premium Drupal themes and in recent years, a no-code Drupal layout builder for content authors and marketers. 

Our new brand name will reflect our shift from premium theme development to developing digital marketing products for Drupal content authors and marketers. 

 

Product changes, effective starting from DXPR launch day

  • Our layout builder module will continue to be actively supported and developed and will remain the focus of our company

  • The sidebar elements were soft-deprecated (hidden by default) in a previous update in December. With the major update they will be completely deprecated. 

  • Our theme will be open-sourced, and will continue to be actively maintained and developed

  • Our layout builder and theme will receive a major update along with getting DXPR product branding

  • Our CMS Drupal distribution and demo profiles will be superseded by a new distribution that is completely composer-based. It will be Drupal 8 and 9 only, because it doesn’t make sense to start new websites on Drupal 7. The CMS profile components will be minimally supported.

  • Our Portfolio module will be deprecated and will not be supported or developed anymore

At the time of release we will provide an upgrade path for our major updates to the layout builder module and our theme. We understand not everyone can upgrade their website right away. We will support the current Glazed Theme and Glazed Builder products for 90 days after launching our new, DXPR branded releases. All the above applies to our Drupal 8 and Drupal 7 software.

When will DXPR launch

We aim to launch DXPR approximately 30 days from now. However, our launch date may change, as it is dependending on work with a design agency to do our new visual identity and web design.

Support Services

We replaced our support forum system with a private ticketing set-up. We are launching changes to our support system bit by bit, before and after our DXPR launch. As part of the changes, all support tickets created more than one month ago are not available in the new support dashboard. If some of your removed tickets are still important to you, you can recreate them in the new system (starting today).

Thank you for being a part of our story! If you have any questions don’t hesitate to contact us.


Kind Regards,
Jurriaan Roelofs
Founder, Sooperthemes
Mar 11 2020
Mar 11

There’s nothing like the threat of a global pandemic to bring the topic of working remotely to the forefront. 

This week, in response to the rapid spread of the Corona virus disease (COVID-19), companies from all over the world are scrambling to get systems and policies in place to ensure that work can continue in the event that quarantines are imposed or decisions are made to exercise caution and curtail the threat of workplace transmission of the disease. 

Remote work options are inherent to the Promet Source culture. We’ve benefitted for years from a culturally diverse staff and the opportunity to source the best talent without bias to location. As other organizations are rapidly moving in this direction, here are five strategies that we've learned for optimizing the remote work opportunity.

1. Communicate Often and Communicate Well

Compensating for the fact that you are not engaging with co-workers in the hallways, over lunch, or during daily stand-ups requires excellent and intentional communication. In fact, don't hesitate to over communicate with both your team and your supervisor. Assume nothing. Set clear expectations. Stay in touch, and be sure not to overlook the importance of casual conversations and humor. A productive work environment is not all work no play and you should get to know your remote colleagues just the same as you would those who work in the office or workspace next to yours.

2. Maintain Face-to-Face Connections

Promet’s president, Andrew Kucharski is insistent on the use of Zoom video conferencing for all meetings -- even ad hoc check-ins. This serves multiple purposes. Of course, it keeps us on our A Game and mitigates against any work-at-home temptation to stay in PJs and slippers all day. More importantly, we are inherently more connected when we see each other’s faces and facial expressions. We’re also more prone to converse and know what’s going on in each other’s lives and to remain accountable to each other.

Promet Source PMO working remotely with her dog, Lumen, by her sideDemonstrating another big advantage of working remotely from home, Pamela Ross, Promet Source PMO Director, takes a break with her dog Lumen at her side.

3. Leverage Technology

This one is huge. The concept of telecommuting has been around for a decade or two, but more so than ever before, we have access to tools that make us wonder why we’d ever waste a potentially productive hour or so every day sitting in traffic or taking public transportation to an office. Teleconferencing, shared calendars, collaborative document authoring tools and Slack are among the multitude of tech resources that enable a global staff to thrive.

4. Acknowledge and Affirm

Remember that all of the same principles of leadership and human dynamics apply when working remotely. Who among us isn’t energized by public acknowledgement, appreciation, and various forms of affirmation? At Promet Source, we encourage shout outs on the companywide Slack channel and through software that is specifically designed to engage employees and emphasize the company culture. New team members are welcomed and introduced during a “two-truths-and-a-lie” activity during company wide video conference calls, and managers are coached to consistently acknowledge team members for their contributions.

5. Recognize Relevant Strengths

Working remotely is a privilege that requires maturity and an excellent work ethic. Not everyone fares well in an environment that lacks the structure of a traditional office. Leadership needs to hire accordingly and cultivate an environment in which the responsibilities and advantages of working remotely are emphasized and built into expectations for every role.

At Promet Source, we do a lot more than develop and design accessible websites that win awards. Our Human-Centered Design workshops help to define and the dynamics and the direction for success in any organizational endeavor. Contact us today to learn more. 

Mar 11 2020
Mar 11

Today, we're launching a membership drive. You can help by sharing our message and showing your support.

Help grow membership by visiting our campaign page and sharing with your network. We're supporting the global Drupal project and community with your help!  Together, we make the open source community stronger.

Get your 2020 Membership Certificate

Active members can download a personalized certificate now, and one way to contribute to this membership drive is to print out your certificate (or display it on a screen) and share your selfie with us. Tag with #joinDrupalAssoc #OpenSource, #supportOpenSource to show you care.

Our heartfelt thanks to you—members and supporters—for contributing and participating as individuals and organizations.

Ready to help? Share a post like this or craft your own. 

Mar 11 2020
Mar 11

In this short article I wanted to draw your attention to a neat little feature introduced in Drupal 8.8 related to migrations. And I mean the ability to force entity validation whenever Migrate saves destination entities.

As we already know, entities can be validated using their typed data wrappers like so:

$violations = $entity->validate();

This calls the general validation over the entity and all its fields. Very handy.

Something that you may or may not know is that when we create and save an entity programatically, this validation is not run by default. So for example:

$entity = Node::create([
  'type' => 'page',
  'title' => 'My title',
]);

$entity->save();

If we had some validation on the title field, it would not run and potentially bad data would be saved in the field. Sometimes this is fine, other times it’s bad and in many times it’s critical as things can break spectacularly. So always good to run validation.

When it comes to the Migrate entity destination, there was no validation being run. But with Drupal 8.8, we have a destination plugin option that indicates we want to run the validation when saving the entity. Like so:

destination:
  plugin: 'entity:node'
  validate: true

This will ensure the nodes are validated before being saved by the migration.

Moreover, apart from the configuration on the migration plugin, you can also control this from the entity level if the entity type is defined by you. You can do this by implementing the Drupal\Core\Entity\FieldableEntityInterface::isValidationRequired() method in the entity class. Do note, however, that most entity types do not implement it, nor is this method checked before doing regular entity saves. I expect its use will be extended but so far it is only used within the migration context.

Hope this helps.

Mar 11 2020
Mar 11

Table of Contents

Accessibility in Claro
--Color
--Size and spacing
--Enabling better accessibility reviews
Scope
What makes Claro unique
--PostCSS
Conclusion

Throughout Drupal's existence, no other changes have made as much an impression on users as refreshes of the administrative interface in Drupal. Content editors and site builders have a long list of expectations when it comes to the editorial experience they wish to use, and surveys and tests had shown that Seven was showing its age. Now, thanks to Claro, the new administration theme for Drupal 8, user interfaces for all editors in Drupal are now optimized for accessibility and usability in addition to a realistic roadmap for the future.

Recently, Cristina Chumillas (Claro maintainer and Front-End Developer at Lullabot), Fabian Franz (Senior Technical Architect and Performance Lead at Tag1) and Michael Meyers (Managing Director at Tag1) sat down with me (Preston So, Editor in Chief at Tag1 and author of Decoupled Drupal in Practice) for a Tag1 Team Talks episode about the Claro administration theme and its bright future. In this multi-part blog series, we track the journey of Claro from beginnings to its current state. In this third installment, we uncover some of the ways in which Claro has improved outcomes for accessibility and stayed innovative in a fast-changing front-end landscape.

Accessibility in Claro

If you haven't yet read the first and second installments of this blog series, it's a good idea to go back to better understand where Claro came from and how it came to be. In this section, we examine some of the ways in which Claro prizes accessibility and ensures that Drupal users with disabilities can easily operate Drupal.

Color

Claro: Drupal Core Color Module During our conversation, Fabian asked Cristina: "Will there be a dark mode (in Claro)?" Though a dark mode toggle isn't part of the minimum viable product (MVP) for Claro, Cristina agrees that it could be a valuable addition to the theme, initially as part of the contributed theme ecosystem and then eventually part of core. The notion of a dark mode brings up one of the central considerations for accessibility: the use of color.

Currently, an issue on the Drupal Ideas queue proposes removing the Color module from Drupal core, a module that has been a mainstay of Drupal's available modules list for many years and allows the customization of Drupal themes through color variables. The discussion sparked a wide-ranging discussion about people wishing to customize the colors for administration interfaces for each customer. Though this is undeniably an interesting feature, the issue with offering color customization is that it grants far too much control to users who may configure color combinations that are inaccessible.

Size and spacing

Another piece of feedback that Claro received concerned what some perceived as too much additional space between form elements. Commenters wanted not only to see the elements displayed in a more compact manner but also to be able to customize the extent to which elements were spaced from one another, in addition to font sizes across the theme.

All of this has raised the important question of user customization of elements like size, spacing, and color and whether this sort of customization should be done at the theme level instead of in the user interface. Accessibility, after all, is hampered if a user sets a color scheme with insufficient contrast for certain users. One thing that Claro also implemented in Drupal's administrative interface was an increase in the font size across the board, as smaller font sizes are less accessible.

Enabling better accessibility reviews

A few months into development, the Claro team worked with Andrew McPherson, one of Drupal's accessibility maintainers, to review the designs for the administration theme and found important changes were necessary for elements like the aforementioned text field. An important discovery that the Claro team made was that providing a PNG or PDF file to a design reviewer and accessibility reviewer was less useful when the design isn't fully implemented in order to allow interaction testing as well.

Scope

Another key finding from Claro's development revolves around scope. In open source, of course, one of the fundamental constraints on project scope is always the number of contributor resources available to implement a given design. Having a tight release date also encourages a tighter scope, and Claro's team decided that having Claro ready for Drupal 8.8 was of paramount importance as otherwise Claro would possibly not have been eligible to become a stable theme in Drupal 8.8 or Drupal 8.9.

In the end, the team created a list of components that would be necessary to migrate from one theme to another, and though some of these continue to rely on Seven styles, they remain usable in Claro. By drawing a line in the sand and focusing on a revamp of the design first and foremost, without the risk incurred by engaging overly ambitious ideas like wildly new interfaces and layouts, Claro successfully entered Drupal core thanks in part to the constraints imposed by a narrow scope.

What makes Claro unique

Claro has been pushing the envelope when it comes to Drupal administration themes. Many common patterns now found on the web today are employed in Claro in lieu of the more "classic" components that Seven provided. Cristina says that Claro was "a chance to modernize the interface and get in a lot of patterns common around the web that make lives easier for users and I would say also for front-end developers."

PostCSS

Claro: Drupal Core PostCSS One of the key ways in which Claro demonstrates a high degree of innovation with regard to front-end development is the implementation of PostCSS for the administration theme. The Claro team decided explicitly not to utilize Sass as a Cascading Style Sheets (CSS) preprocessor, as is common in other Drupal themes. Instead, Claro uses PostCSS, a postprocessor for CSS, to provide features such as cross-browser support for CSS variables. Because Claro leverages PostCSS to process CSS written normally, it will remain compatible well into the future.

Although Sass offers many features that normal CSS did not offer several years ago, they provided many features that have now been integrated into CSS3 and modern browsers, which are receiving more regular updates and supporting specifications much more rapidly than before. This means that you can already access many CSS features provided by Sass in browsers including CSS variables — admittedly without Sass's unique features but also without the need for JavaScript to run the preprocessor.

Postprocessors, meanwhile, allow developers to write modern CSS with all of the vendor prefixes needed to support modern browsers. They change variables into functional code for CSS and converts the CSS you write into styles that will work in all browsers, including Internet Explorer. In Fabian's opinion, PostCSS can be considered a bridge from the old to the new much like a JavaScript shim in which one can use new language features available to the language but generate old JavaScript from the file.

Conclusion

Claro: Drupal Available Updates Claro is the new administration theme for Drupal 8, and it looks to be one of the most exciting developments for content editors and site builders in recent years, thanks to the design and development team's emphasis on accessibility, usability, and a narrow scope on implementing a moderate redesign rather than a fundamental rework. But Claro also demonstrates some of the key manners in which Drupal themes can be both accessible and innovative. Thanks to Claro's focus on providing accessible defaults, all users can successfully make use of the administration theme. In addition, with modern approaches like the use of PostCSS, a postprocessor that offers features like vendor prefixes for all modern browsers, Claro shows an innovative front end is possible for Drupal's administration layer.

In this multi-part blog series about Claro for Drupal 8, we have explored the history and origins of the Claro theme, some of the ways in which Claro addressed usability concerns, and how Claro has exemplified accessibility and front-end innovation. In this third installment, we spotlighted some of the ways in which Claro has allowed Drupal's administrative interfaces to become more accessible, including through font size, spacing between elements, and color contrast. We also inspected some of the additional features the Claro theme has added such as PostCSS, which obviates some of the need for CSS preprocessors thanks to the availability of features like CSS variables in modern browsers, and provides postprocessing to ensure vendor prefixes are present. In the following installment of this Claro blog series, we look forward to possible improvements to Claro and Cristina's dream vision for Drupal's new flagship administration theme.

Special thanks to Cristina Chumillas, Fabian Franz, and Michael Meyers for their feedback during the writing process.

Photo by Hulki Okan Tabak on Unsplash

Mar 11 2020
Mar 11

DrupalCon Minneapolis is two months away, and that means it's time for the 2020 Drupal Local Development Survey.

2019 results - Local Drupal development environments
Local development environment usage results from 2019's survey.

If you do any Drupal development work, no matter how much or how little, we would love to hear from you. This survey is not attached to any Drupal organization, it is simply a community survey to help highlight some of the most widely-used tools that Drupalists use for their projects.

Take the 2020 Drupal Local Development Survey

The results will be released around DrupalCon Minneapolis and discussed at the panel: Local development environments panel discussion.

See results from previous years:

Mar 10 2020
Mar 10

A Little Bit of Background

In 2017, I graduated with a PhD in history from The George Washington University, where I studied German colonialism in Africa. After graduation, I ended up teaching at a private secondary school part-time to get my foot in the door. Some days were enjoyable, some days were painful, and I never fully adjusted to life in the classroom at that level. With no opportunity to transition to full-time in my near future, and the little joy I was getting out of the hard work I was putting into my lessons, I realized that I needed to make a change. I weighed my options, and ultimately landed on the most off the wall route in comparison to where I was starting from, and that was to pursue a web development career.

What Led Me To Web Development

The web development world is bewildering. Like most fields, it has its jargon, communities, and prejudices. However, web development is rife with opportunities. Opportunities for a developer position without a degree in computer science if you can build a portfolio. Before you can do that, however, learning what to study was a challenge. Phrases such as front-end or back-end stacks, or the myriad of oddly specific and generic proper nouns were downright confusing: SASS, JavaScript, Gulp, Node, CSS, Angular, React, and MongoDB to name a few. What were these words and what did they represent? If I simply wanted to make a website, what did I need to study first? 

After a lot of research, I decided to self-study, which eventually led me to enroll in Debug Academy’s Drupal Developer course.

There are a nearly infinite number of articles that discuss how they went from no experience to coder in three, six, nine, or twelve months. Some are more helpful than others. Some recommend coding every day for 8 hours, learn a number of new web technologies, and essentially lose yourself while learning the material. I found that there was a certain coding sub-culture that dovetailed with a “work hard, play hard” mentality. This approach is what I call Burnout 101. It was certainly not for me.

The second approach is more sustainable. After a considerable amount of research, I decided to learn HyperText Markup Language (HTML), Cascading Style Sheets (CSS), and JavaScript (JS) through a free online service. It was one of the best decisions to learn how to start web development. While I had a background in HTML—I learned the basic syntax in high school—it slowly walked you through the material and ended with some basic projects before launching into advanced material. However, what it gave me was consistency and daily practice. 

Most of my time studying occurred after coming home from teaching with the occasional weekend workout when I was feeling extra ambitious. Some days I felt like I was making great progress; other days I felt like this career transition was rather silly. What I was firm on was that I wasn’t going to code every day and that I would be able to go at my own pace. 

My goal was to learn material sustainably and practice it before I moved onto the next chunk of material, and it worked for me. Over the course of 4 months, I had learned the basics of HTML, CSS, and JavaScript and was ready to pursue the next step. Figuring that out, again, was a challenge that had me spinning in circles. I attended Meetups, free training, Drupal in DC, Code and Coffee...you name it, I tried it out. Then, by a chance comment mentioned in a group, I saw a free intro course by Debug Academy and the rest is history. (Pun very much intended)

My Debug Academy Journey

When Saturday arrived, I metro’ed to the location in Tyson’s Corner and located a few other people waiting for class to begin. After chatting, I learned that some were advanced learners in other languages, such as Python or Ruby, and others were new to web development, learning HTML for the first time. What I found most interesting was that a handful of these people did the Meetup circuit. They had a list of free trainings, talks, and code alongs in the DMV. I was heartened to see that I felt comfortable with the amount of material that I had learned from February.

The teacher introduced himself as Ashraf Abed. He pitched a three-month course on Drupal, which would be starting in a week. Alumni, many of whom were at established companies in the area, effused about Ashraf’s course. You would code a website for a non-profit and learn the necessary skills for a web developer position. The best part for me was the last three weeks he would help place you at a company. Debug Academy’s course ticked off many of the boxes about filling out my portfolio as a developer. It would seem obtuse to not pursue this opportunity. 

I came home, excited. I enrolled the next day, looking forward to being a student again for the next three months. On the first day of class, I noticed an array of people from different backgrounds, vocations, and ages. I was nearly the youngest person present at 32. Students included web designers, coders from outside the web development world, a few experienced Drupalers who wanted a more formal training, and then, me, the humanities outcast. 

The class went swimmingly. While I had worked on the initial skills, Drupal and Debug Academy’s course brought them together in a package in which I could build complete, sophisticated websites. I learned about those oddly specific, generic proper nouns and acronyms like Twig, SASS, PHP. I became familiar with others, such as Gulp (really?), Less (who comes up with these names?) or Bourbon (why bring my favorite spirit into this?). 

Yes, my head did spin after a few class sessions, and, yes, I was frustrated with some of the advanced material such as custom module development or object-oriented programming. However, I could see the picture of my career developing. By learning these skills, not mastering them, I was seeing a direct path to a position as a Drupal developer. Seeing this path was encouraging; looking at job postings online was downright awe-inspiring, and landing that job within two weeks of my job hunt was euphoric. 

The Abbreviated Job Search

Early September arrived and the class was finishing up. To be honest, I was a bit sad. I enjoyed being a student again. Not having the worries of lesson planning, having everything organized for you, being entertained for 6 hours a week with lectures and code alongs, but the class had to come to an end so my next chapter could begin. The job search, once again, reared its ugly head.
    
I was rather surprised when I searched a number of job listing websites for Drupal in the DMV and found a considerable number of them. What a refreshing change! In the end, I applied only to 6 positions, had 2 requests for interviews, and accepted a position at Debug Academy itself. What I found was that I could leverage my previous skills learned at my doctoral program (teaching, synthesizing complex material, and knowing how to learn quickly) would all be useful attributes in the web development world. 

Since I have joined Debug Academy 18 months ago, I have built upon my skills by working on real client projects. I was one of the principal front-end developers for the Drupal 8 redesign of a federal agency, consulted on a variety of Drupal projects, and worked to revamp the current debugacademy.com. I have also acquired two Acquia certifications: developer and front-end specialist. It all started with a free Git seminar offered by Debug Academy!

It Was Scary, But I Wouldn’t Change A Thing

Looking back on this career change, I realize how scary it all initially seemed. How could I do this? Was I turning my back on all my previous studies? I would like to think we always bring our previous employments and skills to our current position, so there had to be some value to all that hard work I had done prior to my switch. In the end, what was most important was I found a better work-life balance being a Drupal developer, and it has made many other aspects of my life easier to enjoy.

There is much more to the story that is left untold, so if you have any questions send me an email. As always, if you have any questions about transitioning into a career in Drupal or enhancing your current Drupal capabilities, and how Debug Academy can help, reach out to Ashraf to learn about upcoming classes.

Mar 10 2020
Mar 10

Softrip is a boutique software company that makes some of the back office plumbing companies in the travel business need to support their unique use cases. The company operates in a niche market, but its platform is absolutely crucial for this sector.
 
Because Drupal is a powerful platform that shines when it comes to integrations, we coupled it with Softrip to build compelling digital experiences for prospective travel customers. Here are our tips on how to make the integration shine.

Tip #1: Build your API integration the right way

Like most modern web services, Softrip offers a REST API. Bundled into Drupal is Guzzle, the popular PHP HTTP client. Use Guzzle to make a Softrip client that includes authentication, configurable endpoints, timeout settings, etc. You should also consider using the Symfony serializer and a custom normalizer to make the more complex API requests more manageable in your business logic.

Tip #2: The Softrip users should be represented by the Drupal user

It is useful for the Softrip user to be represented by a Drupal user. Why? Because you can use the normal permissions API to have better control of access to the site. You can also use the Drupal PrivateTempStore for key/value data. Debugging is easier when the data is referenced to a user, rather than an anonymous session. A good way to implement this is to combine the Externalauth contrib module with a custom Authentication Provider.

Tip #3: A shopping cart as a mental model

It is helpful to think of the reservation object returned by the Softrip API as a shopping cart from a commerce site., as the customer moves through the reservation process items are added, removed, and changed in the cart. Use this mental model in your code and build a PHP object to represent a reservation. The user's private storage (PrivateTempStore) is a good Drupal API to leverage to save the cart state object.

Tip #4: Use Form API… carefully

A typical travel reservation process involves a series of decisions where a form is submitted at each step. Drupal offers a flexible and powerful framework for managing forms called The Drupal Form API. Along with the core framework, it also includes a powerful set of helpers to quickly build Ajax functionality. However, with the power of the form API you get complexity, so be prepared to learn the details and gotchas in the deep internals of the framework.
 
You will find the framework works very well—provided that you strictly follow the rules. The most important rules’ If the triggering element is a submit button, then that element must have a #submit function where the important logic happens. The Ajax callback function should do nothing but return a partial form array or AjaxResponse.

Tip #5: Handle your errors or they will handle you

The only thing worse than writing code to log error messages is not writing it. 
 
Catch and record any API errors you encounter while communicating with Softrip. These will show up as exceptions thrown by Guzzle. Certain errors are flagged as "friendly." These tend to be the result of invalid data entry rather than a technical error. These errors should be displayed so that the user can correct them and resubmit the form. Of course, most data entry errors should be caught by your form validation before being submitted to Softrip.

Tip #6: Embrace change

Warren Buffett once said that one of the best lessons he ever learned was to get used to—and learn to like— things you can’t change. The travel reservation process is not a perfect, indivisible operation. Travel component availability, options, and extras can change between the start of the reservation process and when a reservation is created. Be sure to check immediately after each step and before submitting the final book request for any conflicts between what the user has selected and what is available at that moment in time.
 

Mar 10 2020
Mar 10
Drupal Career Online, Spring 2020

The Spring 2020 class of our Drupal Career Online course is off to a great start with a (sold out!) group of ten amazing students from across the United States. During the 12 weeks, we'll be diving deep into Git, Composer, information architecture, site-building, module and theme development, development workflows, and so, so, so much more!

This semester's class includes students with a wide range of technical backgrounds and experiences. Some quick stats:

  • Includes students from all four U.S. time zones.
  • Includes Windows, Mac OS, and Linux users.
  • Some students with Drupal 7 experience and some who are new to Drupal. 
  • Varying levels of command line and PHP experience.
  • All students are using Docker-based local development environments (Lando and DDEV).
  • One student is returning for their second DCO (they previously took the Drupal 7 version of the course).
  • Three students are associated with the same company.
  • One student is the child of a long-time active Drupal community member. 

Each of our students has different goals and expectations for the course; over the duration of the course, we will be working to personalize their experience via office hours, community mentors, and the high-level of student/teacher interaction that the DCO has come to be known for. 

In addition to the 7 hours of class time each week, students have access to 4 hours of open "office hours" each week with DrupalEasy instructors where they can ask about anything Drupal- (or Drupal-adjacent) related. Half of the office hours coincide with DrupalEasy's perpetual office hours for all DCO alumni - this serves as a way to help introduce new students to our ongoing DrupalEasy learning community. 

In the next week or so, we'll be looking for community mentors for each of our current students. If you're interested in getting involved, please let us know- this is a great opportunity to give back to the community while growing your own Drupal network. We always try to match up mentors and mentees based on similar Drupal interests and/or technical backgrounds. 

If you, or someone you know, is interested in learning more about the DCO, the next semester begins August 31 - learn more by attending one of our free, 1-hour Taste of Drupal information seminars.

Mar 10 2020
Mar 10

As an application, Federated Search needs to keep pace with changes in React and Drupal. On 1 January 2020, the Search API Solr module deprecated its 1.x branch in favor of 3.x, which supports Solr 7. Similarly, Acquia will be discontinuing Solr 4 in favor of Solr 7 throughout 2020.

In order to best support sites using the Federated Search application, we have created a 3.x version of the application that is compatible with the latest changes in Drupal, React, and Solr. Moving forward, there will be two versions of the application:

  • 3.x - The current feature release. Future work, including Drupal 9 compatibility, will be done on this branch.
  • 2.x - The legacy stable release. No further development work will be done on this release.

Our intention is to allow sites to continue using the 2.x version of the application as long as it suits their needs. We also need to be prepared for upgrading sites to Solr 7 and Drupal 9.

The 3.x version does not add any new features at this time. See the section on Major Changes for information about upgrading your installation.

Requirements

The following software elements are required for each release:

  • 3.x Requirements
    • Solr 6.4 or higher
    • Federated Search React 3.0.0
      • This application is automatically added by the module.
    • Drupal 7.69 or higher
      • Search API Solr 7.x-1.15
      • Search API Federated Solr 7.x-3.0 or higher
    • Drupal 8.8.2 or higher
      • Search API Solr 8.x-3.9 or higher
      • Search API Federated Solr 8.x-3.0 or higher
      • Search API Field Map 8.x-3.0 or higher
  • 2.x Requirements
    • Solr 4.5 - Solr 6.3
    • Federated Search React 2.1.4
      • This application is automatically added by the module.
    • Drupal 7.69 or higher
      • Search API Solr 7.x-1.15
      • Search API Federated Solr 7.x-2.6
    • Drupal 8.8.2 or higher
      • Search API Solr 8.x-1.4
      • Search API Federated Solr 8.x-2.9
      • Search API Field Map 8.x-1.5 (for Drupal 8 only)

Major Changes

When you migrate to a new version of Solr, your schema will need to change. There are detailed instructions for these changes in the README files shipped with the module.

In addition to upgraded support for Solr 7 and 8, the 3.x branch of the application features  improved CSS that has better namespacing to avoid collisions with existing site code.

Upgrade Steps

  • Update the modules and their dependencies.
  • Configure search indexes for Solr 7.
  • Re-index your site content.
  • Rename any custom CSS to use the new ‘fs-’ namespace prefix. 

Maintenance Plan

Moving forward, the next release will address Drupal 9 compatibility. We are then looking at adding additional features to the application.

365-211 by Canned Muffins licensed under CC BY 2.0.

Mar 10 2020
Mar 10

Do you think first impressions last? Will Rogers once said – “You never get a second chance to make a first impression”. Drupal 8 offers some fantastic themes to help every business create a lasting impression. Let’s delve into some of the top Drupal themes that are most suited (but not exclusively) for Media and publishing websites.

According to a research, it takes 50 milliseconds for your website to make a good first impression. Which means that you have 50 milliseconds to visually appeal to your website users. And 94% of the impressions are related to the design of a website. Sure, Content is King; but “Queen Design” steals the show. At its core, Drupal is a framework that is excellent at managing content. However, the Drupal community has built and designed large number of themes to fit every business.

Drupal Themes consist of HTML markup and CSS files that helps in defining the visual appeal of your website. Drupal 8 core comes packed with several basic but impressionable themes like the Claro theme (administrative theme), Bartik theme, Seven theme, Classy theme and more. Don’t forget to watch out for Drupal 9’s (coming soon) new sleek and intuitive default theme - Olivero, which is still work-in-progress, but the style looks amazing already!

 

What should media and publishing websites look for in a Drupal Theme

If none of the default Drupal 8 themes fit your media and publishing business, you can choose from a wide variety of third-party themes. Third-party themes can be categorized into Free Drupal themes, Paid Drupal themes and Custom Drupal themes. Custom themes are opted for when none of the other 3rd party Drupal themes fit a business requirement or when certain CSS styles need to be tweaked. What makes a good Drupal theme?

  • Mobile Responsive – We all know how important it is to have a responsive website today. Not only does it make a great impression, considering the increase in the number of mobile users; it also appeals to Google which in-turn can boost the SEO ranking of a website.

  • Improved content readability - with well-marked layouts and clean HTML 5 markup

  • Friendly Editorial experience design – especially important for media and publishing websites

  • Choice of Layouts to fit your needs – 2 column or 3 column layouts to fit content and ads especially for media websites.

  • Customization options – should be easy to customize while offering customization options

  • Lightweight – Helps in faster page loading

Top Drupal 8 Themes for Media and publishing websites (actively maintained)

  1. Bootstrap

    The very popular and widely used Bootstrap framework is a front-end component library that is ideal for any type of website. The Bootstrap Drupal theme is responsive and acts as a link between Drupal and the Bootstrap framework. Many themes have been based on the Bootstrap framework – some of which we are going to talk about next. It offers slick styles and super-fast page load times with the help of jsDeliver CDN.

  2. Barrio

    This Bootstrap 4 based theme offers flexible layouts that are particularly useful for media and publishing websites. The free Barrio Drupal 8 theme offers 1, 2 and 3 column layouts that are easily configurable. It acts as a base theme on which you can add styles. To completely leverage the power of Bootstrap 4, it is recommended to use the Bootstrap 4 Barrio SASS subtheme. It brings along a variety of styles for every type of element including Color management, dropdown menus, predefined Google fonts and more.

  3. Showcase Lite

    The Showcase Lite Drupal 8 theme is free Drupal theme based on Bootstrap 3 offering a mobile-first layout. It comes with 34 configurable block regions and 1, 2 and 3 column layouts to choose from. With a Superfish menu it offers intuitive navigation with mobile-friendly, touch-enabled and keyboard-accessible drop-down menus. They also offer a Premium version of this Drupal 8 theme.

Showcase LiteImage Source – https://www.drupal.org/project/showcase_lite

 

  1. Creative Responsive Theme

    This is a minimalistic yet stunning Drupal 8 theme that does not depend on any core theme. It is responsive, lightweight and comes with clean HTML5 markup and CSS code. It offers 1 and 2 column layout options, more than 16 block regions and multi-level dropdown menus. It offers a Nivo slider that helps showcase featured content in the banner, allowing administrator to add slides to the slideshow as and when they please.

Creative ResponsiveImage source - https://www.drupal.org/project/creative_responsive_theme

 

  1. Newsplus Lite

    The Newsplus Lite Drupal 8 theme is a free Bootstrap 3 based theme and as its name suggests, a very lightweight theme. It is great for styling media websites like News sites and magazine sites. Offering 3 column layouts, it is a responsive theme that is simple yet great looking. It comes with clean HTML5 and CSS3 codebase, a rich footer and plenty of other modern features. They also offer a premium version of this theme.

Newsplus LiteImage source - https://www.drupal.org/project/newsplus_lite

 

  1. Magazine Lite


As the name suggests, this Drupal 8 theme is lightweight and best suited for magazine websites. It is based on Bootstrap 3 offering mobile-first layouts. It offers a clean 1, 2 and 3 column layouts and Superfish menu for a sleek and stylish news or magazine website.

Magazine LiteImage source - https://www.drupal.org/project/magazine_lite

 

  1. Photographer theme

    The Photographer theme is a free Drupal 8 one page portfolio theme that is chic and classy at the same time. It offers a fully responsive layout that is best suited to showcase creative and colorful portfolios. Comes with jQuery dropdown menus for modern and easy navigation. It lets you add more than 10 social network profiles and is integrated with Font Awesome.

PhotographerImage source - https://www.drupal.org/project/photographer

 

    8. Pixel Theme
 

This is a classic, modern and lightweight Drupal 8 theme that is great for media and publishing websites as well as corporate websites. It is fully responsive and offers 3 column layouts to showcase your work. You can configure up to 20 block regions and comes with a responsive Superfish menu.

PixelsImage source - https://www.drupal.org/project/pixels

 

Mar 10 2020
Mar 10

Landing pages are great for product presentation and customer engagement.

There are a must for today marketing campaigns, mobile advertising and sales development.

landing pages

There is no easy way to build a simple landing page in Drupal.

You can use custom themes or modules to manage layout like parade but it is not that simple. Layout options are limited. For instance we have used Parade for the front page of the demo application for EK. This module does the job; you can build a simple landing page without custom development, but requires a lot of dependencies for a simple page and you may still have to do some css editing.

In this article we will explain how our landing page has been constructed within Drupal 8 website using separate dedicated theme and a custom module with twig templates.

The original page which is a stand alone 1 page theme is now fully integrated in the website.

It may not be the best method, but it can be easily replicated and give more room for creativity and extra flexibility compared to a layout module or a full land page theming construction.

Part 1: the custom theme

To achieve that, we created a custom theme with only 1 region for content. When building a 1 page theme, you usually do not want side columns or extra headers and footers.

To create your theme, you only need 1 file to be saved under your custom theme folder in the Drupal 8 "themes" folder: myTheme.info.yml.

type: theme
base theme: false name: 'EK'
description: 'Built to use land page'
version: VERSION
core: '8.x' regions:
  content: Content
  footer: Footer

This is what is needed to create the basic theme that will be used in our landing page. We keep a region "footer" to insert hidden blocks or content.

This theme will be called on specific routes names and will replace the default theme of the site.

You can add a screenshot image also in the theme folder if you want to enhance your admin view.

theme admin

In the next step we will explain how our custom module switch theme for dedicated url and build the landing page with twig template. For that step you will need some knowledge on creating a simple module, insert libraries and make a twig template.

Mar 09 2020
Mar 09

This is Part 2 of our article on how to tweak BLT so that it can work with Pantheon. If you want more information about BLT, Pantheon, and our setup, go to How to Set Up BLT and Gitlab CI to Work with Pantheon Hosting (Part 1). In this part of the article, we'll go into the details of setting up the CI/CD system in a Drupal project with BLT and Gitlab CI.  

Part 2: Use BLT with Pantheon

We'll set up a development workflow with:

  • BLT: a suite of tools that wraps around your Drupal project
  • Gitlab CI: the pipeline to validate and build artifacts
  • Pantheon: hosts the Drupal site

In this tutorial, we'll set up a Drupal website with the name drupal-books-api.

The Setup 

0. Create the project on Pantheon 

Create the Drupal project on Pantheon, then switch into Git and collect the Git URL. 

1. Create project with BLT 

On your computer, create the project by running

composer create-project --no-interaction acquia/blt-project drupal-books-api

Open the file blt/blt.yml and update git.remotes with the Git URL

git:
   default_branch: master
   remotes:
      - ssh://[email protected]xxx.drush.in:2222/~/repository.git

2. Add Gitlab CI 

Gitlab CI, by its name, is a service from Gitlab that integrates CI/CD pipelines to help build, test and deploy applications. 

We'll start by adding the file name .gitlab-ci.yml into the project's root. 

## 1. use docker image composer
image: composer
    
## 2. define custom variables
variables:
    BLT: ./vendor/bin/blt
    
## 3. cache, reused packages fetched from previous job
cache:
    paths:
        - $HOME/.npm
        - $HOME/.nvm
        - vendor
        - docroot/core
        - docroot/modules/contrib
        - docroot/themes/contrib
        - docroot/profiles/contrib
        - docroot/libraries
    
## 4. additional setup, prior to running the main tasks
before_script:
    # Setup SSH key to push artifact to deploy to server
    - mkdir -p ~/.ssh
    - eval $(ssh-agent -s)
    - echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config
    - echo "$DEPLOY_PRIVATE_KEY" > ~/.ssh/id_rsa && chmod 0600 ~/.ssh/id_rsa
    - ssh-add ~/.ssh/id_rsa
    # Install rsync as needed by BLT
    - apk add rsync --no-cache
    
## 5. define stages
stages:
    - validate
    - build_deploy
    
## 6. stage validate
validate:
    stage: validate
    script:
        # Install required dependencies using composer
        - composer install --ignore-platform-reqs
        # Validate composer, phpcs ...
        - $BLT validate
    
## 7. state build and deploy
build_deploy:
    stage: build_deploy
    only:
        - master
    script:
        - $BLT artifact:deploy --commit-msg "$CI_COMMIT_TITLE" --branch "master" --ignore-dirty --ignore-platform-reqs --no-interaction --verbose

Explanations: 

1. Use Docker image composer: This defines the Docker image that the executor will run to perform CI tasks. This setup uses the composer image. If your CI/CD pipeline has specific requirements, e.g. a specific PHP version, you can always find another prebuilt Docker image from DockerHub. Or if none suits your purpose, you can create and push a custom Docker image for your team. 

2. Define custom variables: These variables will later be used during the run of CI tasks. In this example, we defined a variable named BLT which equals to ./vendor/bin/blt. Then in step 6, $BLT validate simply means running script ./vendor/bin/blt validate.

3. Cache: Cache the downloaded dependencies so that in the next stage, it doesn't have to download them again from the Internet. This helps speed up the running time of the jobs. 

4. Before script: the additional tasks to run prior to the main tasks. At the end of the build process, we want Gitlab CI to push the latest version to Pantheon. As in the first step, we defined Pantheon to accept Git push via SSH protocol, so we need the container where Gitlab CI is running our tasks to be able to identify itself with Pantheon. In this step, we simply create a SSH key ~/.ssh/id_rsa with content from $DEPLOY_PRIVATE_KEY. $DEPLOY_PRIVATE_KEY is a predefined environment variable that we set up in our Gitlab project by going to Gitlab > Your Project > Settings > CI/CD > Variables. 

The value of DEPLOY_PRIVATE_KEY is the private key, which can be retrieved by 

pbcopy < ~/.ssh/id_rsa
// Then paste it into CI/CD variables

Or if you don't have pbcopy, simply copy the contents of ~/.ssh/id_rsa

Variables

Note 1: Make sure that the SSH key id_rsa.pub is added to your Pantheon account, otherwise the Pantheon server has no idea who is pushing code. 

Note 2: It's a good practice to set up a separate SSH key for deployment for each team. 

5. Define stages: Define how many stages you should have in your pipeline. In our simple project, we have two stages: validate code and build the artifact, then deploy it. 

Steps 6 and 7 include tasks to run in each of the stages. 

6. Install dependencies and perform validation: Although BLT will run composer install during the process of creating an artifact, this step is required to make sure BLT and its dependencies are present and up-to-date before running any BLT commands afterward. $BLT validate runs a group of commands below: 

blt tests:composer:validate
blt tests:php:lint
blt tests:phpcs:sniff:all
blt tests:yaml:lint:all
blt tests:twig:lint:all

7. Build the artifact and deploy to Pantheon's server: If all is good, Gitlab CI jumps to the second stage by running the command 

blt artifact:deploy 
  --commit-msg "$CI_COMMIT_TITLE" 
  --branch "master" 
  --ignore-dirty 
  --ignore-platform-reqs 
  --no-interaction --verbose

This will create an artifact, then push it to the branch master of the remotes defined in file blt.yml mentioned in step 1. At the end of the process, you'll see the commit passed through to Pantheon's dashboard.

Notice that in step 7, there is a declaration of only which accepts master. This means the step build_deploy triggers only when an action is made on the branch master

Let's say your team follows Gitflow workflow. When a developer pushes the feature branch (feature/00000-change-header-color), Gitlab CI should run the validate stage to verify and sniff code, but should not deploy it right away to Pantheon. Instead, the developer would create a Merge Request against master. Once approved, trigger CI to validate again, then build and deploy to Pantheon. 

So, with the addition of the file .gitlab-ci.yml, we are able to orchestrate a Gitlab CI instance to validate code and push it to Pantheon when things are right. 

3. Update settings.php 

At this point, we have the CI set up and configured, but that's not enough for our small project to run on Pantheon. We need the database connection.

Drupal projects created by Pantheon come with a modified version of settings.php and an additional file settings.pantheon.php. These files allow your project, when run on Pantheon, to be able to read the database connection from a JSON file located at $_SERVER['PRESSFLOW_SETTINGS'], and use it to connect your Drupal site to the correct database. 

A project created with BLT doesn't come with this setup by default, so we need to update settings.php and settings.pantheon.php from the Pantheon repo. 

Note: Make sure to have hash_salt in your settings.php 

$settings['hash_salt'] = '41kFdvIe95v0tbqQWoxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-JJwoEW4iQeHer-wkMB3rgAXkVFQ';

4. Add the pantheon.yml file 

While Acquia prefers to put Drupal code in the docroot directory, Pantheon nests docroot in a directory called web. So we need to add the final tweak—add the file pantheon.yml to the project root: 

# Put overrides to your pantheon.upstream.yml file here.
# For more information, see: https://pantheon.io/docs/pantheon-yml/
api_version: 1

# PHP Version:
# https://pantheon.io/docs/pantheon-yml#php-version
# Set site's PHP version to 7.2
php_version: 7.2

# Nested Docroot
# https://pantheon.io/docs/pantheon-yml#nested-docroot
web_docroot: true

If your project doesn't have this file, Pantheon will create one. Since we want to specify web_docroot: true, creating a pantheon.yml tells Pantheon not to overwrite our custom setup. 

And lastly, we need a symlink 

ln -s docroot/ web

Now, commit the changes to Gitlab and wait for the green check marks in Gitlab CI/CD > Pipelines. 

Where to Go From Here 

Now that everything is in place, development workflows in teams are simpler and more manageable: 

  • Developers can just worry about their code and dependencies, not the deployment. With BLT and Gitlab CI, only SASS files make their way to the Git repo, so there's no more resolving of CSS conflicts, no more commits of type "Recompile CSS" 
  • Vendor/, core/, modules/contrib/, libraries/ can be excluded from the Git repo 
  • Deployment is more consistent as the build process runs through the same list of predefined tasks 

A more complex setup can include: 

  • Custom Docker image to run Gitlab CI tasks 
  • More code validating 
  • Automated testing 
  • Complex approval process 
  • Post deployment notifications

This is one way to make BLT work with Pantheon. Let us know if you have comments or questions!

Mar 09 2020
Mar 09

Due to the worldwide Coronavirus/COVID-19 response and in consideration of the health of our community, we have reconsidered the implications of moving forward with business as usual for MidCamp. While there are currently no travel or gathering warnings for Chicago, we need to consider that historically, >40% of our attendees travel to the event. Additionally, current advice and historical evidence suggest that proactive steps are essential in containing the network effects of the virus.

Fortunately, we have the tools and ability to share our information in a remote manner, and as such we’re moving MidCamp to a 100% remote format this year. There’re many details to work out, but our aim is to deliver the same quality content on the same schedule as we would have done in person.

What does this mean for attendees?

  • All social events will be cancelled.

  • All Wednesday trainings will be cancelled.

  • All tickets will be refunded.

    • We are still accepting individual sponsorship donations. Donations will go toward deferring cancellation costs, toward costs of the virtual event, and towards our 2021 event.

  • If you’ve booked accommodations with the Midcamp 2020 group at Hotel Versey, call (773) 525-7010 to cancel. 

Our goal is to maintain the high-quality content you've come to expect from MidCamp without adding any risk to the community we care so deeply about. We appreciate your patience and flexibility with this change of format. If you have any questions at all, feel free to contact us at [email protected], or hop into our Slack https://mid.camp/slack.

Thanks again for your support!

Mar 09 2020
Mar 09

As developers, we all want to maximize the use of the tools we've become familiar with. For me, this is BLT by Acquia. Since BLT fits with Acquia's workflows and Acquia Cloud, it's not designed to work with Pantheon, which has a different setup and development workflow.

This article will help you make tweaks so that BLT can work with Pantheon. The setup in this article is deliberately simple in order to explain the approach and concept. The real world setup is usually much more complex and depends on the needs of each project.

Also in this article, we will implement a simple CI/CD pipeline to conduct a seamless flow of continuous integration from the developer's code to the phase where code is delivered to the deployment server.

To be as thorough as possible, we will break the article into two parts:

  • Part 1: We'll explain what parts make up the system and why they're important.
  • Part 2: We'll show you how to implement the setup.

TL;DR: if you're already familiar with BLT and Pantheon, go to How to Set Up BLT and Gitlab CI to Work with Pantheon Hosting (Part 2). Otherwise, read on.

Part 1: What and Why

The CI/CD Pipelines

A typical Drupal project includes a Drupal code base, version control software (Git), dependencies management tools (Composer, NPM, Bower), and development and production environments.

The pipelines connect all parts of the projects together and orchestrate them to do their jobs at the right moment, thus creating a flow of updates from developers to the deployment server.

In our setup, the workflow looks like this:

  1. Developers push code to Gitlab.
  2. Gitlab CI gets triggered and spins up a Gitlab runner, which is a build instance. Gitlab runners run each and every task as defined in the .gitlab-ci.yml. Tasks are: 
    • Code sniffing 
    • Installing composer dependencies 
    • Run npm or gulp tasks to build front-end assets 
    • Run tests if needed 
    • Sanitize and build artifacts
  3. Push the artifact above to Pantheon.

Pipelines done right can increase product quality thanks to the layers of validation. It can also increase the speed of deployment, which means that new features and bug fixes will reach deployment faster.

Pantheon

Pantheon is one of the biggest names in Drupal hosting. Pantheon's main workflow includes the three main environments Dev-Test-Live and Multidev, which creates extra development environments that development teams can use to test feature branches before merging back to Dev.

Acquia BLT

In some big Drupal projects where many developers are involved, automation is required and managing configurations across many Dev-Stage-Preprod-Prod environments becomes complicated. Things can easily become chaotic due to the different ways each member of the team approaches problems.

Acquia BLT is a suite of tools that wraps around your Drupal project, which helps you manage team projects in a more standardized way. It also adds an automation layer on top of Drupal, making the implementation of continuous integration and deployment easier.

Some notable features of BLT:

  • Local Git hooks: Sniffing coding standard on each commit, or even setting a standard pattern on git commits. This is very useful, for example, when we have the git commit pattern set to 00000 - description of commit, which helps track down the commit and the task in PM tools such as Jira or Redmine.
  • Automation tasks: Tasks such as compiling front-end assets and composer install.
  • Artifact generation: Creating production-only artifacts. Generating a safe artifact to deploy to another git repo involves a lot of steps. During the process of generating artifacts, BLT goes into directory ${project.root}/deploy, checks out the main code base, installs dependencies, builds front-end assets, sanitizes the code base, and creates an artifact ready to deploy. This process is complex given the amount of tasks to run and the amount of files to include or exclude from the artifact. BLT is built on a set of best practices in Drupal development and does the heavy lifting in sanitizing and preparing the code.
  • CI/CD: Works with Acquia Cloud pipelines and Travis CI or Gitlab CI.

An Example

In some projects, we had to integrate Drupal with Okta authentication using module simplesamlphp_auth. Okta configuration needed to be declared in vendor/simplesamlphp/simplesamlphp/config/config.php, and this could be erased when running composer install. Then we needed to add the extra commands post-install-cmd or post-update-cmd to remember to patch the SimpleSAMLphp package with our configuration.

It's manageable, but when a new developer comes onboard, you'll need to explain all these details again and again.

With BLT, it's simple to set up simplesamlphp_auth: add a SimpleSAMLphp to blt/blt.yml, then BLT will remember to include your config from ${project.root}/simplesamlphp/config during the deployment.

This is a simple use case that we like from BLT. Even if you're experienced and can handle all DevOps and automation tasks by writing your script, it's still hard to synchronize your brain with your colleagues'. This setup allows you to automatically include your config during deployment so you don't have to go through all the steps again.

Implementation of the Setup

In How to Set Up BLT and Gitlab CI to Work with Pantheon Hosting (Part 2), we'll go into the details of the setup and show you how to run a Drupal site wrapped by BLT on a Pantheon server.

Mar 09 2020
Mar 09

Drupal 8 migration is an increasingly popular decision among Drupal 7 website owners. This is explained by the benefits of Drupal 8, as well as by the fact that migration to D8 is the best way of getting ready for Drupal 9.

Migration involves plenty of aspects. Today, we will review a very important, but often overlooked one — SEO. Discover the best SEO practices in planning a Drupal site migration to the 8th version.

A little introductory note: is Drupal good for SEO?

Before we begin diving into the migration and SEO peculiarities, we want to answer one of the most popular questions of our customers: is Drupal good for SEO? The answer is a definite yes. By means of both built-in and contributed Drupal SEO modules, your website can be provided with everything for better positions in SERP:

  • SEO-friendly descriptive page URLs
  • meta tags for your content
  • effective content categorization via taxonomy
  • proper URL redirects
  • helpful analytics
  • XML sitemap creation
  • robots.txt creation
  • broken link checking
  • real-time SEO tracking

and much more.

How does Drupal 8 migration impact SEO?

The difference in architecture between D7 and D8 has created a challenging Drupal upgrade path (although migrations have definitely become much easier today). There are classic migration steps like creating a fresh D8 instance and moving the content, users, and configuration to it. This is done via automation, or recreated manually in some cases.

The choices depend on complexity of migration, the amount of content, and more. For example, when you need to migrate content from Drupal 7, it can be more feasible to simply republish the nodes if there are under 50.

In any case, Drupal 8 migration should be performed with respect to SEO, otherwise various issues may arise. They may lead to losses in website traffic, SEO rankings, brand visibility, customer satisfaction, and conversions. Of course, no website owner wants this to happen, because they have invested their money, time, and effort into gaining it all through high-quality content.

Consider, for example, a very common SEO issue such as broken links that appear if your pages have changed their URLs during an uncareful content migration in Drupal. In this case:

  • search engines find missing pages and decrease your rankings
  • users access your content via bookmarks and become disappointed
  • backlinks to your site on third-party websites and social media are broken as well

This looks like a total SEO disaster! However, this is just one example — read on to discover more. This proves how important the role of an SEO expert is in a Drupal 8 website migration plan.

The workflow of an SEO expert during a Drupal 8 migration

For SEO success during migration, an SEO expert should be involved from start to finish:

  1. Do a SEO audit before the migration to discover what needs improvement, what brings the most value, what should most definitely stay unchanged during the migration, etc.
  2. Review the content to see what needs to be rewritten or maybe what obsolete content needs to be deleted.
  3. Create the SEO requirements for developers for what should be achieved so they can use them as a guide to plan the necessary steps and choose the right tools.
  4. Stay in touch during the migration so developers can consult the SEO expert with any questions that arise.
  5. Thoroughly check the migration results in every SEO-related detail before the new Drupal 8 website is deployed to live.

What’s important: SEO checklist for a smooth Drupal 7 to Drupal 8 migration

The necessary things to do differ from website to website and are based on its SEO audit. However, we have asked our SEO experts for a common Drupal migration checklist of important SEO issues to consider. Here is goes:

Web page ULRs

  • The website should be available to users at the same address as was used for Drupal 7. Take care to redirect from the WWW to non-WWW version, as well as from the version with slashes (https://my-website.com/). If the website uses HTTP, it is highly recommended to switch to HTTPS.
  • Website structure and navigation should be preserved if no changes are planned. Sticking to the same structure is highly recommended in order to save a lot of trouble with URLs.
  • All page URLs should be preserved (because even the slightest URL change can cause broken links!). If the website structure changes so that you cannot keep the same URLs for some pages, make 301 redirects for them. The Redirect module can take care of the redirects to the new addresses.
  • The logic of adding new nodes in the future should also be kept the same. The Pathauto module will help you generate new URLs according to the necessary logic patterns (e.g. your-website/category/blog-title).
  • You need a well-designed 404 page in place. If you would like your website to search for some content instead of just showing the 404 page, try the Search 404 Drupal module.
  • There is a recommendation to make all website URLs relative.
  • The use of the Canonical tag should be preserved.
  • Provide a rel="nofollow" attribute for external links.
  • Image URL paths used in content should be preserved.
  • Internal linking should be preserved.

Metadata & tags

  • Meta tags and the way they are generated should be preserved. You can use the Metatag module to provide automatic meta tag generation for content according to the specified patterns.
  • The ALT and TITLE tags for the media should also be carefully migrated. Please note that by default the ALT tag is required in Drupal 8 and changing this is not recommended for web accessibility practices.
  • H1-H6 tags and the way they are generated need to be kept.
  • Twitter Cards metadata and the way it is generated need to be kept.
  • Open Graph metadata and the way it is generated need to be kept.
  • Structured data needs to be carefully migrated. The Schema.org Metatag module can be of great help.
  • HTML lang attributes should be preserved.
  • Meta viewport should be preserved.

Configuration & other important SEO elements

  • You need to make sure all the webforms have the proper configuration and are working so you can get your leads, customers, subscribers, etc.
  • XML Sitemaps should be preserved (if the website structure changes, it’s necessary to create recommendations as to its configuration with further checking). The XML Sitemap Drupal module can be of great help.
  • The Robots.txt should be carefully migrated (if the website structure changes, it’s necessary to create recommendations as to its configuration with further checking). The RobotsTxt Drupal module can take care of that.
  • All important scripts for third-party services should remain fully functional on your website.
  • Content categories, node publication dates, pagination, page tags, etc. needs a careful migration.
  • The website’s favicon should be preserved.
  • And there is more, depending on a particular website!

Website loading speed

It’s necessary to take care of the page loading speed because it is one of the SEO ranking factors and also has a direct influence on website traffic and conversions.

Mobile-friendliness

A key part of the Drupal 8 migration and SEO checklist is the proper site display across all desktop and mobile devices. Mobile-friendliness influences your rankings according to Google’s mobile-first approach and boosts your traffic.

Drupal migration SEO checklist

Let us perform your Drupal 8 migration with best SEO practices

If you want to be sure you migrate to Drupal 8 without losing SEO, we have great news for you. Our Drupal support and maintenance team specializes in Drupal 8 migrations and also has SEO experts on staff. They will consider all your website needs and provide for the SEO-friendly upgrade process.

Plan your migration with us and welcome to the innovative Drupal 8 with all your SEO gains retained!

Mar 09 2020
Mar 09

What is the future?

Our future user experience with computers is going to be significantly based on the intersection of personalization and voice assistants. Simply put, voice assistants are going to completely understand every question being asked, find the perfect personalized answer to every question, and empathetically return an appropriate response.

I have become slightly obsessed with voice assistants. In my limited free time, I am exploring both Alexa and Google Assistant. Both technologies have their strengths, weaknesses, and differences but their underlying user interaction and even their back-end code is glorified if/then statements. For example, “if” an end-user asks a question like, “What is the weather?”, “then,” the back-end code looks up the weather for the user’s current location and returns a response.

One of the key challenges for building useful voice assistant applications is conversational design. Although we all know how to have a conversation, designers and developers will need to discuss the problem and figure out the solution. A secondary challenge that I am noticing for creating engaging voice user experiences is providing the data behind the voice. Organizations are going to have to restructure their data to be more omnichannel and consumable by voice applications. For example, Mayo Clinic recently discussed how they had to rethink their editorial process to create content that is more distributable to voice channels.

As we begin to develop voice assistant applications and strive to build personalized user experience, everyone is going to come to the realization that we need to rethink how we structure, share, and consume data. If we collectively want to succeed, we need to collaborate and work together to define and implement standardized data structures.

Defining, standardizing, and structuring our data using Schema.org

The internet and modern computing exist because collectively we have become good at collaboratively creating and sharing open standards and open-source software. Schema.org is the most recent standard to emerge and the blog post excerpt below established who is behind Schema.org and why these organizations are working together.

“On June 2nd (2011) we announced a collaboration between Bing, Google and Yahoo to create and support a standard set of schemas for structured data markup on web pages. Although our companies compete in many ways, it was evident to us that collaboration in this space would be good for each search engine individually and for the industry as a whole.”

-- http://blog.schema.org/2011/07/on-june-2-nd-we-announced-collaboration.html

The homepage of Schema.org succinctly defines what is Schema.org.

“Schema.org is a collaborative, community activity with a mission to create, maintain, and promote schemas for structured data on the Internet, on web pages, in email messages, and beyond.”

-- https://schema.org/

Based on the popularity of Schema.org, there is little need for any further explanation or even examples because everyone is implementing it…we are just doing it wrong. “Wrong” is a very harsh and critical word, and I hesitate to ever use it. At the same time, in this context I am now required to justify its usage and illustrate the mistake that we are collectively making when thinking about Schema.org.

First off, for many existing websites and internet applications, schema is an afterthought that is being implemented on top of our existing web pages and content. Fortunately, adding schema to our web pages usually results in improved SEO and Google page ranking. If we spend the time implementing speakable schema, we can also have our content promoted in voice applications.

To address the growing need to include schema within our webpages, every enterprise CMS offers some mechanism to add schema to generated content. Drupal has a Schema Metatag module. WordPress has a Schema plugin. Adobe Experience Manager has the ability to create, add, and manage metadata schemas. Laying schema on top of our existing content architecture feels more like a workaround or band-aid to the challenge of collectively defining, standardizing, and structuring our data. Everyone is still defining their unique content architecture.

We are failing to address one of the biggest challenges in computer science... naming things

If we step back and think about it, there are many different ways to describe something as simple as an image object. For example, should the text associated with the image be called title, caption, label, or text, and then how do we want to describe the image’s URI and metadata?

Would it be possible for everyone to standardize on one canonical definition on an image object? What would happen if a content management system adopted a Schema-First approach?

A Schema-First approach

Several people have begun talking about the benefit of a well-defined schema at the very beginning of a project.

“I think it would be fantastic if most vendors would be able to use schema.org as a starting point.”

-- Using schema.org as a starting point for (headless) WCM

“What’s the solution, then? Establishing a single source of truth through the use of schema-first design can help align software development. Identifying a common goal, and establishing a source for teams and processes to align themselves with is incredibly important, and in this piece, we’ll give you the understanding, and some tools, to do exactly that.”

-- Using A Schema-First Design As Your Single Source of Truth

As we start creating decoupled, headless content management solutions, it becomes more critical that the front-end teams get the expected data in the expected format. “Communication” is the single word that best describes the overarching benefit to a Schema-First approach. The idea that our websites and applications communicate using the same data structures would make the concept “Omnichannel publishing” a thing of the past and change how we syndicate and aggregate information.

Besides “Communication” there are several secondary benefits worth highlighting and using to as arguments to stakeholders who need to understand the benefits of going Schema-First.

“Google uses structured data that it finds on the web to understand the content of the page, as well as to gather information about the web and the world in general.”
-- https://developers.google.com/search/docs/guides/intro-structured-data

Everyone especially, site owners and marketers want to have a website with excellent SEO, which increases the website's overall Google page ranking. Schema.org’s structured data is specifically designed to help improve how search engines understand shared content. SEO is the reason most organizations have added schema to the web pages. Now, we are just taking it one step further and building web pages on top of a well-defined shared schema.

Omnichannel

Omnichannel is a cross-channel content strategy that organizations use to improve their user experience and drive better relationships with their audience across points of contact.
-- https://en.wikipedia.org/wiki/Omnichannel

COPE (Create, Once, Publish, Everywhere) approach to content management, which has been rebranded as “Omnichannel,” has helped people understand the importance of creating and distributing content. COPE and Omnichannel is still focused on an organization reaching users across an organization’s contact points. A Schema-First approach would make the “everywhere” in COPE mean “everyone”. Everyone inside and outside an organization would be able to share and consume data.

Syndication & Aggregation

Any organization that has implemented schema on their web content has seen the benefit of their content - that it is more accessible within search results and voice applications. It’s very hard for people to conceptualize what it would be like if websites and applications could seamlessly push and pull data from one website to another. It is hard to imagine that every single webpage with biographical or location information could have the data structured in the same way.

Even though we are creating API First Content Management Systems, it still requires developers to document the data being syndicated, and the developer aggregating this data needs to understand, transform, and consume the data. It pains me to admit that many organizations’ internal teams still struggle with sharing and consuming data.

Shouldn’t an organization press release be automatically consumable by any application?

An amazing proof-of-concept would be Schema-First CMS being able to pull in any NewsArticle from a website, like The New York Times, which implements Schema.org for their articles. A content manager should be able to cut and paste a URI, and with absolutely no code to normalize or massage the data, the external content should be available with the CMS.

Limitations

“Now, one thing that schema.org won’t do for you is map to your product strategy, content strategy or layout.”

-- https://markdemeny.com/2019/09/using-schema-org-as-a-starting-point-for-headless-wcm/

A Schema-First approach is not a solution but provides a rock-solid foundation for building out an organization’s digital strategy and user experience. We also have to recognize that Schema.org is continually evolving, improving, and even deprecating some entities and properties. Fortunately, as one of my next steps, I want to explore prototyping a Schema-First Content Hub/Repository using Drupal, which as a community in their upgrade from Drupal 8 Drupal 9, is learning how to evolve and properly deprecate code and data structures.

Next steps

Analyze, Prototype, Evangelize

The concept of taking a Schema-First approach to the Information Architecture behind a Content Management System is not going to be difficult to sell; the challenge is going to be implementing it successfully.

Schema.org is an evolving standard, and its shortcomings need to be analyzed and discussed. Examining how we are transforming our legacy content structures to conform to Schema.org’s specifications could show us what is missing from Schema.org’s specifications. We also need to determine what data should and should not be modeled via schema. For example, we should also think about how to manage presentation information and where it should live.

I am more of an implementation person and not necessarily a specification person. Everyone has their strengths and weaknesses. This is why I frequently use the word collaboration throughout my blog posts, to encourage everyone to contribute. The best feedback which I am going to be able to provide is going to come from implementing and prototyping a Schema-First content management solution. Fortunately, there are people like Peter F. Patel-Schneider at Nuance Communications, Inc. analyzing Schema.org (Article - Video),

Of course, I am guiltily optimistic in thinking that Drupal and its community can solve any problem. At the very least, Drupal is the open-source leader for enterprise content management and user experiences. And fortunately, I am not alone in thinking that Drupal could be a forerunner for a Schema-First CMS.

“So that’s why I’ve been thinking about how it would be if we just had one CMS which would, obviously, be perfect. The perfect CMS, or PCMS, for short...PCMS doesn’t exist, but many of the concepts mentioned in this article do. Drupal has something called the Content Construction Kit which brings ‘custom fields’ to a new level.”
-- A proposal for a perfect CMS

For me, I want to follow up this post with a proposal on how we can prototype a Schema-First implementation of a decoupled Drupal application, or maybe even more specifically, a “Content Hub/Repository.”

Evangelize

For open source and open standards to succeed, we collectively need to evangelize our thoughts and ideas. I know some organizations have invested time and resources in building better content models based on Schema.org. Organizations need to share their experience and get involved by encouraging clients to use and improve these standards.

For now, when your organization is layering Schema.org on top of your existing websites you should think about the possibility of a single way for everyone to organize, structure, and share data because I believe there is one, and Schema.org might be the solution.

Mar 09 2020
Mar 09
Many production websites out there are still happily running on Drupal 7, whilst newer ones have been started on Drupal 8, but the question keeps coming up - should we wait for Drupal 9, and if so, when is it due for release?
Mar 08 2020
Mar 08
Drupal 8 will be released on November 19 | Wunderkraut

Coincidence?

We're ready to celebrate and build (even more) amazing Drupal 8 websites. 
On November 19 we'll put our Drupal 8 websites in the spotlight...be sure to come back and check out our website.

By

Michèle Weisz

Share

Want to know more?

Contact us today

or call us +32 (0)3 298 69 98

© 2015 Wunderkraut Benelux

Mar 08 2020
Mar 08
77 of us are going | Wunderkraut

Drupalcon 2015

People from across the globe who use, develop, design and support the Drupal platform will be brought together during a full week dedicated to networking, Drupal 8 and sharing and growing Drupal skills.

As we have active hiring plans we’ve decided that this year’s approach should have a focus on meeting people who might want to work for Wunderkraut and getting Drupal 8 out into the world.
As Signature Supporting Partner we wanted as much people as possible to attend the event. We managed to get 77 Wunderkrauts on the plane to Barcelona!  From Belgium alone we have an attendance of 17 people.
The majority of our developers will be participating in sprints (a get-together for focused development work on a Drupal project) giving all they got together with all other contributors at DrupalCon.

We look forward to an active DrupalCon week.  
If you're at DrupalCon and feel like talking to us. Just look for the folks with Wunderkraut carrot t-shirts or give Jo a call at his cell phone +32 476 945 176.

Share

Related Blog Posts

Want to know more?

Contact us today

or call us +32 (0)3 298 69 98

© 2015 Wunderkraut Benelux

Mar 08 2020
Mar 08
Watch our epic Drupal 8 promo video | Wunderkraut

How Wunderkraut feels about Drupal 8

Drupal 8 is coming and everyone is sprinting hard to get it over the finish line. To boost contributor morale we’ve made a motivational Drupal 8 video that will get them into the zone and tackling those last critical issues in no time.

[embedded content]

Share

Related Blog Posts

Want to know more?

Contact us today

or call us +32 (0)3 298 69 98

© 2015 Wunderkraut Benelux

Mar 08 2020
Mar 08

Once again Heritage day was a huge succes.

About 400 000 visitors visited Flanders monuments and heritage sites last Sunday.  The Open Monumentendag website received more than double the amount of last year's visitors.

Visitors to the website organised their day out by using the powerful search tool we built that allowed them to search for activities and sights at their desired location.  Not only could they search by location (province, zip code, city name, km range) but also by activity type, keywords, category and accessibility.  Each search request being added as a (removable) filter for finding the perfect activity.

By clicking on the heart icon, next to each activity, a favorite list was drawn up.  Ready for printing and taking along as route map.

Our support team monitored the website making sure visitors had a great digital experience for a good start to the day's activities.

Did you experience the ease of use of the Open Monumentendag website?  Are you curious about the know-how we applied for this project?  Read our Open Monumentendag case.

Mar 08 2020
Mar 08
Very proud to be a part of it | Wunderkraut

Breaking ground as Drupal's first Signature Supporting Partner

Drupal Association Executive Director Holly Ross is thrilled that Wunderkraut is joining as first and says: "Their support for the Association and the project is, and has always been, top-notch. This is another great expression of how much Wunderkraut believes in the incredible work our community does."

As Drupal Signature Supporting Partner we commit ourselves to advancing the Drupal project and empowering the Drupal community.  We're very proud to be a part of it as we enjoy contributing to the Drupal ecosystem (especially when we can be quircky and fun as CEO Vesa Palmu states).

Our contribution allowed the Drupal Association to:

  • Complete Drupal.org's D7 upgrade - now they can enhance new features
  • Hired a full engineering team committed to improving Drupal.org infrastructure
  • Set the roadmap for Drupal.org success.

First signaturepartner announcement in Drupal Newsletter

By

Michèle Weisz

Share

Related Blog Posts

Want to know more?

Contact us today

or call us +32 (0)3 298 69 98

© 2015 Wunderkraut Benelux

Mar 08 2020
Mar 08

But in this post I'd like to talk about one of the disadvantages that here at Wunderkraut we pay close attention to.

A consequence of the ability to build features in more than one way is that it's difficult to predict how different people interact (or want to interact) with them. As a result, companies end up delivering solutions to their clients that although seem perfect, turn out, in time, to be less than ideal and sometimes outright counterproductive. 

Great communication with the client and interest in their problems goes a long way towards minimising this effect. But sometimes clients realise that certain implementations are not perfect and could be made better. And when that happens, we are there to listen, adapt and reshape future solutions by taking into account these experiences. 

One such recent example involved the use of a certain WYSIWYG library from our toolkit on a client website. Content editors were initially happy with the implementation before they actually started using it to the full extent. Problems began to emerge, leading to editors spending way more time than they should have performing editing tasks. The client signalled this problem to us which we then proceed to correct by replacing said library. This resulted in our client becoming happier with the solution, much more productive and less frustrated with their experience on their site. 

We learned an important lesson in this process and we started using that new library on other sites as well. Polling our other clients on the performance of the new library revealed that indeed it was a good change to make. 

Mar 08 2020
Mar 08

A few years ago most of the requests started with : "Dear Wunderkraut, we want to build a new website and ... "  - nowadays we are addressed as "Dear Wunderkraut, we have x websites in Drupal and are very happy with that, but we are now looking for a reliable partner to support & host ... ".

By the year 2011 Drupal had been around for just about 10 years. It was growing and changing at a fast pace. More and more websites were being built with it. Increasing numbers of people were requesting help and support with their website. And though there were a number of companies flourishing in Drupal business, few considered specific Drupal support an interesting market segment. Throughout 2011 Wunderkraut Benelux (formerly known as Krimson) was tinkering with the idea of offering support, but it was only when Drupal newbie Jurgen Verhasselt arrived at the company in 2012 that the idea really took shape.

Before his arrival, six different people, all with different profiles, were handling customer support in a weekly rotation system. This worked poorly. A developer trying to get his own job done plus deal with a customer issue at the same time was getting neither job done properly. Tickets got lost or forgotten, customers felt frustrated and problems were not always fixed. We knew we could do better. The job required uninterrupted dedication and constant follow-up.

That’s where Jurgen came in the picture. After years of day job experience in the graphic sector and nights spent on Drupal he came to work at Wunderkraut and seized the opportunity to dedicate himself entirely to Drupal support. Within a couple of weeks his coworkers had handed over all their cases. They were relieved, he was excited! And most importantly, our customers were being assisted on a constant and reliable basis.

By the end of 2012 the first important change was brought about, i.e. to have Jurgen work closely with colleague Stijn Vanden Brande, our Sys Admin. This team of two ensured that many of the problems that arose could be solved extremely efficiently. Wunderkraut being the hosting party as well as the Drupal party means that no needless discussions with the hosting took place and moreover, the hosting environment was well-known. This meant we could find solutions with little loss of time, as we know that time is an important factor when a customer is under pressure to deliver.

In the course of 2013 our support system went from a well-meaning but improvised attempt to help customers in need to a fully qualified division within our company. What changed? We decided to classify customer support issues into: questions, incidents/problems and change requests and incorporated ITIL based best practices. In this way we created a dedicated Service Desk which acts as a Single Point of Contact after Warranty. This enabled us to offer clearly differing support models based on the diverse needs of our customers (more details about this here). In addition, we adopted customer support software and industry standard monitoring tools. We’ve been improving ever since, thanks to the large amount of input we receive from our trusted customers. Since 2013, Danny and Tim have joined our superb support squad and we’re looking to grow more in the months to come.

When customers call us for support we do quite a bit more than just fix the problem at hand. Foremostly, we listen carefully and double check everything to ensure that we understand him or her correctly. This helps to take the edge off the huge pressure our customer may be experiencing. After which, we have a list of do’s and don’t for valuable support.

  • Do a quick scan of possible causes by getting a clear understanding of the symptoms
  • Do look for the cause of course, but also assess possible quick-fixes and workarounds to give yourself time to solve the underlying issue
  • Do check if it’s a pebkac
  • and finally, do test everything within the realm of reason.

The most basic don’t that we swear by is:

  • never, ever apply changes to the foundation of a project.
  • Support never covers a problem that takes more than two days to fix. At that point we escalate to development.

We are so dedicated to offering superior support to customers that on explicit request, we cater to our customers’ customers. Needless to say, our commitment in support has yielded remarkable  results and plenty of customer satisfaction (which makes us happy, too)

Mar 08 2020
Mar 08

If your website is running Drupal 6, chances are it’s between 3 and 6 years old now, and once Drupal 8 comes out. Support for Drupal 6 will drop. Luckily the support window has recently been prolonged for another 3 months after Drupal 8 comes out. But still,  that leaves you only a small window of time to migrate to the latest and greatest. But why would you? 

There are many great things about Drupal 8 that will have something for everyone to love, but that should not be the only reason why you would need an upgrade. It is not the tool itself that will magically improve the traffic to your site, neither convert its users to start buying more stuff, it’s how you use the tool.  

So if your site is running Drupal 6 and hasn’t had large improvements in the last years it might be time to investigate if it needs a major overhaul to be up to par with the competition. If that’s the case, think about brand, concept, design, UX and all of that first to understand how your site should work and what it should look like, only then we can understand if a choice needs to be made to go for Drupal 7 or Drupal 8.  

If your site is still running well you might not even need to upgrade! Although community support for Drupal 6 will end a few months after Drupal 8 release, we will continue to support Drupal 6 sites and work with you to fix any security issues we encounter and collaborate with the Drupal Security Team to provide patches.

My rule of thumb is that if your site uses only core Drupal and a small set of contributed modules, it’s ok to build a new website on Drupal 8 once it comes out. But if you have a complex website running on many contributed and custom modules it might be better to wait a few months maybe a year until all becomes stable. 

Mar 08 2020
Mar 08

So how does customer journey mapping work?

In this somewhat simplified example, we map the customer journey of somebody signing up for an online course. If you want to follow along with your own use case, pick an important target audience and a customer journey that you know is problematic for the customer.

1. Plot the customer steps in the journey

customer journey map 1

Write down the series of steps a client takes to complete this journey. For example “requests brochure”, “receives brochure”, “visits the website for more information”, etc. Put each step on a coloured sticky note.

2. Define the interactions with your organisation

customer journey map 2

Next, for each step, determine which people and groups the customer interacts with, like the marketing department, copywriter and designer, customer service agent, etc. Do the same for all objects and systems that the client encounters, like the brochure, website and email messages. You’ve now mapped out all people, groups, systems and objects that the customer interacts with during this particular journey.

3. Draw the line

customer journey map 3

Draw a line under the sticky notes. Everything above the line is “on stage”, visible to your customers.

4. Map what happens behind the curtains

customer journey map 4

Now we’ll plot the backstage parts. Use sticky notes of a different color and collect the persons, groups, actions, objects and systems that support the on stage part of the journey. In this example these would be the marketing team that produces the prod brochure, the printer, the mail delivery partner, web site content team, IT departments, etc. This backstage part is usually more complex than the on stage part.

5. How do people feel about this?

Customer journey map 5

Now we get to the crucial part. Mark the parts that work well from the perspective of the person interacting with it with green dots. Mark the parts where people start to feel unhappy with yellow dots. Mark the parts where people get really frustrated with red. What you’ll probably see now is that your client starts to feel unhappy much sooner than employees or partners. It could well be that on the inside people are perfectly happy with how things work while the customer gets frustrated.

What does this give you?

Through this process you can immediately start discovering and solving customer experience issues because you now have:

  • A user centred perspective on your entire service/product offering
  • A good view on opportunities for innovation and improvement
  • Clarity about which parts of the organisation can be made responsible to produce those improvements
  • In a shareable format that is easy to understand

Mapping your customer journey is an important first step towards customer centred thinking and acting. The challenge is learning to see things from your customers perspective and that's exactly what a customer journey map enables you to do. Based on the opportunities you identified from the customer journey map, you’ll want to start integrating the multitude of digital channels, tools and technology already in use into a cohesive platform. In short: A platform for digital experience management! That's our topic for our next post.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web