Mar 07 2019
Mar 07

In our last meetup here on Long Island, we reviewed Preston So's recent book "Decoupled Drupal in Practice". We had the opportunity to record the meetup, figured it can't hurt to post it here!

We also demonstrate how setup a super simple REST server and connect it to a React component. 

In our last meetup here on Long Island, we reviewed Preston So's recent book "Decoupled Drupal in Practice". We had the opportunity to record the meetup, figured it can't hurt to post it here!

We also demonstrate how setup a super simple REST server and connect it to a React component. 

[embedded content]

Mar 07 2019
Mar 07

At Agiledrop, we do what we can to support and grow the local Drupal community as well as contribute on a more global scale. As such, we were honored to be one of the Gold sponsors for this year’s DrupalCamp London after being Tote Bag and Tea and Coffee sponsors last year. Coincidentally, a part of our team held a free Drupal course in our Ljubljana offices this weekend, while a few of us were up north representing Agiledrop at the DrupalCamp. 

The DrupalCamp in London last weekend was not just my first DrupalCamp, but actually my first Drupal event of any kind. Having just joined Agiledrop in December of last year with very little experience in marketing and development, I was a bit nervous about going to the event and meeting some of the brightest tech minds of today - but I knew I was in for an unforgettable weekend.

Aleš and Iztok at our booth. You can even see my reflection in the background!

After setting up our booth at the venue and getting some much-needed coffee, it was time for me to mingle and start meeting people. I almost immediately learned that my nervousness was completely unjustified, since everyone I talked to was super friendly and inclusive. This is a natural reflection of the essence of the Drupal community: it is one based on inclusivity and acceptance, giving everyone who wants to contribute the chance to do so within their own capacity.

Nonetheless, even with this awareness, it was not a piece of cake to approach and introduce myself to people whose blog posts I’d read or whose videos I’d watched. One of the first people I got the courage to talk to was Helena of Lullabot - despite not being a developer or a designer, I absolutely loved her presentation on accessibility at DrupalCamp Florida, and I just had to tell her that! 

For me, this welcoming attitude of people within the community is the best thing about Drupal and consequently it was the best aspect of the event itself. It is this attitude that gives us newcomers the motivation to strive to make Drupal better for everyone and makes us realize that our fresh and inexperienced perspective can actually benefit Drupal and its community.

Here I’d like to give a shout out to Paul Johnson of CTI Digital. He was actually the one person who motivated me to start thinking of ways I can contribute without worrying about my lack of knowledge. I was very excited to meet Paul and I attended both of his Saturday sessions. The afternoon one was a presentation of the Promote Drupal initiative and it also included a short workshop where the group brainstormed ideas on Drupal’s unique selling point.

This meant that I actually got to contribute to Drupal at my first Drupal event ever! And the way in which the workshop was conducted ensured that even a seemingly insignificant idea was given equal attention and perhaps served as inspiration for someone else’s idea. In this way, every voice was heard, and the combination of different mindsets and skill levels yielded a much better result than someone trying to tackle the issue on their own. 

Paul Johnson giving his talk on the Promote Drupal initiative

The next morning, I attended Preston So's keynote speech on decontextualizing content in order to keep up with new and emerging technologies. I must say I was completely blown away by Preston’s mastery of language and the abundance of experience he has on the subject. His speech encompassed all aspects of Drupal, from development to marketing and sales, and contained meaningful insights on the future of Drupal as a decoupled system.

Being a linguist myself, I couldn’t resist running after Preston when his lecture was finished and introducing myself to him, babbling about how I’m a total Drupal newbie, but how his talk was just completely inspiring (I actually got goosebumps at some point during his lecture). Thanks to Preston, I believe I now have a much better understanding of Drupal and all its capabilities, consequently being more aware of my place within the community and the extent to which I’m able to get involved.

Preston So giving his keynote speech on decontextualized content and decoupled Drupal

It’s not easy to honestly relate such an unforgettable experience, so these were just some of the highlights from my first Drupal event. The best thing about the weekend was definitely getting first-hand experience of what it’s like to be part of a community as welcoming and accepting as Drupal is. I’m sure that my next Drupal event will give me a chance to do and learn even more, and I’m already looking forward to it. Big thanks to the DrupalCamp London team and to everyone there for ensuring a great experience for all of us!

Mar 07 2019
Mar 07

Linking patterns allows us to give our users a real feeling for how the website is going to work, on real devices, which things like InVision can never do. Here's some simple approaches.

When using PatternLab, you can link to a pattern by creating a variable such as {{ url }}. Then in your corresponding JSON or YML file, you can setting this variable equal to something like
url: link.pages-contact
or
url: link.pages-homepage.

We often use this when creating menu items, since in Drupal our menu items template looks for two parts to the menu link: title and url, something like this:


  1. menu:

  2.   items:

  3.   item_1:

  4.   title: 'About Us'

  5.   url: link.sample-pages-basic-page

  6.   item_2:

  7.   title: 'Contact Us'

  8.   url: link.sample-pages-basic-page-contact-us

This works great when working with a template that has a specific variable for the URL, such as the link to a node in node.html.twig, so we can link the title in our teaser template in PL to our sample blog pattern, for example.

But if we have a link field, such as a Call to Action in a paragraph bundle we might have something like this in our pattern:


  1. <div class="cta__link">

  2. {{ cta_link }}

  3. </div>

and this in our corresponding YML file:


  1. cta_link: '<a href="#">Click Me!</a>'

We don't have PL paths in those links, because if we swap `#` for a `link.sample-pages-basic-page` it'll just render that as a string. And we don't want to break the variable into two parts, because in the Drupal template, we want to be able to {% set cta_link = content.field_cta %} and let Drupal do all its render magic.

The solution? Don't break up variable into two parts, concatenate what you want in YML instead to allow us to link to specific patterns:


  1. cta_link:

  2.   join():

  3. - '<a href="'

  4. - link.sample-pages-basic-page-with-quote

  5. - '">See Ways to Help</a>'

Now, the first part will render as a string, the second as a variable to the pattern you want to link to, and the third part as a string.

We could also create a link pattern, and do something like this:


  1. cta_link:

  2.   include():

  3.   pattern: 'organisms-link'

  4.   with:

  5.   url: 'link.sample-page-homepage'

I don't, because, in general, I don't like patterns to depend on other patterns. This is simply so I can drag and drop them from project to project without any friction. Each component has everything it needs contained within it. It also means in case of something like a link field, we can let Drupal do as much of the heavy lifting as possible.

Mar 07 2019
Mar 07

Last night saw the popular EU Cookie Compliance module fall from grace, as the Drupal community discovered that numerous inputs in the admin form were not being sanitised.

To me, this shows some serious failings in how our community is handling security awareness. Let's do some fixing :)

1) We need to make this OBVIOUS, with clear examples

One of the most important things when trying to get people to write secure code is making them aware of the issues. We need Drupalers of all levels of experience to know and understand the risks posed by unsanitised input, where they come up and how to fix / avoid them.

I did a little internet searching, and found that there's actually a great guide to writing Drupal modules on Drupal.org. It covers a whole bunch of things, and is compiled really nicely.

I noticed that it says how to do forms, but it manages to NOT mention security anywhere. This should be a key thought right now, no? There is a guide to handling text securely, but it's just there and isn't really linked to.

Similarly, the page of Drupal 7 sanitize functions is easily findable, but only if you know to look for it in the first place

Guys and girls, if we're going to help our Drupalers to write secure code we simply have to make it obvious. We shouldn't be assuming young new Drupalers will think to browse around the internet looking for sanitization functions. We shouldn't be assuming they know all the niggly bits that present security issues. We shouldn't be assuming that anyone really know when to use tokens in URLs and when not to. We should be putting all these things right there, saying hey! don't forget to use these! here's why!. We should have articles and guides for writing forms that take the time to cover how to handle the security side of things.

In that vein, surely the Form API reference should surely have a reminder link? A little sidebar with some links to all these guides and articles on writing secure code?

I'm going to go start some conversations and some edits - Drupal documentation is maintained by us, the Drupalers, after all.
Who else out there wants to help move things in the right direction? :)

Update: Security in Drupal 7 - Handle text in a secure fashion is looking a good bit better. Input still welcome though!

2) We need to be aware of what we're installing

81,086 modules report use of the EU Cookie Compliance module. That's a whole bunch of blind installs! Nobody thought to check through the code? Nobody missed the lack of check_plain?

Well, you don't, do you? It's far too easy to assume that things are just fine. Our wonderful Open Source world, protected by our numbers, means that code is safe because it has a thousand people keeping eyes on it. Unless, of course, we're all assuming that somebody else is looking. In that case, as evident here, nobody really takes responsibility - and that's why we end up with module maintainers burning out trying to fight battles alone. In the presence of other people who we know could also do something, humans are significantly less likely to take responsibility.

I've said this before in my previous article discussing security risks to Drupal as we mature - if we took a little more of a moment to check through the modules that we install, we might catch a whole bunch of missed bugs!

I must make explicit that this call isn't just to the big bods and the experienced Drupalers. This task is for you, too, freelancers and small Drupal shops. We all have unique perspectives and unique opportunities that will allow us to see what others have missed - but if nobody is looking then nobody will see anything.

3) Contrib security reviews need help

Unless we're going to go through every module by hand, we need to think about writing some tool to do a basic sanity check on our contrib modules. How hard can it be to see if there's even one instance of a check_plain in a .admin.inc file?

It's admirable and encouraging to see the Drupal Security Team making huge progress on really key modules. Well done guys :) But, as far as I can guess, they're going through modules by hand, line by line. What other way is there?

If I had £50k going spare, I'd put a huge bounty out for anyone that can write an automated tool for spotting missing check_plains. Alas, I really don't have that! But I reckon there must be a decent tool for at least getting a start?

If we can solve this problem for contrib, then we can also solve it for every site's custom modules. And that will be of huge security benefit for Drupalers worldwide.

Huge publicity awaits whoever solves this problem, I'm sure.
Inventors and innovators in the Drupal world, this is your moment!

Mar 06 2019
Mar 06

by David Snopek on March 6, 2019 - 1:56pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Critical security release for the EU Cookie Compliance module to fix an Cross Site Scripting (XSS) vulnerability.

The module provides a banner where you can gather consent from the user when the website stores cookies.

The module doesn't sufficiently sanitize data for some interface labels and strings shown in the cookie policy banner.

This vulnerability is mitigated by the fact that an attacker must have a role with the permission "Administer EU Cookie Compliance banner".

See the security advisory for Drupal 7 for more information.

Here you can download the Drupal 6 patch or the full release.

If you have a Drupal 6 site using the EU Cookie Compliance module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Mar 06 2019
Mar 06

by David Snopek on March 6, 2019 - 1:51pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Moderately Critical security release for the Ubercart module to fix a CSRF vulnerability.

The Ubercart module provides a shopping cart and e-commerce features for Drupal.

The taxes module doesn't sufficiently protect the tax rate cloning feature.

See the security advisory for Drupal 7 for more information.

Here you can download the Drupal 6 patch or the full release.

If you have a Drupal 6 site using the Ubercart module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Mar 06 2019
Mar 06

Platform.sh, like any good PaaS, exposes a lot of useful information to applications via environment variables. The obvious parts, of course, are database credentials, but there's far more that we make available to allow an application to introspect its environment.

Sometimes those environment variables aren't as obvious to use as we'd like. Environment variables have their limits, such as only being able to store strings. For that reason, many of the most important environment variables are offered as JSON values, which are then base64-encoded so they fit nicely into environment variables. Those are not always the easiest to read.

That's why we're happy to announce all new, completely revamped client libraries for PHP, Python, and Node.js to make inspecting the environment as dead-simple as possible.

Installation

All of the libraries are available through their respective language package managers:

PHP:

composer install platformsh/config-reader

Python:

pip install platformshconfig

Node.js:

npm install platformsh-config --save

That's it, you're done.

Usage

All three libraries work the same way, but are flavored for their own language. All of them start by instantiating a "config" object. That object then offers methods to introspect the environment in intelligent ways.

For instance, it's easy to tell if a project is running on Platform.sh, in the build hook or not, or if it's in a Platform.sh Enterprise environment. In PHP:

$config = new \Platformsh\ConfigReader\Config();

$config->inValidPlatform(); // True if env vars are available at all.
$config->inBuild();
$config->inRuntime();
$config->onEnterprise();
$config->onProduction();

// Individual Platform.sh environment variables are available as their own properties, too.
$config->applicationName;
$config->port;
$config->project;
// ...

The onProduction() method already takes care of the differences between Platform.sh Professional and Platform.sh Enterprise and will return true in either case.

What about the common case of accessing relationships to get credentials for connecting to a database? Currently, that requires deserializing and introspecting the environment blob yourself. But with the new libraries, it's reduced to a single method call. In Python:

config = platformshconfig.Config()

creds = config.credentials('database')

This will return the access credentials to connect to the database relationship. Any relationship listed in .platform.app.yaml is valid there.

What if you need the credentials formatted a particular way for a third-party library? Fortunately, the new clients are extensible. They support "credential formatters", which are simply functions (or callables, or lambdas, or whatever the language of your choice calls them) that take a relationship definition and format it for a particular service library.

For example, one of the most popular Node.js libraries for connecting to Apache Solr, solr-node wants the name of a collection as its own string. The Platform.sh relationship provides a path, since there are other libraries that use a path to connect. Rather than reformat that string inline, the Node.js library includes a formatter specifically for solr-node:

const solr = require('solr-node');
const config = require("platformsh-config").config();

let client = new solr(config.formattedCredentials('solr-relationship-name', 'solr-node'));

Et voila. client is now a solr-node client and is ready to be used. It's entirely possible to register your own formatters, too, and third-party libraries can include them as well:

config.registerFormatter('my-client-library', (creds) => {
  // Do something here to return a string, struct, dictionary, array, or whatever.
});

We've included a few common formatters in each library to cover some common libraries. We'll be adding more as time goes by, and, of course, PRs are always extremely welcome to add more!

But what about my language?

We wanted to get these three client libraries out the door and into your hands as soon as possible. But don't worry; Go and Ruby versions are already in the works and will be released soon.

We'll continue to evolve these new libraries, keeping the API roughly in sync between all languages, but allowing each to feel as natural as possible for each language.

Mar 06 2019
Mar 06

Using Paragraphs to define components in Drupal 8 is a common approach to providing a flexible page building experience for your content creators. With the addition of Acquia Lift and Content Hub, you can now not only build intricate pages – you can personalize the content experience for site visitors.

Personalization with Acquia Lift and Content Hub

Acquia Lift is a personalization tool optimized for use with Drupal. The combination of Acquia Lift and Content Hub allows for entities created in Drupal to be published out to Content Hub and be made available through Lift to create a personalized experience for site visitors. In many instances, the personalized content used in Lift is created by adding new Blocks containing the personalized content, but not all Drupal sites utilize Blocks for content creation and page layout.

Personalizing paragraph components

To personalize a Paragraph component on a page, we’ll need to create a new derivative of that component with the personalized content for export to Content Hub. That means creating duplicate content somewhere within the Drupal site. This could be on a different content type specifically meant for personalization.

To make this process easier on our content creators we developed a different approach. We added an additional Paragraphs reference to the content types we wanted to enable personalization on. This “Personalized Components” field can be used to add derivatives of components for each segment in Acquia Lift. The field is hidden from display on the resulting page, but the personalized Paragraph entities are published to Content Hub and available for use in Lift. This allows the content team to create and edit these derivatives in the same context as the content they’re personalizing. In addition, because Paragraphs do not have a title of their own, we can derive a title for them from combination of the title of their parent page and the type of component being added. This makes it easy for the personalization team to find the relevant content in Acquia Lift’s Experience Builder.

In addition to all of this, we also added a “Personalization” tab. If a page has personalized components, this tab will appear for the content team allowing them to review the personalized components for that page.

Keeping the personalized experience in the context of the original page makes it easier for the entire team to build and maintain personalized content.

The technical bits

There were a few hurdles in getting this all working. As mentioned above, Paragraph entities do not have a title property of their own. This means that when their data is exported to Content Hub, they all appear as “Untitled”. Clearly this doesn’t make for a very good user experience. To get around this limitation we leveraged one of the API hooks in the Acquia Content Hub module.

<?php
/**
 * Implements hook_acquia_contenthub_cdf_from_drupal_alter().
 */
function mymodule_acquia_contenthub_cdf_from_drupal_alter(ContentHubEntity $cdf) {
  $paragraph = \Drupal::service('entity.repository')->loadEntityByUuid($cdf->getType(), $cdf->getUuid());

  /** @var \Drupal\node\Entity\Node $node */
  $node = _get_parent_node($paragraph);
  $node_title = $node->label();

  $paragraph_bundle = $paragraph->bundle();
  $paragraph_id = $paragraph->id();

  $personalization_title = $node_title . ' - ' . $paragraph_bundle . ':' . $paragraph_id;

  if ($cdf->getAttribute('title') == FALSE) {
    $cdf->setAttributeValue('title', $personalization_title, 'en');
  }
}

/**
 * Helper function for components to identify the current node/entity.
 */
function _get_parent_node($entity) {
  // Recursively look for a non-paragraph parent.
  $parent = $entity->getParentEntity();
  if ($parent instanceof Node) {
    return $parent;
  }
  else {
    return _get_parent_node($parent);
  }
}

This allows us to generate a title for use in Content Hub based on the title of the page we’re personalizing the component on and the type of Paragraph being created.

In addition to this, we also added a local task and NodeViewController to allow for viewing the personalized components. The local task is created by adding a mymodule.links.task.yml and mymodule.routing.yml to your custom module.

*.links.task.yml:

personalization.content:
  route_name: personalization.content
  title: 'Personalization'
  base_route: entity.node.canonical
  weight: 100


*.routing.yml:

personalization.content:
  path: '/node/{node}/personalization'
  defaults:
    _controller: '\Drupal\mymodule\Controller\PersonalizationController::view'
    _title: 'Personalized components'
  requirements:
    _custom_access: '\Drupal\mymodule\Controller\PersonalizationController::access'
    node: \d+

The route is attached to our custom NodeViewController. This controller loads the latest revision of the current Node entity for the route and builds rendered output of a view mode which shows any personalized components.

<?php

namespace Drupal\mymodule\Controller;

use Drupal\Core\Access\AccessResult;
use Drupal\Core\Entity\EntityInterface;
use Drupal\node\Controller\NodeViewController;
use Drupal\Core\Session\AccountInterface;

/**
 * Defines a controller to render a single node.
 */
class PersonalizationController extends NodeViewController {

  /**
   * {@inheritdoc}
   */
  public function view(EntityInterface $node, $view_mode = 'personalization', $langcode = NULL) {
    // Make sure we're working from the latest revision.
    $revision_ids = $this->entityManager->getStorage('node')
      ->revisionIds($node);
    $last_revision_id = end($revision_ids);
    if ($node->getLoadedRevisionId() <> $last_revision_id) {
      $node = $this->entityManager->getStorage('node')
        ->loadRevision($last_revision_id);
    }
    $build = parent::view($node, $view_mode, $langcode);
    return $build;
  }

  /**
   * Custom access controller for personalized content.
   */
  public function access(AccountInterface $account, EntityInterface $node) {
    /** @var \Drupal\node\Entity\Node $node */
    $personalized = FALSE;
    if ($account->hasPermission('access content overview')) {
      if ($node->hasField('field_personalized_components')) {
        $revision_ids = $this->entityManager->getStorage('node')
          ->revisionIds($node);
        $last_revision_id = end($revision_ids);
        if ($node->getLoadedRevisionId() <> $last_revision_id) {
          $node = $this->entityManager->getStorage('node')
            ->loadRevision($last_revision_id);
        }
        if (!empty($node->get('field_personalized_components')->getValue())) {
          $personalized = TRUE;
        }
      }
    }
    return AccessResult::allowedIf($personalized);
  }
}

The controller both provides the rendered output of our “Personalization” view mode, it also uses the access check to ensure that we have personalized components. If no components have been added, the “Personalization” tab will not be shown on the page.

Mar 06 2019
Mar 06

I recently wrote an article about Flexible Authoring with Structured Content. In this follow-up post, I'm going to dig into more detail on one specific approach we've been working on: Entity Reference with Layout.

If you use the Paragraphs module and wish there was a way to more easily control the layout of referenced paragraphs on a particular node, check out Entity Reference with Layout, a new module for Drupal 8. Entity Reference with Layout combines structured content (a la Paragraphs) with expressive layout control (a la Layout Discovery in core). Now you can quickly add new sections without leaving the content edit screen, choose from available layouts, add text or media to specific regions, drag them around, edit them, delete them, add more, and so on. The experience is easy-to-use, fast, and expressive.

Background

Structured Content FTW.

We’ve been working with Drupal for a very long time: since version 4.7, way back in 2006. We love the way Drupal handles structured content – something that has only improved over the years with support for important concepts like “fieldable entities” and “entity references.” Integration with flexible rendering systems like Views, and in more recent years the ability to quickly expose content to services for headless, decoupled applications, relies largely on structured content. With structured content, editors can “Create Once, Publish Everywhere (COPE),” a widely-recognized need for modern web authoring. Drupal’s support for structured content is an important advantage.

Drupal, We Have a Problem.

But Drupal’s interface for creating structured content– the part that editors use daily, often many times per day – is lagging. In the era of SquareSpace, WIX, and Gutenberg, Drupal’s clunky authoring interface leaves much to be desired and is quickly becoming a disadvantage.

Complex form for adding different types of content called paragraphs

Paragraphs to the Rescue. Sort Of.

There have been a number of really interesting steps forward for Drupal’s authoring interface as of late. Layout Builder is powerful and flexible and soon to be a full-fledged part of Drupal core. Gutenberg, an expressive authoring experience first developed for Wordpress, now offers a Drupal version. The Paragraphs module solves similar problems, providing a way for authors to create structured content that is incredibly flexible.

We started using Paragraphs years ago, soon after it was first introduced in Drupal 7. We liked the way it combined structure (Paragraphs are fieldable entities) with flexibility (Paragraphs can be dragged up and down and reordered). We used nested Paragraphs to give authors more control over layout. The approach was promising; it seemed flexible, powerful, and easy-to-use.

For more complex applications, though, nested Paragraphs proved anything but easy-to-use. They could be profoundly complicated. Managing intricate layouts with nested Paragraphs was downright difficult.

If only there was a way to have it both ways: Drupal Paragraphs plus easy layout control. Well of course, now there is.

Introducing Entity Reference with Layout

We created Entity Reference with Layout to give authors an expressive environment for writing structured content. As the name suggests, Entity Reference with Layout is an entity reference field type that adds an important element to the equation: layout. It leverages the layout discovery system in Drupal Core, allowing editors to quickly add new paragraphs into specific regions. The authoring experience is expressive and easy, with drag-and-drop layout controls.

Give Entity Reference with Layout a Whirl

Entity Reference with Layout is available on Drupal.org. Installation is quick and easy (we recommend composer, but you can also just download the files). The module is experimental and still under active development; check it out and let us know what you think. We’d love to hear feedback, bug reports, or feature requests in the issue queue. And if you think your organization’s web editors might benefit from this approach and want to learn more, drop us a line and we’ll follow up!

Mar 06 2019
Mar 06

Drupal 8 has dozens of useful performance optimization modules. We have already reviewed the core BigPipe module and the contributed Subrequests module. Today, we are pleased to discuss a new tool to speed up Drupal sites — the Quicklink module in Drupal 8. Using the example of its work, we will see that a fast website is always a step ahead of the users’ intentions. Let’s go.

The essence of the Quicklink module in Drupal 8

The Quicklink module is meant to speed up Drupal 8 sites through the mechanism of link prefetching.

First, links in the user’s viewport are detected. These are links that the user might want to visit next. When the browser goes idle, the content from the links begins to be saved in the cache. If then the user navigates to one of these links, the content is there already.

The Quicklink module is based on GoogleChromeLabs Quicklink library. This lightweight JavaScript library weights less than 1 kb when compressed.

To detect the links, the module uses the Intersection Observer API. The requestIdleCallback method is responsible for waiting until the browser goes idle. Quicklink also discovers slow connections, for which it does no prefetching.

The module follows good Drupal performance improvement practices. More details are coming below.

How the Quicklink module in Drupal 8 works

Installing the Quicklink module

The Quicklink library will be loaded from a CDN by default, or you can choose to store it locally. The module can be installed manually or via Composer. Its creators recommend installation with Composer, for which you will also need to check the Composer.json file.

Configuring the Quicklink module

When the module is installed and enabled, its settings are available in the site admin dashboard at Configuration — Development — Quicklink Configuration.

The Quicklink module’s default settings are suitable for most websites. Let’s take a closer look at its options, which are presented in 5 tabs.

1) In the “Prefetch Ignore Settings” tab, we can choose which links to ignore (not prefetch). To follow Drupal performance practices, the module ignores these links by default:

  • admin links (that have /admin, edit/, or are otherwise known to be admin links)
  • AJAX-enabled links (the ones that have a use-ajax class or end in /ajax)
  • links that have hashes (#)
  • links that have file extensions (so we don’t prefetch PDFs, MP3s etc.)

It is also possible to add particular URL patterns to ignore.

2) In the “Optional Overrides” tab, we can:

  • override the CSS selector where the module looks for hyperlinks (the default is the whole document)
  • override the domains allowed for prefetching
  • provide the specific paths allowed for prefetching

3) In the “When to Load Library” tab, we can control in which context the Quicklink library will load. The recommended Drupal defaults are:

  • to load the library for anonymous users only
  • disable prefetching during sessions (for example, in Drupal Commerce cart)

We can select the content types for which the library will not load.

4) In the “Extended Browser Support” tab, we can choose to use or not Intersection Observer polyfill from polyfill.io. This will allow the Quicklink to work with such browsers as Safari or Microsoft Edge.

5) In the “Debug” tab, we can turn debug mode on. This will save Quicklink logs for further display on the Developer Tools console. This will give us detailed information when we want to analyze the performance and see why prefetching was ignored for some links.

We can see the prefetching process on the Network tab of Chrome Developers Tools. The “Initiator” column will list the request initiator as “Quicklink.”

Performance improvement for your Drupal 8 website

Quicklink module in Drupal 8 expands developers’ arsenal of speed optimization options. Our Drupal team is ready to help you use the Quicklink module, as well as examine your website from different angles and improve its speed.

Mar 06 2019
Luc
Mar 06

You may have heard: Google+ is game over.

Officially, this is directly related to a leak of data that potentially impacted 500,000 Google+ accounts.

Businesswise, the attempts of Google to build a social media platform was not successful. If you had a personal account on Google+ and want to download your data before the shutdown, you can download it here.

The shutdown will happen in two phases:

Your website can be impacted by the shutdown of the Google+ API. This API was used by many providers and libraries to be able to login to a website using Google.

Hybridauth on Drupal 7

If your Drupal site is using Google to log n, it's possible that you are still using the Google+ API. A very popular module in D7 to log in using social media is called Hybridauth (to log in with LinkedIn, and Facebook, as well Google). You need to update that library and your Google key.

Here are the steps to keep your Hybridauth working in D7:

Mar 06 2019
Mar 06

There is never too much discussion of Drupal and JavaScript frameworks. We have taken a glimpse at Drupal 8 and Vue.js combination and know that Vue.js is a candidate for Drupal core. Today is the time to review Nightwatch.js — an automated testing framework that is already part of Drupal 8. We will see how Nightwatch.js in Drupal 8 provides for automated JavaScript testing.

A look at Nightwatch.js and its benefits

Nightwatch.js is an automated testing and continuous integration framework, meant to streamline and simplify these processes. It is an integrated end-to-end (E2E) browser testing solution. Nightwatch is also suitable for Node.js unit tests.

Nightwatch.js is written on Node.js — a cross-platform JavaScript runtime environment. To execute commands and assertions on DOM elements, Nightwatch relies on the W3C WebDriver API (formerly known as Selenium WebDriver). For identifying elements, it uses JavaScript language (Node.js) and CSS or XPath selectors.

Nightwatch js based on Node and W3C WebDriverAPI

Currently, Nightwatch.js has 148,900+ weekly NPM downloads thanks to numerous benefits:

  • ease of use due to clear syntax and detailed documentation

  • ability to perform custom commands and assertions

  • command-line test runner Nightwatch Runner included

  • its own NightCloud.io cloud testing platform

  • compatibility with other testing platforms (SauceLabs, BrowserStack etc.)

  • real-time tests possible in the browser

  • parallel test execution possible

  • screenshots and videos of tests

  • automatic management of Selenium or WebDriver services

  • support for Page Object Model that ensures better organization of elements

  • ability to integrate with CI systems (Jenkins, Hudson, Teamcity etc.) thanks to JUnit XML

Nightwatch.js in Drupal 8

Previously, JavaScript functionality on Drupal sites had often been tested via PHPUnit. That often required front-end developers to study the ins and outs of PHP programming language and PHPUnit testing framework. Generally, it was cumbersome and error-prone to test JavaScript with PHP.

Things changed when the Drupal community decided to officially use Nightwatch.js for browser testing. Inspired by the JavaScript Modernization Initiative, in 2018, Nightwatch was included into the Drupal 8.6 core.

Nightwatch-js is part of Drupal 8-6 core.jpg

Nightwatch.js makes it possible to test JavaScript with JavaScript. Developers can write custom tests for browser interactions directly in JS, and execute them in different browsers.

The vast community of JavaScript developers now have their own browser testing tool in the Drupal core. In addition, Nightwatch.js overcomes some PHPUnit limitations.

Nightwatch.js in Drupal 8 offers a built-in Chromedriver. It can work on a standalone basis and requires no Selenium or Java installation. However, it is possible to use other browsers via Selenium — Firefox, Internet Explorer, Safari, and more.

To work with Nightwatch.js, we will need three basic dependencies: Node.js, Yarn, and Chrome. Thanks to detailed documentation, it is easy to create and run tests for Drupal websites, modules, and themes. Helpful links include:

and more!

Get assistance with Nightwatch.js

Our development team follows the latest JavaScript trends. They implement them in Drupal projects and give speeches on them at conferences. One of our Drupal developers even took part in fixing the JS coding standards for Nightwatch.js tests in Drupal 8.6 core.

So you can trust our Drupal team if you need any assistance with Nightwatch.js in Drupal 8. We can install and configure the framework, create and run Nightwatch.js tests according to your requirements, help you integrate Nightwatch with your CI systems (Jenkins, Hudson, Teamcity), and so on.

Rely on us for any other small or big tasks that involve Drupal and JS frameworks!

Mar 06 2019
Mar 06

What is inbound marketing?

So what is inbound marketing exactly? Well, some professionals are calling it “new marketing”. This type of marketing is focusing on delivering value upfront to your potential customers. This way, your potential customers will have already benefited from your marketing efforts before having to contribute with something of value to them. Inbound marketing is focused on delivering value, creating trust and developing a loyal and supportive customer base for your awesome brand.

What are the main pillars of inbound marketing?

So, what is the framework on which inbound marketing is operating? Well, inbound marketing has at its core three main concepts: 

  • Attract: The first pillar is the attraction, which inbound marketing aims to generate. Inbound marketing is not only focusing on attracting leads, but it focuses on attracting the right type of leads, the ones which are more likely to convert and buy or use your services and products and ultimately become your happy customers. Instead of preying on the attention of the potential customer, it instead lets the customer come to your business on his own accord and on his own terms. In order to create this attraction, you have to use the content marketing strategy. What this means is that you have to create relevant content for your ideal customer. The content can come in various forms such as blog posts, informational videos, articles, books, e-books etc.

  • Engage: Now that you have attracted the right kind of leads, you cannot sit back and relax with a piña colada in your hand. The next step is to further engage with your leads. To do this, you can engage in conversations with your leads through different mediums, such as Facebook, Instagram, Twitter, E-mail, bots, live chat, etc. This is an important step because it facilitates the opportunity to foster important relationships with your audience. Next, you have to capture the information of your prospects by using conversion tools such as CTA, forms and lead flows. This will help you in creating a personalized experience for your prospects on your website. When people feel that their experience was optimized for them, it creates trust and brand loyalty.

  • Delight: The final pillar of the inbound marketing is delight. How do you delight your prospects you might wonder. It’s easy, now that you have collected relevant information about your leads, you can further engage with your audience by using marketing automation and conversations. This will ensure that your Emails are targeting the right people with the right information every time. On top of that, create content that your leads will be happy to share with their family, friends, pets etc. Bonus points, if you deliver this content in your audience's favorite content format, like video for example.

Inbound marketing vs. outbound marketing

Now that you have a clearer understanding of what inbound marketing means, let’s take a look at what outbound marketing is and how the two of them compare.

Outbound marketing refers to any kind of marketing where a company is sending out a message to the audience. In outbound marketing, companies are competing for the attention of the customers. How do they do that? Well, in order to grab their attention, companies who use outbound marketing are usually interrupting their audience right in a moment when they are focused on something else with an ad that is totally irrelevant for them. Outbound marketing uses a “spray and pray” strategy in the hopes of collecting leads. What this means, is that they try to show their message to as many people as possible and hope to get noticed. However, because of this approach, people are increasingly getting more desensitized to the outbound marketing, meaning that people have learned to ignore the traditional types of marketing. On top of that, the advent of ad blockers has made it increasingly more difficult for marketers to get their message to reach their audience. The consequences of this being that outbound marketing becomes increasingly expensive, while generating low yields.

But how does inbound and outbound marketing compare in terms of statistics? Well, according to the statistics, inbound leads cost 61% less than outbound. On top of that, businesses that rely on inbound marketing are saving more than 14$ dollars per new acquired customer. 79% percent of business that have a blog report positive ROI for inbound marketing. Around 80% of business decision makers prefer to get their information out of a series of articles rather than advertisements. Properly implemented inbound marketing strategy is 10 times more effective for lead conversion than outbound marketing. The average cost per lead is dropping by 80% after 5 months of consistent inbound marketing. Inbound marketing is 62% cheaper than outbound marketing and it triples the leads.

Now, taking into account these statistics, it’s clear that inbound marketing is more efficient than traditional marketing.

How can inbound marketing influence your Drupal Business?

But how is this relevant to my Drupal business you might ask? Well, first of all, by creating awesome content for your business you will be able to attract the right type of leads, the ones that are already interested in your business or in the field that your business operates in. These leads are more likely to convert to happy customers. On top of that, you will significantly reduce your costs for marketing, all while generating more quality leads and creating awesome value and content for your prospects.

Another cool aspect of inbound marketing for your Drupal business is that it generates leads long after the content was posted, as long as it is updated from time to time. Think of inbound marketing as a long term investment in your business. You make the investment now and you reap the benefits over a longer period of time, than for example with outbound marketing. On top of that, and inbound marketing strategy won't stop working in case you run out of budget. Blogs, articles and videos posted on your website will still continue to generate traffic long after you have run out of budget, which is another great perk of doing inbound marketing.

Another great aspect of adopting an inbound marketing strategy for your Drupal business, is that it provides a clear overview over the Return on Investment. Traditional outbound marketing is really ineffective when it comes to measuring how many people see your advert. Inbound marketing on the other hand provides a transparent examination over the results that it generates and the impact that those leads have on you business's ROI.

Conclusion

Marketing is an ever evolving profession and people have to be constantly keeping up with the latest trends and emerging technologies in order to stay ahead of the game. The same goes for inbound marketing, it is a new way of doing marketing, which is essentially the opposite of the “old school marketing”. Now, companies and people have to adapt to a new “cool kid on the block”, but for the best results, like all things in life, a balance has to be achieved between outbound and inbound marketing.

Mar 06 2019
Mar 06

Imagine a pizza box ( Yeah, I know its really tempting, but just picture it). You can serve the pizza in different ways: On a tray, in a box, on a plate, in pieces or even as a platter. 

Whatever may be the situation, the taste of the pizza and the material in it remains the same, but it is served to the customers in different ways. This is done to get a wider base of customers and develop a situation where they consume it. 

 Image of a pizza cut in five halves and placed beside a pizza box

Similarly, a publication can be served to the customer in different ways. People look for information in different places because they look for different methods to receive that information.  

It is important to publish the content to several channels to let the user access it as quickly as possible. 

So how can this be done? 

Let’s find out!

Multichannel Publishing has endless possibilities

Brian Solis describes Digital Darwinism as the phenomenon in which technology and society evolve faster than an organization can adapt. Yet the society accepts these changes and adapts to the concept of digital communication (The main avenue for customer experience). All of this confusion has buzzed out a name in terms of business necessity i.e Multi-Channel Publishing.

The idea is to get similar or related content onto multiple platforms in order to reach more people. In other words, Multichannel Publishing helps the user to publish the content to different channels. 

Your article will reach the audience on your website, on your app, and in the social media accounts with the help of multi-channel publishing. These channels don't have to be physical, like your website or social media accounts, but can also refer to different types of audiences or users. The possibilities are endless.

It helps in finding your audience on multiple channels and increases your “findability” by boosting your SEO with targeted content

  • Multiple platforms provide with opportunities to promote your content just like the way your audience wants it. Your content should be visually appealing, easy to share, and should attract a new audience.
  • Once the audience is attracted to your website, the content encourages deeper interaction with your organization. It presents a constant content on a regular basis, concentrates on education and encourages further action by the reader.
  • The conversion rules are most likely to be the content and platform working together that enables a reader to become a member or follower. 
 Image of different media widgets that are placed in a bundle with different colors

One system for Whole Channel 

What is the Content hub? What role does DC-X play in it?

The digital collection or the digital environment in a publishing background brings a different set of features and characteristics to the publishing process. It benefits in creating a final formatted version of the title that is suitable for many display devices. Thus, giving rise to the term “Content Hub”

A content hub is a collection of digital assets that are housed on an organization’s website or externally. It is a centralized target point for a brand’s ‘best in show’ digital assets. The organization can learn from the target audiences and prove their chops as authorities in the industry.

One of the biggest examples of the content hub is the DC-X and Drupal Europe Germany talked explicitly about DC-X content hub. The session offers up a plethora of cases and solutions to help users with their digital asset integration.

Well, it is a cross-editorial and cross-national depository for all Ringier content. It is used to manage all sorts of text, image, video and audio files within one central Content Hub.

Some of the features provided by DC-x Content hub are:

  • Semantic search
  • Right management
  • Content Sharing
  • Workflow Management
Image of a laptop, phone, tablet and a newspaper in a line where several media icon are made at the top. Both of them are connected with arrows. In between the arrows, there is an image of DCX content hubSource: digital collections

One of the biggest advantages of DC-X is that it is connected via APIs to third-party editorial and also connected to the content management systems for print and online activities.

Suppose an editor wishes to publish an article with any Drupal based channel. He would assign it to the channel and Drupal would then get triggered and seize the article using the DC-X JSON API. 

The stored XML and the extra metadata are handled to let Drupal determine how and where the article is operating to be published. The interface Drupal - DCX works bidirectionally. If the article is getting updated in DCX or in Drupal, both systems get synchronized in real time. 

Legacy building online memories with Drupal 

Legacy.com is the global leader of online memorial content. With the help of product ownership of two organizations, Legacy was launched on Drupal. The platform provided excellent authoring workflow and editorial layout control with multi-channel publishing of content across its global obituary network of over 1,400 branded site.

After choosing a RESTful API, which is essential for the Drupal setup, REST service module was brought forward. With an object-oriented architecture, full control of the API and performance was witnessed. Not only this but the enterprise also created a single API resource to redirect URL paths (alongside granular caching, metatags, panels integration, and more) This helped them to leverage strong SEO tools and functional value. 

Screenshot of the home screen of legacy.com


Imagine Canada Grant Connect 

Imagine Canada has been providing plans and resources to all the Canadian charities for a while now. It has been ensuring the sectoral growth, aid, and progress with 50-year legacy with Imagine Canada. They are now a functional and a scalable web-based solution that is known for seeking and managing the fundraising pipeline.

The website is constructed on Drupal as it presents them with an unparalleled ability to model complex content relationships and user structures in a method that can easily be maintained. 

Drupal provides them with the core support for developing RESTful services, handles authentication and regularly presents decoupled and multi-channel publishing. Apart from this the CMS also grants them with Contenta Decoupled distribution with an unbelievable contributor toolkit for out-of-the-box services and best-in-class technology.

When it comes to challenges and iteration - Rapid development in MVP helps the organization conquer them. All they ever wanted was to deliver a modern and extremely usable end-user experience with the flexibility to evolve at any time. With the back-end fully taken care by Drupal 8, the task is easily achieved.

Image of a laptop and a mobile phone showing imagine connect data and schedule

Conclusion  

At the end of the day, either it is a large organization or a small enterprise, customers and a good user experience is what they thrive for. Your customers want to find you and your content in a number of ways through print, on the web, via social media and on their smartphones and tablets. So, when you create content, it’s important that it adapts quickly and efficiently to those output channels.

At OpenSense Labs, we help you achieve such endeavors. We can collaborate with you to develop and experience the best content management services and solutions and help you find the right functionalities 

Ping us at [email protected] today. 

Mar 06 2019
Mar 06

We all have heard about Didier Claude Deschamps, right?

He is a French retired footballer who has been the manager of the France national team since 2012. He played as a defensive midfielder for several clubs such as Marseille, Juventus, Chelsea, Valencia, Nantes, and Bordeaux. 

Why was he famous? Well, mainly because he was a silent performer who did the best for his team and created a special place in the minds of his fans and the supporters. 

His biggest USP: being unique in a way that it was relevant and appealing for everyone around. 

An image divided into 2 parts. The right side has Didier Claude Deschamps in a blue jersey . In the left side he is wearing a black suit.


The role played by Didier Claude Deschamps as a coach and as a team member was exquisite. He not only served as a striking ray of hope by bagging exciting titles but also became an overnight hero.

And Drupal is just like Didier Claude Deschamps for large organizations. Powerful, all-rounder and robust. 

With roughly 1.2 million websites using Drupal across the world, it is clearly a strong content management system capable of supporting large organizations. 

Let’s take a look at technical and business reasons as to why large organizations rely on Drupal to achieve their goals. 

Open Source Has its Own Perks

Open source has presented users with open source codes. The source code enables access to the common public for their use and modifications in the original design. 

Open-source code is expected to be a collaborative effort, where the programmers fix or change the source code and share it within the community. 

Social and political views have been influenced by the growing concept of open source. There is a much larger impact of the open-source movement and the extent of its role in the development of new information sharing procedures.

The open-source movement has not only enhanced transparency in the biotechnology but the research methodologies have also benefited from the applications of open-source principles.

One of the main advantages of using open source is that it is not limited. Any organization can build a secure and safe online presence with the help of its capabilities.  Some of the major functionalities provided by the open source are:

  • Since open source provides with open code its quality can be easily and greatly improved when it is passed around, tested, and fixed.
  • Open Source provides with a valuable learning opportunity for programmers. They can apply skills to the most popular programs available today.
  • Open Source is more secure than any other proprietary software because bugs are identified and fixed quickly.
  • Since it is in the public domain, and constantly subject to updates, there is little chance it can become unavailable or quickly outmoded—an important plus for long-term projects.
A circular graph in shape of a flower whose background is yellow and the petals have all the open source practices in it

Presenting Drupal for Large Enterprises

Large or big organizations understand that their website is the foundation for the online presence. It is the structure on which their business (or marketing) is based on. 

These organizations require a seamless and fully functional website. They opt for that CMS which provides them various features and functionalities. 

Drupal is one of those open source CMSes which is suitable for any type of digital presence, with a strong focus on personalization, community building, and social tools. 

Drupal provides enterprises with:

Excellent Security

The ability of Drupal to limit security vulnerabilities is one of the most important features of the CMS, and one of the principal reasons why large websites work with the platform. 

Due to the excellent protection of the sensitive data, Drupal is chosen instead of other available CMS. Drupal also meets the Open Web Application Security Project (OWASP) security standards and addresses critical security risks. The platform has a dedicated security team who presents information to project maintainers, train the Drupal community on security topics and make improvements related to security in a core and contributed projects.

CMS which is flexible and scalable

When building a professional website, the main thing to take into consideration is the flexibility and scalability of the software that runs the website. And yes, Drupal is one of the most flexible and scalable CMSes for constructing any kind of website. 

Whether the user is thinking to create news, government platforms, higher education, enterprise or NGO website, Drupal creatively combines the correct modules and custom code to construct a truly different experience for the visitors. 

Highly customized websites that need scalability and serve with a huge amount of data are going to find Drupal absolutely capable of handling the workflow.

Provides easy content authoring 

Drupal presents an intuitive tool for creating content, maintaining workflow and secure publishing for each and every online content. 

The CMS provides easy authoring of the data to the website administrators, marketers and content managers. The website administrators can grant permission to other staff members to perform administrative tasks. 

Has a dedicated community 

Drupal community is one of the largest and most important assets. 

Being one of the largest open source online communities, more than 1 million strong developers, designers, trainers, strategists, coordinators, editors, and sponsors run together towards accomplishing one goal: making the web a better place for everyone.

Cost efficiency

Drupal is one of those platforms that is free and is written in PHP which is distributed under GNU (General Public License). The installation of the Drupal core can provide a simple website, an internet forum, a single-user or multi-user blog or a community-based website.

Image of the Drupal logo in between with six sub-pictures of its features around it

Is Drupal Right for My Sector?

Having this question in mind?

It is really normal. Trusting a CMS that coincides with your needs and requirements is something every large organization in every sector wants.

Whether it is a government or a public administrator sector or healthcare and medicine, Drupal is the platform which is suitable for every sector. 

A horizontal bar graph of 10 different sectors in which Drupal clients operateSource: Drupal.org

According to the Drupal business survey conducted in 2018, Drupal enterprise has clients in diverse industries. Half of the respondents (nearly 59.3 %) stated having Drupal clients in Charities and Non-Profit organizations. 

Among other industries, there were Government and Public Administration (about 54.8 %), Arts & Culture (41.5 %), Healthcare & Medicine (47.4 %) and IT (40.7 %). 

The result of the survey reveals that the businesses of Media and Banking and Insurance have had the highest drop as compared to last year survey, while Healthcare and Medicine and Consulting industry have developed the most and learned from the first survey.

Decision-Making Model for Large Organizations 

One of the sessions in Drupal Europe, which was on Compelling USPs for Drupal in large organizations (conducted by Digitalization and Innovation Specialist, Mr. Rouven Volk) was about how large enterprises values are defined by an increase in revenue and a decrease in cost and risks. These enterprises look for CMS solutions that involve:

  • Responsive and SEO based platform 
  • The CMS which consists of Marketing Integration
  • It involves a flexible solution
  • It has the necessary modules and additional features
  • It is proven in terms of scalability and flexibility
  • It should give excellent user experiences
  • The cost should be minimized
  • Multitenancy
 Image of the values in a triangle showing revenue, cost, and risk with an image if cms, suite, and framework connecting a bulb that has a logoSource: Drupal Europe

Some of the Challenges that Might Occur and Their Solutions 

As the organizations continue to embrace digital transformation, they are finding that digital business is not as simple as buying the latest technology, it requires significant changes to both culture and systems. To sustain the digital transformation, an organization has to understand technology and data. 

This also includes understanding your customers and unifying the information which helps in easy interaction. As customers resume to sit in the driver’s seat and choose where they desire to go, how they need to get there, and what the purpose will be, large enterprises also continue to follow that journey which delivers the right customer experience. 

Once you develop your USP and outline your Customer Journey it’s time to give your strategy a voice. This is done by mapping how you will communicate your USP through educational content that creates awareness, education, trust building, and easy conversions.

A CMS that offers multi-site management functionality can help you manage these content properties and social communication with ease. For example, You have 50 brands in 20 regions. Separate websites would require 1000 teams for managing, not a practical solution. 

Drupal has a feature which enables separate, independent sites to be served from a single codebase. Each site has its own database, configuration, files and base domain or URL. The main reason to use a multisite Drupal setup is to conserve time. The single code base helps a large enterprise manage multiple sites in one go. Even if there are 1000 websites. 

To achieve that quality user experience you might think of transitioning to responsive web design where your current content simply won’t integrate well with the other devices. So what do you do? You think of migrating it. Although it might be challenging.

Yes, migration can be time-consuming and a costly affair. One of the biggest difficulties with site migrations is that success will largely depend on the quantity and quality of pages that have been migrated. Conventional monolithic applications attempt to resolve all the challenges in one system, which put large companies into a complex migration path. Typical pitfalls include security, scaling, management, and compliance.

Drupal is one such CMS that helps in importing data from a variety of sources seamlessly. It provides a holistic data lifecycle management, especially in regards to sensible or confidential data. And with the help of Microservices in Drupal, the development has presented us with a lifecycle that provided with faster testing, greater quality, and more releases. Selecting a microservice architecture for Drupal-based websites is pleasant and is extremely productive.   

Drupal here also helps you to survive an ever-changing industry.  

  • It is open for anything
  • Continuous in nature and provides scheduled releases 

The ability to grow and innovate is bound to the ability to standardize, automate and integrate. Drupal, as one of the pioneering Content Management Systems (CMS), empowers digital innovation. It helps enterprises in their endeavors for digital transformation. The new Drupal 8 provides APIs for creating solutions. Also, it is not limited to only being a website platform. 

Concluding with a broader view 

As we come to an end, we now know that Drupal is like a strong backbone to all the enterprises (big or small). Benefits like:

  • Freedom to innovate 
  • Ease of integration
  • Time to market 
  • Future proofing solutions 
  • Building an innovative culture

OpenSense Labs is a Drupal agency which treats every organization like our own and the services provided by us follow all the USPs of Drupal CMS. Contact us now at [email protected]

Mar 06 2019
Mar 06

“Should I stay or should I go?” Should you stick to an all-too-familiar traditional CMS and “reap” the benefit of getting loads of much-needed functionality out-of-the-box? Or should you bid on flexibility, top speed, and versatility instead? In a headless CMS vs traditional CMS “debate”, which system best suits your specific needs?

Now, let me try and “guess” some of the CMS requirements on your wishlist:
 

  • to have all the needed functionality “under the same hood” (a predefined theme, robust database, a user-friendly admin dashboard...)
  • to be developer friendly
  • to integrate easily and seamlessly with any modern JS front-end of your choice
  • to “fuel” your website/app with high speed
     

Needless to add that:

You can't have them all in one CMS, either traditional or headless.

What you can actually do is:
 

  • set up a hierarchy with all your feature needs and requirements
  • set it against each of these two types of CMSs' advantages and limitations 
     

Just see which one of them “checks off” the most requirements on your list.

Then, you'd have yourself a “winner”.

So, let's do precisely that:

A headless CMS vs traditional CMS comparison to help you determine which one's... a better fit for you.
 

1. Traditional CMS: Benefits and Challenges

Everything in one place...

That would be a concise, yet fully comprehensive definition for the traditional CMS.

Just imagine a content management system that provides you with all the critical features and functionality, all the needed elements straight from the box:
 

  • a generic theme
  • a dashboard for easily managing your own content
  • a predefined database
  • PHP code for retrieving the requested content from your database and serving it to the theme layout
     

The back-end and front-end, meaning the code, database, and the layout/design, are “under the same hood”, strongly coupled. 

It's all there, pre-built, at hand... “Convenience” must be another word for “traditional CMS”.
 

Security & Performance: A Few Challenges to Consider 

Getting all that critical functionality out-of-the-box does translate into... code. Lots and lots of code, lots and lots of files.

Which also means lots and lots of potential vulnerabilities to be exploited.

There could be an error in any of the files in that heavy load of files that you get. Or a query string parameter that could be turned into “free access” into your database...

Therefore, the convenience of built-in functionality does come with its own security risks. 

Also, whenever you make a “headless CMS vs traditional CMS” comparison, always be mindful of the maintenance aspect:

Of the upgrading that you'll need to perform with every new security patch that gets released.

Now, as regards the performance “pumped” into your traditional CMS-based website/application, just think: compiling files.

That's right! Consider all those custom files, in addition to the pre-defined ones that you'll be provided with, that you'll pile up for... customizing your website. 

All these files, all the new libraries that you'll want to integrate, will need to get compiled. Which can only mean:
 

  • more stress put on your server memory 
  • copying code of functionalities that you might not even use
  • a poor page loading time, with an impact on the user experience provided on your website
     

2. A Traditional CMS Is the Best Choice for You If...

Now, you must be asking yourself: “How do I know if a traditional CMS is the best fit for my own use case?”

My answer is:

You go through the here listed “scenarios” and see if any of them matches your own.
 

  • you already have a team of PHP experts with hands-on experience working with a particular CMS (Drupal, WordPress...)
  • it's a stand-alone website that you need; no other applications and tech stack that might depend on a CMS's provided functionality
  • you're not opinionated on the technology that your website will get built on
     

3. Headless CMS: What Is an API-Based Website, More Precisely?

“It's a CMS that gives you the flexibility and freedom to build your own front-end — Angular, Rails, Node.js-based, you name it — and integrate it with content management tools via an API."

In short: your headless CMS can then serve raw content —  images, text values —  via an API, to a whole “ecosystem” of internet-connected devices: wearables, websites, mobile apps. 

And it'll be those content-consuming devices' responsibility to provide the layout and design of the content delivered to the end-users.

What's in it for you?
 

  • it dramatically streamlines the development cycle of your API-based website; you can get a new project up and running in no time
  • there's no need to pile up lots and lots of files and the code of out-of-the-box functionalities that you might not even need
  • if there's a particular service that you need — store form submissions or a weather forecast —  there's always a specific service with an API that you could integrate to have that particular content served on your website
     

A headless approach gives you the freedom to integrate exclusively the functionalities that you need into your website.

Moreover, you still get a dashboard for easily managing your content. Your headless CMS will have got you covered on this.

With no code being “forced” into your website/mobile app or need to perform a performance “routine” for this. You get it by default.
 

Security and Performance: Main Considerations

In terms of security, a short sentence could sum all the advantages that you can “reap” from having an API-based website:

There's no database...

Therefore, there are no database vulnerabilities, no unknown gateway that a hacker could exploit. 

Furthermore, in a “headless CMS vs traditional CMS” debate, it's important to outline that the first one doesn't call for an administration service. 

Meaning that you get to configure all those components delivering content to your website as you're building it. Except for that, the rest of the dynamic content gets safely stored and managed in your headless CMS.

“But can't anyone just query the service endpoints delivering content on my API-based website?”

True. And yet, there are ways that you can secure those channels:
 

  • use double-authentication for sensitive content 
  • be extra cautious when handling sensitive data; be mindful of the fact that anyone can query the JS implementation 
     

Now, when it comes to performance, keep in mind that:

It's just assets that your web server will provide. As for the content coming from all those third-party services that your headless CMS is connected with, it will get delivered... asynchronously.

Now, considering that:
 

  • most of those endpoints are hosted in the cloud and highly flexible 
  • the first response — the first static HTML file that gets served  — is instant
  • you could go with a headless CMS that leverages a CDN for delivery
  • in a traditional CMS scenario the website visitor has to wait until the server has finished ALL the transactions (so, there is a bit of waiting involved in there)
     

… you can't but conclude that in a “headless CMS vs traditional CMS” debate, the first one's way faster.
 

4. Use a Headless Approach If...
 

  • you already have your existing website built on a specific modern tech stack (Django, React, Node.js, Ruby on Rails) and you need to integrate it with a content management system, quick and easy
  • you don't your team to spend too much time “force-fitting” your existing tech stack into the traditional CMS's technology (React with... WordPress, for instance)
  • you need your content to load quickly, but you don't want a heavy codebase, specific to traditional CMSs, as well
  • you want full control over where and how your content gets displayed across the whole ecosystem of devices (tablets, phones, any device connected to the IoT...)
  • you don't want to handle all the hassle that traditional CMS-based websites involve: scaling, hosting, continuous maintenance 
     

5. Headless CMS vs Traditional CMS: Final Countdown

Now, if we are to sum it up, the two types of CMSs' pros and cons, here's what we'd get:
 

Traditional CMS

It comes with a repository for your content, as well as a UI for editing it and a theme/app for displaying it to your website visitors.

While being more resource-demanding than a headless CMS, it provides you with more built-in functionality.
 

Headless CMS

It, too, provides you with a way to store content and an admin dashboard for managing it, but no front-end. No presentation layer for displaying it to the end user.

Its main “luring” points?
 

  • it's faster
  • it's more secure
  • more cost-effective (no hosting costs)
  • it helps you deliver a better user experience (you get to choose whatever modern JS framework you want for your website's/app's “storefront”)
     

It's true, though, that you don't get all that functionality, right out-of-the-box, as you do when you opt for a traditional CMS and that you still need to invest in building your front-end.

In the end, in a “headless CMS vs traditional CMS” debate, it's:
 

  • your own functionality/feature needs
  • your versatility requirements 
  • the level of control that you wish to have over your CMS
  • your development's team familiarity with particular technologies
     

… that will influence your final choice.

Photo by rawpixel on Unsplash

Mar 06 2019
Mar 06

Help us set the stage for future configuration management improvements in core.

The configuration management initiative 2.0 needs your help.

I will be presenting updates of CMI 2.0 at the upcoming Drupalcon Seattle together with Mike Potter.

Some of the highlights of the CMI 2.0 road map for inclusion in Drupal core are an improved version of the concept of Config Filter (in the form of the config storage transformation api) and a simplified version of Config Split (in the form of config environments).
Unfortunately those big things will not make it into 8.7, but we could lay the ground work for 8.8.

But the deadline for some patches for Drupal 8.7 is this Friday!

It would be great if you could help us get the following issues to RTBC and committed:

#3036193: Add ExportStorageFactory to allow config export in third party tools
This would allow us to add a service to core that drush, drupal console and other tools and modules can use to export configuration. If this lands in 8.7 we will be able to patch Drush and Drupal Console between 8.7 and 8.8 and make improvements to configuration management such as adding a Config Environment module to core without then patching the cli tools again afterwards.

#3016429: Add a config storage copy utility trait
This adds a new utility trait that would make dealing with configuration storages easier. Currently there are a bunch of modules that implement this logic by themselves and not all of them do it correctly. This lead to bugs in Drush and Drupal Console and even Drupal core has a bug (which is fixed by this issue).

Thanks already in advance.

PS: If you are interested in more CMI 2.0 issues to review or work on check the CMI 2.0 candidate issue tag.

Mar 06 2019
Mar 06

Thank you for supporting Midwest Drupal Camp 2019. Here are a few resources to help spread the word about this year’s MidCamp.

MidCamp in a Nutshell

  • What: MidCamp 2019 - The sixth annual Chicago-area event that brings together designers, developers, users, and evaluators of the open source Drupal content management software. Attendees come for four days of presentations, professional training, contribution sprints, and socials while brushing shoulders with Drupal service providers, hosting vendors, and other members of the broader web development community.

  • Purpose: Increase Drupal knowledge through networking, contribution sprints, training, and community.

  • When: Wednesday, March 20 - Saturday, March 23, 2019

  • Where: DePaul University - Lincoln Park Student Center, 2250 N Sheffield Ave., Chicago, IL 60614

  • Registration: https://www.eventbrite.com/e/midcamp-2019-midwest-drupal-camp-chicago-il...

  • Who Attends: Anyone who uses Drupal, or is responsible for designing, building, developing, and supporting Drupal in any capacity. Also, anyone currently evaluating Drupal or simply looking to learn more about it. Also also, any students who are technology-curious and looking to learn more about future opportunities.

Places Where You Can Help Build Buzz Around MidCamp

  • Event Calendars: Add MidCamp to your organization’s internal and external event calendars.

  • Newsletters: Announce MidCamp in your internal and external newsletters.

  • Social Media: See sample posts below.

  • Association Partners: Ask your professional associations to announce MidCamp to your peers.

  • Relevant Organizations: Ask organizations with aligned interests to include information about MidCamp in their internal and external communications. This includes any local organizations or institutions who might send students who are interested in learning more about web development or technology.

  • Meetups: Hosting a local event? Meeting up with other Drupal-enthusiasts? Share it with your peers.

Outreach Sample Email

Hi [name],

I’ll be attending MidCamp in March 2019 at Depaul University in Chicago, and would like to invite you to join me. MidCamp is a great place to learn more about the Drupal community and what it has to offer.

If it sounds interesting to you, you can register for MidCamp on Eventbrite.

 

Let me know if you have any questions. I hope to see you there!

Sincerely,

[your name]

Social Media

Let everyone know that you’ll be attending MidCamp (a.k.a. #midcamp)!

Photos

If you'd like to share photos from past events, check out what is available on the MidCamp Flickr and Instagram

Press Releases

We encourage speakers, attendees, exhibitors, volunteers, and others to generate their own press release(s) highlighting their involvement with MidCamp. Please use messaging from the MidCamp website when drafting a press release.

Contact

If you’re looking for a quote or more information about the event, please email [email protected].

Mar 05 2019
Mar 05

Running a business is demanding. To be successful requires leadership be equipped with a broad range of skills from financial astuteness to empathy for staff. Whilst developers have ample resources from which to draw reference on best practice, for managers and business leaders knowledge gained is often be deemed competitive advantage and so kept secret or is accessed only through expensive training or courses.

Working in open source brings many benefits including the fostering of knowledge transfer that transcends merely code. It is to the benefit of all that business leaders in Drupal share this openness and are willing to reveal lessons learnt or formulae of success, that in other industries would remain behind closed doors. A fine example of this mindset is DrupalCamp London CXO, this years incarnation was no exception.

Prof. Costas Andriopoulos, Cass Business School, spoke about leadership and innovation in scaling enterprises. He explained that it’s far wiser to sort out your business early, when you are small and well ahead of scaling because what kills businesses is success, age and size.

Prof. Costas Andriopoulos

 

Success: breeds complacency, overstretching, even arrogance. All of these can be the downfall of your business.

Age: of leadership and team leads to them becoming slower, more stuck in your ways. Andriopoulos stated that curiosity drops with age — a child asks over 400 questions per day. By adulthood and towards later life this drops dramatically.

Size: brings bureaucracy, slowing the pace at which information disseminates. Layers of management become more risk averse. Humans are natural hoarders, it’s normal he says for people add but we hold on to things too long. This slows businesses down.

To maintain momentum in decision making he recommended all meetings and team sizes should be manageable — 4 or five, the best team is 2. There’s nowhere to hide here. You have to participate. In large meetings people repeat one another often or may say nothing at all.

Andriopoulos recommended when facing challenging projects do a pre-mortem. Split the team in two, half of them imagine the plans has been put in motion and failed terribly. Then write a story of what happened. The other half imagine that it succeeded and write their story of how that happened. Doing so equips you with a variety of scenarios to consider before the work beings.

Rasmus Lerdorf founder of the programming language PHP

 

Rasmus Lerdorf, founder of the programming language PHP, gave a potted history of how the language came to be and prospered. What struck me was how innovation and breaking free of the norm were key drivers. In the early days where hardware and networks were far slower than we know today, Rasmus questioned the merit of querying databases without the ability to reduce verbosity of responses. He introduced the “LIMIT” clause, something we all take for granted now, to introduce efficiency gains in early internet applications.

Upgrading to PHP 7 across the web would remove 7.5 BN Kg carbon dioxide emissions

Rasmus Lerdorf

 

This ethos remains today. Lerdorf stressed the importance of upgrading to PHP 7 or above as the dramatic performance improvements significantly reduce the physical hardware required to support PHP applications. Since PHP powers >70% internet sites, our efforts combined will contribute to he estimates a 15B KWH energy savings and 7.5 BN Kg less carbon dioxide emissions.

Michel Van Velde, founder of One Shoe agency

 

Michel Van Velde, founder of One Shoe agency, spoke openly of the challenges his business faced in 2017 and how a combination of reading and hiring a personal coach helped him evolve his approach to leadership, behaviour and in doing so the actions of his staff.

His presentation was a shining example of how business leaders in open source act differently. Whilst on the face of it counterintuitive, by sharing how he overcame adversity in his life with his potential competitors, what Michel was actually doing was helping his peers to avoid these pains meaning we all rise. Doing so he is contributing to a virtuous circle.

Van Velde put his success in 2018 down to a combination of three factors, rooted in knowledge of three leadership models and an appreciation of how to apply them to his circumstances.

The Drama Triangle: defines any conflictual situation to have victim, rescuer, persecutor. An oversimplification is to say a victim typically takes the “poor me!” stance, Rescuers are those who might choose to say “Let me help you!”, Persecutor adopts the “It’s all your fault!” stance.

Radical Candor: is the ability to Challenge Directly and show you Care Personally at the same time. “Radical Candor really just means saying what you think while also giving a damn about the person you’re saying it to”

Transactional Analysis: considers that we each have internal models of parents, children and also adults, and we play these roles with one another in our relationships. If we grow an appreciation in our daily conversations, meetings and conflicts what state we and others are in (parents, children, adults) we can begin to realise how to avoid or deal with conflict.

Van Velde explained that by rewiring how he dealt with his staff not meeting expectation, dealing with situations in such a way to offer his team the opportunity to realise their shortcomings themselves, providing opportunities to address their behaviour he was creating a positive environment in which his staff could grow.

Melissa Van Der Hecht’s presenting on “Why we need to be more open about diversity in tech”

 

Melissa Van Der Hecht’s presentation on “Why we need to be more open about diversity in tech” was a breath of fresh. I can never hear enough on this topic. I found her angle refreshing.

Rather than specifying diversity through gender, race, religion she saw diversity as that which makes us stand out, what makes us special. She talked about the fact that as a female in tech you have to work harder, a lot harder, to convince men you are worthy of respect and have your ideas recognised as having merit. Van Der Hecht said this is unrelenting. At best exhausting and worst leads to burnout, reporting those from minority groups suffer double burnout rates over those in the majority.

Van Der Hecht went on to explain that unconscious bias really hard to adjust. She spoke of the “Surgeon’s dilemma”, a test for unconscious bias and admitted she fell for this riddle. I compel you to take the test, how did you fare?

Watch this short video, as a further example used in the presentation illustrating the point. For me, rather than despair, it actually gave hope that generations soon entering the workplace could bring a tidal wave of impressive minds.

[embedded content]

 

Van Der Hecht highlighted that diverse teams are more productive, more innovative and creative. There is a strong correlation between diversity and increased innovation.

According to Forbes.com companies with more diverse teams reported 19% higher revenue due to innovation

 

I always remember Erynn Petersen, Executive Director of Outercurve an OSS foundation, speaking at DrupalCon Austin. She cited data showing that diversity leads to better performance in business. It’s hard to ignore these facts, unwise not to act upon the evidence.

I couldn’t help but notice while Melissa was speaking to an audience of ~100 people, only 3 were female, few of mixed race. True they were from across Europe, but the male dominance alone was striking. The Drupal is committed to diversity, during the weekend event it was striking to me how more diverse the attendee mix was. There is clearly a way to go in terms of fostering diversity in management, agency leadership. We should all consider how in our businesses we create cultures which foster diversity. We all have a lot to benefit from that.

I’ve strived in our business to create a culture which embraces all. It takes time and we are constantly learning, listening and evolving. These things don’t happen overnight and take commitment and a willingness to change.

We are fortunate in Drupal to have a vast community with many inspiring contributors from diverse backgrounds. Next time you are on Slack, at a meetup, DrupalCamp or Con why not take time out to open a conversation with someone quite different to you. It’s quite possible you’ll begin to realise being different is what makes them special. Thanks Melissa!

 

Mar 05 2019
Mar 05

Promet's acquisition of a team focused on user experience and strategy, has sparked a new spectrum of conversations that we are now having with clients. 

The former DAHU Agency’s Human-Centered Design expertise has given rise to many questions and within a relatively short span of time, has driven the delivery of expectation-exceeding results for clients from a range of sectors. 

As the name suggests, Human-Centered Design occurs within a framework for creating solutions that incorporate the perspectives, desires, context, and behaviors for the audiences that our clients want to reach. It factors into every aspect of development, messaging and delivery, and calls for:

  1. A deep and continuous questioning of all assumptions,
  2. A willingness to look beyond the “best practices” that others have established,
  3. An eagerness to find inspiration from anywhere and everywhere,
  4. The involvement and ideas of multiple stakeholders, from different disciplines, along with a process for ongoing testing, iterating and integration of feedback, and
  5. Constant emphasis on the concerns, goals and relevant behaviors of targeted cohort groups.


Within less than a year, this specialized approach has become fundamentally integrated into the ways that Promet thinks, works and engages with clients. We intentionally practice design techniques that combine inputs from our UXperts, the client, and the end user--bringing empathy and human experience to the forefront of our process.

How Does this Approach Differ?

In contrast to traditional product-centered design, where the appeal, color, size, weight, features and functionality of the product itself serves as the primary focus, Human-Centered Design creates solutions that understand audiences from a deeper perspective.  We try to meet more than the basic needs of a captivating design. To do this, we must fulfill greater and more engaging purpose and meaning expressed within the designs we create.


Among the approaches that we’ve found particularly useful is that of Abstraction Laddering, in which we guide interdisciplinary teams through the process of stating a challenge or a goal in many different ways, continuing to answer “how” and “why” for purposes of advancing toward greater clarity and specificity. 


Human-Centered Design fuels simplicity, collaborative energies, and a far greater likelihood that launched products will be adopted and embraced. When practiced in its entirety it helps to ensure success. As such, it benefits everyone and is perfectly aligned with Promet's User Experience (UX) Design practice.

Design that Delivers

As we engage with clients in the process of deepening our understanding of their customers, we draw upon the expertise of our highly skilled and creative team members, and leverage expertise at the leading edge of the digital landscape.


The addition of this new Human-Centered Design team to the Promet Source core of web developers has helped us to proactively approach new websites with a holistic mindset combining our technology expertise with great design and function, along with an essential empathy of how humans interact with technology.  

Contact us today to schedule a workshop or to start a conversation concerning Human-Centered Design as a strategy to accelerate your business goals. 

Mar 05 2019
Mar 05

Promet's acquisition last year of a team focused on user experience and strategy, has opened an exciting new sphere for the types of conversations that we are having with clients. 

The former DAHU Agency’s Human-Centered Design expertise has sparked a many questions and within a relatively short span of time, has driven the delivery of expectation-exceeding results for clients from a range of sectors. 

As the name suggests, Human-Centered Design occurs within a framework for creating solutions that incorporate the perspectives, desires, context, and behaviors for the people whom our clients want to reach. It factors into every aspect of development, messaging and delivery, and calls for:

  1. A deep and continuous questioning of all assumptions,
  2. A willingness to look beyond the “best practices” that others have established,
  3. An eagerness to find inspiration from anywhere and everywhere,
  4. The involvement and ideas of multiple stakeholders, from different disciplines, along with a process for ongoing testing, iterating and integration of feedback, and
  5. Constant emphasis on the concerns, goals and relevant behaviors of targeted cohort groups.


Within less than a year, this specialized approach has become fundamentally integrated into the ways that Promet thinks, works and engages with clients. We intentionally practice design techniques that combine inputs from our UXperts, the client, and the end user--bring empathy and human experience to the forefront of our process.

How Does this Approach Differ?

In contrast to traditional product-centered design, where the appeal, color, size, weight, features and functionality of the product itself serves as the primary focus, Human-Centered Design creates solutions that understand audiences from a deeper perspective.  We try to meet more than the basic needs of a captivating design. To do this, we must fulfill greater and more engaging purpose and meaning expressed within the designs we create.


Among the approaches that we’ve found particularly useful is that of Abstraction Laddering, in which we guide interdisciplinary teams through the process of stating a challenge or a goal in many different ways, and continuing to answer “how” and “why” for purposes of advancing toward greater clarity and specificity. 


Human-Centered Design fuels simplicity, collaborative energies, and a far greater likelihood that launched products will be adopted and embraced. When practiced in its entirety it helps to ensure success. As such, it benefits everyone and is perfectly aligned with Promet's User Experience (UX) Design practice.

Design that Delivers

As we engage with clients in the process of deepening our understanding of their customers, we draw upon the expertise of our highly skilled and creative team members, and leverage expertise at the leading edge of the digital landscape.


The addition of this new Human-Centered Design team to the Promet Source core of web developers has helped us to proactively approach new websites with a holistic mindset combining our technology expertise with great design and function, along with an essential empathy of how humans interact with technology.  

Contact us today to schedule a workshop or to start a conversation concerning Human-Centered Design as a strategy to accelerate your business goals. 

Mar 04 2019
Mar 04

It’s been a lot of hard work and the time has finally come to launch your new website. Congratulations! But before you push that launch button, take a minute to think; are you REALLY ready to launch your website?

  • Multiple rounds of quality assurance testing? CHECK!
  • Cross browser and responsive testing? CHECK!

But is there something else you might have missed?

The items above are some of the more obvious steps a team may go through when preparing a site to launch, but there are some lesser known or sometimes forgotten steps that are just as important to take when launching a new website. So what are they?

  • Set up redirects
  • Check links: Absolute vs Relative
  • Accessibility checks
  • Decide what to do with your old site
  • Decide who will maintain your new site

Set up redirects

Over the years you may have amassed a great deal of content on your old website, and  chances are that in the course of creating your new website you’ve changed how that content is organized. This can lead to content revisions during the process of migrating  that content to the new system. Any team that has gone through this process can tell you that it is a massive effort; even if you’re automating the migration of content in someway. During this flurry of activity in moving content from point A to point B, it’s easy to forget one simple matter: How will users find the same or similar content on the new website?

Creating Redirects ensures that users who arrive at the site via an outdated URL, say from a bookmark or external site, are automatically sent to the appropriate content. Setting up redirects is incredibly important to creating a solid User Experience and it’s good for SEO. Just about every URL on your old site should have a redirect if the URL has changed. This may seem like a herculean effort, but it actually pairs well with the process of moving content from the old to new website.

Check links: Absolute v. Relative

First off a brief explanation of Absolute versus Relative URLs. An Absolutely URL encompasses a URL in its entirety. ie: https://www.kanopistudios.com/about-us. A Relative URL is just the portion of the URL that occurs after the “.com” in the example above. ie. /about-us. In the course of preparing a new website by loading copy and uploading images, you most likely are working from a temporary Development URL. When the time comes to launch the new website, the Development URL will change. When the URL is changed, any links that are pointing to the Absolute Development URL will break. This is a common mistake, and one that can have disastrous results once your new website goes live.

As a general rule of thumb try to avoid Absolute URLs when loading content to any environment. This ensure that if the core URL ever changes, your links won’t break. Leading up to launch, try to work with your Developer to identify and rectify any Absolute URLs.   

Accessibility checks

Accessibility was not exactly a top priority of early website development; as technology catches up, supporting users with impairments is becoming an ever increasing need for any modern website. Accessibility starts early on in a project’s planning, and should be discussed early and often. From Designs to Development there are many touch points where a project team can ensure that the site is compliant with standards.

But what if your site is about to go live and you haven’t considered this? Luckily there are tools like Site Improve that allow you to run automated tests to see where your site may need remediation before it can be compliant. Not only is it good for SEO, but making your site is accessible to the widest range of users ensures you reach a wider audience and that they have the best user experience possible.   

Decide what to do with your old site

In the activity leading up to the launch of your new website, it’s easy to overlook this question. Regardless of how confident you are in the new website, it’s important to have a plan in place for what to do with your old website. Here are some important questions to consider when considering the fate of your old website:

Will you need to reference your old site at any point in the future? Perhaps you weren’t able to move all the content to the new site before launch or maybe there is old content that won’t be migrated, but you still need to reference it in the future. Whatever the reason may be if the answer to this question is yes, you’ll want to keep your site up in some capacity.

Can you afford to host two websites at the same time? This one is a little less straightforward; depending on the size, state, and makeup of your old website, you have options. From a budgetary standpoint, paying for a website that no one will really visit is probably not going to look all that great to accounting. The good news is that with no traffic visiting the old website you probably don’t need all that expensive infrastructure; many enterprise level hosting providers have a free tier that is great for storing a legacy site on.    

Regardless of your situation, you can always find options. What’s most important is that you have a plan.

Decide who will maintain your new site

Building a website is a process; one that requires regular upkeep and ongoing maintenance. Understand that your website is a tool, and built right it should be designed to grow and adapt to the changing needs of your business. This is the philosophy we at Kanopi believe in, and try to instil in our projects. So with that in mind, it’s important to consider who will be responsible for ongoing improvements, maintenance, updates, and bug fixes when the times arise.

While not uncommon for teams to try to take this on internally, it’s important to consider if you have the right skill sets, let alone bandwidth for this to be a viable option. Another solution is to work with an agency like Kanopi to provide ongoing support for your site. An agency will have access to a wider range of expertise and ensures maximum flexibility for the future growth of your site.

Check these off your list, and you’re good to launch!

These items may seem like big additions to your plate leading up to launch, but they pale in comparison to the what could occur if you leave them out. Plan for these early on, and it will ensure your launch goes off with one less hitch.

Mar 04 2019
Mar 04

Subscribe to the TCDrupal News

*/
Mar 04 2019
Mar 04


Bitbucket Pipelines is a CI/CD service, built into Bitbucket and offers an easy solution for building and deploying to Acquia Cloud for project’s whose repositories live in Bitbucket and who opt out of using Acquia’s own Pipelines service. Configuration of Bitbucket Pipelines begins with the creation of a bitbucket-pipelines.yml file and adding that file to the root of your repository. This configuration file details how Bitbucket Pipelines will construct the CI/CD environment and what tasks it will perform given a state change in your repository.

Let’s walk through an example of this configuration file built for one of our clients.

bitbucket-pipelines.yml

image: geerlingguy/drupal-vm:4.8.1
clone:
  depth: full
pipelines:
  branches:
    develop:
      - step:
         script:
           - scripts/ci/build.sh
           - scripts/ci/test.sh
           - scripts/ci/deploy.sh
         services:
           - docker
           - mysql
         caches:
           - docker
           - node
           - composer
    test/*:
      - step:
         script:
           - scripts/ci/build.sh
           - scripts/ci/test.sh
         services:
           - docker
           - mysql
         caches:
           - docker
           - node
           - composer
  tags:
    release-*:
      - step:
          name: "Release deployment"
          script:
            - scripts/ci/build.sh
            - scripts/ci/test.sh
            - scripts/ci/deploy.sh
          services:
            - docker
            - mysql
          caches:
            - docker
            - node
            - composer
definitions:
  services:
    mysql:
      image: mysql:5.7
      environment:
        MYSQL_DATABASE: 'drupal'
        MYSQL_USER: 'drupal'
        MYSQL_ROOT_PASSWORD: 'root'
        MYSQL_PASSWORD: 'drupal'

The top section of bitbucket-pipelines.yml outlines the basic configuration for the CI/CD environment. Bitbucket Pipelines uses Docker at its foundation, so each pipeline will be built up from a Docker image and then your defined scripts will be executed in order, in that container.

image: geerlingguy/drupal-vm:4.8.1
clone:
  depth: full

This documents the image we’ll use to build the container. Here we’re using the Docker version of  Drupal VM. We use the original Vagrant version of Drupal VM in Acquia BLT for local development. Having the clone depth set to full ensures we pull the entire history of the repository. This was found to be necessary during the initial implementation.

The “pipelines” section of the configuration defines all of the pipelines configured to run for your repository. Pipelines can be set to run on updates to branches, tags or pull-requests. For our purposes we’ve created three pipelines definitions.

pipelines:
  branches:
    develop:
      - step:
         script:
           - scripts/ci/build.sh
           - scripts/ci/test.sh
           - scripts/ci/deploy.sh
         services:
           - docker
           - mysql
         caches:
           - docker
           - node
           - composer
    test/*:
      - step:
         script:
           - scripts/ci/build.sh
           - scripts/ci/test.sh
         services:
           - docker
           - mysql
         caches:
           - docker
           - node
           - composer

Under branches we have two pipelines defined. The first, “develop”, defines the pipeline configuration for updates to the develop branch of the repository. This pipeline is executed whenever a pull-request is merged into the develop branch. At the end of execution, the deploy.sh script builds an artifact and deploys that to the Acquia Cloud repository. That artifact is automatically deployed and integrated into the Dev instance on Acquia Cloud.

The second definition, “test/*”, provides a pipeline definition that can be used for testing updates to the repository. This pipeline is run whenever a branch named ‘test/*’ is pushed to the repository. This allows you to create local feature branches prefixed with “test/” and push them forward to verify how they will build in the CI environment. The ‘test/*’ definition will only execute the build.sh and test.sh scripts and will not deploy code to Acquia Cloud. This just gives us a handy way of doing additional testing for larger updates to ensure that they will build cleanly.

The next section of the pipelines definition is set to execute when commits in the repository are tagged.

tags:
  release-*:
    - step:
        name: "Release deployment"
        script:
          - scripts/ci/build.sh
          - scripts/ci/test.sh
          - scripts/ci/deploy.sh
        services:
          - docker
          - mysql
        caches:
          - docker
          - node
          - composer

This pipeline is configured to be executed whenever a commit is tagged with the name pattern of “release-*”. Tagging a commit for release will run the CI/CD process and push the tag out to the Acquia Cloud repository. That tag can then be selected for deployment to the Stage or Production environments.

The final section of the pipelines configuration defines services built and added to the docker environment during execution.

definitions:
  services:
    mysql:
      image: mysql:5.7
      environment:
        MYSQL_DATABASE: 'drupal'
        MYSQL_USER: 'drupal'
        MYSQL_ROOT_PASSWORD: 'root'
        MYSQL_PASSWORD: 'drupal'

This section allows us to add a Mysql instance to Docker, allowing our test scripts to do a complete build and installation of the Drupal environment, as defined by the repository.

Additional resources on Bitbucket Pipelines and bitbucket-pipelines.yml:

Scripts

The bitbucket-pipelines.yml file defines the pipelines that can be run, and in each definition it outlines scripts to run during the pipeline’s execution. In our implementation we’ve split these scripts up into three parts:

  1. build.sh – Sets up the environment and prepares us for the rest of the pipeline execution.
  2. test.sh – Runs processes to test the codebase.
  3. deploy.sh – Contains the code that builds the deployment artifact and pushes it to Acquia Cloud.

Let’s review each of these scripts in more detail.

build.sh

#!/bin/bash
apt-get update && apt-get install -o Dpkg::Options::="--force-confold" -y php7.1-bz2 curl && apt-get autoremove
curl -sL https://deb.nodesource.com/setup_8.x | sudo -E bash -
apt-get install -y nodejs
apt-get install -y npm
cd hive
npm install
npm install -g gulp
cd ..
composer install
mysql -u root -proot -h 127.0.0.1 -e "CREATE DATABASE IF NOT EXISTS drupal"
export PIPELINES_ENV=PIPELINES_ENV

This script takes our base container, built from our prescribed image, and starts to expand upon it. Here we make sure the container is up-to-date, install dependencies such as nodejs and npm, run npm in our frontend library to build our node_modules dependencies, and instantiate an empty database that will be used later when we perform a test install from our codebase.

test.sh

#!/bin/bash
vendor/acquia/blt/bin/blt validate:phpcs --no-interaction --ansi --define environment=ci
vendor/acquia/blt/bin/blt setup --yes  --define environment=ci --no-interaction --ansi -vvv

The test.sh file contains two simple commands. The first runs a PHP code sniffer to validate our custom code follows prescribed standards. This command also runs as a pre-commit hook during any code commit in our local environments, but we execute it again here as an additional safeguard. If code makes it into the repository that doesn’t follow the prescribed standards, a failure will be generated and the pipeline will halt execution. The second command takes our codebase and does a complete Drupal installation from it, instantiating a copy of Drupal 8 and importing the configuration contained in our repository. If invalid or conflicting configuration makes it into the repository, it will be picked up here and the pipeline will exit with a failure. This script is also where additional testing could be added, such as running Behat or other test suites to verify our evolving codebase doesn’t produce regressions.

deploy.sh

#!/bin/bash
set -x
set -e

if [ -n "${BITBUCKET_REPO_SLUG}" ] ; then

    git config user.email "[email protected]"
    git config user.name "Bitbucket Pipelines"

    git remote add deploy $DEPLOY_URL;

    # If the module is -dev, a .git file comes down.
    find docroot -name .git -print0 | xargs -0 rm -rf
    find vendor -name .git -print0 | xargs -0 rm -rf
    find vendor -name .gitignore -print0 | xargs -0 rm -rf

    SHA=$(git rev-parse HEAD)
    GIT_MESSAGE="Deploying ${SHA}: $(git log -1 --pretty=%B)"

    git add --force --all

    # Exclusions:
    git status
    git commit -qm "${GIT_MESSAGE}" --no-verify

    if [ $BITBUCKET_TAG ];
      then
        git tag --force -m "Deploying tag: ${BITBUCKET_TAG}" ${BITBUCKET_TAG}
        git push deploy refs/tags/${BITBUCKET_TAG}
    fi;

    if [ $BITBUCKET_BRANCH ];
      then
        git push deploy -v --force refs/heads/$BITBUCKET_BRANCH;
    fi;

    git reset --mixed $SHA;
fi;

The deploy.sh script takes the product of our repository and creates an artifact in the form of a separate, fully-merged Git repository. That temporary repository then adds the Acquia Cloud repository as a deploy origin and pushes the artifact to the appropriate branch or tag in Acquia Cloud. The use of environment variables allows us to use this script both to deploy the Develop branch to the Acquia Cloud repository as well as deploying any tags created on the Master branch so that those tags appear in our Acquia Cloud console for use in the final deployment to our live environments. For those using BLT for local development, this script could be re-worked to use BLT’s internal artifact generation and deployment commands.

Configuring the cloud environments

The final piece of the puzzle is ensuring that everything is in-place for the pipelines to process successfully and deploy code. This includes ensuring that environment variables used by the deploy.sh script exist in Bitbucket and that a user with appropriate permissions and SSH keys exists in your Acquia Cloud environment, allowing the pipelines process to deploy the code artifact to Acquia Cloud.

Bitbucket configuration

DEPLOY_URL environment variable

Configure the DEPLOY_URL environment variable. This is the URL to your Acquia Cloud repository.

  1. Log in to your Bitbucket repository.
  2. In the left-hand menu, locate and click on “Settings.”
  3. In your repository settings, locate the “Pipelines section” and click on “Repository variables.”
  4. Add a Repository variable:
    1. Name: DEPLOY_URL
    2. Value: The URL to your Acquia Cloud repository. You’ll find the correct value in your Acquia Cloud Dashboard.

SSH keys

Deploying to Acquia Cloud will also require giving your Bitbucket Pipelines processes access to your Acquia Cloud repository. This is done in the form of an SSH key. To configure an SSH key for the Pipelines process:

  1. In the “Pipelines” section of your repository settings we navigated to in steps 1-3 above, locate the “SSH keys” option and click through.
  2. On the SSH keys page click the “Generate keys” button.
  3. The generated “public key” will be used to provide access to Bitbucket in the next section.

Acquia Cloud configuration

For deployment to work, your Bitbucket Pipelines process will need to be able to push to your Acquia Cloud Git repository. This means creating a user account in Acquia Cloud and adding the key generated in Bitbucket above. You can create a new user or use an existing user. You can find more information on adding SSH keys to your Acquia Cloud accounts here: Adding a public key to an Acquia profile.

To finish the configuration, log back into your Bitbucket repository and retrieve the Known hosts fingerprint.

Mar 04 2019
Mar 04

I headed to City University with a slight hangover, meeting up with a friend in SE London the night before doing the damage. But after some food I was ready for the opening keynote. This was delivered by Rowan Merewood, a developer for Google Chrome, talking on the user experience and website optimisation.

Rowan highlighted the need to deliver fast web pages, especially on mobile devices which now comprise over 50% of all web traffic. Tools such as Google Lighthouse can be run in Chrome DevTools, from the command line, or as a Node module. It can audit your site and provide lots of data on ways you can improve your sites performance.

Rowan had a few slides that made the audience cheer, one being that the number of Drupal sites delivering content over https is marginally above the web average. Another being site speed for Drupal sites is around the average too, not as slow as you might think! Plus some slides that made us wince, when showing that Drupal sites have a larger percentage of javascript vulnerabilities.

Radically Candid, the modern psychology that leads to effective leadership

Michelvan Velde shared his insights on transactional analysis and the drama triangle. He explained how you can help move people out of being a victim and yourself as the persecutor/rescuer, to taking the adult position, to help empower the other person to find the fix themselves.

Michelvan was very open about how working with a business coach impacted his day to day life and transformed his business. Stepping out of the drama triangle/conflict and changing the way you speak to colleagues, from an adult perspective, can defuse tensions. It can help the person raising the issue find the answer, empowering them to take control and deliver the change required.

The talk was insightful, funny, and enjoyable.

Drupal 8 SEO

Jaro from Droptica discussed his approach to SEO, including the use of modules such as SEO Checklist, Ahrefs for data analytics, SEO MOFO for SERP snippet optimisation and Backlinko for an extensive list of Google ranking factors. Lots of info to help any SEO strategy.

State of Layout Management in Drupal

Anton Staroverov spoke about the variety of modules available for building layouts and their purposes. Sadly it did highlight a known bug in the Drupal 8 Layout Builder whereby a quick change of layout, didn't take effect. Anton did explain Bricks and their use which looks quite a neat module. This is one area where Drupal is still lagging behind other systems, when it comes to quick manipulation of layouts and content, that either the customer or non-techy can handle.

George Boobyer Browser Wars 2019 - George Boobyer

Browser Wars 2019 - Implementing a Content Security Policy

Security is high on most developers agenda and George Boobyer delivered a good session on Content Security Policies (CSP) and their use. George took us back to the old days of Netscape, a blast from the past, and how we fought against browsers to now when we work together with the browsers to deliver a better user experience.

Implementing security headers, creating a CSP, to then monitor this going forward. All in order to get a better grasp of what scripts your website delivers, you'd be surprised what you find. Resources included the Mozilla Web Security Cheat Sheet, the SecKit and CSP modules and check your websites security headers with https://securityheaders.com

No Monkey Business Static Progressive Web Apps

Eli_t and Dominic Fallows, of Interactive Investor, session was on the use of decoupled Drupal, GatsbyJS, ReactJS and AWS to deliver rich content without making Google cry. A jam packed session full of great info, based on their move from Drupal 6 to a decoupled D8 site.

Gatsby sits at the heart of the pipeline taking data from Drupal and other sources to create static content, that gets spiced up with React to provide a dynamic look and feel. They mentioned the issue of previewing content, potentially having to wait whilst Gatsby generates the full site in order for editors to see changes, whereas their own custom solution allows previews of individual pages within seconds. Plenty of food for thought.

A great day rounded off at the pub, although we timed each round of drinks just moments before more money was put behind the bar, probably a good thing after my exploits the night before!

Drupal ducks Some of the booty from DrupalCamp London 2019​​​​ Drink
Mar 03 2019
Mar 03

Consistency is a key element in two things, success and web development and both go hand in hand when working on developing a website, either from scratch or revamping an existing one. What aids this consistency is the ability to recognise a pattern that is often repeated in the process and then followed for achieving a common result.

Raindrops on a patterned fence


Ensuring a consistent and easy to maintain website is one of the biggest headaches faced by large organisations. This is the exact gap where the suggestion of creating a pattern library as the solution can come into the picture.

But what are pattern libraries? And how can they be put to use? Let’s find out! 

Pattern libraries aid in using easy elements and styles in a project in order to document the visual language of a site, promote consistency, provide user control and reduce cognitive load.

What is a Pattern Library?

The primary attraction of a pattern library is that the time taking process of building new features and pages is reduced to a minimum

The importance of Pattern libraries gained attention in the tech space when developers started understanding the benefits of having readymade components for projects. In this era of wanting quick and easy fixes, the primary attraction of a pattern library is that the time taking process of building new features and pages is reduced to a minimum. Thus, the main purpose is to help create consistent websites that are easy to maintain and become a solid part of the design and development process. A pattern library works in a way that it documents all ‘patterns’ (also known as modules) to defines what they look like and how they behave and code.

Style Guides, Pattern Libraries, Design Systems

Style guides, pattern libraries and design systems may hold similar implication for designers and developers, but they exist as individual entities. Also, style guides and pattern libraries( also known as component libraries) may co-exist together to form complete and coherent design systems for a product. Let’s explore the difference between them in detail. 

Typically encompassing a company’s branding guidelines, including components like logo usage, designated color palettes, and editorial tone, a Style Guide is a collection of pre-designed elements to be followed to ensure consistency and a cohesive experience at the end. It is often wrapped as a whole by the company as a deliverable to work with vendors in partnerships. They can directly influence the look and feel of a Pattern Library with the basic difference being that Style Guides can have a standing without data, while Pattern Libraries do rely on some data to function.

On the other side, a Pattern Library often confine static UI elements, being a storage for your components  like articles, headers, galleries, dropdown menu, accordion, and even common web page layouts, like grids. Though style guides do not always worry about context and relationships with data, UI elements and their application in the overall user experience depend largely on context and the interplay with content. Thus, Pattern Libraries focus on interface design, and would not include rules that apply globally to print or other mediums.

This brings us to the Design system joining the dots between a style guide and a pattern library to define the principles relating to the way in which components should exert together. It defines how a layout should work being a form of product documentation which contains everything that helps with delivering the outcome.
 
Often influenced by a style guide, a pattern library usually includes HTML snippets or living documentation for website components which are well-documented and responsive. For instance, pattern libraries can include –

  • Buttons
  • Images
  • Hero 
  • Elements
  • Sliders
  • Galleries
  • Navigation
  • Articles

Why do you need a pattern library?

We have reached an understanding that pattern libraries escalate productivity, but how and in what ways it makes it possible on the ground level? Following are the three-fold benefits of pattern libraries:

Consistency

Development of big sites happens over a prolonged period by a group of developers working on it and requires to be revised regularly. This leads to a fragmented user interface unless everything is in place to ensure consistency.

From Navigation shifts position to form elements, everything has a different format and approach. A pattern library offers a straightforward way to duplicate existing design and functionality on any page of the site for a steady user interface in a fixed frame.

Reusability

If multiple web teams work on multiple sites of different departments in a company, they might end up reinventing the same styles at a considerable cost.

In such cases, a central pattern library can be formulated for reuse functionality and design. A pattern for a particular requirement in the area of responsibility can then be shared with the whole group and also be available for future projects.

This makes a new site or subsection becomes a mere matter of combining these patterns, in much the same way you build something out of Lego bricks.

Easy Maintenance

Having a consistent pattern library that everybody pools from makes the maintenance work easier as seeing all of the pieces in one place makes the task effortless. 

Having coded elements in the same way from the very beginning makes it much elementary for a developer to work on somebody else’s code. Also, for a new developer, work efficiency can speed up by looking at the existing pattern library in use and build the site based primarily on it.

Who is it For?

End User:

From the user’s perspective, websites and products that are familiar and consistent provide a smooth experience along with reducing cognitive load on the user.

Development Team:

For teams to focus on the bigger picture without worrying about pushing pixels, pattern libraries help ship products faster to ensure greater efficiency in internal processes and allowing engineers to re-use existing codes.

Organization:

Providing longevity to big sites which are developed by different people over a prolonged period and revised regularly, Pattern Libraries proves to increase the productivity of the organisation at large.

One of the more popular Pattern Libraries, a static site generator called Pattern Lab, is based on Brad Frost’s Atomic Design concept. There are many others to choose from, but this blog will focus on Pattern Lab being a dynamic prototyping and organization tool.

Pattern Lab is available for download on GitHub and can be used as part of your existing or new projects.

Pattern Lab + Drupal 8 = Emulsify

Logo of Pattern Lab and Drupal = ‘Emulsify’ on a blue background


Emulsify is a component-driven prototyping tool that uses atomic design principle and modern frontend practice to develop a living style guide. It can be easily implemented into any CMS to render twig files as it adopts the methodology where the smallest components are atoms, which are assembled into molecules, organisms, templates, and finally pages.

With the shift for templating in Drupal 8 to Twig, a whole new range of tools are now available for theming.

Emulsify authorises you to assemble and manage components in a way that enhance your workflow by integrating Pattern Lab. The Emulsify based project works with custom template names that are specific to the project, developers, and clients. This segregates category-wise patterns(modules) and increases the proficiency of the process.

Emulsify authorises you to assemble and manage components in a way that enhance your workflow by integrating Pattern Lab

When the templates are all set for production, Emulsify connects them to Drupal in a non-complex way as a Twig function (include, extends, or embed) and connects the Drupal templates to the component files.

Emulsify swears by a "living style guide" approach where the style guide components are the same ones in use on the live site. One doesn’t have to worry about the components becoming obsolete or looking unusual than the style guide.

Also, the components constructed with Emulsify are used on any project, with or without Drupal. In simpler terms, it can be used with any CMS that renders content with Twig, including WordPress. This provides an opportunity to work with any frontend expert in a development team as they will be only working with familiar technologies. However, if your project doesn't use Twig, Emulsify can still be used by designers and front-end developers to build a style guide and then be carried forward by backend developers.

In Conclusion

Though building a pattern library demands a lot of work, but once set, it eases the process for all future projects. You can always take baby steps and start small, with just a lightweight overview of the main patterns and modules, without any detailed documentation. Later, you can always progressively refactor and upgrade the pattern library over time by adding features according to the team need.

Aiming for a full-proof pattern library that solves all problems at once might take a year-long project’s time without immediate, tangible benefits to extract from. 

We at OpenSense Labs provide best of Drupal services in enhancing your development in respect to industry standards. Mail us at [email protected] to connect and know more.

Mar 02 2019
Mar 02


Here is a pretty simple way to maintain the hook_help() of a Drupal project, straight from the readme file.

Markdown is preferred here to .txt so it can be reused on a GitHub, GitLab, ... repository as well.
The downsides of this approach are that we are losing several capabilities like:

  • the Drupal translation and routing system
  • conditional help, e.g. when another related module is installed

Anyway, in most cases, it can be used as a good fallback.

Require the Parsedown library in your composer.json

"require": {
  "erusev/parsedown": "^2.0"
}

Create the README.md file at the root directory of the module.

Then, in the hook_help()

use Drupal\Core\Routing\RouteMatchInterface;
use Drupal\Component\Utility\Xss;

/**
 * Implements hook_help().
 */
function my_module_help($route_name, RouteMatchInterface $route_match) {
  switch ($route_name) {
    case 'help.page.my_module':
      $parsedown = new Parsedown();
      $readme = file_get_contents(drupal_get_path('module', 'my_module') . '/README.md');
      // Stay permissive here, with filterAdmin.
      $output = Xss::filterAdmin($parsedown->parse($readme));
      return $output;
  }
}


Resources

Mar 01 2019
Mar 01

We’re featuring some of the people in the Drupalverse! This Q&A series highlights individuals you could meet at DrupalCon.

Every year, DrupalCon is the largest gathering of people who belong to this community. To celebrate and take note of what DrupalCon means to them, we’re featuring an array of perspectives and fun facts to help you get to know your community.
 

Mar 01 2019
Mar 01

Which of those Drupal modules that are crucial for almost any project make you want to... pull your hair out? 

For, let's face it, with all the “improving the developer experience” initiatives in Drupal 8:
 

  • BigPipe enabled by default
  • the Layout Builder
  • Public Media API
  • and so on
     

… there still are modules of the “can't-live-without-type” that are well-known among Drupal 8 developers for the headaches that they cause.

And their drawbacks, with a negative impact on the developer experience, go from:
 

  • lack of/poor interface
  • to a bad UI for configuration
  • to hard-to-read-code
  • too much boilerplate code, verbosity
  • to a discouragingly high learning curve for just one-time operations
     

Now, we've conducted our research and come up with 4 of the commonly used Drupal modules that developers have a... love/hate relationship with:
 

1. Paragraphs, One of the Heavily Used Drupal Modules 

It's one of the “rock star” modules in Drupal 8, a dream come true for content editors, yet, there are 2 issues that affect developer experience:
 

Developers are dreaming of a... better translation support for the Paragraphs module. And of that day when the deleted pieces of content with paragraphs data don't remain visible in their databases.
 

2. Views

Here's another module with its own star on Drupal modules' “hall of fame” that...  well... is still causing developers a bit of frustration:

You might want to write a query yourself, to provide a custom report. In short, to go beyond the simple Views lists or joins. It's then that the module starts to show its limitations.

And things to get a bit more challenging than expected. 

It all depends on how “sophisticated” your solution for setting up/modifying your custom query is and on the very structure of the Drupal data.

Luckily, there's hope.

One of the scheduled sessions for the DrupalCon Seattle 2019 promises to tackle precisely this issue: how to create big, custom reports in Drupal without getting your MySQL to... freeze.
 

3. Migrate 

There are plenty of Drupal developers who find this module perfectly fit for small, simple website migration projects. And yet, they would also tell you that it's not so developer friendly when it comes to migrating heavier, more complex websites.

Would you agree on this or not quite?


4. Rules 

Another popular Drupal module, highly appreciated for its flexibility and robustness, yet some developers still have a thing or two against it:

It doesn't enable them to add their own documentation: comments, naming etc.

And the list could go on since there are plenty of developers frustrated with the core or with the Commerce Drupal module...

The END!

What do you think of this list of Drupal modules that give developers the most headaches? Would you have added other ones, as well?

What modules do you find critical for your projects, yet... far from perfect to work with?

Mar 01 2019
Mar 01

Do you remember Scrooge McDuck? He was the uncle to the most famous and beloved character - Donald Duck. Most notable for his piles of shiny, golden coins stacked in his cartoon mansion. 
 
His favorite pastimes: Pinching pennies, counting gold and swimming around in his mountains of money. 
 
While we can’t all have Scrooge McDuck’s limitless riches, we’re still like him in a few important ways. Among which guarding his riches in every sort of manner is one.

Image of McDuck with donal duck and his two grandsons measuring the room full of gold coins


New technologies and approaches are creating massive changes that have forever altered the way consumers and businesses interact. Adding to these technological changes, our e-mail accounts and other social media handlings play a similar role as Scrooge McDuck’s riches. And having to log in to these treasures one by one is something we dodge to do.

Right?

Thus, here is one of the most trustworthy applications for your software systems. 

Presenting Single Sign-On (SSO) 

Single sign-on (SSO) is a session and user authentication service that allows a user to use a single set of login credentials (like a name and password) to enter multiple applications. In the SSO system, a third party service offers the identification and authorization mechanism and is responsible for informing the user identity. 

This identification and authorization are handled with the help of Federated Identity. 

Federated Identity refers to the standards, tools, and use-cases that enable the users to access multiple applications using the same access credentials. 

Image of a laptop, being touched by a finger. There is an image of a yellow key connected with six images different social platforms


So now the question is - how is the authorized data exchanged?

Well, Federated Identity comes with OASIS security assertion markup language (SAML) specification (It may involve open source technologies also). This specification helps in standard exchange of data between the security domain, having the main focus on providing support for:

SAML 2.0 as an Identity Provider: is the system or the domain that supports the user authentication and has associated attributes. In SAML, Identity Providers are also called SAML authorities and Asserting Parties

SAML 2.0 as a Service Provider: is the system or administrative domain that relies on information or the data supplied by the Identity Provider. 

Chart showing SAML process. At the top is a circle saying user which is connected with 2 squares saying service provider and identity providerSource: blog.imaginea

Security and Privacy in SAML 2.0

This protocol brings no security by itself and highly relies on secure communications (SSL and TLS) or some pre-existed trust relationship which also typically relies on     PKI or asymmetric cryptography.   

It represents a wide variety of security mechanisms to identify and guard the data against attacks. The relying party and asserting party should have a pre-existing trust relationship which typically depends on a Public Key Infrastructure (PKI). 

When a party demands an assertion from another party, a bi-lateral authentication is needed. Among which SSL or TLS are the two that are recommended with the management of mutual authentication or authentication via digital signatures.
 
In terms of privacy, SAML 2.0 also promotes the establishment of pseudonyms between an identity provider and a service provider. The authentication context mechanisms enable  a  user to  be  authenticated  at  a sufficient and assured level ( appropriate to the resource that is attempting to access at the service provider)

Flow chart showing the security process in SAML 2.0 on the left side is a user diverging to two part. One part has identity provider the other has service providerSource: Medium

SimpleSAMLphp for Implementing the standards of SAML 2.0

What is SimpleSAMLphp?

It is an application that is written in PHP which helps in implementing SAML 2.0. SimpleSAMLphp is a really easy way of integrating all the web-based PHP application into a federation. 

SimpleSAMLphp maintains all the non-PHP scenarios by using the Auth Memcookie approach (a special cookie is added in Memcache that the Apache module Auth MemCookie understands).

It offers support to the two scenarios:

  • SimpleSAMLphp as a Service Provider 
  • SimpleSAMLphp as an Identity Provider 

Service Provider Scenario 

It is important for the user to know that the Service Provider API presents with basic functionality.

  • Verifying if the user is genuine or not
  • Whether they need any authentication or not
  • Login and Logout
  • Preparing the user attributes
  • Preparing the URLs for login and log out. 

For authentication, SimpleSAMLphp connects to an identity provider (which is easily defined by configurated files). This is done so that the Service Provider easily configures to connect to other Identity Providers without having to modify anything in the web application.

In the web application, if the user wants to implement SimpleSAMLphp as a Service Provider, they need to add classes by using the API. Once the authentication is complete, they can easily access the user’s attributes.

Identity Provider  Scenario

The identity provider in simpleSAMLphp is configured to validate the user against various origins - it may be static, LDAP, SQL, Radius, OpenID, Facebook, and Twitter. 

For setting up the Identity Provider, configuration files are required to be changed so that the authentication module can be used and specified ( with additional information and the list of Service Providers). When several Services Providers utilize the same Identity Provider to verify the user, the user needs to log in only once. This is done because the session information is stored by the Identity Provider. 

The Identity Provider also requires a certificate so that identification is proven to the Service Provider.

Flow chart with three sections. Fisrt section has a blue heading and says service provider, the second block is in green color which says user and the third say identity provider in red colorSource: JulianZhu

How are Sessions in SimpleSAMLphp?

SimpleSAMLphp consists of an abstraction layer for session management. That indicates it is possible to select between different kind of session stores, as well as write new session store plugins.
There are five ways in which the user can store their sessions in SAML. The five ways are:

PHP: If the user wants to use the PHP session handler, then they need to set the store.type configuration option in config.php. But they have to keep one thing in mind that the PHP does not allow two sessions to be open at the same time. 
This indicates that if they are using PHP sessions, both the application as well as the SimpleSAMLphp at the same time, then they need to have different names.

SQL: To store session to a SQL database, set the store.type option to SQL. SimpleSAMLphp uses PDO (PHP objects) when entering the database server, so the database source is configured with the help of  DSN (Data source name). The expected tables are generated automatically. If the user wants to store the data from multiple but a separate SimpleSAMLphp installation in the same database, then they can do the same by using the store.sql.prefix option to prevent conflicts.

Memcache: To store the Memcache session handler, the user needs to set the store.type parameter in config.php. memcache that enables them to store many redundant copies of sessions on various Memcache servers. Every server group is an array of the servers. The data or the information items are load-balanced between all types of servers in each and every server group.

Redis:  To save sessions in Redis, the user need to set the store.type option to redis. By default, SimpleSAMLphp will strive to combine Redis on the localhost at the port 6379. It is, then, configured with the help of store.redis.host and store.redis.port options.

Writing your own plugin: In SimpleSAMLphp there is an excellent open source community, and every type of users are welcome to join. The forums are open for everyone to ask questions and queries, provide answers, inquire improvements or offer with code or plugins of their own.

Drupal in the picture 

DrupalCamp 2018 talked about Drupal 8 module, simpleSAMLphp. The session was all about installing and configuring SimpleSAMLphp as IDP and SP. It also talked about integrating SimpleSAMLphp into Drupal 8 and create an SSO network.

[embedded content]


Drupal SimpleSAMLphp module is one of the most robust and strong modules. It provides a comprehensive and complete implementation of SAML in PHP. 

This module not only made it possible for Drupal to communicate with SAML or identity providers (IdP) to authenticate users but it also resulted in the Drupal site to act effectively as a SAML or Shibboleth service provider (SP). Some of the features provided by it are:

  • The module provides with in-time provisioning to the accounts of the Drupal user which are based on SAML attributes.
  • It provides with automatic role assignment that is based on SAML attributes
  • The dual mode in the module guides the users with traditional Drupal accounts and SAML-authenticated accounts at the same time.
  • It supports multiple authentication protocols like OpenID (e.g., Google, Yahoo), Facebook, OAuth (e.g., Twitter), Radius etc

Conclusion 

SimpleSAMLphp is very valuable and important for executing an SSO mechanism in web applications. It is developed in a native PHP and maintains integration to any  SAML providers.

Yes, the library is very flexible and it comes with many authentication modules and furthermore, they can easily be adapted to third-party applications. 

The technology has become very popular especially with the rise of concepts like Web 2.0 and the continuous development of social networks websites like Facebook, MySpace, and others. 

At OpenSense Labs, we believe that security is the number one concern of any organization and we try to provide them with services that help them in a longer run. Ping us now at [email protected], our professionals would provide you with suitable answers to all your queries and questions.  

Feb 28 2019
Feb 28

“Oh snap”, said the project manager. “The client has this whole range of rich articles they probably are expecting to still work after the migration!”

The project was a relaunch of a Drupal / Commerce 1 site, redone for Drupal 8 and Commerce 2. A couple of weeks before the relaunch, and literally days before the client was allowed in to see the staging site, we found out we had forgotten a whole range of rich articles where the client had carefully crafted landing pages, campaign pages and “inspiration” pages (this is a interior type of store). The pages were panel nodes, and it had a handful of different panel panes (all custom).

In the new site we had made Layout builder available to make such pages.

We had 2 options:

  • Redo all of them manually with copy paste.
  • Migrate panel nodes into layout builder enabled nodes.

“Is that even possible?”, said the project manager.

Well, we just have to try, won’t we?

Creating the destination node type

First off, I went ahead and created a new node type called “inspiration page”. And then I enabled layout builder for individual entities for this node type.

Now I was able to create “inspiration page” landing pages. Great!

Creating the migration

Next, I went ahead and wrote a migration plugin for the panel nodes. It ended up looking like this:

id: mymodule_inspiration
label: mymodule inspiration
migration_group: mymodule_migrate
migration_tags:
  - mymodule
source:
  # This is the source plugin, that we will create.
  plugin: mymodule_inspiration
  track_changes: TRUE
  # This is the key in the database array.
  key: d7
  # This means something to the d7_node plugin, that we inherit from.
  node_type: panel
  # This is used to create a path (not covered in this article).
  constants:
    slash: '/'
process:
  type:
    plugin: default_value
    # This is the destination node type
    default_value: inspiration_page
  # Copy over some values
  title: title
  changed: changed
  created: created
  # This is the important part!
  layout_builder__layout: layout
  path:
    plugin: concat
    source:
      - constants/slash
      - path
destination:
  plugin: entity:node
  # This is the destination node type
  default_bundle: inspiration_page
dependencies:
  enforced:
    module:
      - mymodule_migrate

As mentioned in the annotated configuration, we need a custom source plugin for this. So, let’s take a look at how we make that:

Creating the migration plugin

If you have a module called “mymodule”, you create a folder structure like so, inside it (just like other plugins):

src/Plugin/migrate/source

And let’s go ahead and create the “Inspiration” plugin, a file called Inspiration.php:

<?php

namespace Drupal\mymodule_migrate\Plugin\migrate\source;

use Drupal\Component\Uuid\UuidInterface;
use Drupal\Core\Entity\EntityManagerInterface;
use Drupal\Core\Extension\ModuleHandlerInterface;
use Drupal\Core\State\StateInterface;
use Drupal\layout_builder\Section;
use Drupal\layout_builder\SectionComponent;
use Drupal\migrate\Plugin\MigrationInterface;
use Drupal\migrate\Row;
use Drupal\node\Plugin\migrate\source\d7\Node;
use Symfony\Component\DependencyInjection\ContainerInterface;

/**
 * Panel node source, based on panes inside a panel page.
 *
 * @MigrateSource(
 *   id = "mymodule_inspiration"
 * )
 */
class Inspiration extends Node {

  /**
   * Uuid generator.
   *
   * @var \Drupal\Component\Uuid\UuidInterface
   */
  protected $uuid;

  /**
   * Inspiration constructor.
   */
  public function __construct(
    array $configuration,
    $plugin_id,
    $plugin_definition,
    MigrationInterface $migration,
    StateInterface $state,
    EntityManagerInterface $entity_manager,
    ModuleHandlerInterface $module_handler,
    UuidInterface $uuid
  ) {
    parent::__construct($configuration, $plugin_id, $plugin_definition,
      $migration, $state, $entity_manager, $module_handler);
    $this->uuid = $uuid;
  }

  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container, array $configuration, $plugin_id, $plugin_definition, MigrationInterface $migration = NULL) {
    return new static(
      $configuration,
      $plugin_id,
      $plugin_definition,
      $migration,
      $container->get('state'),
      $container->get('entity.manager'),
      $container->get('module_handler'),
      $container->get('uuid')
    );
  }

}

Ok, so this is the setup for the plugin. For this specific migration, there were some weird conditions for which of the panel nodes were actually inspiration pages. If I copy-pasted it here, you would think I was insane, but for now I can just mention that we were overriding the public function query. You may or may not need to do the same.

So, after getting the query right, we are going to do some work inside of the prepareRow function:

  /**
   * {@inheritdoc}
   */
  public function prepareRow(Row $row) {
    $result = parent::prepareRow($row);
    if (!$result) {
      return $result;
    }
    // Get all the panes for this nid.
    $did = $this->select('panels_node', 'pn')
      ->fields('pn', ['did'])
      ->condition('pn.nid', $row->getSourceProperty('nid'))
      ->execute()
      ->fetchField();
    // Find all the panel panes.
    $panes = $this->getPanelPanes($did);
    $sections = [];
    $section = new Section('layout_onecol');
    $sections[] = $section;
    foreach ($panes as $delta => $pane) {
      if (!$components = $this->getComponents($pane)) {
        // You must decide what you want to do when a panel pane can not be
        // converted.
        continue;
      }
      // Here we used to have some code dealing with changing section if this
      // and that. You may or may not need this.
      foreach ($components as $component) {
        $section->appendComponent($component);
      }
    }
    $row->setSourceProperty('layout', $sections);
    // Don't forget to migrate the "path" part. This is left out for this
    // article.
    return $result;
  }

Now you may notice there are some helper methods there. They look something like this:

  /**
   * Helper.
   */
  protected function getPanelPanes($did) {
    $q = $this->select('panels_pane', 'pp');
    $q->fields('pp');
    $q->condition('pp.did', $did);
    $q->orderBy('pp.position');
    return $q->execute();
  }

  /**
   * Helper to get components back, based on pane configuration.
   */
  protected function getComponents($pane) {
    $configuration = @unserialize($pane["configuration"]);
    if (empty($configuration)) {
      return FALSE;
    }
    $region = 'content';
    // Here would be the different conversions between panel panes and blocks.
    // This would be very varying based on the panes, but here is one simple
    // example:
    switch ($pane['type']) {
      case 'custom':
        // This is the block plugin id.
        $plugin_id = 'my_custom_content_block';
        $component = new SectionComponent($this->uuid->generate(), $region, [
          'id' => $plugin_id,
          // This is the title of the block.
          'title' => $configuration['title'],
          // The following are configuration options for this block.
          'image' => '',
          'text' => [
            // These values come from the configuration of the panel pane.
            'value' => $configuration["body"],
            'format' => 'full_html',
          ],
          'url' => $configuration["url"],
        ]);
        return [$component];

      default:
        return FALSE;
    }
  }

So there you have it! Since we now have amazing tools in Drupal 8 (namely Layout builder and Migrate) there is not task that deserves the question “Is that even possible?”.

To finish off, let's have an animated gif called "inspiration". And I hope this will give some inspiration to other people migrating landing pages into layout builder.

Feb 28 2019
Feb 28
In this final installment of our series on conversational usability, we dig into a case study that brings together all of the ideas and best practices we have discussed so far: the Ask GeorgiaGov skill built by the Acquia Labs team for Digital Services Georgia.
Feb 28 2019
Feb 28

It's that time of year again! Leading up to DrupalCon Seattle, Chris Urban and I are working on a presentation on Local Development environments for Drupal, and we have just opened up the 2019 Drupal Local Development Survey.

Local development environments - 2018 usage stats
Local development environment usage results from 2018's survey.

If you do any Drupal development work, no matter how much or how little, we would love to hear from you. This survey is not attached to any Drupal organization, it is simply a community survey to help highlight some of the most widely-used tools that Drupalists use for their projects.

Take the 2019 Drupal Local Development Survey

Chris and I will present the results of the survey at our DrupalCon Seattle session What Should I Use? 2019 Developer Tool Survey Results.

We will also be comparing this year's results to those from last year—see our presentation from MidCamp 2018, Local Dev Environments for Dummies.

Feb 28 2019
Feb 28

Drupal 8.7.0-alpha1 will be released the week of March 11

In preparation for the minor release, Drupal 8.7.x will enter the alpha phase the week of March 11, 2019. Core developers should plan to complete changes that are only allowed in minor releases prior to the alpha release. The 8.7.0-alpha1 deadline for most core patches is March 8. (More information on alpha and beta releases.)

  • Developers and site owners can begin testing the alpha after its release.

  • The 8.8.x branch of core will be created, and future feature and API additions will be targeted against that branch instead of 8.7.x. All outstanding issues filed against 8.7.x will be automatically migrated to 8.8.

  • All issues filed against 8.6.x will then be migrated to 8.7.x, and subsequent bug reports should be targeted against the 8.7.x branch.

  • During the alpha phase, core issues will be committed according to the following policy:

    1. Most issues that are allowed for patch releases will be committed to 8.7.x and 8.8.x.
    2. Most issues that are only allowed in minor releases will be committed to 8.8.x only. A few strategic issues may be backported to 8.7.x, but only at committer discretion after the issue is fixed in 8.8.x (so leave them set to 8.8.x unless you are a committer), and only up until the beta deadline.

Drupal 8.7.0-beta1 will be released the week of March 25

Roughly two weeks after the alpha release, the first beta release will be created. All the restrictions of the alpha release apply to beta releases as well. The release of the first beta is a firm deadline for all feature and API additions. Even if an issue is pending in the Reviewed & Tested by the Community (RTBC) queue when the commit freeze for the beta begins, it will be committed to the next minor release only.

The release candidate phase will begin the week of April 15, and we will post further details at that time. See the summarized key dates in the release cycle, allowed changes during the Drupal 8 release cycle, and Drupal 8 backwards compatibility and internal API policy for more information.

Bugfixes and security support of Drupal 8.5.x and 8.6.x

Since September 2018, we have been providing security coverage for the previous minor release as well as the newest minor release.

So, in accordance with our policy, security releases for Drupal 8.6.x will be made available until December 4, 2019 when Drupal 8.8.0 is released. Bugfixes that are not security-related will only be committed until Drupal 8.6.x's final bugfix window on April 3.

Normal bugfix support for Drupal 8.5.x ended in August 2016. However security support is provided for 8.5.x until the release of Drupal 8.7.0 on May 1, 2019.

Feb 28 2019
Feb 28

Change is the only constant. That’s the lesson we need to adopt when it comes to embracing Drupal 8 and migrating from Drupal 7. Since the launch of Drupal 8 in 2015, many new challenges have emerged among developers and one of them includes forking Drupal.

Three forks and their shadows falling on a green background


Quoting Dries' opinion on embracing change:

“The reason Drupal has been successful is because we always made big, forward-looking changes. It’s a cliché, but change has always been the only constant in Drupal. The result is that Drupal has stayed relevant, unlike nearly every other Open Source CMS over the years. The biggest risk for our project is that we don't embrace change.”

What is Backdrop CMS?

Backdrop CMS logo with the word ‘backdrop’ written on left and black and white coloured square on right

Backdrop is a Content Management System (CMS) which can be put to use when designing a wide variety of websites from a single administrator's personal blog site to an intricate, multi-role business e-commerce site. It is the perfect fit for comprehensive non-profit, educational, corporate, or government websites.

Being a tool for structuring websites, the core Backdrop CMS package aims to include many useful features, but only those that are necessary for the majority of sites using it. Backdrop can be extended with the addition of modules, themes, and layouts which are easy in nature.

In a way, it allows non-technical users to manage a wide variety of content. It is feature-compatible with Drupal 8 (containing things like Configuration Management Initiative(CMI), WYSIWYG & Views in core), but is built on APIs more similar to those found in Drupal 7.

Evolution of Backdrop CMS

Backdrop CMS started its existence as an offshoot of Drupal. Although Backdrop originates from a common codebase with Drupal, its philosophy and organisation are distinct. Backdrop follows a policy of concentrated releases that account feedback from the community. 

Essentially, for the small to medium sized businesses, non-profits, educational institutions, or any other organisations, who are in need of a comprehensive website on a budget, Backdrop CMS is easy to build and extend. 

Advantages of Backdrop CMS

Both, Backdrop and Drupal projects have different end goals, but emerging from the same original code base, there are areas in which collaboration can benefit both projects. 

  • Along with easier updates, Backdrop is backwards compatible. Backdrop attempts to keep API change to a minimum in order for contributed code to be maintained easily, and for existing sites to be updated affordably.
  • Being simple in its structure, backdrop lets you write code for the majority. It aims to be easy to learn and build upon, even for those with a minimal amount of technical knowledge. Direct implementations are chosen over abstraction, and how things work can be immediately clear and easily documentable.
  • The focus is to include features for the majority. Backdrop core only includes features and tools that benefit the majority of sites that are running it. Also, Backdrop aims to include opinions from individuals who attend trainings, meetups, and camps as well as real-world engagements with consumers.
  • Backdrop can be extended. Backdrop aims to provide a powerful core that can be readily extended through custom or publicly available contributed modules. These additional modules provide desired features that are not incorporated in core due to their complexity or use cases that are too specific.
  • Rendering great performance, Backdrop has low system requirements. Backdrop runs on affordable hosting with very basic requirements. This means not chasing popular trends in technology, but instead adopting common, proven, and learnable systems.
  • Backdrop lets you plan and schedule releases. Each release contains a planned set of features, and is released on time. If a feature is not ready in time for a specific release, the feature gets postponed, but the release is delivered on time. 
  • It gives the freedom to remain free and open source. All codes included with Backdrop are under an Open Source license that allows anyone to use it for free, regardless of their beliefs or intentions.

Why fork Drupal?

There are lots of reasons why Drupal was forked to create Backdrop. These are the most notable ones:

Technical Gap

Though many features in Drupal 8 are identical to those in Drupal 7, the code underneath has little to share resemblance with the Drupal of yesteryear. Developers value maintaining the code that has a proven success rate rather than drifting from the track record of the success.

Coding Principles

The Backdrop community may vary from the Drupal community on some issues that they regard higher, and vice versa. As the principles diverge, so does the code. This justifies the existence of Backdrop.

Niche Audience

Backdrop CMS is dedicatedly targeted at small to medium-sized businesses, non-profits, and education. It best serves the kinds of organisations that need complex functionality, but on a budget.

Graphical representation with blue and green coloured regions to show Drupal evolution from Drupal 6 to Drupal 8 and emergence of Backdrop after Drupal 8Source: Quora

Case Studies

The NorCal Hunter Jumper Association is a not-for-profit recreational sports organization that was looking for a better viewing website experience for the membership, mainly on tablets and mobile devices. The new site also needed to be easy for the board and administrators to update and manage. Further, they planned to move board membership nominations, voting, general surveys, and other forms onto the website in the future, including forms that may need credit card processing. Thus, Backdrop was chosen as the medium to integrate all these requirements and following were the results:

  • A finer viewing experience for the members on tablets and mobile devices.
  • Easier updates and management for the board and administrators.
  • Flexible in adding features as the needs of the organization grows.
  • Easy to integrate with other web services.
  • Affordable maintenance and long-term development costs.
A picture of a woman sitting on a horse in the middle of a group of people with tabs on the top of the homepage of NorCal Hunter’s website


BGP Site Solutions is a group of business sites showcasing web publishing experience.

Founded in 2003, BGP Site Solutions has managed nearly 100 web properties with vast experience in performance-based online marketing (Cost per Lead, Cost per Acquisition, Cost per Click), white-hat organic search engine optimization, and web publishing in the marketing verticals of post-secondary education, home services, insurance (auto/health), wine, diet/weight loss/health, financial services, dating, and eldercare/senior services. 

On the other hand, formed in 2011, Authority Media is a leading publisher of career training web properties. The AM goal was to be the most authoritative source of information in each post-secondary education category in which they operate.  

These sites were formerly separate WordPress sites and were hacked multiple times. Thus, security of the website was the need of the hour. 

Since these are both fairly small sites, combining them into a single codebase site offered savings in terms of hosting and maintenance costs. And the multi-site feature offered by Backdrop CMS seemed like the perfect fit.

Two types of services described in two columns below the three hands picking the red coloured screw fittings on the BGP Site Solutions’ website’s homepage


Final Thoughts

Drupal 8 is a huge departure from anything the Drupal community has released in the past and it’s a move towards the enterprise.

Backdrop is not about users but about developers facing challenges in adapting and investing their time to further improve the Drupal platform. That’s where Backdrop aims to fill the gap left and attempting to maintain connectivity and cohesiveness with the larger Drupal community.

Thus, both Drupal 8 and Backdrop are trying to address the problem, in fundamentally different ways.

Still confused? OpenSense Labs provides steadfast solutions and services to empower digital innovation for all enterprises.
Ping us at [email protected] and let us know how we can help you achieve your digital goals.

Feb 28 2019
Feb 28

A client asks about yet another hosting option:

The VPS-2000HA-S includes the following resources:

6GB RAM (burstable)

150GB SSD Disk space

5TB Monthly Bandwidth

4 free dedicated IP's

options to configure the server for particular versions of PHP

2 hours of Launch Assist to help migrate and configure the server with the Managed Hosting team (one on one Tier 3 support)

... what do you think?

Sounds good, right? This is at what I'm going to call a "traditional" host. Cost for this package, if you pay for a year up front, is just over $40/month. Seems reasonable. But... Digging a little deeper, I'm skeptical that this is a good value. The "Choice of OS" offered is CentOS 7, Ubuntu 16.04, or Debian 8. While CentOS 7 is current, Ubuntu 16.04 is 3 years old and Debian 8 is 4 years old -- why don't they support Ubuntu 18.04 or Debian 9, both of which are 1 year old and have far more service life remaining?

And, how exactly can you manage backups on this hosting platform?

My sense is this client is really attached to having Cpanel or a similar control panel so they can add/remove sites through a point-and-click web interface. Sounds great, right? Not when DevOps enters the picture.

DevOps vs Hand Built

Control panels are great for learning what options and functionality is available. They are great for people who don't spend their days managing sites and servers. They are great for sites that are disposable, and not that important. But when I go in to figure out yet another control panel, I cringe and reach for another cup of coffee.

What's the alternative? Automation, and configuration management. The tools of DevOps.

Yesterday I spun up a brand new server and moved over a large existing Drupal site to it in under an hour. And most of that time I wasn't paying much attention to the process.

The steps were basically:

  • Start up a bare server on a cloud provider
  • Install the configuration management client, and point it at our master
  • Accept its key on our master
  • Create a new server configuration file from our template, filling in the blanks with things like API keys for the backup service, which PHP version to deploy, how to route their outgoing email
  • Create a new site configuration file from a different template, specifying the git repository path, all domain name variations, specific platform
  • Tell it GO!

That was 5 or 6 minutes of actual attention, and it churned away for another 25 - 30 minutes. When it was done, the site code was on the server, most everything was configured -- all that was left was to import the database and copy over the image assets, and it was ready for the DNS changeover.

"Ok," you say. "I can do all that in my favorite control panel in less than 30 minutes."

That may well be true -- but we haven't gotten to the point yet.

The point is, my config is completely self-documented, completely reproduceable, and easily portable to other services. Did you keep track of every change you made in the control panel? Can you save those control panel settings, and apply them on an entirely different service if you need to? How fast can you recover if the hardware your site is running on fails? Or worse, your site gets hacked?

When you're using DevOps tools and practices, you treat your configuration like code -- it is managed, versioned, constantly improved.

If you want to change the PHP version, it's a single line in a config file, and apply.

If something drastic changes on the server, the configuration management alerts you and changes it back.

If the server dies, you spin up a new one and tell it to use the previous config -- and, bonus, this can work extremely well even across new operating system versions! Which means under our server maintenance plan, we cover upgrades to new "Ubuntu Long Term Support (LTS)" versions as part of our regular service, no extra charge -- we just point your config at the new server and restore your content.

If your host isn't reliable enough, pick a new one -- similar effort on our end, we just point your config at a new one and restore your content.

Evergreen vs Set it and Forget it

This all largely boils down to attitude and world view. Is your website critical to your business, something that should be tended to, kept current, and constantly changed to keep delivering value to your business? Or is it something you create once and then ignore it for years until your marketing consultant persuades you to update it?

If you don't think your website is important, then sticking it in a control-panel-driven host may be fine. But if it's any kind of application that, you know, does anything, if you don't pay attention to it an attacker might.

Drupal has now had 3 quite serious security vulnerabilities over the past year. WordPress in many ways is even worse -- its huge ecosystem of plugins gets very little review or coverage, and we're getting more and more business cleaning up sites that have been hacked. Leaving a CMS untouched for any length of time is asking for somebody to come mess with it.

Now, I'm not saying you need to make your website the full focus of your business. Different businesses have different goals, and for many, "set it and forget it" is all they want to do -- have a contact place so people can reach them but otherwise do nothing with it. That's totally fine -- but I might suggest creating a static website that has nothing to attack, instead of having a CMS that can be.

So really, we're talking about a spectrum here. At one end, a static site is a collection of files sitting on a server somewhere. If the server is relatively secure, an attacker can't really do anything.

In the middle, you have these control panels, and the scads of people running WordPress in them. That's really asking for trouble -- what I find appalling is how many sites we're seeing where the site owners get hacked and they have no backup of the site, nothing telling them they have been hacked, no warnings, no nothing. We end up using Archive.org to try to at least recover some of their content.

And then you have the DevOps end of the scale -- what I'm calling Evergreen. We are constantly applying updates -- not just to your site, but to your servers, Docker containers, the overall environment. We are constantly solving tiny problems as minor upgrades break little things that can be easily solved one at a time.

This means we keep you at the forefront across the board -- you're never so far behind that you need to do a big risky expensive upgrade project.

This also gives you the opportunity to try out different things on your website, things that might drive big changes in your business, things that you can't do if your server is too old -- we make sure your server is never too old.

For the past year or so, we're seeing a big generational change to PHP, the language that powers a lot of popular platforms including both Drupal and WordPress. PHP 5 is now obsolete as of January 2019, no longer getting any security coverage after a 15 year run. PHP 7.2 is the current release, but there are lots of incompatibilities we're seeing between PHP 7.0 and 7.2 -- this has proved to be some of the stickiest upgrade issues we've had to resolve. And a huge number of hosts still only support PHP5 or PHP 7.0!

Cloud provider vs Traditional host

... which brings me back to the original question. Why is a VPS from a traditional host not the same as a Cloud provider?

Here are some of our favored cloud providers:

  • Digital Ocean
  • Upcloud
  • Linode
  • Amazon Web Services (AWS)
  • Google Cloud Engine (GCE)
  • Microsoft Azure

... those examples break down into roughly two categories: flat-rate packaged services, and entirely infrastructure-as-a-service. The first 3 on the list, you pretty much pay a flat rate for a server of a certain size, by the hour or month, and it includes disk and a certain amount of bandwidth. The others are more commodity, with a vast range of server sizes, and you pay per GB for disk space and bandwidth on top of that. Right now, I think of Amazon, Google, and Microsoft as "the big 3", which all have similar services and similar pricing -- more expensive than the flat-rate services, but all of these offer deep discounts if you commit to 1 or 3 years of "reserved instances", especially if you pay some or all up front.

There are many, many other providers that fall into one or both of these "cloud provider" categories. However, we see lots of "traditional hosts" that offer "Virtual Private Servers" that do not stack up. Here are some of the warning signs, deficiencies, and drawbacks of these traditional hosts, compared to any of these cloud hosts.

  • Higher costs. A 2GB server, for a basic site, costs around $10/month. An 8GB server around $40 at the packaged places, around $80 - $100 before discounts at the big 3 (can be brought down to ~$35/month or less for 3 years paid up front!). If you are seeing prices higher than that for equivalent hardware, you have to ask why.
  • Only older operating system versions. If you can't get a Long Term Support version of a major Linux distribution within a month of its release, I would look elsewhere.
  • Lack of entire-disk snapshots, or ability to spin up new servers from a snapshot. This is the killer feature of Cloud servers -- if something goes wrong with your server, spin up a new one on last night's snapshot and you're back up in minutes. And -- you don't need to wait for their support to do it for you.
  • "Elastic" or "Floating" IP addresses -- reserve an IP address you can point at a replacement server and not have to update your DNS.
  • No ability to attach extra disk. Cloud providers pretty much all have "elastic block storage" or "object storage" available -- you can create a new volume and attach it to your existing server if you need more space.
  • Restricted kernels, or limits on software installation. We have a client with a full VPS and root access at a major reputable host, but we cannot install Docker on it due to the way they manage the kernel. Docker is a must-have tool for us -- it is what gives us the ability to reliably manage software with different versions on pretty much any host -- and yet we've found several hosts where it's not possible to run Docker. Which takes us back to hand-built servers instead of letting us use our DevOps tools.
  • Console access. AWS lacks this, but most cloud providers will let you see the console and boot screens through a web interface, to help recover from corrupt disks or mistakes adding additional software. AWS does give you logs -- and you can easily attach disks from one VPS inside another to repair...
  • CPanel or Plesk -- if you see these, run away! They actively interfere with our ability to manage configuration in code, and often prevent us from locking down and properly securing a server. And they usually depend on specific, ancient software versions, making it impossible to stay evergreen.

Doing Evergreen Safely

There's a couple more pieces to this puzzle: monitoring, and change management.

How do we know that a server upgrade didn't break your site?

How do we know that a site update succeeded?

How do we know whether you got hacked?

How do we know if you made a change in production that might get overwritten by an update?

What do we do if something goes wrong?

Those are questions we can answer with our site and server maintenance plans -- you'll have a very hard time getting answers to these anywhere else!

No hosting company does this kind of site monitoring or management for you. Very few services do this for you. This is where we excel.

We use industry-leading monitoring software, the various DevOps tools we mentioned, mostly the same ones that cutting-edge technology startups and leading edge enterprises use -- but we use them for mid-market companies and organizations that don't otherwise have the expertise to put them in place.

Every minute or two, our monitoring systems check every site we manage, measuring response times, checking for error codes, warning about upcoming certificate expirations, checking for specific text on your page.

Every night our bots check every site to see if any code has changed, whether the code matches exactly what is tracked, whether there have been any configuration changes on the site that could get clobbered or reverted.

Every time we release updates to a website we get a full backup of the site before and after, we send a notification of exactly what's going to change before we release, and another notification when it has actually released.

If something goes wrong, we have a full trail of everything that has changed, and a process for recovering -- worst case, we can roll you back to the previous day's backup -- but more often, we can identify the single thing causing trouble and revert just that. And for many of our clients who have historical backups, it's surprising that we can go retrieve data that was deleted weeks ago.

And one final point here. We are far from perfect. We have made lots and lots, and lots, and lots of mistakes over the years. Every one of these things we do is something we've done because we've made a mistake, or seen somebody else make a mistake. We will continue to make mistakes -- but what all these systems, all these tools, all these process do is make it so a mistake doesn't hurt very much.

And you can use our mistake-rectifying service at a very affordable rate! Go make all the mistakes you want, we've got you covered!

Feel free to reach out to us via email [email protected] or use our contact form.

Feb 27 2019
Feb 27

Our company is hosting a 2019 Drupal Community DevOps Survey to determine the community’s use of DevOps currently and tracking progress over time. We’re excited to utilize these results in our presentations this year, especially at DrupalCon Seattle this April!

This survey hosted on measure.team created by Last Call Media.

To see some current progress shots of the survey, please visit our DevOps Results blog.

The survey will collect responses until Last Call Media presents at DrupalCon Seattle, but as soon as you take the survey, the results change.

How does your involvement align with the community’s?

Feb 27 2019
Feb 27

by David Snopek on February 27, 2019 - 12:54pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Moderately Critical security release for the Context module to fix an Open Redirect vulnerability.

The context module enables site builders to setup conditions and reactions for different parts of the site.

The module doesn't sufficiently sanitize user output when displayed leading to a Cross Site Scripting (XSS) vulnerability.

This vulnerability is mitigated by the fact that an attacker must have the ability to store malicious markup in the site (e.g. permission to create a node with a field that accepts "filtered html").

See the security advisory for Drupal 7 for more information.

Here you can download:

If you have a Drupal 6 site using the Context module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Feb 27 2019
Feb 27

If you’ve spent time looking for a website support partner, you’ll quickly realize that while there are a lot of options out there, they’re not all created equal. Keeping your goals in mind will help you find an agency with an approach that best meets your needs.

If you’re simply looking for software updates and security patches, there are a lot of options out there. But if you’re looking for a strategic partner to support your site, the search for the right fit can be a bit more challenging.

At Kanopi Studios, we cover the basics, but that’s just the beginning. Our support team focuses on continuous improvement and growth-driven design, ensuring long-term growth for your website. We can jump in at any stage of your site’s lifecycle to make sure you’re meeting your goals and getting the most out of your investment. And when it’s finally time for an upgrade, we can help with that too!

Here are a few details that set Kanopi’s support services apart:

Customer service is our #1 priority.

Our team goes the extra mile to provide stellar customer service. We’re here to make your life easier, regardless of the size of your account.  

Added value and strategic guidance

As part of your monthly support budget, you’ll gain access to experienced designers, user experience strategists, developers and more. When it’s time to go beyond bug fixes, you’ll have experts in your corner to help your site respond to changes in the market or shifts in your business priorities.

You’ll work with real humans!

Our full-time support team manages every detail of your account. We analyze incoming requests, make sure we have the details needed to get the job done right, and respond within an hour, all without a single bot in sight.  

A dedicated, senior-level team

Our support team focuses on support. We know that it takes a different set of skills, energy, and dedication to handle rapidly changing priorities and keep the issue queue clear. Our experienced team has identified and resolved nearly every issue imaginable. We encourage you to check out their bios so you can see their qualifications for yourself!

A partner you can trust

Kanopi Studios supports more than 135 active websites. Due to the great relationships we’ve built, we’re still working with some of the very first clients that signed on for our services. In fact, most of our work comes through referrals from happy customers. We welcome you to check out our five-star reviews and get in touch to learn more about ensuring long-term growth for your website.

Feb 27 2019
Feb 27

DrupalCamp London 2019 is approaching fast. Are you ready for another great time with Drupal? This year, 42 sessions on Drupal and related topics are scheduled. We hope, that with our help, you will choose the most promising lectures. Below are a few sessions you should definitely visit. We've picked topics both for experienced coders and beginners, as well as something for business owners, editors, marketers and others. 

1. Visual regression testing  

https://drupalcamp.london/session/visual-regression-testing-patterns

This talk will cover:

  • BackstopJS for visual regression testing on Pattern Lab patterns on a Drupal 8 theme. 
  • How to set up regression testing for each pattern and for the entire pattern library and the problems you could run into when setting up regression testing for patterns. 
  • The benefits of using this approach.

To take a part in this session, it’s best to have basic knowledge of how the integration of Drupal and Pattern Lab works. 

2. No Monkey Business Static Progressive Web Apps

https://drupalcamp.london/session/no-monkey-business-static-progressive-web-apps

This talk will cover:

  • An overview of the architecture used to deliver ii.co.uk.
  • How the GatsbyJS was used to generate static content from Drupal and other dynamic sources.
  • How these pages were further hydrated with React for dynamic content after the initial page load.
  • The custom cache handling implemented to keep content build pipelines.
  • Division of the responsibilities for content generation between GatsbyJS and Drupal.
  • Resolving the real-time preview issue without waiting for Gatsby's upcoming hosted paid preview service.

3. Layouts in Drupal: past, present and future

https://drupalcamp.london/session/layouts-drupal-past-present-and-future

This talk will cover:

  • The history of building layouts in Drupal. 
  • Using Node reference (CCK), Nodequeue and custom template to build newspaper and magazine style layouts in Drupal 5. 
  • Having a look at "page builders" like Panels, Context and Block Visibility Groups. 
  • Dividing into CSS Grid layouts and using plugins like Masonry and GridStack for more advanced grid style layouts. 
  • Alternative approaches like Paragraphs, ECK/IEF and Bricks to create custom layouts. 
  • The pros and cons of these layout approaches and if and why they are now outdated.
  • New Layout Builder and some possible new approaches for building layouts in Drupal.

4. Creating an enterprise level editorial experience for Drupal 8 using React

https://drupalcamp.london/session/creating-enterprise-level-editorial-experience-drupal-8-using-react

This talk will cover:

  • Recent project results where a decoupled application with React was created, allowing the edition of content directly in the frontend. Using the possibilities of React to the fullest. 
  • Sharing an editorial experience with 'in-place' editing, 'context-sensitive' editing, 'drag-n-drop' content placement and creation, and much more.
  • Presentation of the application and vision of what an enterprise level editorial experience should look like.
  • What to expect when going fully decoupled with editorial experience and how this approach fits into the development of Drupal.

5. Migrate to Drupal

https://drupalcamp.london/session/migrate-drupal

The talk will cover:

  • Migrations in Drupal 7 and Drupal 8
  • Effective communication to project stakeholders
  • Writing and running efficient migrations

To take a part in this session, it’s best to have basic PHP coding skills and understanding of Drupal site building.

6. Droopler distribution - How can you save even 100 man-days during the development of a new website with Drupal

https://drupalcamp.london/session/droopler-distribution-how-can-you-save-even-100-man-days-during-development-new-website

Maciej Lukianski will show you that you don't have to possess a budget of over ten thousand dollars if you need Drupal 8.

This talk will cover:

  • Droopler modules.
  • What paragraphs Droopler can offer to build your new page fast.
  • How simple it is to build a new landing page with Droopler in a live demo.
  • What ideas we have for the future functionalities of Droopler.

7. Drupal 8 SEO

https://drupalcamp.london/session/drupal-8-seo

This talk will cover:

  • Drupal modules, Google tools and external tools and how to use them to prepare an SEO strategy.
  • How to plan an SEO strategy for your website and how to compare your website with the competition.

8. Out of the Box Initiative Update

https://drupalcamp.london/session/out-box-initiative-update

This talk will cover:

  • The project in the past and present state of the initiative.
  • Targets for inclusion in Drupal 8.7.0
  • Ways to contribute to the project.
  • Plans for the more distant future.

9. Scrum everywhere - how we implemented Scrum not only in software development projects

https://drupalcamp.london/session/scrum-everywhere-how-we-implemented-scrum-not-only-software-development-projects

This talk will cover:

  • Using Scrum in the marketing team.
  • Using Scrum in QA team to improve software testing in the whole company.
  • Using Scrum for company management.
  • Using Scrum for the training of junior developers.

10. Accessibility in UX Design: How we redesigned The University of West London’s website for everyone

https://drupalcamp.london/session/accessibility-ux-design-how-we-redesigned-university-west-londons-website-everyone

This talk will cover:

  • Importance of accessibility in design, showcasing examples from industry giants such as Microsoft.
  • Highlights of accessible design.

11. How to start contributing to Drupal without code

https://drupalcamp.london/session/how-start-contributing-drupal-without-code

This talk will cover:

  • Non-code contributions and impactful ways to get involved in the Drupal project.
  • How to get started.

Summary

The talks presented by us are just a small part of what you will learn during DrupalCamp London 2019. Undoubtedly, the selection of the most important presentations out of 42 proposals is a real challenge. Perhaps the above list will help you choose. Certainly, the level of the conference will be as always deliberately high. Therefore, surely everyone will leave London with a huge dose of knowledge of Drupal innovations.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web