Feb 15 2019
Feb 15

As a writer or editor for your organization’s website, you should be able to quickly write articles or build pages that are collections of smaller elements. You should be able to write some text, add a slideshow, write some more text, perhaps list a few tweets, and finish things off with a list of related content. Or maybe you paste in a pull quote, add a couple full-width images with captions, or even put together an interactive timeline. Your content management system should let you do all that, easily. But chances are, it won’t be with the WYSIWYG you’re used to right now.

What You See Isn’t What You Get

WYSIWYG editors still fall short when it comes to doing much more than simple formatting and embedding a few images. Anything beyond that, and the underlying technology has to leverage some kind of proprietary “smart code” or “token” and do some find-and-replace magic that makes slideshows, media players, or other more complex blocks of content show up right to the editor. These tokens aren’t typically based on any adopted standard. It’s just this custom, arbitrary formatting shortcut that programmers decided to use that tells the CMS, “Replace this snippet with that other piece of content.”

If it sounds complicated, that’s because it is. It’s hard to get right. It’s hard to build in a sustainable way. It’s hard – impossible, really – to make it look right and work well for authors. It’s REALLY hard to migrate.

Here’s an example: In earlier versions of Drupal, Node Embed was a way to embed one piece of content (say, an image) inside the body of another (like an article). The “smart code” [[nid: 123]] tells Drupal, “replace this with the piece of content that has an ID of 123.” It worked, but the authoring experience was janky. And it really wasn’t structured content, since your markup would end up littered with these proprietary snippets referencing objects in the CMS. Somewhere down the line, someone would inevitably have to migrate all of that and write regular expressions and processors to parse it back into a sane structure for the new system. That gets expensive.

Simple form with input, select, and textarea

Fieldable Entities and Structured Content

The thing that lets you, the web editor, write content that is both manageable and flexible is breaking your content into discrete, single-purpose fields. In Drupal it’s called “fieldable entities.” You don’t dump everything into the WYSIWYG (which would be hard to do anyway). Instead, there’s a field to add the author’s name, a field for attaching images, and a field for the text (that last part gets the WYSIWYG). More generally, this serves an important concept called “structured content.” Content is stored in sensible chunks. It adapts to a variety of contexts, like a mobile app or a news aggregator or (of course) your website. In the case of your website, your CMS pushes all those fields through a template, and voila, the page is published beautifully and your readers eat it up.

Complex form with input, select, textarea, and multiple fields nested inside

What If My Fields Have Fields?

Here’s where it gets interesting. Back to our earlier example: let’s say your article has a couple slideshows. Each slideshow has a few images, captions, and links. Suddenly your discrete, single-purpose field (slideshow) has its own fields (images, captions, links). And, you may want to add a slideshow virtually anywhere in the flow of the page. Perhaps the page goes text, slideshow, text. Or maybe it’s text, slideshow, text, some tweets, another slideshow. And now you want to swap some things around. Again, you should be able to do all that, easily.

Drupal Paragraphs

Enter the Drupal Paragraphs module. Paragraphs takes the approach of creating content bundles, or collections of fields, that can be mixed and matched on a given page in virtually countless configurations. They’re called “Paragraphs” because they are flexible, structured building blocks for pages. The name is a little misleading; in fact, they are 100% configurable groups of fields that can be added, edited, and rearranged however you want on a given article. You can have paragraph types for slideshows, pull quotes, tweets, lists of related content, or virtually anything else. Paragraphs are building blocks: smaller elements that can be combined to build a page. And like I said earlier, you should be able to easily make pages from collections of smaller elements.

Complex form for adding different types of content called paragraphs

Drupal Paragraphs is Sort of Easy

We use Drupal Paragraphs whenever a particular type of content (a news article, blog post, etc.) is really built up of smaller, interchangeable collections of other fields (text, slideshows, videos, etc.). Drupal Paragraphs are flexible and organized. They let authors create whatever kinds of pages they want, while storing content in a way that is structured and adaptable. Migrations with Paragraphs are generally easier than migrations with special, proprietary embed codes. Breaking content types into Paragraphs gives authors the flexibility they need, without sacrificing structure. You don’t end up with a bunch of garbage pasted into an open WYSIWYG field.

So what’s the catch? Well, the interface isn’t awesome. Using Drupal Paragraphs can add a lot of complexity to the authoring experience. Forms will have nested forms. It can be overwhelming.

Alternatives to Drupal Paragraphs

As I’m writing this, another approach to page building is gathering momentum in the Drupal universe. Layout Builder is currently an experimental module in core, and slated to ship as a stable release with Drupal 8.7. Layout Builder provides a slick drag-and-drop interface for editors to build pages from blocks and fields. We’re excited to see how Layout Builder develops, and to see how well it performs for large editorial websites. For websites with hundreds or thousands of articles, managing pages with Layout Builder may be difficult. As Drupal’s founder, Dries Buytaert, pointed out in a post late last year, “On large sites, the free-form page creation is almost certainly going to be a scalability, maintenance and governance challenge.”

Other open source CMS communities are seeing a similar rise in the demand to provide authors with flexible page-building tools. WordPress released Gutenberg, a powerful drag-and-drop editing experience that lets authors quickly build incredibly flexible pages from a massive library of components. It’s worth noting Gutenberg is not without challenges. It poses accessibility issues. Antithetical to the themes in this post, it does not necessarily produce structured content. It relies on proprietary tokens for referencing embedded blocks of content. But it is very flexible, and offers an expressive interface for authors. For Drupal users, there’s a Drupal port for Gutenberg.

For us at Aten, the balance comes back to making sure content is stored in a way that is structured, can be adaptive, is reusable, and is relatively easy to migrate. And that you, the writer, can easily build flexible web pages.

Structured and Adaptable: Drupal Paragraphs with Layout Control

We’ve been working on an approach that keeps Paragraphs in place as the primary way content is managed and stored, but also gives authors the ability to easily control layout. Using Drupal’s core Layout Discovery system, Entity Reference with Layout is a custom field type that combines layouts and Paragraphs. It’s still in very early experimental development, but we’re excited about the impact this approach might have on making it even easier to create flexible pages. And it uses Paragraphs for content storage, with the benefits we’ve already touched on: content is well-structured and relatively easy to migrate. It’s not as flexible or robust as Layout Builder, but might be a great option for authoring flexible pages with Paragraphs. (More on this in a future post.)

Reusable and Flexible: Advanced Paragraphs

Since Drupal Paragraphs are themselves collections of flexible fields, there are all kinds of interesting ways they can be applied to building complex publishing features. We’re working with a client in publishing who needs the ability to completely customize the way content appears on their home page. They would like to promote existing content to the homepage, but they may want to override article titles, images, and summaries. Since the article authors aren’t the same people editing the home page and other key listing pages, they didn’t want authors to have to think about all of those variations. The way content is presented on an article page isn’t always the best-suited for the homepage and other contexts. We used paragraphs to give home page editors the ability to drop articles onto the page, with fields for overriding everything they need to.

Where to Go From Here

Your CMS should make it easy to write content and build pages. If you’re interested in seeing a demo of Drupal Paragraphs, Layout Builder, or Gutenberg, drop us a line. We’d love to help.

Feb 15 2019
Feb 15

The recent post on Dries’ blog about REST, JSON:API and GraphQL caused a bigger shockwave in the community than we anticipated. A lot of community members asked for our opinion, so we decided to join the conversation.

Apples and Oranges

Comparing GraphQL and JSON:API is very similar to the never-ending stream of blog posts that compare Drupal and Wordpress. They simply don’t aim to do the same thing.

While REST and JSON:API are built around the HTTP architecture, GraphQL is not concerned with its transportation layer. Sending a GraphQL query over HTTP is one way to use it, and unfortunately, one that got stuck in everybody’s minds, but by far not the only one. Which is what we are trying to prove with the GraphQL Twig module. It allows you to separate your Twig templates from Drupal’s internal structures and therefore make them easier to maintain and reuse. No HTTP requests involved. If this sparks your interest, watch our two webinars and the Drupal Europe talk on that topic.

So GraphQL is a way to provide typed, implementation agnostic contracts between systems, and therefore achieve decoupling. REST and JSON:API are about decoupling too, are they?

What does “decoupling” mean?

The term “decoupling” has been re-purposed for content management systems that don’t necessarily generate the user-facing output themselves (in a “coupled” way) but allow to get the stored information using an API exposed over HTTP.

So when building a website using Drupal with its REST, JSON:API or GraphQL 3.x extension and smash a React frontend on top, you would achieve decoupling in terms of technologies. You swap Drupal’s rendering layer with React. This might bring performance improvements - our friends at Lullabot showed that decoupling is not the only way to achieve that - and allows you to implement more interactive and engaging user interfaces. But it also comes at a cost.

What you don’t achieve, is decoupling, or loose-coupling in the sense of software architecture. Information in Drupal might be accessible to arbitrary clients, but they still have to maintain deep knowledge about Drupal data structures and conventions (entities, bundles, fields, relations…). You might be able to attach multiple frontends, but you will never be able to replace the Drupal backend. So you reached the identical state of coupling as Drupal had for years by being able to run different themes at the same time.

The real purpose of GraphQL

Back when we finished the automatically generated GraphQL schema for Drupal and this huge relation graph would just pop up after you installed the module, we were very proud of ourselves. After all, anybody was able to query for any kind of entity, field, block, menu item or relation between them. And all that with autocompletion!

The harsh reality is that 99.5% of the world doesn’t care what entities, fields or blocks are. Or even worse, they have a completely different understanding of it. And a content management system is just one puzzle piece in our client's business case - technology should not be the focus it’s just helping to achieve the goal.

The real strength of GraphQL is that it allows us to adapt Drupal to the world around it, instead of having to teach everybody how it thinks of it.

Some of you already noticed that there is a 4.x branch of the GraphQL module lingering, and there have been a lot of questions what this is about. This new version has been developed in parallel over the last year (mainly sponsored by our friendly neighbourhood car manufacturer Daimler) with an emphasis on GraphQL schema definitions.

Instead of just exposing everything Drupal can offer, it allows us to craft a tailored schema that becomes the single source of truth for all information, operations, and interactions that happen within the system. This contract is not imposed by Drupal, but by the business needs that have to be met.

A bright future

So, GraphQL is not a recommendation for Drupal Core. What does that mean? Not a lot, since there is not even an issue on drupal.org to pursue that. GraphQL is an advanced tool that requires a certain amount of professionalism (and budget) to reap its benefits. Drupal aims to be used by everyone, and Drupal Core should not burden itself with complexity, that is not to the benefit of everyone. That's what contrib space is there for.

The GraphQL module is not going anywhere. Usage statistics are still climbing up and the 3.x branch will remain maintained until we can provide the same out-of-the-box experience and an upgrade path for version 4. If you have questions or opinions you would like to share, please reach out in the #graphql channel on drupal.slack.com or contact us on Twitter.
 

Feb 14 2019
Feb 14

A commercial came on the radio recently advertising a software application that would, basically, revolutionize data management and enable employees to be more efficient. My first thought was, “How can they possibly promise that when they don’t know their customers’ data management processes?” Then, it became clear. The business processes would have to be changed in order to accommodate the software.

Is that appropriate? Is it right that an organization should be required to change the way it conducts business in order to implement a software application? 

Sometimes yes. Sometimes no. 

The following four factors can serve as an essential guide for evaluating the need for change and ensuring a successful transition to a new solution. 

  1. Process Analysis
  2. Process Improvement
  3. Process Support
  4. Change Management

1. Process Analysis

Before committing to the latest and greatest software that promises to revolutionize the way you do business, make sure you’ve already documented current processes and have a clear understanding of what’s working well and what can be improved. Over the last few decades, several methodologies have been created and used to analyze and improve processes. Key among them: Six Sigma, Lean Manufacturing, and Total Quality Management (TQM).

How to Streamline Process Analysis

In simple terms, document the way you create, edit, and manage your data. Task-by-task, step-by-step, what does your team do to get the job done? If you see a problem with a process, keep documenting. Improvements, changes, redesigns come in the next step.

Steps to Document Processes

At the core of all process documentation is the process flowchart: an illustration with various boxes and decision diamonds connected with arrows from one step to the next. Collect as much data and as many contingencies as possible to make this flowchart as meaningful and robust as possible.

Diving Deeper

Integrated Definition (IDEF) methods look at more than just the steps taken to complete a task within a process. IDEF takes the following into account:

  • Input - What is needed to complete the steps in the task? Understanding this information can highlight deficiencies and can highlight flaws in effective completion of a particular step. 
  • Output - What does the task produce? Is the outcome what it needs to be?
  • Mechanisms - What tools are needed to perform the steps in the task? Can new technology make a difference?
  • Controls - What ensures that the steps within the task are being performed correctly?

Swimlane or Functional Flowcharts

In addition to the four items from IDEF data, knowing who performs a task is also important. As you draw the boxes and connect them with arrows, place them in a lane that represents the “Who” of process. Knowing who does what can reveal possible over-tasking and/or bottleneck issues in production and efficiency.

Getting Started

Whiteboards and Post-it can be an easy place to start process documentation. With today’s high definition smartphone cameras, you can sketch flowcharts, take a picture, and then share with stakeholders for their review and input. Once you have collected all relevant information, tools such as Visio or Lucidchart can make complicated process flows easier to create and visualize. 

2. Process Improvement

When you know how your current processes are performed, you can start to make improvements--whether incremental or a complete redesign (A.K.A reengineering). 

Low-hanging Fruit

Some needed improvements will likely be obvious. For example, someone on the process team says, “I didn’t know that you were doing that? I do it, too.” Duplication of effort doesn’t require metrics to be collected to determine that steps can be taken to enhance efficiency.

However, most processes require some form of metric collection to determine where improvements can be made. Metrics can include research costs, development time frames, quality assurance oversights, or frequency of downtime.

Knowing What to Change

If the needed improvement isn’t obvious, metrics can help guide the decision-making process. For example, how much time does it take to complete a process today? Could that be improved? Brainstorm ideas on how to shorten the time. Test the change and remeasure. 

Too often, data is collected with the intent of measuring the success of the process, but the data being collected is does not reflect objectives. For example, if the goal of a website or a particular page is to teach, metrics concerning the number of page visits does not indicate the degree to which that goal is being achieved. A more telling metric would be the amount of time a visitor spends on the lesson pages. Even more revealing would be the number of quizzes passed after a visitor engaged with the lessons.  

Before choosing metrics to be collected, understand your goal, determine the level of improvement you want to see, then start measuring that which will actually inform your goal.

3. Process Support

Key to process improvement is an analysis of how technology is can enhance productivity and efficiencies. 

For example: “We can cut three days worth of effort for three employees if we integrate the XYZ software into our process.” This kind of statement sounds like a worthy goal, assuming the cost to transition doesn’t exceed the long-term savings it can help realize. 

Calculating the Cost of Change

Open source software applications start out with a great price tag: Free. However, unless you use the application as is, there is always a cost associated with turning it into what you need. There are many more factors to consider than the upfront cost of the software. 

Costs can include:

  • Training on new processes
  • Training on the new software application acquired to support the new process
  • A drop in productivity and efficiency until the new process/application is adopted and accepted by your staff
  • Technology needed to support the new application
  • Keep in mind: The list of ancillary costs that are involved in the transition are unlikely to appear at the top of the salesperson’s brochure.

Return on Investment

ROI assessments will vary. Two values are needed to make this computation: cost and benefits. 

Once you've computed the short-term and long-term costs, you need to determine the benefits you gain. In a situation where investment directly translates into profit, this calculation can be straightforward.

However, sometimes the benefits are cost savings: “We used to spend this. Now we spend that.” When this is the case, instead of a cost/benefit ratio, a breakdown calculation might be required to determine how long it will take to recoup costs. 

The most complicated analysis will result from benefits such as an increase in customer satisfaction in which the benefit does not have a direct monetary value.

4. Change Management

When a team cannot see the benefit or resists change, new initiatives face an uphill battle. That’s why circling back to the process analysis phase and ensuring the buy-in of those who are being counted on to use the new application is a critical success factor.

Keep in mind that employees may not have access to the big-picture business goals that management sees, but change has the greatest chance of being realized if those who will be required to support it are informed as to why it needs to happen and included, at some level, in the decision-making process. 

Indeed, change management doesn’t start when the new application is ready to be implemented. Change management starts when:

  • All the costs of change are considered;
  • The full scope of process changes are identified;
  • Training is planned and delivered; and 
  • The campaign for change acceptance is designed and initiated.

Conclusion

You might be the boss. You might believe that the software application just discovered is what  your company or your department needs. As much as that matters, you need buy in. You need the support of the people who make your business possible. You also need to engage in a disciplined analysis of the processes that will be impacted, along with anticipated improvements and the level of support that will be deployed.

From ensuring that clients’ entire online presence is compliant with ADA accessibility guidelines to web solutions that optimize impact in the ever-evolving digital environment, we at Promet Source serve as a powerful source of transformative digital possibilities. Contact us today to learn what we can do for you.

Feb 14 2019
Feb 14

Communication is the heart of all human interactions and the media is like the blood pumping all the necessary ideas and expressions. 

Media provides the essential link between the individual and the demands of the technological society.
-Jacques Ellul

We as individuals view hundreds of advertisement each day. Digging through that phone and eyes glued to those tabs. People like us have produced a substantial rise in marketing tactics. 

Marketing tactics such as social media, videos, search engine optimization, mobile paid media, and marketing of the emails have simulated the need for good quality content. What our minds’ decide to pay attention to depends on the interest and how compelling the advertisement or piece of content is.

Image of a blue cloud that says content under which 28 lego humans are standing and rain is showering upon them through that cloud

It is necessary for organizations to realize their target persona, and serve up content that will bust through the clutter and hit homes with their customer.

Drupal Media can serve up this task beautifully and can do almost anything by gracefully blending digital assets in it.  

You ask how?

I say - let’s find out!

The Evolution of Media Management in Drupal 8 

Drupal 8 Versions When was it introduced? What was offered? Drupal 8.2 5th October 2006  Basic out-of-the-box media handling  Drupal 8.3 6th April 2017 This brought enhanced media handling in Drupal 8. Migrating Drupal 7 File Entities to Drupal 8 Media Entities Drupal 8.4 4th October 2017 Introduction of a new Media API to the core. For site builders, Drupal 8.4 ships with the new media module i.e base media entity Drupal 8.5 7th March 2018  Supported remote video using the oEmbed format. Drupal 8.6 7th November 2018  For the content creator, richer image and media integration and digital asset management.

Media Type and Best Solutions to Handle Them

Media type as we know has been generally categorized with the data content such as an application, audio content, image text message, a video stream and so on. Media type conveys the applications that in return tell them what type of application is needed for the process. Media Types like Pictures, graphics, icons, video are handled beautifully with the help of Drupal modules. 

Media types can be handled with the help of some practices :

  • Media Module Maintenance 

Modules maintenance in Drupal can be achieved with the help of distinct features and functionalities. Status report screen (which checks a variety of issues), cron (that automates the task of a website in “N” hour), caching and deployment, are some of the pieces to the whole module maintenance picture.

Media module provides a “base” entity for assets. This allows users to standardize Drupal site interactions with media resources. Local files, images, youtube video, tweets can be treated with the help of a media module. 

  • Building Distributions 

If you are setting up a Drupal site then it would typically mean being involved in downloading and configuring various contributed modules (media and non-media). To make the whole process easier there are a variety of “Pre-configured” versions of Drupal that can be downloaded and used for a specific site. These pre-configured versions of Drupal are called distributions. With these “full-featured” distributions you can easily and quickly set up a site for the specialized purpose.  

  • Site Building 

Drupal 8 comes with the most popular text editor modules and image uploader modules. These both provide the users with basic HTML controls and the power to edit the content. Text editor modules like paragraphs grant the user with a cleaner data structure. The scope of making mistakes is next to null due to the module known as the environmental indicator that helps in correcting mistakes. 

  • Custom Development 

Drupal is a collection of modules and distribution. With more and more organizations looking to build an engaging digital experience for their stakeholders, the Drupal CMS has made custom developments in its platform. The version brings significant changes in modules that help in better user experience and efficiency of the website. 

Media expectations as a content author and Site Builders  

State of Drupal 2016 survey which 2,900 people attended and participated in got the top two most requested features in terms of content creator persona.

The top two features which were adequately demanded were

  • Richer media 
  • Media integration

Thus, “media initiative” for Drupal 8 was introduced that provided with extensible base functionalities. For the media handling in the core the support of the reusable assets, media browsing, remote video with the extensible cleanliness of contributed modules were made. 

In Drupal 7 the media module was jam-packed with many functionalities. Now in Drupal 8 it has been recreated and introduced into separate modules. The three major modules which beautifully handles the media entities are named as:

Media Entity 

To store media assets, media entity modules were introduced. This module provides a base entity for the media, a very basic entity which refers to all kinds of media objects. Media entity also presented a relation between Drupal and media resource.  

Entity Embed

WYSIWYG embed support(within the text area) is allowed by the entity embed module in Drupal 8. The core consists of an editor and a filter module. This module allows a site builder to construct a button which leads an editor with the text area, hence the name “entity embed”.

Entity Browser

The entity browser module provides flexible and generic entity browsing and selection tools. It can be used in any context where one needs to select a number of entities and do something with them. The inline entity also provided with the integration of the media.

Site builders want that every type of media usage should be easily tracked and be presented to them. These three modules help them in achieving this task.

Third Party integrations for media solutions 

DAM (Digital Asset Media)

A digital asset is any text or media that is formatted into a binary source and includes the right to use it. All the digital files that do not include this right are not considered digital assets. Digital assets are categorized into images and multimedia, called media assets, and textual content and the management of these types of assets is known as Digital Asset Management. Modules like Acquia DAM, Bynder, integration module, EMBridge, S3 file sync, Q bank, Asset Bank, Media Valet, Elvis contribute to the integration of DAM and Drupal media. 

Image of a laptop with a chart picture on the screen and five sub-images are connected around it via dots

CDN (content delivery network)

CDN is a globally distributed network of proxy servers. It integrates offload static assets like images, videos, CSS and JS.

CDN like Cloudflare offers image compression and is great for content delivery network services. CDN provides several advantages over serving the traffic directly:

  1. Assets can be cached in a proxy which is geographically closer to the end users that usually leads to high download speed.
  2. Each page response is shared with the origin server and the CDN.
  3. Some of the CDN’s provides with page optimization service which further enhances the performance and also the user experience. 

To make the integration easier Drupal has a CDN module that would help in speeding up the process and make it more agile. 

Image of the cloud saying CDN, where it has three parts on the left side of the cloud-connected with dots and same is on the right side too


External Storage 

It is not uncommon for large files and folders to get into the way of website speed. Large files are not usually cached resulting in every request to load the website slow. Drupal modules like the S3 file system, storage API, AmazonS3 contributes highly to integrate external storage. These modules manage the storage and the files in its API by providing an additional file system to your Drupal sites. 

Image of a silver tube that is indicated by two back and forth arrow pointing files sign

Infrastructure 

One of the most prominent examples of integrating infrastructure is Cloudflare. It is one of the biggest networks operating on the Internet. People use Cloudflare services for the purposes of increasing the security and performance of their websites. 

A number of various solutions implemented at customers' facilities are rather large today. Often the subsystems of seemingly unified IT landscape are either loosely connected to each other or the interaction between them is ensured by file and data transfer via e-mail or from hand to hand.

When content becomes media 

Content on your website would start acting like media because let’s face it the content repository or the content that is stored in the database of the digital content is an association set of data management, search and access method allowing accessing of content. It includes

Content Pooling 

Content pooling involves the storing of the learning material in form of objects, meta-data as well as the relation which is there between them. It is the grouping up of the resources together (assets, resources etc) of the purpose of maximizing profit and minimizing risks, content pooling is done. 

Content Syndication 

Content Syndication is the building up of a suite of Drupal site that needs a way to consume content from a central Drupal source. The CMS provides a number of splendid tools to simplify content creation and moderation. The users can create content once and make it available everywhere. Push content and media items at any sites to publish them on any targeted remote site. 

Deploy

This module of Drupal 8 allows the user to easily stage and preview content on all Drupal sites. It automatically manages dependencies between entities and is specially designed for rich API which can easily be extended. 

Contenta CMS 

The main agenda of Contenta CMS was to make the content happy. It is a community-driven API first distribution for Drupal 8 which provides users with a standard platform alongside the content. Contenta CMS is all about easing the pain of its users. It builds decoupled applications and websites. 

Beyond version 8 

Drupal 8 was launched without the support of the media library. Thus, the addition of the media library is planned to be launched in Drupal 8. The developers have been currently working on adding a media library to Drupal 8 so that the content authors can select pre-existing media from a library and easily embed them in their posts. Once the media library becomes stable, the content authors can deprecate the use of the old file upload functionality and make the new media library the default experience.

Instead of working on Drupal 9 as a separate code base, it is planned to be launched in Drupal 8 which means that new functionalities are being added on the backward compatibility code along with the experimental features. For contributed module authors, Drupal 9 is working on compatibilities (Before the release of Drupal 9 the users are allowed to update their media module for the new media library) 

Image of a line showing 2019, 2020 and 2021 that has a pointing arrow at the starting. The version are written at the top of a line with a blue background Source: Dries Buytaert's blog

Conclusion

As the world is getting more and more involved in the act of media, the need for handling it has become really important. 

Media is important because it allows the people to transmit information to a larger audience, over a greater length of time. The importance of the media today is immense. Never before in mankind's history have the media had such an influence. 

Yes, Drupal has come a long way in this sector. Contact us on [email protected] to know more about the media handling in your Drupal sites and the services which are provided by us.

Feb 14 2019
Feb 14
Fixing SEMRush reported duplicate content rules through Metatag tag Views and Drupal tokens.
Feb 14 2019
Feb 14

The trend of using JavaScript frameworks with Drupal keeps gaining popularity. It is used for creating rich, fast, and interactive web interfaces. One of the hot use areas is decoupled (headless Drupal 8) architecture, with the front-end part completely entrusted to a JS framework. There are many JS frameworks and libraries that pair well with Drupal 8 — React, Angular, Gatsby, Ember, Elm etc. Today, we will review one of them — Vue.js. Let’s take a closer look at Drupal 8 and Vue.js combination.

Vue.js: candidate for Drupal core

Adopting JavaScript framework in Drupal core would improve the admin UX — that’s what Drupal creator Dries Buytaert said at DrupalCon Vienna in 2017. Core committers and developers with the relevant experience agreed they needed to think and choose the optimal JavaScript framework. 

The main proposal was to choose React. However, there soon emerged an idea to consider another candidate for Drupal core — Vue.js. Let’s take a closer look at this rising star.

Why Vue.js? The framework and its benefits

Vue.js is a progressive open-source JavaScript framework for building interfaces and single-page applications. It is often called a library, still the official version on the Vue.js website is “framework”.

Vue.js was created by Google employee Evan You when he was working with Angular. Evan says he wanted to take the things he liked about Angular and create “something really lightweight.” 

Since its creation in 2014, Vue.js has reached a 127,500+ star rating on GitHub and recently overpassed its counterpart React (122,000+ stars). The top 3 trending JS tools is completed with Angular (59,300+ stars). As we see, Vue demonstrates the most rapid growth.

 vue react and angular star rating

The number of NPM downloads for Vue is 796,700+ each week. The growing popularity of Vue.js is explained by its benefits:

  • lightweight and easy to learn for new developers
  • clear and detailed documentation
  • adoption within the PHP community (for example, Laravel framework has an out-of-box Vue support)
  • active and helpful Vue.js community
  • used by giants (Xiaomi, Netflix, Alibaba, Adobe, GitLab, partly Facebook, WizzAir, EuroNews, Grammarly, Laracasts, Behance, Codeship, Reuters etc.)
  • high performance due to two-way data binding and virtual DOM (like in Angular and React)
  • even large aps are built with self-contained and often reusable Vue components

Drupal 8 and Vue.js combination

With the Drupal 8 and Vue.js combination, developers can enrich Drupal interfaces with reactive features with no jQuery, use ready Vue components, or build single-page applications that consume Drupal 8 data.

On the Drupal part, we need to prepare the content for REST export. For example, we can create a View, click “provide a REST export” in the View settings, and add a path. It works when the web services set of modules is enabled in Drupal core.

To set up a Vue.js application, we need to make sure we have Node and NPM installed, and then use the Vue CLI 3 tool. It does a lot of work for developers and that speeds up the processes. It automatically plugs in the selected libraries, configures the Webpack to optimize all files, provides app updates live every time you save changes, and so on. FInally, it offers a graphic user interface — Vue Project Manager. 

The commands for starting a Vue.js application are:

  • npm install -g @vue/cli
  • vue init webpack projectname
  • cd projectname
  • npm install
  • npm run dev
vuejs application

Then Vue router and Vue resource need to be added to the project. The main.js file is then configured to use Vue routers and Vue resources, the app.js is modified to work with the router, and app components and routers are set up. 

Drupal 8 modules for working with Vue.js

There is an ecosystem of contributed Drupal modules for the Drupal 8 and Vue.js combination:

  • Vue.js builds helps connect Drupal and Vue.js. It can be used in a Twig template, attach programmatically, or add as a dependency to a YAML file.
  • Vue Views adds a new format to Drupal views so the results are displayed in Vue.js, so it’s possible to use Vue features like asynchronous loading etc.
  • Webpack Vuejs provides Vue configuration for the Webpack bundler Drupal module, which helps developers bundle Drupal libraries by Webpack
  • Decoupled blocks: Vue.js is the Vue.js version of the Decoupled Blocks module, which adds Vue.js blocks to websites via modules or themes.

Let’s combine Drupal 8 and Vue.js for you

We have taken a glimpse at Drupal 8 and Vue.js combination. Our Drupal team is always aware of latest JavaScript technologies. Contact us and we will help you choose the optimal one for you, and bring all ideas to life. 

Feb 14 2019
Feb 14

Our Glazed framework theme allows users to have control over every aspect of a Drupal site: from typography to colors, grid design and navigation. Combine this with our Drag and Drop builder and everything you need on a professional website can be designed and developed directly in the browser. This empowers your marketing and design staff to work efficiently without incurring heavy IT costs. 

When you take look at Drupal competitors such as WordPress and cloud based solutions like Squarespace, one of the main reasons they successfully skyrocketed in the web development industry is because of the simple front-end editing experience and the value this experience brings to the users. Glazed Builder brings this modern site-building experience to the Drupal world by combining the power and unique aspects of Drupal with the simplicity and intuitiveness of Drag and Drop technology. 

Glazed Builder is different from Wix, Squarespace, or any other Drag and Drop builders: it's made for Drupal and deeply integrated with Drupal APIs. It acts as a Drupal field formatter and you can have multiple instances per page, for your footer, main content, blocks, and even custom entity types. It automatically understands Drupal's revision system, language systems, workflow states, and permissions. This makes it one of the most advanced visual page builders in the world from a website architecture perspective.

How Sooperthemes products create a better Drupal experience 

Drag and Drop tools have evolved to be more powerful, produce better code, and leverage frontend frameworks to create a fluent authoring experience that runs in the browser. In Glazed Builder this experience is integrated with Views and the block systems: you can create highly dynamic pages and even dashboards with Drag and Drop, without losing reusability of the components you build. It is available for both Drupal 8 and Drupal 7, and provides the tools to easily perform difficult customization tasks. It lets the user focus on creating value for the customers and leave the technical aspects behind. It's intuitive and easy to use out of the box. 

Adding Glazed Builder on top of your existing Drupal 8 stack

Feb 14 2019
Feb 14

How do you run a page speed audit from a user experience standpoint? For, let's face it: website performance is user experience! 

What are the easiest and most effective ways to measure your Drupal website's performance? What auditing tools should you be using? How do you identify the critical metrics to focus your audit on?

And, once identified, how do you turn the collected data into speed optimization decisions? Into targeted performance improvement solutions...

Also, how fast is “ideally fast”, in the context of your competition's highest scores and of your users' expectations?

Here are the easiest steps of an effective page performance audit, with a focus on the prompt actions you could take for improving it.
 

1. Front-End vs Back-End Performance

They both have their share of impact on the overall user experience:

Long response times will discourage, frustrate and eventually get your website visitors to switch over to your competition.
 

Front-End Performance 

It's made of all those elements included in the page loading process, as being executed by the browser: images, HTML, CSS and JavaScript files, third-party scrips...

The whole process boils down to:

Downloading all these elements and putting them together to render the web page that the user will interact with.
 

Back-End Performance 

It covers all those operations performed by your server to build page content.

And here, the key metrics to measure is TTFB (Time To First Byte).

It's made of 3 main elements:
 

  1. connection latency
  2. connection speed
  3. the time needed for the server to render and serve the content
     

2. What Should You Measure More Precisely? 5 Most Important Metrics

What metrics should you focus your page speed audit on? Here's a list of the 5 most relevant ones:
 

a. Speed index

The essential indicator that will help you determine the perceived performance on your Drupal website:

How long does it take for the content within the viewport to become fully visible?

When it comes to optimization techniques targeting the speed index, “battle-tested” techniques, like lazyloading and Critical CSS, are still the most effective ones.
 

b. Time to first byte

As previously mentioned, the back-end performance:

Measures the time passed from the user's HTTP request to the first byte of the page being received by the browser.
 

c. Start render

The time requested for rendering the first non-white content on the client's browser.

Note: the subsequent requests are “the usual suspects” here, so you'd better ask yourself how you can reduce, defer or relocate them. Maybe you'd consider a service worker?
 

d. Load time

How long does it take for the browser to trigger the window load event? For the content on the requested page to get loaded?

Note: consider enabling HTTP/2, with a dramatic impact on individual page requests.
 

e. Fully loaded

It measures the time of... zero network activity. When even the JavaScript files have all been already fired.

Note: make sure your third-party scripts are “tamed” enough. They're the main “responsible” factors for high fully loaded times.
 

3. How to Perform a Page Speed Audit: 5 Useful Tools

Now that you know what precisely to analyze and evaluate, the next question is:

“How do I measure these key metrics?”

And here are some of the easiest to use and most effective tools to rely on when running your page performance audit:
 

Use them to:
 

  • collect your valuable data on all the above-mentioned metrics
  • get an insight into the page rendering process performed by the browser
  • identify the “sore spots” to work on
  • automate repeated page speed tests
  • keep monitoring your website (SpeedCurve) across multiple devices and in relation to your competition's performance 
  • get a peek into your web page's structure and into the process of downloading resources over the network (Chrome DevTools)


4. 3 Key Benchmarks to Evaluate Your Website's Performance

So, now that you've got your “target metrics” and your toolbox ready, you wonder: 

“What should I measure those metrics against?”

And here, there are 3 major benchmark-setting processes to include in your page speed audit:
 

  • determine your competition: your current production site before its rebuild, your direct and indirect “rivaling” websites
  • determine the key pages on your Drupal website: homepage, product listing page, product detail page etc.
  • measure your competition's web page performance
     

5. Most Common Performance Bottlenecks & Handiest Solutions

Here are the most “predictable” culprits that you'll identify during your page speed audit, along with the targeted performance-boosting measures to take:
 

Factors Impacting the Front-End Performance & Solutions

a. Too many embedded resources

Too many embedded stylesheets, JavaScript and images are an extra challenge for your page loading time. They'll just block the rendering of the page.

Each connection setup, DNS lookup and queuing translates into... overhead, with an impact on your site's perceived performance.

The solution: consider caching, minification, aggregation, compression...
 

b. Oversized files

And images (stylesheets and JavaScript) sure are the main “culprits” for long response times on any Drupal website. 

The solution: consider aggregating/compressing them, turning on caching, lazyloading, resizing etc.
 

c. Wrongly configured cache

Is your website properly cached? Have you optimized your cache settings? Or is it possible that your Drupal cache could be broken? 

If so, then it will have no power to reduce latency, to eliminate unnecessary rendering.

The solution: look into your response headers, URL/pattern configuration, expiration and fix the problem cache settings.
 

d. Non-optimized fonts

Your heavy fonts, too, might play their part in dragging down your website.

The solution: consider caching, delivery, compression, and character sets.

In conclusion: do re-evaluate all your modal windows, third-party scripts and image carousels. Is their positive impact on the user experience worth the price you pay: a negative impact on your page loading time?        

Word of caution on caching:

Mind you don't overuse caching as a performance boosting technique. If there are serious back-end performance issues on your website, address them; using caching as the solution to mask them is not the answer. It works as a performance improvement technique on already working systems only.
 

Factors Impacting the Back-End Performance & Solutions

And there are some handy, straightforward measures that you could take for addressing back-end performance issues, as well:

  • Consider optimizing the page rendering process directly in the CMS.
     
  • Upgrade your server's hardware infrastructure (e.g. load balancing, RAM, disk space, MySQL tuning, moving to PHP7).
     
  • Keep the number of redirects to a minimum (since each one of them would only add another round trip, which bubbles up to the TTFB).
     
  • Reconfigure those software components that are lower in the server stack (caching back-end, application container).
     
  • Consider using a CDN; it won't serve everything, plus it will lower the distance of a round trip.
     
  • Consider using Redis, Varnish.
     

6. Final Word(s) of Advice

Here are some... extra tips, if you wish, to perform a page speed audit easily and effectively:
 

  • remember to run your audit both before and after you will have applied your targeted performance improving techniques
  • grow a habit of tracking weekly results
  • define the goal of your Drupal website performance test: what precisely should it test and measure and under which circumstances?
  • … for instance, you could be analyzing your site's back-end performance only: the time requested for generating the most popular web pages, under a load of 700 concurrent visitors, let's say (so, you won't be testing connection speed or the browser rendering process)
  • pay great attention to the way you configure your page speed audit system if you aim for accurate and representative results
     

The END!

This is the easy, yet effective way of performing a website speed and optimization audit. What other tools and techniques have you been using so far?

Photo by Veri Ivanova on Unsplash

Feb 14 2019
Feb 14

I've been thinking about the performance of my site and how it affects the user experience. There are real, ethical concerns to poor web performance. These include accessibility, inclusion, waste and environmental concerns.

A faster site is more accessible, and therefore more inclusive for people visiting from a mobile device, or from areas in the world with slow or expensive internet.

For those reasons, I decided to see if I could improve the performance of my site. I used the excellent https://webpagetest.org to benchmark a simple blog post https://dri.es/relentlessly-eliminating-barriers-to-growth.

A diagram that shows page load times for dri.es before making performance improvements

The image above shows that it took a browser 0.722 seconds to download and render the page (see blue vertical line):

  • The first 210 milliseconds are used to set up the connection, which includes the DNS lookup, TCP handshake and the SSL negotiation.
  • The next 260 milliseconds (from 0.21 seconds to 0.47 seconds) are spent downloading the rendered HTML file, two CSS files and one JavaScript file.
  • After everything is downloaded, the final 330 milliseconds (from 0.475 seconds to 0.8 seconds) are used to layout the page, execute the JavaScript code and download the Favicon.

By most standards, 0.722 seconds is pretty fast. In fact, according to HTTP Archive, it takes more than 2.4 seconds to download and render the average web page on a desktop.

Regardless, I noticed that the length of the horizontal green bars and the horizontal yellow bar was relatively long compared to that of the blue bar. In other words, a lot of time is spent downloading JavaScript (yellow horizontal bar) and CSS (two green horizontal bars) instead of the HTML, including the actual content of the blog post (blue bar).

To fix, I did two things:

  1. Use vanilla JavaScript. I replaced my jQuery-based JavaScript with vanilla JavaScript. Without impacting the functionality of my site, the amount of JavaScript went from almost 45 KB to 699 bytes, good for a savings of over 6,000 percent.
  2. Conditionally include CSS. For example, I use Prism.js for syntax highlighting code snippets in blog posts. prism.css was downloaded for every page request, even when there were no code snippets to highlight. Using Drupal's render system, it's easy to conditionally include CSS. By taking advantage of that, I was able to reduce the amount of CSS downloaded by 90 percent — from 4.7 KB to 2.5 KB.

According to the January 1st, 2019 run of HTTP Archive, the median page requires 396 KB of JavaScript and 60 KB of CSS. I'm proud that my site is well under these medians.

File type Dri.es before Dri.es after World-wide median JavaScript 45 KB 669 bytes 396 KB CSS 4.7 KB 2.5 KB 60 KB

Because the new JavaScript and CSS files are significantly smaller, it takes the browser less time to download, parse and render them. As a result, the same blog post is now available in 0.465 seconds instead of 0.722 seconds, or 35% faster.

After a new https://webpagetest.org test run, you can clearly see that the bars for the CSS and JavaScript files became visually shorter:

A diagram that shows page load times for dri.es after making performance improvements

To optimize the user experience of my site, I want it to be fast. I hope that others will see that bloated websites can come at a great cost, and will consider using tools like https://webpagetest.org to make their sites more performant.

I'll keep working on making my website even faster. As a next step, I plan to make pages with images faster by using lazy image loading.

February 13, 2019

2 min read time

Feb 13 2019
Feb 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

An abstract image of three boxes

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST for Drupal core's needs.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetchdata but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability.

Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an anyHTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted conventionamongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GETrequest with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling.

Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:APIis on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions.

Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure.

What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, for Drupal core's needs, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. That said, Drupal's GraphQL implementation is fantastic, especially when you have the developer capacity to build a bespoke API for your project.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein(Acquia) for their feedback during the writing process.

 February 11, 2019

 11 min read time

 Permalink

Feb 13 2019
Feb 13

Both Drush and Console have built-in help. If you type drush, you will get a long list of available commands. If you type drupal, again you will get a long list of available console commands. If you can’t remember the exact command to use to do what you need, you can scroll through the long list of commands to find the one you want.

But scrolling through the long list can be a real pain. Wouldn’t it be easier to see commands that relate to a particular activity without all the other commands in the same list?

Target the Drush or Console commands by keyword

Fortunately you can isolate what you need by adding the grep command to the drush or drupal console command, along with a pipe (|). This will allow you to filter the results by a keyword.

Here are a couple of examples.

Show watchdog Drush commands

You can use the Drush watchdog command to list messages in the database log, delete records and more.

drush | grep watchdog

This will return the following watchdog commands:

Drush grep watchdog

Show theme related Drupal console commands

You can use Drupal console to generate a new theme, download a theme, uninstall a theme and more.

drupal | grep theme

This will return the following

drupal grep theme

As you can see, this is a list of Drupal console theme commands.

Show generate console commands

Drupal console comes with a set of generate commands that you can use to generate the module controller, routes, menus and blocks for a custom module.

drupal | grep generate

This will return the following list of generate commands:

drupal grep generate

There are more generate commands than the above graphic. You can filter this down even further by making your search more specific. For example, if you need to generate a new plugin, you can search for generate:plugin with:

drupal | grep generate:plugin

This will return the following list of generate commands:

drupal grep generate plugin

The solution explained

What is grep?

Grep is a Unix command that will search and match based on a particular criteria.

What is pipe?

The pipe command allows you to connect commands together. The output from the first command will be passed through the second command.

This means that the following:

  • Running the drush or drupal command will return the full list of available commands and a brief description of how to use each one
  • Add the pipe (|) after the drush or drupal command will connect a second command
  • Adding grep and the keyword after the pipe will result in the output of the drush or drupal command being passed as input to the second command grep. In other words, it will search for the keyword in the full list and return just those lines that match the keyword
Feb 13 2019
Feb 13

The events that every year bring us together all over the World are the lifeblood of our vibrant and diverse community. From BADCamp in California, USA, through Global Training Days in Omsk, Russia to Contribution Weekends in Leeds, UK, we have a combined wealth of knowledge and experiences but also challenges and opportunities.

At the Drupal Association, we value highly the commitment made by those who volunteer their time to make these events happen. If I wasn’t British, and therefore terribly understated, at this point I would probably say “You rock!”

As an event organiser, I wanted to take the opportunity to bring to your attention a few things happening in the community that we hope will help you in your efforts.

The Event Organizers’ Group

We were very pleased to see the creation of a growing group of event organizers forming and beginning to have regular meetings to share experiences and resources. One of its founders, Kaleem Clarkson, has blogged about this and their plans for the future.

To help with their formation, we helped them become one of the first groups to create one of the new Community Group Sections at Drupal.org/community/event-organizers.

One thing that is super important, though, is that this group has representation by event organizers from all around the World. Wherever you live, do join this group, make your voice heard and ensure that it meets in a way that works for you.

The Event Organizers’ Round Table with Dries Buytaert

One of the requests we had from the group was to be able to spend more time discussing their aims with the Project founder, Dries Buytaert.

Dries agreed that spending quality time with representatives of such a key part of our community was highly valuable, so we have created a round table at DrupalCon Seattle 2019 to which event organizers will be invited to attend and participate.

The invites will be sent to the mailing segment introduced below - make sure you read on!

Dries taking notes at a previous round table
Image of Dries taking notes from a previous Round Table, courtesy Baddý Sonja Breidert

A mailing list segment especially for event organizers

To ensure that we can alert event organizers to activities like the above, we have created a special segment in our mailing list system that allows us to send updates to only those that want to hear about them.

If you are an Event Organizer, or looking to become one in the future, I highly recommend you visit your user profile on Drupal.org and check the box “Contact me regarding local organizing (camps, groups, etc.)”

For the above mentioned Round Table with Dries Buytaert, we will be sending invites to member of this mailing list so make sure you are on it!

screenshot of user account settings

Feb 13 2019
Feb 13
In this fifth installment of our series on conversational usability, our focus shifts to conversational usability and the process of evaluating and improving conversational interfaces that often differ significantly from the visual and physical interfaces we normally use to test with end users.
Feb 13 2019
Feb 13

The Micro Site module allows you to set up a Drupal web factory, from a single Drupal 8 instance, to power a multitude of sites, of a different nature if necessary. Micro Site provides a new content entity that allows you to customize these different site types at will, using the APIs available on Drupal 8, and modify their behavior according to business needs. Let's detail how we can proceed, through a basic example of providing a modular footer on different powered sites, and display it on all the pages that can make up the site.

This operation can be summed up as a little bit of site building, by adding and configuring a few fields, and a little bit of theming by adapting the general template of the pages of a site to be able to display this footer.

Site building

Add field on site type

For example, on a site type called Generic we can create a new Paragraphs field called Site Footer.

field site footer

Then we can fill in the footer of the page when editing the microsite, from a certain number of configured paragraphs (links type to add a list of links, or text type to put free text, or why not a paragraph to load a registration form, etc.).

Footer exemple

In this example we use paragraphs to feed the footer of the micro site, to allow a great modularity for each micro site to customize its footer according to its needs. But we could just as easily have used simpler, more constrained fields to limit the possibilities that could be filled in the footer. It is here to assess the business needs of each type of site that we want to set up.

Theming

We are not going to announce a great innovation, or a thundering trick, for the future. Once the field (or fields) created that will feed the footer of the micro site, all that remains is to display it at the template level page.html.twig, or on a surcharge specific to site: page--site.html.twig or a surcharge more specific to the site type, for example page--site--generic.html.twig.

First, we implement a prepocess function on the site pages to provide the template with the site_footer variable.

/**
 * Implements hook_preprocess_HOOK().
 *
 * Provide site variables to override footer, menu in the page template
 * dedicated to site entities.
 */
function micro_drupal_preprocess_page(&$variables) {
  $negotiator = \Drupal::service('micro_site.negotiator');
  /** @var \Drupal\micro_site\Entity\SiteInterface $site */
  $site = $negotiator->getActiveSite();
  if ($site instanceof SiteInterface) {
    if ($site->hasField(MicroDrupalConstant::SITE_FOOTER)) {
      if (!$site->get(MicroDrupalConstant::SITE_FOOTER)->isEmpty()) {
        $viewBuilder = \Drupal::entityTypeManager()->getViewBuilder('site');
        $field_footer = $site->get(MicroDrupalConstant::SITE_FOOTER);
        $site_footer = $viewBuilder->viewField($field_footer, 'footer');
        $variables['page']['site_footer'] = $site_footer;
      }
    }
  }
}

Then at the template level page--site.html.twig, we display this new variable.

{% if page.site_footer %}
  <!-- #site-footer -->
  <footer id="site-footer" class="clearfix site-footer-wrapper">
    <div class="site-footer-wrapper-inside clearfix">
      <div class="container">
        <div class="row">
          <div class="col-xs-12">
            <div class="site-footer">
              {{ page.site_footer }}
            </div>
          </div>
        </div>
      </div>
    </div>
  </footer>
  <!-- EOF: #site-footer -->
{% endif %}

And it's over. The rest is just a little bit of styling with some magical CSS rules.

What about a sidebar?

Using the Drupal 8 block positioning system, we can place as many blocks as necessary in a sidebar region, and control visibility by site, or by site type, as needed. But this configuration must then be done at the administrator level of the Drupal Master instance and not at the level of each micro site.

Following the same logic, we can very well set up dedicated fields (modular or not) to feed a sidebar present on each micro site, and then allow this time the administration and management, at each level of a micro site, of the content of these sidebars and control their visibility at this level. The implementation will be slightly more complex, to give full control to an administrator of a micro site on the visibility rules of the created content, but the basic principle remains the same: a touch of site building and a zest of theming to customize as much as necessary a micro site.

Feb 13 2019
Feb 13

Various forms are the heart of website’s interaction with the user. They are vital for usability, conversions, marketing analytics, and more. So form building tools are in demand — the Drupal Webform module ranks 7th on the 42,000+ list of contributed modules. The Webform module has recently got its stable version for Drupal 8. We will give it an overview and show an example of creating a simple form with the Webform module.

Webform module in Drupal 8: finally stable

The Webform module lets us create any kinds of forms — from simple to multi-page ones. It also adds settings and features to them like statistics collection, email notifications, confirmations, data export in various formats, fine-tuned access, conditional logic, filtering and sorting, pushing results to CRM, downloading results as CSV, and more.

The stable version for Drupal 8 finally arrived on December 25, 2018. Drupal creator Dries Buytaert congratulated the Webform module creator Jacob Rockowitz via Twitter and called the release a big milestone.

The release was soon followed by Webform 8.x-5.1 on January 1 and Webform 8.x-5.2-beta1 on January 25.

Defining forms in a YAML file or admin interface

One of the most remarkable new features of Webform module in Drupal 8 is the ability to define a form in a YAML file. Webform module creator Jacob Rockowitz also built the YAML Form module specifically for Drupal 8. As of Webform 8.x-5.x, the YAML Form merged into Webform. So developers now can go to the YAML file by clicking the “Source” tab.

But there is also a user interface for form creation that is intuitively understandable even for website administrators. The admin dashboard contains examples and videos, as well as a built-in contact form.

The Webform ecosystem of modules in Drupal 8

The Webform module for Drupal 8 is a complex module with 20 submodules that demonstrate its varied capabilities. The minimum to enable are Webform and Webform UI modules.

There is a also number of contributed modules that work together with Webform and extend the its capabilities still further: Webform Views, Webform REST, Webform Analysis, Webform Encrypt, Webform Composite Tools, Webform Invitation, CAPTCHA, Honeypot, MailSystem, SMTP, and more.

Creating a simple webform with the Webform 8.x-5.1 module

We will create a feedback form for customers from scratch. With the Webform and Webform UI modules enabled, we go to Structure — Webforms, click “Add webform”, give it a name “Let us know your thoughts”, and save it.

The form then needs to be completed with fields that are called “elements” in Webform. The “Add element” button leads us to the list of 60+ available fields, both simple and complex.

We will need a field for the customer’s name. For this purpose, there is a complex “name” element with sub-elements (First name, Last name etc.) that can be checked or unchecked to be included or not into the form.

If no complex name structure is needed, we can use a simple “Text field” element for the name. Let’s call it “Your name (optional)” in the “Title.”

For the email field, we select an “Email” element and add a placeholder for email addresses

An element of “Checkboxes” type will let customers check some options. For our feedback form, we list the company’s services as “option values”.

To let customers give their votes, we will us an element of the “Rating” type and set the value range from 1 to 10.

A “Text area” element will host customers’ comments.

On the list of all fields for our form, we can drag and drop the fields to change their order, or quickly check them to quickly make them required/optional.

Let’s choose where the submitted forms will go. On the form’s dashboard, we select Settings — Emails/Handlers and click “Add email.” There, we configure the “sent to” email address (in our case, it’s the default [site:mail] is OK). We also add a custom subject “You received a new form submission.”

The “View” tab takes us to our form on the website. If we submit something and check the admin email, we see new messages about form submissions.

Let’s create the forms your Drupal website needs

This was just a very simple form example where we have not even touched advanced options — conditional logic, access, submission handling, JS effects, and so on. The Webform module in Drupal 8 has unlimited capabilities, and our Drupal team can create you the forms that will work in accordance with all your requirements.

Feb 13 2019
Feb 13

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

We had a delightful talk with Taco Potze, co-founder of GoalGorilla, Open Social and THX. Taco revealed to us why his team decided for Drupal among the various possible CMS choices and what Drupal initiative they are most excited about. He thinks open source has the potential to balance out the power of tech giants and give people all over the world equal opportunities. Read on to find out more about his projects and his contributions to the community.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

My name is Taco Potze, or just Taco usually does it ;). I am the co-Founder of the Drupalshop GoalGorilla and co-Founder of the Open Social and THX projects. I have been on the board of the Dutch Drupal Association for four years and active in organizing various local events such as the DrupalJam. My day to day business focuses on business development for Open Social and getting our latest THX Project up and running. Other than that, I love to travel and take care of our 1-year-old son.

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

I really started working with Drupal when one of our early clients asked us to build their new website. We were mainly working on online marketing, analytics and UX improvement in those days. 

My co-Founder and I have an industrial engineering background, not in coding per se. We searched for an out-of-the-box CMS that was open-source and Drupal made it to our shortlist. The winning reason for doing the project with Drupal 6 was the multi-language capabilities. The project had to be done in English and Chinese. Adding Chinese menus, blocks and content to the websites gives me now, over 10 years later, still nightmares sometimes ;). 

Jaap Jan Koster and I, now our VP of product, got the project done over summer within time and budget and ended up with a very happy client. That triggered us to offer more web development services and soon we were doing lots of projects. We used a variety of open-source CMSs until in 2010 we decided to do projects only in Drupal. 

For us Drupal provided the best framework to do challenging projects and working with only one CMS meant we could really become experts. The early years did not include many Drupal projects, I have to admit. We did not fully understand how important contributions (on all levels) are and lacked some of the skills to make worthwhile contributions. This changed over time when we started contributing modules back and became mature with the Open Social distribution where we have invested 10,000s of hours.

3. What impact has Drupal made on you? Is there a particular moment you remember?

One of the best aspects of the community is the Dutch Drupal Community. We have excellent thought leaders such as Bert Boerland and Baris Wanschers that relentlessly push the Drupal community forward.

We’ve had many successful events such as the DrupalJam, Frontend United and Splash Awards. There are informal meetings with developers or members of the board, and cooperation exists in distribution projects such as Dimpact WIM or DVG. Instead of competing with negative sentiment, we are competing but also working together to push our projects and companies forward.

A while ago, I even helped pitch an Open Social project for another Drupal agency (which we won). When I tell other companies about this ecosystem, at times they are skeptical and think that I am overselling or that we don't really compete or cooperate. However, with over 10 years of experience as a community, we have proven we can. The community is growing, Drupal is still winning market share, and companies are flourishing. I think this has made a profound impact on me as an entrepreneur.

4. How do you explain what Drupal is to other, non-Drupal people?

It depends who you are talking to. At a birthday party, you might want to simplify more than when talking to a potential client that hasn't heard of Drupal yet. I always amplify the message that it's a huge global community all working on the same IT project contributing code, sharing information and meeting at events all around the world.

I might share some of my worries about the power of big tech companies (Facebook tends to be a good example) and how we are trying to balance the scale by being completely open and transparent. I love sharing the idea that work we have done on Open Social gives people all around the world, say in developing countries, the same opportunities to organize and engage and drive their missions as companies with larger budgets.

For me working on open-source is a principled choice. Drupal is one of the projects where the importance of the open-source comes first. If I can make somebody aware of that and the choice they might have in that one day, then it was a good conversation.

5. How did you see Drupal evolving over the years? What do you think the future will bring?

These next few questions about Drupal I answered with the help from my team.

Our team sees Drupal evolving into an API-first platform, something we definitely applaud when looking at the possibilities out there that are related to this innovation (e.g. Internet of Things). We see Drupal being more open to integrations with other systems so we can provide an amazing cross-channel user experience.

6. What are some of the contributions to open source code or to the community that you are most proud of?

Our team works hard to contribute back to the Drupal distribution. It’s actually hard to pick which contributions we are most proud of since every single one of them is something to be proud of. 

However, the contributions we would highlight are all the commits done to Open Social. The fact that we are able to provide a ready solution for everybody to use is very motivating, especially since we can work together with developers from the community who help to make our distribution even better!

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

Drupal has many initiatives that we look forward to. One of our developers, Ronald, especially highlighted the Layout Builder

“I’m really looking forward to using the Layout Builder. We have always struggled with creating a good solution for custom one-off pages with unstructured content, which would provide a lot of flexibility for content managers using Drupal. I think this initiative will produce the “wow factor” to our users and give us the ease of mind by not needing to create difficult custom solutions.” - Ronald te Brake

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavorment. 

Blockchain technology has been a passion for a while and we are making great steps adding this exciting technology as part of Open Social and beyond with our THX Project. It's important to be able to improve engagement in social communities. 

With THX you can reward your users for valuable actions they take. For example, filling out their profile, adding content, adding code and patches for a community such as Drupal and much more. It also helps transferring reputation from one community to the next and gives a model to measure the value of online communities. If you are interested, we have written a white paper and various blogs on the matter and will publicize more information on the project and our DAICO in the upcoming months.
 

Feb 12 2019
Feb 12

One the most popular articles on our blog is an article I wrote a year and half ago about how to install CiviCRM on Drupal 8.

The method described there worked (and still more-or-less works), but it's... a mess.

It involves running a dozen or so commands, and is pretty easy to get wrong. All of this is just to get code assembled in such a way that you CAN install it.

I'm happy to announce that you can now do this in just a single command!

There's still some little issues and bugs with running CiviCRM on Drupal 8 that need to be manually worked around, but getting over that first hurdle of simply allowing you to install it in the first place should be significantly easier with this new method.

Read the full article to find out how!

Background: Why is this hard?

The biggest problem is that both Drupal 8 and CiviCRM depend on some of the same PHP libraries, for example, Symfony.

In PHP, you can't define the same class twice (this would be a fatal error) and you certainly can't define the same class twice at two different versions (for example, one version for Drupal and one for CiviCRM).

So, you couldn't just copy CiviCRM with all its dependencies into an instance of Drupal, because some of those dependencies would conflict with what's already in Drupal.

And, unfortunately, you couldn't just make a special CiviCRM bundle that's "optimized" for a particular version of Drupal 8, because each Drupal 8 site is potentially unique: you can update the PHP libraries used by a Drupal 8 site (for example, upgrading Symfony to a newer version) or add new PHP libraries that could conflict.

The Magic of Composer

Composer is a tool that's used by PHP applications and libraries to find and download a compatible set of dependencies.

For example, CiviCRM needs Symfony 2 (version 2.8.44 or greater) or any version of Symfony 3. Drupal 8.6.9 needs Symfony 3.4 (version 3.4.14 or newer), although, soon it will be possible to use Drupal 8 with Symfony 4 as well.

Using composer, you can say, "I need Drupal 8.6.9 and CiviCRM 5.10.0" and it can pull in a version of Symfony that works for both.

If your Drupal site used Symfony 4 (because some Drupal module needed it), it would error out and say you need to either remove that module (so Symfony can be downgraded to Symfony 3) or remove CiviCRM.

That's why composer is so heavily involved in getting Drupal and CiviCRM to work together!

The peculiarities of CiviCRM

Of course, that's not the whole problem, because otherwise it would have been possible to solve this with a single composer command years ago.

CiviCRM has been around for over a decade (which is a lifetime ago in the PHP ecosystem), and still has some legacy pecularilies that need to be accounted for..

Namely, CiviCRM depends on going through a non-composer-y build process to generate a "release" that is actually usable.

Some of those things that need a build process could be reworked in a way so that they didn't need it, and others could be done in a composer-compatible way, such that CiviCRM would work like any composer library. Work is being done on that, but those are hard problems which will take time to solve.

TL;DR - What are the commands?

Heh, alright! Time for the actual actionable steps. :-)

First, you need to have the following installed:

Then, to create a new Drupal 8 site with CiviCRM:

# replace 'some-dir' with the directory to create
composer create-project roundearth/drupal-civicrm-project:8.x-dev some-dir --no-interaction

Or, to add to an existing Drupal 8 site (assuming you used the best practice method of starting from the drupal-project composer template):

composer require roundearth/civicrm-composer-plugin civicrm/civicrm-drupal-8

If you have a Drupal 8 site that isn't based on the drupal-project composer template, well, you're going to need to convert your site.

I'd recommend creating a new codebase (using the 'composer create-project' command above), adding all the modules/themes/libraries from the old codebase, and then switching the site to the new codebase. Assuming you didn't forget to copy anything over, that should work, but definitely keep a backup.

Installing CiviCRM

Once you have the code in place using the commands above, you'll actually need to install CiviCRM. Due to some bugs in CiviCRM for Drupal 8, there's a bunch of caveats and extra steps associated with that.

I've documented some of the steps on the project page for 'roundearth/drupal-civicrm-project':

https://gitlab.com/roundearth/drupal-civicrm-project#installing-civicrm

Hopefully, all these bugs will be fixed in time.

The good news is that the composer project template will always pull in the latest versions of CiviCRM and its Drupal 8 integration, so these steps should be able to remain the same, and will just start working better as those bugs get fixed. :-)

How does it work?

There are two parts to this:

It does the following:

  1. Run's Bower to pull in Javascript and CSS dependencies
  2. Downloads a release of CiviCRM and copies in the missing files generated by the usual build process
  3. Downloads any requested CiviCRM extensions
  4. Copies any web assets from CiviCRM (which is installed in the vendor directory, which isn't web accessible) into the web root

In case you're curious, or want to help improve it, here's where all the most interesting code lives:

https://gitlab.com/roundearth/civicrm-composer-plugin/blob/master/src/Handler.php

Conclusion

One of the main challenges in improving CiviCRM on Drupal 8, is just getting it installed for developers who might want to help.

We've been involved in the effort to port the Webform CiviCRM module to Drupal 8, which has also faced this same challenge.

That's why a small part of the development of this new method was funded via the donations that were made to that effort. The rest was just part of our efforts in maintaining Roundearth, a Drupal 8 and CiviCRM platform for non-profits.

We're hoping this will help, not only with getting CiviCRM installed on Drupal 8, but also help to accelerate development. :-)

Feb 12 2019
Feb 12

The only way to get someone to contribute to an open source project is to ask them.

At the beginning of the New Year, I set up the Webform module's Open Collective. I knew that in order to persuade organizations to back the Webform module or any Drupal-related Open Collective would require directly asking organizations to join the Webform module's Open Collective. Additionally, I also needed to listen to what an organization might want to get out of their financial contribution to an Open Collective

It is reasonable for organizations to ask why should they back an Open Collective and what should they expect in return.

At DrupalCampNJ, I paid a visit to the event's sponsors that I was friendly with and asked them to join the Webform module's Open Collective. I also decided to broaden my pitch to include asking people to consider backing any Drupal related Open Collective. The Simplytest.me and Drupal Recording Initiative collectives provide invaluable services that our community needs and everyone should help support them.

Everyone was familiar with the Webform module, and most people knew that I was maintaining the Drupal 8 version, but no one knew what an "Open Collective" is. Gradually, as I spoke to people, I managed to put together a concise explanation for the question, "What is Open Collective?"

Open Collective is a service which allows Open Source projects to transparently collect and distribute funds. Organizations who back an Open Collective will get a receipt for their financial contributions and be able to see exactly how the collected money is being spent.

The above explanation leads to the next relevant question which is "How is this money going to be spent?" My response is this: Spending the collected funds is going to be determined by what the backers want and the Webform module needs.

As the maintainer of the Webform module, I feel we need a logo. A logo will help distinguish the Webform module from the massive sea of online form builders. For some projects, the Webform module is a significant part of a proposal and a logo would help make the software feel more professional. In the near future, I am going to post an RFP for a Webform logo. Still, what I want is not nearly as important as what organizations need.

Discussing, Explaining, Listening and Delivering

I want to hear what people want or need from the Webform module's Open Collective. Since most people did not know what is an Open Collective, it was hard to expect them to know what they need from an Open Collective. As I spoke with people at DrupalCampNJ, I came up with two anecdotes that explored some potential use cases for an Open Collective.

My first anecdote happened when I finished setting up the Webform module's Open Collective, someone emailed me asking for help with debugging a broken image file upload, and they offered to pay me for my time. These small private transactions are complicated to coordinate, so I suggested that they make a donation to the Webform module's Open Collective and then create a ticket in the Webform's issue queue. I managed to resolve their problem quickly. Everyone felt that this was a successful transaction.

Another related anecdote: While brainstorming about getting people to help pay for open source development, a fellow software developer daydreamed that his company would happily pay to see a bug fixed or merely a patch committed. One of the most common frustrations in open source issue queues is seeing a patch sit there for months and sometimes years.

The above stories highlight the fact that besides using Open Collective to back an open source project, a collective could be used to execute transparent transactions using a common platform.

Would enterprise companies and organizations be more comfortable paying for open source work through an Open Collective instead of a private transaction with an individual developer?

Persuading

At DrupalCampNJ, no one rejected the concept of backing a Drupal-related Open Collective. I was able to collect several email addresses which I am going to use to continue persuading and listening to potential backers. My next step is to write a short and persuasive email which hopefully will inspire organizations to back Drupal-related Open Collective.

Continuing

In the spirit of full transparency, I will publish this email in my next blog post. For now, any feedback or thoughts are always appreciated. Hey, maybe my two anecdotes might have persuaded you to back the Webform module's Open Collective.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly

Feb 12 2019
Feb 12

Amazee Labs is proud to sponsor Drupal Mountain Camp in Davos, Switzerland 7-10 March 2019.

Come by and see us in the exhibit area or at one of the social events, and be sure to check out these Amazee sessions: 

On Friday, from 14:40 till 15:00, join Maria Comas for GraphQL 101: What, Why, How. This session is aimed at anyone that might have heard or read about “GraphQL” and is curious to know more about it. The session will give a basic overview and try to answer questions like:

  • What is GraphQL?

  • Is GraphQL only for decoupled projects?

  • Advantages to using GraphQL with Drupal

  • Getting started with GraphQL

Follow this up on Friday from 15:00 till 16:00, with Daniel Lemon who will present Mob Programming: An interactive session. The basic concept of mob programming is simple: the entire team works as a team together on one task at the time. That is one team – one (active) keyboard – one screen (projector of course). It’s just like doing full-team pair programming. In this session you’ll learn:

  • What are the benefits to a team?

  • How could this be potentially integrated into your current workflow

  • The disadvantages to Mob Programming and why it might not work for certain types of companies (such as a web agency).

Additionally, don’t forget to check out this talk from Michael Schmid of amazee.io Best Practices: How We Run Decoupled Websites with 110 Million Hits per Month. This session will lift the curtain on the biggest Decoupled Websites run by amazee.io and will cover:

  • How the project is set up in terms of Infrastructure, Code, Platform and People

  • How it is hosted on AWS with Kubernetes, and what we specifically learned from hosting Decoupled within Docker & Kubernetes

  • Other things we learned running such a big website

Hope to see you in Davos soon! 

Feb 11 2019
xjm
Feb 11

Last fall, we adjusted our minor release date for Drupal 8.7.0 from March 6 to May 1. This was done as part of moving Drupal's minor release schedule toward a consistent schedule that will have minor releases in the first weeks of June and December each year. (See Plan for Drupal 9 for more information on why we are making this change.)

However, the change to the 8.7.0 release date means that DrupalCon Seattle now falls in the middle of important preparation phases for the minor release. In order to ensure community members have adequate time to prepare and test the release without interfering with DrupalCon Seattle events, we've moved the alpha and beta phases for the release one week earlier:

  • 8.7.0-alpha1 will now be released the week of March 11. The alpha phase will last two weeks until the release window for beta1.
  • 8.7.0-beta1 will now be released the week of March 25. The beta phase will now last three weeks (including the week of DrupalCon) instead of two. The beta phase will still end when the release candidate window begins.
  • The release candidate (RC) and release dates are unchanged. The RC window still begins April 15 and the scheduled release date is still May 1.

(Read more about alpha, beta, and RC phases.)

Feb 11 2019
Feb 11
In this fourth installment of our series on conversational usability, we're turning our attention to conversational content strategy, an underserved area of conversational interface design that is rapidly growing due to the number of enterprises eager to convert the text trapped in their websites into content that can be consumed through voice assistants and chatbots.
Feb 11 2019
Feb 11

We compare REST, JSON:API and GraphQL — three different web services implementations — based on request efficiency, operational simplicity, API discoverability, and more.

An abstract image of three boxes

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetch data but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability.

Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an any HTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted convention amongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GET request with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling.

Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:API is on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions.

Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure.

What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. While Drupal's GraphQL implementation is fantastic, I no longer recommend that we add GraphQL to Drupal 8 core.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein (Acquia) for their feedback during the writing process.

February 11, 2019

11 min read time

Feb 11 2019
Feb 11

We’re off to a great start of the new year! In January, we wrote some really interesting blog posts; in case you missed any of them, we’ve prepared this overview where you can find all of our posts from last month, neatly compiled in one place. Enjoy!

2018 in review

Our first post in 2019, published just a few days into the new year, was a review of our achievements in the previous year. Not only did 2018 mark the 5-year anniversary of Agiledrop, it will also remain in our memories as the year when we upped our game, scaled our team very successfully and optimized our strategy for the future. 

Of course, we still found the time to give back to the Drupal community, whether it be through open-source contributions or any of our educational events, such as our free Drupal courses. 

Read more

Interview with Shawn McCabe, CTO of Acro Media

We couldn’t properly start the year without continuing with our Community Interviews series. Mere days after our yearly review, we published the interview with Shawn McCabe, CTO of the Canadian Acro Media

Shawn’s love for open source was something that was immediately obvious from our talk and it was extremely interesting to get to know his story about discovering and working with Drupal. Our favorite part is almost definitely how he first met Dries - but you’ll just have to check out the entire post if you’re curious about that! 

Read more

Best Drupal 8 Security Modules

To also cater to the more tech-oriented audience, and to highlight one of the foremost advantages of Drupal (yes, of course it’s security!), we wrote a post about the 5 Drupal security modules that we’ve so far found to be the most useful. 

Even though Drupal is known for being a very secure CMS out-of-the-box, it still never hurts to take some additional security measures. Better safe than sorry, they say, especially with so many cyber security threats reported recently!

Read more

Interview with Gabriele Maira of Manifesto Digital

Next up came another Community Interview - this time we talked with Manifesto Digital’s Gambry, an avid proponent of contribution sprints (definitely not just because he’s responsible for running local Contribution Sprints in London!). He thinks every Drupal developer should attend a sprint at least once in their life, and provides the really on-point reasons for this.

There’s one sentence from the interview that’s really remained with us and fills us with warmth every time we read it: “And instead of being a mortal between gods, I found friends. I found the wonderful Drupal Community.” Ahh … Isn’t it great? Can you feel the warmth? We know we sure do.

Read more

The Story of Agiledrop: Cultivating Strong Relationships with Clients

Our final blog post from January was the 3rd chapter in our latest series of posts, The Story of Agiledrop. In this extensive post, we talked about the steps we take to ensure that the relationships with our clients are always as healthy and strong as possible.

Admittedly, due to our unique workflow, this has proved to be quite challenging. But, because we’ve understood the importance of this from the get-go and have hence made it one of our top priorities, we’re proud to say that our approach is very effective. The result is two-fold: happy clients and a motivated team.

Read more

That’s it for our posts from January - but, don’t worry, we’ll be back very soon with new content, and, if you happen to miss any of our upcoming blog posts, we’ll be doing the overview again in March. So, keep warm and stay tuned! 

Feb 11 2019
Feb 11
Wisdom doesn’t automatically come with old age. Nothing does - except wrinkles. It’s true that some wines improve by age but only if the grapes were good in the first place. 
-Abigail Van Buren

A reflection of the life experiences adds generously to the whole box of one’s wisdom because let’s face it, being wise and savvy can come from anyone and anywhere. So yes, famous quote “Age is just a number” has done justice to the whole scenario of erudition. 

Just like natural misconception “ bigger the better” proved right by small agencies handling bigger projects. Gone are the days where large enterprises use to rule in the market kingdom bagging all the big projects. Today, small agencies are winning big-name accounts and cool projects far more often. And the trend is forecast to continue.

Two fish bowls, one small and one big with water where a fish is jumping from small bowl to the bigger bowl


For the Drupal agency with big aspirations deciding on the projects to opt for can be a bit of a task sometimes, but attaining the trust from CxOs of big organizations that is even bigger than the projects itself. 

Thereby, solving this issue of handling and winning - here are some of the ways which would help you to seize those big projects in your vanity. 

First things First - How to meet big clients?

Just because you are a small agency or organization, it would not mean your clients to have to be small. Landing on the large organization not only boosts up the small business revenue but also increases efficiency among your team members and organization.

  • Use client reference to introduce your process

Big companies may seem like a grand entity, but you should not forget that they are constituted of hundreds and thousands of individuals who have the power to make the decisions.

So it is really important for your research to be up notch and accurate that tells you who to contact within the company you've targeted. Some of the sources or references may help with this. Apart from this some companies also present details of at least one of the senior employees on their websites.

But you need to be really creative to figure out exactly who the right person is. Look out for out some of the company’s publications or newspapers mentions seeing whose name comes up.
Not only this but you can also tag along with people who would introduce you to big tech giants.

  • Indulge in cold calling

Telemarketing and cold calling continues to be an essential discipline that is really useful for the sales role. In many business sales organizations, the old school “door knocking” might not be that productive, and when it comes to big organizations especially with large territory assignments, cold calling becomes the hero for everyone. Prospecting via phone calls continues to be a great compliment to your overall employment setting and lead generation projects.

  • Be an expert and then try to be a solution to their needs. 

If you want the big giants to trust you with the projects then a sense of “What the work means to you”must be established with a clearer vision for the future. In fact, according to the Employee Job Satisfaction and Engagement survey, nearly 77% of employees said it was important to their job satisfaction and engagement to have a clear understanding of their organization’s vision and mission.

Start with your team 

Now that you have big names in your vanity start by developing strong team hold and skills. Starting from:

  • A team of Generalists 

Generalists are the people who have a particular skill but are flexible enough to mold themselves in any situations and are ready to learn a new skill. In the case of Drupal websites, a generalist should be able to handle both backends as well as frontend. 

In other words, having a person as a generalist would be beneficial for your organization. He/She would be able to effectively handle many tasks. 

 Image of 5 people. The middle guy is magnified with the help of magnifying glass which is held by a hand
  • Services are important 

Focus on the set of services and assistance which you would be providing to the vendor. Your team would become a specialist with time and experience. Treat a big enterprise like royalty. 

The big giant enterprise is like the customer for you who are always expecting great services and will not put up with the waiting for the poor responses from their representatives. 

Be honest with your projects and their goals. If your customers find that you are dishonest with your services, they will lose faith in you and may even spread negative feedback about your business.  

  • Categorizing your projects

To ensure that the complexity of the project is achieved, categorize the project into the following:

Small projects: These can easily be tracked just by getting updates A project is classified as small when the relationships between tasks are basic and detailed planning or organization is not required.

Charter required projects: These are projects that require some level of approval other than the first line manager, but do not include significant financial investment. A summary of major deliverables is usually enough for management approval.

Large projects: The project network is broad and complicated. There are many task interdependencies. With these projects, simplification where possible is everything. 

  • Planning 

Planning a project helps in achieving objectives and deadlines on time. It pushes the team members to keep working hard until the goal are conquered. Planning also helps in creating a network of right directions to the organization.

Increases efficiency: Planning helps in maximum utilization of all the available resources that you would be using. It supports to reduce the wastage of precious resources and dodges their duplication. It also aims to give the greatest returns at the lowest possible cost. 

Reduces risks: With having such large projects there are many risks associated with it. Planning serves to forecast these risks. It also serves to take the necessary precautions to avoid these risks.
 
Facilitates coordination: Often, the plans of all departments of an organization are well coordinated with each other. Similarly, the short-term, medium-term and long-term plans of an organization should be coordinated with each other. 
 
Aids in Organizing: Organizing intends to bring together all possible resources, Organizing is not possible without planning. It is so, since, planning tells us the number of resources needed and when are they needed. It suggests that planning aids in organizing in an effective way.
 
Keeps good control: The actual administration of an employee is compared with the plans, and deviations (if any) are found out and corrected. It is impossible to achieve such control without the right planning. Therefore, planning becomes necessary to keep good control.

 Image of text saying plan in red color with two robotic hands

 

  • The scope of the Project 

Perhaps the most difficult part of managing a large project with a small team is the difference between a task and an actual project. In order for small project teams to be successful with large projects, the manager should always know the status of the project and the scope at which it is being achieved. 

  • Excellent Relationship with the vendor

The most important part of managing big projects with small teams is to establish a meaningful relationship across the organization.

A solid relationship is a path that may lead to the difference between a project that becomes actualized and one that remains in the conceptual area. If the business doesn't concentrate on a product or the service that is important to reach your clientele, you require a vendor that does it. 

Next comes the Methodologies 

Large organizations usually handle classical methodologies which involve a lot of unnecessary documentation. Thus, for small agencies, some methodologies help largely in handling large projects 

Agile was developed for projects that require both speed and flexibility. The method is split down into sprints- short cycles for producing certain features. 

Agile is highly interactive, allowing for fast adjustments throughout a project. It is mostly applied in software development projects in large part because it makes it simpler to identify issues quickly 

Agile is essential because it allows making changes early in the development process, rather than having to wait until testing is complete.

A circular infographic with six parts that says test, evaluate, meet, plan, design and develop

It is a variation of an agile framework which is iterative in nature which relies on scrum sessions for evaluating priorities. “The Scrum assemblies” or “30-day sprints” are utilized to limit prioritized tasks.

Small teams may be gathered to concentrate on a particular task independently and then coincide with the scrum master to assess progress or results and reprioritize backlogged tasks.

Image of a circular infographic with five parts saying commitment, focus,openness, respect, and courage

 

  • Waterfall 

This is a basic, sequential methodology from which Agile and similar concepts evolved. It is commonly practiced in many industries, especially in software projects.

Waterfall has been an excellent project management methodology for years now and used by most of the project managers. This methodology is sequential in nature and is used by many industries, mostly used in software development. It consists of static phases ( analysis, design, testing, implementation, and maintenance) that are produced in a specific order. 

  • Critical Path Method 

CPM is an orderly, systematic method that breaks down project development into specific but related actions. 

This methodology can be used to build the preference for a project’s activities to assess risks and allocate resources accordingly. This method encourages teams to identify milestones, assignment dependencies, and deadlines with efficiency. 

A Critical Path introduces to a sequence of critical projects (dependent or floating) in a project that tells the extended succession of tasks that have to be made on time in order for the project to meet the deadlines.

Culture is Fundamental to Succeed

How do you explain to your client that the team won’t work this week for DrupalCon, DrupalCamp or any other events happening around?

You can only explain it by being clear with your thoughts and ideas. The community here plays a vital role in everything. 

Explain to your team members that it is beneficial for them to improve Drupal as a platform and introduce them with the team culture. Help your team member create pages in drupal.org and give credits to them of their creation on patches and modules. 

Closing the project 

Yes, it is possible that project closing might look like an insignificant and unimportant task in your project management journey, but, in fact, it is a critical part of producing a successful project. To help you get this step right, here are 4 things you need to know about how to close a project effectively.

Trace Project Deliverables: It is an effective closure means that you have completed all the deliverables to the satisfaction of the project’s sponsor

Reward Team Members: As your project comes to a close, always make sure to acknowledge, recognize and appreciate the contribution of your team members

Closeout Reports: A detailed close-out report should contain details about the process used during the project, the mistakes, the lessons learned, and how successful the project was in achieving the initial goals

Finance: Big clients are usually slow in payment, try to indulge in an agile budget for large projects. 

Turning From Technical provider to strategic solution partner 

As with any investment portfolio, an organization’s investment in Run, Optimise and Innovate initiatives must be balanced and aligned with the organization’s risk tolerance and the role expected of IT. If an organization considers itself to be more conservative, it is expected to see a higher ratio of Run to Optimise and Innovate spending. More progressive organizations will have more Optimise spending, and “leading edge” organizations will have more Innovate spending.

Conclusion 

Yes, Goliath, the Gittite, is and would always be the well-known giant in the Bible. He is described as 'a champion out of the camp of the Philistines, whose height was six cubits and a span. 

Befriending with the Goliath not only gave the sense of power to anyone with him but was also granted with security. 

Hunching on to large enterprises with big projects is like the very first step to success. Right steps and maintenance would to that success anytime soon. 

Opensense labs development methodologies work specifically on the approaches that involve Drupal development, enhancing efficiency, and increasing project delivery. 

Contact us on [email protected] to accomplish those large projects which you always desired off. 

Feb 10 2019
Feb 10

The night watchman can get a new image as an alert guard if he senses correctly that something does not look right. Imagine a man dressed like a stockbroker and carrying a soft leather briefcase and a valise. His walking style is sort of tentative and squirrelish which is not the way a stockbroker walks. On being asked for an ID by the watchman, he scurries off towards the entry gate of a building and drops his valise on to the floor before being finally captured by the watchman who later finds out that he was trying to rob someone from the building.

A security guard posing for the camera in a broad daylight


Much like the night watchman and his brilliance in judgement that makes sure that the building is kept safe from any such robbers, there is another solution in the digital landscape that helps in writing browser tests easily and safely. NightwatchJS is an exceptional option to run browser tests and its inclusion in the Drupal core has only made things easier for the Drupalists.

Understanding NightwatchJS

Logo of nightwatchjs with an icon representing an owl


NightwatchJS is an automated testing framework for web applications and websites. It is written on NodeJS and uses the W3C WebDriverAPI (formerly known as Selenium WebDriver) for performing commands and assertions on DOM elements.

Nightwatch.js is an integrated, easy to use End-to-End testing solution for browser-based apps and websites, written on Node.js. - Nightwatchjs.org

As a complete browser (end-to-end) testing solution, NightwatchJS has the objective of streamlining the process of setting up continuous integration and writing automated tests. It can also be utilised for writing NodeJS unit tests. It has a clean syntax that helps in writing testing rapidly with the help of NodeJS and CSS or Xpath selectors. Its out-of-the-box command line test runner propels sequential or parallel test runs simultaneously by group, tags or single. It, also, has the support for Mocha runner out-of-the-box.

NightwatchJS has its own cloud testing platform called NightCloud.io in addition to the support for other cloud testing providers like SauceLabs and BrowserStack. It governs Selenium and WebDriver services automatically in a different child process and has great support for working with Page Object Model. Moreover, with its out-of-the-box JUnit XML reporting, it is possible to incorporate your tests in the build process with systems like Jenkins. 

NightwatchJS in Drupal

JavaScript Modernisation Initiative paved the way for the addition of NightwatchJS to the Drupal core (in version 8.6) as the new standard framework for unit and functional testing of JavaScript. It makes sure that alterations made to the system do not break expected functionality in addition to the provision for writing tests for your contributed modules and themes. It can be included in the build process for ensuring that regressions hardly creep into production.

You can try NightwatchJS in Drupal 8.6 by adhering to the instructions given on GitHub. It exhibits how to test core functionality. It also instructs on how to use it for testing your existing sites, modules, and themes by giving your own custom commands, assertions and tests. It is worth considering to check out Nightwatch API documentation and the developer guide of NightwatchJS for creating custom commands and assertions.

NightwatchJS tests will be run by Drupal CI and are viewable in test log for core developers and module authors. And for your own projects, tests can be run easily in, for instance, CircleCI, that can provide you access to artefacts like screenshots and console logs.

Conclusion

While Drupal 8 has extensive back-end test coverage, NightwatchJS offers a more modern platform that will make Drupal more familiar to PHP and JavaScript developers. 

Offering amazing digital experience has been our biggest objective and we have been doing that with a suite of services.

Contact us at [email protected] and let us know how can we help you achieve your digital transformation dreams.

Feb 10 2019
Feb 10

Virtual private servers are fantastic for running your own cloud applications and gives you the authority over your private data. Potentially, your private data may be leaked when you communicate via services like text messaging. One way to ensure greater privacy is by hosting your own messaging system. This is where Rocket.Chat comes into picture.

person typing on laptop computer


Rocket.Chat has the provision for an actual open source implementation of an HTTP chat solution that provides convenience and gives greater freedom at the same time. It can be a marvellous solution for remote communications, especially for open source communities.

Rocket Chat: A close look

[embedded content]


The official site of Rocket.Chat states that it’s an open source team communication software which offers an alternative to remote communications by replacing the likes of email, HipChat and Slack. It aids in improving productivity via efficacious team communication and team collaboration. It helps in sharing files with the team, real-time chatting or even leveraging audio or video conference calls with screen sharing. Being an open source solution, It gives you the option of customising, extending or adding new functionality to meet your expectations.

Rocket.Chat is an open source team communication software which offers an alternative to remote communications by replacing the likes of email, HipChat and Slack

 

Logo of rocketchat with an icon representing rocket


With Rocket.Chat, you can do away with cc/bcc and make use of Rocket.Chat channels and private groups thereby bringing in more transparency in the team communication. By utilising @username, you can include relevant participants in order to apprise them swiftly. When you need to inform about an important matter to all the members of a group, @all can be used. Participants can join or leave anytime using the full chat history. Moreover, Rocket.Chat offers a secure workspace with restrictions on username and greater transparency for admin. You can be a part of leading blockchain propellants like Hyperledger, Brave, Aragon among others in migrating from Slack and Atlassian to Rocket.Chat.

Essential features

Following are some of the major features of Rocket.Chat:

  • Unlimited: Rocket.Chat has the provision for unlimited users, channels, searches, messages, guests and file uploads.
  • Authentication mechanism: It offers different authentication mechanisms like LDAP Group Sync, 2-factor authentication, end-to-end encryption, single sign-on and dozens of OAuth providers.
  • Real-time Chat: Its Live Chat feature allows you to add real-time chat widgets to any site or mobile applications. This brings more efficacy in team communication and also ensures top-notch customer service.
  • Message translation: It utilises machine learning for automatically translating messages in real-time.
  • Use across platforms: It can be leveraged for all the platforms with its web, desktop and mobile applications, LiveChat clients, and SDK( Software Development Kit).
  • Marvellous customisation: You can alter it to meet your requirements. Incoming and outgoing WebHook integrations can be added to it. The personalisation of user interface can be performed by overriding any of the built-in styles. You get to extract the benefits of its REST API, LiveChat API or Real-time API.

Rocket.Chat in Drupal

The Rocket.Chat module, available for both Drupal 7 and Drupal 8, helps a Drupal site to integrate Rocket.Chat. It constitutes a base module that holds the configuration and the LiveChat module that includes a block for enabling LiveChat widget on a page which can be controlled as a block.
 
The maintainers of this module recommend running Drupal and Rocket.Chat behind a TLS (Transport Layer Security) proxy or web server that has TLS capabilities. Also, HTTPS and HTTP crossovers should be taken care of. Moreover, enabling the LiveChat on Rocket.Chat instance allows you to use LiveChat feature.

Conclusion

One of the quintessential aspects of open source communities is remote communication. Rocket.Chat offers a great alternative to the likes of Slack and gives you the plugin to enable in it in Drupal.
 
We have a strong inclination towards the provision of digital innovation and have been doing that with our expertise in Drupal development.

Contact us at [email protected] to understand more on Rocket.Chat and transform your team communication and collaboration.

Feb 09 2019
Feb 09

Ever attended an art gallery and witnessed how modern artists use canvas to speak their thoughts?

By looking at the art and paying attention to their creation a lot of “Ohhs” and “Ahhs” and expressions of awe with wonder tells us whether it has sufficiently aroused the engagement or not.

But at the core of this practice, the whole idea of thinking an endless number of ways in which visitors attend an art gallery always sustain in the mind of an owner. 

More visitors means more conversions, which results in better progression. 

Customer experience and engagement - 2 things thrived by every art gallery owner.

Right?

Image of 7 people in an art gallery looking at the painting in front of them


Today most marketing teams are structured to drive traffic towards websites that seek to generate traffic and hopefully even more profit. 

Yes, it might be an oversimplification of the trend, but that’s the standard marketing playbook. That’s where Conversion Rate Optimization (CRO) comes in. 

But, What exactly is CRO?

Let's discover. 

Everything about CRO

In internet marketing, conversion optimization, or conversion rate optimization is a system for increasing the percentage of visitors to a website that converts into customers, or more generally, takes any desired action on a webpage. The whole process involves understanding how users move through a particular website, what actions they are taking, and what's stopping them from completing their goals. 
 

images of ROI, targeting, usability, A/B testing, multivariate, credibility & landing pages in the middle there is conversion rate optimization


Importance of CRO 

  • Pay per click

The general idea of pay per click advertisement was that it targeted the audience fast by selecting options of what they could see. And you would agree on the fact that nowadays “pay per click” (Google Adwords) prices have hiked up to an extent where it is evident that it is directly affecting the conversions. Also with the increase in digital devices and people getting indulged in technology more businesses have become digital. 

  • Enhancing online competition 

Now that more people are becoming tech-savvy, competition among retailers has increased a lot. Some of them are simply eating away small retailers. That means if you want to convert your visitors into customers, you need to have a website that should be easy to use and easily customizable.  

You can take the help of Drupal as your website platform. It is one such CMS that allows you to set up your website easily and customize it as desired. 

Conversion Optimization has the benefit of allowing you to stay ahead of the curve in terms of competition, by providing you with insights on what's going on with the competitor's website.
  • Combating the rising cost of digital marketing

Let’s face it pay-per-click isn’t just the only thing which is rising in the market. Digital marketing in this area is giving good competition to any sort of “traffic source” ever known. 

The whole point of marketing is to direct the users towards your website.  But how would you make sure that most of them actually make a purchase?

This is where CRO comes to the rescue.

By increasing the number of page visitors who make purchases, CRO helps in improving the conversion rates by simply compacting the cost of digital marketing
  • Streamlining the business 

A website that is continuously being optimized looks more legitimate than the ones which are not doing that job. 

Why?

Well maybe due to the fact that the ones which are not being optimized are not providing a clearer path to the landing pages. Clear landing pages for an online retailer means having an inventory that can easily search or getting a clearer view of the categories. 

  • Saving a large amount of money 

So how can spending a large amount of money on your website result in saving money? 

You’ll find that spending less money on each customer would actually produce more money. Maybe you are not necessarily saving a lot, but you are definitely making a lot more. 

Which eventually balance out both. 

  • Improving the efficiency and layouts 

If you are working with an affiliate organization or the marketers, you would find out that many online retailers find CRO to be a good way to get news out about their products, through a platform that already has an engaged audience, CRO makes your website more valuable to your affiliates, and to any other marketing platforms.
When a higher number of users who click-through to your webpage actually makes a purchase, your affiliates, pay per click advertisers, social media marketing campaigns, etc., make more because you are making more.  

Common misconceptions related to CRO 

There are some businesses that see CRO be an unnecessary expense, an expense that doesn’t really move their business ahead.

Whereas most of them see it as a golden bolt for their marketing woes, a key for high traffic and more leads. 

Using it or not usually it is better than having misconceptions that lead to misguiding and wasting of resource and time. Among which some of them are:

  • CRO is a single skill 

One of the biggest misconceptions among business is an entrenched belief that CRO is a single skill set. In other words, CRO is a broad practice that encompasses a wide range of skills. To be effective at conversion rate optimization, you require three skill sets:

Copywriting: Whether you can write a persuasive copy or not would have a great impact on conversion rates.

Design: Starting from UI/UX to its choice in graphics depends highly on the rate of conversions.

Analytics: It is important to have someone with special and necessary skills to analyze your result. 

  • It is all about best practices

Running or stumbling upon blog posts and articles that tell you the best practices on boosting up your conversion rate has become standard now. 

And going ahead and implementing those practices as written on the write-ups is conventional. But do these tricks really work?

The truth is that there are no particular one-size-fit best practices that can lead you to the path of better conversions. Your focus should always be on removing barriers that hinder with the flow of conversions.  

  • Making small changes that lead to big and better rewards
     
Image of two poster having a girl divided inside a square. The first is having short content with 102.5% green up arrow and the other is having big text with 22.7% red down arrow

 

The above pictures clearly describe that how changing the content length affected the conversions and resulted in 90% more clicks. 

Going by this case study, you might be tempted to find the silver bolt where making a minor change reaped into great news. 

In truth case studies like these are entirely misleading and provides you with only partial information. They don’t tell you that:

How long the tests were done?

Whether the traffic remained constant throughout the testing period?

What all changes were made on the website?

  • It is all about split testing 

Most people think that CRO is all about split testing site elements. 

The truth is that CRO is all about measuring your customer's actions and remove the convention barriers. 

To do this start by basic user needs and understanding the psychology of the customers. This model would help you to focus on the things that should be worked on:

A pyramid with five sections where the text is written as functional, accessible, usable, intuitive and persuasive

 

  • Focusing on CRO alone builds a successful business 

Due to the immense love showered on CRO and per digital marketing, it is believed by the companies that they are winning the game of online business. 

True, that CRO might increase your conversion from 1% to 2%, which yes has a great impact on sales, but to reach those heights of success you need to take a closer look on traffic, brand, and customers. 

So What is the Structured Process in CRO?

  • Defining your conversion actions 

The conversion actions can be defined based on the business goals and then they can be implemented on web analytics. Promoting and producing content in one of the actions which should be implemented as soon as possible. The content technique would require you to do be indulged in practices like:

  1. Targeting email marketing
  2. Marketing automation 
  3. Producing demo
  4. Live events
  5. Case studies

The content at this stage revolves around customer-relationship management through segmentation. When you segment your audience based on age, gender, geographical position, professional role, etc., you are better equipped to offer them targeted content that interests them

  • Understand the prospects

A better understanding of the prospects helps in better converting of offers. This is the stage where you look for indirect customers acquisition and brand awareness.  Begin by mapping out the current situations and forming a clearer idea on your target audience, objectives, market, KPI’s and the current result. Knowing in advance that where you stand and what you want to achieve provides you with more context.

  • Research and Analytics

Once you have the insights into your current situation and objectives, it is the time for you to analyze them. In the analysis phase, you would want to employ web analytics and the other sources of information to form a hypothesis that can then be tested. It could include the heatmaps, tests, insights etc.

This would make sure that your insights are based on the actual behavior of the users and not on the basis of superstitions.

  • Implementing a challenger and Testing the hypothesis 

For implementing a good challenger you need to choose a suitable testing method and then run a test. It involves the turning up of backlogs and matrix into a measurement plan. This is one of the most crucial parts of CRO. 

Examining the hypothesis on basis of multiple tests results in a good number of visitors and websites. 

  • Validating 

After setting up and running the tests, you analyze the results. This generates insights that you then subsequently test. CRO is a continuous process of learning and refining. After setting up and running the test, you analyze the result. This then generates insights that can have a subsequent test. 

Testing for CRO 

Conversion Rate Optimization Test are the ones that refer to the various types of testing methodologies. They are used to identify the best possible version of a certain site that brings in the most valuable traffic.

  • Principles of CRO 

Speed

Back in 2009, Google conducted experiments which described that slowing down the search results page by under half a second resulted in 0.2% to 0.6% fewer searches.

These results now might not sound like big numbers, but for Google (which processed roughly 40,000 searches every second ) the number resulted in 14,400 fewer searches every minute and according to Google itself, people have become more impatient. Thus making speed an important factor
 
Singularity 
 
A single take away “Less is more” is just about right mantra which should be followed by every website owner. 
Many landing pages contain multiple offers.
 
They shouldn’t.
 
Why?
 
Having just one and improved campaign clicks help in increasing the conversion rates

Identification

Identifying your audience and their aspirations/desires mean high conversions. In other words - what they want, what matters to them, and what sources of frictions are for them all comes under the section of identification.  

To identify the people you must know them. After all the reason how sales and marketing would see those conversions is only when you identify your own customers. 

Landing Pages

Take an advise and do not clutter your pages or emails to “what if” moments. 

What if the user is to like my page?

What if more audience like the information that is being served on the page? 

What if they want to read my testimonials and case studies? 

If the goal of creating a particular page is to get likes and visitors to subscribe it, then all you should do it is focus on it. Thus, your landing pages should be as precise and simple as they could be. This gives a clearer idea to the audience and your customers on what you are selling. 

  • Testing Methods 

A/B testing

Businesses want visitors to take an action (conversion) on the website and the rate at which a website is able to drive, this is called its "conversion rate."

A/B testing is the practice of showing 2 variants of the same webpage to different segments of website visitors at the same time and comparing which variation drives more conversions. 
 
The one that gives higher conversions wins!
 
The metrics of conversion is different for every site. For commerce, it might be a sale product whereas for B2B it might generate qualified leads. A well planned, data-driven A/B testing makes your marketing plan more profitable by just narrowing it down to its most important elements by testing them and also by combining them. 

Note that every element on your website should influence visitor behavior and conversion rate should be A/B tested. 

\Image of two laptops where one is A having a 23% reb bar sign on the screen and the other is having 37% green bar sign on the screen


Multivariate Tests 

In a multivariate test, you identify a few key areas/sections of a page and then create variations for those sections specifically (as opposed to creating variations of the whole page in an A/B split test). So for example, in a multivariate test, you choose to create different variations for 2 different sections: headline and image. A multivariate testing software will combine all these section specific variations to generate unique versions of the page to be tested and then simply split traffic amongst those versions.

Multivariate testing looks to provide the solution. You can change a title and an image at the same time. With multivariate tests, you test a hypothesis for which several variables are modified and determine which combination from among all possible solutions performed the best. If you create 3 different versions of 2 specific variables, you then have nine combinations in total (number of variants of the first variable X number of variants of the second). 

Flow chart of multivariate test. Starting from buying now & then getting divided into two parts, one part says original other says variation

 

  • Some Testing tools 

Now we know that Conversion Rate Optimization (CRO) focuses on simple tests that allow you to compare and contrast layouts, call to action, a design feature, content and even personalized marketing feature to nudge your conversion rate into new.   

Therefore here are some wicked yet sweet CRO tools that would help you with the testing.

Optimizely: Optimizely tool requires a single line of lightweight code that can be added to any website. The user would then have the power to change any on-page elements on the website.

Google Analytics: Most of the website has this tool in-built on the platform. It is one of the most popular testing tools. Google Analytics tool would help you to split your traffic into two pages that you have developed and would let you know which version is best for testing. 
  
Visual web optimizer: This tool specifically helps in figuring out how your testing windows will be after you plug in a few variables. Visual web optimizer is great for the company project and client work. 

Adobe Target: Adobe target is a popular enterprise tool that combines the taste of targeting testing and personalization. It walks you through three step workflow where you first create a variant, then target a variant base and lastly customize your goals 

Appitmize: It is the testing tool that focuses entirely on mobile optimization and is a perfect choice for mobiles and business. It offers full control over the visual editor and rapidly creates new variants and targets. 

Conclusions 

Now we know that the most important goal for an organization is to create conversions. Creating conversions is the reason how you can measure your progress and growth. 

Opensense Labs is aware of the fact that how important it is to apprehend the visitor’s preferences and their interests. Therefore our services on Drupal personalization and CRO bundles us together to our clients and helps in accelerating the conversions.

Ping us on [email protected] and let us take that road to handle in your success and hurdles. 

Feb 08 2019
Feb 08

Welcome to the latest version of Lullabot.com! Over the years (since 2006!), the site has gone through at least seven iterations, with the most recent launching last week at the 2019 Lullabot team retreat in Palm Springs, California.

Back to a more traditional Drupal architecture

Our previous version of the site was one of the first (and probably the first) decoupled Drupal ReactJS websites. It launched in 2015.

Decoupling the front end of Drupal gives many benefits including easier multi-channel publishing, independent upgrades, and less reliance on Drupal specialists. However, in our case, we don’t need multi-channel publishing, and we don’t lack Drupal expertise.

One of the downsides of a decoupled architecture is increased complexity. Building blocks of our decoupled architecture included a Drupal 7 back end, a CouchDB middle-layer, ReactJS front end, and Node.js server-side application. Contrast this with a standard Drupal architecture where we only need to support a single Drupal 8 site.

The complexity engendered by decoupling a Drupal site means developers take longer to contribute certain types of features and fixes to the site. In the end, that was the catalyst for the re-platforming. Our developers only work on the site between client projects so they need to be able to easily understand the overall architecture and quickly spin up copies of the site.

Highlights of the new site

In addition to easily swapping in and out developers, the primary goals of the website were ease of use for our non-technical marketing team (hi Ellie!), a slight redesign, and to maintain or improve overall site speed.

Quickly rolling developers on and off

To aid developers quickly rolling on and off the project, we chose a traditional Drupal architecture and utilized as little custom back-end code as possible. When we found holes in functionality, we wrote modules and contributed them back to the Drupal ecosystem. 

We also standardized to Lando and created in-depth documentation on how to create a local environment. 

Ease of use

To enable our marketing team to easily build landing pages, we implemented Drupal’s new experimental Layout Builder module. This enables a slick drag-and-drop interface to quickly compose and control layouts and content.

We also simplified Drupal’s content-entry forms by removing and reorganizing fields (making heavy use of the Field Group module), providing useful descriptions for fields and content types, and sub-theming the Seven theme to make minor styling adjustments where necessary.

Making the front end lean and fast 

Normally, 80% of the delay between navigating to a webpage and being able to use the webpage is attributed to the front end. Browsers are optimized to quickly identify and pull in critical resources to render the page as soon as possible, but there are many enhancements that can be made to help it do so. To that end, we made a significant number of front-end performance optimizations to enable the rendering of the page in a half-second or less.

  • Using vanilla JavaScript instead of a framework such as jQuery enables the JS bundle size to be less than 27kb uncompressed (to compare, the previous version’s bundle size was over 1MB). Byte for byte, JavaScript impacts the performance of a webpage more than any other type of asset. 
  • We heavily componentize our stylesheets and load them only when necessary. Combined with the use of lean, semantic HTML, the browser can quickly generate the render-tree—a critical precursor to laying out the content.
  • We use HTTP2 to enable multiplexed downloads of assets while still keeping the number of HTTP requests low. Used with a CDN, this dramatically lowers the time-to-first-byte metric and time to download additional page assets.
  • We heavily utilize resource-hints to tell the browser to download render-blocking resources first, as well as instructing the browser to connect third-party services immediately.
  • We use the Quicklink module to pre-fetch linked pages when the browser is idle. This makes subsequent page loads nearly instantaneous.

There are still some performance @todos for us, including integrating WEBP images (now supported by Chrome and Firefox), and lazy-loading images. 

Contributing modules back to the Drupal ecosystem

During the development, we aimed to make use of contributed modules whenever it made sense. This allowed us to implement almost all of the features we needed. Only a tiny fraction of our needs was not covered by existing modules. One of Lullabot’s core values is to Collaborate Openly which is why we decided to spend a bit more time on our solutions so we could share them with the rest of the community as contributed modules.

Using Layouts with Views

Layout Builder builds upon the concept of layout regions. These layout regions are defined in custom modules and enable editors to use layout builder to insert these regions, and then insert content into them.

Early on, we realized that the Views module lacked the ability to output content into these layouts. Lullabot’s Director of Technology, Karen Stevenson, created the Views Layout module to solve this issue. This module creates a new Views row plugin that enables the Drupal site builder to easily select the layout they want to use, and select which regions to populate within that layout.

Generating podcast feeds with Drupal 8

Drupal can generate RSS feeds out of the box, but podcast feeds are not supported. To get around this limitation, Senior Developer Mateu Aguiló Bosch created the Podcast module, which complies with podcast standards and iTunes requirements.

This module utilizes the Views interface to map your site’s custom Drupal fields to the necessary podcast and iTunes fields. For more information on this, checkout Mateu’s tutorial video here.

Speeding up Layout Builder’s user interface

As stated earlier, Layout Builder still has “experimental” status. One of the issues that we identified is that the settings tray can take a long time to appear when adding a block into layout builder.

Lullabot Hawkeye Tenderwolf identified the bottleneck as the time it takes Drupal to iterate through the complete list of blocks in the system. To work around this, Karen Stevenson created the Block Blacklist module, in which you can specify which blocks to remove from loading. The result is a dramatically improved load time for the list of blocks.

Making subsequent page loads instantaneous 

A newer pattern on the web (called the PRPL pattern) includes pre-fetching linked pages and storing them in a browser cache. As a result, subsequent page requests return almost instantly, making for an amazing user experience. 

Bringing this pattern into Drupal, Senior Front-end Dev Mike Herchel created the Quicklink module using Google’s Quicklink JavaScript library. You can view the result of this by viewing this site’s network requests in your developer tool of choice. 

Keeping users in sync using the Simple LDAP module

Lullabot stores employee credentials in an internal LDAP server. We want all the new hires to gain immediate access to as many services as possible, including Lullabot.com. To facilitate this, we use the Simple LDAP module (which several bots maintain) to keep our website in sync with our LDAP directory.

This iteration of Lullabot.com required the development of some new features and some performance improvements for the D8 version of the module.

Want to learn more?

Built by Bots

While the site was definitely a team effort, special thanks go to Karen Stevenson, Mike Herchel, Wes Ruvalcaba, Putra Bonaccorsi, David Burns, Mateu Aguiló Bosch, James Sansbury, and, last but not least, Jared Ponchot for the beautiful designs.

Feb 08 2019
Feb 08

Drupal 8 is known for extensive third-party integration opportunities it gives to websites. One of the tools for this is the contributed Drupal module JSON:API. It helps developers build high-performance APIs for various purposes, including multi-channel content or decoupled Drupal and JSON API setups (which is one of our Drupal team’s areas of expertise). This winter has seen a new release — Drupal JSON:API 2.x. Let’s take a look at what the module does, what makes it useful, and how it has changed in the 2.x version.

JSON:API: principle and benefits

JSON API is a specification, or a set of rules, for REST APIs. It defines how data is exchanged between the server and the client in the JSON format. This includes how the client request the resources, how the are fetched, which HTTP methods are used, and so on. 

JSON (JavaScript Object Notation), in its turn, is the most popular format for APIs. It is very lightweight, consistent in structure, intuitively understandable, human-readable, and easily consumable by machines. At the root of all requests is a JavaScript object.

The JSON API specification optimizes the HTTP requests and gives you better performance and productivity. JSON API eliminates unnecessary server requests, and reduces the size of the data packages.  

The specification supports the standard CRUD operations that let users create, read, update, or delete the resources. It is also accepted by all programming languages and frameworks.

crud operations

A glimpse at Drupal JSON:API module’s work

The Drupal JSON:API module offers Drupal’s implementation of the JSON API specification. The module provides an API in compliance with the JSON:API standards for accessing the content and configuration entities of Drupal websites.

The JSON:API module is part of Drupal 8’s ecosystem of web services, and also an alternative to Drupal’s core REST. JSON:API is considered to resolve some of the core REST limitations (for example, complex setup, confusing URLs, hard-to-configure collections of entities etc.). At the same time, it only works with entities.

Let’s mark some important points and benefits of the JSON:API work:

  • no configuration needed (enabling the module is enough to get a full REST API)
  • instant access to all Drupal entities
  • URLs provided dynamically for entity types and bundles so they are accessible via standard HTTP methods (GET, POST, PATCH, DELETE etc.)
  • support for Drupal entity relationships
  • support for complex sorting and pagination
  • access configured in Drupal core role and permission system

As we see, the main philosophy of the module is to be production-ready out-of-box. For configurations, there is a related contributed module JSON:API Extras module that lets developers set up every detail they need.

Drupal JSON:API 2.x: what’s new?

Drupal JSON:API 2.x module is getting ready to become part of Drupal 8 core in the nearest future. Thanks to this, the ecosystem of web services to build high-performance APIs in Drupal 8 core will then be more complete and diverse. It is also great that Drupal will have a NIH (not invented here) API in core that follows a very popular specification. 

The creators of the JSON API module have had a busy time preparing the 2.x module version for Drupal 8 websites. Overall, 63 contributors took part in that. And there is still a big roadmap ahead. 

They issued two beta versions in August and September, then three release candidates from October to December of 2018. 

Finally, the stable 2.x version came — JSON:API 2.0 in January 7, 2019, bringing big changes. Websites will benefit from:

  • serious performance improvements (including sparse fieldsets)
  • better compatibility with JSON:API clients
  • more comprehensive test case coverage (including edge cases)
  • backwards compatibility with JSON:API 1.x
Drupal JSON API 2.x improvements

As well as:

  • the ability to see labels of the inaccessible entities
  • information about the user available via “meta.links.me”
  • error response cacheability
  • the config entity mutation feature moved to the JSON:API Extras module
  • final farewell to the _format parameter in the URL
  • custom "URL" field to no longer added to file entities
  • filter paths closely match JSON:API output structure
  • URLs become objects with `href` keys

and more

Drupal JSON:API 2.0 was soon followed by a new one — JSON:API 8.x-2.1 in January 21, 2019. This is now the latest stable version. Drupal JSON:API 2.1 added two new features: 

  1. support for file entity creation from binary data
  2. support for fetching non-default entity revisions.

We should also note that Drupal JSON:API 2.x module is part of the Contenta CMS decoupled distribution that uses best practices of decoupled architecture. JSON:API is immediately available with all new installs of the Contenta.

The related module JSON API Extras for customizing APIs is also fresh and updated. It has had a new 8.x-3.3 release on January 21.

Let’s build the optimal JSON:API setup

As we see, there are plenty of means and tools to build high-performance APIs. Our Drupal developers are keen in this area, and will select the optimal ones for your website. It can be the Drupal JSON:API 2.x module, core RESTful web services, GraphQL, and more. Contact our Drupal team!

Feb 08 2019
Feb 08

Today, our interactions with the digital world have surpassed the human interactions so much that the need for the user interface to be appealing and friendly plays an important role in terms of progression. 

Government websites are no different.

The first connection which they have with their citizens is more likely to be an engaging website. One of the most essential tools for meeting the needs of your people or citizens. 

So, it has to be the best. Right?

Image of a .gov text where the dot is red in color and the text is blue in color


Creating a functional website with easy navigation not only help officials do better in connecting with their constitutes but also ensures that the public stays well informed all the time. 

Drupal is one such platform which helps you achieve all of this in one go. 

How is Drupal in Government sector performing?  

Drupal is gaining popularity in government sector all over the world. The solidity and flexibility of the platform is the primary reason why the government is moving its online portals to Drupal. Government websites like the white house, Federal IT Spending Dashboard, Data govt. has specifically chosen Drupal for its efficiency. The reason why most of the govt. Websites are choosing Drupal over any other platform is also due to the facts that it is:

Drupal has an excellent track record when it comes to solving and maintaining security issues. The security team (Drupal Community) that works together with other councils watches and ensures that its users are getting the best security practices

The fact that The White House entrusts Drupal as its platform is enough to prove that it is highly secure CMS. There are several security modules that make it a highly reliable platform. Modules like:

Login Security: Drupal sites that are available in both, HTTP and HTTPs provides the user with a lockdown in the login page, and submits the other forms securely via HTTPS. Hence, it prevents passwords and other sensitive user data from being transmitted.

Password Policy: This module forces a user to forcefully create a strong and powerful password. 

Captcha: It is the response test which is specifically constructed for determining the user. To check whether the process is being done by a human and not by a robot.

Security Kit: It provides Drupal users with various security hardening options. This doesn’t let the website compromise the data and information to the hackers and other foreign users. Security Kit presents particular mitigation for cross-site request forgery, cross-site scripting, and clickjacking, among other issues.

Two-factor verification: The two-step verification is a procedure that implicates the two-step authentication method, which is performed one after another to verify the requesting access. 

Image of a lock, shield, monitor and a cloud placed on a blue background. The text on the image is Drupal 8, A secure way for web development
  • Accessible

Government websites are those kinds which are used by everyone, and by everyone, I mean the visually impaired too. Each citizen should be able to access the government website quickly and seamlessly and according to WACG 2.0 (web content accessibility guidelines), every website should provide equal standards to all the people.
Drupal is one such platform which adheres to each and every WACG guidelines with its different modules and provides accessibility to everyone. 

Alt text: This is one of the most important modules when it comes to providing accessibility to a website. With this module, the search engine understands the text of an image or a page and screen readers read it loudly to the user.

Content accessibility: This module checks all the type of fields where the user can enter formatting. Below the content section of the accessibility page, the user is provided with the option to turn the tests on or off using the tab that would appear below the page. 

Accessibility in WYSIWYG: This type of module integrates the accessibility configuration with the WYSIWYG or CKEditor modules, which provide the user with a new button while they are editing the content that checks the work for accessibility issue. 
 

Image of the blue Drupal drop which has stick images of a person on a wheelchair and a person spreading its arms

 

  • Economical

The government budget for software developments cannot be compared to the budget of large enterprises for the same purpose. The government needs to opt for a high-quality solution that does not cost a fortune. High development and maintenance costs may be considered as an obstacle to finding high-quality website solutions.

Thus, Drupal meets these and many other requirements are the reason why the government chose it as there CMS. The Drupal development price is relatively low when it is compared with other CMS in the market. 

Another great feature of Drupal8 is its perfect scalability. The CMS is suitable for both small survey website as well as the content-rich websites with gigabyte information. 

At the same time, Drupal is capable of handling high traffic issues and problems. Web solution built on this tool is available even when the traffic volume jumps sky high. A great example of the same is UNESCO and CERN web channels. These Drupal-based websites offer a great user experience and thousands of people use it on daily basis. 

  • Easily Customized

When we say government websites we automatically imagine something grey, black, white or something really boring. But one has to remember that these website services both political and non- political purposes. Thus, user interaction and engagement then here becomes a crucial aspect. 

Better user engagement is made possible with Drupal and its modules. The administrator can add blogs, articles, write-ups that contributes highly opportunities and solutions. Not only this but Drupal also gives its user the power to personalize their website according to their needs and requirement, making it a flexible and reliable CMS. 

  • Has superb integration capabilities 

A powerful web solution should integrate seamlessly with third-party applications. Publishing tools, data repository, and other features belong to the list of necessary interactions. 

The integration capabilities offered by Drupal is enormous. It provides numerous options so that the user is able to integrate any type of third party service. 

Drupal Distribution: DeGov to the rescue 

DeGov is the first Drupal 8 open source distribution focussing entirely on the needs of the governmental organizations. It is intended to provide a comprehensive set of functionalities that are commonly used only of the applications that are related to government. 

What DeGov is not?

The DeGov distribution is not a commercial CMS ( developed as well as owned by a single company, and the users usually need to buy a license) and it is not a finalized product. Meaning, DeGov is not a ready solution. The reason is that there are a lot of functionalities and ideas in backlog and hence it is a daily work off process. 

DeGov is an idea to realize the benefits in the public sector with the idea of open source. 
 
Image of the logo of Degov in form of a circle where the text is DeGov in the middle

Use case related to DeGov

DeGov has 6 use cases and extends its valuable functions to meet certain scenarios. 

  • Publishes information for all the websites that are government based organizations of all level.
  • Service-oriented E-government portals, that closes the gaps between the users/ citizens and administrator.
  • Citizen engagement portals to decline and discuss online.
  • Open311 portal for civic issue tracking.
  • Open Data portal for publishing and creating communities data

Intranet and Extranet of government employees.

An infographic with 6 parts describing the use case of DeGov.

Beneath the Canopy of DeGov

DeGov is the sub-profile of the lightning distribution. 

Lightning allows you to create sub-profiles that are entirely based on default lightening distribution profile. Creating a sub-profile enables you to customize the installation process to meet specific needs and requirements. 

By combining DeGov and lightening distribution, it delivers configuration that is specialized to start new projects. Building on top of Lightning, DeGov leverages in a true open source manner that allows focussing on functionalities for the public sector. 

Which problems does DeGov solve?

Editor 

DeGov solves the issues that are all related to complex backends, missing workflow, multi-language as a pain, missing modernity etc. It allows the editors with highest design flexibility in the maintenance of the website as well as simple editing of the content. 

With the help of “Drag and Drop” function, you can easily structure your page even without the help of programming. Predefined page type helps in easy maintenance. In the central media library, all types of media are managed no matter what form or type it is (pictures, PDFs, videos, posts)

DeGov distribution also helps in easy integrating of the social media. Likewise, you can allow the users to easily share the content in form of articles, blogs, and other write-ups. DeGov uses the privacy- compliant integration of social media share buttons. 

Customers and clients 

DeGov solves the issues that are related to the features and high ownership with old and propriety technologies.  It a user-friendly web front with attractive designs. It is responsive and thus adapts to any device and its size.  The use of HTML/CSS makes a website accessible. Likewise, the page can be made translated into several languages. A language switcher and an automatic screen reader translate the workflow easily. 

DeGov websites receive a powerful search function for the website content as well as the documents. This is particularly convenient for the authorities: The administrator search engine and NRW  that is connected to the service so that users can also search the content outside their area of responsibility. 

Companies and developers 

It solves the issue related to high-cost updates and the functionalities that are related to it. The distribution is based on opensource software, as a result, you pay neither for the license nor you purchase it.

Drupal is used in numerous projects thus the basic functions and the functionalities are constantly updated and expanded from time to time. This fixes all the bugs and vulnerabilities quickly. 

Implementations with DeGov

Websites

The DeGov distribution has the ability to release, federal and state portals, as well as internet sites for ministries, authorities, districts, and municipalities. The web pages and the target group-specific portals (or topic pages) can be created easily by DeGov. 

Participation Portals

With participation portals to let the citizens participate in decisions and proposed measures, DeGov distribution has opened doors in terms of communication. Here you can notify, discuss with users, gather valuable suggestions or create polls. From participatory budgeting through bills to constructing methods - portals obtained by DeGov give citizens as a voice.
 
E-Government Portals

DeGov distribution is a great way to implement entire eGovernment portals. Less stress is laid on the editorial content than whole specialist procedures. In this way, DeGov allows the digital processing of administrative processes. The project Gewerbe.NRW was implemented with the help of DeGov distribution.

Why DeGov with Drupal 8?

DeGov modules are Drupal modules which are carefully fit together. This means that the whole system benefits largely with its functionalities. The reasons why Drupal 8 should be with DeGov is because:

  • It is based on Symfony and it uses composers.
  • It is Git accessible for config files.
  • It is cloud-ready and runs like a modern PHP solution 

Better projects with DeGov

Image of a table having the list of all media types

A case study on Gewerbe-Service-Portal.NRW

The new "Gewerbe-Service-Portal.NRW" has been providing citizen friendly services by allowing company founder in German federal state North Rhine-Westphalia (NRW) to electorally register a business from home. Its main aim was to provide aid of a clearly arranged online form, commercial registration and that can be transmitted to responsible citizens.

In addition to the business registration, the portal provided with information to the topic “foundation of an enterprise”. Furthermost all the users had access to Einheitliche Ansprechpartner NRW. Also, the online service supported specialized staff in taking up a service occupation.

The portal was developed on the basis of the content management Drupal-based system DeGov and nrwGOV. They chose Drupal because of that fact that it was cost-effective and new technologies could be adapted to the Drupal community. Apart from this Drupal provided with:

  • Higher safety and better quality
  • Independence
  • Comprehensive option
  • Accessibility

The portal aimed at providing more flexibility to the entrepreneur that is eligible to start there own business by saving time through digitization. The electronic forwarding and processing of the application by authorities ensured effective processing within applications. The result was effective and user-friendly communication between the citizens and authorities. Whereas in the near future the Gewerbe-Service-Portal.NRW will develop a comprehensive service platform so that the administrative process could be carried out at home.

In the Nutshell 

So how important is the government website to you?

The answer might have been crystal clear by now. 

As important as it is to have a website, maintaining it a whole different task. Yes, Drupal is making it easy for you by its functionality and distribution.  But the definite art of maintaining it is as important as creating it. 

Ping us on [email protected] our services would not only get the best out of Drupal but would also but would also help in enhancing your development and industrial standards.  

Feb 08 2019
Feb 08

This Episode's Guests

Karen Stevenson

Thumbnail

Karen is one of Drupal's great pioneers, co-creating the Content Construction Kit (CCK) which has become Field UI, part of Drupal core.

Mateu Aguiló Bosch

Thumbnail

Loves to be under the sun. Very passionate about programming.

Wes Ruvalcaba

Thumbnail

Wes is a designer turned front-end dev with a strong eye for UX.

Putra Bonaccorsi

Thumbnail

An expert in content management systems like Drupal, Putra Bonaccorsi creates dynamic, user friendly sites and applications.

Feb 08 2019
Feb 08

Innovation within Canadian healthcare continues to provide better care experiences for those using the system.  As the population ages and strains our facilities to care for those nearing their end-of-life, hospitals are looking at technological solutions to ease the burden on emergency rooms and give people access to accurate and timely healthcare.   Nextide partnered with uCarenet, a Toronto-based e-health company, to create an innovative health and wellness application to monitor the condition of palliative care patients for a major Canadian hospital.

The Requirements:

The hospital required an application that could gather critical patient data through forms, compile the results daily, report those results to the patient’s healthcare team and alert the team in the event the results warranted intervention.  The application had to be accessible to all participants, easy to use and mimic the existing paper-based data collection mechanism the palliative care team employed. The hospital staff required administrative logins to create users and monitor the entered data in a simple-to-use and uncluttered dashboard.  Doctors and nurses should be able to determine who has symptoms which require immediate attention, and be able to view the discrete data if required.

The Solution:

At Nextide, we look at Drupal as a platform to build applications and not purely as a feature-rich content management system for websites.  We were confident that Drupal was the right selection for the application. Through careful design, our team was able to create an interactive application that gathered patient data, compiled and weighted the results based on current and historical data, and send the palliative care team notifications when warranted.

First, the palliative care user’s experience:

User's view when ability to enter data is open or closed

The image above shows two versions of the palliative care user’s home page.  The left side is the home page when they are able to complete their daily forms.  On the right, if the user completed their forms within the last 24 hours, they’re locked out of completing a new set of forms until the 24 hour window has passed.

Once daily the palliative care users are able to fill in the three required forms.  An example of one of the forms is shown here:

Showing the Brief Pain Inventory Form

On any device the patient or caregiver has, patient data can recorded using Drupal to store all form data in custom entities.  Themed form buttons and select lists are presented in easy-to-use clickable human bodies, sliders and radio buttons.

Upon completion of the patient information forms, the results are compiled according to the hospital’s predefined requirements.  The three forms have at least 50 discrete data points in total. We take the data entered, compare it to the last three to five days of the patient’s data and determine trends within those fields which require an alert being triggered.  If a user is deemed to be “alerting”, we send a notification to the healthcare team and flag the user and the fields as alerting.

The Healthcare User’s Experience:

When a healthcare user enters the system, their homepage is comprised of a simple to use dashboard.  The dashboard presents the user with an easy-to-read listing of patients with their status plainly visible.

Healthcare Dashboard

The healthcare user is able to drill into any patient’s data, giving them a graphical breakdown of the historical patient data.  The nurse or doctor is able to see the trends in the patient data and is able to take appropriate action. Healthcare users are able to acknowledge the alerts, thereby clearing the status of the patient, and they’re able to silence alarms for any given patient so they do not receive alerts for 24, 48 or 72 hours.

Drilldown into stats

We generate on-the-fly graphs of patient data.  Healthcare professionals can also drill even further into each entity where they’re able to view each form’s entry and mute any alerting fields.

Alerts panel

Themed Drupal entities give the healthcare professional a quick breakdown of patient entered values.  If a field is alerting, it’s highlighted and gives the user the option to silence the alarm by clicking on the alerting symbol.  Users are able to scan back and forth through daily entries. The interface is easy to use, uncluttered and responsive in design.


The Outcomes:

The initial pilot phase of the project had between 15 and 20 patients entering data at any given time.  In the first 3 months of the pilot:

  • 105 critical alerts were generated by the system and sent to the patients’ circle of care.
  • 46 of the 105 critical alerting patients received follow-up intervention.
  • 7 patients who had not triggered an alert received telephone intervention after the healthcare staff noticed trends in their data.
  • Saved an estimated $62 000.  That’s right!  This single application over the course of three months saved over sixty-thousand dollars in unnecessary care in the Emergency Room!

Every time a patient receives proactive care, the likelihood that they require an Emergency Room visit is reduced, thereby reducing the load on the hospital.  When you consider the small pilot size and the number of interventions required, each intervention could have eliminated a potential Emergency Room visit. A resounding success!

When you look at Drupal as “just a content management system”, you lose sight of what it really is:  a platform to produce applications that matter. At Nextide, we have been focused on Drupal since 2009. Our experience in Drupal ensures that we utilize the most effective modules to construct sites and applications that are optimized for performance and functionality. We continue to leverage Drupal in many more ways than just content management.  Our core strengths are focused on the design and build of creative sites, and web based business applications, that mirror the way people think and work. We can custom create any business function or create the ideal work environment for people to perform at their best. The capability to deliver robust custom applications to our customers allows us to deliver solutions from healthcare, to HR, and everything in-between.

Need Help?

Do you have a Healthcare initiative we can help with?

Feb 08 2019
Feb 08

Web monetization service is a browser API which allow creation of (micro) payments between the reader (user agent) and the content provider (website).

This is one way of getting paid for valuable content.

Today Coil is providing web monetization service using Interledger protocol (ILP).

We have built a simple module to integrate coil monetization with Drupal website:

Simple settings to add payment pointer is demonstrated in this video:

Your browser does not support the video tag.

Feb 07 2019
Feb 07

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

I'm frequently sent examples of how Drupal has changed the lives of developers, business owners and end users. Recently, I received a very different story of how Drupal had helped in a rescue operation that saved a man's life.

The Snowdonia Ultra Marathon website

In early 2018, Race Director Mike Jones was looking to build a new website for the Ultra-Trail Snowdonia ultra marathon. He reached out to a good friend and developer, Rob Edwards, to lead the development of the website.

A photo of a runner at the Ultra-trail Snowdonia ultramarathon

© Ultra-trail Snowdonia and No Limits Photography

Rob chose Drupal for its flexibility and extensibility. As an organization supported heavily by volunteers, open source also fit the Snowdonia team's belief in community.

The resulting website, https://apexrunning.co/, included a custom-built timing module. This module allowed volunteers to register each runner and their time at every aid stop.

A runner goes missing

Rob attended the first day of Ultra-Trail Snowdonia to ensure the website ran smoothly. He also monitored the runners at the end of the race to certify they were all accounted for.

Monitoring the system into the early hours of the morning, Rob noticed one runner, after successfully completing checkpoints one and two, hadn't passed through the third checkpoint.

A photo of a runner at the Ultra-trail Snowdonia ultramarathon

© Ultra-trail Snowdonia and No Limits Photography

Each runner carried a mobile phone with them for emergencies. Mike attempted to make contact with the runner via phone to ensure he was safe. However, this specific area was known for its poor signal and the connection was too weak to get through.

After some more time eagerly watching the live updates, it was clear the runner hadn't reached checkpoint four and more likely hadn't ever made it past checkpoint three. The Ogwen Mountain Rescue were called to action.

Due to the terrain and temperature, searching for the lost runner on foot would be too slow. Instead, the mountain rescue volunteers used a helicopter to scan the area and locate the runner.

How Drupal came to the rescue

The area covered by runners in an ultra marathon like this one is vast. The custom-built timing module helped rescuers narrow down the search area; they knew the runner passed the second checkpoint but never made it to the third.

After following the fluorescent orange markers in the area pinpointed by the Drupal website, the team quickly found the individual. He had fallen and become too injured to carry on. A mild case of hypothermia had set in. The runner was airlifted to the hospital for appropriate care. The good news: the runner survived.

Without Drupal, it might have taken much longer to notify anyone that a runner had gone missing, and there would have been no way to tell when he had dropped off.

NFC and GPS devices are now being explored for these ultra marathon runners to carry with them to provide location data as an extra safety precaution. The Drupal system will be used alongside these devices for more accurate time readings, and Rob is looking into an API to pull this additional data into the Drupal website.

Stories about Drupal having an impact on organizations and individuals, or even helping out in emergencies, drive my sense of purpose. Feel free to keep sending them my way!

Special thanks to Rob EdwardsPoppy Heap (CTI Digital) and Paul Johnson (CTI Digital) for their help with this blog post.

Feb 07 2019
Feb 07
  • Shefali
  • 07-02-2019

“It takes time to save time”, said Astrophysicist Joseph Hooton Taylor Jr. and how right he is! The good folks from Acquia have thoughtfully curated a package that can save businesses from tons of time spent in marketing and editorial efforts. Acquia Lightning is an open-source Drupal 8 distribution that was designed to enhance, fasten and simplify the process of creating powerful (and responsive!) digital experiences.

Drupal 8 gives developers and content authors full flexibility to shape their websites and applications that meets their vision. It is packed with thousands of powerful features that requires to be able to support a wide variety of content-rich applications. Acquia Lightning is a lean, ready-to-use starter-kit that encompasses just the tools needed to develop and manage your enterprise-grade digital experiences. Lightning is built specifically to empower your marketing and editorial teams to build better, easier and faster. Here are 5 reasons why choosing Acquia Lightning could be a great decision for your organization’s editorial and content teams.

1. Speed of Lightning

This aggressively competitive era calls for quick formulas that can produce epic digital experiences. Acquia Lightning enables organizations to develop new applications in Drupal 8 faster. According to Chris Stone, former Chief Product Officer at Acquia, “Lightning removes 20 percent of the time required for every new Drupal 8 development project.” With Acquia Lightning, Drupal developers can now empower business and non-technology savvy users with powerful tools to enable them build on their websites faster.

2. Drag This and Drop That

Lightning enables you to easily design and customize new website layouts yourself without a Drupal developer’s assistance. Content editors won’t need to use any code to design a layout. With the help of Panels, Lightning layouts lets you easily drag and drop content blocks (elements wherever you like! This powerful feature lets you create compelling User Experiences as you can visually design the layout of your website the way you want to. With just a few clicks (and zero code), you can – add rich text, media, content references, slideshows, Google Maps and just about everything you need to create an interactive and engaging website. Not just this, your Drupal developers can also create custom components which you can then use from your library.

3. Effortless Workflows

Having everything together, without rummaging through the place, makes life simpler and easier – don’t you think? Editorial workflows call for tons of significant activities like creating, editing, reviewing and publishing content. Acquia Lightning’s easy all-in-one-place workflow management system, lets you easily tackle the chaos. It provides you with an approval dashboard meant just for your editorial workflow. Here you can easily check the status of the content you are reviewing - change, delete or publish it at your convenience. The version tracker allows you to check for changes and updates between different revisions, giving you the freedom to revise, compare and revert your content. You can review them, pick the most appropriate version, schedule and publish the content, all in the dashboard.

4. Manage your Media

You cannot have an engaging website without leveraging the power of Media. Lightning offers an easy and effective approach to add, embed and manage your media like Videos, Audio files, Images, Social Media Widgets, Documents, Maps and DAM (Digital Asset Management). All added Media will be stored in a Media library, giving you easy and quick access to all your media. Using the CKEditor (rich text editor), you can attach or embed any media into any content type. Cropping and resizing your images is easy as pie.

With the power of Drupal 8, Acquia Lightning can give your business that edge over your competitors. Learn how and why..

tweet this With Acquia Lightning, designing interactive digital experiences is simply effortless.

5. Sneak Peek – Experience Preview

You’ve got to think like your customer if you want to sell better. And to provide great digital experiences, you should be able to see exactly what your customer is seeing. Acquia Lightning for Drupal 8 (upgrade now!) enables you to conduct a comprehensive preview at every stage so you view what you publish before it goes live.

drupal-lightning

Acquia Lightning Features

Right out-of-the-box, Acquia Lightning is a slick, lightweight, all-you-need Drupal 8 solution that will meet all your web initiatives’ demands. It is one of the best ways to jump-start your Drupal projects and can accelerate your time to market. Headless Lightning lets you create beautiful and effective decoupled web applications with ease. Security is almost synonymous with Drupal CMS and Acquia lightning conforms to security best practices. Lightning undergoes regular security audits and releases frequent patches to keep your website’s health in check. Acquia Lightning makes development on Drupal simpler, faster and more powerful.

Feb 07 2019
Feb 07

A while ago, we wrote a post on the history of the Druplicon. As we pointed out in this post, our beloved Drupal logo, the drop, went through quite a few iterations to arrive at the point where it is today, known by everyone in the community. On top of that, because of the same prolific community, various versions of the logo have been created for special occasions, such as new releases and events, and for different topics and regions.

So, in the 18 years since Drupal’s conception, the community has seen a wide range of different Druplicons. But, unlike for other Drupal-related material, such as modules, there did not exist a unified platform where one could get an overview of the Druplicon’s evolution and all its variations throughout the years.

Druplicon version Delta from 2013

The Official Origin Story 

Officially, this is what sparked the idea of druplicon.org for Vesna, one of our Drupal developers. She was impressed by the large number of Druplicons and wanted to create a website that would display all existing Druplicons in one place. 

In order to bring the Druplicon closer to the community, she decided to gather all the diverse versions of the Druplicon and turn their discovery into something fun and interactive. 
 

... And the Actual One

Well, the origin story above is not exactly untrue; there is more to it, though. Vesna actually revealed the whole story behind what drove her to create druplicon.org - and it’s super fascinating! 

At Drupal Developer Days Milan 2016 she caught a glimpse of someone wearing a T-shirt with what appeared to be a Druplicon in the style of Joan Miró i Ferrà, one of her favorite painters. She isn’t completely sure, but she thinks the logo was that of a Drupal-related event in Barcelona (Joan Miró was a Spanish painter, sculptor, and ceramicist born in Barcelona, so there’s the connection). 

Of course, it’s not always easy to approach someone you don’t know and just spark up a conversation. It’s totally understandable, then, that she didn’t go to him to find out about his shirt and ask him if she can take a photo of the Druplicon for Instagram - and it’s even more understandable that she immediately regretted not doing so! 

She went on to scour Google relentlessly, trying to find the lost Druplicon - but, sadly, to no avail. So, she decided to make a database of all existing Druplicons, with the hidden agenda of maybe eventually finding this mythical one.

Memory Game

Visitors to druplicon.org can thus easily explore the icons by different categories, get additional information about them, clearly see which are the newest ones and even play an interactive memory game with the icons.

Due to the abundance of different Druplicons, the game is not exactly an easy one - even when opting for the “easy” mode. Playing it regularly, however, will quickly improve your knowledge of existing Druplicons. 

And, since the game is designed to be educational as well as fun to play, it greatly helps with remembering which Drupal event or aspect of the larger community a specific icon is connected to. 

Whenever you discover a matching pair, you get information about the event or topic that the icon has been used for, together with the link to the event’s or the community’s page on drupal.org

So, attend Drupal events, memorize their logos, get to know the community, then play the game regularly and become a true Druplicon master! (Pro tip: tackling the harder levels on a big screen will make the game much easier).

Druplicon.org memory game example

Staying True to the Spirit of Drupal

There’s another very Drupal-esque aspect of the site - in the spirit of Drupal contribution, users that register to the platform get the opportunity to submit new Druplicons. Because, let’s face it - there are so many versions of Drupal’s logo that it would almost be megalomaniac to think that we caught them all. 

So, if you discover or think of any Druplicons we might have missed, you’re more than welcome to join druplicon.org and add them to the platform - especially if one of them is the elusive Joan Miró Druplicon!

The Power of the Community

Druplicon.org is thus a collective effort in the true sense of the word. Staying true to the all-encompassing nature of Drupal, it’s a site anyone can contribute to and thus connect with members of the Drupal community no matter where they hail from. 

And, fittingly, the site itself is of course built in Drupal 8; it was actually built by our freshly recruited developers as part of their onboarding project (if you want to know more about how we onboard our new developers, take a look at this post on our effective training program). 

Doing so, they got hands-on experience with both crucial aspects of Drupal: coding and giving back to the community. As such, druplicon.org is truly a site by and for the Drupal community, showcasing the power of said community. 

Call to Action

This, then, is the perfect opportunity to try to get the community involved in our search for the lost Druplicon. Remember how we mentioned before that (a major!) part of the motivation behind the creation of the site was finding the Joan Miró Druplicon?

Well, we know that the Druplicon is somewhere out there, just waiting to be discovered. So, now we’re calling on all of you, especially those who attended Drupal Developer Days Milan in 2016, to spread the word, explore druplicon.org and contribute with any missing Druplicons. Together, we can surely find the lost Druplicon and make Vesna happy!

Joan Miró i Ferrà: Zephyr Bird

(If anyone has any information on the icon, but doesn’t want to join the platform for any reason whatsoever, please give us a shout out - any info is helpful!)


 

 

Feb 07 2019
Feb 07

I'm frequently sent examples of how Drupal has changed the lives of developers, business owners and end users. Recently, I received a very different story of how Drupal had helped in a rescue operation that saved a man's life.

The Snowdonia Ultra Marathon website

In early 2018, Race Director Mike Jones was looking to build a new website for the Ultra-Trail Snowdonia ultra marathon. He reached out to a good friend and developer, Rob Edwards, to lead the development of the website.

A photo of a runner at the Ultra-trail Snowdonia ultramarathon© Ultra-trail Snowdonia and No Limits Photography

Rob chose Drupal for its flexibility and extensibility. As an organization supported heavily by volunteers, open source also fit the Snowdonia team's belief in community.

The resulting website, https://apexrunning.co/, included a custom-built timing module. This module allowed volunteers to register each runner and their time at every aid stop.

A runner goes missing

Rob attended the first day of Ultra-Trail Snowdonia to ensure the website ran smoothly. He also monitored the runners at the end of the race to certify they were all accounted for.

Monitoring the system into the early hours of the morning, Rob noticed one runner, after successfully completing checkpoints one and two, hadn't passed through the third checkpoint.

A photo of a runner at the Ultra-trail Snowdonia ultramarathon© Ultra-trail Snowdonia and No Limits Photography

Each runner carried a mobile phone with them for emergencies. Mike attempted to make contact with the runner via phone to ensure he was safe. However, this specific area was known for its poor signal and the connection was too weak to get through.

After some more time eagerly watching the live updates, it was clear the runner hadn't reached checkpoint four and more likely hadn't ever made it past checkpoint three. The Ogwen Mountain Rescue were called to action.

Due to the terrain and temperature, searching for the lost runner on foot would be too slow. Instead, the mountain rescue volunteers used a helicopter to scan the area and locate the runner.

How Drupal came to rescue

The area covered by runners in an ultra marathon like this one is vast. The custom-built timing module helped rescuers narrow down the search area; they knew the runner passed the second checkpoint but never made it to the third.

After following the fluorescent orange markers in the area pinpointed by the Drupal website, the team quickly found the individual. He had fallen and become too injured to carry on. A mild case of hypothermia had set in. The runner was airlifted to the hospital for appropriate care. The good news: the runner survived.

Without Drupal, it might have taken much longer to notify anyone that a runner had gone missing, and there would have been no way to tell when he had dropped off.

NFC and GPS devices are now being explored for these ultra marathon runners to carry with them to provide location data as an extra safety precaution. The Drupal system will be used alongside these devices for more accurate time readings, and Rob is looking into an API to pull this additional data into the Drupal website.

Stories about Drupal having an impact on organizations and individuals, or even helping out in emergencies, drive my sense of purpose. Feel free to keep sending them my way!

Special thanks to Rob Edwards, Poppy Heap (CTI Digital) and Paul Johnson (CTI Digital) for their help with this blog post.

February 06, 2019

2 min read time

Feb 06 2019
Feb 06

If you are a programmer looking to improve your professional craft, there are many resources toward which you will be tempted to turn. Books and classes on programming languages, design patterns, performance, testing, and algorithms are some obvious places to look. Many are worth your time and investment.

Despite the job of a programmer often being couched in technical terms, you will certainly be working for and with other people, so you might also seek to improve in other ways. Communication skills, both spoken and written, are obvious candidates for improvement. Creative thinking and learning how to ask proper questions are critical when honing requirements and rooting out bugs, and time can always be managed better. These are not easily placed in the silo of “software engineering,” but are inevitable requirements of the job. For these less-technical skills, you will also find a plethora of resources claiming to help you in your quest for improvement. And again, many are worthwhile.

For all of your attempts at improvement, however, you will be tempted to remain in the non-fiction section of your favorite bookstore. This would be a mistake. You should be spending some of your time immersed in good fiction. Why fiction? Why made-up stories about imaginary characters? How will that help you be better at your job? There are at least four ways.

Exercise your imagination

Programming is as much a creative endeavor as it is technical mastery, and creativity requires a functioning imagination. To quote Einstein:

Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution.

You can own a hammer, be really proficient with it, and even have years of experience using it, but it takes imagination to design a house and know when to use that hammer for that house. It takes imagination to get beyond your own limited viewpoint. This can make it easier to make connections and analogies between things that might not have seemed related which is a compelling definition of creativity itself.

Your imagination works like any muscle. Use it or lose it. And just like any other kind of training, it helps to have an experienced guide. Good authors of fiction are ready to be your personal trainers.

Understanding and empathy

The best writers can craft characters so real that they feel like flesh and blood, and many of those people can be similar to actual people you know. Great writers are, first and foremost, astute observers of life, and their insight into minds and motivations can become your insight. Good fiction can help you navigate real life.

One meta-study suggests that reading fiction, even just a single story, can help improve someone’s social awareness and reactions to other people. For any difficult situation or person you come across in your profession, there has probably been a writer that has explored that exact same dynamic. The external trappings and particulars will certainly be different, but the motivations and foibles will ring true.

In one example from Jane Austen’s Mansfield Park, Sir Thomas is a father who puts too much faith in the proper appearances, and after sternly talking to his son about an ill-advised scheme, the narrator of the books says, “He did not enter into any remonstrance with his other children: he was more willing to believe they felt their error than to run the risk of investigation.”

You have probably met a person like this. You might have dealt with a project manager like this who will limit communication rather than “run the risk of investigation". No news is good news. Austen has a lot to teach you about how one might maneuver around this type of personality. Or, you might be better equipped to recognize such tendencies in yourself and snuff them out before they cause trouble for yourself and others.

Remember, all software problems are really people problems at their core. Software is written by people, and the requirements are determined by other people. None of the people involved in this process are automatons. Sometimes, how one system interfaces with another has more to do with the relationship between two managers than any technical considerations.

Navigating people is just as much a part of a programmer’s job as navigating an IDE. Good fiction provides good landmarks.

Truth and relevance

This is related to the previous point but deserves its own section. Good fiction can tell the truth with imaginary facts. This is opposed to much of the news today, which can lie with the right facts, either by omitting some or through misinterpretation.

Plato, in his ideal republic, wanted to kick out all of the poets because, in his mind, they did nothing but tell lies. On the other hand, Philip Sidney, in his Defence of Poesy, said that poets lie the least. The latter is closer to the truth, even though it might betray a pessimistic view of humanity.

Jane Austen’s novels are some of the most insightful reflections on human nature. Shakespeare’s plays continue to last because they tap into something higher than “facts”. N.N. Taleb writes in his conversation on literature:

...Fiction is a certain packaging of the truth, or higher truths. Indeed I find that there is more truth in Proust, albeit it is officially fictional, than in the babbling analyses of the New York Times that give us the illusions of understanding what’s going on.

Homer, in The Iliad, gives us a powerful portrait of the pride of men reflected in the capriciousness of his gods. And, look at how he describes anger (from the Robert Fagles translation):

...bitter gall, sweeter than dripping streams of honey, that swarms in people's chests and blinds like smoke.

That is a description of anger that rings true and sticks. And maybe, just maybe, after you have witnessed example after vivid example of the phenomenon in The Iliad, you will be better equipped to stop your own anger from blinding you like smoke.

How many times will modern pundits get things wrong, or focus on things that won’t matter in another month? How many technical books will be outdated after two years? Homer will always be right and relevant.

You also get the benefit of aspirational truths. Who doesn’t want to be a faithful friend, like Samwise Gamgee, to help shoulder the heaviest burdens of those you love? Sam is a made up character. Literally does not exist in this mortal realm. Yet he is real. He is true.

Your acts of friendship might not save the world from unspeakable evil, but each one reaches for those lofty heights. Your acts of friendship are made a little bit nobler because you know that they do, in some way, push back the darkness.

Fictional truths give the world new depth to the reader. C.S. Lewis, in defending the idea of fairy tales, wrote:

He does not despise real woods because he has read of enchanted woods: the reading makes all real woods a little enchanted.

Likewise, to paraphrase G.K. Chesterton, fairy tales are more than true — not because they tell us dragons exist, but because they tell us dragons can be beaten.

The right words

One of the hardest problems in programming is naming things. For variables, functions, and classes, the right name can bring clarity to code like a brisk summer breeze, while the wrong name brings pain accompanied by the wailing and gnashing of teeth.

Sometimes, the difference between the right name and the wrong name is thin and small, but represents a vast distance, like the difference between “lightning” and “lightning bug,” or the difference between “right” and “write”.  

Do you know who else struggles with finding the right words? Great authors. And particularly, great poets. Samuel Taylor Coleridge once said:

Prose = words in their best order; — poetry = the best words in the best order. 

"The best words in the best order" could also be a definition of good, clean code. If you are a programmer, you are a freaking poet.

Well, maybe not, but this does mean that a subset of the fiction you read should be poetry, though any good fiction will help you increase your vocabulary. Poetry will just intensify the phenomenon. And when you increase your vocabulary, you increase your ability to think clearly and precisely.

While this still won’t necessarily make it easy to name things properly - even the best poets struggle and bleed over the page before they find what they are looking for - it might make it easier.

What to read

Notice the qualifier “good”. That’s important. There were over 200,000 new works of fiction published in 2015 alone. Life is too short to spend time reading bad books, especially when there are too many good ones to read for a single lifetime. I don’t mean to be a snob, just realistic.

Bad fiction will, at best, be a waste of your time. At worst, it can lie to you in ways that twist your expectations about reality by twisting what is good and beautiful. It can warp the lens through which you view life. The stories we tell ourselves and repeat about ourselves shape our consciousness, and so we want to tell ourselves good ones.

So how do you find good fiction? One heuristic is to let time be your filter. Read older stuff. Most of the stuff published today will not last and will not be the least bit relevant twenty years from now. But some of it will. Some will rise to the top and become part of the lasting legacy of our culture, shining brighter and brighter as the years pass by and scrub away the dross. But it's hard to know the jewels in advance, so let time do the work for you.

The other way is to listen to people you trust and get recommendations. In that spirit, here are some recommendations from myself and fellow Lullabots:

Feb 06 2019
Feb 06

by David Snopek on February 6, 2019 - 1:04pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Less Critical security release for the Public Download Count  module to fix an Open Redirect vulnerability.

Public Download Count keeps track of file download counts, even for public files.

The module did not verify that the links provided to the intermediate page were actually present in the Drupal site content.

See the security advisory for Drupal 7 for more information.

Here you can download the Drupal 6 patch or the full release.

If you have a Drupal 6 site using the Public Download Count module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Feb 06 2019
Feb 06

Mass.gov dev team releases open source project

Go to the profile of Moshe Weitzman

The Mass.gov development team is proud to release a new open source project, Drupal Test Traits (DTT). DTT enables you to run PHPUnit tests against your Drupal web site, without wiping your database after each test class. That is, you test with your usual content-filled database, not an empty one. We hope lots of Drupal sites will use DTT and contribute back their improvements. Thanks to PreviousNext and Phase2 for being early adopters.

Mass.gov is a large, content-centric site. Most of our tests click around and assert that content is laid out properly, the corresponding icons are showing, etc. In order to best verify this, we need the Mass.gov database; testing on an empty site won’t suffice. The traditional tool for testing a site using an existing database is Behat. So we used Behat for over a year and found it getting more and more awkward. Behat is great for facilitating conversations between business managers and developers. Those are useful conversations, but many organizations are like ours — we don’t write product specs in Gherkin. In fact, we don’t do anything in Gherkin beside Behat.

Meanwhile, the test framework inside Drupal core improved a lot in the last couple of years (mea culpa). Before Drupal Test Traits, this framework was impossible to use without wiping the site’s database after each test. DTT lets you keep your database and still test using the features of Drupal’s BrowserTestBase and friends. See DrupalTrait::setUp() for details (the bootstrap is inspired by Drush, a different open source project that I maintain).

Zakim Bridge at Night, North End Boston. Photo by David Fox.

Using DTT in a Test

  • Our test cases extend ExistingSiteBase, a convenience class from DTT that imports all the test traits. We will eventually create our own base class and import the traits there.
  • Notice calls to $this->createNode(). This convenience method wraps Drupal’s method of the same name. DTT deletes each created node during tearDown().
  • Note how we call Vocabulary::load(). This is an important point — the full Drupal and Mink APIs are available during a test. The abstraction of Behat is happily removed. Writing test classes more resembles writing module code.

More Features

Misc

  • See the DTT repo for details on how to install and run tests
  • Typically, one does not run tests against a live web site. Tests can fail and leave sites in a “dirty” state so it’s helpful to occasionally refresh to a pristine database.

If you have questions or comments about DTT, please comment below or submit issues/PRs in our repository.

More from Moshe: Our modern development environment at Mass.gov

Interested in a career in civic tech? Find job openings at Digital Services.
Follow us on Twitter | Collaborate with us on GitHub | Visit our site

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web