Feb 20 2019
Feb 20

With growth comes changes and today we're introducing changes to our legal terms and pricing. The basic susbcription remains the same at $78 USD per year and the Professional subscription was bumped from $249 to $360 USD per year. The Enterprise subscription now starts at $3000 with a $1000 USD set-up fee which is needed for the one-time job of collecting brand logos and brand names and setting up our scripts to produce the white-labeled/re-branded products automatically. The Enterprise subscription will now be charged per month rather than per year.

New terms of service: https://www.sooperthemes.com/legal/terms
Services catalog:  https://www.sooperthemes.com/legal/services-catalog

Drupal is becoming more valuable but more expensive

3 years after Drupal 8's release the results are in: Drupal is still relevant but Drupal 8 is more expensive to implement and Drupal's adoption curve has tapered off. I don't think this is necessarily bad. Drupal's leadership made a decision to make Drupal the best Enterprise-grade CMS, not the best everyman's CMS. The result is that Drupal's steep learning curve became steeper yet, and costs of training and hiring Drupal developers increased accordingly. Our price bump is not merely a reaction to a decrease in the volume of Drupal websites in need of our solutions, it is also part of our learning process. 

Sooperthemes is growing but it is not growing enough

Since our Drupal 8 launch last year business at Sooperthemes is better than ever. But with our growing popularity comes a big increase in workload from the customer support forum, customer success tasks, and managing simple tasks like account administration, taxes, sales questions. It adds up to a lot of work. Currently our prices are too low for the increase in customers to pay for new staff to take on the additional workload. We have been investing a lot of effort in training interns but the time has come to move to a more sustainable solution.

Without changes Sooperthemes is not ready for the future. This price increase in the Professional subscription is one part of our strategy for sustainable growth.

Another change is getting better at charging big clients more than small clients. We want to keep our products accessible to the entire Drupal community. While we love our enterprise clients. we don't want to develop an amazing product just for the Drupal elite who can afford hundreds or thousands of dollars per month per site. Therefore we're introducing new licensing terms to charge users based on the scale of their usage of our flagship product Glazed Builder.

We updated our terms for us to be able to charge websites fairly not just by the number of domain (site) licenses, but also by the number of users who are using our Glazed Builder product. Some examples to illustrate why I think this is fair.

  1. Freelance Music teacher's website with 1 domain license: $78 USD per year including updates and support.
  2. An Drupal agency with currently 10 clients on our products: $360 USD per year.
  3. A fast-moving consumer goods enterprise with 40 enterprise domain licenses: ~3000 USD per month.
  4. If Tesla.com would use our products for their marketing content, job portal, community forum, online stores, online tools, in 37 languages: $78 USD per year, or 6 dollars and 50 cents per month.

I think the last example illustrates why it makes sense to introduce this new lever to help Sooperthemes grow sustainably.  To learn how exactly our new licensing term works make sure to read our services catalog.  

Provide More Value To Enterprise Clients

In order for Sooperthemes to be successful in the future we will need to work on signing on more Enterprise clients. We're going to work on adding more features that are important to enterprise clients. Starting today we offer better support options and dedicated support developers to cases in the Enterprise tier. If you want to share ideas on what you think should differentiate the Enterprise subscription tier from the other tiers don't hesitate to send me an email here: https://www.sooperthemes.com/contact

I would be especially interested in hearing what it would take for your business to purchase an enterprise subscription.

Feb 20 2019
Feb 20

A month ago, Matt Mullenweg, co-founder of WordPress and founder of Automattic, visited me in Antwerp. While I currently live in Boston, I was born and raised in Antwerp, and also started Drupal there.

We spent the morning together walking around Antwerp and visited the Plantin Moretus Museum.

The museum is the old house of Christophe Plantin, where he lived and worked around 1575. At the time, Plantin had the largest printing shop in the world, with 56 employees and 16 printing presses. These presses printed 1,250 sheets per day.

Today, the museum hosts the two oldest printing presses in the world. In addition, the museum has original lead types of fonts such as Garamond and hundreds of ancient manuscripts that tell the story of how writing evolved into the art of printing.

The old house, printing business, presses and lead types are the earliest witnesses of a landmark moment in history: the invention of printing, and by extension, the democratization of publishing, long before our digital age. It was nice to visit that together with Matt as a break from our day-to-day focus on web publishing.

An old printing press at the Plantin Moretus Museum

Dries and Matt in front of the oldest printing presses in the world

An old globe at the Plantin Moretus Museum

February 20, 2019

47 sec read time

db db
Feb 20 2019
Feb 20

You don’t want people to treat your website as an outcast. You don’t want to be the ugly duckling in this sharp, serious and extremely competitive world. 

Correct?

Thus, owning a professional looking website becomes an important platform for all sorts of business. It doesn’t really matter whether you are planning to make money or not, treating your website just like your employees is a must.

Why?

Well, because it creates an impression of your business, a place where people come to see who you are and what you want. Whether it is a big e-commerce site or a one-pager - a good website would always bring values to you and your company.  

An image of a hand which is holding a tray on which there is an image of a laptop, PC, touchpad and phone


As important as the website is for you, the themes contributes highly to the user experience and functionality of a particular site.

Your theme is the overall look, feel and styling of your website and the very first thing which the audience witnesses. And nothing can beat Drupal in this area.  

Beginning with the Zurb Foundation 

So here is Zurb foundation for you.

Zurb Foundation is the prototype theme that is used to prototype in the browser. It allows you to rapidly create a website or application while leveraging mobile and responsive technology that is well tested. 

The front end framework is the collection of HTML, CSS, and Javascript containing design patterns. These design patterns help the user in persevering time by helping them to dodge boring, boilerplate codes. The sites that are built on this foundation works great when there are multiple screens which include laptops, mobile phone, PC, iPad. The reason is that Zurb foundation has a responsive framework that uses CSS media queries and a mobile-first approach. 

Image of the Zurb foundation logo which has a white yeti holding a phone and a laptop


Different versions of Zurb 
 

 Infographic of different versions of Zurb in a linear form


Zurb Foundation 3

If the primary goal for your website is rapid prototyping then Zurb foundation 3 is for you.  This theme is developed in Sass, which is a powerful CSS pre-processor that helps you to write, clean and organize CSS which can be maintained easily over time. 

One of the biggest advantages of using Zurb Foundation 3 was the shift of development of the framework to Sass and compass rather than pure CSS. 

Sass grants the user with variables, functions and powerful mixin that speeds up the development of framework as well as made the code more coincide. 

Image of number 3 which is drawn in ice shape with horns. Around the digit is image of a phone and an Ipad on left and PC on right.


Zurb Foundation 4 

This version of Zurb foundation brought many new functionalities and changes in its framework. Started from being a mobile-friendly theme, Zurb foundation 4 supported some complex layouts, grids, and media queries. 

This version brought about flexible and powerful built-in tools with the advantage of it being accessible to different screen size (new, simpler syntax that better reflects how the grid works)

Apart from this, Zurb foundation 4 is all about semantics.  Users were granted with the power of removing and writing all the presentational classes from the markup with built-in sass mixins and extensions. 

Zurb Foundation 4 also presented the users with some splendid plugins that worked well with AJAX. 

Image of a white yeti with specks holding a mobile phone that is of the same height with blue background


Zurb Foundation 5

Fast, strong and a better foundation, Zurb foundation 5 is great for designers and developers that are using it in their workflow. The foundation specifically focused on smart coding that introduced the users with better coding practices. And if it is a team then this would give them an idea to start from a common point. The advantage: It helped them to put together all interactions and workflow in a shorter period of time. 

 Image of the white yeti wearing an astronaut suit with a mobile phone

Zurb Foundation 6

Foundation of site 6 has been designed to give the users with production efficiency in their project. It includes a wide range of modular and flexible components that are not only lightweight but easy to maintain as well. 

Foundation 6 is also half the size of foundation 5, in other words, there was a reduction of 50 % code. All these codes have come with ARIA attributes and roles alongside instructions. 

The base style in foundation 6 act as a coded framework, which makes the work of the user even more easy and flexible. Simpler CSS styles allow the user to easily modify them and fit it according to their needs. 

Image of a white yeti wearing an astronaut suit holding a laptop in a sleeping position. It has a background image of stars and a rocket


Zurb Foundation or Bootstrap?

If talking about Zurb and its different version, bootstrap tends to make its way between all this. 

How?

Well, because Bootstrap and Zurb are the major participants when it comes to web designing methods.  Often designers and developers seem to get lost in the mist while battling and choosing between one. 

  Zurb Foundation Bootstrap Community The community here is smaller as compared to bootstrap but it is growing with decent technical support.  The community here is smaller as compared to bootstrap but it is growing with decent technical support.  CSS Preprocessor Foundation supports SASS Bootstrap also supports SASS but it is a less camp. Customization Minimalist approach to its pre-built UI components giving room to the designers to create new things. Bootstrap consist of basic GUI customizer which most of the time doesn’t allow the users to create something new.  Browser Support Supports Chrome, Mozilla Firefox, Safari, opera, android  Supports Chrome, Mozilla Firefox, Safari, opera, android, and IE8 Grid System Foundation was the first one to go mobile friendly and adapt to the grid system. Bootstrap has ample of time to bridge the gap in terms of features and functionalities.  Advanced Feature Elements Zurb foundation   of X-y grid, responsive tab, off-canvas etc It is customizable with a variety of designs. Releases Zurb has more releases w.r.t the development requirement. It has 6 releases and the 7th one is yet to come. Bootstrap has 4 release. The  4th  release was on August 19, 2011

Why choose Drupal for Zurb foundation?

When building a website what is the very first thing which the user sees?

Content? Functionalities? Information? Or Layout?

Selecting the theme and functionalities of a website is one of the most primary decisions that a website owner has to make and Drupal is the CMS that can help you achieve this task. It has about 1,316 themes where each theme has a variety of strengths

Out of which Zurb Foundation theme is grid-based, mobile first CSS. When used with Drupal, Zurb provides efficiency to the front end developer. It is a platform-specific method that helps you achieve various functionalities. 

  • The foundation XY grid (in timeless condition) allows the user to easily create and layout Drupal content.
  • The top bar of the foundation is combined with Drupal’s menu navigation and works well with submenus.
  • The off-campus region in the Zurb foundation are available as Drupal blocks and the menu bar that is placed in this region are automatically themed. 

Creating Sub Themes 

It is imperative that the user creates a sub-themes that allow Zurb foundation to apply on any website. 

There are two ways of creating a sub-theme:

Drush: Drush is basically a command line access to common Drupal administrative tasks and configuration. The user can change the directory with the help of Drush.  

Manually: The user can also create sub-themes manually. They can complete the task by expanding the theme folder and then replacing it with starter files. 

Contributed Drupal modules that can be used with Zurb

The Zurb foundation aims to do theming without the help of any dependencies but there are many modules that help in theming better. 

Modules like panels, block class, display suite, special menu items. 

Panels: Panels module allow a site administrator to create customized layouts for one or more users. It provides an API that allows the configuration and placement of
blocks into regions. 

Block Class: Block class allows the user to add classes to any block through the block’s configuration interface. 

Display Suite: Display suite allows the users to take control over content fully using drag and drop interface. 

Menu Items: This module allows the user to create dropdown divider and all sort of headers in the navigation menu. 

Case Study on MIT Press CogNet 

MIT Press CogNet is an essential online resource for all the students and scholars that are into cognitive sciences. 

The objective of this project was to provide a completely responsive design in all devices for the users. An organization worked closely with the CogNet team. With the CogNet team, they developed a basic wireframe. custom Drupal theme based on the zurb foundation was built.

Zurb foundation theme and sass for CSS preprocessing were used to rework the already existing theme. To guarantee a seamless experience on any type of screen the developers used jQuery to construct slick navigation, scrolling, and content exploration. 

The results were eye-catching. From being a desktop-only website, MIT Press CogNet was modified into an accessible one where the users were able to view it in any device. The biggest achievement of the whole procedure was that it was done under the budget provided by the organization.  

Future of Zurb Foundation

Zurb is yet to launch another version of the architecture (ITCSS+SMACSS). 

Zurb Foundation 7 separates the view layer from the logic layer to make it easy and reliable. 

It would dramatically improve your freedom to shift between JavaScript frameworks with a super-powerful pluggable JavaScript architecture. In short, there are two major changes that would take place in Zurb Foundation 7 

The first one as mentioned would dramatically let user shift between javascript frameworks with javascript architecture. UI framework today tends to either go in one framework or have different and independent ports for different JS framework.

And yes, the second major change in the foundation is the ITCSS based architecture with the usage of SMACSS. This would make it easier to build and maintain scalable sites and themes. 

Conclusion 

Remember that themes are connected to the protagonist's internal journey. It is not just the visuals but it is also the journey of the user experience that they would have while going through your website.

At OpenSense labs, we understand how important it is to create a website that matches your goals and objectives. Ping us at [email protected] so that we can arrange services to make your website what you have always hoped for. 

Feb 19 2019
Feb 19
Performance and scale for all levels of digital commerce


Drupal Commerce is a fantastic open source ecommerce platform, but there is a common misconception that it is lacking when it comes to performance and scalability. This is not true! Drupal Commerce is extremely fast and is more than capable of scaling from small business all the way to enterprise level ecommerce. We have proof and it’s right here for you to view.

Download the Drupal 8 Commerce Performance Benchmarks Report (PDF)

About the report

Shawn McCabe, Acro Media’s CTO, put Drupal Commerce to the test to see how it performed on a number of different AWS configurations, ranging from single server setups all the way up to multi-server configurations.

He ran simulated traffic through Drupal Commerce, mimicking actual traffic as close as possible, testing concurrent users, site speed, transactions per second, and a number of other useful technical metrics.

The smallest server configuration tested was capable of handling 130 concurrent users flawlessly, with a throughput of 13.59 transactions per second. On the other hand, the largest configuration could handle 52,000 concurrent users with a throughput of 1,305.85 transactions per second.

The report goes further and includes how the tests were set up, their limitations and methodology, all of the server configurations details and, of course, the test results. This testing puts the performance and scalability question to rest, backed by hard data that anyone can reproduce. Drupal Commerce is a viable option for ecommerce that businesses of any size can use and grow with in the future.

Feb 19 2019
Feb 19
Date: 2019-February-19Security risk: Highly critical 20∕25 AC:None/A:None/CI:All/II:All/E:Theoretical/TD:UncommonVulnerability: Critical ReleaseDescription: 

There will be a security release of 8.5.x and 8.6.x on February 20th 2019 between 1PM to 5PM America/New York (1800 to 2200 UTC). (To see this in your local timezone, refer to the Drupal Core Calendar) . The risk on this is currently rated at 20/25 (Highly critical) AC:None/A:None/CI:All/II:All/E:Theoretical/TD:Uncommon.

Not all configurations are affected. Reserve time on February 20 during the release window to determine whether your sites are affected and in need of an immediate update. Mitigation information will be included in the advisory.

Contributed module security updates may also be required.

If you are running Drupal 7, no core update is required, but you may need to update contributed modules if you are using an affected module. We are unable to provide the list of those modules at this time.

Neither the Security Team nor any other party is able to release any more information about this vulnerability until the announcement is made. The announcement will be made public at https://www.drupal.org/security, over Twitter, and in email for those who have subscribed to our email list. To subscribe to the email list: log in on Drupal.org, go to your user profile page and subscribe to the security newsletter on the Edit » My newsletters tab.

Security release announcements will appear on the Drupal.org security advisory page.

Feb 19 2019
Feb 19

An article from ComputerMinds - Building with Drupal in the UK since 2005.

Drupal empowers site builders and editors to configure their sites in settings forms. Configuration management lets developers push changes up to live sites to be imported. But developers have to be considerate to ensure imports will not wipe out those changes made directly through the live sites' settings forms. At the least, they have to export the changes before making further tweaks. But admins may make further changes in the meantime too, so developers can end up frequently pulling irrelevant changes back from live, which seems unnecessary.

Here's some examples of the kind of config that I'm thinking of:

  • The site email and Google Analytics account are usually managed by site admins, not developers. So developers should not be the ones to manage those settings.
  • Marketers may like tweaking the site name or slogan. That doesn't need to affect developers.
  • Contact forms contain labels and other text which may be key to the communication between a client and their customers.
  • Permissions - sometimes it's not clear where the lines are between editors/admins/etc, so why not allow some flexibility to reassign permissions directly on live without needing to touch the codebase?

We need an approach that allows for specific settings to be considered 'unmanaged' - so an import wouldn't touch whatever they have made to be on live. The Config Ignore project claims to solve this, but we already use Config split which is more powerful, more flexible and has a better user interface. (Although Config Ignore does allow targeting parts of config rather than whole config items.)

Config split is often used to create environment-specific sets of configuration, but its design means it can be used for separating config for other purposes. In this scenario, what's needed is a split that represents settings to be protected, which can be exported immediately before any import. Then when importing, Drupal only sees the preserved version of the settings, so won't change them, regardless of what is in the main configuration files.

The split, which I've called 'Unmanaged', needs to be set up like as follows (see screenshot):

  • Use a folder (directory) which already exists and is writable. I use ../config/unmanaged, so it matches the split name and is outside the webroot.
  • Set to active. I usually set all other splits to inactive, and only make them active in an environment's settings.php, but this split exists for the sake of workflow, not environment. For example, it can actually be useful locally, so I can tweak things for development without affecting what ends up in version control.
  • Have the largest weight of any split, so that it overrides any other exported version of config it contains.
  • Use the Conditional split section, not Complete split, to pick configuration to protect.
  • Do not tick either of the checkboxes in the conditional split section.

Once the split has been created, the container needs rebuilding for it to work. Run this, which includes exporting it for the first time:

drush cache-rebuild
drush -y config-split-export unmanaged

Now that it is exported, a .htaccess file will be have been added to the directory alongside the config. Add the following line to your project's .gitignore file, adjusting the directory location as appropriate. This ensures the directory will get created on live when changes are pulled from git (containing .htaccess), but deliberately without the exported config:

config/unmanaged/*.yml

So now before running any imports, make sure to export the split:

drush -y config-split-export unmanaged
drush -y config-import

With this split and the export step in place in your workflow, you can be confident of allowing your developers and site admins to get on with their respective work, without getting in each others' way. This puts configuration splits to use for something beyond environment-specific overrides, which I think is exciting. I wonder what other useful purposes they may have?

Photo by Sascha Hormel from Pexels

Feb 19 2019
Feb 19

Websites will run into problems. Whether you're using Drupal or any other software, there will be problems at some point.

Drupal runs on PHP and when PHP has problems, it reports them to you. 

However, often these errors will appear on your site and will be visible to visitors, as in the image below:

PHP Notices on Your Drupal Site

In this tutorial, we're going to give you a quick introduction to these errors. We'll explain the different types that might appear on your site and how you can stop them from showing.

There are three main ways in which PHP will report problems: notices, warnings and errors.

What are PHP Notices?

These are the least important. According to the official PHP website, notices are generated when:

"the script encountered something that could indicate an error, but could also happen in the normal course of running a script."

What are PHP Warnings?

Warnings are more serious, but probably won't break your site. According to the official PHP website, warnings are:

"non-fatal errors. Execution of the script is not halted."

What are PHP Errors?>

Errors are the most serious type of problem and may break your site. According to the official PHP website, warnings are:

"Fatal run-time errors. These indicate errors that can not be recovered from, such as a memory allocation problem. Execution of the script is halted."

Solution #1: Disable Error Reporting on Your Drupal Site

One solution is to simply stop the errors from showing.

  • Go to Configuration > Logging and Errors.
  • You have three choices:
    • None will disable all error reporting.
    • Errors and warnings will display on the most serious problems.
    • All messages will display all problems and is probably only useful for developers.
Disabling Error Reporting on Your Drupal Siteg

Solution #2: Fix the Problem

Yes, yes, I know this is a controversial idea. Fixing a problem is definitely harder than hiding a problem.

Here are some suggestions to help you fix the problem. Please backup your site before trying any of these.

  • Make sure your Drupal site and all your modules and themes are up-to-date.
  • Search Google and Drupal.org for anyone who has reported the same message. See if they have found a solution.
  • Read the message itself for hints about the problem. For example, the problem in the image at the top of this tutorial is all/modules/calendar/includes/calendar_plugin_display_page.inc on line 47. This tells that the problem may well be with the Calendar module, because the error is coming from the Calendar module folder. If the problem is serious, you might consider disabling the problematic module or theme.

About the author

Steve is the founder of OSTraining. Originally from the UK, he now lives in Sarasota in the USA. Steve's work straddles the line between teaching and web development.
Feb 18 2019
Feb 18

Last week I published Pino: a Member Management web app built as a custom Drupal 8 distribution. With Pino, you can manage members and their metadata, send email notifications to the members and more.

I have been thinking about this doing this project for some time now. I am personally a board member in several associations, and I have seen several different member management solutions. Most common of them all, is honestly just Excel.

Excel is a simple and alright solution for managing a small content sheet, but at some point, it's limitations will have to be faced. Major example is emailing: pretty much all of the associations I am involved in just copied the emails from an Excel sheet and sent email notifications using their personal mailboxes. Better tools for this are available, and I wanted to build something like that, in open source way of course.

I think dogfooding is the only way to make a meaningful SaaS product in this day and age. That way the specs and requirements are more meaningful, I stop occasionally while developing a feature and ask myself: "Would I use this feature? How I would use it? How would I make it faster/easier to use?"

The product is now up and ready to be used. I would love to hear any feedback of it, installation, daily usage, import/exporting etc.

Homepage: https://pinomembers.com

Documentation: https://pinomembers.com/documentation

Gitlab: https://gitlab.com/risse/pino

Drupal.org: https://drupal.org/project/pino

Feb 17 2019
Feb 17

A metalsmith sees potential where others might see trash. In his vision, a plastic bag can become a ring, the zipper can turn into a bracelet and brass platters can be metamorphosed into a striking hollow vessel. In the digital spectrum, there’s another Metalsmith which can shape a great web presence for your organisation with its amazing web development capabilities.

A metalsmith working in a room full of metal objects


A combination of Metalsmith and Drupal even more fruitful with Drupal’s spectacular backend that can be used to feed data to all sorts of clients and Metalsmith’s exceptional capabilities as a static site generator.

Forging an understanding

Flowchart showing boxes to explain metalsmith's working principle as a static site generatorSource: Werner Glinka | Twin Cities Drupal Camp 2018

Metalsmith is not aimed at particular project types such as blogs. It supports a wide array of templates and data format options. It offers a simple plug-in architecture and is easy to get started. It uses a modular structure. Its lightweight nature and fewer dependencies make its a magnificent solution.

Metalsmith is an extremely simple, pluggable static site generator - Metalsmith.io

Static site generators create HTML code locally on the developer’s computer and all the required components are stored in a well-structured directory characterised by the strict separation of layout and content. That means a static site generator produces static build files for their deployment to a web server wherein the files are developed from source files. This is exactly the reasoning on which Metalsmith was built. Metalsmith is more than just a static site generator as everything is a plugin and Metalsmith core is an abstraction for manipulation of the file directory. It can be used for building a plethora of use cases like a project scaffolder, or an ebook generator, or even the technical docs.
 
Its working principle is simple. It takes the information from the source files of a source directory and it, then, writes the manipulated information to the files into a destination directory. The manipulations can be the translation of templates, replacement of variables, grouping files, moving files among others.
 
All the manipulations are done by plugins. Only thing Metalsmith has to take care of in its core is to offer an underlying logic of how the manipulations are confronted with and for a defined interface for the plugins. Moreover, source files are, in the nascent stage itself, converted into JavaScript objects so that plugins only do the modifications to the JavaScript objects. Moreover, Metalsmith is divided into a core and several plugins that minimises the intricacy by giving the user complete authority to use only those plugins that are needed and disseminating the task of maintaining the Metalsmith core to the Metalsmith community. Being programmed in Javascript, the need for another language like Python or Ruby is eliminated and it also has a simple plugin-interface thereby offering a simple workflow.

Metalsmith with Drupal

Twin Cities Drupal Camp 2018 had a session that demonstrated how a headless Drupal 8 installation serves content to a local Metalsmith-based static website build process. Drupal 8 turned out to be a fantastic backend CMS for serving data to all kinds of clients. Here, simply put, the local environment is the Metalsmith process uploading anything onto the Github. Once uploaded, Netlify loads it into its own process and publishes it.

Cat-shaped icon at top, green coloured square icon at right, glass with straw at bottom, and a droplet icon containing number 8 Source: Werner Glinka | Twin Cities Drupal Camp 2018

Server data are leveraged for building pages dynamically at build time with the assistance of Metalsmith as the static site generator and the Nunjucks template engine. As a result, the merits of a static website as well as the magnificent backend data governance of Drupal is obtained. Drupal is only utilised for governing content in the backend and is then used to serve the content to the build process through an API.

Conclusion

Metalsmith, along with Drupal's great prowess in content governance, can be a great solution for building static sites.
 
We believe in empowering the digital firms and making their digital transformation dreams come true with our suite of services.
 
Let us know how do you want us to help you build an astounding web presence at [email protected]

Feb 17 2019
Feb 17

"Content is king! SEO is so 2017." Sure, why not.

Where is that king's throne? Where do you find content?

Content is found on web and mobile pages across the internet, amongst millions of millions of pages. Search Engine Optimization is about the process of being found by your content consumers and readers.

What Search Engine Optimization is not is that it's all about keywords; a common and prevalent misconception. You still need to create content that is of relevance to someone searching for it.

and of course, follow Google's rules and learn their ranking factors.

Basic stuff really.

Google SEO

"SEO? Gotta follow the Google machine's rules bro"

 

A critical consideration that usually escapes the decision-making process of what to use to publish that content: Which CMS suits my needs the most? 

A CMS (Content Management System) is the beating heart of your digital platform, website or experience. It is where you develop, manage and publish your content. Most websites look like Ferraris' but drive like an Austin Allegro.

Long story short, the CMS you built your site on will play a key role in your SEO efforts.

 

4 Ways Drupal CMS Enhances Your SEO

I am not here to shamelessly promote Drupal CMS (I really am though), but if you are looking to create the best digital experience for your users without bothering about on-site SEO then look no further than Drupal.

At its core, Drupal was built with SEO in mind. It has the power, flexibility, and tools needed to optimize every facet of your website for search engines, and in its huge kit of modules, there are quite a few that are dedicated to giving you an easier time when it comes to improving the optimization of your website. 

 

1. Implementing Meta tags

Meta tags | Vardot

Meta tags are bits of text that are integral when it comes to improving your website’s search ranking, because, in a way, it tells search engines what the content is on each page on your website. This could be the titles of your pages to the little descriptions you see underneath the website links on a Google results page. You and your search engine need these bits of information to properly present and index your site on the search results page.

Usually, you can leave it up to your search engine to generate your page’s metadata for you, but by using the Drupal Metatag module, you can customize the metadata yourself. Set your own information such as page titles and descriptions to more properly and correctly present your site to your search engine and the online world.

 

 

2. Cleaning up Your URLs

SEO URLs | Vardot

 

Having bad, messy-looking links is a no-no when it comes to SEO. You want links that are easy to read and not just a jumble of letters and numbers so that they look more attractive to prospective visitors and to your search engine, who may be looking at your URL for keywords when it determines your site’s ranking.

Many web developers never realize the implications of messy URLs and leave their link syntax as-is, but going through each and every page on your website and manually setting the URLs isn’t an attractive option either. Luckily, Drupal generates clean URLs by default, improving the readability of your links and making things a bit easier on you.

If you want your links to be better and even more easy on the eyes, popular Drupal module Pathauto is a configurable system that automatically creates clean and extremely readable links that are perfect for your site’s optimization.

Another thing to keep in mind is making sure that your links actually go somewhere. Nothing sours the user experience more than clicking a link and being presented with a 404 page, and this in turn negatively affects your search rankings.

You can avoid this from happening by using the Redirect module. If you happened to have changed the page’s URL after Google indexed it, or moved the content to a different URL, this module allows you to make 301 redirects from that old link to the new one, quickly and painlessly, without having to go through the headache of cleaning up after yourself and fixing broken links.

 

3. Improving Page Speed

Improve page speed | Vardot

Google has been using the speed your page loads as an influencing factor in search rankings for years at this point. As they point out, sites that load faster have users that stay on for much longer, so it’s not only Google that you’re pleased by speeding up your website.

You might have to spend a little to have your website up to speed, but Drupal comes with several measures to help pages load faster, such as using BigPipe.

However, it’s not only desktop users you have to keep in mind, but mobile users, too. Given the leaps and bounds that technology has undergone in the last couple of years, you now find more and more people browsing the web on their smartphones and tablets. It’s important to make sure that your site experience is just as friendly and accessible on mobile devices as it is on desktop computers. As anyone who has used a desktop site on a mobile device knows, it’s not a pleasant experience.

Drupal’s default theme is responsive by design, which means it will display well on mobile screens of any size without having to do complicated rewrites of code or having to juggle multiple URLs to make sure your site displays correctly. With Google now also looking at the page speed of mobile sites, it’s now more important than ever to focus on delivering a good, well-optimized mobile experience to improve your SEO.

 

 

4. Talking to Your Search Engine

Search engine optimization tips | Vardot

Optimizing your website can be a little tough when you don’t even know basic things such as where your site traffic is coming from. Installing modules like Google Analytics makes you privy to such information, and for someone with their finger on the pulse of the site’s SEO, it’s perhaps one of the most important tools they can have.

With Google Analytics, you get to know things about your site visitors: Where in the world they come from, which links they followed to get to your site, which pages they visit and how much time they spend on those pages, what keywords they searched to find your page and more. If you’re concerned about SEO, then getting information about your website directly from Google, the most popular search engine in the world is valuable information to have, and can help you make decisions on what to improve on next.

And while you’re pulling information from Google about your website, you can also provide information about your website to Google in the form of an XML sitemap. These are specially formatted, condensed summaries of the pages of content on your website that you can submit to Google to help them find your site and let their bots crawl through your pages. Google can crawl through your site without an XML sitemap, but you take on the risk of them possibly missing pages.

With Drupal, generating an XML sitemap is as easy as installing the XML sitemap module which creates one for you, and modules like Cron can automatically make sure your sitemap is kept up-to-date with the latest information from your website.

Drupal is inherently optimized for search engines after all the whole idea behind Drupal was to enable the creation of digital experiences that are user-centric. That user includes the development team who are always aided by the open-source community of experts that provide us with awesome SEO tools such as the SEO Checklist.

You see, friend... SEO is not dead. Content's prominence just made SEO eternal.

Want to boost your site’s traffic and rank #1 on Google with Drupal? Message us through our Contact Us page, or via email at [email protected].

Feb 15 2019
Feb 15

As a writer or editor for your organization’s website, you should be able to quickly write articles or build pages that are collections of smaller elements. You should be able to write some text, add a slideshow, write some more text, perhaps list a few tweets, and finish things off with a list of related content. Or maybe you paste in a pull quote, add a couple full-width images with captions, or even put together an interactive timeline. Your content management system should let you do all that, easily. But chances are, it won’t be with the WYSIWYG you’re used to right now.

What You See Isn’t What You Get

WYSIWYG editors still fall short when it comes to doing much more than simple formatting and embedding a few images. Anything beyond that, and the underlying technology has to leverage some kind of proprietary “smart code” or “token” and do some find-and-replace magic that makes slideshows, media players, or other more complex blocks of content show up right to the editor. These tokens aren’t typically based on any adopted standard. It’s just this custom, arbitrary formatting shortcut that programmers decided to use that tells the CMS, “Replace this snippet with that other piece of content.”

If it sounds complicated, that’s because it is. It’s hard to get right. It’s hard to build in a sustainable way. It’s hard – impossible, really – to make it look right and work well for authors. It’s REALLY hard to migrate.

Here’s an example: In earlier versions of Drupal, Node Embed was a way to embed one piece of content (say, an image) inside the body of another (like an article). The “smart code” [[nid: 123]] tells Drupal, “replace this with the piece of content that has an ID of 123.” It worked, but the authoring experience was janky. And it really wasn’t structured content, since your markup would end up littered with these proprietary snippets referencing objects in the CMS. Somewhere down the line, someone would inevitably have to migrate all of that and write regular expressions and processors to parse it back into a sane structure for the new system. That gets expensive.

Simple form with input, select, and textarea

Fieldable Entities and Structured Content

The thing that lets you, the web editor, write content that is both manageable and flexible is breaking your content into discrete, single-purpose fields. In Drupal it’s called “fieldable entities.” You don’t dump everything into the WYSIWYG (which would be hard to do anyway). Instead, there’s a field to add the author’s name, a field for attaching images, and a field for the text (that last part gets the WYSIWYG). More generally, this serves an important concept called “structured content.” Content is stored in sensible chunks. It adapts to a variety of contexts, like a mobile app or a news aggregator or (of course) your website. In the case of your website, your CMS pushes all those fields through a template, and voila, the page is published beautifully and your readers eat it up.

Complex form with input, select, textarea, and multiple fields nested inside

What If My Fields Have Fields?

Here’s where it gets interesting. Back to our earlier example: let’s say your article has a couple slideshows. Each slideshow has a few images, captions, and links. Suddenly your discrete, single-purpose field (slideshow) has its own fields (images, captions, links). And, you may want to add a slideshow virtually anywhere in the flow of the page. Perhaps the page goes text, slideshow, text. Or maybe it’s text, slideshow, text, some tweets, another slideshow. And now you want to swap some things around. Again, you should be able to do all that, easily.

Drupal Paragraphs

Enter the Drupal Paragraphs module. Paragraphs takes the approach of creating content bundles, or collections of fields, that can be mixed and matched on a given page in virtually countless configurations. They’re called “Paragraphs” because they are flexible, structured building blocks for pages. The name is a little misleading; in fact, they are 100% configurable groups of fields that can be added, edited, and rearranged however you want on a given article. You can have paragraph types for slideshows, pull quotes, tweets, lists of related content, or virtually anything else. Paragraphs are building blocks: smaller elements that can be combined to build a page. And like I said earlier, you should be able to easily make pages from collections of smaller elements.

Complex form for adding different types of content called paragraphs

Drupal Paragraphs is Sort of Easy

We use Drupal Paragraphs whenever a particular type of content (a news article, blog post, etc.) is really built up of smaller, interchangeable collections of other fields (text, slideshows, videos, etc.). Drupal Paragraphs are flexible and organized. They let authors create whatever kinds of pages they want, while storing content in a way that is structured and adaptable. Migrations with Paragraphs are generally easier than migrations with special, proprietary embed codes. Breaking content types into Paragraphs gives authors the flexibility they need, without sacrificing structure. You don’t end up with a bunch of garbage pasted into an open WYSIWYG field.

So what’s the catch? Well, the interface isn’t awesome. Using Drupal Paragraphs can add a lot of complexity to the authoring experience. Forms will have nested forms. It can be overwhelming.

Alternatives to Drupal Paragraphs

As I’m writing this, another approach to page building is gathering momentum in the Drupal universe. Layout Builder is currently an experimental module in core, and slated to ship as a stable release with Drupal 8.7. Layout Builder provides a slick drag-and-drop interface for editors to build pages from blocks and fields. We’re excited to see how Layout Builder develops, and to see how well it performs for large editorial websites. For websites with hundreds or thousands of articles, managing pages with Layout Builder may be difficult. As Drupal’s founder, Dries Buytaert, pointed out in a post late last year, “On large sites, the free-form page creation is almost certainly going to be a scalability, maintenance and governance challenge.”

Other open source CMS communities are seeing a similar rise in the demand to provide authors with flexible page-building tools. WordPress released Gutenberg, a powerful drag-and-drop editing experience that lets authors quickly build incredibly flexible pages from a massive library of components. It’s worth noting Gutenberg is not without challenges. It poses accessibility issues. Antithetical to the themes in this post, it does not necessarily produce structured content. It relies on proprietary tokens for referencing embedded blocks of content. But it is very flexible, and offers an expressive interface for authors. For Drupal users, there’s a Drupal port for Gutenberg.

For us at Aten, the balance comes back to making sure content is stored in a way that is structured, can be adaptive, is reusable, and is relatively easy to migrate. And that you, the writer, can easily build flexible web pages.

Structured and Adaptable: Drupal Paragraphs with Layout Control

We’ve been working on an approach that keeps Paragraphs in place as the primary way content is managed and stored, but also gives authors the ability to easily control layout. Using Drupal’s core Layout Discovery system, Entity Reference with Layout is a custom field type that combines layouts and Paragraphs. It’s still in very early experimental development, but we’re excited about the impact this approach might have on making it even easier to create flexible pages. And it uses Paragraphs for content storage, with the benefits we’ve already touched on: content is well-structured and relatively easy to migrate. It’s not as flexible or robust as Layout Builder, but might be a great option for authoring flexible pages with Paragraphs. (More on this in a future post.)

Reusable and Flexible: Advanced Paragraphs

Since Drupal Paragraphs are themselves collections of flexible fields, there are all kinds of interesting ways they can be applied to building complex publishing features. We’re working with a client in publishing who needs the ability to completely customize the way content appears on their home page. They would like to promote existing content to the homepage, but they may want to override article titles, images, and summaries. Since the article authors aren’t the same people editing the home page and other key listing pages, they didn’t want authors to have to think about all of those variations. The way content is presented on an article page isn’t always the best-suited for the homepage and other contexts. We used paragraphs to give home page editors the ability to drop articles onto the page, with fields for overriding everything they need to.

Where to Go From Here

Your CMS should make it easy to write content and build pages. If you’re interested in seeing a demo of Drupal Paragraphs, Layout Builder, or Gutenberg, drop us a line. We’d love to help.

Feb 15 2019
Feb 15

The recent post on Dries’ blog about REST, JSON:API and GraphQL caused a bigger shockwave in the community than we anticipated. A lot of community members asked for our opinion, so we decided to join the conversation.

Apples and Oranges

Comparing GraphQL and JSON:API is very similar to the never-ending stream of blog posts that compare Drupal and Wordpress. They simply don’t aim to do the same thing.

While REST and JSON:API are built around the HTTP architecture, GraphQL is not concerned with its transportation layer. Sending a GraphQL query over HTTP is one way to use it, and unfortunately, one that got stuck in everybody’s minds, but by far not the only one. This is what we are trying to prove with the GraphQL Twig module. It allows you to separate your Twig templates from Drupal’s internal structures and therefore make them easier to maintain and reuse. No HTTP requests involved. If this sparks your interest, watch our two webinars and the Drupal Europe talk on that topic.

So GraphQL is a way to provide typed, implementation agnostic contracts between systems, and therefore achieve decoupling. REST and JSON:API are about decoupling too, are they not?

What does “decoupling” mean?

The term “decoupling” has been re-purposed for content management systems that don’t necessarily generate the user-facing output themselves (in a “coupled” way) but allow to get the stored information using an API exposed over HTTP.

So when building a website using Drupal with its REST, JSON:API or GraphQL 3.x extension and smash a React frontend on top, you would achieve decoupling in terms of technologies. You swap Drupal’s rendering layer with React. This might bring performance improvements - our friends at Lullabot showed that decoupling is not the only way to achieve that - and allows you to implement more interactive and engaging user interfaces. But it also comes at a cost.

What you don’t achieve, is decoupling, or loose-coupling in the sense of software architecture. Information in Drupal might be accessible to arbitrary clients, but they still have to maintain a deep knowledge about Drupal data structures and conventions (entities, bundles, fields, relations…). You might be able to attach multiple frontends, but you will never be able to replace the Drupal backend. So you reached the identical state of coupling as Drupal had for years by being able to run different themes at the same time.

The real purpose of GraphQL

Back when we finished the automatically generated GraphQL schema for Drupal and this huge relation graph would just pop up after you installed the module, we were very proud of ourselves. After all, anybody was able to query for any kind of entity, field, block, menu item or relation between them, and all that with autocompletion!

The harsh reality is that 99.5% of the world doesn’t care what entities, fields or blocks are. Or even worse, they have a completely different understanding of it. A content management system is just one puzzle piece in our client's business case - technology should not be the focus, it’s just there to help achieve the goal.

The real strength of GraphQL is that it allows us to adapt Drupal to the world around it, instead of having to teach everybody how it thinks of it.

Some of you already noticed that there is a 4.x branch of the GraphQL module lingering, and there have been a lot of questions what this is about. This new version has been developed in parallel over the last year (mainly sponsored by our friendly neighbourhood car manufacturer Daimler) with an emphasis on GraphQL schema definitions.

Instead of just exposing everything Drupal has to offer, it allows us to craft a tailored schema that becomes the single source of truth for all information, operations, and interactions that happen within the system. This contract is not imposed by Drupal, but by the business needs that have to be met.

A bright future

So, GraphQL is not a recommendation for Drupal Core. What does that mean? Not a lot, since there is not even an issue on drupal.org to pursue that. GraphQL is an advanced tool that requires a certain amount of professionalism (and budget) to reap its benefits. Drupal aims to be used by everyone, and Drupal Core should not burden itself with complexity, that is not to the benefit of everyone. That's what contrib space is there for.

The GraphQL module is not going anywhere. Usage statistics are still climbing up and the 3.x branch will remain maintained until we can provide the same out-of-the-box experience and an upgrade path for version 4. If you have questions or opinions you would like to share, please reach out in the #graphql channel on drupal.slack.com or contact us on Twitter.
 

Feb 14 2019
Feb 14

A commercial came on the radio recently advertising a software application that would, basically, revolutionize data management and enable employees to be more efficient. My first thought was, “How can they possibly promise that when they don’t know their customers’ data management processes?” Then, it became clear. The business processes would have to be changed in order to accommodate the software.

Is that appropriate? Is it right that an organization should be required to change the way it conducts business in order to implement a software application? 

Sometimes yes. Sometimes no. 

The following four factors can serve as an essential guide for evaluating the need for change and ensuring a successful transition to a new solution. 

  1. Process Analysis
  2. Process Improvement
  3. Process Support
  4. Change Management

1. Process Analysis

Before committing to the latest and greatest software that promises to revolutionize the way you do business, make sure you’ve already documented current processes and have a clear understanding of what’s working well and what can be improved. Over the last few decades, several methodologies have been created and used to analyze and improve processes. Key among them: Six Sigma, Lean Manufacturing, and Total Quality Management (TQM).

How to Streamline Process Analysis

In simple terms, document the way you create, edit, and manage your data. Task-by-task, step-by-step, what does your team do to get the job done? If you see a problem with a process, keep documenting. Improvements, changes, redesigns come in the next step.

Steps to Document Processes

At the core of all process documentation is the process flowchart: an illustration with various boxes and decision diamonds connected with arrows from one step to the next. Collect as much data and as many contingencies as possible to make this flowchart as meaningful and robust as possible.

Diving Deeper

Integrated Definition (IDEF) methods look at more than just the steps taken to complete a task within a process. IDEF takes the following into account:

  • Input - What is needed to complete the steps in the task? Understanding this information can highlight deficiencies and can highlight flaws in effective completion of a particular step. 
  • Output - What does the task produce? Is the outcome what it needs to be?
  • Mechanisms - What tools are needed to perform the steps in the task? Can new technology make a difference?
  • Controls - What ensures that the steps within the task are being performed correctly?

Swimlane or Functional Flowcharts

In addition to the four items from IDEF data, knowing who performs a task is also important. As you draw the boxes and connect them with arrows, place them in a lane that represents the “Who” of process. Knowing who does what can reveal possible over-tasking and/or bottleneck issues in production and efficiency.

Getting Started

Whiteboards and Post-it can be an easy place to start process documentation. With today’s high definition smartphone cameras, you can sketch flowcharts, take a picture, and then share with stakeholders for their review and input. Once you have collected all relevant information, tools such as Visio or Lucidchart can make complicated process flows easier to create and visualize. 

2. Process Improvement

When you know how your current processes are performed, you can start to make improvements--whether incremental or a complete redesign (A.K.A reengineering). 

Low-hanging Fruit

Some needed improvements will likely be obvious. For example, someone on the process team says, “I didn’t know that you were doing that? I do it, too.” Duplication of effort doesn’t require metrics to be collected to determine that steps can be taken to enhance efficiency.

However, most processes require some form of metric collection to determine where improvements can be made. Metrics can include research costs, development time frames, quality assurance oversights, or frequency of downtime.

Knowing What to Change

If the needed improvement isn’t obvious, metrics can help guide the decision-making process. For example, how much time does it take to complete a process today? Could that be improved? Brainstorm ideas on how to shorten the time. Test the change and remeasure. 

Too often, data is collected with the intent of measuring the success of the process, but the data being collected is does not reflect objectives. For example, if the goal of a website or a particular page is to teach, metrics concerning the number of page visits does not indicate the degree to which that goal is being achieved. A more telling metric would be the amount of time a visitor spends on the lesson pages. Even more revealing would be the number of quizzes passed after a visitor engaged with the lessons.  

Before choosing metrics to be collected, understand your goal, determine the level of improvement you want to see, then start measuring that which will actually inform your goal.

3. Process Support

Key to process improvement is an analysis of how technology is can enhance productivity and efficiencies. 

For example: “We can cut three days worth of effort for three employees if we integrate the XYZ software into our process.” This kind of statement sounds like a worthy goal, assuming the cost to transition doesn’t exceed the long-term savings it can help realize. 

Calculating the Cost of Change

Open source software applications start out with a great price tag: Free. However, unless you use the application as is, there is always a cost associated with turning it into what you need. There are many more factors to consider than the upfront cost of the software. 

Costs can include:

  • Training on new processes
  • Training on the new software application acquired to support the new process
  • A drop in productivity and efficiency until the new process/application is adopted and accepted by your staff
  • Technology needed to support the new application
  • Keep in mind: The list of ancillary costs that are involved in the transition are unlikely to appear at the top of the salesperson’s brochure.

Return on Investment

ROI assessments will vary. Two values are needed to make this computation: cost and benefits. 

Once you've computed the short-term and long-term costs, you need to determine the benefits you gain. In a situation where investment directly translates into profit, this calculation can be straightforward.

However, sometimes the benefits are cost savings: “We used to spend this. Now we spend that.” When this is the case, instead of a cost/benefit ratio, a breakdown calculation might be required to determine how long it will take to recoup costs. 

The most complicated analysis will result from benefits such as an increase in customer satisfaction in which the benefit does not have a direct monetary value.

4. Change Management

When a team cannot see the benefit or resists change, new initiatives face an uphill battle. That’s why circling back to the process analysis phase and ensuring the buy-in of those who are being counted on to use the new application is a critical success factor.

Keep in mind that employees may not have access to the big-picture business goals that management sees, but change has the greatest chance of being realized if those who will be required to support it are informed as to why it needs to happen and included, at some level, in the decision-making process. 

Indeed, change management doesn’t start when the new application is ready to be implemented. Change management starts when:

  • All the costs of change are considered;
  • The full scope of process changes are identified;
  • Training is planned and delivered; and 
  • The campaign for change acceptance is designed and initiated.

Conclusion

You might be the boss. You might believe that the software application just discovered is what  your company or your department needs. As much as that matters, you need buy in. You need the support of the people who make your business possible. You also need to engage in a disciplined analysis of the processes that will be impacted, along with anticipated improvements and the level of support that will be deployed.

From ensuring that clients’ entire online presence is compliant with ADA accessibility guidelines to web solutions that optimize impact in the ever-evolving digital environment, we at Promet Source serve as a powerful source of transformative digital possibilities. Contact us today to learn what we can do for you.

Feb 14 2019
Feb 14

Communication is the heart of all human interactions and the media is like the blood pumping all the necessary ideas and expressions. 

Media provides the essential link between the individual and the demands of the technological society.
-Jacques Ellul

We as individuals view hundreds of advertisement each day. Digging through that phone and eyes glued to those tabs. People like us have produced a substantial rise in marketing tactics. 

Marketing tactics such as social media, videos, search engine optimization, mobile paid media, and marketing of the emails have simulated the need for good quality content. What our minds’ decide to pay attention to depends on the interest and how compelling the advertisement or piece of content is.

Image of a blue cloud that says content under which 28 lego humans are standing and rain is showering upon them through that cloud

It is necessary for organizations to realize their target persona, and serve up content that will bust through the clutter and hit homes with their customer.

Drupal Media can serve up this task beautifully and can do almost anything by gracefully blending digital assets in it.  

You ask how?

I say - let’s find out!

The Evolution of Media Management in Drupal 8 

Drupal 8 Versions When was it introduced? What was offered? Drupal 8.2 5th October 2006  Basic out-of-the-box media handling  Drupal 8.3 6th April 2017 This brought enhanced media handling in Drupal 8. Migrating Drupal 7 File Entities to Drupal 8 Media Entities Drupal 8.4 4th October 2017 Introduction of a new Media API to the core. For site builders, Drupal 8.4 ships with the new media module i.e base media entity Drupal 8.5 7th March 2018  Supported remote video using the oEmbed format. Drupal 8.6 7th November 2018  For the content creator, richer image and media integration and digital asset management.

Media Type and Best Solutions to Handle Them

Media type as we know has been generally categorized with the data content such as an application, audio content, image text message, a video stream and so on. Media type conveys the applications that in return tell them what type of application is needed for the process. Media Types like Pictures, graphics, icons, video are handled beautifully with the help of Drupal modules. 

Media types can be handled with the help of some practices :

  • Media Module Maintenance 

Modules maintenance in Drupal can be achieved with the help of distinct features and functionalities. Status report screen (which checks a variety of issues), cron (that automates the task of a website in “N” hour), caching and deployment, are some of the pieces to the whole module maintenance picture.

Media module provides a “base” entity for assets. This allows users to standardize Drupal site interactions with media resources. Local files, images, youtube video, tweets can be treated with the help of a media module. 

  • Building Distributions 

If you are setting up a Drupal site then it would typically mean being involved in downloading and configuring various contributed modules (media and non-media). To make the whole process easier there are a variety of “Pre-configured” versions of Drupal that can be downloaded and used for a specific site. These pre-configured versions of Drupal are called distributions. With these “full-featured” distributions you can easily and quickly set up a site for the specialized purpose.  

  • Site Building 

Drupal 8 comes with the most popular text editor modules and image uploader modules. These both provide the users with basic HTML controls and the power to edit the content. Text editor modules like paragraphs grant the user with a cleaner data structure. The scope of making mistakes is next to null due to the module known as the environmental indicator that helps in correcting mistakes. 

  • Custom Development 

Drupal is a collection of modules and distribution. With more and more organizations looking to build an engaging digital experience for their stakeholders, the Drupal CMS has made custom developments in its platform. The version brings significant changes in modules that help in better user experience and efficiency of the website. 

Media expectations as a content author and Site Builders  

State of Drupal 2016 survey which 2,900 people attended and participated in got the top two most requested features in terms of content creator persona.

The top two features which were adequately demanded were

  • Richer media 
  • Media integration

Thus, “media initiative” for Drupal 8 was introduced that provided with extensible base functionalities. For the media handling in the core the support of the reusable assets, media browsing, remote video with the extensible cleanliness of contributed modules were made. 

In Drupal 7 the media module was jam-packed with many functionalities. Now in Drupal 8 it has been recreated and introduced into separate modules. The three major modules which beautifully handles the media entities are named as:

Media Entity 

To store media assets, media entity modules were introduced. This module provides a base entity for the media, a very basic entity which refers to all kinds of media objects. Media entity also presented a relation between Drupal and media resource.  

Entity Embed

WYSIWYG embed support(within the text area) is allowed by the entity embed module in Drupal 8. The core consists of an editor and a filter module. This module allows a site builder to construct a button which leads an editor with the text area, hence the name “entity embed”.

Entity Browser

The entity browser module provides flexible and generic entity browsing and selection tools. It can be used in any context where one needs to select a number of entities and do something with them. The inline entity also provided with the integration of the media.

Site builders want that every type of media usage should be easily tracked and be presented to them. These three modules help them in achieving this task.

Third Party integrations for media solutions 

DAM (Digital Asset Media)

A digital asset is any text or media that is formatted into a binary source and includes the right to use it. All the digital files that do not include this right are not considered digital assets. Digital assets are categorized into images and multimedia, called media assets, and textual content and the management of these types of assets is known as Digital Asset Management. Modules like Acquia DAM, Bynder, integration module, EMBridge, S3 file sync, Q bank, Asset Bank, Media Valet, Elvis contribute to the integration of DAM and Drupal media. 

Image of a laptop with a chart picture on the screen and five sub-images are connected around it via dots

CDN (content delivery network)

CDN is a globally distributed network of proxy servers. It integrates offload static assets like images, videos, CSS and JS.

CDN like Cloudflare offers image compression and is great for content delivery network services. CDN provides several advantages over serving the traffic directly:

  1. Assets can be cached in a proxy which is geographically closer to the end users that usually leads to high download speed.
  2. Each page response is shared with the origin server and the CDN.
  3. Some of the CDN’s provides with page optimization service which further enhances the performance and also the user experience. 

To make the integration easier Drupal has a CDN module that would help in speeding up the process and make it more agile. 

Image of the cloud saying CDN, where it has three parts on the left side of the cloud-connected with dots and same is on the right side too


External Storage 

It is not uncommon for large files and folders to get into the way of website speed. Large files are not usually cached resulting in every request to load the website slow. Drupal modules like the S3 file system, storage API, AmazonS3 contributes highly to integrate external storage. These modules manage the storage and the files in its API by providing an additional file system to your Drupal sites. 

Image of a silver tube that is indicated by two back and forth arrow pointing files sign

Infrastructure 

One of the most prominent examples of integrating infrastructure is Cloudflare. It is one of the biggest networks operating on the Internet. People use Cloudflare services for the purposes of increasing the security and performance of their websites. 

A number of various solutions implemented at customers' facilities are rather large today. Often the subsystems of seemingly unified IT landscape are either loosely connected to each other or the interaction between them is ensured by file and data transfer via e-mail or from hand to hand.

When content becomes media 

Content on your website would start acting like media because let’s face it the content repository or the content that is stored in the database of the digital content is an association set of data management, search and access method allowing accessing of content. It includes

Content Pooling 

Content pooling involves the storing of the learning material in form of objects, meta-data as well as the relation which is there between them. It is the grouping up of the resources together (assets, resources etc) of the purpose of maximizing profit and minimizing risks, content pooling is done. 

Content Syndication 

Content Syndication is the building up of a suite of Drupal site that needs a way to consume content from a central Drupal source. The CMS provides a number of splendid tools to simplify content creation and moderation. The users can create content once and make it available everywhere. Push content and media items at any sites to publish them on any targeted remote site. 

Deploy

This module of Drupal 8 allows the user to easily stage and preview content on all Drupal sites. It automatically manages dependencies between entities and is specially designed for rich API which can easily be extended. 

Contenta CMS 

The main agenda of Contenta CMS was to make the content happy. It is a community-driven API first distribution for Drupal 8 which provides users with a standard platform alongside the content. Contenta CMS is all about easing the pain of its users. It builds decoupled applications and websites. 

Beyond version 8 

Drupal 8 was launched without the support of the media library. Thus, the addition of the media library is planned to be launched in Drupal 8. The developers have been currently working on adding a media library to Drupal 8 so that the content authors can select pre-existing media from a library and easily embed them in their posts. Once the media library becomes stable, the content authors can deprecate the use of the old file upload functionality and make the new media library the default experience.

Instead of working on Drupal 9 as a separate code base, it is planned to be launched in Drupal 8 which means that new functionalities are being added on the backward compatibility code along with the experimental features. For contributed module authors, Drupal 9 is working on compatibilities (Before the release of Drupal 9 the users are allowed to update their media module for the new media library) 

Image of a line showing 2019, 2020 and 2021 that has a pointing arrow at the starting. The version are written at the top of a line with a blue background Source: Dries Buytaert's blog

Conclusion

As the world is getting more and more involved in the act of media, the need for handling it has become really important. 

Media is important because it allows the people to transmit information to a larger audience, over a greater length of time. The importance of the media today is immense. Never before in mankind's history have the media had such an influence. 

Yes, Drupal has come a long way in this sector. Contact us on [email protected] to know more about the media handling in your Drupal sites and the services which are provided by us.

Feb 14 2019
Feb 14
Fixing SEMRush reported duplicate content rules through Metatag tag Views and Drupal tokens.
Feb 14 2019
Feb 14

The trend of using JavaScript frameworks with Drupal keeps gaining popularity. It is used for creating rich, fast, and interactive web interfaces. One of the hot use areas is decoupled (headless Drupal 8) architecture, with the front-end part completely entrusted to a JS framework. There are many JS frameworks and libraries that pair well with Drupal 8 — React, Angular, Gatsby, Ember, Elm etc. Today, we will review one of them, Vue.js. Let’s take a closer look at Drupal 8 and Vue.js combination.

Vue.js: candidate for Drupal core

Adopting JavaScript framework in Drupal core would improve the admin UX, according to Drupal creator Dries Buytaert, who spoke at DrupalCon Vienna in 2017. Core committers and developers with the relevant experience agreed they needed to think and choose the optimal JavaScript framework.

The main proposal was to choose React. However, another strong contender soon emerged as a candidate for Drupal core — Vue.js. Let’s take a closer look at this rising star.

Why Vue.js? The framework and its benefits

Vue.js is a progressive open-source JavaScript framework for building interfaces and single-page applications. It is often called a library, but the official version on the Vue.js website is a “framework”.

Vue.js was created by Google employee Evan You when he was working with Angular. Evan says he wanted to take the things he liked about Angular and create “something really lightweight.”

Since its creation in 2014, Vue.js has reached a 127,500+ star rating on GitHub and recently overpassed its counterpart React (122,000+ stars). The top 3 trending JS tools is completed with Angular (59,300+ stars). As we see, Vue demonstrates the most rapid growth.

 vue react and angular star rating

The number of NPM downloads for Vue is 796,700+ each week. The growing popularity of Vue.js is explained by its benefits:

  • lightweight and easy to learn for new developers
  • clear and detailed documentation
  • adoption within the PHP community (for example, Laravel framework has an out-of-box Vue support)
  • active and helpful Vue.js community
  • used by giants (Xiaomi, Netflix, Alibaba, Adobe, GitLab, parts of Facebook, WizzAir, EuroNews, Grammarly, Laracasts, Behance, Codeship, Reuters etc.)
  • high performance due to two-way data binding and virtual DOM (like in Angular and React)
  • even large aps are built with self-contained and often reusable Vue components

Drupal 8 and Vue.js combination

With the Drupal 8 and Vue.js combination, developers can enrich Drupal interfaces with reactive features with no jQuery, use ready Vue components, or build single-page applications that consume Drupal 8 data.

On the Drupal part, we need to prepare the content for REST export. For example, we can create a View, click “provide a REST export” in the View settings, and add a path. It works when the web services set of modules is enabled in Drupal core.

To set up a Vue.js application, we need to make sure we have Node and NPM installed, and then use the Vue CLI 3 tool. It does a lot of work for developers and that speeds up the processes. It automatically plugs in the selected libraries, configures the Webpack to optimize all files, provides app updates live every time you save changes, and so on. Finally, it offers a graphic user interface — Vue Project Manager.

The commands for starting a Vue.js application are:

  • npm install -g @vue/cli
  • vue init webpack projectname
  • cd projectname
  • npm install
  • npm run dev
vuejs application

Then Vue router and Vue resource need to be added to the project. The main.js file is then configured to use Vue routers and Vue resources, the app.js is modified to work with the router, and app components and routers are set up.

Drupal 8 modules for working with Vue.js

There is an ecosystem of contributed Drupal modules for the Drupal 8 and Vue.js combination:

  • Vue.js builds helps connect Drupal and Vue.js. It can be used in a Twig template, attach programmatically, or add as a dependency to a YAML file.
  • Vue Views adds a new format to Drupal views so the results are displayed in Vue.js, which makes it possible to use Vue features like asynchronous loading etc.
  • Webpack Vuejs provides Vue configuration for the Webpack bundler Drupal module, which helps developers bundle Drupal libraries by Webpack
  • Decoupled blocks: Vue.js is the Vue.js version of the Decoupled Blocks module, which adds Vue.js blocks to websites via modules or themes.

Let’s combine Drupal 8 and Vue.js for you

We have taken a glimpse at the possibilities of a Drupal 8 and Vue.js combination. Our Drupal team is always on top of the latest JavaScript technologies. Contact us and we will help you choose the optimal one for you, and bring all your ideas to life.

Feb 14 2019
Feb 14

Our Glazed framework theme allows users to have control over every aspect of a Drupal site: from typography to colors, grid design and navigation. Combine this with our Drag and Drop builder and everything you need on a professional website can be designed and developed directly in the browser. This empowers your marketing and design staff to work efficiently without incurring heavy IT costs. 

When you take look at Drupal competitors such as WordPress and cloud based solutions like Squarespace, one of the main reasons they successfully skyrocketed in the web development industry is because of the simple front-end editing experience and the value this experience brings to the users. Glazed Builder brings this modern site-building experience to the Drupal world by combining the power and unique aspects of Drupal with the simplicity and intuitiveness of Drag and Drop technology. 

Glazed Builder is different from Wix, Squarespace, or any other Drag and Drop builders: it's made for Drupal and deeply integrated with Drupal APIs. It acts as a Drupal field formatter and you can have multiple instances per page, for your footer, main content, blocks, and even custom entity types. It automatically understands Drupal's revision system, language systems, workflow states, and permissions. This makes it one of the most advanced visual page builders in the world from a website architecture perspective.

How Sooperthemes products create a better Drupal experience 

Drag and Drop tools have evolved to be more powerful, produce better code, and leverage frontend frameworks to create a fluent authoring experience that runs in the browser. In Glazed Builder this experience is integrated with Views and the block systems: you can create highly dynamic pages and even dashboards with Drag and Drop, without losing reusability of the components you build. It is available for both Drupal 8 and Drupal 7, and provides the tools to easily perform difficult customization tasks. It lets the user focus on creating value for the customers and leave the technical aspects behind. It's intuitive and easy to use out of the box. 

Adding Glazed Builder on top of your existing Drupal 8 stack

Feb 14 2019
Feb 14

How do you run a page speed audit from a user experience standpoint? For, let's face it: website performance is user experience! 

What are the easiest and most effective ways to measure your Drupal website's performance? What auditing tools should you be using? How do you identify the critical metrics to focus your audit on?

And, once identified, how do you turn the collected data into speed optimization decisions? Into targeted performance improvement solutions...

Also, how fast is “ideally fast”, in the context of your competition's highest scores and of your users' expectations?

Here are the easiest steps of an effective page performance audit, with a focus on the prompt actions you could take for improving it.
 

1. Front-End vs Back-End Performance

They both have their share of impact on the overall user experience:

Long response times will discourage, frustrate and eventually get your website visitors to switch over to your competition.
 

Front-End Performance 

It's made of all those elements included in the page loading process, as being executed by the browser: images, HTML, CSS and JavaScript files, third-party scrips...

The whole process boils down to:

Downloading all these elements and putting them together to render the web page that the user will interact with.
 

Back-End Performance 

It covers all those operations performed by your server to build page content.

And here, the key metrics to measure is TTFB (Time To First Byte).

It's made of 3 main elements:
 

  1. connection latency
  2. connection speed
  3. the time needed for the server to render and serve the content
     

2. What Should You Measure More Precisely? 5 Most Important Metrics

What metrics should you focus your page speed audit on? Here's a list of the 5 most relevant ones:
 

a. Speed index

The essential indicator that will help you determine the perceived performance on your Drupal website:

How long does it take for the content within the viewport to become fully visible?

When it comes to optimization techniques targeting the speed index, “battle-tested” techniques, like lazyloading and Critical CSS, are still the most effective ones.
 

b. Time to first byte

As previously mentioned, the back-end performance:

Measures the time passed from the user's HTTP request to the first byte of the page being received by the browser.
 

c. Start render

The time requested for rendering the first non-white content on the client's browser.

Note: the subsequent requests are “the usual suspects” here, so you'd better ask yourself how you can reduce, defer or relocate them. Maybe you'd consider a service worker?
 

d. Load time

How long does it take for the browser to trigger the window load event? For the content on the requested page to get loaded?

Note: consider enabling HTTP/2, with a dramatic impact on individual page requests.
 

e. Fully loaded

It measures the time of... zero network activity. When even the JavaScript files have all been already fired.

Note: make sure your third-party scripts are “tamed” enough. They're the main “responsible” factors for high fully loaded times.
 

3. How to Perform a Page Speed Audit: 5 Useful Tools

Now that you know what precisely to analyze and evaluate, the next question is:

“How do I measure these key metrics?”

And here are some of the easiest to use and most effective tools to rely on when running your page performance audit:
 

Use them to:
 

  • collect your valuable data on all the above-mentioned metrics
  • get an insight into the page rendering process performed by the browser
  • identify the “sore spots” to work on
  • automate repeated page speed tests
  • keep monitoring your website (SpeedCurve) across multiple devices and in relation to your competition's performance 
  • get a peek into your web page's structure and into the process of downloading resources over the network (Chrome DevTools)


4. 3 Key Benchmarks to Evaluate Your Website's Performance

So, now that you've got your “target metrics” and your toolbox ready, you wonder: 

“What should I measure those metrics against?”

And here, there are 3 major benchmark-setting processes to include in your page speed audit:
 

  • determine your competition: your current production site before its rebuild, your direct and indirect “rivaling” websites
  • determine the key pages on your Drupal website: homepage, product listing page, product detail page etc.
  • measure your competition's web page performance
     

5. Most Common Performance Bottlenecks & Handiest Solutions

Here are the most “predictable” culprits that you'll identify during your page speed audit, along with the targeted performance-boosting measures to take:
 

Factors Impacting the Front-End Performance & Solutions

a. Too many embedded resources

Too many embedded stylesheets, JavaScript and images are an extra challenge for your page loading time. They'll just block the rendering of the page.

Each connection setup, DNS lookup and queuing translates into... overhead, with an impact on your site's perceived performance.

The solution: consider caching, minification, aggregation, compression...
 

b. Oversized files

And images (stylesheets and JavaScript) sure are the main “culprits” for long response times on any Drupal website. 

The solution: consider aggregating/compressing them, turning on caching, lazyloading, resizing etc.
 

c. Wrongly configured cache

Is your website properly cached? Have you optimized your cache settings? Or is it possible that your Drupal cache could be broken? 

If so, then it will have no power to reduce latency, to eliminate unnecessary rendering.

The solution: look into your response headers, URL/pattern configuration, expiration and fix the problem cache settings.
 

d. Non-optimized fonts

Your heavy fonts, too, might play their part in dragging down your website.

The solution: consider caching, delivery, compression, and character sets.
 

In conclusion: do re-evaluate all your modal windows, third-party scripts and image carousels. Is their positive impact on the user experience worth the price you pay: a negative impact on your page loading time?  
     

Word of caution on caching:

Mind you don't overuse caching as a performance boosting technique. If there are serious back-end performance issues on your website, address them; using caching as the solution to mask them is not the answer. It works as a performance improvement technique on already working systems only.
 

Factors Impacting the Back-End Performance & Solutions

And there are some handy, straightforward measures that you could take for addressing back-end performance issues, as well:

  • Consider optimizing the page rendering process directly in the CMS.
     
  • Upgrade your server's hardware infrastructure (e.g. load balancing, RAM, disk space, MySQL tuning, moving to PHP7).
     
  • Keep the number of redirects to a minimum (since each one of them would only add another round trip, which bubbles up to the TTFB).
     
  • Reconfigure those software components that are lower in the server stack (caching back-end, application container).
     
  • Consider using a CDN; it won't serve everything, plus it will lower the distance of a round trip.
     
  • Consider using Redis, Varnish.
     

6. Final Word(s) of Advice

Here are some... extra tips, if you wish, to perform a page speed audit easily and effectively:
 

  • remember to run your audit both before and after you will have applied your targeted performance improving techniques
  • grow a habit of tracking weekly results
  • define the goal of your Drupal website performance test: what precisely should it test and measure and under which circumstances?
  • … for instance, you could be analyzing your site's back-end performance only: the time requested for generating the most popular web pages, under a load of 700 concurrent visitors, let's say (so, you won't be testing connection speed or the browser rendering process)
  • pay great attention to the way you configure your page speed audit system if you aim for accurate and representative results
     

The END!

This is the easy, yet effective way of performing a website speed and optimization audit. What other tools and techniques have you been using so far?

Photo by Veri Ivanova on Unsplash

Feb 14 2019
Feb 14

I've been thinking about the performance of my site and how it affects the user experience. There are real, ethical concerns to poor web performance. These include accessibility, inclusion, waste and environmental concerns.

A faster site is more accessible, and therefore more inclusive for people visiting from a mobile device, or from areas in the world with slow or expensive internet.

For those reasons, I decided to see if I could improve the performance of my site. I used the excellent https://webpagetest.org to benchmark a simple blog post https://dri.es/relentlessly-eliminating-barriers-to-growth.

A diagram that shows page load times for dri.es before making performance improvements

The image above shows that it took a browser 0.722 seconds to download and render the page (see blue vertical line):

  • The first 210 milliseconds are used to set up the connection, which includes the DNS lookup, TCP handshake and the SSL negotiation.
  • The next 260 milliseconds (from 0.21 seconds to 0.47 seconds) are spent downloading the rendered HTML file, two CSS files and one JavaScript file.
  • After everything is downloaded, the final 330 milliseconds (from 0.475 seconds to 0.8 seconds) are used to layout the page and execute the JavaScript code.

By most standards, 0.722 seconds is pretty fast. In fact, according to HTTP Archive, it takes more than 2.4 seconds to download and render the average web page on a laptop or desktop computer.

Regardless, I noticed that the length of the horizontal green bars and the horizontal yellow bar was relatively long compared to that of the blue bar. In other words, a lot of time is spent downloading JavaScript (yellow horizontal bar) and CSS (two green horizontal bars) instead of the HTML, including the actual content of the blog post (blue bar).

To fix, I did two things:

  1. Use vanilla JavaScript. I replaced my jQuery-based JavaScript with vanilla JavaScript. Without impacting the functionality of my site, the amount of JavaScript went from almost 45 KB to 699 bytes, good for a savings of over 6,000 percent.
  2. Conditionally include CSS. For example, I use Prism.js for syntax highlighting code snippets in blog posts. prism.css was downloaded for every page request, even when there were no code snippets to highlight. Using Drupal's render system, it's easy to conditionally include CSS. By taking advantage of that, I was able to reduce the amount of CSS downloaded by 47 percent — from 4.7 KB to 2.5 KB.

According to the January 1st, 2019 run of HTTP Archive, the median page requires 396 KB of JavaScript and 60 KB of CSS. I'm proud that my site is well under these medians.

File type Dri.es before Dri.es after World-wide median JavaScript 45 KB 669 bytes 396 KB CSS 4.7 KB 2.5 KB 60 KB

Because the new JavaScript and CSS files are significantly smaller, it takes the browser less time to download, parse and render them. As a result, the same blog post is now available in 0.465 seconds instead of 0.722 seconds, or 35% faster.

After a new https://webpagetest.org test run, you can clearly see that the bars for the CSS and JavaScript files became visually shorter:

A diagram that shows page load times for dri.es after making performance improvements

To optimize the user experience of my site, I want it to be fast. I hope that others will see that bloated websites can come at a great cost, and will consider using tools like https://webpagetest.org to make their sites more performant.

I'll keep working on making my website even faster. As a next step, I plan to make pages with images faster by using lazy image loading.

February 13, 2019

2 min read time

Feb 13 2019
Feb 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

An abstract image of three boxes

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST for Drupal core's needs.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetchdata but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability.

Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an anyHTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted conventionamongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GETrequest with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling.

Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:APIis on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions.

Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure.

What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, for Drupal core's needs, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. That said, Drupal's GraphQL implementation is fantastic, especially when you have the developer capacity to build a bespoke API for your project.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein(Acquia) for their feedback during the writing process.

 February 11, 2019

 11 min read time

 Permalink

Feb 13 2019
Feb 13

Both Drush and Console have built-in help. If you type drush, you will get a long list of available commands. If you type drupal, again you will get a long list of available console commands. If you can’t remember the exact command to use to do what you need, you can scroll through the long list of commands to find the one you want.

But scrolling through the long list can be a real pain. Wouldn’t it be easier to see commands that relate to a particular activity without all the other commands in the same list?

Target the Drush or Console commands by keyword

Fortunately you can isolate what you need by adding the grep command to the drush or drupal console command, along with a pipe (|). This will allow you to filter the results by a keyword.

Here are a couple of examples.

Show watchdog Drush commands

You can use the Drush watchdog command to list messages in the database log, delete records and more.

drush | grep watchdog

This will return the following watchdog commands:

Drush grep watchdog

Show theme related Drupal console commands

You can use Drupal console to generate a new theme, download a theme, uninstall a theme and more.

drupal | grep theme

This will return the following

drupal grep theme

As you can see, this is a list of Drupal console theme commands.

Show generate console commands

Drupal console comes with a set of generate commands that you can use to generate the module controller, routes, menus and blocks for a custom module.

drupal | grep generate

This will return the following list of generate commands:

drupal grep generate

There are more generate commands than the above graphic. You can filter this down even further by making your search more specific. For example, if you need to generate a new plugin, you can search for generate:plugin with:

drupal | grep generate:plugin

This will return the following list of generate commands:

drupal grep generate plugin

The solution explained

What is grep?

Grep is a Unix command that will search and match based on a particular criteria.

What is pipe?

The pipe command allows you to connect commands together. The output from the first command will be passed through the second command.

This means that the following:

  • Running the drush or drupal command will return the full list of available commands and a brief description of how to use each one
  • Add the pipe (|) after the drush or drupal command will connect a second command
  • Adding grep and the keyword after the pipe will result in the output of the drush or drupal command being passed as input to the second command grep. In other words, it will search for the keyword in the full list and return just those lines that match the keyword
Feb 13 2019
Feb 13

The events that every year bring us together all over the World are the lifeblood of our vibrant and diverse community. From BADCamp in California, USA, through Global Training Days in Omsk, Russia to Contribution Weekends in Leeds, UK, we have a combined wealth of knowledge and experiences but also challenges and opportunities.

At the Drupal Association, we value highly the commitment made by those who volunteer their time to make these events happen. If I wasn’t British, and therefore terribly understated, at this point I would probably say “You rock!”

As an event organiser, I wanted to take the opportunity to bring to your attention a few things happening in the community that we hope will help you in your efforts.

The Event Organizers’ Group

We were very pleased to see the creation of a growing group of event organizers forming and beginning to have regular meetings to share experiences and resources. One of its founders, Kaleem Clarkson, has blogged about this and their plans for the future.

To help with their formation, we helped them become one of the first groups to create one of the new Community Group Sections at Drupal.org/community/event-organizers.

One thing that is super important, though, is that this group has representation by event organizers from all around the World. Wherever you live, do join this group, make your voice heard and ensure that it meets in a way that works for you.

The Event Organizers’ Round Table with Dries Buytaert

One of the requests we had from the group was to be able to spend more time discussing their aims with the Project founder, Dries Buytaert.

Dries agreed that spending quality time with representatives of such a key part of our community was highly valuable, so we have created a round table at DrupalCon Seattle 2019 to which event organizers will be invited to attend and participate.

The invites will be sent to the mailing segment introduced below - make sure you read on!

Dries taking notes at a previous round table
Image of Dries taking notes from a previous Round Table, courtesy Baddý Sonja Breidert

A mailing list segment especially for event organizers

To ensure that we can alert event organizers to activities like the above, we have created a special segment in our mailing list system that allows us to send updates to only those that want to hear about them.

If you are an Event Organizer, or looking to become one in the future, I highly recommend you visit your user profile on Drupal.org and check the box “Contact me regarding local organizing (camps, groups, etc.)”

For the above mentioned Round Table with Dries Buytaert, we will be sending invites to member of this mailing list so make sure you are on it!

screenshot of user account settings

Feb 13 2019
Feb 13
In this fifth installment of our series on conversational usability, our focus shifts to conversational usability and the process of evaluating and improving conversational interfaces that often differ significantly from the visual and physical interfaces we normally use to test with end users.
Feb 13 2019
Feb 13

The Micro Site module allows you to set up a Drupal web factory, from a single Drupal 8 instance, to power a multitude of sites, of a different nature if necessary. Micro Site provides a new content entity that allows you to customize these different site types at will, using the APIs available on Drupal 8, and modify their behavior according to business needs. Let's detail how we can proceed, through a basic example of providing a modular footer on different powered sites, and display it on all the pages that can make up the site.

This operation can be summed up as a little bit of site building, by adding and configuring a few fields, and a little bit of theming by adapting the general template of the pages of a site to be able to display this footer.

Site building

Add field on site type

For example, on a site type called Generic we can create a new Paragraphs field called Site Footer.

field site footer

Then we can fill in the footer of the page when editing the microsite, from a certain number of configured paragraphs (links type to add a list of links, or text type to put free text, or why not a paragraph to load a registration form, etc.).

Footer exemple

In this example we use paragraphs to feed the footer of the micro site, to allow a great modularity for each micro site to customize its footer according to its needs. But we could just as easily have used simpler, more constrained fields to limit the possibilities that could be filled in the footer. It is here to assess the business needs of each type of site that we want to set up.

Theming

We are not going to announce a great innovation, or a thundering trick, for the future. Once the field (or fields) created that will feed the footer of the micro site, all that remains is to display it at the template level page.html.twig, or on a surcharge specific to site: page--site.html.twig or a surcharge more specific to the site type, for example page--site--generic.html.twig.

First, we implement a prepocess function on the site pages to provide the template with the site_footer variable.

/**
 * Implements hook_preprocess_HOOK().
 *
 * Provide site variables to override footer, menu in the page template
 * dedicated to site entities.
 */
function micro_drupal_preprocess_page(&$variables) {
  $negotiator = \Drupal::service('micro_site.negotiator');
  /** @var \Drupal\micro_site\Entity\SiteInterface $site */
  $site = $negotiator->getActiveSite();
  if ($site instanceof SiteInterface) {
    if ($site->hasField(MicroDrupalConstant::SITE_FOOTER)) {
      if (!$site->get(MicroDrupalConstant::SITE_FOOTER)->isEmpty()) {
        $viewBuilder = \Drupal::entityTypeManager()->getViewBuilder('site');
        $field_footer = $site->get(MicroDrupalConstant::SITE_FOOTER);
        $site_footer = $viewBuilder->viewField($field_footer, 'footer');
        $variables['page']['site_footer'] = $site_footer;
      }
    }
  }
}

Then at the template level page--site.html.twig, we display this new variable.

{% if page.site_footer %}
  <!-- #site-footer -->
  <footer id="site-footer" class="clearfix site-footer-wrapper">
    <div class="site-footer-wrapper-inside clearfix">
      <div class="container">
        <div class="row">
          <div class="col-xs-12">
            <div class="site-footer">
              {{ page.site_footer }}
            </div>
          </div>
        </div>
      </div>
    </div>
  </footer>
  <!-- EOF: #site-footer -->
{% endif %}

And it's over. The rest is just a little bit of styling with some magical CSS rules.

What about a sidebar?

Using the Drupal 8 block positioning system, we can place as many blocks as necessary in a sidebar region, and control visibility by site, or by site type, as needed. But this configuration must then be done at the administrator level of the Drupal Master instance and not at the level of each micro site.

Following the same logic, we can very well set up dedicated fields (modular or not) to feed a sidebar present on each micro site, and then allow this time the administration and management, at each level of a micro site, of the content of these sidebars and control their visibility at this level. The implementation will be slightly more complex, to give full control to an administrator of a micro site on the visibility rules of the created content, but the basic principle remains the same: a touch of site building and a zest of theming to customize as much as necessary a micro site.

Feb 13 2019
Feb 13

Various forms are the heart of website’s interaction with the user. They are vital for usability, conversions, marketing analytics, and more. So form building tools are in demand — the Drupal Webform module ranks 7th on the 42,000+ list of contributed modules. The Webform module has recently received its stable version for Drupal 8. We will give it an overview and show an example of creating a simple form with the Webform module.

Webform module in Drupal 8: finally stable

The Webform module lets us create any kind of forms from simple to complicated multi-page ones. It also adds settings and features to them like statistics collection, email notifications, confirmations, data export in various formats, fine-tuned access, conditional logic, filtering and sorting, pushing results to CRM, downloading results as CSV, and more.

The stable version for Drupal 8 finally arrived on December 25, 2018. Drupal creator Dries Buytaert congratulated the Webform module creator Jacob Rockowitz via Twitter and called the release a big milestone.

The release was soon followed by Webform 8.x-5.1 on January 1 and Webform 8.x-5.2-beta1 on January 25.

Defining forms in a YAML file or admin interface

One of the most remarkable new features of Webform module in Drupal 8 is the ability to define a form in a YAML file. Webform module creator Jacob Rockowitz also built the YAML Form module specifically for Drupal 8. As of Webform 8.x-5.x, the YAML Form merged into Webform. So developers now can go to the YAML file by clicking the “Source” tab.

But there is also a user interface for form creation that is intuitively understandable even for website administrators. The admin dashboard contains examples and videos, as well as a built-in contact form.

The Webform ecosystem of modules in Drupal 8

The Webform module for Drupal 8 is a complex module with 20 submodules that demonstrate its varied capabilities. The minimum ones to enable are Webform and Webform UI modules.

There is a also number of contributed modules that work together with Webform and extend the its capabilities still further: Webform Views, Webform REST, Webform Analysis, Webform Encrypt, Webform Composite Tools, Webform Invitation, CAPTCHA, Honeypot, MailSystem, SMTP, and more.

Creating a simple webform with the Webform 8.x-5.1 module

We will create a feedback form for customers from scratch. With the Webform and Webform UI modules enabled, we go to Structure — Webforms, click “Add webform," name “Let us know your thoughts," and save it.

The form then needs to be completed with fields that are called “elements” in Webform. The “Add element” button leads us to the list of 60+ available fields, both simple and complex.

We will need a field for the customer’s name. For this purpose, there is a complex “name” element with sub-elements (First name, Last name etc.) that can be checked or unchecked to be included in the form or not.

If no complex name structure is needed, we can use a simple “Text field” element for the name. Let’s call it “Your name (optional)” in the “Title.”

For the email field, we select an “Email” element and add a placeholder for email addresses

An element of “Checkboxes” type will let customers check some options. For our feedback form, we list the company’s services as “option values”.

To let customers give their votes, we will us an element of the “Rating” type and set the value range from 1 to 10.

A “Text area” element will host customers’ comments.

On the list of all fields for our form, we can drag and drop the fields to change their order, or check them to quickly make them required/optional.

Let’s choose where the submitted forms will go. On the form’s dashboard, we select Settings — Emails/Handlers and click “Add email.” There, we configure the “sent to” email address (in our case, it’s the default [site:mail] is OK). We also add a custom subject “You received a new form submission.”

The “View” tab takes us to our form on the website. If we submit something and check the admin email, we see new messages about form submissions.

Let’s create the forms your Drupal website needs

This was just a very simple form example where we have not even touched advanced options — conditional logic, access, submission handling, JS effects, and so on. The Webform module in Drupal 8 has unlimited capabilities, and our Drupal team can create you the forms that will work in accordance with all your requirements.

Feb 13 2019
Feb 13

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

We had a delightful talk with Taco Potze, co-founder of GoalGorilla, Open Social and THX. Taco revealed to us why his team decided for Drupal among the various possible CMS choices and what Drupal initiative they are most excited about. He thinks open source has the potential to balance out the power of tech giants and give people all over the world equal opportunities. Read on to find out more about his projects and his contributions to the community.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

My name is Taco Potze, or just Taco usually does it ;). I am the co-Founder of the Drupalshop GoalGorilla and co-Founder of the Open Social and THX projects. I have been on the board of the Dutch Drupal Association for four years and active in organizing various local events such as the DrupalJam. My day to day business focuses on business development for Open Social and getting our latest THX Project up and running. Other than that, I love to travel and take care of our 1-year-old son.

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

I really started working with Drupal when one of our early clients asked us to build their new website. We were mainly working on online marketing, analytics and UX improvement in those days. 

My co-Founder and I have an industrial engineering background, not in coding per se. We searched for an out-of-the-box CMS that was open-source and Drupal made it to our shortlist. The winning reason for doing the project with Drupal 6 was the multi-language capabilities. The project had to be done in English and Chinese. Adding Chinese menus, blocks and content to the websites gives me now, over 10 years later, still nightmares sometimes ;). 

Jaap Jan Koster and I, now our VP of product, got the project done over summer within time and budget and ended up with a very happy client. That triggered us to offer more web development services and soon we were doing lots of projects. We used a variety of open-source CMSs until in 2010 we decided to do projects only in Drupal. 

For us Drupal provided the best framework to do challenging projects and working with only one CMS meant we could really become experts. The early years did not include many Drupal projects, I have to admit. We did not fully understand how important contributions (on all levels) are and lacked some of the skills to make worthwhile contributions. This changed over time when we started contributing modules back and became mature with the Open Social distribution where we have invested 10,000s of hours.

3. What impact has Drupal made on you? Is there a particular moment you remember?

One of the best aspects of the community is the Dutch Drupal Community. We have excellent thought leaders such as Bert Boerland and Baris Wanschers that relentlessly push the Drupal community forward.

We’ve had many successful events such as the DrupalJam, Frontend United and Splash Awards. There are informal meetings with developers or members of the board, and cooperation exists in distribution projects such as Dimpact WIM or DVG. Instead of competing with negative sentiment, we are competing but also working together to push our projects and companies forward.

A while ago, I even helped pitch an Open Social project for another Drupal agency (which we won). When I tell other companies about this ecosystem, at times they are skeptical and think that I am overselling or that we don't really compete or cooperate. However, with over 10 years of experience as a community, we have proven we can. The community is growing, Drupal is still winning market share, and companies are flourishing. I think this has made a profound impact on me as an entrepreneur.

4. How do you explain what Drupal is to other, non-Drupal people?

It depends who you are talking to. At a birthday party, you might want to simplify more than when talking to a potential client that hasn't heard of Drupal yet. I always amplify the message that it's a huge global community all working on the same IT project contributing code, sharing information and meeting at events all around the world.

I might share some of my worries about the power of big tech companies (Facebook tends to be a good example) and how we are trying to balance the scale by being completely open and transparent. I love sharing the idea that work we have done on Open Social gives people all around the world, say in developing countries, the same opportunities to organize and engage and drive their missions as companies with larger budgets.

For me working on open-source is a principled choice. Drupal is one of the projects where the importance of the open-source comes first. If I can make somebody aware of that and the choice they might have in that one day, then it was a good conversation.

5. How did you see Drupal evolving over the years? What do you think the future will bring?

These next few questions about Drupal I answered with the help from my team.

Our team sees Drupal evolving into an API-first platform, something we definitely applaud when looking at the possibilities out there that are related to this innovation (e.g. Internet of Things). We see Drupal being more open to integrations with other systems so we can provide an amazing cross-channel user experience.

6. What are some of the contributions to open source code or to the community that you are most proud of?

Our team works hard to contribute back to the Drupal distribution. It’s actually hard to pick which contributions we are most proud of since every single one of them is something to be proud of. 

However, the contributions we would highlight are all the commits done to Open Social. The fact that we are able to provide a ready solution for everybody to use is very motivating, especially since we can work together with developers from the community who help to make our distribution even better!

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

Drupal has many initiatives that we look forward to. One of our developers, Ronald, especially highlighted the Layout Builder

“I’m really looking forward to using the Layout Builder. We have always struggled with creating a good solution for custom one-off pages with unstructured content, which would provide a lot of flexibility for content managers using Drupal. I think this initiative will produce the “wow factor” to our users and give us the ease of mind by not needing to create difficult custom solutions.” - Ronald te Brake

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavorment. 

Blockchain technology has been a passion for a while and we are making great steps adding this exciting technology as part of Open Social and beyond with our THX Project. It's important to be able to improve engagement in social communities. 

With THX you can reward your users for valuable actions they take. For example, filling out their profile, adding content, adding code and patches for a community such as Drupal and much more. It also helps transferring reputation from one community to the next and gives a model to measure the value of online communities. If you are interested, we have written a white paper and various blogs on the matter and will publicize more information on the project and our DAICO in the upcoming months.
 

Feb 12 2019
Feb 12

One the most popular articles on our blog is an article I wrote a year and half ago about how to install CiviCRM on Drupal 8.

The method described there worked (and still more-or-less works), but it's... a mess.

It involves running a dozen or so commands, and is pretty easy to get wrong. All of this is just to get code assembled in such a way that you CAN install it.

I'm happy to announce that you can now do this in just a single command!

There's still some little issues and bugs with running CiviCRM on Drupal 8 that need to be manually worked around, but getting over that first hurdle of simply allowing you to install it in the first place should be significantly easier with this new method.

Read the full article to find out how!

Background: Why is this hard?

The biggest problem is that both Drupal 8 and CiviCRM depend on some of the same PHP libraries, for example, Symfony.

In PHP, you can't define the same class twice (this would be a fatal error) and you certainly can't define the same class twice at two different versions (for example, one version for Drupal and one for CiviCRM).

So, you couldn't just copy CiviCRM with all its dependencies into an instance of Drupal, because some of those dependencies would conflict with what's already in Drupal.

And, unfortunately, you couldn't just make a special CiviCRM bundle that's "optimized" for a particular version of Drupal 8, because each Drupal 8 site is potentially unique: you can update the PHP libraries used by a Drupal 8 site (for example, upgrading Symfony to a newer version) or add new PHP libraries that could conflict.

The Magic of Composer

Composer is a tool that's used by PHP applications and libraries to find and download a compatible set of dependencies.

For example, CiviCRM needs Symfony 2 (version 2.8.44 or greater) or any version of Symfony 3. Drupal 8.6.9 needs Symfony 3.4 (version 3.4.14 or newer), although, soon it will be possible to use Drupal 8 with Symfony 4 as well.

Using composer, you can say, "I need Drupal 8.6.9 and CiviCRM 5.10.0" and it can pull in a version of Symfony that works for both.

If your Drupal site used Symfony 4 (because some Drupal module needed it), it would error out and say you need to either remove that module (so Symfony can be downgraded to Symfony 3) or remove CiviCRM.

That's why composer is so heavily involved in getting Drupal and CiviCRM to work together!

The peculiarities of CiviCRM

Of course, that's not the whole problem, because otherwise it would have been possible to solve this with a single composer command years ago.

CiviCRM has been around for over a decade (which is a lifetime ago in the PHP ecosystem), and still has some legacy pecularilies that need to be accounted for..

Namely, CiviCRM depends on going through a non-composer-y build process to generate a "release" that is actually usable.

Some of those things that need a build process could be reworked in a way so that they didn't need it, and others could be done in a composer-compatible way, such that CiviCRM would work like any composer library. Work is being done on that, but those are hard problems which will take time to solve.

TL;DR - What are the commands?

Heh, alright! Time for the actual actionable steps. :-)

First, you need to have the following installed:

Then, to create a new Drupal 8 site with CiviCRM:

# replace 'some-dir' with the directory to create
composer create-project roundearth/drupal-civicrm-project:8.x-dev some-dir --no-interaction

Or, to add to an existing Drupal 8 site (assuming you used the best practice method of starting from the drupal-project composer template):

composer require roundearth/civicrm-composer-plugin civicrm/civicrm-drupal-8

If you have a Drupal 8 site that isn't based on the drupal-project composer template, well, you're going to need to convert your site.

I'd recommend creating a new codebase (using the 'composer create-project' command above), adding all the modules/themes/libraries from the old codebase, and then switching the site to the new codebase. Assuming you didn't forget to copy anything over, that should work, but definitely keep a backup.

Installing CiviCRM

Once you have the code in place using the commands above, you'll actually need to install CiviCRM. Due to some bugs in CiviCRM for Drupal 8, there's a bunch of caveats and extra steps associated with that.

I've documented some of the steps on the project page for 'roundearth/drupal-civicrm-project':

https://gitlab.com/roundearth/drupal-civicrm-project#installing-civicrm

Hopefully, all these bugs will be fixed in time.

The good news is that the composer project template will always pull in the latest versions of CiviCRM and its Drupal 8 integration, so these steps should be able to remain the same, and will just start working better as those bugs get fixed. :-)

How does it work?

There are two parts to this:

It does the following:

  1. Run's Bower to pull in Javascript and CSS dependencies
  2. Downloads a release of CiviCRM and copies in the missing files generated by the usual build process
  3. Downloads any requested CiviCRM extensions
  4. Copies any web assets from CiviCRM (which is installed in the vendor directory, which isn't web accessible) into the web root

In case you're curious, or want to help improve it, here's where all the most interesting code lives:

https://gitlab.com/roundearth/civicrm-composer-plugin/blob/master/src/Handler.php

Conclusion

One of the main challenges in improving CiviCRM on Drupal 8, is just getting it installed for developers who might want to help.

We've been involved in the effort to port the Webform CiviCRM module to Drupal 8, which has also faced this same challenge.

That's why a small part of the development of this new method was funded via the donations that were made to that effort. The rest was just part of our efforts in maintaining Roundearth, a Drupal 8 and CiviCRM platform for non-profits.

We're hoping this will help, not only with getting CiviCRM installed on Drupal 8, but also help to accelerate development. :-)

Feb 12 2019
Feb 12

The only way to get someone to contribute to an open source project is to ask them.

At the beginning of the New Year, I set up the Webform module's Open Collective. I knew that in order to persuade organizations to back the Webform module or any Drupal-related Open Collective would require directly asking organizations to join the Webform module's Open Collective. Additionally, I also needed to listen to what an organization might want to get out of their financial contribution to an Open Collective

It is reasonable for organizations to ask why should they back an Open Collective and what should they expect in return.

At DrupalCampNJ, I paid a visit to the event's sponsors that I was friendly with and asked them to join the Webform module's Open Collective. I also decided to broaden my pitch to include asking people to consider backing any Drupal related Open Collective. The Simplytest.me and Drupal Recording Initiative collectives provide invaluable services that our community needs and everyone should help support them.

Everyone was familiar with the Webform module, and most people knew that I was maintaining the Drupal 8 version, but no one knew what an "Open Collective" is. Gradually, as I spoke to people, I managed to put together a concise explanation for the question, "What is Open Collective?"

Open Collective is a service which allows Open Source projects to transparently collect and distribute funds. Organizations who back an Open Collective will get a receipt for their financial contributions and be able to see exactly how the collected money is being spent.

The above explanation leads to the next relevant question which is "How is this money going to be spent?" My response is this: Spending the collected funds is going to be determined by what the backers want and the Webform module needs.

As the maintainer of the Webform module, I feel we need a logo. A logo will help distinguish the Webform module from the massive sea of online form builders. For some projects, the Webform module is a significant part of a proposal and a logo would help make the software feel more professional. In the near future, I am going to post an RFP for a Webform logo. Still, what I want is not nearly as important as what organizations need.

Discussing, Explaining, Listening and Delivering

I want to hear what people want or need from the Webform module's Open Collective. Since most people did not know what is an Open Collective, it was hard to expect them to know what they need from an Open Collective. As I spoke with people at DrupalCampNJ, I came up with two anecdotes that explored some potential use cases for an Open Collective.

My first anecdote happened when I finished setting up the Webform module's Open Collective, someone emailed me asking for help with debugging a broken image file upload, and they offered to pay me for my time. These small private transactions are complicated to coordinate, so I suggested that they make a donation to the Webform module's Open Collective and then create a ticket in the Webform's issue queue. I managed to resolve their problem quickly. Everyone felt that this was a successful transaction.

Another related anecdote: While brainstorming about getting people to help pay for open source development, a fellow software developer daydreamed that his company would happily pay to see a bug fixed or merely a patch committed. One of the most common frustrations in open source issue queues is seeing a patch sit there for months and sometimes years.

The above stories highlight the fact that besides using Open Collective to back an open source project, a collective could be used to execute transparent transactions using a common platform.

Would enterprise companies and organizations be more comfortable paying for open source work through an Open Collective instead of a private transaction with an individual developer?

Persuading

At DrupalCampNJ, no one rejected the concept of backing a Drupal-related Open Collective. I was able to collect several email addresses which I am going to use to continue persuading and listening to potential backers. My next step is to write a short and persuasive email which hopefully will inspire organizations to back Drupal-related Open Collective.

Continuing

In the spirit of full transparency, I will publish this email in my next blog post. For now, any feedback or thoughts are always appreciated. Hey, maybe my two anecdotes might have persuaded you to back the Webform module's Open Collective.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly

Feb 12 2019
Feb 12

Amazee Labs is proud to sponsor Drupal Mountain Camp in Davos, Switzerland 7-10 March 2019.

Come by and see us in the exhibit area or at one of the social events, and be sure to check out these Amazee sessions: 

On Friday, from 14:40 till 15:00, join Maria Comas for GraphQL 101: What, Why, How. This session is aimed at anyone that might have heard or read about “GraphQL” and is curious to know more about it. The session will give a basic overview and try to answer questions like:

  • What is GraphQL?

  • Is GraphQL only for decoupled projects?

  • Advantages to using GraphQL with Drupal

  • Getting started with GraphQL

Follow this up on Friday from 15:00 till 16:00, with Daniel Lemon who will present Mob Programming: An interactive session. The basic concept of mob programming is simple: the entire team works as a team together on one task at the time. That is one team – one (active) keyboard – one screen (projector of course). It’s just like doing full-team pair programming. In this session you’ll learn:

  • What are the benefits to a team?

  • How could this be potentially integrated into your current workflow

  • The disadvantages to Mob Programming and why it might not work for certain types of companies (such as a web agency).

Additionally, don’t forget to check out this talk from Michael Schmid of amazee.io Best Practices: How We Run Decoupled Websites with 110 Million Hits per Month. This session will lift the curtain on the biggest Decoupled Websites run by amazee.io and will cover:

  • How the project is set up in terms of Infrastructure, Code, Platform and People

  • How it is hosted on AWS with Kubernetes, and what we specifically learned from hosting Decoupled within Docker & Kubernetes

  • Other things we learned running such a big website

Hope to see you in Davos soon! 

Feb 11 2019
xjm
Feb 11

Last fall, we adjusted our minor release date for Drupal 8.7.0 from March 6 to May 1. This was done as part of moving Drupal's minor release schedule toward a consistent schedule that will have minor releases in the first weeks of June and December each year. (See Plan for Drupal 9 for more information on why we are making this change.)

However, the change to the 8.7.0 release date means that DrupalCon Seattle now falls in the middle of important preparation phases for the minor release. In order to ensure community members have adequate time to prepare and test the release without interfering with DrupalCon Seattle events, we've moved the alpha and beta phases for the release one week earlier:

  • 8.7.0-alpha1 will now be released the week of March 11. The alpha phase will last two weeks until the release window for beta1.
  • 8.7.0-beta1 will now be released the week of March 25. The beta phase will now last three weeks (including the week of DrupalCon) instead of two. The beta phase will still end when the release candidate window begins.
  • The release candidate (RC) and release dates are unchanged. The RC window still begins April 15 and the scheduled release date is still May 1.

(Read more about alpha, beta, and RC phases.)

Feb 11 2019
Feb 11
In this fourth installment of our series on conversational usability, we're turning our attention to conversational content strategy, an underserved area of conversational interface design that is rapidly growing due to the number of enterprises eager to convert the text trapped in their websites into content that can be consumed through voice assistants and chatbots.
Feb 11 2019
Feb 11

We compare REST, JSON:API and GraphQL — three different web services implementations — based on request efficiency, operational simplicity, API discoverability, and more.

An abstract image of three boxes

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST for Drupal core's needs.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetch data but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability.

Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an any HTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted convention amongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GET request with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling.

Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:API is on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions.

Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure.

What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, for Drupal core's needs, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. That said, Drupal's GraphQL implementation is fantastic, especially when you have the developer capacity to build a bespoke API for your project.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein (Acquia) for their feedback during the writing process.

February 11, 2019

11 min read time

Feb 11 2019
Feb 11

We’re off to a great start of the new year! In January, we wrote some really interesting blog posts; in case you missed any of them, we’ve prepared this overview where you can find all of our posts from last month, neatly compiled in one place. Enjoy!

2018 in review

Our first post in 2019, published just a few days into the new year, was a review of our achievements in the previous year. Not only did 2018 mark the 5-year anniversary of Agiledrop, it will also remain in our memories as the year when we upped our game, scaled our team very successfully and optimized our strategy for the future. 

Of course, we still found the time to give back to the Drupal community, whether it be through open-source contributions or any of our educational events, such as our free Drupal courses. 

Read more

Interview with Shawn McCabe, CTO of Acro Media

We couldn’t properly start the year without continuing with our Community Interviews series. Mere days after our yearly review, we published the interview with Shawn McCabe, CTO of the Canadian Acro Media

Shawn’s love for open source was something that was immediately obvious from our talk and it was extremely interesting to get to know his story about discovering and working with Drupal. Our favorite part is almost definitely how he first met Dries - but you’ll just have to check out the entire post if you’re curious about that! 

Read more

Best Drupal 8 Security Modules

To also cater to the more tech-oriented audience, and to highlight one of the foremost advantages of Drupal (yes, of course it’s security!), we wrote a post about the 5 Drupal security modules that we’ve so far found to be the most useful. 

Even though Drupal is known for being a very secure CMS out-of-the-box, it still never hurts to take some additional security measures. Better safe than sorry, they say, especially with so many cyber security threats reported recently!

Read more

Interview with Gabriele Maira of Manifesto Digital

Next up came another Community Interview - this time we talked with Manifesto Digital’s Gambry, an avid proponent of contribution sprints (definitely not just because he’s responsible for running local Contribution Sprints in London!). He thinks every Drupal developer should attend a sprint at least once in their life, and provides the really on-point reasons for this.

There’s one sentence from the interview that’s really remained with us and fills us with warmth every time we read it: “And instead of being a mortal between gods, I found friends. I found the wonderful Drupal Community.” Ahh … Isn’t it great? Can you feel the warmth? We know we sure do.

Read more

The Story of Agiledrop: Cultivating Strong Relationships with Clients

Our final blog post from January was the 3rd chapter in our latest series of posts, The Story of Agiledrop. In this extensive post, we talked about the steps we take to ensure that the relationships with our clients are always as healthy and strong as possible.

Admittedly, due to our unique workflow, this has proved to be quite challenging. But, because we’ve understood the importance of this from the get-go and have hence made it one of our top priorities, we’re proud to say that our approach is very effective. The result is two-fold: happy clients and a motivated team.

Read more

That’s it for our posts from January - but, don’t worry, we’ll be back very soon with new content, and, if you happen to miss any of our upcoming blog posts, we’ll be doing the overview again in March. So, keep warm and stay tuned! 

Feb 11 2019
Feb 11
Wisdom doesn’t automatically come with old age. Nothing does - except wrinkles. It’s true that some wines improve by age but only if the grapes were good in the first place. 
-Abigail Van Buren

A reflection of the life experiences adds generously to the whole box of one’s wisdom because let’s face it, being wise and savvy can come from anyone and anywhere. So yes, famous quote “Age is just a number” has done justice to the whole scenario of erudition. 

Just like natural misconception “ bigger the better” proved right by small agencies handling bigger projects. Gone are the days where large enterprises use to rule in the market kingdom bagging all the big projects. Today, small agencies are winning big-name accounts and cool projects far more often. And the trend is forecast to continue.

Two fish bowls, one small and one big with water where a fish is jumping from small bowl to the bigger bowl


For the Drupal agency with big aspirations deciding on the projects to opt for can be a bit of a task sometimes, but attaining the trust from CxOs of big organizations that is even bigger than the projects itself. 

Thereby, solving this issue of handling and winning - here are some of the ways which would help you to seize those big projects in your vanity. 

First things First - How to meet big clients?

Just because you are a small agency or organization, it would not mean your clients to have to be small. Landing on the large organization not only boosts up the small business revenue but also increases efficiency among your team members and organization.

  • Use client reference to introduce your process

Big companies may seem like a grand entity, but you should not forget that they are constituted of hundreds and thousands of individuals who have the power to make the decisions.

So it is really important for your research to be up notch and accurate that tells you who to contact within the company you've targeted. Some of the sources or references may help with this. Apart from this some companies also present details of at least one of the senior employees on their websites.

But you need to be really creative to figure out exactly who the right person is. Look out for out some of the company’s publications or newspapers mentions seeing whose name comes up.
Not only this but you can also tag along with people who would introduce you to big tech giants.

  • Indulge in cold calling

Telemarketing and cold calling continues to be an essential discipline that is really useful for the sales role. In many business sales organizations, the old school “door knocking” might not be that productive, and when it comes to big organizations especially with large territory assignments, cold calling becomes the hero for everyone. Prospecting via phone calls continues to be a great compliment to your overall employment setting and lead generation projects.

  • Be an expert and then try to be a solution to their needs. 

If you want the big giants to trust you with the projects then a sense of “What the work means to you”must be established with a clearer vision for the future. In fact, according to the Employee Job Satisfaction and Engagement survey, nearly 77% of employees said it was important to their job satisfaction and engagement to have a clear understanding of their organization’s vision and mission.

Start with your team 

Now that you have big names in your vanity start by developing strong team hold and skills. Starting from:

  • A team of Generalists 

Generalists are the people who have a particular skill but are flexible enough to mold themselves in any situations and are ready to learn a new skill. In the case of Drupal websites, a generalist should be able to handle both backends as well as frontend. 

In other words, having a person as a generalist would be beneficial for your organization. He/She would be able to effectively handle many tasks. 

 Image of 5 people. The middle guy is magnified with the help of magnifying glass which is held by a hand
  • Services are important 

Focus on the set of services and assistance which you would be providing to the vendor. Your team would become a specialist with time and experience. Treat a big enterprise like royalty. 

The big giant enterprise is like the customer for you who are always expecting great services and will not put up with the waiting for the poor responses from their representatives. 

Be honest with your projects and their goals. If your customers find that you are dishonest with your services, they will lose faith in you and may even spread negative feedback about your business.  

  • Categorizing your projects

To ensure that the complexity of the project is achieved, categorize the project into the following:

Small projects: These can easily be tracked just by getting updates A project is classified as small when the relationships between tasks are basic and detailed planning or organization is not required.

Charter required projects: These are projects that require some level of approval other than the first line manager, but do not include significant financial investment. A summary of major deliverables is usually enough for management approval.

Large projects: The project network is broad and complicated. There are many task interdependencies. With these projects, simplification where possible is everything. 

  • Planning 

Planning a project helps in achieving objectives and deadlines on time. It pushes the team members to keep working hard until the goal are conquered. Planning also helps in creating a network of right directions to the organization.

Increases efficiency: Planning helps in maximum utilization of all the available resources that you would be using. It supports to reduce the wastage of precious resources and dodges their duplication. It also aims to give the greatest returns at the lowest possible cost. 

Reduces risks: With having such large projects there are many risks associated with it. Planning serves to forecast these risks. It also serves to take the necessary precautions to avoid these risks.
 
Facilitates coordination: Often, the plans of all departments of an organization are well coordinated with each other. Similarly, the short-term, medium-term and long-term plans of an organization should be coordinated with each other. 
 
Aids in Organizing: Organizing intends to bring together all possible resources, Organizing is not possible without planning. It is so, since, planning tells us the number of resources needed and when are they needed. It suggests that planning aids in organizing in an effective way.
 
Keeps good control: The actual administration of an employee is compared with the plans, and deviations (if any) are found out and corrected. It is impossible to achieve such control without the right planning. Therefore, planning becomes necessary to keep good control.

 Image of text saying plan in red color with two robotic hands

 

  • The scope of the Project 

Perhaps the most difficult part of managing a large project with a small team is the difference between a task and an actual project. In order for small project teams to be successful with large projects, the manager should always know the status of the project and the scope at which it is being achieved. 

  • Excellent Relationship with the vendor

The most important part of managing big projects with small teams is to establish a meaningful relationship across the organization.

A solid relationship is a path that may lead to the difference between a project that becomes actualized and one that remains in the conceptual area. If the business doesn't concentrate on a product or the service that is important to reach your clientele, you require a vendor that does it. 

Next comes the Methodologies 

Large organizations usually handle classical methodologies which involve a lot of unnecessary documentation. Thus, for small agencies, some methodologies help largely in handling large projects 

Agile was developed for projects that require both speed and flexibility. The method is split down into sprints- short cycles for producing certain features. 

Agile is highly interactive, allowing for fast adjustments throughout a project. It is mostly applied in software development projects in large part because it makes it simpler to identify issues quickly 

Agile is essential because it allows making changes early in the development process, rather than having to wait until testing is complete.

A circular infographic with six parts that says test, evaluate, meet, plan, design and develop

It is a variation of an agile framework which is iterative in nature which relies on scrum sessions for evaluating priorities. “The Scrum assemblies” or “30-day sprints” are utilized to limit prioritized tasks.

Small teams may be gathered to concentrate on a particular task independently and then coincide with the scrum master to assess progress or results and reprioritize backlogged tasks.

Image of a circular infographic with five parts saying commitment, focus,openness, respect, and courage

 

  • Waterfall 

This is a basic, sequential methodology from which Agile and similar concepts evolved. It is commonly practiced in many industries, especially in software projects.

Waterfall has been an excellent project management methodology for years now and used by most of the project managers. This methodology is sequential in nature and is used by many industries, mostly used in software development. It consists of static phases ( analysis, design, testing, implementation, and maintenance) that are produced in a specific order. 

  • Critical Path Method 

CPM is an orderly, systematic method that breaks down project development into specific but related actions. 

This methodology can be used to build the preference for a project’s activities to assess risks and allocate resources accordingly. This method encourages teams to identify milestones, assignment dependencies, and deadlines with efficiency. 

A Critical Path introduces to a sequence of critical projects (dependent or floating) in a project that tells the extended succession of tasks that have to be made on time in order for the project to meet the deadlines.

Culture is Fundamental to Succeed

How do you explain to your client that the team won’t work this week for DrupalCon, DrupalCamp or any other events happening around?

You can only explain it by being clear with your thoughts and ideas. The community here plays a vital role in everything. 

Explain to your team members that it is beneficial for them to improve Drupal as a platform and introduce them with the team culture. Help your team member create pages in drupal.org and give credits to them of their creation on patches and modules. 

Closing the project 

Yes, it is possible that project closing might look like an insignificant and unimportant task in your project management journey, but, in fact, it is a critical part of producing a successful project. To help you get this step right, here are 4 things you need to know about how to close a project effectively.

Trace Project Deliverables: It is an effective closure means that you have completed all the deliverables to the satisfaction of the project’s sponsor

Reward Team Members: As your project comes to a close, always make sure to acknowledge, recognize and appreciate the contribution of your team members

Closeout Reports: A detailed close-out report should contain details about the process used during the project, the mistakes, the lessons learned, and how successful the project was in achieving the initial goals

Finance: Big clients are usually slow in payment, try to indulge in an agile budget for large projects. 

Turning From Technical provider to strategic solution partner 

As with any investment portfolio, an organization’s investment in Run, Optimise and Innovate initiatives must be balanced and aligned with the organization’s risk tolerance and the role expected of IT. If an organization considers itself to be more conservative, it is expected to see a higher ratio of Run to Optimise and Innovate spending. More progressive organizations will have more Optimise spending, and “leading edge” organizations will have more Innovate spending.

Conclusion 

Yes, Goliath, the Gittite, is and would always be the well-known giant in the Bible. He is described as 'a champion out of the camp of the Philistines, whose height was six cubits and a span. 

Befriending with the Goliath not only gave the sense of power to anyone with him but was also granted with security. 

Hunching on to large enterprises with big projects is like the very first step to success. Right steps and maintenance would to that success anytime soon. 

Opensense labs development methodologies work specifically on the approaches that involve Drupal development, enhancing efficiency, and increasing project delivery. 

Contact us on [email protected] to accomplish those large projects which you always desired off. 

Feb 10 2019
Feb 10

The night watchman can get a new image as an alert guard if he senses correctly that something does not look right. Imagine a man dressed like a stockbroker and carrying a soft leather briefcase and a valise. His walking style is sort of tentative and squirrelish which is not the way a stockbroker walks. On being asked for an ID by the watchman, he scurries off towards the entry gate of a building and drops his valise on to the floor before being finally captured by the watchman who later finds out that he was trying to rob someone from the building.

A security guard posing for the camera in a broad daylight


Much like the night watchman and his brilliance in judgement that makes sure that the building is kept safe from any such robbers, there is another solution in the digital landscape that helps in writing browser tests easily and safely. NightwatchJS is an exceptional option to run browser tests and its inclusion in the Drupal core has only made things easier for the Drupalists.

Understanding NightwatchJS

Logo of nightwatchjs with an icon representing an owl


NightwatchJS is an automated testing framework for web applications and websites. It is written on NodeJS and uses the W3C WebDriverAPI (formerly known as Selenium WebDriver) for performing commands and assertions on DOM elements.

Nightwatch.js is an integrated, easy to use End-to-End testing solution for browser-based apps and websites, written on Node.js. - Nightwatchjs.org

As a complete browser (end-to-end) testing solution, NightwatchJS has the objective of streamlining the process of setting up continuous integration and writing automated tests. It can also be utilised for writing NodeJS unit tests. It has a clean syntax that helps in writing testing rapidly with the help of NodeJS and CSS or Xpath selectors. Its out-of-the-box command line test runner propels sequential or parallel test runs simultaneously by group, tags or single. It, also, has the support for Mocha runner out-of-the-box.

NightwatchJS has its own cloud testing platform called NightCloud.io in addition to the support for other cloud testing providers like SauceLabs and BrowserStack. It governs Selenium and WebDriver services automatically in a different child process and has great support for working with Page Object Model. Moreover, with its out-of-the-box JUnit XML reporting, it is possible to incorporate your tests in the build process with systems like Jenkins. 

NightwatchJS in Drupal

JavaScript Modernisation Initiative paved the way for the addition of NightwatchJS to the Drupal core (in version 8.6) as the new standard framework for unit and functional testing of JavaScript. It makes sure that alterations made to the system do not break expected functionality in addition to the provision for writing tests for your contributed modules and themes. It can be included in the build process for ensuring that regressions hardly creep into production.

You can try NightwatchJS in Drupal 8.6 by adhering to the instructions given on GitHub. It exhibits how to test core functionality. It also instructs on how to use it for testing your existing sites, modules, and themes by giving your own custom commands, assertions and tests. It is worth considering to check out Nightwatch API documentation and the developer guide of NightwatchJS for creating custom commands and assertions.

NightwatchJS tests will be run by Drupal CI and are viewable in test log for core developers and module authors. And for your own projects, tests can be run easily in, for instance, CircleCI, that can provide you access to artefacts like screenshots and console logs.

Conclusion

While Drupal 8 has extensive back-end test coverage, NightwatchJS offers a more modern platform that will make Drupal more familiar to PHP and JavaScript developers. 

Offering amazing digital experience has been our biggest objective and we have been doing that with a suite of services.

Contact us at [email protected] and let us know how can we help you achieve your digital transformation dreams.

Feb 10 2019
Feb 10

Virtual private servers are fantastic for running your own cloud applications and gives you the authority over your private data. Potentially, your private data may be leaked when you communicate via services like text messaging. One way to ensure greater privacy is by hosting your own messaging system. This is where Rocket.Chat comes into picture.

person typing on laptop computer


Rocket.Chat has the provision for an actual open source implementation of an HTTP chat solution that provides convenience and gives greater freedom at the same time. It can be a marvellous solution for remote communications, especially for open source communities.

Rocket Chat: A close look

[embedded content]


The official site of Rocket.Chat states that it’s an open source team communication software which offers an alternative to remote communications by replacing the likes of email, HipChat and Slack. It aids in improving productivity via efficacious team communication and team collaboration. It helps in sharing files with the team, real-time chatting or even leveraging audio or video conference calls with screen sharing. Being an open source solution, It gives you the option of customising, extending or adding new functionality to meet your expectations.

Rocket.Chat is an open source team communication software which offers an alternative to remote communications by replacing the likes of email, HipChat and Slack

 

Logo of rocketchat with an icon representing rocket


With Rocket.Chat, you can do away with cc/bcc and make use of Rocket.Chat channels and private groups thereby bringing in more transparency in the team communication. By utilising @username, you can include relevant participants in order to apprise them swiftly. When you need to inform about an important matter to all the members of a group, @all can be used. Participants can join or leave anytime using the full chat history. Moreover, Rocket.Chat offers a secure workspace with restrictions on username and greater transparency for admin. You can be a part of leading blockchain propellants like Hyperledger, Brave, Aragon among others in migrating from Slack and Atlassian to Rocket.Chat.

Essential features

Following are some of the major features of Rocket.Chat:

  • Unlimited: Rocket.Chat has the provision for unlimited users, channels, searches, messages, guests and file uploads.
  • Authentication mechanism: It offers different authentication mechanisms like LDAP Group Sync, 2-factor authentication, end-to-end encryption, single sign-on and dozens of OAuth providers.
  • Real-time Chat: Its Live Chat feature allows you to add real-time chat widgets to any site or mobile applications. This brings more efficacy in team communication and also ensures top-notch customer service.
  • Message translation: It utilises machine learning for automatically translating messages in real-time.
  • Use across platforms: It can be leveraged for all the platforms with its web, desktop and mobile applications, LiveChat clients, and SDK( Software Development Kit).
  • Marvellous customisation: You can alter it to meet your requirements. Incoming and outgoing WebHook integrations can be added to it. The personalisation of user interface can be performed by overriding any of the built-in styles. You get to extract the benefits of its REST API, LiveChat API or Real-time API.

Rocket.Chat in Drupal

The Rocket.Chat module, available for both Drupal 7 and Drupal 8, helps a Drupal site to integrate Rocket.Chat. It constitutes a base module that holds the configuration and the LiveChat module that includes a block for enabling LiveChat widget on a page which can be controlled as a block.
 
The maintainers of this module recommend running Drupal and Rocket.Chat behind a TLS (Transport Layer Security) proxy or web server that has TLS capabilities. Also, HTTPS and HTTP crossovers should be taken care of. Moreover, enabling the LiveChat on Rocket.Chat instance allows you to use LiveChat feature.

Conclusion

One of the quintessential aspects of open source communities is remote communication. Rocket.Chat offers a great alternative to the likes of Slack and gives you the plugin to enable in it in Drupal.
 
We have a strong inclination towards the provision of digital innovation and have been doing that with our expertise in Drupal development.

Contact us at [email protected] to understand more on Rocket.Chat and transform your team communication and collaboration.

Feb 09 2019
Feb 09

Ever attended an art gallery and witnessed how modern artists use canvas to speak their thoughts?

By looking at the art and paying attention to their creation a lot of “Ohhs” and “Ahhs” and expressions of awe with wonder tells us whether it has sufficiently aroused the engagement or not.

But at the core of this practice, the whole idea of thinking an endless number of ways in which visitors attend an art gallery always sustain in the mind of an owner. 

More visitors means more conversions, which results in better progression. 

Customer experience and engagement - 2 things thrived by every art gallery owner.

Right?

Image of 7 people in an art gallery looking at the painting in front of them


Today most marketing teams are structured to drive traffic towards websites that seek to generate traffic and hopefully even more profit. 

Yes, it might be an oversimplification of the trend, but that’s the standard marketing playbook. That’s where Conversion Rate Optimization (CRO) comes in. 

But, What exactly is CRO?

Let's discover. 

Everything about CRO

In internet marketing, conversion optimization, or conversion rate optimization is a system for increasing the percentage of visitors to a website that converts into customers, or more generally, takes any desired action on a webpage. The whole process involves understanding how users move through a particular website, what actions they are taking, and what's stopping them from completing their goals. 
 

images of ROI, targeting, usability, A/B testing, multivariate, credibility & landing pages in the middle there is conversion rate optimization


Importance of CRO 

  • Pay per click

The general idea of pay per click advertisement was that it targeted the audience fast by selecting options of what they could see. And you would agree on the fact that nowadays “pay per click” (Google Adwords) prices have hiked up to an extent where it is evident that it is directly affecting the conversions. Also with the increase in digital devices and people getting indulged in technology more businesses have become digital. 

  • Enhancing online competition 

Now that more people are becoming tech-savvy, competition among retailers has increased a lot. Some of them are simply eating away small retailers. That means if you want to convert your visitors into customers, you need to have a website that should be easy to use and easily customizable.  

You can take the help of Drupal as your website platform. It is one such CMS that allows you to set up your website easily and customize it as desired. 

Conversion Optimization has the benefit of allowing you to stay ahead of the curve in terms of competition, by providing you with insights on what's going on with the competitor's website.
  • Combating the rising cost of digital marketing

Let’s face it pay-per-click isn’t just the only thing which is rising in the market. Digital marketing in this area is giving good competition to any sort of “traffic source” ever known. 

The whole point of marketing is to direct the users towards your website.  But how would you make sure that most of them actually make a purchase?

This is where CRO comes to the rescue.

By increasing the number of page visitors who make purchases, CRO helps in improving the conversion rates by simply compacting the cost of digital marketing
  • Streamlining the business 

A website that is continuously being optimized looks more legitimate than the ones which are not doing that job. 

Why?

Well maybe due to the fact that the ones which are not being optimized are not providing a clearer path to the landing pages. Clear landing pages for an online retailer means having an inventory that can easily search or getting a clearer view of the categories. 

  • Saving a large amount of money 

So how can spending a large amount of money on your website result in saving money? 

You’ll find that spending less money on each customer would actually produce more money. Maybe you are not necessarily saving a lot, but you are definitely making a lot more. 

Which eventually balance out both. 

  • Improving the efficiency and layouts 

If you are working with an affiliate organization or the marketers, you would find out that many online retailers find CRO to be a good way to get news out about their products, through a platform that already has an engaged audience, CRO makes your website more valuable to your affiliates, and to any other marketing platforms.
When a higher number of users who click-through to your webpage actually makes a purchase, your affiliates, pay per click advertisers, social media marketing campaigns, etc., make more because you are making more.  

Common misconceptions related to CRO 

There are some businesses that see CRO be an unnecessary expense, an expense that doesn’t really move their business ahead.

Whereas most of them see it as a golden bolt for their marketing woes, a key for high traffic and more leads. 

Using it or not usually it is better than having misconceptions that lead to misguiding and wasting of resource and time. Among which some of them are:

  • CRO is a single skill 

One of the biggest misconceptions among business is an entrenched belief that CRO is a single skill set. In other words, CRO is a broad practice that encompasses a wide range of skills. To be effective at conversion rate optimization, you require three skill sets:

Copywriting: Whether you can write a persuasive copy or not would have a great impact on conversion rates.

Design: Starting from UI/UX to its choice in graphics depends highly on the rate of conversions.

Analytics: It is important to have someone with special and necessary skills to analyze your result. 

  • It is all about best practices

Running or stumbling upon blog posts and articles that tell you the best practices on boosting up your conversion rate has become standard now. 

And going ahead and implementing those practices as written on the write-ups is conventional. But do these tricks really work?

The truth is that there are no particular one-size-fit best practices that can lead you to the path of better conversions. Your focus should always be on removing barriers that hinder with the flow of conversions.  

  • Making small changes that lead to big and better rewards
     
Image of two poster having a girl divided inside a square. The first is having short content with 102.5% green up arrow and the other is having big text with 22.7% red down arrow

 

The above pictures clearly describe that how changing the content length affected the conversions and resulted in 90% more clicks. 

Going by this case study, you might be tempted to find the silver bolt where making a minor change reaped into great news. 

In truth case studies like these are entirely misleading and provides you with only partial information. They don’t tell you that:

How long the tests were done?

Whether the traffic remained constant throughout the testing period?

What all changes were made on the website?

  • It is all about split testing 

Most people think that CRO is all about split testing site elements. 

The truth is that CRO is all about measuring your customer's actions and remove the convention barriers. 

To do this start by basic user needs and understanding the psychology of the customers. This model would help you to focus on the things that should be worked on:

A pyramid with five sections where the text is written as functional, accessible, usable, intuitive and persuasive

 

  • Focusing on CRO alone builds a successful business 

Due to the immense love showered on CRO and per digital marketing, it is believed by the companies that they are winning the game of online business. 

True, that CRO might increase your conversion from 1% to 2%, which yes has a great impact on sales, but to reach those heights of success you need to take a closer look on traffic, brand, and customers. 

So What is the Structured Process in CRO?

  • Defining your conversion actions 

The conversion actions can be defined based on the business goals and then they can be implemented on web analytics. Promoting and producing content in one of the actions which should be implemented as soon as possible. The content technique would require you to do be indulged in practices like:

  1. Targeting email marketing
  2. Marketing automation 
  3. Producing demo
  4. Live events
  5. Case studies

The content at this stage revolves around customer-relationship management through segmentation. When you segment your audience based on age, gender, geographical position, professional role, etc., you are better equipped to offer them targeted content that interests them

  • Understand the prospects

A better understanding of the prospects helps in better converting of offers. This is the stage where you look for indirect customers acquisition and brand awareness.  Begin by mapping out the current situations and forming a clearer idea on your target audience, objectives, market, KPI’s and the current result. Knowing in advance that where you stand and what you want to achieve provides you with more context.

  • Research and Analytics

Once you have the insights into your current situation and objectives, it is the time for you to analyze them. In the analysis phase, you would want to employ web analytics and the other sources of information to form a hypothesis that can then be tested. It could include the heatmaps, tests, insights etc.

This would make sure that your insights are based on the actual behavior of the users and not on the basis of superstitions.

  • Implementing a challenger and Testing the hypothesis 

For implementing a good challenger you need to choose a suitable testing method and then run a test. It involves the turning up of backlogs and matrix into a measurement plan. This is one of the most crucial parts of CRO. 

Examining the hypothesis on basis of multiple tests results in a good number of visitors and websites. 

  • Validating 

After setting up and running the tests, you analyze the results. This generates insights that you then subsequently test. CRO is a continuous process of learning and refining. After setting up and running the test, you analyze the result. This then generates insights that can have a subsequent test. 

Testing for CRO 

Conversion Rate Optimization Test are the ones that refer to the various types of testing methodologies. They are used to identify the best possible version of a certain site that brings in the most valuable traffic.

  • Principles of CRO 

Speed

Back in 2009, Google conducted experiments which described that slowing down the search results page by under half a second resulted in 0.2% to 0.6% fewer searches.

These results now might not sound like big numbers, but for Google (which processed roughly 40,000 searches every second ) the number resulted in 14,400 fewer searches every minute and according to Google itself, people have become more impatient. Thus making speed an important factor
 
Singularity 
 
A single take away “Less is more” is just about right mantra which should be followed by every website owner. 
Many landing pages contain multiple offers.
 
They shouldn’t.
 
Why?
 
Having just one and improved campaign clicks help in increasing the conversion rates

Identification

Identifying your audience and their aspirations/desires mean high conversions. In other words - what they want, what matters to them, and what sources of frictions are for them all comes under the section of identification.  

To identify the people you must know them. After all the reason how sales and marketing would see those conversions is only when you identify your own customers. 

Landing Pages

Take an advise and do not clutter your pages or emails to “what if” moments. 

What if the user is to like my page?

What if more audience like the information that is being served on the page? 

What if they want to read my testimonials and case studies? 

If the goal of creating a particular page is to get likes and visitors to subscribe it, then all you should do it is focus on it. Thus, your landing pages should be as precise and simple as they could be. This gives a clearer idea to the audience and your customers on what you are selling. 

  • Testing Methods 

A/B testing

Businesses want visitors to take an action (conversion) on the website and the rate at which a website is able to drive, this is called its "conversion rate."

A/B testing is the practice of showing 2 variants of the same webpage to different segments of website visitors at the same time and comparing which variation drives more conversions. 
 
The one that gives higher conversions wins!
 
The metrics of conversion is different for every site. For commerce, it might be a sale product whereas for B2B it might generate qualified leads. A well planned, data-driven A/B testing makes your marketing plan more profitable by just narrowing it down to its most important elements by testing them and also by combining them. 

Note that every element on your website should influence visitor behavior and conversion rate should be A/B tested. 

\Image of two laptops where one is A having a 23% reb bar sign on the screen and the other is having 37% green bar sign on the screen


Multivariate Tests 

In a multivariate test, you identify a few key areas/sections of a page and then create variations for those sections specifically (as opposed to creating variations of the whole page in an A/B split test). So for example, in a multivariate test, you choose to create different variations for 2 different sections: headline and image. A multivariate testing software will combine all these section specific variations to generate unique versions of the page to be tested and then simply split traffic amongst those versions.

Multivariate testing looks to provide the solution. You can change a title and an image at the same time. With multivariate tests, you test a hypothesis for which several variables are modified and determine which combination from among all possible solutions performed the best. If you create 3 different versions of 2 specific variables, you then have nine combinations in total (number of variants of the first variable X number of variants of the second). 

Flow chart of multivariate test. Starting from buying now & then getting divided into two parts, one part says original other says variation

 

  • Some Testing tools 

Now we know that Conversion Rate Optimization (CRO) focuses on simple tests that allow you to compare and contrast layouts, call to action, a design feature, content and even personalized marketing feature to nudge your conversion rate into new.   

Therefore here are some wicked yet sweet CRO tools that would help you with the testing.

Optimizely: Optimizely tool requires a single line of lightweight code that can be added to any website. The user would then have the power to change any on-page elements on the website.

Google Analytics: Most of the website has this tool in-built on the platform. It is one of the most popular testing tools. Google Analytics tool would help you to split your traffic into two pages that you have developed and would let you know which version is best for testing. 
  
Visual web optimizer: This tool specifically helps in figuring out how your testing windows will be after you plug in a few variables. Visual web optimizer is great for the company project and client work. 

Adobe Target: Adobe target is a popular enterprise tool that combines the taste of targeting testing and personalization. It walks you through three step workflow where you first create a variant, then target a variant base and lastly customize your goals 

Appitmize: It is the testing tool that focuses entirely on mobile optimization and is a perfect choice for mobiles and business. It offers full control over the visual editor and rapidly creates new variants and targets. 

Conclusions 

Now we know that the most important goal for an organization is to create conversions. Creating conversions is the reason how you can measure your progress and growth. 

Opensense Labs is aware of the fact that how important it is to apprehend the visitor’s preferences and their interests. Therefore our services on Drupal personalization and CRO bundles us together to our clients and helps in accelerating the conversions.

Ping us on [email protected] and let us take that road to handle in your success and hurdles. 

Feb 08 2019
Feb 08

Welcome to the latest version of Lullabot.com! Over the years (since 2006!), the site has gone through at least seven iterations, with the most recent launching last week at the 2019 Lullabot team retreat in Palm Springs, California.

Back to a more traditional Drupal architecture

Our previous version of the site was one of the first (and probably the first) decoupled Drupal ReactJS websites. It launched in 2015.

Decoupling the front end of Drupal gives many benefits including easier multi-channel publishing, independent upgrades, and less reliance on Drupal specialists. However, in our case, we don’t need multi-channel publishing, and we don’t lack Drupal expertise.

One of the downsides of a decoupled architecture is increased complexity. Building blocks of our decoupled architecture included a Drupal 7 back end, a CouchDB middle-layer, ReactJS front end, and Node.js server-side application. Contrast this with a standard Drupal architecture where we only need to support a single Drupal 8 site.

The complexity engendered by decoupling a Drupal site means developers take longer to contribute certain types of features and fixes to the site. In the end, that was the catalyst for the re-platforming. Our developers only work on the site between client projects so they need to be able to easily understand the overall architecture and quickly spin up copies of the site.

Highlights of the new site

In addition to easily swapping in and out developers, the primary goals of the website were ease of use for our non-technical marketing team (hi Ellie!), a slight redesign, and to maintain or improve overall site speed.

Quickly rolling developers on and off

To aid developers quickly rolling on and off the project, we chose a traditional Drupal architecture and utilized as little custom back-end code as possible. When we found holes in functionality, we wrote modules and contributed them back to the Drupal ecosystem. 

We also standardized to Lando and created in-depth documentation on how to create a local environment. 

Ease of use

To enable our marketing team to easily build landing pages, we implemented Drupal’s new experimental Layout Builder module. This enables a slick drag-and-drop interface to quickly compose and control layouts and content.

We also simplified Drupal’s content-entry forms by removing and reorganizing fields (making heavy use of the Field Group module), providing useful descriptions for fields and content types, and sub-theming the Seven theme to make minor styling adjustments where necessary.

Making the front end lean and fast 

Normally, 80% of the delay between navigating to a webpage and being able to use the webpage is attributed to the front end. Browsers are optimized to quickly identify and pull in critical resources to render the page as soon as possible, but there are many enhancements that can be made to help it do so. To that end, we made a significant number of front-end performance optimizations to enable the rendering of the page in a half-second or less.

  • Using vanilla JavaScript instead of a framework such as jQuery enables the JS bundle size to be less than 27kb uncompressed (to compare, the previous version’s bundle size was over 1MB). Byte for byte, JavaScript impacts the performance of a webpage more than any other type of asset. 
  • We heavily componentize our stylesheets and load them only when necessary. Combined with the use of lean, semantic HTML, the browser can quickly generate the render-tree—a critical precursor to laying out the content.
  • We use HTTP2 to enable multiplexed downloads of assets while still keeping the number of HTTP requests low. Used with a CDN, this dramatically lowers the time-to-first-byte metric and time to download additional page assets.
  • We heavily utilize resource-hints to tell the browser to download render-blocking resources first, as well as instructing the browser to connect third-party services immediately.
  • We use the Quicklink module to pre-fetch linked pages when the browser is idle. This makes subsequent page loads nearly instantaneous.

There are still some performance @todos for us, including integrating WEBP images (now supported by Chrome and Firefox), and lazy-loading images. 

Contributing modules back to the Drupal ecosystem

During the development, we aimed to make use of contributed modules whenever it made sense. This allowed us to implement almost all of the features we needed. Only a tiny fraction of our needs was not covered by existing modules. One of Lullabot’s core values is to Collaborate Openly which is why we decided to spend a bit more time on our solutions so we could share them with the rest of the community as contributed modules.

Using Layouts with Views

Layout Builder builds upon the concept of layout regions. These layout regions are defined in custom modules and enable editors to use layout builder to insert these regions, and then insert content into them.

Early on, we realized that the Views module lacked the ability to output content into these layouts. Lullabot’s Director of Technology, Karen Stevenson, created the Views Layout module to solve this issue. This module creates a new Views row plugin that enables the Drupal site builder to easily select the layout they want to use, and select which regions to populate within that layout.

Generating podcast feeds with Drupal 8

Drupal can generate RSS feeds out of the box, but podcast feeds are not supported. To get around this limitation, Senior Developer Mateu Aguiló Bosch created the Podcast module, which complies with podcast standards and iTunes requirements.

This module utilizes the Views interface to map your site’s custom Drupal fields to the necessary podcast and iTunes fields. For more information on this, checkout Mateu’s tutorial video here.

Speeding up Layout Builder’s user interface

As stated earlier, Layout Builder still has “experimental” status. One of the issues that we identified is that the settings tray can take a long time to appear when adding a block into layout builder.

Lullabot Hawkeye Tenderwolf identified the bottleneck as the time it takes Drupal to iterate through the complete list of blocks in the system. To work around this, Karen Stevenson created the Block Blacklist module, in which you can specify which blocks to remove from loading. The result is a dramatically improved load time for the list of blocks.

Making subsequent page loads instantaneous 

A newer pattern on the web (called the PRPL pattern) includes pre-fetching linked pages and storing them in a browser cache. As a result, subsequent page requests return almost instantly, making for an amazing user experience. 

Bringing this pattern into Drupal, Senior Front-end Dev Mike Herchel created the Quicklink module using Google’s Quicklink JavaScript library. You can view the result of this by viewing this site’s network requests in your developer tool of choice. 

Keeping users in sync using the Simple LDAP module

Lullabot stores employee credentials in an internal LDAP server. We want all the new hires to gain immediate access to as many services as possible, including Lullabot.com. To facilitate this, we use the Simple LDAP module (which several bots maintain) to keep our website in sync with our LDAP directory.

This iteration of Lullabot.com required the development of some new features and some performance improvements for the D8 version of the module.

Want to learn more?

Built by Bots

While the site was definitely a team effort, special thanks go to Karen Stevenson, Mike Herchel, Wes Ruvalcaba, Putra Bonaccorsi, David Burns, Mateu Aguiló Bosch, James Sansbury, and, last but not least, Jared Ponchot for the beautiful designs.

Feb 08 2019
Feb 08

Drupal 8 is known for the extensive third-party integration opportunities it gives to websites. One of the tools for this is the contributed Drupal module JSON:API. It helps developers build high-performance APIs for various purposes, including multi-channel content or decoupled Drupal and JSON API setups (which is one of our Drupal team’s areas of expertise). This winter has seen a new release — Drupal JSON:API 2.x. Let’s take a look at what the module does, what makes it useful, and how it has changed in the 2.x version.

JSON:API: principle and benefits

JSON API is a specification, or a set of rules, for REST APIs. It defines how data is exchanged between the server and the client in the JSON format. This includes how the client requests the resources, how they are fetched, which HTTP methods are used, and so on.

JSON (JavaScript Object Notation), in its turn, is the most popular format for APIs. It is very lightweight, consistent in structure, intuitively understandable, human-readable, and easily consumable by machines. At the root of all the requests is a JavaScript object.

The JSON API specification optimizes the HTTP requests and gives you better performance and productivity. JSON API eliminates unnecessary server requests and reduces the size of the data packages.

The specification supports the standard CRUD operations that let users create, read, update, or delete the resources. It is also accepted by all programming languages and frameworks.

crud operations

A glimpse at Drupal JSON:API module’s work

The Drupal JSON:API module offers Drupal’s implementation of the JSON API specification. The module provides an API in compliance with the JSON:API standards for accessing the content and configuration entities of Drupal websites.

The JSON:API module is part of Drupal 8’s ecosystem of web services, and also an alternative to Drupal’s core REST. JSON:API resolves some of the core REST limitations (for example, complex setup, confusing URLs, hard-to-configure collections of entities etc.). At the same time, it only works with entities.

Let’s note some important points about the benefits of the JSON:API work:

  • no configuration is needed (enabling the module is enough to get a full REST API)
  • instant access to all Drupal entities
  • URLs provided dynamically for entity types and bundles so they are accessible via standard HTTP methods (GET, POST, PATCH, DELETE etc.)
  • support for Drupal entity relationships
  • support for complex sorting and pagination
  • access configured in Drupal core role and permission system

As we see, the main philosophy of the module is to be production-ready out of the box. For configurations, there is a related contributed module JSON:API Extras module that lets developers set up every detail they need.

Drupal JSON:API 2.x: what’s new?

Drupal JSON:API 2.x module is getting ready to become part of Drupal 8 core in the near future. Thanks to this, the ecosystem of web services to build high-performance APIs in Drupal 8 core will soon be more complete and diverse. It is also great that Drupal will have a NIH (not invented here) API in its core that follows a very popular specification.

The creators of the JSON API module have had a busy time preparing the 2.x module version for Drupal 8 websites. Overall, 63 contributors took part in that. And there is still a big roadmap ahead.

They issued two beta versions in August and September, then three release candidates from October to December of 2018.

Finally, the stable 2.x version came — JSON:API 2.0 in January 7, 2019, bringing big changes. Websites will benefit from:

  • serious performance improvements (including sparse fieldsets)
  • better compatibility with JSON:API clients
  • more comprehensive test case coverage (including edge cases)
  • backwards compatibility with JSON:API 1.x
Drupal JSON API 2.x improvements

As well as:

  • the ability to see labels of the inaccessible entities
  • information about the user available via “meta.links.me”
  • error response cacheability
  • the config entity mutation feature moved to the JSON:API Extras module
  • a final farewell to the _format parameter in the URL
  • custom "URL" field to no longer added to file entities
  • filter paths closely match JSON:API output structure
  • URLs become objects with `href` keys

and more

Drupal JSON:API 2.0 was soon followed by a new one — JSON:API 8.x-2.1 in January 21, 2019. This is now the latest stable version. Drupal JSON:API 2.1 added two new features:

  1. support for file entity creation from binary data
  2. support for fetching non-default entity revisions.

We should also note that Drupal JSON:API 2.x module is part of the Contenta CMS decoupled distribution that uses best practices of decoupled architecture. JSON:API is immediately available with all new installs of the Contenta.

The related module JSON API Extras for customizing APIs is also fresh and updated. It has had a new 8.x-3.3 release on January 21.

Let’s build the optimal JSON:API setup

As we see, there are plenty of means and tools to build high-performance APIs. Our Drupal developers are keen in this area, and will select the optimal ones for your website. It can be the Drupal JSON:API 2.x module, core RESTful web services, GraphQL, and more. Contact our Drupal team!

Feb 08 2019
Feb 08

Today, our interactions with the digital world have surpassed the human interactions so much that the need for the user interface to be appealing and friendly plays an important role in terms of progression. 

Government websites are no different.

The first connection which they have with their citizens is more likely to be an engaging website. One of the most essential tools for meeting the needs of your people or citizens. 

So, it has to be the best. Right?

Image of a .gov text where the dot is red in color and the text is blue in color


Creating a functional website with easy navigation not only help officials do better in connecting with their constitutes but also ensures that the public stays well informed all the time. 

Drupal is one such platform which helps you achieve all of this in one go. 

How is Drupal in Government sector performing?  

Drupal is gaining popularity in government sector all over the world. The solidity and flexibility of the platform is the primary reason why the government is moving its online portals to Drupal. Government websites like the white house, Federal IT Spending Dashboard, Data govt. has specifically chosen Drupal for its efficiency. The reason why most of the govt. Websites are choosing Drupal over any other platform is also due to the facts that it is:

Drupal has an excellent track record when it comes to solving and maintaining security issues. The security team (Drupal Community) that works together with other councils watches and ensures that its users are getting the best security practices

The fact that The White House entrusts Drupal as its platform is enough to prove that it is highly secure CMS. There are several security modules that make it a highly reliable platform. Modules like:

Login Security: Drupal sites that are available in both, HTTP and HTTPs provides the user with a lockdown in the login page, and submits the other forms securely via HTTPS. Hence, it prevents passwords and other sensitive user data from being transmitted.

Password Policy: This module forces a user to forcefully create a strong and powerful password. 

Captcha: It is the response test which is specifically constructed for determining the user. To check whether the process is being done by a human and not by a robot.

Security Kit: It provides Drupal users with various security hardening options. This doesn’t let the website compromise the data and information to the hackers and other foreign users. Security Kit presents particular mitigation for cross-site request forgery, cross-site scripting, and clickjacking, among other issues.

Two-factor verification: The two-step verification is a procedure that implicates the two-step authentication method, which is performed one after another to verify the requesting access. 

Image of a lock, shield, monitor and a cloud placed on a blue background. The text on the image is Drupal 8, A secure way for web development
  • Accessible

Government websites are those kinds which are used by everyone, and by everyone, I mean the visually impaired too. Each citizen should be able to access the government website quickly and seamlessly and according to WACG 2.0 (web content accessibility guidelines), every website should provide equal standards to all the people.
Drupal is one such platform which adheres to each and every WACG guidelines with its different modules and provides accessibility to everyone. 

Alt text: This is one of the most important modules when it comes to providing accessibility to a website. With this module, the search engine understands the text of an image or a page and screen readers read it loudly to the user.

Content accessibility: This module checks all the type of fields where the user can enter formatting. Below the content section of the accessibility page, the user is provided with the option to turn the tests on or off using the tab that would appear below the page. 

Accessibility in WYSIWYG: This type of module integrates the accessibility configuration with the WYSIWYG or CKEditor modules, which provide the user with a new button while they are editing the content that checks the work for accessibility issue. 
 

Image of the blue Drupal drop which has stick images of a person on a wheelchair and a person spreading its arms

 

  • Economical

The government budget for software developments cannot be compared to the budget of large enterprises for the same purpose. The government needs to opt for a high-quality solution that does not cost a fortune. High development and maintenance costs may be considered as an obstacle to finding high-quality website solutions.

Thus, Drupal meets these and many other requirements are the reason why the government chose it as there CMS. The Drupal development price is relatively low when it is compared with other CMS in the market. 

Another great feature of Drupal8 is its perfect scalability. The CMS is suitable for both small survey website as well as the content-rich websites with gigabyte information. 

At the same time, Drupal is capable of handling high traffic issues and problems. Web solution built on this tool is available even when the traffic volume jumps sky high. A great example of the same is UNESCO and CERN web channels. These Drupal-based websites offer a great user experience and thousands of people use it on daily basis. 

  • Easily Customized

When we say government websites we automatically imagine something grey, black, white or something really boring. But one has to remember that these website services both political and non- political purposes. Thus, user interaction and engagement then here becomes a crucial aspect. 

Better user engagement is made possible with Drupal and its modules. The administrator can add blogs, articles, write-ups that contributes highly opportunities and solutions. Not only this but Drupal also gives its user the power to personalize their website according to their needs and requirement, making it a flexible and reliable CMS. 

  • Has superb integration capabilities 

A powerful web solution should integrate seamlessly with third-party applications. Publishing tools, data repository, and other features belong to the list of necessary interactions. 

The integration capabilities offered by Drupal is enormous. It provides numerous options so that the user is able to integrate any type of third party service. 

Drupal Distribution: DeGov to the rescue 

DeGov is the first Drupal 8 open source distribution focussing entirely on the needs of the governmental organizations. It is intended to provide a comprehensive set of functionalities that are commonly used only of the applications that are related to government. 

What DeGov is not?

The DeGov distribution is not a commercial CMS ( developed as well as owned by a single company, and the users usually need to buy a license) and it is not a finalized product. Meaning, DeGov is not a ready solution. The reason is that there are a lot of functionalities and ideas in backlog and hence it is a daily work off process. 

DeGov is an idea to realize the benefits in the public sector with the idea of open source. 
 
Image of the logo of Degov in form of a circle where the text is DeGov in the middle

Use case related to DeGov

DeGov has 6 use cases and extends its valuable functions to meet certain scenarios. 

  • Publishes information for all the websites that are government based organizations of all level.
  • Service-oriented E-government portals, that closes the gaps between the users/ citizens and administrator.
  • Citizen engagement portals to decline and discuss online.
  • Open311 portal for civic issue tracking.
  • Open Data portal for publishing and creating communities data

Intranet and Extranet of government employees.

An infographic with 6 parts describing the use case of DeGov.

Beneath the Canopy of DeGov

DeGov is the sub-profile of the lightning distribution. 

Lightning allows you to create sub-profiles that are entirely based on default lightening distribution profile. Creating a sub-profile enables you to customize the installation process to meet specific needs and requirements. 

By combining DeGov and lightening distribution, it delivers configuration that is specialized to start new projects. Building on top of Lightning, DeGov leverages in a true open source manner that allows focussing on functionalities for the public sector. 

Which problems does DeGov solve?

Editor 

DeGov solves the issues that are all related to complex backends, missing workflow, multi-language as a pain, missing modernity etc. It allows the editors with highest design flexibility in the maintenance of the website as well as simple editing of the content. 

With the help of “Drag and Drop” function, you can easily structure your page even without the help of programming. Predefined page type helps in easy maintenance. In the central media library, all types of media are managed no matter what form or type it is (pictures, PDFs, videos, posts)

DeGov distribution also helps in easy integrating of the social media. Likewise, you can allow the users to easily share the content in form of articles, blogs, and other write-ups. DeGov uses the privacy- compliant integration of social media share buttons. 

Customers and clients 

DeGov solves the issues that are related to the features and high ownership with old and propriety technologies.  It a user-friendly web front with attractive designs. It is responsive and thus adapts to any device and its size.  The use of HTML/CSS makes a website accessible. Likewise, the page can be made translated into several languages. A language switcher and an automatic screen reader translate the workflow easily. 

DeGov websites receive a powerful search function for the website content as well as the documents. This is particularly convenient for the authorities: The administrator search engine and NRW  that is connected to the service so that users can also search the content outside their area of responsibility. 

Companies and developers 

It solves the issue related to high-cost updates and the functionalities that are related to it. The distribution is based on opensource software, as a result, you pay neither for the license nor you purchase it.

Drupal is used in numerous projects thus the basic functions and the functionalities are constantly updated and expanded from time to time. This fixes all the bugs and vulnerabilities quickly. 

Implementations with DeGov

Websites

The DeGov distribution has the ability to release, federal and state portals, as well as internet sites for ministries, authorities, districts, and municipalities. The web pages and the target group-specific portals (or topic pages) can be created easily by DeGov. 

Participation Portals

With participation portals to let the citizens participate in decisions and proposed measures, DeGov distribution has opened doors in terms of communication. Here you can notify, discuss with users, gather valuable suggestions or create polls. From participatory budgeting through bills to constructing methods - portals obtained by DeGov give citizens as a voice.
 
E-Government Portals

DeGov distribution is a great way to implement entire eGovernment portals. Less stress is laid on the editorial content than whole specialist procedures. In this way, DeGov allows the digital processing of administrative processes. The project Gewerbe.NRW was implemented with the help of DeGov distribution.

Why DeGov with Drupal 8?

DeGov modules are Drupal modules which are carefully fit together. This means that the whole system benefits largely with its functionalities. The reasons why Drupal 8 should be with DeGov is because:

  • It is based on Symfony and it uses composers.
  • It is Git accessible for config files.
  • It is cloud-ready and runs like a modern PHP solution 

Better projects with DeGov

Image of a table having the list of all media types

A case study on Gewerbe-Service-Portal.NRW

The new "Gewerbe-Service-Portal.NRW" has been providing citizen friendly services by allowing company founder in German federal state North Rhine-Westphalia (NRW) to electorally register a business from home. Its main aim was to provide aid of a clearly arranged online form, commercial registration and that can be transmitted to responsible citizens.

In addition to the business registration, the portal provided with information to the topic “foundation of an enterprise”. Furthermost all the users had access to Einheitliche Ansprechpartner NRW. Also, the online service supported specialized staff in taking up a service occupation.

The portal was developed on the basis of the content management Drupal-based system DeGov and nrwGOV. They chose Drupal because of that fact that it was cost-effective and new technologies could be adapted to the Drupal community. Apart from this Drupal provided with:

  • Higher safety and better quality
  • Independence
  • Comprehensive option
  • Accessibility

The portal aimed at providing more flexibility to the entrepreneur that is eligible to start there own business by saving time through digitization. The electronic forwarding and processing of the application by authorities ensured effective processing within applications. The result was effective and user-friendly communication between the citizens and authorities. Whereas in the near future the Gewerbe-Service-Portal.NRW will develop a comprehensive service platform so that the administrative process could be carried out at home.

In the Nutshell 

So how important is the government website to you?

The answer might have been crystal clear by now. 

As important as it is to have a website, maintaining it a whole different task. Yes, Drupal is making it easy for you by its functionality and distribution.  But the definite art of maintaining it is as important as creating it. 

Ping us on [email protected] our services would not only get the best out of Drupal but would also but would also help in enhancing your development and industrial standards.  

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web