Aug 23 2019
Aug 23

Marketing department these days are feeling the heat to make processes faster, agile, and efficient in this fast-paced digital world. That’s where the concept of Drupal comes into the picture! 

Drupal, since its inception, has been considered crucial for the companies which are trying to get on the digital transformation bandwagon in order to provide a faster and more nimble digital experience to customers.

Drupal comes equipped with a set of marketing tools to target the right audience and increase the overall ROI for the businesses. Additionally, the suite of tools and solutions available, build a solid foundation important for marketers, to unearth its potential and other martech capabilities; making it one of the most powerful and widely approached platforms for all the marketing needs. 

The blog gives you insights on how Drupal can be a perfect choice for marketers.-

Leveraging Drupal For Marketing

Drupal has become an inseparable part of the marketing space which not only drives the business revenue but also makes a remarkable balance between marketing technologies and its ecosystem with its content first, commerce-first, and community-first marketing solutions.

Creative Freedom 

Dependence on the IT team for implementation is the most known problem for digital marketers. With traditional CMS it takes an added effort, time, and resources across design, development and marketing departments to work in sync.

By the time changes are implemented the model already looks outdated to the new situation.

“Drupal seamlessly incorporates with the existing marketing and sales technologies of the enterprises”


The cutting-edge Drupal modules and distributions empowers different teams with to have their creative freedom in order to manage the development of a project at their own pace and convenience. 

With Drupal’s architecture, organizations have a platform where they can dynamically launch their website. Marketing teams can curate the structure with segment content and visuals to lay the foundation of a strong digital strategy in its backbone.

Drupal turns out to be a useful asset for organizations which are looking for implementing it in their digital business as it seamlessly incorporates with their existing marketing and sales technologies.

Balancing Marketing Ecosystem

CRM systems are important in running businesses and boosting sales management. It is important for organizations to integrate and be able to customize the martech stack to their unique needs. 

Salesforce written in blue cloud
Drupal offers features that help you create a fine balance between CRM and marketing ecosystem- with content-first, commerce-first, and community-first marketing solutions. Business, technology, and marketing professionals use Drupal to create such agile solutions that are easy-to-use and offer a wide reach across the web. Built fully-responsive, customers and users can discover products and solutions with the help of any device. 

It possesses infinite potential with native features and module extensions, including collaboration with third-party digital marketing tools.

In all, it’s a platform to help you push your strategy for the upcoming phase of digital customer engagement and digital business.

Responsive Customer Experience

Based on the system of engagement, it gives a solid foundation important, for all the single interactions with customers and people within the organization alike, to provide the ultimate unified experience. Streamlining your business operations and aligning your digital strategies, it delivers on the mobile-first approach. Enterprises’ marketing teams can manage the website and administrative pages with ease across multiple platforms.

“Personalized customer experience is evolving at a blistering pace”

Organizations competing in this customer-centric age are putting efforts to provide a personalized experience to customers for keeping them engaged and simultaneously attracting new visitors on board for lead generation.

The digital strategies run by marketing teams majorly focuses on increasing leads, conversions, and revenue percentage of the company gradually via digital channels. 

As customers lie at the center of the organization, this can be accomplished by offering them a tailor-made experience across all channels.

The motive is the same but the way it is rendered has evolved significantly and Drupal has a large part to play in it. Here’s how-

Have a look at the video to understand more about the Acquia Lift and how it can be beneficial for marketers-

Focus on Personalization with Acquia Lift

There is no denying to the fact that companies are leaving no stone unturned to provide personalized results to users- it's not just about presenting content on the website but rather tweaking the whole experience. Simply put, they are trying to ensure that every single user-prospect or customer - gets what they want, even before they realize it.

Acquia Lift is a powerful amalgamation of data collection, content distribution, and personalization helping marketers deliver on the refined user experience 

Acquia Lift bolsters the personalized experiences of the customers. It is a powerful amalgamation of data collection, content distribution, and personalization that helps enterprises’ marketing teams ascertain the refined user experience without much dependency on development or IT teams. 

7 Vertical and 4 horizontal blocks with text written on them

Source: Acquia

Acquia Lift encompasses three key elements to boost personalization-

Profile Manager

It helps you build a complete profile of your users right from when they land on your website as anonymous visitors up until the stage where they are repeat visitors or loyal customers. Collecting user info such as demography, historical behavior, and real-time interactions, it complements the collected insights on user preference with best-suggested actions in your capacity.

Content Hub

This cloud-based tool provides a secure content syndication, discovery, and distribution. Any content created within the organization can be consolidated and stored here; readily available to broadcast on any channel, in any format.

Searches on varied topics and automatic updates give insights on a wide spectrum of content being developed within the enterprise- in different departments, across websites, and on different platforms.

Experience Builder

This is the most essential element of Acquia Lift. It lets you create a personalized experience for your users from the beginning.

The Experience Builder is an easy-peasy drag-and-drop tool that allows you to personalize every section of your website to showcase different content to different target audiences, based on the information retrieved from the Profile Manager.

Marketing teams can:
  1. Define the rules and protocol on how content should be displayed to a different segment of site visitors
  2. Carry out A/B testing to determine what type of content drives more conversions for which user segments.


All this can be executed with simple overlays onto the existing website segments, without disturbing the core structure of the site and without depending on IT teams for implementation.


Multilingual

With companies expanding their reach to the international markets, Drupal multilingual features are definitely worth to make capital out of it. As it supports 94 international languages, it can translate the complete website within a fraction of seconds with less than 4 modules in action, enabling marketing teams to deliver localized content experiences; thus increasing the probability of turning a visitor into a customer. 

“Drupal multilingual features are definitely worth to make capital out of it”

Despite the features of Drupal site’s language for the audience, the editors and admins have the option to choose a different language for themselves at the backend. Marketing and editorial teams have the rights to create and manage multilingual sites, with no need for additional local resources.

Layout Builder

The Layout Builder with its stable features showcased in Drupal 8.7 allows content marketers to create and edit page layouts and arrange the presentation of individual content, its types, media, and nodes. It also lets you feed user data, views, fields, and menus.

"Marketing teams can preview the pages with ease without impacting the user-experience”

It acts as a huge asset for enterprise marketing and digital experience teams-

  1. It offers flexibility to help you create custom layouts for pages and other specific sections on websites. You can override the predefined templates for individual landing pages when required.
  2. Content authors can embed videos effortlessly across the site to enhance user experience and ultimately drive conversion rate.
  3. Marketers can preview the pages with ease and without the fear of impacting the user experience


Layout Builder explained with the help of flowers in square gift box

Source: Dri.es


All these features offered ensures that marketing teams have more control over the website and can work independently without needing to take help from developers. This ultimately reduces the turnaround time for launching campaigns and hence the dependency on development teams.

Improved UI with API - First headless architecture

With the ever-increasing demands of acquiring and retaining customers, marketing teams are always in a hurry to redesign and update the backend and front-end in a short period. However, this becomes quite a strenuous task for them to update and redesign digital properties rapidly keeping in mind the evolving customer expectations.

Traditional Drupal architecture could take ample amount of time to make updates and redesigns because the refinement needs to take place at both front end and back end resulting in a dependency on developers and designers for the completion of the project.

“A decoupled CMS strategy can work wonders for a website that is in desperate need for a change”

But now with the powerful feature of Drupal, i.e., decoupled Drupal, marketing teams can become more agile and efficaciously segregate the processes & streamline the upgrades without impacting the user experience at the front end. This uber-cool feature of the Drupal helps in making the design and UX alterations easier to maintain.


Three blocks on the grey background with text written on them
Source: Acquia

Having the flexibility to come up with more ideas and implementing them by being able to easily add them to the website is a huge upliftment for marketers. New requirements can pop up anytime in the marketer’s mind that could be added to the site for more customer engagement and lead generation. The decoupled approach gives the extra agility needed to keep improving the public-facing site.

Final Words

Drupal has the potential to provide a promising future for better digital experiences with its every upcoming release. The appending of new features in Drupal will help marketing teams to become more flexible and scalable and yet provide a surpassing customer experience in no time. Consequently, conversion rate, sales, and brand visibility would increase manifolds.

Marketing teams in the organization who are already running their sites on Drupal have a lot to be happy about. Specifically, the development around Acquia Lift and Acquia Journey gives them freedom and hence no reliance on developers for any website updates and making content live, etc, to target the audience at the right time and increase ROI.

And for the marketing teams of the organizations those who are envisaging to roll out a plan for shifting to Drupal due to its all-inclusive features, the culmination from the highly-skilled and empowered team will make it worth all the efforts.

Aug 23 2019
Aug 23

With enterprises looking for ways to stay ahead of the curve in the growing digital age, machine learning is providing them with the needed boost for seamless digital customer experience.

Machine learning algorithms can transform your Drupal website into an interactive CMS and can come up with relevant service recommendations targeting each individual customer needs by understanding their behavioural pattern.


Machine Learning integrated Drupal website ensures effortless content management and publishing, better targeting and empowering your enterprise to craft personalized experiences for your customers. It automates the customer service tasks and frees up your customer support teams, subsequently impacting RoI.

However, with various big names competing in the market, let’s look at how Amazon’s Machine Learning stands out amongst all and provides customised offerings by integrating with Drupal.

Benefits of Integrating AWS Machine Learning with Drupal

AWS offers the widest set of machine learning services ranging from pre-trained AI services for computer vision, language, recommendations, and forecasting. These capabilities are built on the most comprehensive cloud platform and are optimized without compromising security. Let’s look at the host of advantages it offers when integrated with Drupal.

Search Functionality

One of the major problems encountered while searching on a website is the usage of exact keyword. If the content uses a related keyword, you will not be able to find it without using the correct keyword.

This problem can be solved by using machine learning to train the search algorithm to look for synonyms and display related results. The search functionality can also be improved by using automatically filtering as per past reads, the search results according to the past reads, click-through rate, etc.

Amazon Cloudsearch is designed to help users improve the search capabilities of their applications and services by setting up a scalable search domain solution with low latency and to handle high throughput.

Image Captioning

Amazon Machine Learning helps in automatic generation of related captions for all images on the website by analyzing the image content. The admin would have the right to configure whether the captions should be added automatically or after manual approval, saving a lot of time for the content curators and administrators of the website.

Amazon Rekognition helps search several images to find content within them and easily helps segregate them almost effortlessly with minimal human interaction.

Website Personalization

Machine learning ensures users get to view tailored content on websites as per their favorite reads and searches by assigning them unique identifier (UID) and tracking their behaviour (clicks, searches, favourite reads etc) on the website for personalized web experience.

Machine learning analyzes the data connected with the user’s UID and provides personalized website content.

Amazon Personalize is a machine learning service which makes it easy for developers to create individualized recommendations for its customers. It saves upto 60% of the time needed to set up and tune the infrastructure for the machine learning models as compared to setting own environment.

Another natural language processing (NLP) service that uses machine learning to find insights and relationships in text is Amazon Comprehend. It easily finds out which topics are the most popular on the internet for easy recommendation. So, when you’re trying to add tags to an article, instead of searching through all possible options, it allows you to see suggested tags that sync up with the topic.

Vulnerability Scanning

A website is always exposed to potential threats, with a risk to lose customer confidential data.

Using machine learning, Drupal based websites can be made secure and immune to data loss by automatically scanning themselves for any vulnerabilities and notifying the administrator about them. This gives a great advantage to websites and also help them save the extra cost spent on using external software for this purpose.

Amazon Inspector is an automated security assessment service, which helps improve the security and compliance of the website deployed on AWS and assesses it for exposure, vulnerabilities, and deviations from best practices.

Voice-Based operations

With machine learning, it’s possible to control and navigate your website by using your voice. With Drupal standing by its commitment towards accessibility, when integrated with Amazon Machine Learning features, it promotes inclusion to make web content more accessible to people.

Amazon Transcribe is an automatic speech recognition (ASR) service. When integrated with a Drupal website, it benefits the media industry with live subtitling of news or shows, video game companies by streaming transcription to help hearing-impaired players, enables stenography in courtrooms in legal domain, helps lawyers to make legal annotations on top of live transcripts, and enables business productivity by leveraging real-time transcription to capture meeting notes.

The future of websites looks interesting and is predicted to benefit users through seamless experience by data and behavior analysis. The benefits of integrating Amazon Machine Learning with Drupal will clearly give it a greater advantage over other CMSs and will pave the way for a brighter future and better roadmap.

Srijan has certified AWS professionals and an expertise in AWS competencies. Contact us to get started with the conversation.

Aug 21 2019
Aug 21

The JS frameworks have changed quite a lot in Drupal especially with API-first concept adding to the scenario. It is only expected that developers are inclined towards learning more about JS and related possibilities.

Recently, I was tasked to render blocks on a page while keeping individual components encapsulated in their own. JavaScript has the potential to make web pages dynamic and interactive without hindering the page speed and so I decided to opt for progressively decoupled blocks.

I came across this Drupal module Progressive Decoupled Blocks, which allows us to render blocks in a decoupled way and seemed a perfect fit for the situation.

Anatomy of Progressive Decoupled Blocks

The module is javascript-framework-agnostic, progressive decoupling tool to allow custom blocks to be written in the javascript framework of one’s choice. What makes it unique is one can do all this without needing to know any Drupal API.

It keeps individual components compact in their own directories containing all the CSS, JS, and template assets necessary for them to work, and using an info.yml file to declare these components and their framework dependencies to Drupal.

Did it work?

I decided to test the module in the new Umami Demo profile, that comes out of the box with Drupal, with React and Angular blocks.

I used the Layout module and picked a page to place these blocks. This is one of the best parts I liked about the module, every piece of block content that we need to be placed on the page, can be done without even visiting the block page and setting visibility.

Module Architecture

The module has a custom block derivative, which searches for all components that have been added and exposes them as blocks. This architecture makes it super easy to add a new block.

You can refer to this git for the block derivative - https://git.drupalcode.org/project/pdb/blob/8.x-1.x/src/Plugin/Derivative/PdbBlockDeriver.php

Refer to this git for component discovery - https://git.drupalcode.org/project/pdb/blob/8.x-1.x/src/ComponentDiscovery.php

Possibilities

For React JS there are 2 examples in this module.

First is with a simple React Create element class, without JSX. This example renders a simple text inside a React component. This is a very good starter example for anyone who wants to understand how to integrate a react component with Drupal in a progressive way.

The second example is a ToDo app, which allows you to add and remove todo items to a list. It makes use of local storage to store any item that has been added. This app creates React components, integrate these components in other component and renders a fully functional DOM inside Drupal DOM in a progressive way.

This example comes with a package.json which needs to be installed before making it functional. However, the component did not render perfectly on Umami theme, I made certain changes to make sure it renders correctly. Patch attached in the end.

progressive decoupled block

I decided to extend this module and added a new component to render a banner block (comes OOB with Umami), and exposed this block as an API via JSON:API. As this was a very simple block, I decided to create this block without JSX. Also, I decided to generate the URL with static ID in the path. This can, however, be made dynamic, which I plan to do later in my implementation. I also decided to choose the same class names to save on styling. The classes can be found in the Umami theme profile.

(function ($, Drupal, drupalSettings) {
Drupal.behaviors.pddemo = {
attach: function (context, settings) {
$.ajax(drupalSettings.pdb.reactbanner.url, {
type: 'get',
success: function (result) {
var image = result.included[0].attributes.uri.url;
var title = result.data.attributes.field_title;
ReactDOM.render(React.createElement(
'div',
{
className: 'block-type-banner-block',
style: {backgroundImage: 'url(' + image + ')'}
},
React.createElement(
'div',
{className: 'block-inner'},
React.createElement(
'div',
{className: 'summary'},
React.createElement(
'div',
{className: 'field--name-field-title'},
title
)
)
)
), document.getElementById('reactbanner'));
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
console.log(textStatus || errorThrown);
},
});
}
};
})(jQuery, Drupal, drupalSettings);


Angular JS

Angular JS comes with many more easy and complex examples. Things didn’t work as smoothly as it did with React. There were a couple of changes we were required to make it work.

Patch is attached in the end. You can refer to this gist for the patch.

You will have to install node modules before, to make it work. Also, all JS is in the form of Typescript, which needs to be processed to JS before you can make it work.

Conclusion

The module gives you a great kickstart to move any block, rendered by Drupal, to a progressively decoupled block, using React, Angular or Ember as JS framework. However, you may want to extend the module or create your own to render the blocks.

Aug 20 2019
Aug 20

Welcome to Mediacurrent’s Open Waters, a podcast about open source solutions. In this episode, we catch up with Cristina Chumillas. Cristina comes from the design world and is passionate about front-end development. She works at Lullabot (though when we recorded this, she worked at Ymbra) and has been involved in the Drupal community for years, contributing with code, design, and organizing events. Her contributions to Drupal Core are mainly focused on front-end, design and UX. Nowadays, she's a co-organizer of the Drupal Admin UI & JS Modernization Initiative and a Drupal core UX maintainer.

Audio Download Link

Project Pick

 Claro

Interview with Cristina Chumillas

  1. Tell us about yourself: What is your role, who do you work for, and where are you from?
  2. You are a busy woman, what events have you recently attended and/or are scheduled to attend in the near future?
  3. Which Drupal core initiatives are you currently contributing to?
  4. How does a better admin theme UI help site owners?  
  5. What are the main goals?
  6. Is this initiative sponsored by anyone? 
  7. Who is the target for the initiative? 
  8. How is the initiative organized? 
  9. What improvements will it bring in a short/mid/long term?
  10. How can people get involved in helping with these initiatives?

Quick-takes

  •  Currently contributing to The Out of the Box initiative for a while, together with podcast co-host Mario
  • 3 reasons why Drupal needs a better admin theme UI: Content Productivity, savings, less frustration
  • Main goals: We have 2 separate paths: the super-fancy JS app that will land in an undefined point in the future and Claro as the new realistic & releasable short term work that will introduce improvements on each release.
  • Why focus on admin UI?  We’re focusing on the content author's experience because that’s one of the main pain points mentioned in an early survey we did last year.)
  • How is the initiative organized? JS, UX&User studies, New design system (UI), Claro (new theme)
  • What improvements will it bring in a short/mid/long term? Short: New theme/UI, Mid: editor role with specific features, autosave, Long: JS app. 

That’s it for today’s show, thanks for joining us!  Looking for more useful tips, technical takeaways, and creative insights? Visit mediacurrent.com/podcast for more episodes and to subscribe to our newsletter.

Aug 19 2019
Aug 19

This is a short story on an interesting problem we were having with the Feeds module and Feeds directory fetcher module in Drupal 7.

Background on the use of Feeds

Feeds for this site is being used to ingest XML from a third party source (Reuters). The feed perhaps ingests a couple of hundred articles per day. There can be updates to the existing imported articles as well, but typically they are only updated the day the article is ingested.

Feeds was working well for over a few years, and then all of a sudden, the ingests started to fail. The failure was only on production, whilst the other (lower environments) the ingestion worked as expected.

The bizarre error

On production we were experiencing the error during import:

PDOStatement::execute(): MySQL server has gone away database.inc:2227 [warning] 
PDOStatement::execute(): Error reading result set's header [warning] 
database.inc:2227PDOException: SQLSTATE[HY000]: General error: 2006 MySQL server has [error]

The error is not so much that the database server is not alive, more so that PHP's connection to the database has been severed due to exceeding MySQL's wait_timeout value.

The reason why this would occur on only production happens on Acquia typically when you need to read and write to the shared filesystem a lot. As lower environments, the filesystem is local disk (as the environments are not clustered) the access is a lot faster. On production, the public filesystem is a network file share (which is slower).

Going down the rabbit hole

Working out why Feeds was wanting to read and/or write many files from the filesystem was the next question, and immediately one thing stood out. The shear size of the config column in the feeds_source table:

mysql> SELECT id,SUM(char_length(config))/1048576 AS size FROM feeds_source GROUP BY id;
+-------------------------------------+---------+
| id                                  | size    |
+-------------------------------------+---------+
| apworldcup_article                  |  0.0001 |
| blogs_photo_import                  |  0.0003 |
| csv_infographics                    |  0.0002 |
| photo_feed                          |  0.0002 |
| po_feeds_prestige_article           |  1.5412 |
| po_feeds_prestige_gallery           |  1.5410 |
| po_feeds_prestige_photo             |  0.2279 |
| po_feeds_reuters_article            | 21.5086 |
| po_feeds_reuters_composite          | 41.9530 |
| po_feeds_reuters_photo              | 52.6076 |
| example_line_feed_article           |  0.0002 |
| example_line_feed_associate_article |  0.0001 |
| example_line_feed_blogs             |  0.0003 |
| example_line_feed_gallery           |  0.0002 |
| example_line_feed_photo             |  0.0001 |
| example_line_feed_video             |  0.0002 |
| example_line_youtube_feed           |  0.0003 |
+-------------------------------------+---------+
What 52 MB of ASCII looks like in a single cell.

Having to deserialize 52 MB of ASCII in PHP is bad enough.

The next step was dumping the value of the config column for a single row:

drush --uri=www.example.com sqlq 'SELECT config FROM feeds_source WHERE id = "po_feeds_reuters_photo"' > /tmp/po_feeds_reuters_photo.txt
Get the 55 MB of ASCII in a file for analysis

Then open the resulting file in vim:

"/tmp/po_feeds_reuters_photo.txt" 1L, 55163105C
Vim struggles to open any file that has 55 million characters on a single line

And sure enough, inside this config column was a reference to every single XML file ever imported, a cool ~450,000 files.

a:2:{s:31:"feeds_fetcher_directory_fetcher";a:3:{s:6:"source";s:23:"private://reuters/pass1";s:5:"reset";i:0;
s:18:"feed_files_fetched";a:457065:{
s:94:"private://reuters/pass1/topnews/2018-07-04T083557Z_1_KBN1JU0WQ_RTROPTC_0_US-CHINA-AUTOS-GM.XML";i:1530693632;
s:94:"private://reuters/pass1/topnews/2018-07-04T083557Z_1_KBN1JU0WR_RTROPTT_0_US-CHINA-AUTOS-GM.XML";i:1530693632;
s:96:"private://reuters/pass1/topnews/2018-07-04T083557Z_1_LYNXMPEE630KJ_RTROPTP_0_USA-TRADE-CHINA.XML";i:1530693632;
s:97:"private://reuters/pass1/topnews/2018-07-04T083617Z_147681_KBE99T04E_RTROPTT-LNK_0_OUSBSM-LINK.XML";i:1530693632;
s:102:"private://reuters/pass1/topnews/2018-07-04T083658Z_1_KBN1JU0X2_RTROPTT_0_JAPAN-RETAIL-ZOZOTOWN-INT.XML";i:1530693632
457,065 is the array size in feed_files_fetched

So this is the root cause of the problem, Drupal is attempting to stat() ~450,000 files that do not exist, and these files are mounted on a network file share. This process took longer than MySQL's wait_timeout and MySQL closed the connection. When Drupal finally wanted to talk to the database, it was not to be found.

Interesting enough, the problem of the config column running out of space came up in 2012, and "the solution" was just to change the type of the column. Now you can store 4GB of content in this 1 column. In hindsight, perhaps this was not the smartest solution.

Also in 2012, you see the comment from @valderama:

However, as feed_files_fetched saves all items which were already imported, it grows endless if you have a periodic import.

Great to see we are not the only people having this pain.

The solution

The simple solution to limp by is to increase the wait_timeout value of your Database connection. This gives Drupal more time to scan for the previously imported files prior to importing the new ones.

$databases['default']['default']['init_commands'] = [
  'wait_timeout' => "SET SESSION wait_timeout=1500",
];
Increasing MySQL's wait_timeout in Drupal's settings.php.

As you might guess, this is not a good long term solution for sites with a lot of imported content, or content that is continually being imported.

Instead we opted to do a fairly quick update hook that would loop though all of the items in the feed_files_fetched key, and unset the older items.

<?php

/**
 * @file
 * Install file.
 */

/**
 * Function to iterate through multiple strings.
 *
 * @see https://www.sitepoint.com/community/t/strpos-with-multiple-characters/2004/2
 * @param $haystack
 * @param $needles
 * @param int $offset
 * @return bool|int
 */
function multi_strpos($haystack, $needles, $offset = 0) {
  foreach ($needles as $n) {
    if (strpos($haystack, $n, $offset) !== FALSE) {
      return strpos($haystack, $n, $offset);
    }
  }
  return false;
}

/**
 * Implements hook_update_N().
 */
function example_reuters_update_7001() {
  $feedsSource = db_select("feeds_source", "fs")
    ->fields('fs', ['config'])
    ->condition('fs.id', 'po_feeds_reuters_photo')
    ->execute()
    ->fetchObject();

  $config = unserialize($feedsSource->config);

  // We only want to keep the last week's worth of imported articles in the
  // database for content updates.
  $cutoff_date = [];
  for ($i = 0; $i < 7; $i++) {
    $cutoff_date[] = date('Y-m-d', strtotime("-$i days"));
  }

  watchdog('FeedSource records - Before trimmed at ' . time(), count($config['feeds_fetcher_directory_fetcher']['feed_files_fetched']));

  // We attempt to match based on the filename of the imported file. This works
  // as the files have a date in their filename.
  // e.g. '2018-07-04T083557Z_1_KBN1JU0WQ_RTROPTC_0_US-CHINA-AUTOS-GM.XML'
  foreach ($config['feeds_fetcher_directory_fetcher']['feed_files_fetched'] as $key => $source) {
    if (multi_strpos($key, $cutoff_date) === FALSE) {
      unset($config['feeds_fetcher_directory_fetcher']['feed_files_fetched'][$key]);
    }
  }

  watchdog('FeedSource records - After trimmed at ' . time(), count($config['feeds_fetcher_directory_fetcher']['feed_files_fetched']));

  // Save back to the database.
  db_update('feeds_source')
    ->fields([
      'config' => serialize($config),
    ])
    ->condition('id', 'po_feeds_reuters_photo', '=')
    ->execute();
}

Before the code ran, there were > 450,000 items in the array, and after we are below 100. So a massive decrease in database size.

More importantly, the importer now runs a lot quicker (as it is not scanning the shared filesystem for non-existent files).

Aug 16 2019
Aug 16

After a couple months off, SC DUG met this month with a presentation on super cheap Drupal hosting.

Chris Zietlow from Mindgrub, Will Jackson from Kanopi Studios, and I all gave short talks very cheap ways to host Drupal 8.

[embedded content]

Chris opened by talking about using AWS Micro servers. Will shared a solution using a Raspberry Pi for a fully wireless server. I closed the discussion with a review of using Drupal Tome on Netlify.

We all worked from a loose set of rules to help keep us honest and prevent overlapping:

Rules for Cheap D8 Hosting Challenge

The goal is to figure out the cheapest D8 hosting that would actually function for a project, even if it is deeply irresponsible to actually use.

Rules

  1. It has to actually work for D8 (so modern PHP version, working database, etc),
  2. You do not actually have to spend the money, but you do need to know all the steps required to make it work.
  3. It needs to honor the TOS for any networks and services you use (no illegal network taps – legal hidden taps are fair game).
  4. You have to share your idea with the other players so we don’t have two people propose the same solution (first-come-first-serve on ideas).

Reporting

Be prepared to talk for about 5 minutes on how your solution would work.  Your talk needs to include:

  1. Estimated Monthly cost for the first year.
  2. Steps required to make it work.
  3. Known weaknesses.

If you have a super cheap hosting solution for Drupal 8 we’d love to hear about it.

Aug 13 2019
Aug 13

Back in April, BigCommerce, in partnership with Acro Media, announced the release of the BigCommerce for Drupal module. This module effectively bridges the gap between the BigCommerce SaaS ecommerce platform and the Drupal open source content management system. It allows Drupal to be used as the frontend customer experience engine for a headless BigCommerce ecommerce store.

For BigCommerce, this integration provides a new and exciting way to utilize their platform for creating innovative, content-rich commerce experiences that were not possible via BigCommerce alone.

For Drupal, this integration extends the options its users and site-builders have for adding ecommerce functionality into a Drupal site. The flexibility of Drupal combined with the stability and ease-of-use of BigCommerce opens up new possibilities for Drupal that didn’t previously exist.

Since the announcement, BigCommerce and Acro Media have continued to educate and promote this exciting new headless commerce option. A new post on the BigCommerce blog published last week title Leverage Headless Commerce To Transform Your User Experience with Drupal Ecommerce (link below) is a recent addition to this information campaign. The BigCommerce teams are experts in what they do and Acro Media is an expert in open source integrations and Drupal. They asked if we could provide an introduction for their readers to really explain what Drupal is and where it fits in to the headless commerce mix. This, of course, was an opportunity not to be missed and so our teams buckled down together once again to provide readers with the best information possible.

So without further explanation, click the link below to learn how you can leverage headless commerce to transform your user experience with Drupal.

Read the full post on the BigCommerce blog

Additional resources:

Aug 13 2019
Aug 13

Drupal Tome is a static site generator distribution of Drupal 8. It provides mechanisms for taking an entire Drupal site and exporting all the content to HTML for direct service. As part of a recent competition at SCDUG to come up with the cheapest possible Drupal 8 hosting, I decided to do a proof-of-concept level implementation of Drupal 8 with Docksal for local content editing, and Netlify for hosting (total cost was just the domain registration).

The Tome project has directions for setup with Docker, and for setup with Netlify, but they don’t quite line up with each other (I followed the docker instructions, then the Netlify set, but had to chart my own course to get the site from the first project linked to the repo in the second), and since I’m getting used to using Docksal when I had to fall back and do a bit of it myself I realized it was almost painfully easy to setup.

The first step was to go to the Tome documentation for Netlify and setup an account, and site from the template. There is a button in those directions to trigger the Netlify setup, I’ve added one here as well (but if this one fails, check to see if they updated theirs):

Deploy to Netlify

Login with Github or similar service, and let it create a repo for your project.

Follow Netlify’s directions for setting up DNS so you can have the domain you want, and HTTPS (through Let’s Encrypt). It took it a couple hours to get that detail to run right, but it eventually worked. For this project I chose a subdomain of my main blog domain: tome-netlify.spinningcode.org

Next go to Github (or whatever service you used) and clone the repository to your local machine. There is a generated README on that project, but the directions aren’t 100% correct if you aren’t cloning onto a machine with a working PHP environment. This is when I switched over to docksal, and ran the following series of commands:

fin init
fin composer install
fin drush tome:install
fin drush uli

Then log into your local site using the domain from docksal and the link from drush, and add some content.

Next we export the content from Drupal to send over to Netlify for deployment.

fin drush tome:static
git add .
git commit -m "Adding sample content"
git push

…now we wait while Netlify notices and builds the site…

If you look at the site a few minutes later the new content should be posted.

This is all well and good if I want to use the version of the site generated for the Netlify example, but I wanted to make sure I could do something more interesting. These days Drupal ships with an install profile called Unami that provides a more robust sample site than the more traditional Standard install.

So now let’s try to get Unami onto this site. Go back to the terminal and have Tome reset everything (it’ll warn you that you are about to nuke everything):

fin drush tome:init

…select Unami when it asks for a profile…and wait cause this takes a while…

Now just re-export the content and push it to your repo.

fin drush tome:static
git add .
git commit -m "Converting to Unami"
git push

And wait again, cause this also takes a few minutes…

The Unami home page on my subdomain hosted at Netlify.

That really was all that was involved for a simple site, you can see my repository on Github if you want to see all of what was generated along the way.

The whole process is pretty straight forward, but there are a few things that it helps to understand.

First, Netlify is actually regenerating the markup on their servers with this approach. The Drupal nodes, and other entities, are saved as JSON and then imported during the build. This makes the process reliable, but slow. Unami takes several minutes to deploy since Netlify is installing and configuring Drupal, loading the content, and generating the output. The build command provided in that template is clear enough to follow if you are familiar with composer projects:

command = "composer install && ./vendor/bin/drush tome:install -y && ./vendor/bin/drush tome:static -l $DEPLOY_PRIME_URL" 

One upside of this, is that you can use a totally unrelated domain for your local testing and have it adjust correctly to the production domain. When you are using Netlify’s branching workflow for managing dev, test, and production it also protects your work that way.

My directions above load a standard docksal container because that’s quick and easy, which includes MySQL, but Tome falls back to using a Sqlite database since you can be more confident it is there. Again this is reliable but slow. If I were going to do this on a more complete project I’d want a smaller Docksal setup or to switch to using MySQL locally.

A workflow based on this approach might also struggle with concurrent edits or complex configuration of large sites. It would probably make more sense to have the content created on a hidden, but traditional, server and then run through a different workflow. But for someone working on a series small sites that are rarely updated, a totally temporary instance of the site that can be rapidly deployed to a device, have content updated, push out to production, and then deleted locally until needed again.

The final detail to note is that there is no support for forms built into this solution. Netlify has support for that, and Tome has a module that claim to connect to that service but I wasn’t able to quickly determine how to get it connected. I am confident there are solves to this problem, but it is something that would take a little additional work.

Aug 08 2019
Aug 08

For most Drupal projects, patches are inevitable. It’s how we, in the Drupal community, share code. If that scares you, don’t worry-- the community is working hard to move to a pull/merge request workflow. Due to the collaborative nature of Drupal as a thriving open source community and the always growing ecosystem of contrib modules, patches are the ever-evolving glue that can hold a site together.  

Before Drupal 8, you may have seen projects use drush make which is a Drupal specific solution. As part of the “get off the island” movement,  Drupal adopted existing dependency manager Composer. Composer does a decent job alleviating the headaches of managing several sites with different dependencies. However, out of the box Composer will revert patched core files and contrib modules and it is for that reason composer-patches project was created. In this blog post, we are going to review how to set up composer-patches for a composer managed project and how to specify local or remote hosted patches.

The setup

In your favorite command line tool, you will want to add the composer-patches project:

composer require cweagans/composer-patches:~1.0 --update-with-dependencies

With this small change, your project is now set up for success because composer can manage your patches. 

Local patches

Sometimes you will find that you need patch contrib or core specifically for your project and therefore the patch exists locally. Composer-patches can apply that patch for you, we just need to tell it where it is.  Let’s look at an example project that has core patch applied and saved locally in the project root directory ‘patches/core-invalid-config-structures.patch’:
    ...
    "extra": {
      "patches": {
        "drupal/core": {
          "Core Invalid config structures ":"patches/core-invalid-config-structures.patch"
        }
      }
    }

In your composer.json, you will want to add an “extra” section if it doesn’t already exist.  Composer-patches will take the packages listed in “patches” and try to apply any listed patches. In our above example, the package we are patching is “drupal/core”. Patches are declared as follows:

“Patch description”: “path to patch file”

This information will be printed on the command line while composer tries to update the package which makes it important to summarize the patches purpose well.  If you would like to see what this looks like in the wild, take a look at our distribution Rain which leverages a couple of contrib patches.

After manually updating composer.json, it is always a good idea to run composer validate to confirm the json syntax is right.  If you get the green success message run composer update drupal/[projectname], e.g. composer update drupal/core to have the patch applied. 

You will know that the patch is applied based on the output:

patch output

As you can see, the package getting patched is removed, added back and the patch is applied. 

Note: Sometimes I feel like I have to give composer a nudge, always feel comfortable deleting /core, /vendor, or /modules/contrib, but if you delete composer.lock know that your dependencies could update based off your constraints.  Composer.json tracks our package dependencies at certain version constraints while composer.lock is the recipe of computed versions based off those constraints. I have found myself running the following:

rm -rf core && rm -rf modules/contrib && rm -rf vendor
composer install

Remote Patches

When possible we should open issues on Drupal.org and post patches there. That way, the community can work together to solve a problem and usually you’ll get a more reliable, lasting solution. Think about it this way - would you rather only you or your team review a critical patch to your project or hundreds of developers?

To make composer-patches grab a remote patch make the following changes:
    ...
    "extra": {
      "patches": {
        "drupal/core": {

          "#2925890-10: Invalid config structures ":"https://www.drupal.org/files/issues/2018-09-26/2925890-10.patch"
        }
      }
    } 

The only change here is rather than the path to the local patch, we have substituted it for the URL the patch. This will have a similar success message when applied correctly:

remote patches

Tips 

So far, I’ve shown you how to get going with composer-patches project but there are a lot of settings/plugins that can elevate your project.  A feature I turn on for almost all sites is exit on patch failure because it is a big deal when a patch fails.  If you too want to turn this feature on, add the following line to your “extras” section in your composer.json:

"composer-exit-on-patch-failure": true,

I have also found it helpful to add a link back to the original issue in the composer.json patch declaration. Imagine working on a release and one of your patches fail but the only reference you have to the issue is the patch file url? It is times like these that a link to the issue can make your day.  If we made the same change to our example before, it would look like the following:

 "drupal/core": {
          "#2925890-10: Invalid config structures (https://www.drupal.org/project/drupal/issues/2925890)" : "https://www.drupal.org/files/issues/2018-09-26/2925890-10.patch"
        }

Conclusion

Composer-patches is a critical package to any Drupal project managed by Composer. In this blog I showed you how to get started with the project and some of the tips and tricks I’ve learned along the way. How does your team user composer-packages? Do you have a favorite setting that I didn’t mention? Feel free to drop a comment and share what works for you and your team.

Aug 07 2019
Aug 07

Decoupled Drupal 8 and GatsbyJS webinar

How did the City of Sandy Springs, GA improve information system efficiency with a unified platform? Join our webinar to see how we built this city on decoupled Drupal 8, GatsbyJS, and Netlify.

We'll explore how a “build-your-own” software approach gives Sandy Springs the formula for faster site speed and the ability to publish messages across multiple content channels — including new digital signage.

What You'll Learn

  • The City of Sandy Springs’ challenges and goals before adopting Drupal 8 

  • How Sandy Springs manages multi channel publishing across the website, social media, and a network of digital signage devices. 

  • Benefits gained from Drupal 8 and GatsbyJS, including: a fast, reliable site, hosting costs, and ease of development for their team.  

Speakers

Jason Green, Visual Communications Manager at City of Sandy Springs, and Mediacurrent Director of Front End Development Zack Hawkins share an inside look at the project.

Registration

Follow the City of Sandy Springs on the path to government digital innovation.  Save your seat today!

Aug 05 2019
Aug 05

Using cloud is about leveraging its agility among other benefits. For the Drupal-powered website, a right service provider can impact how well the website performs and can affect the business revenue.

A robust server infrastructure, such as AWS, when backs up the most advanced CMS, Drupal, it proves to accelerate the website’s performance, security and availability.

But why AWS and what benefits does it offer over others? Let’s deep dive to understand how it proves to be the best solution for hosting your Drupal websites.

Points To Consider For Hosting Drupal Websites

The following are the points to keep in mind while considering providers for hosting your pure or headless Drupal website.

Better Server Infrastructure: Drupal specialised cloud hosting provider should offer a server infrastructure that is specifically optimized for running Drupal websites in a way they were designed to run.

Better Speed: It should help optimise the Drupal website to run faster and should have the ability to use caching tools such as memcache, varnish, etc.

Better Support: The provider should offer better hosting support with the right knowledge of a Drupal website.

Better Security and Compatibility: The hosting provider should be able to provide security notifications, server-wide security patches, and even pre-emptive server upgrades to handle nuances in upcoming Drupal versions.

Why not a traditional server method?

There are two ways of hosting Drupal website via traditional server setups: 

  • a shared hosting server, where multiple websites run on the same server
  • or  a dedicated Virtual Private Server (VPS) per website. 

However, there are disadvantages to this approach, which are:

  1. With a lot of non-redundant single-instance services running on the same server, there are chances if any component crashes, the entire site can get offline.
  2. Being non-scalable, this server does not scale up or down automatically and requires a manual intervention to make changes to the hardware configuration and may cause server to go down due to an unexpected traffic boost.
  3. The setup constantly runs at full power, irrespective of usage, causing wastage of resources and money.

    Hosting Drupal on AWS

Amazon Web Services (AWS) is a pioneer of cloud hosting industry providing hi-tech server infrastructure and is proved to be highly secure and reliable.
With serverless computing, developers can focus on their core product instead of worrying about managing and operating servers or runtimes, either in the cloud or on-premises. It eliminates infrastructure management tasks such as server or cluster provisioning, patching, operating system maintenance, and capacity provisioning. It enables you to build modern applications with increased agility and lower total cost of ownership and time-to-market.

With serverless being the fastest-growing trend with an annual growth rate of 75% and foreseen to be adopted at a much higher rate, let’s understand the significance of the AWS components in the Virtual Private Cloud (VPC). Each of these components proves it to be the right choice for hosting pure or headless websites.

AWS-syanmic-content-srijan-technologiesArchitecture diagram showcasing Drupal hosting on AWS

  •  Restrict connection: NAT Gateway

Network Address Translation (NAT) gateway enables instances in a private subnet to connect to the internet or other AWS services. Hence, the private instances in the private subnet are not exposed via the Internet gateway, instead, all the traffic is routed via the NAT gateway. 

The gateway ensures that the site will always remain up and running. AWS takes over the responsibility of its maintenance.

  • Restrict access: Bastion Host
Bastion hosts protects the system by restricting access to backend systems in protected or sensitive network segments. Its benefit is that it minimises the chances of any potential security attack.
  • Database: AWS Aurora

The Aurora database provides invaluable reliability and scalability, better performance and response times. With fast failover capabilities and storage durability, it minimizes technical obstacles.

  • Upload content: Amazon S3

With Amazon S3,  store, retrieve and protect any amount of data at any time in a scalable storage bucket. Recover lost data easily, pay for the storage you actually use, protect data from unauthorized use and easily upload and download your data with SSL encryption.

  • Memcached/Redis: AWS Elasticache
Elasticache is a web service that makes it easy to set up, manage, and scale a distributed in-memory data store in the cloud.
  • Edge Caching: AWS CloudFront
CloudFront is an AWS content delivery network which provides a globally-distributed network of proxy servers which cache content locally to consumers, to improve access speed for downloading the content.
  • Web servers: Amazon EC2

Amazon EC2 is a web service that provides secure, resizable compute capacity in the cloud.

Amazon Route 53 effectively connects user requests to infrastructure running in AWS and can also be used toroute users to infrastructure outside of AWS.

Benefits of Hosting Drupal Website on AWS

Let’s look at the advantages of AWS for hosting pure or headless Drupal websites.

High Performing Hosting Environment

The kind of performance you want from your server depends upon the type of Drupal website you are building. A simple website with a decent amount of traffic can work well on a limited shared host platform. However, for a fairly complex interactive Drupal site, a typical shared hosting solution might not be feasible. 

Instead, opt for AWS, which provides a server facility and you get billed as per your usage.

Improved Access To Server Environment 

Shared hosting environment restricts its users to gain full control and put a limitation on their ability to change configurations for Apache or PHP, and there might be caps on bandwidth and file storage. These limitations get removed when you're willing to pay a higher premium for advanced level access and hosting services.

This is not true with AWS, which gives you direct control over your server instances, with permissions to SSH or use its interface control panel to adjust settings.

Control over the infrastructure 

Infrastructure needs might not remain constant and are bound to change with time. Adding or removing hosting resources might prove to be difficult or not even possible and would end up paying for the unused resources.

However, opting for AWS will let you pay for the services you use and can shut them off easily if you don’t need them anymore. On-demand virtual hosting and a wide variety of services and hardware types make AWS convenient for anyone and everyone.

No Long-term Commitments

If you are hosting a website to gauge the performance and responsiveness, you probably would not want to allocate a whole bunch of machines and resources for a testing project which might be over within a week or so.

The convenience of AWS on-demand instances means that you can spin up a new server in a matter of minutes, and shut it down (without any further financial cost) in just as much time.

Refrain Physical Hardware Maintenance

The advantage of using virtual resources is to avoid having to buy and maintain physical hardware. 

Going with virtually hosted servers with AWS helps you focus on your core competency - creating Drupal websites and frees you up from dealing with data center operations.

Why Choose Srijan?

Srijan’s team of AWS professionals can help you migrate your website on AWS cloud. With an expertise in enhancing Drupal-optimised hosting environment utilising AWS for a reliable enterprise-level hosting, we can help you implement various AWS capabilities as per your enterprises’ requirements. Drop us a line and let our experts explore how you can get the best of AWS.

Aug 03 2019
Aug 03

Approaching 20 years old, the Drupal Community must prioritize recruiting the next generation of Drupal Professionals

Kaleem Clarkson Ferris Wheel in Centennial Olympic Park in Atlanta, Georgia

Time flies when you are having fun. One of those sayings I remember my parents saying that turned out to be quite true. My first Drupal experience was nearly 10 years ago and within a blink of an eye, we have seen enormous organizations adopt and commit to Drupal such as Turner, the Weather Channel, The Grammys, and Georgia.gov.

Throughout the years, I have been very fortunate to meet a lot of Drupal community members in person but one thing I have noticed lately is that nearly everyone’s usernames can be anywhere between 10–15 years old. What does that mean? As my dad would say, it means we are getting O — L — D, old.

For any thriving community, family business, organization, or your even favorite band for that matter, all of these entities must think about succession planning. Succession what, what is succession planning?

Succession planning is a process for identifying and developing new leaders who can replace old leaders when they leave, retire or die. -Wikipedia

That's right, we need to start planning a process for identifying who can take over in leadership roles that continue to push Drupal forward. If we intend to keep Drupal as a viable solution for large and small enterprises, then we should market ourselves to the talent pool as a viable career option to help in an attempt to lure talent to our community.

There are many different way’s to promote our community and develop new leaders. Mentorship. Mentorship helps ease the barrier for entry into our community by providing guidance into how our community operates. Our community does have some great efforts taking place in the form of mentoring such as Drupal Diversity & Inclusion (DDI) initiative, the core mentoring initiative and of course the code and mentoring sprints at DrupalCon and DrupalCamps. These efforts are awesome and should be recognized as part of a larger strategic initiative to recruit the next generation of Drupal professionals.

Companies spend billions of dollars a year in recruiting but as an open-source community, we don’t have billions so

… what else can we do to attract new Drupal career professionals?

This year’s Atlanta Drupal Users’s Group (ADUG) decided to develop the Drupal Career Summit, all in an effort to recruit more professionals into the Drupal community. Participants will explore career opportunities, career development, and how open source solutions are changing the way we buy, build, and use technology.

  • Learn about job opportunities and training.
  • Hear how local leaders progressed through their careers and the change open source creates their clients and business.
  • Connect one-on-one with professionals in the career you want and learn about their progression, opportunities, challenges, and wins.

On Saturday, September 14 from 1pm -4:30pm. Hilton Garden Inn Atlanta-Buckhead 3342 Peachtree Rd., NE | Atlanta, GA 30326 | LEARN MORE

Student and job seekers can attend for FREE! The Summit will allow you to meet with potential employers and industry leaders. We’ll begin the summit with a panel of marketers, developers, designers, and managers that have extensive experience in the tech industry, and more specifically, the Drupal community. You’ll get a chance to learn about career opportunities and connect with peers with similar interests.

We’re looking for companies that want to hire and educate. You can get involved with the summit by becoming a sponsor for DrupalCamp Atlanta. Sponsors of the event will have the opportunity to engage with potential candidates through sponsored discussion tables and branded booths. With your sponsorship, you’ll get a booth, a discussion table, and 2 passes! At your booth, you’ll get plenty of foot traffic and a fantastic chance to network with attendees.

If you can’t physically attend our first Career Summit, you can still donate to our fundraising goals. And if you are not in the position to donate invite your employer, friends, and colleagues to participate. Drupal Career Summit.

Aug 01 2019
Aug 01

We’re thrilled to be attending Drupal Camp Pannonia from the 1st to 3rd August!

Ratomir and Dejan from the CTI Serbia office will be attending Drupal Camp to discover the biggest breakthroughs of the Open Source platform in 2019.

Grand-terrace-palic

The Grand Terrace in Palić, where Drupal Camp Pannonia is held.

 

Held in Palić, Serbia, near the Hungarian border, Drupal Camp Pannonia brings together some of Europe's top Drupal experts for a weekend of knowledge sharing and Open Source collaboration.

CTI Digital has long worked with a global roster of clients serving international customers. In early 2019, we made a permanent investment in Europe and opened a Serbia office. 

We selected Serbia as the Drupal community is growing quickly and filled with world-class talent. Supporting Drupal Camp Pannonia is a crucial part of our investment in the European Drupal community. If you’d like to chat with Dejan or Ratomir about CTI Digital or Drupal, please email [email protected] to set up a meeting.

Ratomir-and-Dejan-1-1

Dejan (Left) and Ratomir (Right)

We're pleased to announce Dejan (@dekisha007) is also conducting a session on Day 1, Friday 1st August at 15:45. He'll be delving into a new front-end practise we have developed at CTI Digital using Pattern Lab.

Here's what to expect:

  • Pattern Lab and Atomic Design in Drupal
  • Tailwind CSS
  • Advantages of Vue.js over React in Drupal
  • Wrapping all of the above up with CTI’s custom base theme

Be sure to catch Dejan’s talk on day 1, along with a host of brilliant sessions and workshops by checking out the online schedule.

Or follow @dcpannonia and @CTIDigitalUK for all the action on twitter.

Jul 31 2019
Jul 31

Pantheon is an excellent hosting service for both Drupal and WordPress sites. But to make their platform work and scale well they have set a number of limits built into the platform, these include process time limits and memory limits that are large enough for the vast majority of projects, but from time to time run you into trouble on large jobs.

For data loading and updates their official answer is typically to copy the database to another server, run your job there, and copy the database back onto their server. That’s fine if you can afford to freeze updates to your production site, setup a process to mirror changes into your temporary copy, or some other project overhead that can be limiting and challenging. But sometimes that’s not an option, or the data load takes too long for that to be practical on a regular basis.

I recently needed to do a very large import for records into a Drupal database and so started to play around with solutions that would allow me to ignore those time limits. We were looking at needing to do about 50 million data writes and the running time was initially over a week to complete the job.

Since Drupal’s batch system was created to solve this exact problem it seemed like a good place to start. For this solution you need a file you can load and parse in segments, like a CSV file, which you can read one line at a time. It does not have to represent the final state, you can use this to actually load data if the process is quick, or you can serialize each record into a table or a queue job to actually process later.

One quick note about the code samples, I wrote these based on the service-based approach outlined in my post about batch services and the batch service module I discussed there. It could be adapted to a more traditional batch job, but I like the clarity the wrapper provides for breaking this back down for discussion.

The general concept here is that we upload the file and then progressively process it from within a batch job. The code samples below provide two classes to achieve this, first is a form that provides a managed file field which create a file entity that can be reliably passed to the batch processor. From there the batch service takes over and using a bit of basic PHP file handling to load the file into a database table. If you need to do more than load the data into the database directly (say create complex entities or other tasks) you can set up a second phase to run through the values to do that heavier lifting. 

To get us started the form includes this managed file:

   $form['file'] = [
     '#type' => 'managed_file',
     '#name' => 'data_file',
     '#title' => $this->t('Data file'),
     '#description' => $this->t('CSV format for this example.'),
     '#upload_location' => 'private://example_pantheon_loader_data/',
     '#upload_validators' => [
       'file_validate_extensions' => ['csv'],
     ],
   ];

The managed file form element automagically gives you a file entity, and the value in the form state is the id of that entity. This file will be temporary and have no references once the process is complete and so depending on your site setup the file will eventually be purged. Which all means we can pass all the values straight through to our batch processor:

$batch = $this->dataLoaderBatchService->generateBatchJob($form_state->getValues());

When the data file is small enough, a few thousand rows at most, you can load them all right away without the need of a batch job. But that runs into both time and memory concerns and the whole point of this is to avoid those. With this approach we can ignore those and we’re only limited by Pantheon’s upload file size. If they file size is too large you can upload the file via sftp and read directly from there, so while this is an easy way to load the file you have other options.

As we setup the file for processing in the batch job, we really need the file path not the ID. The main reason to use the managed file is they can reliably get the file path on a Pantheon server without us really needing to know anything about where they have things stashed. Since we’re about to use generic PHP functions for file processing we need to know that path reliably:

$fid = array_pop($data['file']);
$fileEntity = File::load($fid);
$ops = [];

if (empty($fileEntity)) {
  $this->logger->error('Unable to load file data for processing.');
  return [];
}
$filePath = $this->fileSystem->realpath($fileEntity->getFileUri());
$ops = ['processData' => [$filePath]];

Now we have a file and since it’s a csv we can load a few rows at time, process them, and then start again.

Our batch processing function needs to track two things in addition to the file: the header values and the current file position. So in the first pass we initialize the position to zero and then load the first row as the header. For every pass after that we need to find point we left off. For this we use generic PHP files for loading and seeking the current location:

// Old-school file handling.
$path = array_pop($data);
$file = fopen($path, "r");
...
fseek($file, $filePos);

// Each pass we process 100 lines, if you have to do something complex
// you might want to reduce the run.
for ($i = 0; $i < 100; $i++) {
  $row = fgetcsv($file);
  if (!empty($row)) {
    $data = array_combine($header, $row);
    $member['timestamp'] = time();
    $rowData = [
             'col_one' => $data['field_name'],
             'data' => serialize($data),
             'timestamp' => time(),
    ];
    $row_id = $this->database->insert('example_pantheon_loader_tracker')
             ->fields($rowData)
             ->execute();

    // If you're setting up for a queue you include something like this.
    // $queue = $this->queueFactory->get(‘example_pantheon_loader_remap’);
    // $queue->createItem($row_id);
 }
 else {
   break;
 }
}
$filePos = (float) ftell($file);
$context['finished'] = $filePos / filesize($path);

The example code just dumps this all into a database table. This can be useful as a raw data loader if you need to add a large data set to an existing site that’s used for reference data or something similar.  It can also be used as the base to create more complex objects. The example code includes comments about generating a queue worker that could then run over time on cron or as another batch job; the Queue UI module provides a simple interface to run those on a batch job.

I’ve run this process for several hours at a stretch.  Pantheon does have issues with systems errors if left to run a batch job for extreme runs (I ran into problems on some runs after 6-8 hours of run time), so a prep into the database followed by running on queue or something else easier to restart has been more reliable.

View the code on Gist.

Jul 31 2019
Jul 31

It's that time of the month again! The time when we express our thanks to those Drupal teams who've generously (and altruistically) shared valuable free content with us, the rest of the community. Content with an impact on our own workflow and our Drupal development process. In this respect, as usual, we've handpicked 5 Drupal blog posts, from all those that we've bookmarked this month, that we've found most “enlightening”.

From:
 

  • Drupal 8 site-building best practices
  • to valuable built-in tools for image optimization in Drupal
  • to altruistically shared tips on how to enforce coding standards in Drupal
  • to best modules for implementing social login functionality into one's Drupal website
  • to... specific metrics that one should prioritize when evaluating his/her website's marketing efforts
     

… the July edition of our “OPTASY favorites” series is, again, packed with high quality, and (most of all) useful content.

But, let's get straight to our selection: here are the 5 blog posts that we've added to our list of resources this month.

Integrating social login functionality into Drupal websites must be one of the most common tasks that we deal with as a development team.

So, having a list of modules designed precisely for this is so reassuring:
 

  • we save precious time
  • we know that it's a Drupal solution (not a different, third-party software component) that we integrate into our projects
     

So, the InternetDevel have shared their list of Drupal 8 modules for social login just in time to... add to our bookmarks and resources & tools list.

Their blog post highlights 5 modules to evaluate first — to see if they fit a specific Drupal 8 web project's particularities — whenever we need to enable social login on the websites that we work on.

Our list of best practices for developing websites in Drupal 8 remains an open one. We keep on adding valuable suggestions on:
 

  • life-saving modules that we should be using, that we might have overlooked or underated
  • new approaches to Drupal development that help us save valuable time and avoid emberassing mistakes
     

That's why the ADCI Solution's team blog post on the best practices to adopt and to stick to when building a website in Drupal 8 couldn't have escaped our “radar”.

From:
 

  • module recommendations for SEO, security, maintaince and performance issues
  • to useful tips on how to address the editorial experience on the websites that we build
  • to what modules we should remove once we move a website from the dev environment to live environment
     

.. we've found a whole bunch of reasons why this post deserves its place in our top 5 Drupal blog posts from July.
 

Another one of those articles that we've “bumped into” precisely when we were looking for the solution that it presents.

In this case here we were looking to set up, document and enforce coding policy and procedures to ensure that our developers comply with the Drupal coding standards.

In their “revelatory” blog post, the Lullabot team puts the spotlight on a tool  — GrumPHP — that checks your Drupal code, before commiting it. One that detects any violation of the well-known Drupal coding standards.

Then, they go on anticipating and detailing each possible challenge that you might face when using this tool and they even share their solutions for overcoming them.

Overall, GrumPHP makes such an interesting discovery that we can't wait to leverage it in our next Drupal project. 
 

Disclaimer: this is not a Drupal blog post, yet the tips included there are highly applicable to websites running on Drupal, as well.

For us, it's been more of a recap of all those key metrics to use for evaluating our marketing strategies.

And it's convenient to have all 6 of them listed in one post that we can keep at hand and use whenever we need to check whether our marketing efforts do have an impact.

From:
 

  • time on site
  • to number of pages visited
  • to traffic sources
     

… plus 3 other crucial metrics for us to turn into the main criteria to use when assessing our marketing strategy's efficency, the article's such a welcomed “reminder”.

A handy resource that we've added to our list of 5 favorite Drupal blog posts from July.
 

Optimizing images is on top of our list of performance-boosting techniques. Therefore, the WishDesk's post on those Drupal 8 built-in tools designed precisely for this purpose came in more than handy.

And not only do they outline all the image optimization features that Drupal 8 provides us with, right out of the box, but they:
 

  • go into details of the whole process and its particularities in this version of Drupal
  • stress out the crucial importance of optimizing one's images for SEO and performance purposes
     

The END!

These are the 5 Drupal blog posts of the month which, in our opinion, shared the most useful and usable content.

Do you have your own list of favorites, too?

Image by Gerd Altmann from Pixabay

Jul 30 2019
Jul 30

Testing integrations with external systems can sometimes prove tricky. Services like Acquia Lift & Content Hub need to make connections back to your server in order to pull content. Testing this requires that your environment be publicly accessible, which often precludes testing on your local development environment.

Enter ngrok

As mentioned in Acquia’s documentation, ngrok can be used to facilitate local development with Content Hub. Once you install ngrok on your development environment, you’ll be able to use the ngrok client to connect and create an instant, secure URL to your local development environment that will allow traffic to connect from the public internet. This can be used for integrations such as Content Hub for testing, or even for allowing someone remote to view in-progress work on your local environment from anywhere in the world without the need for a screen share. You can also send this URL to mobile devices like your phone or tablet and test your local development work easily on other devices.

After starting the client, you’ll be provided the public URL you can plug into your integration for testing. You’ll also see a console where you can observe incoming connections.

Resources

Jul 28 2019
Jul 28

PHP 7.3.0 was released in December 2018, and brings with it a number of improvements in both performance and the language. As always with Drupal you need to strike a balance between adopting these new improvements early and running into issues that are not yet fixed by the community.

Why upgrade PHP to 7.3 over 7.2?

What hosting providers support PHP 7.3?

All the major players have support, here is how you configure it for each.

Acquia

Somewhere around April 2019 the option to choose PHP 7.3 was released. You can opt into this version by changing a value in Acquia Cloud. This can be done on a per environment basis.

The PHP version configuration screen for Acquia Cloud 

Pantheon

Pantheon have had support since the April 2019 as well (see the announcement post). To change the version, you update your pantheon.yml file (see the docs on this).

# Put overrides to your pantheon.upstream.yml file here.
# For more information, see: https://pantheon.io/docs/pantheon-yml/
api_version: 1
php_version: 7.3
Example pantheon.yml file

On a side note, it is interesting that PHP 5.3 is still offered on Pantheon (end of life for nearly 5 years).

Platform.sh

Unsure when Platform.sh released PHP 7.3, but the process to enable it is very similar to Pantheon, you update your .platform.app.yaml file (see the docs on this).

# .platform.app.yaml
type: "php:7.3"
Example .platform.app.yaml file

Dreamhost

PHP 7.3 is also available on Dreamhost, and can be chosen in a dropdown in their UI (see the docs on this).

The 'Manage Domains' section of Dreamhost

Dreamhost also win some award for also allowing the oldest version of PHP that I have seen in a while (PHP 5.2).

When can you upgrade PHP 7.3

Drupal 8

As of Drupal 8.6.4 (6th December 2018), PHP 7.3 is fully supported in Drupal core (change record). I have been running PHP 7.3 with Drupal 8 for a while now and have seen no issues, and this includes running some complex installation profiles such as Thunder and Lightning.

Any Drupal 8 site that is reasonably up to date should be fine with Drupal 8 today.

Drupal 7

Slated for support in the next release of Drupal 7 - being Drupal 7.68 (see the drupal.org issue), however there are a number of related tasks that seem like deal breakers. There also is not PHP 7.3 and Drupal 7 tests running daily either.

It seems like for the mean time, it is probably best to hold off on the PHP 7.3 upgrade until 7.68 is out the door, and also contributed modules have had a chance to upgrade and release a new stable release.

A simple search on Drupal.org yields the following modules that look like they may need work (more are certainly possible):

  • composer_manager (issue)
  • scald (issue) [now fixed and released]
  • video (issue)
  • search_api (issue) [now fixed and released]

Most of the issues seem to be related to this deprecation - Deprecate and remove continue targeting switch. If you know of any other modules that have issues, please let me know in the comments.

Drupal 6

For all you die hard Drupal 6 fans out there (I know a few large websites still running this), you are going to be in for a rough ride. There is a PHP 7 branch of the d6lts Github repo, so this is promising, however the last commit was September 2018, so this does not bode well for PHP 7.3 support. I also doubt contributed modules are going to be up to scratch (drupal.org does not even list D6 versions of modules anymore).

To test this theory, I audited the current 6.x-2.x branch of views

$ phpcs -p ~/projects/views --standard=PHPCompatibility --runtime-set testVersion 7.3
................................................W.W.WW.W....  60 / 261 (23%)
................................E........................... 120 / 261 (46%)
...................................................EE....... 180 / 261 (69%)
............................................................ 240 / 261 (92%)
.....................                                        261 / 261 (100%)

3 errors in views alone. The errors are show stoppers too

Function split() is deprecated since PHP 5.3 and removed since PHP 7.0; Use preg_split() instead

If this is the state of one of the most popular modules for Drupal 7, then I doubt the lesser known modules will be any better.

If you are serious about supporting Drupal 6, it would pay to get in contact with My Drop Wizard, as they at least at providing support for people looking to adopt PHP 7.

Jul 26 2019
Jul 26

Rain logo updated

The Mediacurrent team is excited to formally introduce Rain, an enterprise-grade Drupal install profile. Two years in the making, our team has spent countless hours developing these tools internally and deploying them to our client projects.

Our goal with this post is to share with the broader Drupal community, explain why we created Rain, and share how it can benefit your organization. We welcome feedback.

What is Rain? 

The Rain installation packages the best solutions the Drupal community has to offer so that organizations can build sites faster. We have used Rain internally for the last two years, making improvements along the way prior to its release as an open-source project.

Made by Mediacurrent

We designed the Rain install profile with the goal to capture features that overlap all verticals and projects of every size and scope. As part of the development process, we examined features and content patterns from successful client projects over the past three years. Through our research, we found many content features and module configurations did, in fact, apply to projects across the spectrum. Our focus has been to build Rain with the features that create the best admin and authoring experience we can provide for our clients and the open-source community.

Mediacurrent believes strongly in open source solutions, so we have released all of our project tooling to the community. For more information on the tools we have publicly released please visit the Mediacurrent Development Tools page for more information and links to our projects.

Who is Rain for?

It’s a full-time job to simplify processes on the backend and deliver a consistent, high-impact customer experience. Mediacurrent created the Rain Install Profile to make these jobs easier. 

Clients using Rain today include large B2B Enterprise, Higher Education, and Non-profit organizations. The Rain distribution is flexible enough to serve large and small organizations alike. For large enterprise builds, the install profile will reduce overhead and take advantage of reusable features even if the project overall is a highly-customized implementation. Smaller, more budget-conscious organizations have the opportunity to reuse even more out-of-box functionality for further savings. The end result is a fully branded, enterprise-grade Drupal solution for your organization

What makes the Rain Install profile different?

At DrupalCon this year, co-worker Justin Rent and I presented on the idea of “Content Patterns.” This presentation gives a window into the challenges we were facing and our approach to addressing those challenges. In short, we believe to have long-term success as an agency we need to pool together best practices and take advantage of repeatable patterns. While every project we work on is unique in some way, we also see many shared problems that could benefit from common-sense solutions. As an organization, we made a commitment to build tools that would address those problems and share them with the community.

We borrowed from other distributions, Acquia Lightning and Thunder CMS which are both great projects in their own right. We did, however, add our own twist. 

Our approach

1. Package and configure common modules

We package popular contributed modules like Metatag and many others that focus primarily on authoring and admin experience. Our installation comes with “sensible defaults” that offer a great out of box experience while saving developer and administrative time.

2. Offer flexible content features to jump-start development

Rain offers a variety of content features, including content types and paragraphs, not found in most distributions. These features are all optional and offer the opportunity to jump-start development. 

We believe our approach to content architecture gives editors a consistent, flexible interface for managing content. At the same time, it helps developers get going faster while being fully configurable. 

3. Ship with an extendable base theme and style guide

Rain ships with an excellent starter base theme.  This offers several advantages but is not required (see our GatsbyJS + Rain webinar for a “decoupled" option).

The theme ships with a base style guide that includes the most common components you’ll find on a website. These components provide basic styling and markup that saves development time. The components are intended to be branded to match the design requirements of a given project. Additionally, all Paragraph features are pre-integrated with the base theme and style guide to once again save development time and reduce costs. The net result -- the development team gets a running start on a new project instead of constantly reinventing the wheel.

Next steps

We would love to hear feedback from you. What features are your pain points, what features would you like to see? Interested in working with Mediacurrent? Contact us for a live Rain demo or for more information about our services. 

Jul 26 2019
Jul 26

The expanding data landscape is feeding the demand for higher operational agility. This calls for a more responsive, reliable IT infrastructure — that doesn’t rake up millions — minimizes delays and downtime while improving security and making infrastructure more agile.

Between capacity constraints and unpredictable pricing models, AWS offers suited workloads for growing infrastructure needs with a host of services - IaaS, PaaS, SaaS - for Drupal enterprises. 

Here’s how you can run your Drupal up to 40% cheaper.

Keeping the Business Innovative and Agile

Increasing demands for performance, scalability and agility have never been higher. Success and growth for businesses depend on it. At the same time, changing landscape is forcing businesses to opt for lower costs, greater use of cloud resources and better customer service

While these changes have implications on infrastructure, compute and networking resources, they also impact storage. 

Lack of enough database storage, for example, can adversely impact application performance. Fast-growing applications may need more storage than expected or immediate storage resources.

 

Capacity and storage issues can hinder business agility

 

The continuous need for speed and efficiency is driving businesses to opt for storage as a service (STaaS) model. But there is more to it when it comes to the the benefits. Businesses get:

  • Better insights at reasonable cost: Providing a highly scalable environment at a low cost capable of handling the massive volume and velocity of data, organizations can shift from the two available models (CapEx and OpEx) for more predictable costs.
  • Better collaboration: Cloud-based business solutions accelerate innovation, delivering business analytics at the point of impact and enabling collaboration by creating and linking business networks.
  • Innovation and variety of solutions: Forward-thinking enterprises adopt for STaaS to speed up business innovation, improve overall data-centre efficiency, achieve integrated and innovative business results.  
  • Proven results: Organizations achieve their desired business outcomes by improving the responsiveness of their IT infrastructure without increasing risk or cost.

Capacity and storage issues can hinder your business agility. 

In order to avoid such challenges in the future, Drupal-powered enterprises need to constantly understand and adapt to the changing landscapes.

Azim Premji Foundation, Georgia Technical Authority, Department of Homeland security, USA are powered by Drupal and supported by AWS

While Drupal helps balance the rapid data growth, the right cloud storage solution needs to offer security and robust scalability without constraining the budget and prepare IT and marketing for what comes next.

Run your Drupal 40% cheaper

Choosing the right technology is crucial to avoid equipment failures and the costs of upgrading hardware. Small to medium enterprises and non-profit especially need sustainable solutions for future needs to run its operations without overcommitting budgets today. 

Finding the perfect match, organizations such as Azim Premji Foundation, Georgia Technical Authority, UCAS, Department of Homeland security - USA,  are powered by Drupal and supported by AWS.

Enterprises need sustainable solutions without over committing budgets today

AWS offers cloud web hosting solutions that provide businesses, non-profits, and governmental organizations with low-cost ways to deliver their websites and web applications.

The pay-as-you-go approach lets you pay only for the individual services you need, for as long as you use, and without requiring long-term contracts or complex licensing. 

Similar to how you pay for utilities like water and electricity.  

You only pay for the services you consume, and once you stop using them, there are no additional costs or termination fees.

The pricing models give your enterprises the flexibility to grow your business unencumbered by  IT

  • Pay-as-you-go

With AWS you only pay for what use, helping your organization remain agile, responsive and always able to meet scale demands. Allowing you to easily adapt to changing business needs without overcommitting budgets and improving your responsiveness to changes, reducing the risk of over positioning or missing capacity.

Drupal-AWS-PriceBy paying for services on an as-needed basis, you can redirect your focus to innovation and invention, reducing procurement complexity and enabling your business to be fully elastic.

  • Save when you reserve

By using reserved capacity, organizations can minimize risks, more predictably manage budgets, and comply with policies that require longer-term commitments.

For certain services like Amazon EC2 and Amazon RDS, enterprises can invest in reserved capacity. With Reserved Instances, you can save up to 75% over equivalent on-demand capacity.

Drupal-AWS-Reserve-price

When you buy Reserved Instances, the larger the upfront payment, the greater the discount.

  • Pay less by using more

Providing volume-based discounts, organizations can save more by increasing usage. . For services such as S3 and data transfer OUT from EC2, pricing is tiered, meaning the more you use, the less you pay per GB.

In addition, data transfer IN is always free of charge.

As a result, as your AWS usage needs increase, you benefit from the economies of scale that allow you to increase adoption and keep costs under control.

As your organization evolves, AWS also gives you options to acquire services that help you address your business needs. For example, AWS’ storage services portfolio, offers options to help you lower pricing based on how frequently you access data and the performance you need to retrieve it.

Drupal-AWS-storage-price

To optimize the savings, choosing the right combinations of storage solutions can help reduce costs while preserving performance, security and durability.

The pricing models give your enterprises the flexibility to grow your business unencumbered by  IT.

Case Study: Reducing cost & improving operational efficiency for Drupal application with AWS

Our client which is a legal firm and helps provide jurisdiction and litigant simple, seamless, and secure access to the record of legal proceedings. They built a SaaS-based workflow management application on Drupal to manage and track digital recordings of legal proceedings, transcripts including appeals to the stakeholders.

The goal was to build a robust, cloud-based server to effectively handle the processing and access to a large volume of text, audio and video files.

Since the business model was dependent upon frictionless uploading and downloading of text and media files, AWS cloud-based server came out as a unanimous solution. 

Business benefits

  • Simplified integration of the client's Drupal application with AWS S3, to enable flexible, cloud-native storage
  • As a result of going all-in into the AWS Cloud, the client reduced costs by 40% and increased operational performance by 30-40%
  • Dynamic storage and pay-as-you-go pricing enabled the client to leverage a highly cost-effective cloud-storage solution

Read complete case study on Cloud-Native Storage for Drupal Application with AWS

Get no-cost expert guidance

Designed to help you solve common problems and build faster, Amazon Web Services provides a comprehensive suite of solutions to secure and run your sophisticated and scalable applications.

Srijan is an AWS Advanced Consulting Partner. Schedule a meeting with our experts at no cost or sales pitch and get started with your cloud journey.

Jul 25 2019
Jul 25

The ideas in this post were originally presented by Suzanne Dergacheva at Drupal North 2018.
 

If you've opted for Drupal, then you must be dealing with a large amount of content, right? Now, the question that arises is: how do you build out and structure a complex content architecture in Drupal 8?

For you definitely don't run out of options when it comes to organizing your content:
 

  • content types
  • paragraph types
  • (custom) block types
  • custom fields
     

And things get even more complex when you start to consider all the various relationships between these entities. 

Now, let me help you structure this huge “pile” of different options, approaches and best practices for setting up an effectively organized content structure.
 

What Makes Drupal Ideal for Creating a Flexible Content Architecture?

One of Drupal's key selling points is that it ships with tools and workflows designed specifically to support a flexible content architecture.

And I'm talking here about its:
 

  • WYSIWYG editor
  • all the tools that streamline the content creation and publishing process
  • access control system based on user roles and permission levels
  • ecosystem of Drupal 8 content types (blocks, nodes, paragraphs, terms)
     

All these tools combined empower you to:
 

  • create any type of content (survey, landing page, blog entry...) nice and easy
  • control where and how that piece of content should be displayed on your website
  • categorize and structure your large amount of content using different content entity types
     

In short: Drupal 8's built, from the ground up, to support a well-structured, yet highly flexible content architecture.
 

Step 1: Plan Out Your Content Architecture in Drupal 8: Identify the Needed Content Types 

A well-structured content architecture is, above all, a carefully planned out one. 

Start by analyzing your content wireframe to identify your content needs and to... fill in the blanks:
 

  • decide what content you need on your website, how/where it should be displayed and to whom it should be accessible
  • identify the various content entity types for each piece of content
  • set out all the fields that each content entity type requires
  • define your taxonomy term entities
     

 It's also that step where you gradually start to populate each category outlined in your content wireframe with the corresponding types of content.
 

Step 2: Set Out Your Well-Defined Content Types

I'm talking about those traditional, crystal-clear content types like articles or job postings, where the structure is pre-defined and it leaves no room for interpretation.

Those fixed content types that guarantee consistency across your website, that are easy to search and to migrate.

This is the step where you define each one of these content types' elements —  paragraphs, data, long text, images, etc. — and their order. 
 

Step 3: Set Out the Relationships Between Various Types of Content

Since you're dealing with a complex content structure, an intricate network of references between different pieces of content will be inevitable.

Now, it's time to set out all those explicit relationships between your node references and their referenced nodes, between term references and terms...

Note: needless to add that the implicit relationships will form by themselves, you have no control over those.
 

Step 4: Define the Multi-Purpose Content and the Reusable Pieces of Content

While at this phase, where you identify the content types that you'll need, remember to add the multi-purpose and the reusable content types, as well.

Speaking of multi-purpose content, it's that content type (e.g. the landing page) that you don't know yet what content it should include. And what order its content elements should be displayed in. 

Therefore, you need to keep it flexible enough for future additions and modifications. In this respect, the Paragraphs module is the flexible page builder that you can rely on.The “secret” is to build your paragraph types — call to action, webform, view —  along with the fields that they incorporate and to... leave it flexible for future updates.

Now, as for the reusable type of content, the best example is, again, that of a landing page with multiple reusable blocks that you can move around to your liking.

What you can do at this stage is to define your block types: image, view, call to action.
   

Step 5: Create Your Custom Entities and Custom Fields

While structuring a complex content architecture in Drupal 8 you'll inevitably need to create some custom entities and fields, as well.

With a large pile of content to deal with, there will be cases when the options that Drupal provides aren't suitable. For instance, you might need to define some special rules for a specific piece of content.

In this case, creating a custom entity is a must, but make sure you've carefully thought through all its potential use cases and specific workflows. That you've invested enough time in prototyping it.

Also, you might find yourself in a situation where one of the fields needs to be stored or validated in a particular way. For instance, you might need to create a multi-value field. Since these scenarios call for a custom field, again, take your time to prototype it thoroughly.
 

The END!

These are the main steps to properly structure your complex content architecture in Drupal 8. The golden rule should be: always leave some room for flexibility.

Photo by Alain Pham on Unsplash 

Jul 23 2019
Jul 23

Rain logo

Mediacurrent created the Rain Install Profile to build fast, consistent Drupal websites and improve the editorial experience. Rain expedites website creation, configuration, and deployment.

In the last developer tutorial, we covered how to customize and develop Drupal sites using Rain’s base install profile. This week, we will dive into theming to help frontend developers update the look and feel of the Rain starter theme.

As usual, if you have a question or comment feel free to reach out at @drupalninja

Sub-theme or clone?

The first thing to note is that you can use the Rain install profile with or without the included Rain theme package. The “rain_theme” project exists as its own Composer project which is included by default but can be easily removed. That said, there are benefits to using the Rain Theme as a base theme or starter. The primary benefit is that Paragraphs are integrated with a dozen or so pre-built components that can be easily customized and save you time. 

If you do choose to leverage the base theme, you need to decide whether or not to use Rain Theme as a “parent theme” or as a starter. We highly recommend you do not. This way, you gain full control over the theme and do not need to worry about downstream updates. Leveraging Rain theme as a parent theme can cut down initially on files and code but parent themes in Drupal can be restrictive and cumbersome at times. Ultimately, the decision is yours. With the “clone” approach you can grab everything and rename it or grab only what you want. Anything you don’t use can be discarded.

Using the style guide

The Rain Theme project includes a KSS-based living style guide that has a host of pre-built Twig components that can be integrated with Drupal theme templates.

KSS Style Guide

KSS style guide Example

In the previous step, we mentioned that we recommend cloning the theme so that you can fully control and customize any of these components. The idea is that developers waste less time rebuilding and re-styling the same common components on every new build. Instead, Drupal themers get a head-start while still having enough control of the theme to meet the design requirements of the project.

Working with the popular KSS node project is straightforward. For more information on how to compile and develop with KSS node, visit the project page at https://github.com/kss-node/kss-node.

Even if you are new to KSS, you will find making updates is easy. The “npm run build” command will compile all of your theme assets, as well as the style guide.

Component Integration

The main way components are integrated into Drupal templates is through includes found in Paragraph templates.

Rain Theme Paragraphs Twig Templates

Rain Theme Paragraphs Twig templates screenshot

Using the Components module to define the “@components” namespace, you can see an example below where field markup is passed in as parameters to the component Twig file. You will also that it’s in these templates where we typically attach a library (more on that in a bit). Of course, this is all ready to customize and any time you add or modify fields you will need to adjust templates or components accordingly. JavaScript and CSS are for the most part encapsulated in their corresponding component which keeps things organized. We do recommend you enable the “Twig debug” option in your Drupal services.yml to make it easy to find which templates are being used on any given page.

Rain theme quote paragraph template

Rain Theme Quote Paragraph template

Libraries

Out of the box, we have included many libraries that style components and include vendor libraries where appropriate. Note that these libraries reference compiled CSS and JavaScript found in the “dist” folder of the theme.

Rain theme libraries

Rain Theme Libraries screenshot

Additional Theming Helpers

In addition to the pre-configured templates, style guide and libraries included with Rain, we also ship a few helper modules to make theming easier. 

The “Twig Field Value” module makes it simpler to pull out values from fields and the “Twig Tweak” module adds several utility methods. For a full list of those functions with information visit the “Cheatsheet” documentation page on Drupal.org. As mentioned earlier, the “Components” module is also included and enabled by default in order to allow namespaces to be defined in our theme.

Wrap-up

In this article, we showed frontend developers how to leverage the Rain base theme and style guide. Our recommendation is to clone the theme folder and customize our pre-built components to match your project’s unique requirements. We also covered how to integrate Drupal theme templates with components and define custom libraries. Finally, we covered a few helpful theming modules that ship with Rain and save development time.

In the next (and final) Rain tutorial, we will wrap up this series with a focus on content creation for authors.

Was this information helpful? Let us know! https://twitter.com/drupalninja/

Jul 22 2019
Jul 22

Rain logo

Mediacurrent created the Rain Install Profile to build fast, consistent Drupal websites and improve the editorial experience. Rain expedites website creation, configuration, and deployment.

In our previous article, we walked through each feature of the Rain Install Profile with tips on how to use and configure each feature. In today’s tutorial, we zoom in on some technical aspects that backend developers will need to understand to properly develop against the Rain distribution.

Have a question or comment? Hit me up at @drupalninja

Post-install configuration

The last article, What’s New in Rain 3.0, gives some basic instructions on how to install Rain successfully. The Drupal project template repository also gives additional details on how to fully set up the entire project, including the local environment. Here we will cover what developers should do next once they have successfully installed Rain for the first time.

Creating a sub-profile

By default, the Mediacurrent Drupal project template includes a core patch that allows for profiles to inherit features and configuration from a parent profile. This follows a similar pattern to base themes and sub-themes in Drupal. The project template also bundles with it a sample profile (“mis_profile”) to demonstrate how to set up sub-profiles. The screenshot below illustrates the new yml syntax that will enable the “base profile” inheritance.

sub profile

Do I need to create a custom install?

You might ask - Why mess with a custom install profile at all? You certainly don’t have to. You could simply run mis_profile or the base Rain profile as-is and then start making your customizations. That said, many organizations take the approach of creating a custom install profile for each project. This can be handy for encapsulating configuration, install functions and other aspects related to the project. Mediacurrent takes this approach, and we highly recommend other organizations implementing Drupal applications do the same.

To get started, simply rename the “mis_profile” folder to your project name, then search and replace any instances of the “mis_profile” text string in your project. Once this is done you can remove the reference to mis_profile in composer.json as it will no longer be needed. From that point on, when you run build.sh your config.yml will instruct the installer to use your custom profile.

sample profile

Files included in sample mis_profile

Exporting configuration

The easiest way to get started with Rain is to run the ./scripts/build.sh command once your initial sub-profile is set up from the previous step. The base Rain installation will configure common modules covered in the Rain Admin and Authors Overview article. All of the content features will be left disabled initially but can easily be enabled through Drush or the “Admin Modules” page. Once you have the initial configuration set, it’s generally best practice to export that configuration to a version-controlled folder. In your local settings file you should have a line that is commented out by default that looks like the following:

# $config_directories['sync'] = $app_root . '/profiles/mis_profile/config/sync';

The line should have the name of your profile. Once you uncomment out that line you can run drush cex -y using your local site alias. This will export your configuration to your install profile folder (also considered best practice but not required). Now, every time you execute ./scripts/build.sh Drupal will create the site using your exported configuration. We love this approach because it ensures project developers are sharing the same configuration and it pairs nicely with automated tests.

example sync folder

Example sync folder from Gatsby Drupal 8 demo

Customizing Content

The content features that ship with Rain can be easily customized. Custom paragraph types are integrated with the Rain base theme (“rain_theme”) which means that changes to Paragraphs often require updates to Twig templates as well. The Rain theme is optional but adds the benefit of pre-configuring Paragraphs to map to components provided in the theme’s style guide. We will cover theming with Rain more in-depth in the next article.

optional content features

Optional content type features

 

paragraphs pop up

Add Paragraphs pop-up dialog

Maintenance

The Rain install profile takes the role of a quick starter and package manager. This is a similar approach to other distributions like Acquia Lightning. What this means is that after install, developers will own their site’s configuration and primarily leverage Rain’s Composer updates (if desired) for package dependencies. The main project Composer contains references to the base Rain distribution as well as the “Rain Features” package, both of which have their own Composer files that pull in Drupal contributed modules. The benefit of delegating this work to Rain is that modules are continually tested together as a package and include patches where needed. This can dramatically cut down on the work on the part of the “consumer” of Rain versus managing these dependencies on your own.

When updating mediacurrent/rain and mediacurrent/rain_features be sure to check the UPDATE.md file included in the main Rain project repository. This document includes instructions on how to upgrade when significant updates are made. This could include a major version change or updates after a Drupal core minor version is released. Note that Drupal core minor versions often require some refactoring, such as removing a contributed dependency that was ported to core.

Continuous Integration

Mediacurrent uses Bitbucket as our Git service of choice which means that we leverage Bitbucket’s Pipelines to execute CI tests. In our Drupal project template, we include a sample pipelines files that can be leveraged or refactored to match yml syntax for other Git repositories like Github or Gitlab.

Detaching

How married are you to Rain post-install? The answer is not too much. If you wanted to detach from Rain Composer you could easily back out the Composer dependencies, copy over what you need to your main project Composer and remove the reference to Rain’s base install profile in your custom profile’s info.yml file. From that point on, you will be managing dependencies on your own but still have the option to leverage stand-alone projects like Rain Theme or Rain Admin.

In some cases, developers will want to use the base Rain project but not the corresponding Rain Features project. The rain.install does not include any of the optional features from that project which means that you can simply remove “rain_features” from Composer in instances where you will not utilize those features.

Mediacurrent Rain projects

Mediacurrent Rain projects

Local Environments

Mediacurrent currently uses a DrupalVM based local environment bundled with scripts for managing builds and deployments. These packages are included under “require-dev” in your project Composer and are not required to use Rain. If you have another environment you prefer such as Lando or DDEV you can back out these dependencies from your project Composer.

Drupal VM

Deployments

Most Drupal web hosts (e.g Acquia, Pantheon, etc.) have their own git repository they use for deploying artifacts to the server. This includes all the actual Drupal files that make up core, contributed modules and libraries. We don’t want our source git repository to store artifacts or conform to the folder layout required by the web host. Instead, we execute deploy commands to build the artifacts from Composer and commit them to the appropriate destination git repository. To make that process easier we have some additional configuration to add to our config.yml file that instructs how and where code should be deployed.

The key aspects to both the Acquia and Pantheon configuration is pointing the release repo to the appropriate git URL provided by your host, the name of the host and the release drupal root directory. Our examples use Acquia and Pantheon but also support generic git artifact repositories like AWS.

Acquia config:

project_repo: [email protected]

release_repo: [email protected]:mcABCProject.git

release_drupal_root: docroot

deploy_host: Acquia

Pantheon config:

project_repo: [email protected]

release_repo: ssh://codeserver.dev.YOUR_UUID_VALUES.drush.in:2222/~/repository.git

release_drupal_root: web

deploy_host: Pantheon

Additionally, for Pantheon, you will need to add a pantheon.yml file to the root directory with the following values:

api_version: 1

web_docroot: true

php_version: 7.1 (or latest PHP version supported)

This command also needs to be run in order to clean up the Pantheon git repo prior to our first deployment:

rm -r README.txt autoload.php core example.gitignore index.php modules profiles robots.txt sites themes update.php vendor web.config

Now we are ready to build and deploy for the first time. We do this with two commands, one to build the artifacts and one to push files to the release git repository.

Example (using Pantheon):

./scripts/hobson release:build Pantheon develop

./scripts/hobson release:deploy Pantheon develop -y

After you have deployed code to Acquia or Pantheon you should be able to run a clean install using either the sample Rain child install profile (mis_profile) or cloned profile you have created (see Rain README for more details).

Final Thoughts

We hope this article helps developers get up and running smoothly with the Rain install profile and Drupal project template. At this point, you should be fairly comfortable installing Rain, creating a custom profile, exporting configuration, deploying to a Drupal host and maintaining Composer. As a package, Rain was designed to be flexible as possible to allow developers to customize their applications without being locked into a prescribed way of configuring Drupal.

Jul 19 2019
Jul 19

If your website is the hub for your audience to interact with your brand, then presumably you are doing all sorts of marketing tactics to get them there.

Once they are there, how are you tracking them? How do you know your efforts are effective?

There are a few key metrics you should be tracking to help you to optimize your marketing efforts, understand how the site is doing, continuously improve your website, and to allow you to report to others in your company about where the focus of marketing needs to lie.

First off, you need to determine what your goals are per marketing activity. How are you performing on these goals now? What are doing to affect those goals? How will you measure it? Identify not only your main conversions, like a form completion or a purchase, but soft conversions like a newsletter sign-up or a PDF download.

Next, either review your metrics based on these items or put in these metrics to track moving forward. 

Here are some commonly reviewed and important items to track. Most of these will be familiar to you, but #6 can be a game changer!

1. Time on Site

This metric allows you to see an aggregate of how long your audience spends on your site. If your site is centered around exploration and information, you will want this number to increase over time.

2. Bounce Rate

Your bounce rate is the percentage of users who visit your site, but only visit one page and then leave. Google Analytics defines it as the user only visiting the site for 0 seconds, then they exit. This means they see one page of your site, but the analytics does not have enough time to trigger a duration of their session. 

Several reports lean towards an “Acceptable” bounce rate can range between 26 to 70%. But this is a large range across multiple industries. Look deeper to learn what is considered “acceptable” in your industry, because a high bounce rate absolutely depends on your industry and the goals of your site. For example, if you are a restaurant and the visitor simply visits to grab your phone number, then you have reached your goal!

This should be looked at in combination with the other analytics in this article, since looking at the bounce rate alone will not tell you an accurate story. Researching a good bounce rate for your website type and industry is fantastic, but also look to see where you are today and then focus on reducing it (if appropriate).

3. Number of Pages visited

Again, if your site is more informational and built to provide a “next step” for exploration with your users, than you will want this metric to increase. If the number is closer to 1, but you focus all your traffic to a single page, than you should look deeper into that single page’s analytics, before you are concerned with this number.

4. New vs Returning Visitors

In Google Analytics, there is an overlap in these numbers. “New Visitor” is a unique visitor visiting your site for the first time, on a specific device. If you visit a site once on your phone, then again on your desktop, you will be counted as 2 new visitors.

Once the visitor visits your site again, on a device they already used, they will be counted as a, “Returning Visitor” for the next two years (then the clock starts over again).

This could be a great metric to use when you are running a campaign in different areas or industries, for example. If you pay close attention, you can see which campaigns garnered more new traffic.

5. Traffic Sources

Analytics programs will report to you where your traffic is coming from, which illuminates the more and less popular sources. It will also provide you referral sites, which helps you to see your ROI if you partner with others to send traffic to your site.

Seeing how each traffic source performs for you will continue you on the path of honing what works well for you (and what does not).

6. Search

The most crucial advice we provide our clients is to track your in-site search.

This is done as an admin in your “view settings” for Google Analytics. The reason this is so very powerful is it provides you exactly what your visitors want from your site.

A behavioral studies from the Nielsen Group and other research findings show that more than 50% of people visiting a start page on a website go straight to the internal search box in order to navigate. Those figures prove that search box becomes essential navigation tool on every website.

From this data, you can organize, adjust or create your content plan. You can revamp your navigation or the order at which content is laid out on your site. You can write relevant FAQs or shift your focus from one audience group to another. The reason this can be so compelling for your business is because you are directly answering the needs of your audience. 

These will get you started!

Many more metrics exist which can help you to analyze your effectiveness in your marketing tools and traffic sources, but these six are the best ones with which to start. Once you have defined what is important for you, continue to review your analytics over the course of time so you can continually optimize your site’s effectiveness.

Your website is a living and breathing entity that needs nurture and care to continue its growth and work harder for your business. If you need help with a strategy to define your metrics, contact us. We’d be glad to help. 

Jul 19 2019
Jul 19

At Kanopi Studios, we believe that Drupal is an especially strong choice, further validated by the fact that governments across more than 150 countries have turned to Drupal to power their digital experiences. This includes major sites in the United States like The White House and NASA.

What makes Drupal the best choice? Read on for our top 8 reasons why Drupal should be the content management system of choice for government websites.

1. Mobility 

Website traffic from mobile devices surpassed desktop traffic years ago. In fact, according to Pew Research Center, one in five adults in America are smartphone-only internet users, and that number is likely to continue to grow. Government websites need to prioritize a superior mobile experience so they can meet the needs of citizens of all ages and economic levels and allow users to access critical information on the go.

Drupal can help. Drupal 8 was built to scale across devices, load mobile content at top speeds, provide a wide selection of responsive themes, and more. Drupal also allows content editors the ability to add or update site content via mobile, unlocking the ability to make emergency updates from anywhere.

2. Security 

Offering a secure site that protects your content and sensitive user information is critical for maintaining your reputation and public trust. Drupal offers robust security capabilities, from regular patches to prominent notifications about updates to security modules you can install for additional peace of mind. Unlike other open source platforms, Drupal has a dedicated security council that keeps an eye out for potential issues and develops best practices to keep sites stable and secure.

3. Accessibility

A number of federal, state and local laws require government websites to serve the needs of all citizens, regardless of their abilities. Focusing on accessibility compliance from the very beginning of your website project can help your team avoid costly re-work and launch delays.

Drupal has accessibility baked in, with all features and functions built to conform to the World Wide Web Consortium (WCAG) and ADA guidelines, including the platform’s authoring experience. That means that people of all abilities can interact with your Drupal website, whether they are adding and editing content, reading news, filling out forms, or completing other tasks. Drupal allows screen readers to interpret text correctly, suggests accessible color contrast and intensity, builds accessible images and forms, supports skip navigation in core themes, and much more. 

If you’re a content editor, we recently wrote about eight things you can do to make your site more accessible

4. Simple content management

Drupal’s content editor helps busy government website administrators add posts, pages, and resources in an environment that’s nearly as simple and familiar as a Word document. The what you see is what you get (WYSIWYG) editing mode supports text formatting, links, embedded media, and more.

Drupal also enables administrators to set up customized roles, permissions, and content workflows. This allows any number of team members to contribute to the site while maintaining administrative control of the content that gets through to the public.

5. Ability to handle significant traffic and data

Many government websites store hefty data and resources and see significant spikes in traffic based on seasonal demand, news cycles, and many other factors. Drupal has the power to deal with large databases and intense site traffic with ease.

Drupal’s database capability includes a wide range of ways to sort and organize content via its module system, supporting the needs of almost any content library without the need to create custom code.

Drupal powers a number of heavily visited sites including NBC’s Olympics, The Grammy Awards, and Weather.com, keeping them going strong even when traffic levels are enormous.

6. Flexibility 

The helpful features included in Drupal core are just the beginning. Many, many additional modules have been contributed and tested by the Drupal community and are ready to be added to your site as needed. How many? The Drupal community has contributed well over 40,000 modules, so it’s a safe bet that there’s something already out there that can help meet the needs of your project.

Modules can be added to your site at any time, like building blocks. A few popular examples include social sharing, image editing, calendars, metatags, and modules that support integrations with external systems, from email platforms to customer databases. 

7. Affordability

Government budgets are often tight, with plenty of competing priorities for every dollar spent. With Drupal, you tap into a free, open-source system that’s supported by an enormous community of developers. Building your website on an open-source platform means you can focus your budget on creating an ideal experience for your citizens through professional services including content strategy, user experience, and design rather than dedicating funds to software licensing fees. And Drupal’s flexible modules reduce or even eliminate the need for custom code, helping you save even more.

8. Support for multiple sites in multiple languages

It’s not uncommon for government entities to have multiple websites. Whether your government maintains a few sites or hundreds, building each one individually would require an incredible amount of time and funds. Thankfully, Drupal’s multisite feature allows your site’s code base to be copied and adjusted to create as many new websites as you need, leveraging features that already exist without the need to build them from scratch. To meet language requirements, Drupal offers Content and Entity Translation modules that help content authors translate pages, individual elements, or specific fields into more than 100 languages.

Kanopi Studios loves government website projects

At Kanopi, we’re Drupal experts. We’ve harnessed its power to create citizen-focused sites for the San Francisco Police Department, San Francisco Health Service System and more. We’d love to hear from you, learn about the problems you are trying to solve, and share even more details about how you can put Drupal to work for your government website

Jul 18 2019
Jul 18

Rain logo

Mediacurrent created the Rain Install Profile to build fast, consistent Drupal websites and improve the editorial experience. Rain expedites website creation, configuration, and deployment.

In this article, we will walk through each of the main features that ship with the Rain distribution. This tutorial is intended to help content authors and site administrators get set up quickly with all that Rain has to offer.

Have a question or comment? Hit me up @drupalninja on Twitter.

Content Moderation

A question we often hear when working with a client is, “how can Drupal help to build a publishing workflow that works for my team and business?"

Drupal 8 marked a big step forward for creating flexible editorial workflows. Building on Drupal 8's support for content moderation workflows, Rain comes pre-configured with a set of Workflow states. The term “states” refers to the different statuses your content can have throughout the publishing process - the four statuses available by default are “Draft”, “Needs Review”, “Published” and “Archived.” They can be easily enabled for any content type. As with everything in Drupal, these states and workflows are highly configurable. 

Once enabled, using content moderation in Drupal 8 is straightforward. After you save a piece of content, initially it will default to the “Draft” status which will remain unpublished. The “Review” status also preserves the unpublished status until the current edits get published. What’s great about Workflow in Drupal 8 is that you can make updates on a published piece of content without affecting the published state of that content until your changes are ready to be published. The video below demonstrates how to enable workflow and see draft updates before they are published.

[embedded content]

To review any content currently in a workflow state you can click on the “Moderated Content” local task which is visible from the main Admin content screen (see below).
 

Admin content screen

Revisions

As a best practice, we recommend enabling revisions for all content. This allows editors to easily undo a change made by mistake and revisions keeps a full history of edits for each node. By default, all of Rain’s optional content features have revisions enabled by default. As illustrated below once you have made a save on a piece of content, the “Revisions” tab will appear with options for reviewing or reverting a change.

Rain Drupal Content Moderation - Revisions

Media Library

Coming soon to Drupal core is an overhauled Media library feature. In the meantime, Drupal contrib offers some very good Media library features that are pre-configured in Rain. The Rain media features are integrated with most image fields including the “thumbnail” field on all content type features that ship with Rain.

The video below demonstrates two notable features. First is the pop-up dialog that shows editors all media available to choose from within the site. Editors can search or browse for an existing image if desired. Second is the drag-and-drop file upload which lets the editor user drag an image onto the dialog to immediately upload the file. 

[embedded content]

 

Media Library

WYSIWYG Media

Media is commonly embedded within the WYSIWYG editor in Drupal. Rain helps improve this experience by adding a button which embeds the Media library feature to be used within WYSIWYG. The key difference between the Media library pop-up you see on fields versus the pop-up you see within WYSIWYG is that here you will have an option to select the image style. The video below illustrates how this is done.

[embedded content]

embed media

WYSIWYG Linkit

Another WYSIWYG enhancement that ships with Rain is the integrated “Linkit” module that gives users an autocomplete dialog for links. The short video below demonstrates how to use this feature.

[embedded content]

Content Scheduling

A common task for content editors is scheduling content to be published at a future date and time. Rain gives authors the ability to schedule content easily from the content edit screen. Note that this feature will override the Workflow state so this should be considered when assigning user roles and permissions. The screenshot below indicates the location of the “Scheduling options” feature that appears in the sidebar on node edit pages.

node edit


Clean Aliases

Drupal is usually configured with the ability to set alias patterns for content. This will create the meaningful content “slugs” visitors see in the browser which also adheres to SEO best practices. Rain’s approach is to pre-load a set of sensible defaults that get administrators started quickly. The video below demonstrates how an admin user would configure an alias pattern for a content type.

[embedded content]

XML Sitemap

By default, the Rain distribution generates a sitemap.xml feed populated with all published content. For editors, it can be important to understand how to exclude content from a sitemap or update the priority for SEO purposes. The screenshot below indicates where these settings live on the node edit page.

xml sitemap

Metatag Configuration

The default configuration enabled by the Rain install profile should work well for most sites. Metatag, a core building block for your website’s SEO strategy, is also enabled for all optional content features that ship with the Rain distribution. To update meta tags settings on an individual piece of content, editors can simply edit the “Meta tags” area of the sidebar on the edit screen (see below).

Metatag in Drupal

Google Analytics

Enabling Google Analytics on your Drupal website is a very simple process. The Rain distribution installs the Google Analytics module by default but the tracking embed will not fire until an administrator has supplied a “Web Property ID.” The Google Analytics documentation shows you where to find this ID. To navigate to the Google Analytics settings page, look for the “Google Analytics” link on the main admin configuration page. Most of the default settings will work well without change and the only required setting is the “UA” ID highlighted below.

Google Analytics

Enabling Content Features

Rain comes with many optional content features that can be enabled at any time. This includes content types, vocabularies, paragraphs, search and block types. Enabling a content feature will create the corresponding content type, taxonomy, etc. that can then be further customized. Any paragraph feature that is enabled will be immediately visible on any Rain content type that has enabled. Watch the video below to see an example of how to enable these features.

[embedded content]

Enabling Content Features

Wrapping Up

Mediacurrent created Rain to jump-start development and give editors the tools they need to effectively manage content. All features that ship with Rain are optional and highly configurable. We tried to strike a balance of pre-configuring as many essential modules as possible while still allowing site administrators to own the configuration of their Drupal site.

In the next tutorial, we will “pop open the hood” for Drupal developers to explain in technical detail how to build sites with Rain.

Jul 11 2019
Jul 11

On September 12–14, at Hilton Garden Inn Atlanta-Buckhead

Kaleem Clarkson Kyle Mathews, 2019 DrupalCamp Atlanta Keynote

This year, DrupalCamp Atlanta is honored to welcome Kyle Mathews as our keynote speaker, creator of the open source project Gatsby. Gatsby was a hot topic at DrupalCon this year, and we’re ready to dive into the software at DrupalCamp this September.

Follow Kyle on Twitter and Github.

Session submissions are now open for DrupalCamp Atlanta 2019! With Kyle as our keynote, we’re interested to see how others are combining Drupal and Gatsby. In addition, we’re also accepting sessions in the following tracks:

  • Beginner
  • Design, Theming, and Usability
  • Development and Performance
  • Site Building
  • Business Leadership
  • Education and Training

Each session is 40 minutes with 10 minutes for Q&A. Each room will be set classroom style and will have a projection screen and with in house audio.

Trainings

In addition to 50-minute sessions, we’re also looking for volunteer trainers for our full day of trainings on Thursday (9/12) and a half day on Friday (9/13). Training sessions can range across all experience levels. You can submit your call for training here.

One of our goals for this year’s camp was to increase the number of case studies. We encourage web development companies and units to connect with their clients to co-present a session at this year’s DCATL.

We see this as an opportunity to re-engage with a client by highlighting the great work you have done together all while introducing them to the awesome Drupal community we have. So, reach out to some of our clients and propose a presentation today!

SUBMIT YOUR PROPOSAL HERE

Jul 10 2019
Jul 10

Kanopi Studios is honored to have contributed to Mukrutu, a project that offers a powerful example of the importance of putting inclusivity, cultural sensitivity, and user needs at the center of design and development so that technology can be used as a force for good. 

What is Mukurtu?

Mukurtu  (MOOK-oo-too) is a free content management system built with Drupal that helps indigenous communities manage, share, and exchange their heritage in culturally relevant and ethically-minded ways.

“Mukurtu” is a Warumungu word for safe keeping place, a name chosen in 2007 when Warumungu community members collaborated with developers and scholars on the first iteration of the platform to produce the Mukurtu Wumpurrarni-kari Archive.

Surviving cultures risk being drowned out or forgotten by modern society due to dwindling numbers, resources, and legal claim to land and heritage. By sharing their voices, indigenous cultures can preserve their history and way of life, educate others, and seek much-needed support. But by doing so, they run the risk of losing control and ownership of the narrative. The Mukurtu project helps to solve that problem. Mukurtu was created to allow indigenous cultures to share their heritage on their own terms, eliminating the potential for exploitation or misrepresentation. 

The power of Mukurtu comes from its complex and layered permission system that goes far beyond the capabilities of traditional content management systems. The system is purpose-built to allow indigenous people to maintain control over how information is shared, who they share it with, and how it can be used.

Mukurtu and Kanopi Studios

As the program expanded, Kanopi Studios joined the Center for Digital Scholarship and Curation at Washington State University and CoDA as a development partner. Kanopi led a research-guided approach that included focus groups and surveys with users to inform the project’s technical strategy and development.

Since Mukurtu’s original release, Kanopi has played a lead role in development, adding features based on user requests and ensuring that the system remains easy to use, secure, and scalable. New features include an improved  mobile experience and robust collaboration capabilities with a mobile app coming soon that will allow users to browse and add content from the field, even while offline. 

Kanopi Studios also works directly with clients who want to use Mukurtu, but need to enhance the system to meet unique needs. Custom development examples include a site to help relocate indigenous people in Kivalina, Alaska, another to share the voices of Amiskwaciy people in Edmonton, Canada, and one of our earliest projects with Washington State University.

How Mukurtu can support indigenous communities 

Mukurtu was built to be flexible enough to support diverse communities while remaining easy enough that non-technical users can add and update content and permissions.

Core features include:

  • Traditional knowledge labels allow communities to add labels to content that describe how that content can be accessed, used and circulated, and to whom it needs to be attributed.
  • Cultural protocols allow for finely-grained content access settings that can be customized on an ongoing basis to meet the needs and values of each community, from wide open, to restricted at the individual level. 
  • Community records allow multiple ways to store information about cultural heritage so critical details and diverse perspectives can be maintained. 
  • Data integrity uses file hashes to ensure that files are not tampered with, ensuring that content remains intact over time.
  • Dictionary helps indigenous communities preserve their language, complete with translations, definitions, pronunciations, audio recordings, and other media
  • Collaboration tools allow site members to share events on group calendars and engage in threaded discussions.
  • Unit plans and lessons give educators and students a platform to engage in online and field learning through a Mukurtu site.

Indigenous communities across the globe use Mukurtu to record, preserve, and share their heritage, including the Plateau Peoples’ Web Portal, Passamaquoddy People, Catawba Indian Nation Archives, and many more. 

Impacting our future

While indigenous people benefit from sharing their stories, modern society has much to learn from their cultures as well, from our relationship to the land in a time when climate change threatens us all, to staying connected during this time of individualism and political divide. We’re proud to continue expanding Mukurtu as a platform for telling these important stories and hope they will help us build a stronger future for everyone.

Getting started with Mukurtu

If you have technical support and hosting available, you can download Mukurtu on Github and begin using it for free. For clients who need technical support additional customization of Mukurtu, contact us. We’d love to help.   

Jul 09 2019
Jul 09

In our previous blog - Demystifying the Decoupled Architecture - we discussed how decoupled architecture has become an increasingly popular choice to build enterprise-grade websites.

With various choices available, Drupal gives a breakthrough experience offering powerful content modeling, workflow capabilities and UI creation, helping evolve the marketing, branding and lead generation efforts.

Gatsby js, a React-based site generator, is a great option with Drupal for a decoupled architecture. Drupal and Gatsby form a powerful combination and here’s why this is a great choice.

Why Gatsby?

Let’s learn about the features of Gatsby which makes it an ideal match for decoupling with Drupal.

1. Plugin usage makes life simpler

Plugins help make the site offline, add Google analytics and supports inline SVGs, and much more, making Gatsby extensible and flexible.

Of the different types of Gatsby plugins, the gatsby-source plugins fetch data from a local or remote source and allow it to be used via GraphQL. This implies that almost anything can be used as a source to work with Gatsby and generate static sites.

Some plugins may need only to be listed by name, while others may take options. Gatsby plugin library offers a large number of plugins and continuously being maintained by the community.

2. Progressive Web Apps out-of-the-box

Gatsby enforces best practices to boost performance and add to the smoothness of user experience to give its site users an app-like experience.

The build process creates static HTML files for individual pages which offers swift loading of pages.

Gatsby boots up React on page loading on browser and navigates around your site to give you a single page application experience with near-instant transitions without page reloads. It works by prefetching related page content in the background to nullify any chance of delay on user click.

Gatsby exceptionally improves the client-side experience with JavaScript and can offer offline support with the addition of a single plugin.

3. JAMstack setup

JAMstack setup has taken the web development over by a storm with client-side JavaScript, reusable APIs, and prebuilt Markup offering a great combination to improve the speed and quality of your site as well as the overall developer experience. Gatsby acts as a JavaScript framework for a JAMstack-powered web application.

4. Built with performance in mind

Gatsby framework is built in a way to optimize the website performance on its own and compiles the most performant Webpack configuration to build your site once you create the source code.

It functions by prefetching resources to give a worldclass surfing experience to website users. It follows Google's PRPL (stands for Push Render Pre-cache Lazy-load) architectural pattern which aims to boost your website's performance, especially on mobile devices. The pattern is helpful for structuring and serving progressive web apps (PWAs). You can create PWA with Gatsby by running your page via HTTPS and installing a plugin for the manifest and service worker.

5. Easy to Use and Learn

Gatsby is based on React.js which is a JavaScript library for building user interfaces using components. It's relatively easy to learn, and if you’ve command over JavaScript code, you're good to go.

Working with Gatsby doesn’t require you to learn everything from scratch. Also, you can master working with it even if you don’t have experience working with React or GraphQL. Gatsby has a vast and active community to help you with your concerns.

Benefits of Decoupling Drupal with Gatsby

Static site generators like Gatsby pre-generate all the pages of the website unlike dynamic sites which render pages on-demand alleviating the need for live database querying and running through a template engine. This enhances the performance and brings down the maintenance cost.

Static site generators have seen a growth spurt over the past few years and have been the first preference of developers who wish to build a simple website/blog solution with a minimal server setup and a low maintenance cost.

Drupal proves to be a powerful back-end and is preferred by content editors for its WYSIWYG editor and content types which help them manage content more easily and systematically. However, maintaining a CMS requires hosting a web server and database, which poses security vulnerabilities.

diagram_gatsby_drupal

                                                                     Gatsby in action (Source: Gatsby.org)

Gatsby stands in between the robustness of static site, and the versatile back-end of a content management system. It allows you to host the CMS in-house and publish the content as a static website.

Watch the video to understand how Gatsby helps create blazing fast websites with Drupal:

Headless Drupal- Building blazing-fast websites with React-GatsbyJS + Drupal

Development approach

Gatsby, being a static site generator, lets the public folder created during the build time to behave as a static website.

To make it serve the requests, you can take that folder and deploy it on any server with Apache or nginx like server. Once the build is done, you can down your Drupal server and just deploy the public folder generated during the build.

 

null

 

This means that your Gatsby site will communicate with your Drupal backend during the build time, fetch all the details necessary and create static files for all the paths that will be present on your site.

 

With enterprises choosing to opt for more scalable and flexible experience, the popularity of Gatsby is increasing manifolds.

Are you thinking to implement best decoupled strategy for your next project? Contact our experts to get the best solution tailored as per your needs.

Jul 09 2019
Jul 09

open waters

In this episode of Open Waters, we talk with our own Bob Kepford, creator of the weekly newsletter The Weekly Drop, to discuss Open Source Serverless solutions.  Bob presented this topic at DrupalCon Seattle and it was very well received.  You can catch the recording on the DrupalCon site.

Audio Download Link

Pro Project Pick: Serverless Framework

Interview with Bob Kepford

  • The big question: What is Serverless?
  • What are the 4 pillars of serverless?
  • What are the advantages and disadvantages?
  • What do I have to know to get started?
  • As a site owner, why would I consider using serverless?
  • What are the security implications when using serverless?
  • Who are the big players who are adopting and/or providing serverless solutions?

Subscribe

Apple PodcastsStitcher | Google Podcasts

Jul 08 2019
Jul 08

Part of my day job is to help tune the Cloudflare WAF for several customers. This blog post helps to summarise some of the default rules I will deploy to every Drupal (7 or 8) site as a base line.

The format of the custom WAF rules in this blog post are YAML format (for humans to read), if you do want to create these rules via the API, then you will need them in JSON format (see the end of this blog post for a sample API command).

Default custom WAF rules

Unfriendly Drupal 7 URLs

I often see bots trying to hit URLs like /?q=node/add and /?q=user/register. This is the default unfriendly URL to hit on Drupal 7 to see if user registration or someone has messed up the permissions table (and you can create content as an anonymous user). Needless to say, these requests are rubbish and add no value to your site, let's block them.

description: 'Drupal 7 Unfriendly URLs (bots)'
action: block
filter:
  expression: '(http.request.uri.query matches "q=user/register") or (http.request.uri.query matches "q=node/add")'

Autodiscover

If your organisation has bought Microsoft Exchange, then likely your site will receive loads of requests (GET and POST) to which is likely to just tie up resources on your application server serving these 404s. I am yet to meet anyone that actually serves back real responses from a Drupal site for Autodiscover URLs. Blocking is a win here.

description: Autodiscover
action: block
filter:
  expression: '(http.request.uri.path matches "/autodiscover\.xml$") or (http.request.uri.path matches "/autodiscover\.src/")'

Wordpress

Seeing as Wordpress has a huge market share (34% of all websites) a lot of Drupal sites get caught up in the mindless (and endless) crawling. These rules will effectively remove all of this traffic from your site.

description: 'Wordpress PHP scripts'
action: block
filter:
  expression: '(http.request.uri.path matches "/wp-.*\.php$")'
description: 'Wordpress common folders (excluding content)'
action: block
filter:
  expression: '(http.request.uri.path matches "/wp-(admin|includes|json)/")'

I separate wp-content into it's own rule as you may want to disable this rule if you are migrating from a old Wordpress site (and want to put in place redirects for instance).

description: 'Wordpress content folder'
action: block
filter:
  expression: '(http.request.uri.path matches "/wp-content/")'

SQLi

I have seen several instanced in the past where obvious SQLi was being attempted and the default WAF rules by Cloudflare were not intercepting them. This custom WAF rule is an attempt to fill in this gap.

description: 'SQLi in URL'
action: block
filter:
  expression: '(http.request.uri.path contains "select unhex") or (http.request.uri.path contains "select name_const") or (http.request.uri.path contains "unhex(hex(version()))") or (http.request.uri.path contains "union select") or (http.request.uri.path contains "select concat")'

Drupal 8 install script

Drupal 8's default install script will expose your major, minor and patch version of Drupal you are running. This is bad for a lot of reasons.

Drupal 8's default install screen exposes far too much information

It is better to just remove these requests from your Drupal site altogether. Note, this is not a replacement for upgrading Drupal, it is just to make fingerprinting a little harder.

description: 'Install script'
action: block
filter:
  expression: '(http.request.uri.path eq "/core/install.php")'

Microsoft Office and Skype for Business

Microsoft sure is good at making lots of products that attempt to DoS its own customers websites. These requests are always POST requests, often to your homepage, and you require partial string matching to match the user agent, as it changes with the version of Office/Skype you are running.

In large organisation, I have seen the number of requests here number in the hundreds of thousands per day.

description: 'Microsoft Office/Skype for Business POST requests'
action: block
filter:
  expression: '(http.request.method eq "POST") and (http.user_agent matches "Microsoft Office" or http.user_agent matches "Skype for Business")'

Microsoft ActiveSync

Yet another Microsoft product that you don't why it is trying to hit another magic endpoint that doesn't exist.

description: 'Microsoft Active Sync'
action: block
filter:
  expression: '(http.request.uri.path eq "/Microsoft-Server-ActiveSync")'

Using the Cloudflare API to import custom WAF rules

It can be a pain to have to manually point and click a few hundred times per zone to import the above rules. Instead you would be better off to use the API. Here is a sample cURL command you can use do import all of the above rules in one easy go.

You will need to replace the redacted sections with your details.

curl 'https://api.cloudflare.com/client/v4/zones/XXXXXXXXXXXXXX/firewall/rules' \
  -H 'X-Auth-Email: XXXXXXXXXXXXXX' \
  -H 'X-Auth-Key: XXXXXXXXXXXXXX'
  -H 'Accept: application/json' \
  -H 'Content-Type: application/json'
  -H 'Accept-Encoding: gzip'
  -X POST \
  -d '[{"ref":"","description":"Autodiscover","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.uri.path matches \"\/autodiscover\\.xml$\") or (http.request.uri.path matches \"\/autodiscover\\.src\/\")"}},{"ref":"","description":"Drupal 7 Unfriendly URLs (bots)","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.uri.query matches \"q=user\/register\") or (http.request.uri.query matches \"q=node\/add\")"}},{"ref":"","description":"Install script","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.uri.path eq \"\/core\/install.php\")"}},{"ref":"","description":"Microsoft Active Sync","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.uri.path eq \"\/Microsoft-Server-ActiveSync\")"}},{"ref":"","description":"Microsoft Office\/Skype for Business POST requests","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.method eq \"POST\") and (http.user_agent matches \"Microsoft Office\" or http.user_agent matches \"Skype for Business\")"}},{"ref":"","description":"SQLi in URL","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.uri.path contains \"select unhex\") or (http.request.uri.path contains \"select name_const\") or (http.request.uri.path contains \"unhex(hex(version()))\") or (http.request.uri.path contains \"union select\") or (http.request.uri.path contains \"select concat\")"}},{"ref":"","description":"Wordpress common folders (excluding content)","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.uri.path matches \"\/wp-(admin|includes|json)\/\")"}},{"ref":"","description":"Wordpress content folder","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.uri.path matches \"\/wp-content\/\")"}},{"ref":"","description":"Wordpress PHP scripts","paused":false,"action":"block","priority":null,"filter":{"expression":"(http.request.uri.path matches \"\/wp-.*\\.php$\")"}}]'

How do you know the above rules are working

Visit the firewall overview tab in Cloudflare's UI to see how many requests are being intercepted by the above rules.

Cloudflare's firewall overview screen showing the custom WAF rules in action

Final thoughts

The above custom WAF rules are likely not the only custom WAF rules you will need for any given Drupal site, but it should at least be a good start. Let me know in the comments if you have any custom WAF rules that you always deploy. I would be keen to update this blog post with additional rules from the community.

This is likely the first post in a series of blog posts on customising Cloudflare to suit your Drupal site. If you want to stay up to date - subscribe to the RSS feed, sign up for email updates, or follow us on Twitter.

Jul 02 2019
Jul 02

As of May 31, 2020, Apigee will no longer sponsor hosting of Drupal 7-based developer portals (D7P). Prior to this, starting on May 31, 2019, Apigee will no longer provision Drupal sites for customers. 

Developer portals are expected to give flexibility and control to the developers and keep users’ data secure on a daily basis.

In our previous blog - Should You Migrate Your Developer Portal To Drupal 8? - we detailed on the security challenges your Drupal 7 developer portal could face, after all, security is important.

In this blog, let us understand the best action plan you can take to secure your developer portal. 

Action Plan to Secure Your Developer Portal

As a service provider, you have developed a set of APIs to provide access to your backend services. The announcement can put you in a fix but whatever your choice of action steps, ensure the end result can: 

  1. Keep your data secure
  2. Should not hamper the current flow of work
  3. Should be able to integrate with Apigee

Currently, you have three options in place:

  • Remain on Drupal 7 and assume hosting responsibility
  • Move to Apigee's integrated portal
  • Migrate your developer portal to Drupal 8

Remain on Drupal 7 and assume hosting responsibility

If you need (more) time to decide whether to opt for migration or not, depending on the existing running projects - you can consider remaining on D7P until you finalize on your choice. 

The support of the modules which integrate Drupal 7 with Apigee Edge will not be affected, however, cloud customers would need to assume direct account responsibility with their hosting providers.

After May 31, 2020, all Apigee-hosted Drupal 7 developer portal will be decommissioned and will be unavailable. You would not be able to administer or develop any post-May 2020. 

Remaining on Drupal 7 could make you vulnerable to several security concerns as discussed in our previous blog.

Move to Apigee's integrated portal

Consider moving to an Apigee integrated portal, if you have been using Drupal 7 with a minimal amount of customization or prefer an all-in-one solution. 

It is integrated directly into Apigee Edge and includes a powerful API catalogue and a compelling markdown-based CMS with robust audience management tools.

However, if you are someone who has leveraged the functionality of Drupal 7 in conjunction with a high degree of customization and investment in crafting a specific developer experience, then you should consider switching to Drupal 8. 

Migrate your developer portal to Drupal 8

Drupal 8 remains to be a compelling option for those who wish to remain on Drupal for their developer portal. It is the option preferred by the customers to head towards a self-managed developer portal and as a path forward to leverage the latest functionalities of Drupal.

There is a host of functionalities in Drupal 8 which makes it a better option from the three. It is more secure than Drupal 7, more features than on Drupal 7 and proves to be an ideal option for the developer portal to be built on. Let’s learn about them.

Your Drupal 8 Developer Portal will be Secure

Here is a list of new features introduced in Drupal 8 which aims to enhance the security of your developer portal:

  • Twigs: A new theming layer which comes with a whole bunch of security features.
  • Configuration management: You have a list of all configurations of your site in your code so as to let you know what all settings have been changed and who is responsible for that, when and why and to know what all settings have been changed.
  • No PHP filter: Simplifies the process by allowing developers to code inside a node.
  • Session IDs hashed: Now session IDs in the code are hashed, so even when the session IDs are known, the sessions cannot be hijacked.
  • Trusted Host Patterns: Code knows when it is in a trusted environment and alerts  when it is not. 
  • Single Statement Limitations to DB queries: Doesn’t allow multiple statement query in a single database.
  • Mixed Mode SSL removed: Implying the SSL will be used anyway and is no way an option now.
  • Automated CSRF token protection: Able to detect if the forms have cross-site scripting going on.

Some Best Practices To Mitigate API Threats

To secure your digital property from any possible threats, you need to follow certain best practices to ensure safe usage of APIs and prevent any critical customer information leakage.

  • Encrypt the APIs: Decrypting the API keys by authorised users will protect the critical data from being misused.
  • Ensure Role-Based User Authorization: Authorizing registered users as per roles to access APIs can work to identify a user and classify them as per varying levels of permissions.
  • Allow only Registered users: The first and foremost step is to authenticate users using the API in order to protect information. It becomes easy to track the API usage and identify who is making what request.
  • Deploy Rate Limiting: It can detect and prevent an unusual number of requests from a given user at a given frame of time. In an exceptional case, wherein the attacker manages to bypass your encrypted authentication and authorization protocols, rate limiting can prevent your API from being compromised.
  • Build Security in Layers: If the above steps don't work, try adding an extra level of security through an outer firewall.
  • Security is required in back-end too: Another checkpoint on the way out of the network can thwart the attacks is to secure the system down up so you can track him down on the way out. 

Srijan Can Help

Srijan is committed to providing customized developer portal to help you attract and engage developers with a comprehensive, customizable developer portal that offers a seamless onboarding experience. We offer secure migration of your existing developer portals to Drupal 8. 

Want to upgrade your developer portal? Let’s start the conversation

Jul 02 2019
Jul 02

“DrupalCamp is a unique experience that you have to attend to understand” 

On the weekend of 15-16 June, I seized the opportunity to be a part of DrupalCamp, that happened in India's capital city, New Delhi. A 2-day camp, it was full of experience from keynote to engaging sessions to Drupal Trivia to a stupendous after-party.

anul-tweetThe fun giveaways at DrupalCamp Delhi captured in an image by Anul Khadija

It wasn’t just my first DrupalCamp, but first Drupal event in general and it is quite true to acknowledge the event by the professionalism and dedication of the organizing team. There were sponsor booths, giveaways and at least three sessions going parallelly at any time of the day.

I was quite amazed to see the event standing by Drupal’s commitment towards inclusiveness with specially-abled artists from Kumawat Pahal Charitable Trust (KPCT) Foundation attending the camp. It was accurate to call the event - Drupal with a cause. As part of the Drupal Association, I understand Drupal’s commitment to all and was pleased to see it happen at the camp.

KPCT-PAINTINGSpecially-abled people from KPCT foundation impressed the audience with their paintings (with feet).
Picture by Ankit Singh

Day 1 started with a prenote by Shadab Ashraf, who took everyone to the early beginning of Drupal and how everything started, followed by an inspiring keynote by Sudhanshu Mani-- the man who had an out-of-the-box idea and lead of the project of Train 18, India’s First Engineless Train -- who enlightened us with some of the biggest challenges and success, followed by plentiful interesting sessions.

The two days were a mix of fun and inspiration. It was compelling to see speakers from various backgrounds speak at the platform with a deeper insight around Drupal and on various cutting-edge technologies.

Additionally, there was a launch of Drupal India Association (DIA), aimed to build more momentum for the local chapter incorporated in India. Being a Drupal Association staff, I believe it is important to have local associations which lead to support the local communities. The DIA will be a great place to collect stories, share resources, and raise the problems that Indian Drupal community faces and together with Drupal Association we can work to better things.

Sunday morning (Day 2) started with a plethora of knowledge shared by camp’s second keynote speaker Prashant Singh, VP - Product Management PayTM, who took us for a ride to the road on lessons and learnings from where to start and what one’s approach should be. He also talked about how to build technology in a startup, followed by another jam-packed schedule, full of regular sessions and workshops.

Birds of a Feather (BOF) was organized to know various chances of challenges commonly faced by some people while contributing. I was surprised to know that infrastructure is also a challenge given the fact that Delhi has many social places.

Came the time for the most special part when Akanksha Singh and I hosted Trivia Night-- a fun-filled part quiz of the event. It was the first time I organized and hosted something like this, which I think was an absolute fun while geeking out. We grouped people into teams and asked them head-scratching questions around Drupal and other trending web technologies.

trivia-nightCandid snap of hosts (Akanksha and I) of Trivia Night by Suhail Lone

I was able to attend a few inspiring sessions given by various founders and co-founders, which motivated me but these three had a lasting impact -

  1. Kickstart your success story by Sarita Chauhan
  2. Serendipity as a Business Strategy by Rahul Dewan
  3. Writing content like a pro by Aditi Syal

I also loved spending in-person time at the contribution sprint where we discussed how to contribute to documentation on drupal.org. Kimi Mahajan did her first contribution by editing a few documentation and improving the content quality. It was exciting to see a happy Kimi making her first contribution.

Contribution sprint is not just for developers, but also for the people who can contribute through non-code to Drupal. This makes us love Drupal even more, as almost anyone can get involved in contributing, making the community vibrant and full of inspirational contributions.

The act of contributing starts from getting involved and connecting with different people involved in various ways to make Drupal better. I found renewed motivation from the sheer joy of learning together and by volunteering to make the camp happen.

The energy and enthusiasm of organizers and volunteers were contagious, and it was inspiring to see them bring great ideas to make the event end on a high note.

Although I’ve been a part of DrupalCamps from quite some time, but from a distance. DrupalCamp Delhi made me realize what I missed. It was encouraging for me to see through my eyes the great work and efforts put in by the community to turn the event into a success.

Drupal community events are an interesting place to be at and I would not want to miss another. Such events give space and resource to get people together under one roof to contribute and giving high-fives in the hallway for their amazing work. I am absolutely looking forward to attending the next Drupal event.

Maybe DrupalCon Amsterdam?

InShot_20190618_121033584

Also revisit the journey at Google drive  -

https://drive.google.com/drive/folders/1-7xbDM7Do5RlaGPVoMXg_6hpS5aCy0qd

Jun 28 2019
Jun 28
Go to the profile of Deirdre Habershaw

Oct 19, 2018

Today, more than 80% of people’s interactions with government take place online. Whether it’s starting a business or filing for unemployment, too many of these experiences are slow, confusing, or frustrating. That’s why, one year ago, the Commonwealth of Massachusetts created Digital Services in the Executive Office of Technology and Security Services. Digital Services is at the forefront of the state’s digital transformation. Its mission is to leverage the best technology and information available to make people’s interactions with state government fast, easy, and wicked awesome. There’s a lot of work to do, but we’re making quick progress.

In 2017, Digital Services launched the new Mass.gov. In 2018, the team rolled out the first-ever statewide web analytics platform to use data and verbatim user feedback to guide ongoing product development. Now our researchers and designers are hard at work creating a modern design system that can be reused across the state’s websites and conducting the end-to-end research projects to create user journey maps to improve service design.

If you want to work in a fast-paced agile environment, with a good work life balance, solving hard problems, working with cutting-edge technology, and making a difference in people’s lives, you should join Massachusetts Digital Services.

Check out some of our current postings here:

Creative Director (posting coming soon, submit your resume here)

Director of Technology (posting coming soon, submit your resume here)

Senior Drupal Developer (posting coming soon, submit your resume here)

Didn’t see a good fit for you? Check out more about hiring at the Executive Office of Technology and Security Services and submit your resume in order to be informed on roles as they become available.

Jun 28 2019
Jun 28

Valuable, easy to navigate, useful and (most of all) usable content. This is the criterion that we use for selecting our favorite Drupal blog posts in a month.

And this is precisely what all the pieces of content that we're about to highlight in this post here have in common.

From:
 

  • inspiring lessons learned while working on a specific Drupal project
  • to actionable insights and valuable advice on how to keep up with Drupal's own advancement
  • to highlighted modules, that make the perfect fit for particular types of projects and challenges
     

… each one of these blog posts provided us with content with... an impact.

But let's just get started: here are the 5 Drupal-related blog posts that won the “popularity contest” in our team this month:
 

Kemane Wright, from Texas Creative, challenges us imagine (or to recall) a scenario where we'd need to remove/style a field in a Drupal form.

Say you:
 

  • find one of your forms to be uneccesarily “cluttered” and you just want to reduce the number of fields for the end-user to fill in
  • just want to update that particular fied's functionality (say turn it from an input field into a select field)
  • realize that there's no module available that you could use
     

Next, he reveals and details upon his 2 solutions to this dead-end, both of them applicable to your code directly.

Now, don't expect me to devulge them to you here. Do head to this team's website and delve deep into the enlightening information there.

Another both useful and usable piece of content that we've added to our list of ressources to tap into. And that because we're on a constant “quest” for new ways of integrating artificial intelligence into our Drupal projects. 

In this respect, we're more than grateful to the InternetDevels team for sharing their selection of Drupal modules that are using AI, all while outlining each one's functionalities.

If the topic of using Drupal for AI applications is of a particular interest to you, we kindly recommend you this post, one of the best Drupal blog posts in June.
 

Justin Toupin, Aten's CEO, is altruistic enough to share with us mortals here, his team's discoveries while working on one of their previous Drupal projects.

More precisely, their set of 6 tools to keep at hand when dealing with data-intensive challenges and data visualization issues.
 

With this post, those at ADCI Solutions launched their “Drupal best practices series”.

For us — and particualrly for the front-end developers here, at OPTASY — it turned out to be a true gem, “stuffed” with valuable best practices on how to optimize one's front-end code:
 

  • segmenting code into smaller chunks
  • integrating dynamic imports
  • ...
     

With everyone asking “How do I prepare for Drupal 9?” these days, Drudesk's piece of content stands out as one of the most useful Drupal blog posts of the month.

It lists all the critical preparations and To-Do's to consider for planning properly for Drupal 9:
 

  • from removing all deprecated code
  • to migrating your website from Drupal 7 to Drupal 8 instead of waiting to take the leap straight to Drupal 9, once released
     

The END!

These have been the 5 Drupal blog posts in June that had an impact here, at OPTASY:
 

  • on how we'll be using Drupal from now on
  • on how we'll be approaching our future Drupal projects
  • on how we'll improve our workflow to incorporate the best practices and the useful Drupal tools highltihed in these posts

Photo by Tony Hand on Unsplash

Jun 28 2019
Jun 28

Just imagine: you update content on one of your Drupal websites and content across your whole network gets automatically synchronized! That's Drupal on Blockchain in just a few words...

Say you manage a national library's infrastructure of Drupal websites. One for each one of its local branches.

Now, here's how moving all the user data stored in there from your centralized database onto a decentralized blockchain system would benefit you:
 

  • readers get to validate their own user data since there's no central entity having full (and exclusive) control over it
  • once they've updated their user data on one of the library's websites, it'll get synchronized across the entire network
  • the well-known vulnerability to errors of centralized multi-site structures gets eliminated; there's no longer a centralized database acting as a single point of failure
  • the decentralized architecture speeds up any operation that gets performed across the network
  • you'd avoid scenarios where the same reader enters his login credentials on one of the library's websites and gets asked to enter them, once again, when accessing the website of another library branch
     

And I would also add: increased transparency, lower transaction costs...

But I'd better dive into more details on how Blockchain and Drupal can work together and how you can benefit from the decentralized architecture that they'd put together:
 

1. Blockchain: What You Need to Know About Its Potential

But first, here's Blockchain in plain words:

A decentralized shared system where multiple participants store their data, interact directly with each other, manage and keeping a record of their transactions.

How is it different from the “old” way of managing transactions across a network?
 

  • there's no more a centralized database for storing data and transactions; participants (nodes) store it among themselves
  • … this grants them total control over their own data/created content
  • users involved in a blockchain network get to interact with one another freely, with no need of a third-party as an intermediary
  • it establishes a system of rule-based transactions
  • each transaction — editing, deleting content, etc. — gets documented
  • it enhances communication between nodes/participants
  • transactions get carried out at higher speed and, implicitly, with fewer costs
  • with no central entity as a unique storage source, there's no single source of failure anymore
  • enhanced transparency
     

In short: blockchain enables you to set up a secure and immutable architecture for your network.
 

2. Blockchain and a New Content Distribution Model

“Transparency” is the keyword here. Decentralizing a content distribution platform would benefit both content creators and content consumers:
 

  • digital publishers become the only ones allowed to update or delete their own content
  • consumers pay producers directly for the content they consume (written content, songs, videos etc.)
     

This way: content creators get full control over their own content — there's no platform owner who could remove it to his/her liking — and get paid fairly and in real-time, for each piece of content that gets “consumed”.
 

3. Drupal on Blockchain: Why, How, and With What Costs?

Why would you want to decentralize your CMS — in this case, Drupal — and store your data on Blockchain? 

To answer your question, let me highlight just a few of the inconveniences of managing your content on a centralized Drupal database:
 

  • each transaction is explicit and irreversible
  • it poses a higher vulnerability to errors
  • multiple-user functionality can turn out to be a serious dread
  • the centralized database acts as a single point of failure: if something crashes in there, the whole system is at risk
  • updating content in your database doesn't automatically update it across your entire network of Drupal websites...
     

And how would the 2 technologies work together? Considering the fundamental differences between them:
 

  • Drupal uses a centralized architecture to manage content
  • Blockchain uses a decentralized, middleman-free workflow based on a verification element
     

Before I try to answer your legitimate question, let me ask you this:

Do you seize any similarity between Drupal's “open data” phylosophy and Blockchain's “decentralized data” principle?

Now, here's how your hypothetical “Drupal on Blockchain” architecture would look like:
 

  • it'd be a much more secure, decentralized structure (you'd remove the single point of failure, remember?)
  • since a blockchain workflow would use an immutable validation of data, it'll act as a guarantee that no content can get modified by other than its creator/distributor
  • user data/content would be easily accessible across the entire infrastructure (take the example of an enterprise-level business, running a multi-site Drupal network)
  • … and it'd synchronize in real-time across all your Drupal instances, as well...
  • transactions performed within this architecture would be rule-based: every single content update or removal will get documented
     

“But at what costs?” you might ask. What compromises would you need to make to run Drupal on Blockchain? 

What challenges should you get prepared for?

Here are 2 potential “dares” to ponder upon:
 

  • first of all, integrating your current Drupal data into a blockchain system won't be quite a “piece of cake”
  • secondly, getting the consensus of all the participants (say users whose data would be easily accessible network-wide) is also a serious issue to consider
     

4. Drupal Development Efforts in this Direction: The Blockchain Module

This duo — Drupal and Blockchain — has generated quite a lot of talk these years. And quite a handful of promising initiatives and even prototypes have been presented (integrations with Etherium and bitcoin...)

From all these initiatives of the Drupal community, I've decided to put the spotlight onto the Blockchain module (not yet covered by Drupal's privacy policy).

Take it as a “scaffolding” to support your future “Drupal on Blockchain” architectures. It provides the functionality you need to:
 

  • set up your Drupal installations as blockchain nodes; ”nodes” that are independent, meaning they can get configured independently
  • ensure that your newly set up nodes are compatible with each other
     

The END!

This is the “why, how and at what costs” of this topic. One which has been on the lips (and on the Drupalcon slides) of members of the Drupal community for quite a while now.

What do you think?

Would such a decentralized Drupal on Blockchain architecture suit your own project's needs and constraints? Would you trade your central point of storage for the convenience of automated content synchronization?


Photo by Clint Adair on Unsplash

Jun 25 2019
Jun 25

I recently had reason to switch over to using Docksal for a project, and on the whole I really like it as a good easy solution for getting a project specific Drupal dev environment up and running quickly. But like many dev tools the docs I found didn’t quite cover what I wanted because they made a bunch of assumptions.

Most assumed either I was starting a generic project or that I was starting a Pantheon specific project – and that I already had Docksal experience. In my case I was looking for a quick emergency replacement environment for a long-running Pantheon project.

Fairly recently Docksal added support for a project init command that helps setup for Acquia, Pantheon, and Pantheon.sh, but pull init isn’t really well documented and requires a few preconditions.

Since I had to run a dozen Google searches, and ask several friends for help, to make it work I figured I’d write it up.

Install Docksal

First follow the basic Docksal installation instructions for your host operating system. Once that completes, if you are using Linux as the host OS log out and log back in (it just added your user to a group and you need that access to start up docker).

Add Pantheon Machine Token

Next you need to have a Pantheon machine token so that terminus can run within the new container you’re about to create. If you don’t have one already follow Pantheon’s instructions to create one and save if someplace safe (like your password manager).

Once you have a machine token you need to tell Docksal about it.  There are instructions for that (but they aren’t in the instructions for setting up Docksal with pull init) basically you add the key to your docksal.env file:

SECRET_TERMINUS_TOKEN="HASH_VALUE_PROVIDED_BY_PANTHEON_HERE"

 Also if you are using Linux you should note that those instructions linked above say the file goes in $HOME/docksal/docksal.env, but you really want $HOME/.docksal/docksal.env (note the dot in front of docksal to hide the directory).

Setup SSH Key

With the machine token in place you are almost ready to run the setup command, just one more precondition.  If you haven’t been using Docker or Docksal they don’t know about your SSH key yet, and pull init assumes it’s around.  So you need to tell Docksal to load it but running:
fin ssh-key add  

If the whole setup is new, you may also need to create your key and add it to Pantheon.  Once you have done that, if you are using a default SSH key name and location it should pick it up automatically (I have not tried this yet on Windows so mileage there may vary – if you know the answer please leave me a comment). It also is a good idea to make sure the key itself is working right but getting the git clone command from your Pantheon dashboard and trying a manual clone on the command line (delete once it’s done, this is just to prove you can get through).

Run Pull Init

Now finally you are ready to run fin pull init: 

fin pull init --hostingplatform=pantheon --hostingsite=[site-machine-name] --hosting-env=[environment-name]

Docksal will now setup the site, maybe ask you a couple questions, and clone the repo. It will leave a couple things out you may need: database setup, and .htaccess.

Add .htaccess as needed

Pantheon uses nginx.  Docksal’s formula uses Apache. If you don’t keep a .htaccess file in your project (and while there is not reason not to, some Pantheon setups don’t keep anything extra stuff around) you need to put it back. If you don’t have a copy handy, copy and paste the content from the Drupal project repo:  https://git.drupalcode.org/project/drupal/blob/8.8.x/.htaccess

Finally, you need to tell Drupal where to find the Docksal copy of the database. For that you need a settings.local.php file. Your project likely has a default version of this, which may contain things you may or may not want so adjust as needed. Docksal creates a default database (named default) and provides a user named…“user”, which has a password of “user”.  The host’s name is ‘db’. So into your settings.local.php file you need to include database settings at the very least:

<?php
$databases = array(
  'default' =>
    array(
      'default' =>
      array(
        'database' => 'default',
        'username' => 'user',
        'password' => 'user',
        'host' => 'db',
        'port' => '',
        'driver' => 'mysql',
        'prefix' => '',
      ),
    ),
);

With the database now fully linked up to Drupal, you can now ask Docksal to pull down a copy of the database and a copy of the site files:

fin pull db

fin pull files

In the future you can also pull down code changes:

fin pull code

Bonus points: do this on a server.

On occasion it’s useful to have all this setup on a remote server not just a local machine. There are a few more steps to go to do that safely.

First you may want to enable Basic HTTP Auth just to keep away from the prying eyes of Googlebot and friends.  There are directions for that step (you’ll want the Apache instructions). Next you need to make sure that Docksal is actually listing to the host’s requests and that they are forwarded into the containers.  Lots of blog posts say DOCKSAL_VHOST_PROXY_IP=0.0.0.0 fin reset proxy. But it turns out that fin reset proxy has been removed, instead you want: 

DOCKSAL_VHOST_PROXY_IP=0.0.0.0 fin system reset.  

Next you need to add the vhost to the docksal.env file we were working with earlier:

 VIRTUAL_HOST="test.example.org"

Run fin up to get Docksal to pick up the changes (this section is based on these old instructions).

Now you need to add either a DNS entry someplace, or update your machine’s /etc/hosts file to look in the right place (the public IP address of the host machine).

Anything I missed?

If you think I missed anything feel free to let know. Particularly Windows users feel free to let me know changes related to doing things there. I’ll try to work those in if I don’t get to figuring that out on my own in the near future.

Jun 21 2019
Jun 21

This is no news anymore: preparing to upgrade to Drupal 9 is just a matter of... cleaning your website of all deprecated code. 

No major disruption from Drupal 8. No more compatibility issues to expect (with dread)...

“Ok, but how do I know if my website's using any deprecated APIs or functions? How do I check for deprecations, identify them and then... update my code?”

2 legitimate questions that must be “haunting” you these days, whether you're a:
 

  • Drupal 8 website owner
  • developer currently working on a Drupal project
     

Since the great news of this smooth Drupal upgrade ships with the answer to your “What” question (“What do I do to get my website ready for Drupal 9?”), but leaves the “How” question open:

“How precisely do I check my website for deprecated code?”

Are there any analysis tools available? Tools that you could run to get a thorough and accurate deprecated code report? 

Luckily, there are. And I'll be focusing on 2 of the most effective ones that you should consider integrating into your workflow: Drupal Check and the Upgrade Status module.
 

1. But What Is Deprecated Code? And What Website Elements Should You Audit?

A piece of code is considered deprecated if:
 

  • there's an upgraded alternative for it already available
  • it's no longer in use
     

With this real “dilemma” now solved, there comes another one:

“What parts of my website should I check for deprecated code?”

Make sure you scan your:
 

  • Drupal core
  • Drupal modules
  • theme
     

Note: pay special attention to the contributed modules enabled on your Drupal 8 website; run a deep-scan and, if you get any deprecation warnings, make sure to alert those modules' maintainers to clean them up.
 

2. Drupal Check: Scan Your Database for Any Deprecations 

A handy PHP analysis tool to grab and to run whenever you need to look for deprecated code in your database. 

A command-line tool that Dries Buytaert recommends running the... automated way,  closely integrated into your own workflow. What it'll do is track down instances of deprecated code for you.

Then, all there's left for you to do is to... remove them. And, depending on the context, to replace them with their upgraded alternatives.
 

3. The Upgrade Status Module: Determine Your Site's Readiness to Upgrade to Drupal 9

If the idea of working with a command line doesn't sound too... “tempting” to you, how about adding a user-friendly graphical interface to the equation?

The Upgrade Status module, delivered to us by the Aquia team, lead by Gábor Hojtsy, makes checking for deprecated code a lot more enojyable and intituive, thanks to its admin dashboard.

It's particularly handy if you're a Drupal site owner and not a senior Drupal developer highly familiar with CLIs.

Install it, enable it and use it to evalute your website and to assess to what degree it is ready (meaning up to date) for the Drupal 9 upgrade.

But let's delve head first into details on:
 

  • what it takes to install it properly
  • what parts of your website it will deep scan
  • how you can narrow down its analysis to specific projects only
     

3.1. Use the Composer Package Manager to Install It

Since it ships with its whole collection of third-party PHP dependencies...

Another key requirement to set the stage for the Update Status module is to enable the Update Manager and the Git Deploy modules on your Drupal 8 website.

Once installed, you can access its user interface at /admin/reports/upgrade.
 

3.2. Check Up Your Codebase, Modules and Themes

The great thing about this module is that you get to run your checks right in your admin UI and get a full report.

Another great aspect is that, when it comes to contributed modules, it will provide you any available updates... inline. 

Once it's completed its scan it'll display either an “Errors found” or a “No known errors” message. To localizae the identified deprecations on your website, just click “View errors”.
 

3.3. Run It on Specific Individual Projects, Too

Maybe you don't always need a full check. Maybe you'd like to scan only a specific project that you might be working on, to ensure that it's ready to upgrade to Drupal 9 when due time.

You can do that. The module allows you to cut down the time you'd spend on an uneccessary full-scan by focusing on one target project only. 

Furthermore, to streamline things even more, it enables you to export each deprecated code report individually...
 

4. So,You'Ve Identified Your Deprecated Code: What Next?

In most cases, keeping your codebase up to date once you've detected the deprecated parts is just a matter of replacing those deprecations.

For the other few cases left, you'll need to carry out a more complex refactoring process.

Now that you know which are the tools to use for:
 

… your website's smooth upgrade to Drupal 9 depends on you exclusively. 

On sticking to your own routine of checking up your Drupal core, modules and theme and keeping them up to date.

Image by fajar budiman from Pixabay

Jun 21 2019
Jun 21

I am working with a customer now that is looking to go through a JSON:API upgrade, from version 1.x on Drupal 8.6.x to 2.x and then ultimately to Drupal 8.7.x (where it is bundled into core).

As this upgrade will involve many moving parts, and it is critical to not break any existing integrations (e.g. mobile applications etc), having basic end-to-end tests over the API endpoints is essential.

In the past I have written a lot about CasperJS, and since then a number of more modern frameworks have emerged for end-to-end testing. For the last year or so, I have been involved with Cypress.

I won't go too much in depth about Cypress in this blog post (I will likely post more in the coming months), instead I want to focus specifically on JSON:API testing using Cypress.

In this basic test, I just wanted to hit some known valid endpoints, and ensure the response was roughly OK.

Rather than have to rinse and repeat a lot of boiler plate code for every API end point, I wrote a custom Cypress command, to which abstracts all of this away in a convenient function.

Below is what the spec file looks like (the test definition), it is very clean, and is mostly just the JSON:API paths.

describe('JSON:API tests.', () => {

    it('Agents JSON:API tests.', () => {
        cy.expectValidJsonWithMinimumLength('/jsonapi/node/agent?_format=json&include=field_agent_containers,field_agent_containers.field_cont_storage_conditions&page[limit]=18', 6);
        cy.expectValidJsonWithMinimumLength('/jsonapi/node/agent?_format=json&include=field_agent_containers,field_agent_containers.field_cont_storage_conditions&page[limit]=18&page[offset]=72', 0);
    });
    
    it('Episodes JSON:API tests.', () => {
        cy.expectValidJsonWithMinimumLength('/jsonapi/node/episode?fields[file--file]=uri,url&filter[field_episode_podcast.nid][value]=4976&include=field_episode_podcast,field_episode_audio,field_episode_audio.field_media_audio_file,field_episode_audio.thumbnail,field_image,field_image.image', 6);
    });

});
jsonapi.spec.js

And as for the custom function implementation, it is fairly straight forward. Basic tests are done like:

  • Ensure the response is an HTTP 200
  • Ensure the content-type is valid for JSON:API
  • Ensure there is a response body and it is valid JSON
  • Enforce a minimum number of entities you expect to be returned
  • Check for certain properties in those returned entities.
Cypress.Commands.add('expectValidJsonWithMinimumLength', (url, length) => {
    return cy.request({
        method: 'GET',
        url: url,
        followRedirect: false,
        headers: {
            'accept': 'application/json'
        }
    })
    .then((response) => {
        // Parse JSON the body.
        let body = JSON.parse(response.body);
        expect(response.status).to.eq(200);
        expect(response.headers['content-type']).to.eq('application/vnd.api+json');
        cy.log(body);
        expect(response.body).to.not.be.null;
        expect(body.data).to.have.length.of.at.least(length);

        // Ensure certain properties are present.
        body.data.forEach(function (item) {
            expect(item).to.have.all.keys('type', 'id', 'attributes', 'relationships', 'links');
            ['changed', 'created', 'default_langcode', 'langcode', 'moderation_state', 'nid', 'path', 'promote', 'revision_log', 'revision_timestamp', 'status', 'sticky', 'title', 'uuid', 'vid'].forEach((key) => {
                expect(item['attributes']).to.have.property(key);
            });
        });
    });

});
commands.js

Some of the neat things in this function is that it does log the parsed JSON response with cy.log(body); this allows you to inspect the response in Chrome. This allows you to extend the test function rather easily to meet you own needs (as you can see the full entity properties and fields.

Cypress with a GUI can show you detailed log information

Using Cypress is like having an extra pair of eyes on the Drupal upgrade. Over time Cypress will end up saving us a lot of developer time (and therefore money). The tests will be in place forever, and so regressions can be spotted much sooner (ideally in local development) and therefore fixed much faster.

If you do JSON:API testing with Cypress I would be keen to know if you have any tips and tricks.

Jun 20 2019
Jun 20

What does Drupal 8 do that Laravel does not? What key functionalities, that Drupal ships with, do you need to build from scratch in Laravel? And how would opting for Laravel benefit your specific type of project? In short: Laravel or Drupal 8?

“It's like comparing apples to oranges” some might say since one's a framework and the other one a CMS.

Even so, if it's unclear to you what are their particular use cases and their built-in features, you can't know whether it's a CMS or a framework that best suits your project type, right? That best serves your project-specific needs:
 

  • to be super fast
  • to leverage a solid, off-the-shelf content management system for publishing different pieces of content on the website
  • to feature an easy to scale database
  • to support multisite
  • to tap into robust user and content management features that are already implemented
  • to be built on top of a solid framework acting as a reliable back-end application
  • to leverage a highly intuitive admin user interface
  • to be 101% secure
  • to leverage a mixture of server and client-side logic
     

Now, keep your list of project requirements and constraints at hand to evaluate these 2 technologies' pros and cons against it:
 

1. Drupal 8: Top Benefits, Main Drawbacks, and Specific Use Cases

If a robust user and content management system is critical for your project, then Drupal 8 makes the smartest choice. It's that “thing” that Drupal excels at that, which would take you a whole lot more time to do in Laravel.

And it's not just its robustness that might “lure you in”, but the level of convenience that it provides: a lot of the essential features and functionalities that you might need are already built-in.

Moreover, you can easily manage them and custom-tune them via your admin interface...

By comparison, you'd need to build these functionalities, from the ground up, if you chose to go with Laravel.
 

Top benefits:
 

  • you can rest assured that your website runs on a particularly robust, Symfony-based CMS
  • there's a huge, dedicated community backing it up
  • you get to create various content types, for different parts of your website, assigned with different roles; unlike basic CMSs, that only enable you to write... posts and to create new web pages
  • you can set up different editorial workflows and assign specific user roles, with fine-grained access control
  • you can always further extend its CMS-specific functionalities: extensibility is one of the strongest Drupal 8 benefits
     

Main drawbacks:
 

  • you do need a team of Drupal experts (senior-level preferably) to keep an eye on your Drupal 8 website/app and keep everything properly maintained
  • you can't get away with a “get it up and running and... move on” type of philosophy; Drupal 8 is a more of a long-term commitment: there's always a newly launched promising module to consider adding on, a new update to run...
     

Specific Use Cases for Drupal 8:
 

  • large-scale projects that depend on a robust and reliable content management system; one that withstands an intense, ongoing process of creating, editing and publishing lots of fresh content
  • Laravel or Drupal 8? Definitely the later if it's a multi-site, multi-language web project that you plan to develop; not only that it streamlines content publishing  across your whole network, but it significantly speeds up localization thanks to its server-side caching capabilities
     

It means that no matter the place on the globe where that your users might be located, they get to access your web pages and have them loaded... instantly.
 

2. Laravel: Pros, Cons, and Project Types that It's Best Suited For

Laravel stands out as a highly reputed, powerful PHP framework

If:
 

  • maintainability is one of your biggest concerns
  • you're looking for a robust framework
  • you need to carry out your project fast enough
  • you need a framework that ships with all the latest functionalities
     

... then Laravel is what you need.
 

Top Benefits:
 

  • a fast-growing, devoted community
  • you can easily integrate LDAP authentication 
  • it leverages the Model-View-Controller architecture
  • it's just... fast
  • provides you with a great admin user interfaces
  • it “spoils” you with intiutive, beautifully written code
  • it ships with a heavy “toolbox”: scan through and pick the most suitable one(s) for your project
  • in-built code for social login and sending out emails
  • everything you might need to set up during the development process is right there, already integrated into your code: cron jobs, database queries, routes...
     

Main drawbacks:
 

  • more often than not identifying performance issues isn't that straightforward
  • upgrading to the latest version of Laravel can turn out to be quite a challenge: be prepared for “buggy scenarios” and for the need to rewrite code
  • you can't just jump straight to Laravel: learning the basics of OOPS first things first is a must
     

Specific Use Cases:
 

  • your project needs a back-end application (rather than an off-the-shelf CMS)
  • when the benefits of the MVC architecture (faster development process, suitable for large-scale projects, multiple views, etc.) are critical for the given project 
  • whenever you need to mix the client-side with server logic
  • whenever time factor is the main concern for you: you just need your project developed super fast
     

3. So... Laravel or Drupal 8? 

Now, I'm sure that you already anticipate my answer:

The choice depends strictly on your project requirement and objectives.

On your own hierarchy of priorities in terms of features and functionalities.

And depending on these key aspects, that should be clearly defined, one technology will benefit you over the other.

So... what type of project are you looking to build?

Photo by Raquel Martínez on Unsplash 

Jun 20 2019
Jun 20

We’ve been starting many of our projects using Acquia’s Lightning distribution. This gives a good, consistent starting point for and helps speed development through early adoption of features that are still-in-the-works for Drupal 8. Like other distributions, Lightning bundles Drupal Core with a set of contributed modules and pre-defined configuration.

While Lightning is a great base to start from, sometimes you want to deviate from the path it provides. Say for example you want to use a Paragraphs based system for page components, your client has a fairly complex custom publishing workflow, and you also have different constraints for managing roles. Out-of-the-box, Acquia Lightning has a number of features you may find yourself in conflict with. Things like Lightning Layout provide a landing page content type that may not fit the needs for the site. Lightning Roles has a fairly hard-coded set of assumptions for role generation. And while it is a good solution for many sites, Lightning Workflow may not always be the right fit.

You may find yourself tempted to uninstall these modules and delete the configuration they brought to the party, but things are not always that simple. Because of the inter-relationships and dependencies involved, simply uninstalling these modules may not be possible. Usually, all looks fine, then when it comes time for a deployment things fall apart quickly.

This is where sub-profiles can save the day. By creating a sub-profile of Acquia Lightning you can tweak Lightning’s out-of-the-box behavior and include or exclude modules to fit your needs. Sub-profiles inherit all of the code and configuration from the base profile they extend. This gives the developer the ability to take an install profile like Acquia Lightning and tweak it to fit her project’s needs. Creating a sub-profile can be as easy as defining it via a *.info.yml file.

In our example above, you may create a sub-profile like this:

name: 'example_profile'
type: profile
description: 'Lightning sub-profile'
core: '8.x'
type: profile
base profile: lightning
themes:
  - mytheme
  - seven
install:
  - paragraphs
  - lightning_media
  - lightning_media_audio
  - lightning_media_video
exclude:
  - lightning_roles
  - lightning_page
  - lightning_layout
  - lightning_landing_page

This profile includes dependencies we’re going to want, like Paragraphs – and excludes the things we want to manage ourselves. This helps ensure that when it comes time for deployment, you should get what you expect. You can create a sub-profile yourself by adding a directory and info.yml file in the “profiles” directory, or if you have Drupal Console and you’re using Acquia Lightning, you can follow Acquia’s instructions. This Drupal Console command in Lightning will walk you through a wizard to pick and choose modules you’d like to exclude.

Once you’ve created your new sub-profile, you can update your existing site to use this profile. First, edit your settings.php and update the ‘install_profile’ settings.

$settings['install_profile'] = 'example_profile';

Then, use Drush to make the profile active.

drush cset core.extension module.example_profile 0

Once your profile is active and in-use, you can export your configuration and continue development.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web