Jul 26 2019
Jul 26

The expanding data landscape is feeding the demand for higher operational agility. This calls for a more responsive, reliable IT infrastructure — that doesn’t rake up millions — minimizes delays and downtime while improving security and making infrastructure more agile.

Between capacity constraints and unpredictable pricing models, AWS offers suited workloads for growing infrastructure needs with a host of services - IaaS, PaaS, SaaS - for Drupal enterprises. 

Here’s how you can run your Drupal up to 40% cheaper.

Keeping the Business Innovative and Agile

Increasing demands for performance, scalability and agility have never been higher. Success and growth for businesses depend on it. At the same time, changing landscape is forcing businesses to opt for lower costs, greater use of cloud resources and better customer service

While these changes have implications on infrastructure, compute and networking resources, they also impact storage. 

Lack of enough database storage, for example, can adversely impact application performance. Fast-growing applications may need more storage than expected or immediate storage resources.


Capacity and storage issues can hinder business agility


The continuous need for speed and efficiency is driving businesses to opt for storage as a service (STaaS) model. But there is more to it when it comes to the the benefits. Businesses get:

  • Better insights at reasonable cost: Providing a highly scalable environment at a low cost capable of handling the massive volume and velocity of data, organizations can shift from the two available models (CapEx and OpEx) for more predictable costs.
  • Better collaboration: Cloud-based business solutions accelerate innovation, delivering business analytics at the point of impact and enabling collaboration by creating and linking business networks.
  • Innovation and variety of solutions: Forward-thinking enterprises adopt for STaaS to speed up business innovation, improve overall data-centre efficiency, achieve integrated and innovative business results.  
  • Proven results: Organizations achieve their desired business outcomes by improving the responsiveness of their IT infrastructure without increasing risk or cost.

Capacity and storage issues can hinder your business agility. 

In order to avoid such challenges in the future, Drupal-powered enterprises need to constantly understand and adapt to the changing landscapes.

Azim Premji Foundation, Georgia Technical Authority, Department of Homeland security, USA are powered by Drupal and supported by AWS

While Drupal helps balance the rapid data growth, the right cloud storage solution needs to offer security and robust scalability without constraining the budget and prepare IT and marketing for what comes next.

Run your Drupal 40% cheaper

Choosing the right technology is crucial to avoid equipment failures and the costs of upgrading hardware. Small to medium enterprises and non-profit especially need sustainable solutions for future needs to run its operations without overcommitting budgets today. 

Finding the perfect match, organizations such as Azim Premji Foundation, Georgia Technical Authority, UCAS, Department of Homeland security - USA,  are powered by Drupal and supported by AWS.

Enterprises need sustainable solutions without over committing budgets today

AWS offers cloud web hosting solutions that provide businesses, non-profits, and governmental organizations with low-cost ways to deliver their websites and web applications.

The pay-as-you-go approach lets you pay only for the individual services you need, for as long as you use, and without requiring long-term contracts or complex licensing. 

Similar to how you pay for utilities like water and electricity.  

You only pay for the services you consume, and once you stop using them, there are no additional costs or termination fees.

The pricing models give your enterprises the flexibility to grow your business unencumbered by  IT

  • Pay-as-you-go

With AWS you only pay for what use, helping your organization remain agile, responsive and always able to meet scale demands. Allowing you to easily adapt to changing business needs without overcommitting budgets and improving your responsiveness to changes, reducing the risk of over positioning or missing capacity.

Drupal-AWS-PriceBy paying for services on an as-needed basis, you can redirect your focus to innovation and invention, reducing procurement complexity and enabling your business to be fully elastic.

  • Save when you reserve

By using reserved capacity, organizations can minimize risks, more predictably manage budgets, and comply with policies that require longer-term commitments.

For certain services like Amazon EC2 and Amazon RDS, enterprises can invest in reserved capacity. With Reserved Instances, you can save up to 75% over equivalent on-demand capacity.


When you buy Reserved Instances, the larger the upfront payment, the greater the discount.

  • Pay less by using more

Providing volume-based discounts, organizations can save more by increasing usage. . For services such as S3 and data transfer OUT from EC2, pricing is tiered, meaning the more you use, the less you pay per GB.

In addition, data transfer IN is always free of charge.

As a result, as your AWS usage needs increase, you benefit from the economies of scale that allow you to increase adoption and keep costs under control.

As your organization evolves, AWS also gives you options to acquire services that help you address your business needs. For example, AWS’ storage services portfolio, offers options to help you lower pricing based on how frequently you access data and the performance you need to retrieve it.


To optimize the savings, choosing the right combinations of storage solutions can help reduce costs while preserving performance, security and durability.

The pricing models give your enterprises the flexibility to grow your business unencumbered by  IT.

Case Study: Reducing cost & improving operational efficiency for Drupal application with AWS

Our client which is a legal firm and helps provide jurisdiction and litigant simple, seamless, and secure access to the record of legal proceedings. They built a SaaS-based workflow management application on Drupal to manage and track digital recordings of legal proceedings, transcripts including appeals to the stakeholders.

The goal was to build a robust, cloud-based server to effectively handle the processing and access to a large volume of text, audio and video files.

Since the business model was dependent upon frictionless uploading and downloading of text and media files, AWS cloud-based server came out as a unanimous solution. 

Business benefits

  • Simplified integration of the client's Drupal application with AWS S3, to enable flexible, cloud-native storage
  • As a result of going all-in into the AWS Cloud, the client reduced costs by 40% and increased operational performance by 30-40%
  • Dynamic storage and pay-as-you-go pricing enabled the client to leverage a highly cost-effective cloud-storage solution

Read complete case study on Cloud-Native Storage for Drupal Application with AWS

Get no-cost expert guidance

Designed to help you solve common problems and build faster, Amazon Web Services provides a comprehensive suite of solutions to secure and run your sophisticated and scalable applications.

Srijan is an AWS Advanced Consulting Partner. Schedule a meeting with our experts at no cost or sales pitch and get started with your cloud journey.

Jul 26 2019
Jul 26

You know how important accessibility is, but now what? There are a lot of well intentioned sites on the internet that aren’t accessible.

Is your website accessible?

How do you find out?

Well, it’s not as hard as it seems—and we’re here to help! Here are a few quick ways to measure the accessibility of your website.

1. Automated accessibility tests

While automated tools will only catch about 30% of accessibility bugs, they will give you a general idea of your site’s accessibility and show you some ways to make improvements.

Lighthouse: Chrome’s Accessibility Reporting Tool

Lighthouse is a free tool available right in Chrome. You can use it by simply using chrome’s testing website, in your development tools when you inspect a page, or with a browser plugin. Keep in mind that manual testing is also required to get a full picture of accessibility—we’ll cover that in just a moment.

To use the tool by going to a URL: Visit https://web.dev/measure and paste the URL of the page you want tested into the form field, then click “Run Audit” to see results.

To use the tool through inspect

  1. Right click on the webpage you want to test, and select “Inspect” from the dropdown or from your keyboard press command + option + I. This will open the inspect tool and bring up the last tool you used, so if the last thing you did was run an audit, it will bring you back to the audits panel.

    Dropdown menu: Inspect

  2. In the inspection window at the top right, click on the button with a double arrow, or expand the window until you see “Audits.” Select “Audits.” Dropdown menu: Audits selected
  3. Select your device size (mobile or desktop), and select “Accessibility” from the Audit Type options.
  4. Click “Run Audits.”
  5. A report will pop up in the inspect window with your overall score with information about your score results. Scores are out of 100, and 100 does not mean that a site is completely accessible–it means that it passed all automated tests. Lighthouse score display
  6. Below the score are details about accessibility errors. Toggle open these errors to see what element is failing and how to make fixes. Lighthouse error details

WAVE: Firefox and Chrome Extension

WAVE is a browser extension that allows you to run an automated accessibility test on a page of your website. It’s very thorough and one of our favorites for testing and fixing accessibility bugs.

To use WAVE:

  1. Install the WAVE Extension
  2. Go to the webpage you want to test, and click on the WAVE icon in the tools portion of your browser window. A report will pop up and your page will be marked up with the results of the review.

    Website in brower, with WAVE icon in toolbar

  3. A Summary will show up by default listing the number of Errors, Warnings, and other details on the page.

    WAVE tool sumary screen

  4. Click on the Flag icon to see more details. This will include information about what errors are on the page

    Wave tool details tab

  5. Clicking on the Tab at the bottom of the page that says “< code >” will show you the code marked up with the errors found.
  6. With the “< code >” tab open, you can click on the errors and warnings in the panel on the left to jump to the errors in the code. In the image below, clicking on the yellow rectangle “Redundant Link” icon in the report panel makes the code jump to the offending code.

    WAVE tool with code drawer open below website

2. Manual accessibility tests

A manual test will catch things automated tests can’t quite figure out. Robots are good for some things, but they can’t figure out human intention, so things like tab order, visual theming and good alt tags should be manually tested.

A toy robot from the 1950's

Keyboard testing makes sure that the site works for folks who are blind, who have low vision, who have limited mobility, or the person whose trackpad is broken. Conduct the following tests to see if your site is accessible to those using a keyboard to navigate:

  1. Go to the page you’d like to test. Start with your cursor in the address bar, and hit the “tab” button to navigate through the page. Each time you press tab, you should be moved to the next button, link or form input.
  2. Ideally, the first link you get to on the page is a “skip to main content” link that allows users to skip repeated navigation items.

    Webpage showing 'Skip to Main Content' link

  3. As you continue to tab through the page, you should be able to see where the focus is as it lands on each button, link and form field. Pro-tip: If you lose track of where it is because there’s not a visual indication that’s an accessibility issue.
  4. Check the order: Does pressing the tab key follow the natural flow of the page, or does it jump around? A good tab order follows the natural flow.
  5. Can you operate all menus, pop-ups, buttons, and forms?
  6. Can you press shift tab and navigate backwards?
  7. Are there items that are clickable that don’t receive focus?

Important Note: Keyboard testing needs to be done on mobile as well as desktop. Why? Some users who are blind don’t use full-sized computers or laptops because they don’t actually need a large display. Other users have low vision and magnify their screens. Which leads us to testing with zoom…

3. Testing with zoom

If you zoom a desktop screen to 400% on a responsive site you get…the mobile site! This is why testing on mobile and desktop is important.

Now that you’ve increased the screen to 400%, browse the page. As you browse ask yourself:

  1. Does text content get cut off?
  2. Do buttons get pushed off of the page?
  3. Is the functionality intact?
  4. Is there key functionality on desktop that’s no longer available on the mobile version?

4. Testing with a Screen Reader

Using a screen reader is a more advanced testing approach, and very helpful in identifying accessibility bugs on a site. If you use a mac, VoiceOver is the built in screen reader. To turn VoiceOver on or off, press command f5. Here’s a quick video tutorial on how to test your page using VoiceOver. The video description includes the full text of the captions as a quick reference.

[embedded content]

You can also turn on VoiceOver and tab through the page again to see if icon buttons are labeled properly, if the form labels you’ve applied make sense, and if alt tags on images are useful. If you press “control option a” all at once, VoiceOver will start reading every element from where you are on the page. If you tab, it will read the buttons, links and form inputs.

To sum it up:

Learning about different testing methods can help inform and add clarity to the process of making your site accessible. This is one of the most critical steps in your journey to making a website that everyone can experience. If you want to know how to transform these errors into a site that reads and navigates smoothly for all users, ThinkShout is here to help! Contact us to learn more about how we can partner to make your website more accessible.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jul 26 2019
Jul 26

Is it possible to enhance your Drupal experience?

Drupal is a favorite content management system among professionals. It has been proven time and time again that it is reliable, scalable and can turn any website into a magical digital experience that your customers are loving. For these reasons, Drupal has gathered a passionate community that wants to constantly see it improve. Here at Sooperthemes, we are also driven by our passion for Drupal. We take Drupal and improve its shortcomings through our products. In other words, we enhance your Drupal experience with our framework theme and easy-to-use Drupal 8 & 7 visual content builder.

What are some examples of real-life organizations using Sooperthemes products?

It’s time to show you the results of using our easy-to-use drag and drop builder, and our framework theme. Here is a list of websites that were entirely built over the Drupal architecture using Glazed Builder and Glazed Theme:

Senate enhance your Drupal experience

The U.S. Senate is a core part of the legislation process of the United States. Such an important part of the U.S. had a need for a really good website platoform. Drupal was chosen because it can handle large and complex websites. On top of that, the senate chose to build all websites for newly inaugurated senators in 2019 with our Glazed Builder and Glazed Theme products. This resulted in a modern-looking governmental senator websites that provide a great experience at low costs to the senate, because much of the page-building work can be done in-house thanks to our easy-to-use page builder. 

2. Swarco

Swarco enhance your Drupal experience

Swarco is a company that offers traffic technology for better and safer transportation. It is based in Innsbruck Austria and has an international network of production facilities that are sure to meet the needs of their clients. Swarco decided to improve its online presence by overhauling its website with Glazed Builder. This resulted in an unforgettable digital experience that leaves a long-lasting impression. Well done!

3. Body Worlds

Body Worlds enhance your Drupal experience

Body Worlds is the biggest traveling exposition of dissected human bodies. The exposition attracted more than 37 million visitors, which makes it one of the hottest tourist attractions to date. Such a successful exposition had to also have an online presence that reflected their success. That's why Body World built its website with Glazed Builder. This resulted in a gorgeous website that attracts clients from all over the world. 

4. Monterrey Institute of Technology and Higher Education

Monterrey enhance your Drupal experience

Monterrey Institue of Technology and Higher Education is one of the most prestigious universities in Latin America. With its headquarters established in Monterrey, Mexico, Tec offers the finest education to its student. Such a successful university required a beautiful website that can convince prospective students to join their ranks. That's why Tec decided to go for the combination of Drupal and Glazed Builder. This resulted in a beautiful website that can tackle the multi-lingual necessities of the university, while also attracting a large number of students.

5. Open Medical

Open medical enhance your Drupal experience

Open Medical is a company that wants to improve the delivery of healthcare services to the general public. In order to do this, they partner with various companies that help them reach their goal. On top of that, such an initiative needed a good website that can showcase their mission and values. That's were Glazed Builder came into play. The results were a practical website that showcases the trustworthiness of Open Medical to their potential customers. This resulted in an increased number of clients and leads generated. 

What Sooperthemes' products?

Sooperthemes bases its products on the Drupal architecture. This means that you get the best that Drupal has to offer without any of its drawbacks, making it possible to enhance your Drupal experience. The Sooperthemes portfolio includes a large number of turn-key demo websites that can be used to quickly set up a gorgeous website that converts leads to customers right out of the box. There is a wide selection of demos that you can choose from based on the industry that your company is conducting business.

A couple of examples of our demos that are completely built with our drag and drop Drupal content editor and our framework Drupal theme.

Marketing Drupal Theme Demo:

marketing enhance your Drupal experience

This theme is perfect for any marketing agency that wants to have a gorgeous website that looks professional and attracts high caliber clients. The theme is highly customizable, being able to be adapted to the needs of every marketing agency.

Business Drupal Theme Demo:

business enhance your Drupal experience

Sooperthemes also provides a business website theme, perfect for people that want to have a new and astonishing website for their clients. The business theme focuses on a more professional look that conveys trust to your prospects. The business theme is the perfect choice for any business owner that wants to provide a great online experience for their customer.

Agency Drupal Theme Demo:

agency enhance your Drupal experience

Our agency theme is the perfect choice for any agency that wants to create or improve their digital presence. It is designed to be able to fit the needs of any agency that wants to impress their audience. It has an intuitive design that can surely make a great website for your agency. Especially if you want to enhance your Drupal experience.

Logistics Drupal Theme Demo:

logistics enhance your Drupal experience

Sooperthemes has the perfect theme website for any logistics company that wants to have an impressive online presence. The layout and design are specially adapted to be able to convey the fluidity and speed with which logistics companies are driving business. Moreover, these themes can further be customized to be able to reflect your brand. 

Photography Drupal Theme Demo:

photography enhance your Drupal experience

Are you passionate about photography and don't how you to monetize your hobby? The Glazed Photography theme is the right answer for you. You can easily setup-up your website to be able to show your clients your finest material. Glazed Photography is the right answer for you if want to have an edge above your competition.

Construction Drupal Theme Demo:

construction enhance your drupal experience

Any construction company has to have a jaw-dropping online presence in order to be successful. This is what you get by having by building your website with Glazed construction theme. This theme is perfectly adapted to reflect the seriousness and commitment of the construction industry. Whether you want to showcase your team or your portfolio, this theme is the perfect choice to make a lasting impression to any potential client.

Powerful content capabilities with Sooperthemes' easy-to-use visual content builder

These themed demo sites are further customizable to suit your needs with our Glazed Builder module. This module makes it easy to turn your dream website into reality. Glazed Builder is a powerful Drupal-based drag and drop visual builder that can make any Drupal website shine. One of the struggles that Drupal users seem to have at first is the steep learning curve, which can require a large number of hours, essentially bottlenecking the workflow. In order to bypass this struggle, Sooperthemes designed Glazed Builder, effectively helping website designers and marketers save countless hours and money on working with Drupal. The hours saved can be used for other important tasks.  One of the great points about Glazed Builder is that it makes designing a Drupal website seem effortless.

Why enhance your Drupal experience with Sooperthemes?

This is a great question that everybody should be asking themselves this before making a purchase decision. Well, let me explain.


Sooperthemes is driven by its passion for Drupal. Our main goal is to enhance your Drupal experience. In order to do so, we address the most common pain point that Drupal has, such as long development time, steep learning curve and difficult user interface. Sooperthemes has developed its products to be able to accommodate these needs. With the Glazed theme, users can quickly have a template for their Drupal website that can be easily customizable and deployed. On top of that, Glazed Builder overcomes the native powerful but complex user interface of Drupal with its Drag and Drop capabilities and intuitive user interface. On top of that, Glazed Builder incorporates a large number of elements that can be used to be able to further customize your website. Examples are sections, panels, jumbotrons, wells, panels, collapsible, Drupal blocks, Drupal views and much more. 


As you can see, the imagination is the only limiting factor when it comes to the capabilities of web design with Glazed Builder and its capabilities to enhance your Drupal experience.


If you want to enhance your Drupal experience, then Sooperthemes is the right answer for you. Not only does it offer the best of what Drupal has to offer, but it also transforms Drupal's weak points into its strong points. If you’re not convinced yet, no problem, try Sooperthemes for free here

Jul 26 2019
Jul 26

Sometimes we just want to see if a thing works

Recently I ran into a situation while building out the Watson/Silverpop Webform Parser where I just wanted to test and see if a few things worked without having to reload and bootstrap Drupal every time I refreshed a page. I also wanted to utilize some of the classes and methods made available from Symfony and Drupal. Can you feel my dilemma?

Let's be honest, running drush cr and refreshing a page takes time, and I'm an impatient person, so I wanted to see if there was a way to use some of these things without going through the pain of waiting for a Drupal bootstrap. Turns out, the solution wasn't that difficult and it made development on many methods of my module more pleasant than it could have been.

Here's the scenario

I wanted to test a few things that didn't require the database connection. Specifically, Drupal\Core\Serialization\Yaml to parse an array that I was building into YAML. So, what I did was stubbed out what would become my class WatsonFormEntityForm in a file in my docroot that I creatively named test.php. Now I was able to navigate to local.docksal/test.php on my local machine and see things working.

Get to the good stuff, already!

I got to a point in my development where I was able to convert the submitted HTML, in this case from a call to file_get_contents('./test.html'); into an array that I could work with. Xdebug was going great, and so was the module, but I wanted to see if I could convert it into YAML using a native Drupal method. The solution came with one single line of code.

$autoloader = require_once 'autoload.php';

This tells PHP, "Hey, we got a file here that wants to use some of the methods and classes in the Autoloader. Let's go ahead and let it!" This variable doesn't need to be called anywhere in the file. It just needs to exist. Now I was able to update the file with a few friendly use statements from within the Drupal and Symfony ecosystem without having to wait for all the database connections to happen.

The end result was:


use Drupal\Core\Serialization\Yaml;

$autoloader = require_once 'autoload.php';

// Do all the things here, including:

$yaml = Yaml::encode($array);


It sped up development, and it made it so I didn't have to wonder if something wasn't working because I forgot to drush cr all the things or if it was just because I made some mistakes.


Be sure that any code you're running doesn't rely on database calls or the container. For instance, if you try to run $nodes = \Drupal::entityTypeManager()->getStorage('node')->loadMultiple(); is going to throw a painful error.

Also, this is mainly for rapid prototyping or proving concepts. I don't recommend writing an entire module in procedural code and then trying to refactor later. Maybe take it one function at a time just to make sure it's doing what you want it to do.

Let me know if this helped you out or if you have better suggestions for rapidly testing some Drupal stuff without having to rely on a full bootstrap. As always, feel free to reach out to me on the Drupal Slack, where I'm always Dorf and always willing to help if I can.

Jul 25 2019
Jul 25

The ideas in this post were originally presented by Suzanne Dergacheva at Drupal North 2018.

If you've opted for Drupal, then you must be dealing with a large amount of content, right? Now, the question that arises is: how do you build out and structure a complex content architecture in Drupal 8?

For you definitely don't run out of options when it comes to organizing your content:

  • content types
  • paragraph types
  • (custom) block types
  • custom fields

And things get even more complex when you start to consider all the various relationships between these entities. 

Now, let me help you structure this huge “pile” of different options, approaches and best practices for setting up an effectively organized content structure.

What Makes Drupal Ideal for Creating a Flexible Content Architecture?

One of Drupal's key selling points is that it ships with tools and workflows designed specifically to support a flexible content architecture.

And I'm talking here about its:

  • WYSIWYG editor
  • all the tools that streamline the content creation and publishing process
  • access control system based on user roles and permission levels
  • ecosystem of Drupal 8 content types (blocks, nodes, paragraphs, terms)

All these tools combined empower you to:

  • create any type of content (survey, landing page, blog entry...) nice and easy
  • control where and how that piece of content should be displayed on your website
  • categorize and structure your large amount of content using different content entity types

In short: Drupal 8's built, from the ground up, to support a well-structured, yet highly flexible content architecture.

Step 1: Plan Out Your Content Architecture in Drupal 8: Identify the Needed Content Types 

A well-structured content architecture is, above all, a carefully planned out one. 

Start by analyzing your content wireframe to identify your content needs and to... fill in the blanks:

  • decide what content you need on your website, how/where it should be displayed and to whom it should be accessible
  • identify the various content entity types for each piece of content
  • set out all the fields that each content entity type requires
  • define your taxonomy term entities

 It's also that step where you gradually start to populate each category outlined in your content wireframe with the corresponding types of content.

Step 2: Set Out Your Well-Defined Content Types

I'm talking about those traditional, crystal-clear content types like articles or job postings, where the structure is pre-defined and it leaves no room for interpretation.

Those fixed content types that guarantee consistency across your website, that are easy to search and to migrate.

This is the step where you define each one of these content types' elements —  paragraphs, data, long text, images, etc. — and their order. 

Step 3: Set Out the Relationships Between Various Types of Content

Since you're dealing with a complex content structure, an intricate network of references between different pieces of content will be inevitable.

Now, it's time to set out all those explicit relationships between your node references and their referenced nodes, between term references and terms...

Note: needless to add that the implicit relationships will form by themselves, you have no control over those.

Step 4: Define the Multi-Purpose Content and the Reusable Pieces of Content

While at this phase, where you identify the content types that you'll need, remember to add the multi-purpose and the reusable content types, as well.

Speaking of multi-purpose content, it's that content type (e.g. the landing page) that you don't know yet what content it should include. And what order its content elements should be displayed in. 

Therefore, you need to keep it flexible enough for future additions and modifications. In this respect, the Paragraphs module is the flexible page builder that you can rely on.The “secret” is to build your paragraph types — call to action, webform, view —  along with the fields that they incorporate and to... leave it flexible for future updates.

Now, as for the reusable type of content, the best example is, again, that of a landing page with multiple reusable blocks that you can move around to your liking.

What you can do at this stage is to define your block types: image, view, call to action.

Step 5: Create Your Custom Entities and Custom Fields

While structuring a complex content architecture in Drupal 8 you'll inevitably need to create some custom entities and fields, as well.

With a large pile of content to deal with, there will be cases when the options that Drupal provides aren't suitable. For instance, you might need to define some special rules for a specific piece of content.

In this case, creating a custom entity is a must, but make sure you've carefully thought through all its potential use cases and specific workflows. That you've invested enough time in prototyping it.

Also, you might find yourself in a situation where one of the fields needs to be stored or validated in a particular way. For instance, you might need to create a multi-value field. Since these scenarios call for a custom field, again, take your time to prototype it thoroughly.

The END!

These are the main steps to properly structure your complex content architecture in Drupal 8. The golden rule should be: always leave some room for flexibility.

Photo by Alain Pham on Unsplash 

Jul 25 2019
Jul 25

Submitted by karthikkumardk on Thursday, 25 July 2019 - 15:29:44 IST

file_scan_directory is deprecated and has been moved to the file_system service.


$files = file_scan_directory($directory);


if (is_dir($directory)) {
  $files = \Drupal::service('file_system')->scanDirectory($directory);

When possible, you should inject the FileSystemInterface into your constructor.

Original source - https://www.drupal.org/node/3038437

Jul 24 2019
Jul 24

This article assumes that you've already heard the big news about Acquia acquiring Mautic, the largest open-source marketing automation platform. Chances are that you've already run demos on your Drupal 8 site.  If that is indeed the case and your Drupal 8 site uses the popular Webform module; you're probably wondering how you can send your existing Webform submissions to Mautic in order to convert them to contacts.

Well, the easier way is to simply create a Mautic Form and embed that form in your Drupal site. Mautic lets you create many different forms which can be easily embedded in your Drupal site using simple JavaScript and/or HTML embeds. However, you cannot simply let go of your current Webforms, and let's also be frank, there no form systems that can outperform Drupal 8's Webform module.

The good news is that Drupal 8's Webform module can easily send submissions to Mautic forms - thus connecting your Drupal site with Mautic in a seamless way.

This integration also takes into consideration that Mautic tracks anonymous visitors then converts them into contacts once they submit a form - yes through Webform in this case.


  1. You have a Drupal 8 website with Webform module installed
  2. You have a Mautic instance installed

Step by step guide:

  1. First, you'll need to create a Mautic form similar to your Drupal webform. You do not need to embed this form, you just need to create it so it receives the submissions by remote post from your Drupal webform.
  2. Download and install Webform Mautic module. This module adds a Webform handler to map submissions to Mautic forms.
  3. In your Drupal 8 site, go to the webform that you want to send its submissions to Mautic. 
    Navigate to the webform's "Settings", then go to "Emails / Handlers" to add a new handler.
    Create a new handler by clicking "Add handler" and choose "Mautic".
  4. Now you'll only need to configure the handler. Choose a meaningful title. I like to call this handler "Send submission to Mautic form".
  5. Save the handler and you're done.

Things to pay attention to:

In Drupal side:

  • It's recommended that you have Mautic tracking code installed in your Drupal site.
    This will leverage the contact tracking capabilities of Mautic and link form submissions to the contact activities that were tracked when the contact used to be an anonymous visitor.
  • Test the handler by going to "Test" in the handler's action menu. Make sure Mautic is receiving your submissions.

In Mautic side:

  • Make sure your form is published.
  • Make sure your form is not set to "Kiosk mode" if you want to get the full contact tracking activities when the form is submitted.
Jul 23 2019
Jul 23

Rain logo

Mediacurrent created the Rain Install Profile to build fast, consistent Drupal websites and improve the editorial experience. Rain expedites website creation, configuration, and deployment.

In the last developer tutorial, we covered how to customize and develop Drupal sites using Rain’s base install profile. This week, we will dive into theming to help frontend developers update the look and feel of the Rain starter theme.

As usual, if you have a question or comment feel free to reach out at @drupalninja

Sub-theme or clone?

The first thing to note is that you can use the Rain install profile with or without the included Rain theme package. The “rain_theme” project exists as its own Composer project which is included by default but can be easily removed. That said, there are benefits to using the Rain Theme as a base theme or starter. The primary benefit is that Paragraphs are integrated with a dozen or so pre-built components that can be easily customized and save you time. 

If you do choose to leverage the base theme, you need to decide whether or not to use Rain Theme as a “parent theme” or as a starter. We highly recommend you do not. This way, you gain full control over the theme and do not need to worry about downstream updates. Leveraging Rain theme as a parent theme can cut down initially on files and code but parent themes in Drupal can be restrictive and cumbersome at times. Ultimately, the decision is yours. With the “clone” approach you can grab everything and rename it or grab only what you want. Anything you don’t use can be discarded.

Using the style guide

The Rain Theme project includes a KSS-based living style guide that has a host of pre-built Twig components that can be integrated with Drupal theme templates.

KSS Style Guide

KSS style guide Example

In the previous step, we mentioned that we recommend cloning the theme so that you can fully control and customize any of these components. The idea is that developers waste less time rebuilding and re-styling the same common components on every new build. Instead, Drupal themers get a head-start while still having enough control of the theme to meet the design requirements of the project.

Working with the popular KSS node project is straightforward. For more information on how to compile and develop with KSS node, visit the project page at https://github.com/kss-node/kss-node.

Even if you are new to KSS, you will find making updates is easy. The “npm run build” command will compile all of your theme assets, as well as the style guide.

Component Integration

The main way components are integrated into Drupal templates is through includes found in Paragraph templates.

Rain Theme Paragraphs Twig Templates

Rain Theme Paragraphs Twig templates screenshot

Using the Components module to define the “@components” namespace, you can see an example below where field markup is passed in as parameters to the component Twig file. You will also that it’s in these templates where we typically attach a library (more on that in a bit). Of course, this is all ready to customize and any time you add or modify fields you will need to adjust templates or components accordingly. JavaScript and CSS are for the most part encapsulated in their corresponding component which keeps things organized. We do recommend you enable the “Twig debug” option in your Drupal services.yml to make it easy to find which templates are being used on any given page.

Rain theme quote paragraph template

Rain Theme Quote Paragraph template


Out of the box, we have included many libraries that style components and include vendor libraries where appropriate. Note that these libraries reference compiled CSS and JavaScript found in the “dist” folder of the theme.

Rain theme libraries

Rain Theme Libraries screenshot

Additional Theming Helpers

In addition to the pre-configured templates, style guide and libraries included with Rain, we also ship a few helper modules to make theming easier. 

The “Twig Field Value” module makes it simpler to pull out values from fields and the “Twig Tweak” module adds several utility methods. For a full list of those functions with information visit the “Cheatsheet” documentation page on Drupal.org. As mentioned earlier, the “Components” module is also included and enabled by default in order to allow namespaces to be defined in our theme.


In this article, we showed frontend developers how to leverage the Rain base theme and style guide. Our recommendation is to clone the theme folder and customize our pre-built components to match your project’s unique requirements. We also covered how to integrate Drupal theme templates with components and define custom libraries. Finally, we covered a few helpful theming modules that ship with Rain and save development time.

In the next (and final) Rain tutorial, we will wrap up this series with a focus on content creation for authors.

Was this information helpful? Let us know! https://twitter.com/drupalninja/

Jul 23 2019
Jul 23

This blog has been re-posted and edited with permission from Dries Buytaert's blog:

Coder Dojo

Volunteering as a mentor at CoderDojo to teach young people, including my own kids, how to write software.

Last week, I published an opinion piece on CNN featuring my thoughts on what is wrong with the web and how we might fix it.

In short, I really miss some things about the original web, and don't want my kids to grow up being exploited by mega-corporations.

I am hopeful that increased regulation and decentralized web applications may fix some of the web's current problems. While some problems are really difficult to fix, at the very least, my kids will have more options to choose from when it comes to their data privacy and overall experience on the web.

You can read the first few paragraphs below, and view the whole article on CNN.

I still remember the feeling in the year 2000 when a group of five friends and I shared a modem connection at the University of Antwerp. I used it to create an online message board so we could chat back and forth about mostly mundane things. The modem was slow by today's standards, but the newness of it all was an adrenaline rush. Little did I know that message board would change my life.

In time, I turned this internal message board into a public news and discussion site, where I shared my own experiences using experimental web technologies. Soon, I started hearing from people all over the world that wanted to provide suggestions on how to improve my website, but that also wanted to use my site's technology to build their own websites and experiment with emerging web technologies.

Before long, I was connected to a network of strangers who would help me build Drupal.

Jul 23 2019
Jul 23

Dear friends,

Let's Start a Conversation

Over the past few weeks there have been some conversations among many of us regarding the issue credit system and how it could be used to "game the system."  This started with a conversation in a Slack team asking if a d.o user was a bot because of the number of issue credits attached to their profile.  At the time it was around 550 in the past year.  There was assurance from someone who knew this person that this person was not a bot, however a cursory look at the issues they had been credited for revealed that the majority of these issues were issues that could be considered Novice level.

I didn't think much of it until I pushed a custom module into the contrib space, and within 30-60 minutes I received an email notice from d.o saying an issue had been opened on my module.  My brand new, still in alpha, less than an hour old module.  I read the issue and two things stuck out:

  1. The name of the person creating the issue.
  2. The verbiage used in this issue.

These stuck out because it was the exact same person who had been brought up in Slack and the words were an exact copypasta of issues I had looked at when trying to figure out how this person had earned so many credits without being a bot.  I did a little more looking into it and noticed that the issue description was exactly the same as other issues opened by this person, and so was the comment.  I copied the text of the comment and pasted it into Google and found that there were 8 pages of results with this exact same text, almost all of them were opened by this person.  This did not help claims of being a definite human not-bot.

What's Wrong With This Picture?

Now why is this an issue, you may be asking.  Why do I care and why am I making a fuss about it?

Let me preface this by saying that I don't care about the number of issue credits by my name.  My skills (or lack thereof) speak for themselves, as does my reputation.  I'm happy in my current position, I have a reasonable grasp on what I do, and even though I don't have a ton of credits next to my name, I consider myself a member of the community and an asset to the Drupal project.

As for making a fuss about it, there are three main reasons that I'm bothered by this situation.

Reason 1.

We've all seen the CMS Learning Curve image, right?

CMS learning curve meme

If not, here it is.

This has always been a joke within the community.  Drupal has a high barrier to entry from a developer perspective.  Drupal 8, even moreso than Drupal 7 which (IIRC) is the version that earned Drupal's place on this infamous image.

Because of this barrier, we've tried to make it possible for new contributors to get comfortable with contributing through issues that are tagged "Novice".  These issues are often documentation, coding standards, or basic code fixes that can be handled by someone who may not have as much experience contributing to the project, whether it be the core Drupal project or contributed modules.

These novice issues are important to have.  Sure, they may take some of the core maintainers or more advanced developers a few minutes to handle, but if they're not critical then there's no reason to not utilize them to help bring in new talent and give novice contributors a chance to get into the weeds a bit.

Looking through this person's issue credits, every single issue I found has been a novice issue.  I did not take the time to look through all 570 credits to verify this, but my sample size was large and based on that sample, I'm willing to wager that the vast majority of these issues would have made for decent entry points for new contributors.   With one person collecting all of the proverbial "low-hanging fruit" to pad their profile, this effectively gates new contributors from having the opportunity to help the project and removes upwards of 570 novice issues that would make perfect entry points for novices, not counting issues that have not been credited yet.

Reason 2.

In speaking with some other developers, it's demoralizing.  This person was recently mentioned in a tweet as "the most impressive http://Drupal.org profile page I've ever seen".  I'm not trying to say that these issues aren't important, and if they improve the project overall, then it's a good thing that they're being handled, but the process and the "reward" system makes it seem like the problems are larger than they really are.

Imagine this scenario, please:  Developer A has taken the challenge to move from novice issues to something a little more advanced and spends twenty four working hours debugging an issue in core that is preventing an entity from saving under certain circumstances.  They find the problem, discuss it with the core maintainers on Slack, toss ideas back and forth on the issue node, create a patch, create tests, submit the patch, wait for RTBC, and the patch gets committed.  The project is made better because Developer A is passionate about Drupal, they spent their free time trying to improve Drupal, and when the patch is committed, they get credit for the issue.  This credit is worn as a badge of honor on their profile.  They feel pride in what they've done and what they've given back.

Developer A is scrolling through Twitter and sees mention of Developer B as "the most impressive http://Drupal.org profile page I've ever seen" and wants to know what their secret is.  Developer A starts looking at the issues, and finds that the "impressive" profile is primarily novice issues that can be handled in a matter of minutes, yet they're the ones marked as "impressive".  Developer A feels like their credit is devalued because the only thing seen on the profile is the number, not the number of issues by difficulty or number of lines in a patch, but number of issue credits.

Developer A decides that contributing back isn't worth their time because the only thing that seems to matter is the number of credits.  Drupal loses a talented contributor, and the project is poorer for it.

Reason 3.

The Drupal Marketplace page puts weight on the number of contributions over the last 90 days in its ranking system.  Outwardly, this would seem to reward companies who invest in improving the Drupal project, and it does.  However, it also incentivizes companies who hire new developers who are just thrilled to have a job into positions that exist only for the sake of getting their company to the top of the marketplace.  This is unfair to other companies who may not have the budget, but still contribute how they can, and it's unfair to the employee who is stuck searching for module releases, looking for novice issues, and copying and pasting descriptions and comments to get more and more credits.

In a sense, it's gating the employee from being able to move on from this position because even though they have 570+ credits, they've potentially never done anything past that.  I'm not claiming that the person who sparked this conversation doesn't have skills, but churning out that many bug reports and quickfix patches doesn't really empower that person to show what they're really capable of.  I know if I spent all day doing the same thing over and over, I definitely would not want to even look at a text editor on my free time, let alone feel that it would be worth it to try and expand my skill set.

Possible Solutions

There is no one-size-fits-all solution for this.  Hell, there's probably no good solution at all.  However, here are some ideas that I would like to throw out there that are just that, ideas. They're not fully fleshed out, and they aren't ultimatums of "Do this or I'm going to walk!" They're potential starting points and hopefully help to spark a larger conversation.

Solution 1: Weighted Credits

Not every credit should be worth the same as others. There could be a number of ways to do this, but in my opinion the easiest way would be to weigh by difficulty of the task. A novice, non-critical issue could be weighted less than an advanced, critical issue.


  • Could incentivize contributors to move up in difficulty of tasks.
  • Gives contributors who spend a lot of time on difficult issues more credit than contributors who seek out the easiest fixes.
  • Rewards growth as a developer.


  • Difficult to implement.
  • Creating a weighting system can be difficult and fairness cannot be guaranteed.
  • Valuation of an issue is subjective and could be gamed if a module maintainer decides to start tagging every issue as "Critical" or similar.

Solution 2: Mandatory Difficulty Tagging

The word "Difficulty" is difficult here. Some other possibilities would be "severity", "level of effort", "need", or any number of words that could be placed there. The point is that for this solution, I recommend creating a specific taxonomy that relates to difficulty or [insert word here] of the issue. This would be mandatory before marking an issue as fixed by the maintainer, whether it be a contrib module maintainer or a core committer. Then, on a user's profile credits could be divided by rating. For example: User Credited on 50 "Novice" issues, 3 "Intermediate" issues, 15 "Advanced" issues, and 3 "Critical" issues.


  • Makes it easier to discern how much effort is actually being put into contributions.
  • More transparent for potential employers looking at d.o for references.
  • Creates a weight system without actually having to weight contributions.


  • Puts the responsibility on module/core maintainers to assign difficulty, which can be very subjective.
  • Does not prevent a module maintainer from marking their own issue as "Critical" when it could easily be tagged "Novice"
  • Potentially devalues novice credits and gates new contributors.

Solution 3: Credit limits

I'm of the opinion that if someone has over 100 novice credits, they should probably be at a point where they are either taking on more serious issues and leaving the novice issues as entry points for new contributors. This solution would still require tagging of issues by maintainers, but instead of creating tiers based on difficulty it would limit the number of "novice" issues a contributor would receive credit points for. The number would have to be decided and I don't know if it should be 10 or 100. I'm just spitballing here.


  • Incentivizes contributors to move on from novice issues.
  • Keeps novice issues for actual novices.
  • Creates more entry points for new contributors.


  • May deter non-novice contributors from taking on novice issues that need to be done.
  • Again, module maintainers may tag issues inappropriately to ensure credit.
  • The term "novice" is very subjective and could easily be debated.

Solution 4: Redistribution of Credits

Sorry, my politics are showing on this one, but I don't mean take credits from the top 1% and give them to the bottom 25%. Instead, give others a chance to take on the tasks and open the gates to new contributors. The issues that are being created, fixed, and credited are issues, but they're mostly novice issues. My Google-fu returned over 20 pages with identical descriptions. I don't know exactly how many because I just stopped clicking "Next Page" when I hit about page 15, but it's safe to say that this person knows how to fix this issue.

Great. Good job. Now give someone else a chance!

If the goal of these issues is truly to move the project forward and not just farm credits like a World of Warcraft sweatshop circa 2008 CE, then let's get some new people contributing. Create the issue, tag it as novice, and let it be. Find an upcoming camp and tag it with that camp so that if there is a space for contribution and/or a beginning contributor workshop, there are plenty of issues to work on to help get the new contributors comfortable with the steps needed for code contribution. If it's critical, go ahead and fix it. If it's non-breaking and a good chance for someone to learn some git-fu, how to branch and create patches, how to comment in the issue queue, how to push up a patch in a comment, and the lifecycle of an issue from creation to RTBC to fixed, then let it be for a bit.

If there's no movement after a few days or after the camp it was tagged for, then go ahead and fix it. Sure, maybe the module maintainer will fix it in the meantime. Maybe it'll get marked as "Closed: Outdated" because it's already been taken care of on a dev branch that just hasn't made it up to d.o yet. Maybe, just maybe, a new contributor in the wild will be searching the issue queue for novice issues to get started and they'll take the reins on the issue and get their first credit. You may not get credit for authoring the issue, but you'll definitely get a warm fuzzy feeling for helping foster a new contributor into the world of Drupal. That's truly what building community is about.


  • Gates are open for new contribution.
  • Knowledge is shared and the community grows stronger.
  • You can almost hear the DING when they level up into the world of open source contribution.


  • Won't necessarily get credit for every issue created, but we shouldn't be here just for credits.
  • Issues may become stale and overlooked.
  • That's all I can think of...

Closing Thoughts

This post is not meant to attack any person or any company. If anything, I hope that it sparks some conversations and helps make the Drupal project richer and more diverse by helping spread the wealth of entry-level issues and credits to people who may or may not have opportunities otherwise. Since I began writing this post I've tried keeping an eye on things in the issue queues and I found that some people are taking the example from the account that got me thinking about this and running with it. I've even seen some people trying to patch core testing modules with the .info.yml Dependency Namespacing issue. Fortunately these were caught pretty early and core maintainers made it a point to redirect the issue reporter to fixing more than just one file in core in a patch.

The credit system is great and it rewards hard work, but the issue credit system has issues with credits and needs some work as well. I have brought this up to both the CWG and the DA and they're understandably in tough spots. Yes, fixing issues definitely makes the project better regardless of how small the issue is, but there are thousands upon thousands of developers out there just looking for a project to grab onto and start working with. We need to make it as easy as possible for these people to jump in and become part of the Drupal community. I travel for a lot of camps and, unfortunately, for the most part I see a lot of the same faces in the contrib rooms and areas. I would like to see some new people feel welcomed the same way I felt welcomed at my first contribution days when I was able to get my first patch committed to core. Sure, those maintainers could have taken care of the issue and taken credit for themselves, but instead they created the issue and passed it to me, an eager entry-level developer still trying to find my place in this crazy Drupal world.

We shouldn't have to rely on the CWG or DA to police these things. We should be able to police ourselves. We should be able to know if what we're doing is self-serving or if it for the betterment of the Drupal project and the Drupal community. We should be able to feel good about the credits we get and the credits we help others get. We should try our best to be good people and do the right thing whenever we can.

We shouldn't reward companies or individuals who hoard issue credits just to look better on the marketplace or in a wordcloud on a powerpoint. We can do better than that.

If you're a new developer or an aspiring contributor looking for something to do to contribute back to the project, contact me. I'll help you find an issue to work on and run you through the process of patching, commenting, testing, and whatever else it is that you need to succeed. I want to stress that you don't need to know how to code to contribute back to Drupal. Ask me how!

If you're a seasoned developer or contributor, I challenge you to mentor someone who's eager to learn. Give up a credit or two to someone who is looking for that first DING. Take the time and we'll all be better for it.


JD Flynn, Drupal Developer and Community Member

Jul 23 2019
Jul 23

Here at Phase2, we’re excited to participate in the premiere Drupal government event, Drupal GovCon, held at the National Institutes of Health campus in Bethesda, Maryland where we'll once again be a platinum sponsor.

We hope you will join us for one of North America’s largest Drupal camps, this week.  

You can find Phase2 at our sponsor booth and all over the session schedule:

Why Migrate To Drupal 8 Now

With Drupal 9’s release impending, there has been a resurgence of chatter around Drupal 6/7 migration. If your government organization is still on Drupal 6 or 7 you will won’t want to miss this session. You can read more about the subject here from our on-site presenters, Tobby Hagler, Director of Engineering, and Felicia Haynes, Vice President of Accounts.

Measuring What Matters: Using Analytics To Inform Content Strategy

Content Strategy is the backbone to any successful digital experience. It’s also rarely considered an analytical process, but it should be! Catch our session to learn more, and in the meantime, read about the top 3 content strategy tips for government ahead of Jason Hamrick’s session.

Accessible Design: Empathy In A World With No Average

Our Accessibility expert, Catharine McNally, has a special treat for her session’s attendees: she’ll be hosting some interactive empathy building exercises, to showcase the impact of accessible design for citizens. This is a unique opportunity for anyone involved in the design, content, and UX of their organization’s/agency’s digital experience.

Personalization & Government: The Odd Couple or The Perfect Match?

Personalization has become the new standard, across industries, for improving customer experience, but how can personalization be leveraged in government? Join our CEO, Jeff Walpole, and Vice President of Business Development, Ben Coit,  to explore use cases and how governments can continue thinking about the next generation of citizen experiences.

Read more about scaling personalization in our recent whitepaper

Whether you’re joining us on the ground (in session) or from afar (via the airwaves), be sure to read our latest issue of Contributed Magazine—specifically covering open source, and secure, digital governments.

Jul 23 2019
Jul 23

Solhem Companies

Website Refresh - Making It Easy to Decide on Apartments from Afar

Solhem Companies develops, owns and manages award-winning residential and office properties in Minneapolis, Minnesota’s most desirable neighborhoods. TEN7 has been working with Solhem since the beginning. We built their first Drupal site in 2009, and as they added buildings, the site soon evolved into a multi-site installation for their portfolio of boutique apartment properties. The Solhem websites, for existing properties as well as websites for properties under development, are managed with a single theme. In addition to providing cohesive visual branding for all the sites, this approach results in reduced support and maintenance costs, since updates are performed in one place and changes are propagated to all sites. Additionally, it spreads out the investment being made over multiple sites.

Solhem approached TEN7 in 2016, with a wish list that included developing a new theme that would differentiate them from their competitors' sites, that generally use a standard real estate website template, the one that’s provided from their property management software and usually can be quite boring. CLient's goal was to visually identify their buildings as belonging to the Solhem group of properties, by embarking on a theme redesign that would match the Scandinavian ambiance of their buildings, sunny and light, friendly, modern and sustainable.

The design process went through a few iterations, until all of the client stakeholders were satisfied. Both teams were able to collaborate using InVision, a design prototyping platform. Eva, our UX designer, brought their vision to life in digital form, and our talented front-end team implemented the designs with great care. This project is a perfect example of how the design phase can take longer than planned, and how that extra time contributes to a more considered, user-focused site.

Solhem old layout The old theme with a short-scroll home page as seen on the former Soltva site. New Solhavn site with long scroll The new minimalist theme with a long-scroll home page as featured on the new Solhavn site.

Meeting Client's Goals

Solhem said, “One of our goals is to get our websites to the point where someone who lives in LA can get enough information from the site to confidently apply for a unit, without ever visiting us in person.” One feature that helps them fulfill that goal is the custom floor plans page we built for the sites. All sites have both 2D and 3D plans, with photos and virtual tours. On some sites, you can even select a floor and see which floor plan types are available on that floor. 

Solhem Companies floor plans

A Guide to the Neighborhood

Knowledge of the buildings' surrounding neighborhoods is vital. Solhem wanted to feature Instagram content from neighborhood amenities like attractions, food & drink, recreation and shopping. They also wanted to highlight selected Instagram content from the community of residents from each building on an updated version of their blog page. Obtaining these images required getting permissions from the content creators, a sensitive issue that we reviewed carefully with the client. 

A Valuable Editing Tool

In order for the Solhem staff to curate and update Instagram posts, Jason Cote, our front-end developer, created a custom Drupal module that allows site administrators to retrieve and display Instagram images (including information about the Instagram account owner) from within the site’s administration pages. “It was a farfetched idea in the design phase, and I had no idea that could be brought to life! But Jason figured out how to do it,” said Megan Glover, Solhem Companies’ Marketing Manager. “I’m already seeing other people steal this idea from us.”  It took a little extra development time upfront to make it work, but it will save Solhem time and money in the long run.

Living amenities in the North Loop Instagram content on the Solhavn site.

“We have a lot of awesome residents who are good at Instagram, as well as neighborhood businesses with great content. To be able to present this content keeps us totally on-brand, and it’s a WIN-WIN for everyone. We get fresh beautiful content and get to promote people’s Instagram accounts, and get them new followers!” —Megan Glover, Marketing Manager, Solhem Companies


TEN7 completed the Solhem redesign project on time and within budget, despite some unanticipated complications. It takes a surprising amount of work to render a clean, minimalist and elegant theme that works for multiple websites, while still being fully responsive!

The work was well worth it.

“Conversion rates and SEO have all been improved with the new design, and I can’t tell you the number of people who have said, ‘I LOVE your website, and it’s so easy to use!’ Oftentimes people come in to tour, and say they made up their mind by just viewing the website.”
—Megan Glover


So what’s next for Solhem? Tastes and design trends change quickly. We have been working working on a new theme design with updated fonts featuring more vibrant color, just in time for their new Nordeast Minneapolis location, coming in 2020. Stay tuned.

Jul 23 2019
Jul 23

Everybody's favorite Drupal Camp is fast approaching!

This year (like every year) we promise to deliver the most forward-thinking Drupal knowledge out there. But we need a little help from our friends.

Be a part of humans sharing knowledge by giving a presentation at the 13th annual BADCamp. Remember, and this is important, the first step to presenting is submitting a session proposal!

Our Call For Papers is open and we are the edge of our seats waiting to know what everyone has to offer. Are you a project manager with a knack for cat herding? Do you have mastery of JavaScript and can make your website perform magic? Do you have a community story you want to share? Are your DevOps chops dialed in? Do you know the ins and outs of web accessibility?

If you can answer yes to any of the questions, share what you know with the San Francisco Bay Area Drupal folks!!  Everybody has a unique Drupal story to tell, and we want to hear yours.

Submit Session

Sessions submissions close Thursday, August 1st at midnight PST so be timely.

We welcome those from all levels of expertise, background, gender, ethnicity, sexual identity, religion, age, and ability. Our community is diverse and we know that Drupal can benefit when everyone is included.

Submission Requirements:

Before submitting a session proposal, we recommend that you:

 - Acquaint yourself with BADCamps’s Code of Conduct. All speakers will be required to adhere to this code.

 - Create a Drupal.org account - BADCamp does give Drupal Community attributions for speakers.

Speaker Resources:

We want to support you as a speaker along your journey. 

 - Pre-submission mentoring - We know submitting a conference proposal can be intimidating, especially for newer speakers.

 - We want to support you, and encourage you to use the HelpMeAbstract to obtain feedback from veteran open source speakers

 - The #session-help Slack channel is dedicated to submission support and feedback. 

 - Speaker Diversity Workshop 

 Looking for more inspiration? Check out the videos from last year’s camp.

A BIG magical BADcamp thank you to our sponsors who have committed early. Without them, this magical event wouldn’t be possible. We are also looking for MORE sponsors to help keep BADCamp free and awesome. Interested in sponsoring BADCamp? Contact [email protected].

Jul 23 2019
Jul 23

Yes, you read that right. Having to take your application offline for updates or while launching a newer version is a daunting task and a real pain. True that you can mitigate this with scheduled downtimes, but that is not something that will bring make the customers happy. Some sites actually lose thousands of dollars every minute they are down!  Should deployment or upgrading your application be the real reason behind your loss? 

Blue Green Deployment is one of the mainstays of deployment strategies that conquers the limitations of traditional approach to application deployment. How much do you know about Blue Green Deployment? How is Blue Green Deployment in Drupal implemented?

What is Blue Green Deployment?

Blue-Green Deployment is a technique for releasing applications by shifting or moving the traffic between two identical environments running two different versions of an application. 

In a brief, there is a version of an application, lets call it "blue version" in production. And then, there is a router that is used to route the traffic to the app. Now you need another version. the "green version" with some more goodies added, to be deployed. However, you also want to ensure that while this deployment is happening, a user can still look at your application, press a button or do whatever he wants, without your application going down! More like secretly deploying the "green version" while the "blue version" handles all the traffic in the meantime, before eventually swapping out the connections. 



With near zero downtime release and rollback capabilities, the fundamental idea of the Blue-Green Deployment is to shift the traffic between two identical environments that are running two different versions of the application. While the current version of the application is represented by the Blue environment, the green one is staged running a different version. 

Why Blue-Green Deployment? 

One of the major challenges with automating the deployment process is the cut-over itself, i.e moving the application from the final stage of testing to the live stage. And this has to be taken care of quickly in order to minimize any sort of downtime. This is exactly what blue-green deployment does. With two production environments (as identical as possible in nature), at any given point of time, one of them is live. And as you prepare for a new release, you do all your testing on the other environment which is not live. Once you are ready to roll, all you have to do is switch the router so that all your traffic is directed towards your latest release & the other environment just goes idle. 

Also, Blue-Green deployment provides one of the most important features, a rapid rollback. Suppose if anything goes wrong with your latest release, you just have to switch back the router. While there might still be an issue with the lost transactions when the faulty environment was live, you could also design it in a way that both of your environments are fed with the transactions and one acts as a backup for the other one. 

How Blue-Green Deployment? 

Now that we know about the two environments of Blue-Green Deployment, let us look at some of the best practices to implement the same. 

  • Load Balancing over DNS Switching

Keep in mind that while switching environments, make your domain to eventually point to different servers. Instead of going to the DNS records & making changes in DNS management interface, use load balancing. 

The problem with making changes in the DNS records will just result in a long traffic trail. This means that while some of your users will still be served by the old environment, you won't even have full control on where your traffic is routed to. 

However, load balancers will allow you to set your new servers up immediately and you wouldn't have to depend on the DNS mechanism. This way, you will have full control on the traffic and you can be absolutely sure that all the traffic is routed to the new environment. 

  • Rolling Update

DO NOT switch all your servers at once. Execute a rolling update instead. That is, instead of switching over to the Green servers from the Blue ones all at once, work with an integrated environment. Add one new server, retire an old one. Repeat until every new server is in place. This will reduce your downtime by a huge margin! 

  • Environment Monitoring

While monitoring the live environment is obvious, you do not want to end up being caught off guard by not monitoring the other environment. Yes, the monitoring of the other environment is less critical. However, since the same environment can act as both states, you will need an easier way to switch the alerting between the two. Set up different API tokens for the two environments reporting back, or programatically change the alert policy on the environment when its role is changed/switched. 

  • Automation

Manual set of actions will just increase your work. Instead, script every action in the switch process. Automating the switch process will result in a quicker, easier and safer implementation of Blue-Green Deployment. 

Blue Green Deployment for Drupal Websites


  • During code deployments, Drupal needs to run database updates to operate and this is a major problem since the Drupal updates sets the website in maintenance mode and your users are greeted with this beautiful white screen. 
Drupal MaintenanceSource : drupal.org

If the deployment fails, then a rollback is only possible by restoring the database from dump, considering the database schema may have been modified. This means downtime! 

Solution: Blue Green Deployment! 

Let us look at some of the infrastructure overview to implement Blue Green Deployment with Drupal. Any kind of ops in Drupal requires Nginx and a php server (to mainly run Drupal), we need a database, a file storage, some cache, and the stack is designed in a way that it is highly available. 

With Blue Green Deployment, you need to deploy a new stack with a new codebase and copy all of your database and the different states from the previous codebase, preferably using Drush. 

infrastructre overviewSource: Drupal Con Seattle 2019

Once the two different stacks are set up, the staging stack has to be tested independently from the production stack. Basically, you need your users to go to the stack (production stack) and test out the version of the website that is deployed. And when the other version (the staging stack) is ready, you just need to flip the versions. Now the users are being directed to the new stack and the older version is available for testing. 

Blue Green Deployment with Docker

Deployments in Drupal is not easy. While you have to make sure that the code is deployed, composer dependencies are handled, schema updates are up-to-date and all the caches are cleared, you also need to ensure that the "responsive website" is up and running. And what happens if there is a problem and the deployment fails? Do you rollback? Or stop the deployment? 

The phrase you are looking for, is Blue Green Deployment

During Drupal deployment, Docker ensures an easy shift between applications, and makes it easy to build and run different versions of the same. On the EC2 instance, there is always a set of raised docker containers of blue and green, and nginx works as a reverse proxy server. Blue green deployment with Docker allows a user to build a Drupal website which runs parallel in the two different environments.

Blue Green Deployment with AWS 

While historically, blue green deployment was not the first choice to deploy software on-premises because of its high cost and complexity involved, containers have changed this perception for good. 

Containers ease the adoption of blue green deployment because of their easy package and consistent behaviour as they switch between environments. And to change the configuration of a container, you just have to update its dockerfile and rebuild and redeploy the container in place rather than updating the software. 

Amazon ECS performs these rolling updates when you update an existing Amazon ECS service. The rolling update for ecs blue green deployment  involves replacing the existing version of the container with the latest one. This number of containers that the Amazon ECS adds or removes during the update is controlled by adjusting the maximum and minimum number of healthy tasks allowed during the service deployments.Once the update the service's task definition is updated with the latest version of the container image,  Amazon ECS automatically starts replacing the old version of the container with the latest one. 

AWS blue green deployment


Blue green deployment with AWS ECS will also provide optimization benefits since you are not tied to the same set of resources. That is, when the performance envelope of the application changes from one version to the other, you simply launch the new environment with the optimized resources (can be fewer in number or a completely different set of resources). 

Blue green deployment in AWS also fits well with the continuous integration and deployment workflows, keeping a check on their complexities by allowing deployment automation to consider fewer dependencies on an existing environment. 

This solution allows the users to easily manage the deployment and scalability of web platforms without wasting any time. This deployment helps in configuring a high-availability environment that will run a Drupal website without any problems.

Jul 22 2019
Jul 22

Since our initial release, we’ve been doing agile, iterative development on the software. Working with our partners at the University of Michigan and the State of Georgia, we’ve made refinements to both the application and the Drupal integration.

Better search results

Default searches now target the entire index and not the more narrow tm_rendered_item field. This change allows Solr admins to have better control over the refinement of search results, including the use of field boosting and elevate.xml query enhancements.

Autocomplete search results

We added support for search autocomplete at both the application and Drupal block levels -- and the two can use the same or different data sources to populate results. We took a configurable approach to autocomplete, which supports “search as you type” completion of partial text. These results can also include keyboard navigation for accessibility.

Since the Drupal block is independent of the React application, we made it configurable so that the block can have a distinct API endpoint from the application. We did this because the state of Georgia has specific requirements that their default search behavior should be to search the local site first, looking for items marked with a special “highlighted content” field.

Enter search terms field with list of suggested results

Wildcard searching

We fully support wildcard searches as a configuration option, so that a search for “run” will automatically pass “run” and “run*” as search terms.

Default facet control

The default facets sets for the application -- Site, Content Type, and Date Range -- can now be disabled on a per-site basis. This feature is useful for sites that contribute content to a network but only wish to search their own site’s content.

Enhanced query parameters

We’ve added additional support for term-based facets to be passed from the search query string. This means that all facet options except dates can be passed directly via external URL before loading the search form.

Better Drupal theming

We split the module’s display into proper theme templates for the block and it’s form, and we added template suggestions for each form element so that themes can easily enhance or override the default styling of the Drupal block. We also removed some overly opinionated CSS from the base style of the application. This change should allow CSS overrides to have better control over element styling.

What’s Next for Users?

All of these changes should be backward compatible for existing users, though minor changes to the configuration may be required, Users of the Drupal 8.x-2.0 release will need to run the Drupal update script to load the new default settings. Sites that override CSS should confirm that they address the new styles.

Currently, the changes only apply to Drupal 8 sites. We’ll be backporting the new features to Drupal 7 in the upcoming month.

Users of the 1.0 release may continue to use both the existing Drupal module and their current JS and CSS files until the end of 2019. We recommend upgrading to the 2.0 versions of both, which requires minor CSS and configuration changes you can read about in the upgrade documentation.

Special Thanks

Palantir senior engineer Jes Constantine worked through the most significant changes to the application and integration code. Senior front-end developer Nate Striedinger worked through the template design and CSS. And engineer Matt Carmichael provided QA and code review. And a special shoutout to James Sansbury of Lullabot -- our first external contributor.

Jul 22 2019
Jul 22

Rain logo

Mediacurrent created the Rain Install Profile to build fast, consistent Drupal websites and improve the editorial experience. Rain expedites website creation, configuration, and deployment.

In our previous article, we walked through each feature of the Rain Install Profile with tips on how to use and configure each feature. In today’s tutorial, we zoom in on some technical aspects that backend developers will need to understand to properly develop against the Rain distribution.

Have a question or comment? Hit me up at @drupalninja

Post-install configuration

The last article, What’s New in Rain 3.0, gives some basic instructions on how to install Rain successfully. The Drupal project template repository also gives additional details on how to fully set up the entire project, including the local environment. Here we will cover what developers should do next once they have successfully installed Rain for the first time.

Creating a sub-profile

By default, the Mediacurrent Drupal project template includes a core patch that allows for profiles to inherit features and configuration from a parent profile. This follows a similar pattern to base themes and sub-themes in Drupal. The project template also bundles with it a sample profile (“mis_profile”) to demonstrate how to set up sub-profiles. The screenshot below illustrates the new yml syntax that will enable the “base profile” inheritance.

sub profile

Do I need to create a custom install?

You might ask - Why mess with a custom install profile at all? You certainly don’t have to. You could simply run mis_profile or the base Rain profile as-is and then start making your customizations. That said, many organizations take the approach of creating a custom install profile for each project. This can be handy for encapsulating configuration, install functions and other aspects related to the project. Mediacurrent takes this approach, and we highly recommend other organizations implementing Drupal applications do the same.

To get started, simply rename the “mis_profile” folder to your project name, then search and replace any instances of the “mis_profile” text string in your project. Once this is done you can remove the reference to mis_profile in composer.json as it will no longer be needed. From that point on, when you run build.sh your config.yml will instruct the installer to use your custom profile.

sample profile

Files included in sample mis_profile

Exporting configuration

The easiest way to get started with Rain is to run the ./scripts/build.sh command once your initial sub-profile is set up from the previous step. The base Rain installation will configure common modules covered in the Rain Admin and Authors Overview article. All of the content features will be left disabled initially but can easily be enabled through Drush or the “Admin Modules” page. Once you have the initial configuration set, it’s generally best practice to export that configuration to a version-controlled folder. In your local settings file you should have a line that is commented out by default that looks like the following:

# $config_directories['sync'] = $app_root . '/profiles/mis_profile/config/sync';

The line should have the name of your profile. Once you uncomment out that line you can run drush cex -y using your local site alias. This will export your configuration to your install profile folder (also considered best practice but not required). Now, every time you execute ./scripts/build.sh Drupal will create the site using your exported configuration. We love this approach because it ensures project developers are sharing the same configuration and it pairs nicely with automated tests.

example sync folder

Example sync folder from Gatsby Drupal 8 demo

Customizing Content

The content features that ship with Rain can be easily customized. Custom paragraph types are integrated with the Rain base theme (“rain_theme”) which means that changes to Paragraphs often require updates to Twig templates as well. The Rain theme is optional but adds the benefit of pre-configuring Paragraphs to map to components provided in the theme’s style guide. We will cover theming with Rain more in-depth in the next article.

optional content features

Optional content type features


paragraphs pop up

Add Paragraphs pop-up dialog


The Rain install profile takes the role of a quick starter and package manager. This is a similar approach to other distributions like Acquia Lightning. What this means is that after install, developers will own their site’s configuration and primarily leverage Rain’s Composer updates (if desired) for package dependencies. The main project Composer contains references to the base Rain distribution as well as the “Rain Features” package, both of which have their own Composer files that pull in Drupal contributed modules. The benefit of delegating this work to Rain is that modules are continually tested together as a package and include patches where needed. This can dramatically cut down on the work on the part of the “consumer” of Rain versus managing these dependencies on your own.

When updating mediacurrent/rain and mediacurrent/rain_features be sure to check the UPDATE.md file included in the main Rain project repository. This document includes instructions on how to upgrade when significant updates are made. This could include a major version change or updates after a Drupal core minor version is released. Note that Drupal core minor versions often require some refactoring, such as removing a contributed dependency that was ported to core.

Continuous Integration

Mediacurrent uses Bitbucket as our Git service of choice which means that we leverage Bitbucket’s Pipelines to execute CI tests. In our Drupal project template, we include a sample pipelines files that can be leveraged or refactored to match yml syntax for other Git repositories like Github or Gitlab.


How married are you to Rain post-install? The answer is not too much. If you wanted to detach from Rain Composer you could easily back out the Composer dependencies, copy over what you need to your main project Composer and remove the reference to Rain’s base install profile in your custom profile’s info.yml file. From that point on, you will be managing dependencies on your own but still have the option to leverage stand-alone projects like Rain Theme or Rain Admin.

In some cases, developers will want to use the base Rain project but not the corresponding Rain Features project. The rain.install does not include any of the optional features from that project which means that you can simply remove “rain_features” from Composer in instances where you will not utilize those features.

Mediacurrent Rain projects

Mediacurrent Rain projects

Local Environments

Mediacurrent currently uses a DrupalVM based local environment bundled with scripts for managing builds and deployments. These packages are included under “require-dev” in your project Composer and are not required to use Rain. If you have another environment you prefer such as Lando or DDEV you can back out these dependencies from your project Composer.

Drupal VM


Most Drupal web hosts (e.g Acquia, Pantheon, etc.) have their own git repository they use for deploying artifacts to the server. This includes all the actual Drupal files that make up core, contributed modules and libraries. We don’t want our source git repository to store artifacts or conform to the folder layout required by the web host. Instead, we execute deploy commands to build the artifacts from Composer and commit them to the appropriate destination git repository. To make that process easier we have some additional configuration to add to our config.yml file that instructs how and where code should be deployed.

The key aspects to both the Acquia and Pantheon configuration is pointing the release repo to the appropriate git URL provided by your host, the name of the host and the release drupal root directory. Our examples use Acquia and Pantheon but also support generic git artifact repositories like AWS.

Acquia config:

project_repo: [email protected]

release_repo: [email protected]:mcABCProject.git

release_drupal_root: docroot

deploy_host: Acquia

Pantheon config:

project_repo: [email protected]

release_repo: ssh://codeserver.dev.YOUR_UUID_VALUES.drush.in:2222/~/repository.git

release_drupal_root: web

deploy_host: Pantheon

Additionally, for Pantheon, you will need to add a pantheon.yml file to the root directory with the following values:

api_version: 1

web_docroot: true

php_version: 7.1 (or latest PHP version supported)

This command also needs to be run in order to clean up the Pantheon git repo prior to our first deployment:

rm -r README.txt autoload.php core example.gitignore index.php modules profiles robots.txt sites themes update.php vendor web.config

Now we are ready to build and deploy for the first time. We do this with two commands, one to build the artifacts and one to push files to the release git repository.

Example (using Pantheon):

./scripts/hobson release:build Pantheon develop

./scripts/hobson release:deploy Pantheon develop -y

After you have deployed code to Acquia or Pantheon you should be able to run a clean install using either the sample Rain child install profile (mis_profile) or cloned profile you have created (see Rain README for more details).

Final Thoughts

We hope this article helps developers get up and running smoothly with the Rain install profile and Drupal project template. At this point, you should be fairly comfortable installing Rain, creating a custom profile, exporting configuration, deploying to a Drupal host and maintaining Composer. As a package, Rain was designed to be flexible as possible to allow developers to customize their applications without being locked into a prescribed way of configuring Drupal.

Jul 22 2019
Jul 22

An opinion piece featuring my thoughts on what is wrong with the current web and how we might fix it, ran on CNN last week.

Coder DojoVolunteering as a mentor at CoderDojo to teach young people, including my own kids, how to write software.

Last week, I published an opinion piece on CNN featuring my thoughts on what is wrong with the web and how we might fix it.

In short, I really miss some things about the original web, and don't want my kids to grow up being exploited by mega-corporations.

I am hopeful that increased regulation and decentralized web applications may fix some of the web's current problems. While some problems are really difficult to fix, at the very least, my kids will have more options to choose from when it comes to their data privacy and overall experience on the web.

You can read the first few paragraphs below, and view the whole article on CNN.

I still remember the feeling in the year 2000 when a group of five friends and I shared a modem connection at the University of Antwerp. I used it to create an online message board so we could chat back and forth about mostly mundane things. The modem was slow by today's standards, but the newness of it all was an adrenaline rush. Little did I know that message board would change my life.

In time, I turned this internal message board into a public news and discussion site, where I shared my own experiences using experimental web technologies. Soon, I started hearing from people all over the world that wanted to provide suggestions on how to improve my website, but that also wanted to use my site's technology to build their own websites and experiment with emerging web technologies.

Before long, I was connected to a network of strangers who would help me build Drupal.

July 22, 2019

1 min read time

db db
Jul 22 2019
Jul 22

Can't make a Drupal camp? Kevin Thull has you covered! Kevin donates his time recording sessions at most North American Drupal camps. I find out why, and what food to bribe him with.

Jul 22 2019
Jul 22

“Is it worth it?”
This summer, I’ve been asked a lot of questions concerning Promet Source’s commitment to sponsoring Drupal Camps, as well as our eagerness to lead training events and be present at them in every way that we can. 
I get that at any given moment, most of us have more on our plates than we can handle. It’s difficult to justify throwing travel and a two or three day camp or convention into the mix.  
That said, my answer to the “Is it worth it” question is an enthusiastic, “YES.”
Here’s why:

1. It’s good for business

Key among my objectives is the opportunity to connect with a new client or two, but regardless of the expectation that I have walking into an open-source event, I always learn something new or gain a new perspective or solidify a relationship in an expectation-exceeding kind of way. The basic maxim, “You don’t know what you don’t know,” rings very true during Drupal training events, and I’ll add to that, “You don’t know who you don’t know.”

2. You owe it to the community

The defining feature of any open source product is not the technology, it's the people. Open source thrives or dies based on the community that supports it. Any company that depends on open source has a responsibility to give back to the community. Giving back can come in many forms.

  • Contribute code back to the project, update documentation and other traditional "technical" facets of open source
  • Sponsor the project financially, either by funding critical work or by sponsoring events that bring the community together
  • Provide, content, programming, and volunteer labor to the events
  • Just show up

If your company is built on open source, consider all of the above as goals. 

3. Jobs

This one should be obvious. If you need to hire open source savvy people you will find no greater concentration of them than at a camp or convention. Likewise, if you are looking for a job in the open source world, or simply to network among the movers and shakers, you can usually count on the most connected players participating in and sponsoring open source events.

4. You get smarter

It's probably impossible to attend a Drupal Camp or DrupalCon and not come out of it smarter. The sessions are led by passionate and really smart people sharing interesting ideas and technical solutions to complex problems. Product pitches are usually prohibited, so the presentations are actually useful. And the conversations you end up in on the "hallway track" are often the most interesting of all.

5. It is fun

Get 25 to 2,500 open source advocates together in one place and shenanigans will result. Decorum, and maybe a statute of limitations, prohibits me from going into too much detail here. And a lot of activity is actually very tame. Trivia contests, dinner meetups, and coffee exchanges are regular occurrences.

6. The people

Hey, it's a bonus 6th reason. Oh, the people you'll meet. This should probably be reason #1.

Drupal GovCon starts tomorrow, July 24, 2019, in Bethesda, Md., and runs through July 26. I would love to count you among the people I meet at this event. Contact me and let’s work out a good time to connect.

Jul 21 2019
Jul 21

I talk a lot about the Cloud.  In fact if I were doing shots every time I mentioned the word, I'd probably be facedown on my desk by 9AM.  Like many buzzwords in the C-Suite there's a lot of excitement and confusion around the term.  Unlike some other buzzwords, this movement really is so important it will change the way we all live and work forever.

Cloud is not the first (and probably not the last) disruptor to business via software.  Over the past 50 years or so there have been four major epochs of software, with Cloud being the latest:

  1. Commercialization
  2. Open Source
  3. SaaS
  4. The Cloud


The first major change in software after it was invented was the consolidation of software from disparate bits of code that were being written into commercialized software that could be sold to businesses and consumers alike.  The major ERP providers (Oracle, SAP, Microsoft) and many other companies were founded during this era (Oracle 1977, Microsoft 1975, SAP 1972) - and over the next 3 decades commercial software was sold into business resulting in tremendous consolidation and wealth creation for the players involved.  The peak output of this era was the creation of the ERP, and acronym so widespread in corporate America that few people even in the industry can tell you what it stands for off the top of their head (it's Enterprise Resource Planning).  ERP software in theory is software "to rule them all", which handles every corner of what is required to run a business.

Open Source

Microsoft: join us or dieThen in the late 80s, Richard Stallman launched the GNU Project - a licensing arrangement designed to make software free and easy to share.  The term "Open Source" wasn't coined until the late 90s around the time the code for the internet browser Netscape Navigator was released to the public.  For those not around at the time many nerds (myself included) were in full revolt against the "Evil Empire" of Microsoft whose dominance in Operating Systems, consumer and business software alike made them appear like the unstoppable force that Amazon appears to be today.

During this time there were a number of advancements in software: open source languages (ei, php) evolved to support frameworks (ei, Zend), frameworks in turn gave way to open source platforms (ei, Drupal).  The mood from the late 90s well into the decade that followed was a jubilant wave of enthusiasm: for proponents, there was no problem that open source software couldn't solve - and even better the code was free!

The first cracks in the open source approach appeared during the upgrade cycles of the most well developed platforms.  In many systems, great care was taken to "preserve the upgrade path" with the intention that the underlying code should be able to be replaced with a minimal impact to application. In other systems less care was applied here, but no matter the approach the net result was the same: upgrading was more difficult and expensive than the initial build.  The reasons here are only obvious in hindsight: when upgrading not only did you need to write the custom code, and migrate your existing content over, but you also needed to add enough functionality to rationalize a major re-investment to replace something that already existed.


In the late 2000s just as the shine was coming off Open Source, the Software as a Service market was coming into full bloom.  If the problem with Open Source was the cost of code ownership, why not lease it instead?  Costs for hosting had dropped significantly, the internet was everywhere, all the time, and the software itself (thanks to Open Source) had become a lot cheaper and easier to build.  The secret for SaaS products were the ability to build and sell code at scale.  In the CMS space, DIY projects like WordPress and Drupal came under fire from products like SquareSpace and Adobe Experience Manager; Open Source CRM projects faced more challenges from SaaS providers like Salesforce and dozens of other smaller applications.  Even accounting software (never really Open Source) started to canabalize their own userbase by moving to online versions.  For existing software manufacturers it was an opportunity to convert single one-time purchases into recurring revenue streams that would last *forever*.  For many large Open Source projects and communities, it was a bloodbath.  In some cases the pull to SaaS was so strong that former benevolent corporate sponsors walked away with Open Source code and completely rebranded leaving some Open Source developer communities dazed, confused - and angry.

As a service provider who prides ourselves on making the best recommendations for our customers, we couldn’t deny the value that existed on these platforms for many of our customers, conceptually being aligned with a software platform robust roadmap just made sense.  As a result, there were a number of areas - video streaming, CMS, CRM - we moved away from or helped with product selection and consulting rather than full software builds.

The Cloud

Metal Toad at the EmmysMetal Toad at the Emmys

Way back in 2009 we won our first hosting client - the website for the Primetime Emmys.  The scale required was big and the visibility was high, so we setup a small cluster of big, beefy, physical servers in a physical facility to host the website - and it worked.  We hosted the website for several years with 0 downtime (thanks lots of hard work by a smart team and Akamai CDN), and even though AWS had also launched in 2006 we never thought of moving the site there.  Back then I was fond of saying “the Cloud will be great someday, but not today”.  For us someday came in 2011, when a high profile rebuild of a new website.  The code was released to us 1 week before a major televised event and the new code required a major increase in the required hardware to  get to proper performance.  The short timeline meant we simply couldn’t purchase the physical hardware required and rack it in time.  In a mail Mary attempt to get things across the finish line we setup the site on AWS, held our breath… and it worked perfectly.  From that point on we were true cloud converts.  “Someday” had arrived and the Cloud was great.

For a number of years after that, the Cloud was a cost effective place to get storage, and compute cycles.  The worst part about it was the pricing: Amazon kept dropping the price, penny per hour at a time.  Google and Microsoft joined the fray in (2008 and 2010 respectively), and more and more traditional hosting providers (Rackspace, etc) started offering their own brand of cloud.  Virtualization was in full vogue and the market place was vibrant and varied.

Then with the launch of Lambda in 2015, things shifted again.  You didn’t need to reserve computing power for months, weeks, days or even hours - you could buy in milliseconds.  This shift had profound implications not only the the costs of hosting, but the promise of how software itself should be built.  The concept of server less architecture started driving discussions and capturing the imagination of developers - software spun up on demand using only the resources required, and freeing them up as soon as they were no longer needed.  If fully implemented, rebuilding the software in use today there’s no doubt billions of hours of compute time and associated watts of energy could be freed up - and that’s just the tip of the iceberg.  The AWS cloud now features over 120 products and services including compute cycles optimized for machine, video, AR/VR, IoT, voice, analytics - and even satellite.

By comparison, the promise of SaaS has gone stale.  Years of deferred investment on software roadmaps, maximizing profits, marketing fluff, and walled garden philosophies have left many SaaS providers in a state of high technical debt and high pricing.  The Cloud is going to eat them for lunch.  By way of example, back in 2003 I built a video transcoding engine (using an Open Source project).  The experience was a labor intensive, painful one, so in 2006 when the scores of SaaS video management platforms were released, I readily endorsed them for my customers - Brightcove, Kaltura, Ooyala - they were all better than my home spun system.  Now more than a decade later, many of these systems have fallen behind the times and include many features that customers simply don’t want.  On the cloud front, AWS Elemental offers a-la-carte video features on cloud instances optimized for various functions.  I’m also aware of not 1 but 3 large entertainment companies currently looking at overhauling how they manage video.  And that’s just one small corner of what’s possible!

The Cloud is changing how all software should - and will - be built.  Software is no longer a cost center or a need to have for most corporations, but a critical part of their customer experience and value chain.  If your competitors allow customers to buy online and you do not, you are in trouble.  If your competitors’ software costs are less than yours and this allows them to beat you on pricing, you are in trouble.  Most of the software in use in the world right now is old, clunky, out-dated, built on aging, poorly architected infrastructure.  For the visionary CEO, this presents an unparalleled opportunity; and for the luddite this is an existential threat.  It’s likely that a full shift to Cloud will take decades, but it will impact everyone.  If you haven’t looked at how your company’s and industry’s software will be impacted - it’s high time to elevate in your priority list.  The Cloud is coming for all of us.

Jul 20 2019
Jul 20

For starters, over 200 Drupal 8 sites already run CiviCRM!  This post is based on my own research and conversations with those involved, and is intended to be informative and encouraging.  As you may know, CiviCRM works with no less than four CMS at the moment, including three versions of Drupal, two 'officially'.   Understandably with Drupal 7 end-of-support scheduled for Nov 2021, there has been recent discussion about which to use for your website.

  • Many Drupal shops already support and recommend Drupal 8 with CiviCRM
  • Preferred installation techniques for Drupal 8 have coalesced around the use of the Composer tool
  • Tutorials and how-tos for installation by leading experts are available

Let's back up a step...what is different about Drupal 8?  Well, Drupal 7 to Drupal 8 represents a major leap and upgrade.  Whereas Drupal 8 to 9 (and beyond) should be significantly less laborious, according to the Drupal head honcho.  The investment to jump to Drupal 8 will be offset by easier Drupal updates planned in the future.  Compatibility with CiviCRM should be similarly less laborious to maintain.

The Make it Happen fundraising campaign for Drupal 8 & Civi is currently halfway there.  But what's it for?

  • Improvements to the CiviCRM core installer for use with Composer and a clean build process
  • Funding the CiviCRM Core Team, who are already carefully considering and coordinating the sustainable architecture they will implement
  • An officially supported Drupal 8 release listed for download
  • Future maintenance

But what about other Drupal integration modules we know and love, such as Webform CiviCRM and CiviCRM Entity?  Again, great question...

  • Efforts to port these modules to Drupal 8 are already well underway, and nearing completion!
  • These modules have little to no dependency on the Drupal 8 & core CiviCRM Make it Happen effort
  • There are issue queues and sponsors for Entity and Webform. Code and financial contributions are welcome

Ok but "What do I do?" you may ask.  First off, that's a deeply individual question, depending on what your website already does and what you need it to do with CiviCRM.  But regardless it's a good idea to start this conversation right now.  Here's some food for thought:

  • Discuss with your CiviCRM consulting partner that supports Drupal.  If you are a consultant, consider a proactive chat with your clients
  • You don't need to wait for an official release, you can get started today building Drupal 8 with Civi using these tips and installation guides
  • Get involved technically and financially by supporting individual module efforts and/or the Make it Happen
  • Talk to others, ask their opinion, and get them involved.  You can find support on CiviCRM's Stack Exchange or the 'dev' and 'drupal' channels of Civi's MatterMost chat
  • Consider another CMS for your new website that already has official CiviCRM support.  WordPress and Joomla! come to mind of course.  But it's worth mentioning Backdrop specifically in this category, due to it being a Drupal offshoot with a supposedly clean upgrade path from Drupal 7
  • Or you could wait a little bit longer...and see what happens.  Drupal 7 isn't going away at end of 2021, it's just no longer officially supported.  Third-party providers are already gearing up for long term support of Drupal 7 into 2022 and beyond.  So if you've built a behemoth with Drupal 7, you may be able to keep it active for quite some time

Hope this helps people get their heads wrapped around what's going on...comments welcome!

Jul 19 2019
Jul 19

If your website is the hub for your audience to interact with your brand, then presumably you are doing all sorts of marketing tactics to get them there.

Once they are there, how are you tracking them? How do you know your efforts are effective?

There are a few key metrics you should be tracking to help you to optimize your marketing efforts, understand how the site is doing, continuously improve your website, and to allow you to report to others in your company about where the focus of marketing needs to lie.

First off, you need to determine what your goals are per marketing activity. How are you performing on these goals now? What are doing to affect those goals? How will you measure it? Identify not only your main conversions, like a form completion or a purchase, but soft conversions like a newsletter sign-up or a PDF download.

Next, either review your metrics based on these items or put in these metrics to track moving forward. 

Here are some commonly reviewed and important items to track. Most of these will be familiar to you, but #6 can be a game changer!

1. Time on Site

This metric allows you to see an aggregate of how long your audience spends on your site. If your site is centered around exploration and information, you will want this number to increase over time.

2. Bounce Rate

Your bounce rate is the percentage of users who visit your site, but only visit one page and then leave. Google Analytics defines it as the user only visiting the site for 0 seconds, then they exit. This means they see one page of your site, but the analytics does not have enough time to trigger a duration of their session. 

Several reports lean towards an “Acceptable” bounce rate can range between 26 to 70%. But this is a large range across multiple industries. Look deeper to learn what is considered “acceptable” in your industry, because a high bounce rate absolutely depends on your industry and the goals of your site. For example, if you are a restaurant and the visitor simply visits to grab your phone number, then you have reached your goal!

This should be looked at in combination with the other analytics in this article, since looking at the bounce rate alone will not tell you an accurate story. Researching a good bounce rate for your website type and industry is fantastic, but also look to see where you are today and then focus on reducing it (if appropriate).

3. Number of Pages visited

Again, if your site is more informational and built to provide a “next step” for exploration with your users, than you will want this metric to increase. If the number is closer to 1, but you focus all your traffic to a single page, than you should look deeper into that single page’s analytics, before you are concerned with this number.

4. New vs Returning Visitors

In Google Analytics, there is an overlap in these numbers. “New Visitor” is a unique visitor visiting your site for the first time, on a specific device. If you visit a site once on your phone, then again on your desktop, you will be counted as 2 new visitors.

Once the visitor visits your site again, on a device they already used, they will be counted as a, “Returning Visitor” for the next two years (then the clock starts over again).

This could be a great metric to use when you are running a campaign in different areas or industries, for example. If you pay close attention, you can see which campaigns garnered more new traffic.

5. Traffic Sources

Analytics programs will report to you where your traffic is coming from, which illuminates the more and less popular sources. It will also provide you referral sites, which helps you to see your ROI if you partner with others to send traffic to your site.

Seeing how each traffic source performs for you will continue you on the path of honing what works well for you (and what does not).

6. Search

The most crucial advice we provide our clients is to track your in-site search.

This is done as an admin in your “view settings” for Google Analytics. The reason this is so very powerful is it provides you exactly what your visitors want from your site.

A behavioral studies from the Nielsen Group and other research findings show that more than 50% of people visiting a start page on a website go straight to the internal search box in order to navigate. Those figures prove that search box becomes essential navigation tool on every website.

From this data, you can organize, adjust or create your content plan. You can revamp your navigation or the order at which content is laid out on your site. You can write relevant FAQs or shift your focus from one audience group to another. The reason this can be so compelling for your business is because you are directly answering the needs of your audience. 

These will get you started!

Many more metrics exist which can help you to analyze your effectiveness in your marketing tools and traffic sources, but these six are the best ones with which to start. Once you have defined what is important for you, continue to review your analytics over the course of time so you can continually optimize your site’s effectiveness.

Your website is a living and breathing entity that needs nurture and care to continue its growth and work harder for your business. If you need help with a strategy to define your metrics, contact us. We’d be glad to help. 

Jul 19 2019
Jul 19
August 2 - August 4, 2019 King Center, Auraria Campus, Denver, Colorado DrupalCamp Colorado 2019 (Official Site)

Palantir is excited to return to Denver as a sponsor for DrupalCamp Colorado 2019, featuring a keynote from our CEO, Tiffany Farriss. Tiffany will be discussing the role of organizational culture and open source projects like Drupal in the success of tech companies. We hope to see you there!

  • Location: TBD
  • Date: August 3rd, 2019
  • Time: 9 AM - 10 AM MDT
Jul 19 2019
Jul 19
July 16th, 2019 Oregon Convention Center, Portland, Oregon [email protected] (official site)

Open source looks very different now compared to 20 years ago, and with such a vast community of developers, it is difficult to define the exact role of a “good” open source citizen.

Palantir is thrilled to be participating in Keeping Open Source Open -- a panel including CEO, Tiffany Farriss for a spirited discussion on open source strategy and the future of open source.

Other panelists include Zaheda Bhorat (Amazon Web Services) and Matt Asay (Adobe). The panel will air some of the strongest opinions on Twitter.

  • Time: 1:30 PM - 2:20 PM
  • Location: F150/151


Jul 19 2019
Jul 19
July 17 - 18, 2019 John Jay College of Criminal Justice, New York City, New York Decoupled Days (Official Site)

Our team is so enthusiastic to participate in the third iteration of Decoupled Days. Palantir is excited to sponsor this year’s event and continue to share our insights into Content Management Systems.

Content Modeling for Decoupled Drupal

Join Senior Engineer and Technical Architect Dan Montgomery for a session on content modeling. He’ll break down:

  • How a master content model can enable scalable growth
  • How to create a standardized structure for content
  • How Drupal can function as a content repository that serves other products

You’ll walk away with an understanding of how to develop architecture and structures that are scalable for future, unknown endpoints.

  • Date: Thursday, July 18
  • Time: 9:00am
  • Location: Room 
Jul 19 2019
Jul 19

A design system gives you a “lego box” of components that you can use to create consistent, beautiful interfaces.

Design System artifacts go by many names - Living Style Guides, Pattern Libraries, UI Libraries, and just plain Design Systems. The core idea is to give digital teams greater flexibility and control over their website. Instead of having to decide exactly what all pages should look like in one big redesign and then sticking with those templates until the next redesign, a design system gives you a “lego box” of components the team can use to create consistent, beautiful interfaces. Component-based design is how you SCALE.

At Palantir we build content management systems, so we’ve named our design system artifact a “style guide” in a nod to the editorial space.

Our style guides are organized into three sections:

  1. 'Design Elements' which are the very basic building blocks for the website.
  2. 'Components' which combine design elements into working pieces of code that serve a defined purpose.
  3. 'Page Templates' which combine the elements and components into page templates that are used to display the content at destination URLs.

But how do we help our clients determine what the list of elements, components and page templates should be?

How to Identify Elements for Your Design System

In this post I’ll walk through how we worked with the University of Miami Health System to create a style guide that enabled the marketing team to build a consistent, branded experience for a system with 1,200 doctors and scientists, three primary locations, and multiple local clinics.

1. Start by generating a list of your most important types of content.

Why are people coming to your site? What content helps them complete the task they are there to do? This content list is ground zero for component ideation: how can design support and elevate the information your site delivers?

Table of content types

The list of content serving user needs is your starting point for components. In addition, we can use this list to identify a few page templates right off the bat:

  • Home page
  • Treatment landing page
  • Search page
  • Listing page: Search results, news, classes
  • Clinical trials landing page
  • Clinical trial detail page
  • Location landing page
  • Appointment landing page
  • Appointment detail page
  • Basic page (About us, contact us, general information)

This is just the start of the UHealth style guide; we ultimately created about 80 components and 17 page templates. But it gives you a sense of how we tackled the challenge!

2. Sort your list of important types of content into groups by similarities.

Visitors should be able to scan your website for the information they need, and distinctive component designs help them differentiate content without having to read every word. In addition, being rigorous about consistently using components for specific kinds of information creates predictable interfaces, and predictable interfaces are easy for your visitors to use.

In this step, you should audit the design and photo assets you have available now, and assess your capacity to create them going forward. If, for example, you have a limited photo library and no graphic artist on staff, you’ll want to choose a set of components that don’t heavily rely on photos and graphics.

Component example for UHealth site

In this example, we have three component types: News, Events/Classes, and a Simple Success story.

  1. News Component: This component has no images. This is largely about content management; UHealth publishes a lot of news, and they didn’t want to create a bottleneck in their publishing schedule by requiring each story to have a digital-ready photo.
  2. Events/Classes Component: This component has an option for images or a pattern. Because UHealth wants visitors to take action on this content by signing up, we wanted these to have an eye-catching image. Requiring a photo introduces a potential bottleneck in publishing, so we also gave them the option to make the image a pattern or graphic.
  3. Simple success story: This is the most visually complex component because successful health narratives are an important element of UHealth’s content strategy. We were able to create a complex component here because there’s a smaller number of success stories compared to news stories or classes and events. That means the marketing team can dedicate significant time and resources to making the content for this component as effective as possible.

3. Now that you’ve sorted your list by content, do a cross-check for functionality.

Unlike paper publications, websites are built to enable actions like searching, subscribing, and making appointments. Your component set should include interfaces for your functionality.

Some simple and common functions for the UHealth site included searching for a treatment by letter, map blocks, and step forms.

In a more complex example, the Sylvester Cancer Center included a dynamic “Find a lab” functionality that was powered by a database. We designed the template around the limitations of the data set powering the feature, rather than ideating the ideal interface. Search is another feature that benefits from planning during the design phase.

For example, these components for a side bar location search and a full screen location search require carefully structured databases to support them. The design and technical teams must be in alignment on the capacity and limits of the functionality underlying the interface.

4. Differentiate components by brand.

UHealth is an enormous health care system, and there are several centers of excellence within the system that have their own logos and distinct content strategies. As a result, we created several components that were differentiated by brand.

UHealth navigation bars

In this example, you see navigation interfaces that are different by brand and language. Incorporating the differentiated logos for the core UHealth system and the Centers of Excellence is fairly straightforward. But as you can see the Sylvester Center also has three additional top nav options: Cancer treatments, Research, and For Healthcare Professionals.

That content change necessitated a different nav bar - you can see that it’s longer. We also created a component for the nav in Spanish, because sometimes in other languages you find that the menu labels are different lengths and need to be adjusted for. In this case, they didn’t, but we kept it as a reference for the site builders.

5. Review the list: can you combine any components?

Your overall goal should be creating the smallest possible set of components. Depending on the complexity and variety of your content and functionality, this might be a set of 100 components or it might be just 20. The UHealth Design System has about 80 components, and another 17 page templates.

The key is that each of the components does a specific job and is visually differentiated from components that do different jobs. You want clear visual differences that signal clear content differences to your audience, and you don’t want your web team spending time trying to parse minor differences - that’s not how you scale!

In my experience, the biggest stumbling block to creating a streamlined list of components is stakeholders asking for maximum flexibility and control. I’ve found the best way to manage this challenge is to provide stakeholders with the option to differentiate their fiefdoms through content rather than components.

UHealth component examples

In this example, we have the exact same component featuring different images, which allows for two widely different experiences. You can also enable minor differentiation within a component: maybe you can leave off a sub-head, or allow for two buttons instead of one.

6. Start building your design system and stay flexible.

The list you generated here will get you 80% of the way there, but as you proceed with designing and building your design system, you will almost certainly uncover new component needs. When you do, first double check that you can’t use an existing component. This can be a little tricky, because of course content can essentially be displayed any way you want.

At Palantir, we solve for this challenge by building our Style Guide components with real content. This approach solves for a few key challenges with building a design system:

  1. Showing the “why” of a component. Each component is designed for a specific type of content - news, classes, header, testimonial, directory, etc. This consistency is critical for scaling design: the goal is to create consistent interfaces to create ease of use for your visitors. By building our Style Guides with real content, we document the thought process behind creating a specific component.
  2. Consistency. Digital teams change and grow. We use content in our Style Guide to show your digital team how each component should be used, even if they weren’t a part of the original design process.
  3. Capturing User Testing. Some of our components, like menus, are heavily user-tested to ensure that we’re creating intuitive interfaces. By building the components with the tested content in place, we’re capturing that research and ensuring it goes forward in the design.
  4. Identifying gaps. If you’ve got a piece of content or functionality that you think needs a new component, you can check your assumptions against the Style Guide. Does the content you’re working with actually fit within an existing pattern, or is it really new? If it is, add it to the project backlog!


The most important takeaway here is that design systems let your web team scale. Through the use of design systems, your digital team can generate gorgeous, consistent and branded pages as new needs arise.

But don’t take our word for it! Tauffyt Aguilar, the Executive Director of Digital Solutions for Miller School of Medicine and UHealth, describes the impact of their new design system:

“One of the major improvements is Marketing’s ability to maintain and grow their site moving forward. Previously each page was designed and developed individually. The ability to create or edit pages using various elements and components of the Design System is a significant improvement in the turnaround time and efficiency for the Marketing department.”

My favorite example of a new page constructed with the UHealth design system is this gorgeous interface for the Sports Medicine Institute.

Sports Medicine homepage

The Sports Medicine audience has unique needs and interests: they are professional and amateur athletes who need to get back in the game. The UHealth team used basic components plus an attention-grabbing image to create this interface for finding experts by issue.

And ultimately, that’s Palantir’s goal: your digital team should have the tools to create gorgeous, effective websites.

Content Strategy Design


Jul 19 2019
Jul 19
Patient engagement solutions company homepage on laptop

Content modeling as a practical foundation for future scalability in Drupal.

Content modeling as a practical foundation for future scalability On

Palantir recently partnered with a patient engagement solutions company that specializes in delivering patient and physician education to deliver improved health outcomes and an enhanced patient experience. They have an extensive library of patient education content that they use to build education playlists which are delivered to more than 51,000 physician offices, 1,000 hospitals, and 140,000 healthcare providers - and they are still growing.

The company is in the process of completely overhauling their technical stack so that they can rapidly scale up the number of products they use to deliver their patient education library. Currently, every piece of content needs to be entered separately for each product it can be delivered on, which forces the content teams to work in silos. In addition, because they use a dozen different taxonomies and doing so correctly requires a high level of context and nuance, any tagging of content can only be done at the manager level or above. The company partnered with Palantir.net to remove these bottlenecks and plan for future scalability.

Key Outcome

Palantir teamed up with this patient engagement solutions company to develop a master content model that:

  • Captures key content types and their relationships
  • Creates a standardized structure for content, including fields that enable serving content variations based on end-point devices and localization
  • Incorporates a taxonomy that enables content admins to quickly filter and select content relevant to their needs and device

Enabling Scalable Growth

The company’s content library is only getting larger over time, so the core need driving the master content model is to enable scalable growth. Specifically, that means a future state where:

  • New products can be added and old products deprecated without restructuring content. 
  • Content filtering can scale up for new product capabilities, languages, and specialties without having to be fundamentally reworked. 
  • Clients using the taxonomy find it intuitive and require minimal specific training to create and amend their own patient education playlists. 

These principles guided our recommendations for the content model and taxonomy.

Example of content model and related content

Content Model

Our client’s content model is currently organized by the end product that content is delivered through - for example, a waiting room screen vs. an interactive exam room touchscreen. This approach requires the digital team to enter the same piece of content multiple times.

To streamline this process for the team, we recommended a master content model that is organized by the purpose of the content, including the mindset of the audience and the high-level strategy for delivering value with that content.

For example, a “highlight” is a small piece of content intended to engage the audience and draw them into deeper exploration, while a “quiz” is a test of knowledge of a particular topic as training or entertainment.

Example of quiz and responses

This approach allows the company to separate the content types from products, which in turn makes them easier to scale. For example, this wireframe shows how a single piece of quiz content can be delivered on a range of endpoint devices depending on which fields that device uses. This approach allows us to show how a quiz might be delivered on a voice device, which is a product the company does not yet support, but could in the future.

“Our content is tailored to different audiences with different endpoints. Palantir took the initiative to not only learn about all of our content paths, but to also learn how our content managers interact with it on a daily basis. We’ve relied heavily on their expertise, especially for taxonomy, and they delivered.”

Executive Vice President, Content & Creative


The company’s taxonomy has 12 separate vocabularies, and using them to construct meaningful content playlists requires a deep understanding of both the content and the audience. Existing content has been tagged based on both the information it contains and based on the patients to whom it would be relevant.

For example, a significant proportion of cardiology patients are affected by diabetes, so a piece of content titled "Healthy Eating with Diabetes" would be tagged with both "Diabetes" and "Cardiology". Additionally, many tags have subtle differences in how they are used — when do you use "cardiology" vs. "cardiovascular conditions"? "OB/GYN" vs. "Women's Health"?

This system requires that everyone managing the content — from content creators to healthcare providers and staff selecting content to appear in their medical practice — understand the full set of terms and the nuance of how they are applied in order to tag content consistently.

Our goal was to develop a taxonomy that can be used to filter content effectively without requiring deep platform-specific context and nuance.

Our guiding principles were to:

  • Tag based on the information in the content.
  • Use terms that are meaningful to a general audience.
  • Use combinations of tags to provide granularity.
  • Avoid duplicate information that is available as properties of the content
Back-end of Drupal editorial experience

We ultimately recommended a set of eight vocabularies. Two of them are based on company-specific business processes, and the remaining six are standards-based so that any practitioner can use them. By using combinations of terms, users can create playlists that are balanced in terms of educational and editorial content.

For example, in our recommended taxonomy, relevant content is tagged as referencing diabetes, so that the person building the playlist can still construct effective content playlists, without needing to carry in their head the nuance that many cardiology patients are also diabetic.

Moving Forward With Next Steps

This content modeling engagement spanned 9 weeks, and the Palantir team delivered:

  • A high-level content model identifying the core content types and their relationships
  • A set of global content fields that all content types in the model should have
  • A field level content model for the four most important content types
  • A new taxonomy approach based on internal user testing
  • A Drupal Demo code base showing how the content types and taxonomy can be built in Drupal 8


In the future, the company’s ultimate goal for the platform is to scale their engagement offerings with new content and new technology. With our purpose-driven content model and refined taxonomy, the company can scale their business by breaking down internal content silos and making tagging and filtering content consistent and predictable for their internal team and eventually, their customers. Palantir’s master content modeling work forms a practical foundation for the company’s radical re-platforming work.

Jul 19 2019
Jul 19
Monday, June 17, 2019 WeWork, 111 W Illinois Street, Chicago, IL Chicago IA/UX Meetup (official site)

Facilitating design workshops with key stakeholders allows them to have insight into the process of "how the sausage is made" and provides the product team buy-in from the get-go.

Join Palantir's Director of UX Operations, Lesley Guthrie, for a session on design workshops. She'll go over:

  • How to choose the right exercises 
  • How to play to the team skill sets
  • Ways to adjust the workshop to fit the needs of the project 

You'll learn how to sell it the idea of the design workshop to stakeholders and collaborate with them on a solution that can be tested and validated with real users.

Jul 19 2019
Jul 19

Content editors can help make the web a more accessible place, one published moment at a time.

Although web accessibility begins on a foundation built by content strategists, designers, and engineers, the buck does not stop there (or at site launch). Content marketers play a huge role in maintaining web accessibility standards as they publish new content over time.

“Web accessibility means that people with disabilities can perceive, understand, navigate, and interact with the Web, and that they can contribute to the Web.” - W3

Why Accessibility Standards are Important to Marketers

Web accessibility standards are often thought to assist audiences who are affected by common disabilities like low vision/blindness, deafness, or limited dexterity. In addition to these audiences, web accessibility also benefits those with a temporary or situational disability. This could include someone who is nursing an injury, someone who is working from a coffee shop with slow wifi, or someone who is in a public space and doesn’t want to become a nuisance to others by playing audio out loud.

Accessibility relies on empathy and understanding of a wide range of user experiences. People perceive your content through different senses depending on their own needs and preferences. If someone isn’t physically seeing the blog post you wrote or can’t hear the audio of the podcast you published, that doesn’t mean you as a marketer don’t care about providing that information to that audience, it just means you need to adapt in the way you are delivering that information to that audience.

10 Tips for Publishing Accessible Content

These tips have been curated and compiled from a handful of different resources including the WCAG standards set forth by W3C, and our team of accessibility gurus at Palantir. All of the informing resources are linked in a handy list at the end of this post. 

1. Consider the type of content and provide meaningful text alternatives.

Text alternatives should help your audience understand the content and context of each image, video, or audio file. It also makes that information accessible to technology that cannot see or hear your content, like search engines (which translates to better SEO).

Icons to show image, audio, video

Types of text alternatives you can provide:

  • Images - Provide alternative text.
  • Audio - Provide transcripts.
  • Video - Provide captions and video descriptions in action.

This tip affects those situational use cases mentioned above as well. Think about the last time you sent out an email newsletter. If someone has images turned off on their email to preserve cellular data, you want to make sure your email still makes sense. Providing a text alternative means your reader still has all of the context they need to understand your email, even without that image.

2. Write proper alt text.

Alternative text or alt text is a brief text description that can be attributed to the HTML tag for an image on a web page. Alt text enables users who cannot see the images on a page to better understand your content. Screen readers and other assistive technology can’t interpret the meaning of an image without alt text.

With the addition of required alternative text, Drupal 8 has made it easier to build accessibility into your publishing workflow. However, content creators still need to be able to write effective alt text. Below I’ve listed a handful of things to consider when writing alt text for your content.

  • Be as descriptive and accurate as possible. Provide context. Especially if your image is serving a specific function, people who don’t see the image should have the same understanding as if they had.
  • If you’re sharing a chart or other data visualization, include that data in the alt text so people have all of the important information.
  • Avoid using “image of,” “picture of,” or something similar. It’s already assumed that the alt text is referencing an image, and you are losing precious character space (most screen readers cut off alt text at around 125 characters). The caveat to this is if you are describing a work of art, like a painting or illustration.
  • No spammy keyword stuffing. Alt text does help with SEO, but that’s not it’s primary purpose, so don’t abuse it. Find that happy medium between including all of the vital information and also including maybe one or two of those keywords you’re trying to target.
Illustration of red car with flames shooting out of the back, flying over line of cars on sunny roadway.Example of good alt text: “Red car in the sky.”
Example of better alt text: “Illustration of red car with flames shooting out of the back, flying over line of cars on sunny roadway.”

3. Establish a hierarchy.

Upside down pyramid split into three sections labeled high importance, medium importance, low importance

Accessibility is more than just making everything on a page available as text. It also affects the way you structure your content, and how you guide your users through a page. When drafting content, put the most important information first. Group similar content, and clearly separate different topics with headings. You want to make sure your ideas are organized in a logical way to improve scannability and encourage better understanding amongst your readers.

4. Use headings, lists, sections, and other structural elements to support your content hierarchy.

Users should be able to quickly assess what information is on a page and how it is organized. Using headings, subheadings and other structural elements helps establish hierarchy and makes web pages easily understandable by both the human eye and a screen reader. Also, when possible, opt for using lists over tables. Tables are ultimately more difficult for screen reader users to navigate.

If you’re curious to see how structured your content is, scan the URL using WAVE, an accessibility tool that allows you to see an outline of the structural elements on any web page. Using WAVE can help you better visualize how someone who is using assistive technologies might be viewing your page.

5. Write a descriptive title for every page.

This one is pretty straight forward. Users should be able to quickly assess the purpose of each page. Screen readers announce the page title when they load a web page, so writing a descriptive title helps those users make more informed page selections.

Page titles impact:

  • Users with low vision who need to be able to easily distinguish between pages
  • Users with cognitive disabilities, limited short-term memory, and reading disabilities.

6. Be intentional with your link text.

Write link text that makes each link’s purpose clear to the user. Links should provide info on where you will end up or what will happen if you click on that link. If someone is using a screen reader to tab through 3 links on a page that all read “click here,” that doesn’t really help them figure out what each link’s purpose is and ultimately decide which link they should click on.

Additional tips:

  • Any contextual information should directly precede links.
  • Don’t use urls as link text; they aren’t informative. A
  • void writing long paragraphs with multiple links. If you have multiple links to share on one topic, it’s better to write a short piece of text followed by a list of bulleted links.

EX: Use "Learn more about our new Federated Search application" not "Learn more".

7. Avoid using images of text in place of actual text.

The exact guideline set forth by W3 here is “Make it easier for users to see and hear content including separating foreground from background.” 

There are many reasons why this is a good practice that reach beyond accessibility implications. Using actual text helps with SEO, allows for on-page search ability for users, and creates the ability to highlight for copy/pasting. There are some exceptions that can be made if the image is essential to include (like a logo). Providing alt text also may be a solution for certain use cases.

8. Avoid idioms, jargon, abbreviations, and other nonliteral words.

The guideline set forth by W3 is to “make text content readable and understandable.” Accessibility aside, this is important for us marketers In the Drupal-world, because it’s really easy to include a plethora of jargon that your client audience might not be familiar with. So be accessible AND client-friendly, and if you have to use jargon or abbreviations, make sure you provide a definition of the word, link to the definition, or include an explanation of any abbreviations on first reference.

Think about it this way: if you are writing in terms people aren’t familiar with, how will they know to search for them? Plain language = better SEO.

9. Create clear content for your audience’s reading level.

For most Americans, the average reading level is a lower secondary education level. Even if you are marketing to a group of savvy individuals who are capable of understanding pretty complicated material, the truth is, most people are pressed for time and might become stressed if they have to read super complicated marketing materials. This is also important to keep in mind for people with cognitive disabilities, or reading disabilities, like dyslexia.

I know what you’re thinking, “but I am selling a complicated service.” If you need to include technical or complicated material to get your point across, then provide supplemental content such as an infographic or illustration, or a bulleted list of key points.

There are a number of tools online that you can use to determine the readability of your content, and WebAIM has a really great resource for guidelines on writing clearly.

10. Clearly label form input elements.

If you are in content marketing, chances are you have built a form or two in your time. No matter whether you’re creating those in Drupal or an external tool like Hubspot, you want to make sure you are labeling form fields clearly so that the user can understand how to complete the form. For example, expected data formats (such as day, month, year) are helpful. Also, required fields should be clearly marked. This is important for accessibility, but also then you as a marketer end up with better data.

Helpful Resources

Here are a few guides I've found useful in the quest to publish accessible content:

Accessibility Tools

Jul 19 2019
Jul 19
NRHRC homepage on a laptop computer sitting on a table with a small plant next to the laptop

How we helped NRHRC conduct user testing to validate an audience-centric navigation. 

ruralcenter.org User Testing to Validate an Audience-Centric Navigation On

The National Rural Health Resource Center (The Center) is a nonprofit organization dedicated to sustaining and improving health care in rural communities by providing technical assistance, information, tools, and resources. Users on The Center’s site are looking for information relating to services they provide, programs and events they coordinate, and resources that have been developed to guide and support rural health stakeholders, like webinars, articles, and presentations.

The Center had been making iterative modifications to their Drupal site to improve wayfinding for their visitors, but the team had not yet been able to conduct any user testing on the organization of the site. The Center partnered with Palantir.net to build on previous architecture work and test, validate, and provide recommendations for a more effective, user-centric navigation that lowers user effort on their site.

The goals of the engagement were to: 


  • Make navigation labels and structure relevant and intuitive to users
  • Test and validate hypotheses with real user data
  • Have the web team partner hands-on with Palantir, so they could see how the user testing processes and tools work and execute these research methods on their own for future optimization efforts

The project had two key constraints:

  • Testing needed to focus on copy and labeling rather than new features. The Center’s goal was to surface UX improvements that their team could implement within the Drupal CMS by iterating on menu labels, menu structure, and copy.
  • Limited budget. The Center’s budget could cover a limited set of tests, so Palantir needed to formulate a testing plan that maximized the value of the user testing.

Palantir and the Center teamed up to run a Top Task survey to inform a new Information Architecture (IA) and then ran a tree test to validate the new IA.

Key results with the new Information Architecture and the optimized tree:

  • 17% higher success rate overall for users completing tasks
  • 8% increase in overall “directness” rate (tasks completed with fewer backtracks)

How did we get there?

Palantir implemented a three-step process:

  1. Work with key stakeholders at the Center to identify key metrics.
  2. Design and implement tests.
  3. Handoff our recommendations for the Center to implement.

Step 1: Work with key stakeholders at the Center to identify key metrics.

It was imperative to understand the Center’s goals as they relate to their user’s goals to be able to optimize the site structure and test against what users find important. 

Because the Center’s site is a resource site first, the goals focused on users being able to find the resources they are looking for.

Key Performance Indicators (KPIs)

How we planned to measure success against our established goals:

  • Customer-reported satisfaction with “findability”
    • “Did this content answer your question?” feature (example)
  • Improvement in task performance indicators
    • Webinar participation
    • Completion of Self-Assessment form
    • Download of publications
  • Qualified, interested service leads

Step 2: Design and implement tests.

Our testing approach was two-fold, with one underlying question to answer: what is the most intuitive site structure for users?

Test #1: Top Task survey

During the Top Task survey, we had users rank a list of tasks we think they are trying to complete on the site, so that we have visibility into their priorities. The results from this survey informed a revised version of the navigation labels and structure, which we then tested in the following tree test. The survey was conducted via Google forms with existing Center audiences, aiming for 75+ completions.

We then used these audience-defined “top tasks” to inform the new information architecture, which we tested in our second test.

Test #2: IA tree test

During the tree testing of the Information Architecture, we stripped out any visuals and tested the outline of the menu structure. We began with a mailing list of about 2,500 people, split the list into two segments, and A/B tested the new proposed structure (Variant) vs. the current structure (Benchmark). Both trees were tested with the same tasks but using different labels and structure to see with which tree people could complete the tasks quicker and more successfully.

Step 3: Handoff our recommendations for the Center to implement.

Once the tests were completed, users’ behavior was compared to an “ideal” path, and success rates were analyzed. The test results informed our recommendations to help the Center think about label changes that are more user-centric as opposed to internal jargon. 

The Center has worked with Palantir on multiple projects. Palantir delivers their service in close partnership with our small team. This approach has allowed us to build our internal website development capacity and repeat success even after Palantir’s contract work was completed.

Phillip Birk

Senior IT Specialist

Chart showing overall success rate by task

The Outcomes

Overall, users had a 17% higher success rate with the optimized tree, and they completed the tasks with fewer “backtracks” (less second-guessing their path) on the variant.

One of the most impressive results for the Center was that 29% more users could find recorded webinars with the newly proposed tree. 

Next steps for the Center will be to implement the top-level navigation recommendations made by Palantir, and then select KPIs to monitor long-term. They’ll also follow up with program-specific tree test projects.

The greatest mark of success for this project is that the Center’s web team now has knowledge of the tools and processes needed to run these tests on their own, so they can continue to make iterative improvements over time. Websites are one of the most important tools used to deliver business value, and just like your business’ needs evolve over time, so do the needs of your audience. It’s never too late to perform user testing and improve upon your user experience.

Jul 19 2019
Jul 19
April 8 - 12, 2019 Washington State Convention Center, Seattle, Washington DrupalCon (official site)

Our team is always excited to catch up with fellow Drupal community members (and each other) in person during DrupalCon. Here’s what we have on deck for this year’s event:

Visit us at booth #709

Drop by and say hi in the exhibit hall! We’ll be at booth number 709, giving away some new swag that is very special to us. Have a lot to talk about? Schedule a meeting with us

Palantiri Sessions

Keeping That New Car Smell: Tips for Publishing Accessible Content by Alex Brandt and Nelson Harris

Content editors play a huge role in maintaining web accessibility standards as they publish new content over time. Alex and Nelson will go over a handful of tips to make sure your content is accessible for your audience.

Fostering Community Health and Demystifying the CWG by George DeMet and friends

The Drupal Community Working Group is tasked with fostering community health. This Q&A format session hopes to bring to light our charter, our processes, our impact and how we can improve.

The Challenge of Emotional Labor in Open Source Communities by Ken Rickard

Emotional labor is, in one sense, the invisible thread that ties all our work together. Emotional labor supports and enables the creation and maintenance of our products. It is a critical community resource, yet undervalued and often dismissed. In this session, we'll take a look at a few reasons why that may be the case and discuss some ways in which open source communities are starting to recognize the value of emotional labor.

  • Date: Thursday, April 11
  • Time: 2:30pm
  • Location: Exhibit Stage | Level 4

The Remote Work Toolkit: Tricks for Keeping Healthy and Happy by Kristen Mayer and Luke Wertz

Moving from working in a physical office to a remote office can be a big change, yet have a lot of benefits. Kristen and Luke will talk about transitioning from working in an office environment to working remotely - how to embrace the good things about remote work, but also ways in which you might need to change your behavior to mitigate the challenges and stay mentally healthy.

Join us for Trivia Night 

Thursday night we will be sponsoring one of our favorite parts of DrupalCon, Trivia Night. Brush up on your Drupal facts, grab some friends, and don't forget to bring your badge! Flying solo to DrupalCon? We would love to have you on our team!

  • Date: Thursday, April 11
  • Time: 8pm - 11:45pm
  • Location: Armory at Seattle Center | 305 Harrison Street

We'll see you all next week!

Jul 19 2019
Jul 19

We have released version 2.0 of our Federated Search application and Drupal integration.

Since our initial release, we’ve been doing agile, iterative development on the software. Working with our partners at the University of Michigan and the State of Georgia, we’ve made refinements to both the application and the Drupal integration.

Better search results

Default searches now target the entire index and not the more narrow tm_rendered_item field. This change allows Solr admins to have better control over the refinement of search results, including the use of field boosting and elevate.xml query enhancements.

Autocomplete search results

We added support for search autocomplete at both the application and Drupal block levels -- and the two can use the same or different data sources to populate results. We took a configurable approach to autocomplete, which supports “search as you type” completion of partial text. These results can also include keyboard navigation for accessibility.

Since the Drupal block is independent of the React application, we made it configurable so that the block can have a distinct API endpoint from the application. We did this because the state of Georgia has specific requirements that their default search behavior should be to search the local site first, looking for items marked with a special “highlighted content” field.

Enter search terms field with list of suggested results

Wildcard searching

We fully support wildcard searches as a configuration option, so that a search for “run” will automatically pass “run” and “run*” as search terms.

Default facet control

The default facets sets for the application -- Site, Content Type, and Date Range -- can now be disabled on a per-site basis. This feature is useful for sites that contribute content to a network but only wish to search their own site’s content.

Enhanced query parameters

We’ve added additional support for term-based facets to be passed from the search query string. This means that all facet options except dates can be passed directly via external URL before loading the search form.

Better Drupal theming

We split the module’s display into proper theme templates for the block and it’s form, and we added template suggestions for each form element so that themes can easily enhance or override the default styling of the Drupal block. We also removed some overly opinionated CSS from the base style of the application. This change should allow CSS overrides to have better control over element styling.

What’s Next for Users?

All of these changes should be backward compatible for existing users, though minor changes to the configuration may be required, Users of the Drupal 8.x-2.0 release will need to run the Drupal update script to load the new default settings. Sites that override CSS should confirm that they address the new styles.

Currently, the changes only apply to Drupal 8 sites. We’ll be backporting the new features to Drupal 7 in the upcoming month.

Users of the 1.0 release may continue to use both the existing Drupal module and their current JS and CSS files until the end of 2019. We recommend upgrading to the 2.0 versions of both, which requires minor CSS and configuration changes you can read about in the upgrade documentation.

Special Thanks

Palantir senior engineer Jes Constantine worked through the most significant changes to the application and integration code. Senior front-end developer Nate Striedinger worked through the template design and CSS. And engineer Matt Carmichael provided QA and code review. And a special shoutout to James Sansbury of Lullabot -- our first external contributor.

Development Drupal Open Source
Jul 19 2019
Jul 19

At Kanopi Studios, we believe that Drupal is an especially strong choice, further validated by the fact that governments across more than 150 countries have turned to Drupal to power their digital experiences. This includes major sites in the United States like The White House and NASA.

What makes Drupal the best choice? Read on for our top 8 reasons why Drupal should be the content management system of choice for government websites.

1. Mobility 

Website traffic from mobile devices surpassed desktop traffic years ago. In fact, according to Pew Research Center, one in five adults in America are smartphone-only internet users, and that number is likely to continue to grow. Government websites need to prioritize a superior mobile experience so they can meet the needs of citizens of all ages and economic levels and allow users to access critical information on the go.

Drupal can help. Drupal 8 was built to scale across devices, load mobile content at top speeds, provide a wide selection of responsive themes, and more. Drupal also allows content editors the ability to add or update site content via mobile, unlocking the ability to make emergency updates from anywhere.

2. Security 

Offering a secure site that protects your content and sensitive user information is critical for maintaining your reputation and public trust. Drupal offers robust security capabilities, from regular patches to prominent notifications about updates to security modules you can install for additional peace of mind. Unlike other open source platforms, Drupal has a dedicated security council that keeps an eye out for potential issues and develops best practices to keep sites stable and secure.

3. Accessibility

A number of federal, state and local laws require government websites to serve the needs of all citizens, regardless of their abilities. Focusing on accessibility compliance from the very beginning of your website project can help your team avoid costly re-work and launch delays.

Drupal has accessibility baked in, with all features and functions built to conform to the World Wide Web Consortium (WCAG) and ADA guidelines, including the platform’s authoring experience. That means that people of all abilities can interact with your Drupal website, whether they are adding and editing content, reading news, filling out forms, or completing other tasks. Drupal allows screen readers to interpret text correctly, suggests accessible color contrast and intensity, builds accessible images and forms, supports skip navigation in core themes, and much more. 

If you’re a content editor, we recently wrote about eight things you can do to make your site more accessible

4. Simple content management

Drupal’s content editor helps busy government website administrators add posts, pages, and resources in an environment that’s nearly as simple and familiar as a Word document. The what you see is what you get (WYSIWYG) editing mode supports text formatting, links, embedded media, and more.

Drupal also enables administrators to set up customized roles, permissions, and content workflows. This allows any number of team members to contribute to the site while maintaining administrative control of the content that gets through to the public.

5. Ability to handle significant traffic and data

Many government websites store hefty data and resources and see significant spikes in traffic based on seasonal demand, news cycles, and many other factors. Drupal has the power to deal with large databases and intense site traffic with ease.

Drupal’s database capability includes a wide range of ways to sort and organize content via its module system, supporting the needs of almost any content library without the need to create custom code.

Drupal powers a number of heavily visited sites including NBC’s Olympics, The Grammy Awards, and Weather.com, keeping them going strong even when traffic levels are enormous.

6. Flexibility 

The helpful features included in Drupal core are just the beginning. Many, many additional modules have been contributed and tested by the Drupal community and are ready to be added to your site as needed. How many? The Drupal community has contributed well over 40,000 modules, so it’s a safe bet that there’s something already out there that can help meet the needs of your project.

Modules can be added to your site at any time, like building blocks. A few popular examples include social sharing, image editing, calendars, metatags, and modules that support integrations with external systems, from email platforms to customer databases. 

7. Affordability

Government budgets are often tight, with plenty of competing priorities for every dollar spent. With Drupal, you tap into a free, open-source system that’s supported by an enormous community of developers. Building your website on an open-source platform means you can focus your budget on creating an ideal experience for your citizens through professional services including content strategy, user experience, and design rather than dedicating funds to software licensing fees. And Drupal’s flexible modules reduce or even eliminate the need for custom code, helping you save even more.

8. Support for multiple sites in multiple languages

It’s not uncommon for government entities to have multiple websites. Whether your government maintains a few sites or hundreds, building each one individually would require an incredible amount of time and funds. Thankfully, Drupal’s multisite feature allows your site’s code base to be copied and adjusted to create as many new websites as you need, leveraging features that already exist without the need to build them from scratch. To meet language requirements, Drupal offers Content and Entity Translation modules that help content authors translate pages, individual elements, or specific fields into more than 100 languages.

Kanopi Studios loves government website projects

At Kanopi, we’re Drupal experts. We’ve harnessed its power to create citizen-focused sites for the San Francisco Police Department, San Francisco Health Service System and more. We’d love to hear from you, learn about the problems you are trying to solve, and share even more details about how you can put Drupal to work for your government website

Jul 19 2019
Jul 19

On our way to Drupal 9 every site will need to take care of making updates to their custom code as well as updating their contributed projects. However this time, instead of needing to rewrite code, only smaller changes are needed. Most contributed modules will only need to deal with a couple changes. Collaborating with project maintainers is the best way to get to Drupal 9. The first beta of the Upgrade Status module alongside recent drupal.org changes focus on making this much easier.

Upgrade Status beta provides better insight into Drupal 9 readiness

Take the first beta of the Upgrade Status module and run it on your site. It will provide executive summaries of results about all scanned projects and lets you inspect each individually.

Custom and contributed projects are grouped and summarised separately. You should be able to do all needed changes to your custom code, while for contributed projects you should keep them up to date in your environment and work with the maintainers to get to Drupal 9. The later is facilitated by displaying available update information inline and by pulling the Drupal 9 plan information from drupal.org projects and displaying it directly on the page.

This is how the summary looks like after scanning a few projects:

Upgrade Status summary page after scanning several projects.

Digging deeper from the executive summary, you can review each error separately. The beta release now categorizes issues found to actionable (Fix now) and non-actionable (Fix later) categories with a Check manually category for items where it cannot decide based on available information. For custom projects, any deprecation is fixable that has replacements in your environment while for contributed projects supporting all core versions with security support the window is shifted by a year. Only deprecations from two or more releases earlier can be fixed (compared to the latest Drupal release) while keeping Drupal core support. So somewhat ironically, Upgrade Status itself has deprecated API uses that it cannot yet fix (alongside ones it could fix, but we have them for testing purposes specifically):

Upgrade Status project issue list categories

The module is able to catch some types of PHP fatal errors (unfortunately there are still some in projects that we need to figure out the best way to catch). The @deprecated annotation information guiding you on how to fix the issues found are also displayed thanks to lots of work by Matt Glaman.

Own a Drupal.org project? Direct contributors to help you the way you prefer!

If you own a Drupal.org project that has Drupal 8 code, you should specify your Drupal 9 plans. It is worth spending time to fill in this field to direct contributors to the best way you prefer them help you, so contributions can be a win-win for you and your users alike. Whether it is a META issue you plan to collect work or a specific time in the future you will start looking at Drupal 9 deprecations or a funding need to be able to move forward, letting the world know is important. This allows others to engage with you the way you prefer them to. Additionally to it being displayed in Upgrade Status's summary it is also displayed directly on your project page!

Go edit your project and find the Drupal 9 porting info field to fill in. Some suggestions are provided for typical scenarios:

Drupal 9 porting info field on a project

This will then be displayed on your project page alongside usage and security coverage information. For example, check it out on the Upgrade Status project page.


Special thanks for dedicated contributors and testers of the Upgrade Status module who helped us get to beta, especially Karl Fritsche (bio.logis), Nancy Rackleff (ThinkShout), Tsegaselassie Tadesse (Axelerant), Bram Goffings (Nascom), Travis Clark (Worthington Libraries), Mats Blakstad (Globalbility), Tony Wilson (UNC Pembroke), Alex Pott (AcroMedia, Thunder), Charlie ChX Negyesi (Smartsheet), Meike Jung (hexabinær Kommunikation). Thanks to Neil Drumm (Drupal Association) and Angela Byron (Acquia) for collaboration on the Drupal 9 plan field.

Jul 19 2019
Jul 19

Redis is an open-source, networked, in-memory, key-value data store that can be used as a drop-in caching backend for your Drupal

Add the Redis module from Drupal.org. You can install and enable the module from the command line.

Edit sites/default/settings.php to add the Redis cache configuration. These are the mandatory, required Redis configurations for every site.

// Use Redis for caching.
$conf['redis_client_interface'] = 'PhpRedis';
// Point Drupal to the location of the Redis plugin.
$conf['cache_backends'][] = 'sites/all/modules/redis/redis.autoload.inc';
$conf['cache_default_class'] = 'Redis_CacheCompressed';
$conf['cache_prefix'] = array('default' => 'pantheon-redis');
// Do not use Redis for cache_form (no performance difference).
$conf['cache_class_cache_form'] = 'DrupalDatabaseCache';
// Use Redis for Drupal locks (semaphore).
$conf['lock_inc'] = 'sites/all/modules/redis/redis.lock.inc';

Enable the module via from /admin/modules

Verify Redis is enabled by going to the Dashboard and clicking "Connection Info". If you see the Redis cache connection string, Redis is enabled

Visit /admin/config/development/performance/redis and open Connection Information to verify the connection.

And once the Redis server is connected, one could use the PhpRedis functions to set the key & value pairs on Redis as seen below

function _get_value_redis_cache($key) {
    try {
        $redis = Redis_Client::getClient();
        return $redis->get($key);
    catch (Exception $exception) {
        watchdog('redis_cache', t('Error, while getting the value from redis cache.'), custom_log(array($key, $redis)), WATCHDOG_ERROR);

function _set_value_redis_cache($key, $value) {
    try {
        $redis = Redis_Client::getClient();
        $redis->set($key, $value);
        $now = time();
        $redis->expireAt($key, $now + 900);
    catch (Exception $exception) {
        watchdog('redis_cache', t('Error, while setting the value from redis cache.'), custom_log(array($key, $value, $redis)), WATCHDOG_ERROR);

Once these generic functions are ready, one could utlize this functions for setting and getting the values based on the keys as seen below

$key = "_get_tmp_value_from_redis";
$tmp = _get_value_redis_cache($key);
if (empty($tmp)) {
    $tmp = get_actual_value_from_db();
    _set_value_redis_cache($key, $tmp);
    return $tmp;
else {
    return $tmp;

cheers :)

Jul 19 2019
Jul 19

Parsing HTML versus processing with regular expressions

Suppose you need to extract the URL from a bit of HTML markup like this:

Drupal home page

Using regular expressions

If you are familiar with regular expressions, then it is pretty easy to come up with something like // and use it with built-in PHP functions like preg_match().

Unfortunately, it is more complicated than that. For example:

  • The HTML tags are case-insensitive: you have to match a or A.
  • There might be other attributes, such as class, id, or name, before or after the href attribute.
  • The URL (value of the href attribute) might be enclosed in single quotes instead of double quotes.
  • There might be newlines within the HTML element.
  • Are you sure that an escaped quote (like \") is not allowed in a URL?

Before you start researching that last question, the point is that you should not spend your time reinventing the wheel.

There is an amusing answer on StackOverflow describing the dangers of trying to process HTML with regular expressions, and this practice has come to be known as Parsing Html The Cthulhu Way. The StackOverflow answer ends with the suggestion,

Have you tried using an XML parser instead?

Using the DOMDocument class

In PHP, we can use the DOMDocument and related classes to parse HTML markup. These classes use an HTML parser in the background rather than regular expressions. There are some steps to set things up:

$document = new \DOMDocument(); $document->loadHTML($html_string); $xpath = new \DOMXPath($document);

After this bit of boilerplate code, we can search the $xpath object with any XPath query and extract whatever attributes we need. For example, to find the href attribute of each element in the source,

foreach($xpath->query('//a') as $html_node) { $href = $html_node->getAttribute('href'); // Your processing goes here. }

Using XPath queries gives us a lot of flexibility: we can find elements having a specific class, or we can select those that are nested inside some other HTML element. We did not even think about these possibilities when discussing regular expressions above.

When you are finished processing your DOMDocument element, you can convert it back to a string:

$processed_html = $document->saveHTML(); ,

Migrate API and the ETL paradigm

In Drupal 8, the Migrate API follows the standard Extract, Transform, Load (ETL) structure, and we also keep the terminology from the contributed Migrate module in earlier versions of Drupal:

  • Extract (source plugin): read data from the source
  • Transform (process plugins): change data to match the site’s structure
  • Load (destination plugin): save the data

Each migration has a single source plugin and a single destination plugin, but each field uses at least one process plugin and may use several. I think this is the fun part: creating new, easy-to-configure process plugins is the best way to add reusable code to the framework.

The Transform/process phase is also the right place to handle HTML processing.


New process plugins for managing HTML

So far, Marco and I have contributed four process plugins to the Migrate Plus module. The goal of these plugins is to make it easy to process text fields with proper HTML parsing. The plugins create the required DOMDocument and related objects, so the person writing the migration only has to supply the XPath expression and other configuration.

The dom plugin

This plugin handles creating the DOMDocument object from a string, and then converting back to a string at the end. The other plugins go between these two steps, so they take a DOMDocument object as input, do some processing on it, and return the same object. This is what it looks like in practice:

process: 'body/value': - plugin: dom method: import source: 'body/0/value' # Other plugins do their work here. - plugin: dom method: export

The dom_str_replace plugin

Suppose, as part of your site upgrade, you decide to change the subdomain. For example, you might decide to change documentation.example.com to help.example.com. If you have any links in your text fields, then you need to update them. You can do this with the dom_str_replace plugin:

- plugin: dom_str_replace mode: attribute xpath: '//a' attribute_options: name: href search: 'documentation.example.com' replace: 'help.example.com'

Warning: The xpath key was called expression in version 8.x-4.2 of the Migrate Plus module. Use xpath starting with the recently released version 8.x-5.0-rc1.

Like the str_replace plugin that is already part of the Migrate Plus module, this plugin supports either basic string replacement, using the PHP str_replace() or str_ireplace() function, or regular expressions, using preg_replace().

The dom_apply_styles plugin

If you are using the Migrate API to import data from an external source, then you want the imported data to have formatting consistent with the rest of your site. Perhaps you have configured Drupal’s Editor module to add certain CSS classes from the Styles menu of the WYSIWYG editor, but you cannot add those classes to the external source.

This plugin lets you search for an XPath expression and replace the corresponding HTML elements with whatever is configured in the Editor module. For example,

- plugin: dom_apply_styles format: full_html rules: - xpath: '//b' style: Bold

This will replace ... with whatever style is labeled “Bold” in the Full HTML text format, perhaps ....

The dom_migration_lookup plugin

If you are migrating from a Drupal 7 site, then perhaps node/123 on the old site becomes node/456 on the new site. If you have entity-reference fields, then you can update references like these using the migration_lookup plugin from the core Migrate module.

If those references are in links in a text field, then you can now use the dom_migration_lookup plugin:

- plugin: dom_migration_lookup mode: attribute xpath: '//a' attribute_options: name: href search: '@/node/(\d+)@' replace: '/node/[mapped-id]' migrations: - article - page

If either the article or page migration has mapped 123 to 456, then this will replace /node/123 in any href attributes with /node/456.

Like the core migration_lookup plugin, this one violates the strict ETL paradigm, since a process plugin (i.e., code in the Transform stage) has to “peek” at the destination database. Ditto for the dom_apply_styles plugin, which reads configuration from the destination database.



  • Migrate API documentation on drupal.org
  • Migrate Plus module home page
  • Release notes for migrate_plus 8.x-5.0-rc1
  • amusing answer on StackOverflow
  • Parsing Html The Cthulhu Way
  • Change record describing the new DOMDocument-based plugins
  • XPath documentation on MDN
Jul 18 2019
Jul 18

Rain logo

Mediacurrent created the Rain Install Profile to build fast, consistent Drupal websites and improve the editorial experience. Rain expedites website creation, configuration, and deployment.

In this article, we will walk through each of the main features that ship with the Rain distribution. This tutorial is intended to help content authors and site administrators get set up quickly with all that Rain has to offer.

Have a question or comment? Hit me up @drupalninja on Twitter.

Content Moderation

A question we often hear when working with a client is, “how can Drupal help to build a publishing workflow that works for my team and business?"

Drupal 8 marked a big step forward for creating flexible editorial workflows. Building on Drupal 8's support for content moderation workflows, Rain comes pre-configured with a set of Workflow states. The term “states” refers to the different statuses your content can have throughout the publishing process - the four statuses available by default are “Draft”, “Needs Review”, “Published” and “Archived.” They can be easily enabled for any content type. As with everything in Drupal, these states and workflows are highly configurable. 

Once enabled, using content moderation in Drupal 8 is straightforward. After you save a piece of content, initially it will default to the “Draft” status which will remain unpublished. The “Review” status also preserves the unpublished status until the current edits get published. What’s great about Workflow in Drupal 8 is that you can make updates on a published piece of content without affecting the published state of that content until your changes are ready to be published. The video below demonstrates how to enable workflow and see draft updates before they are published.

[embedded content]

To review any content currently in a workflow state you can click on the “Moderated Content” local task which is visible from the main Admin content screen (see below).

Admin content screen


As a best practice, we recommend enabling revisions for all content. This allows editors to easily undo a change made by mistake and revisions keeps a full history of edits for each node. By default, all of Rain’s optional content features have revisions enabled by default. As illustrated below once you have made a save on a piece of content, the “Revisions” tab will appear with options for reviewing or reverting a change.

Rain Drupal Content Moderation - Revisions

Media Library

Coming soon to Drupal core is an overhauled Media library feature. In the meantime, Drupal contrib offers some very good Media library features that are pre-configured in Rain. The Rain media features are integrated with most image fields including the “thumbnail” field on all content type features that ship with Rain.

The video below demonstrates two notable features. First is the pop-up dialog that shows editors all media available to choose from within the site. Editors can search or browse for an existing image if desired. Second is the drag-and-drop file upload which lets the editor user drag an image onto the dialog to immediately upload the file. 

[embedded content]


Media Library


Media is commonly embedded within the WYSIWYG editor in Drupal. Rain helps improve this experience by adding a button which embeds the Media library feature to be used within WYSIWYG. The key difference between the Media library pop-up you see on fields versus the pop-up you see within WYSIWYG is that here you will have an option to select the image style. The video below illustrates how this is done.

[embedded content]

embed media


Another WYSIWYG enhancement that ships with Rain is the integrated “Linkit” module that gives users an autocomplete dialog for links. The short video below demonstrates how to use this feature.

[embedded content]

Content Scheduling

A common task for content editors is scheduling content to be published at a future date and time. Rain gives authors the ability to schedule content easily from the content edit screen. Note that this feature will override the Workflow state so this should be considered when assigning user roles and permissions. The screenshot below indicates the location of the “Scheduling options” feature that appears in the sidebar on node edit pages.

node edit

Clean Aliases

Drupal is usually configured with the ability to set alias patterns for content. This will create the meaningful content “slugs” visitors see in the browser which also adheres to SEO best practices. Rain’s approach is to pre-load a set of sensible defaults that get administrators started quickly. The video below demonstrates how an admin user would configure an alias pattern for a content type.

[embedded content]

XML Sitemap

By default, the Rain distribution generates a sitemap.xml feed populated with all published content. For editors, it can be important to understand how to exclude content from a sitemap or update the priority for SEO purposes. The screenshot below indicates where these settings live on the node edit page.

xml sitemap

Metatag Configuration

The default configuration enabled by the Rain install profile should work well for most sites. Metatag, a core building block for your website’s SEO strategy, is also enabled for all optional content features that ship with the Rain distribution. To update meta tags settings on an individual piece of content, editors can simply edit the “Meta tags” area of the sidebar on the edit screen (see below).

Metatag in Drupal

Google Analytics

Enabling Google Analytics on your Drupal website is a very simple process. The Rain distribution installs the Google Analytics module by default but the tracking embed will not fire until an administrator has supplied a “Web Property ID.” The Google Analytics documentation shows you where to find this ID. To navigate to the Google Analytics settings page, look for the “Google Analytics” link on the main admin configuration page. Most of the default settings will work well without change and the only required setting is the “UA” ID highlighted below.

Google Analytics

Enabling Content Features

Rain comes with many optional content features that can be enabled at any time. This includes content types, vocabularies, paragraphs, search and block types. Enabling a content feature will create the corresponding content type, taxonomy, etc. that can then be further customized. Any paragraph feature that is enabled will be immediately visible on any Rain content type that has enabled. Watch the video below to see an example of how to enable these features.

[embedded content]

Enabling Content Features

Wrapping Up

Mediacurrent created Rain to jump-start development and give editors the tools they need to effectively manage content. All features that ship with Rain are optional and highly configurable. We tried to strike a balance of pre-configuring as many essential modules as possible while still allowing site administrators to own the configuration of their Drupal site.

In the next tutorial, we will “pop open the hood” for Drupal developers to explain in technical detail how to build sites with Rain.

Jul 18 2019
Jul 18

Mike and Matt gather a fleet of Lullabots to talk the ins and outs of continuous integration (CI) in 2019.

Tools and Services mentioned in this episode:

...the efficiency gains that you get in day to day development from a finely tuned continuous integration setup outweighs the downsides of downtime, or even maintenance, of something like Jenkins... — Andrew Berry

This Episode's Guests

Andrew Berry


Andrew Berry is a architect and developer who works at the intersection of business and technology.

Sally Young

Sally Young

Senior Technical Architect working across the full-stack and specialising in decoupled architectures. Core JavaScript maintainer for Drupal, as well as leading the JavaScript Modernization Initiative.

James Sansbury

James Sansbury

An experienced Drupal developer and architect himself, James manages Lullabot's back-end development team. He lives near Atlanta, GA and enjoys fishing, hiking, and camping.

Jul 18 2019
Jul 18

We love to say that Drupal has modules for absolutely everything. Some modules are simple but still important because they cover specific details in the website’s work. They are like the missing pieces of the puzzle that makes your website more user-friendly, secure, reliable, and so on. One of them is Registration Confirm Email Address, which that we will describe today.

What is the Registration Confirm Email Address module for?

The Registration Confirm Email Address module is meant to create an extra field in the user registration form. As you might have guessed, this field asks users to confirm their email address during registration. They type the address twice, so they are protected against a misspelling or other error.

email confirmation field

Everyone’s happy — the users are not missing out on important emails, and the marketers are happy twice! ;) However, it’s much more than about marketing, because correct email addresses can be really crucial in different scenarios. They might need to manage their user cabinet, keep track of orders, and so on.

Who created the module?

We are happy to say that our Drupal developer knyshuk.vova is the co-owner of the Registration Confirm Email Address module. Together with the owner montesajudy, they have made it easy to confirm email addressess on Drupal websites.

Our parent company InternetDevels and our country’s Drupal Ukraine Community are listed as supporting organizations for the Drupal 8 branch of the module.

What makes the module special?

The Registration Confirm Email Address Drupal module extracts the email confirmation feature from another module — LoginToboggan. The latter is a complex module that offers numerous modifications to the Drupal login system.

However, if you only need the feature to confirm email address, you can install the Registration Confirm Email Address module instead of using the complex LoginToboggan module.

Having only the features you need, nothing less and nothing more, is good for usability and does not overload your Drupal website.

A closer look at the module’s work

1. Enabling the module

First, we install the module and enable it on the Drupal website.

enabling registration confirm email address drupal module

2. Enabling the option to confirm email address

After the module is enabled, we can select to use its functionality at admin/config/people/accounts. Under the “Confirm email address”, you need to check "Use two e-mail fields on registration form."

enabling option to confirm email address on drupal website

3. Creating test user accounts

When it’s configured, we can test the result by trying to create a Drupal website user account. As we see, the registration form now requires the user to retype the email address field. It says “Please retype your email address to confirm it is accurate.”

confirm email address field on drupal user registration form

Use the desired registration features on your website

Our Drupal developers are ready to help you install and configure the Registration Confirm Email Address module on your website, or select other modules in this sphere. If there is no desired module available, we will create a custom one for you that will work exactly as you wish.

Let your website’s registration process fully reflect your requirements, and make you and your users happy. It’s easy with our Drupal team!

Jul 18 2019
Jul 18

Sometimes, you would want to restrict access to certain pages on your site to users who do not have a specific role. You would want users to upgrade to a paid plan. Or you would just want to collect some more information from them.

The Rabbit Hole module controls what should happen when a user clicks the link to the entity or enters a URL in the address bar. It redirects such users to another page in the site.

The Rabbit Hole module works with different types of entities. They could be nodes, users, taxonomy terms and files, to name a few.

This tutorial will explain the basic usage of this module. Let’s start!

Step #1. Download the Required Modules

  • Open the terminal application on your PC
  • Go to the root of your Drupal installation (the composer.json file is inside this directory)
  • Type the following command:

composer require drupal/rabbit hole

Enter the composer installation command

  • Click Extend
  • Scroll down and check Rabbit Hole and Rabbit Hole nodes
  • Click Install:

Click Install

Step #2. Create the VIP Role

You need a VIP role in your site for paid users that are allowed to view this content. Users without a paid memberships will be authenticated users.

  • Click People > Roles > Add Role:

Click People > Roles > Add Role

  • Give this role a proper name
  • Click Save:

Click Save

Step #3. Create Users

For this tutorial, you are going to create one authenticated user and one VIP user.

  • Click People > Add user
  • Enter the user data and give them the role of VIP
  • Click Create new account:

Click Create new account

Notice that I’m working within the development environment. You should always be careful with passwords and make them as strong as possible.

  • Create another user, this time with the role of Authenticated. The People page should now look like on the screenshot below:

The People page should look like this

Step #4. Create the VIP Content Type

  • Click Structure > Content Types > Add Content type:

Click Structure > Content Types > Add Content type

  • Give the content type a proper name and description
  • Scroll down and click the Rabbit Hole vertical tab
  • Leave the Allow these settings to be overridden for individual entities option checked. This will allow administering (as the admin user) permissions on a node basis
  • Choose the Page redirect option
  • Enter the URL of the site you want the users to redirect to (in case they are not paying users). It can be an external URL too.
  • Change the Response code to 303
  • Click Save and manage fields:

Click Save and manage fields

  • Just for your practice, add an image field and place it above the body field on the content type display settings
  • Click Save:

Click Save

Step #5. Set Access Permissions

The users with the VIP role will have to be able to bypass the Rabbit Hole control.

  • Click People > Roles
  • Locate the VIP role and click the dropdown arrow
  • Click Edit permissions:

Click Edit permissions

  • Look for the Rabbit hole permissions
  • Check the Bypass Rabbit Hole action for Content permission
  • Scroll down and click Save permissions:

Scroll down and click Save permissions

  • Click Configuration > Performance > Clear all caches

Step #6. Create Content

  • Click Content > Add content > VIP Content
  • Create a node
  • Click Save:

Click Save

Notice that as the admin user, you can choose another Rabbit hole behavior for that particular node. You left this option checked in Step #4.

Step #7. Testing the Rabbit Hole module

  • Copy the node URL
  • Log in as our authenticated user:

Log in as an authenticated user

  • Go to the Home page and click the Teaser title of the VIP Content
  • You will be redirected to the access-denied page:

You will be redirected to the access-denied page

  • Now paste the URL you copied into the address bar. The system will redirect you to the same page.
  • Log out and log back in as the VIP user:

Log out and log back in as the VIP user

  • Go to the home page and click the teaser title
  • You will be able to access the node:

You will be able to access the node

There you have it! Thanks for reading. Please leave us your comments and suggestions below.

About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web