Oct 30 2018
Oct 30

Our client is migrating from Luminate CMS to Drupal because they want to improve performance without changing the look or feel of the site. Each of the pages on a Luminate site are like snowflakes - unique. It doesn’t make sense to rebuild those features as structured blocks given that they only appear on one single page. So having the ability to use existing JS and CSS allows us to copy and paste markup without rebuilding a whole structure that wouldn’t be repurposed on other pages.

This technically savvy client wants a way to add existing JavaScript and CSS to Drupal pages. So let’s give them the capability of putting raw CSS and JavaScript on their pages. This will help them complete the migration, moving their existing code to Drupal. These are the tools the content editors need to make their website beautiful and effective. If your content editors are more familiar with writing javascript and css here’s how to enable them to keep doing that.

To make this happen, first make a raw field formatter.

  • Go to Configuration > Content authoring > Text formats and editors.
  • Add a new text format called “Raw”. None of the filters should be enabled since this will be raw output.

Raw Text Format

Adding in raw text format

No Filters Enabled

AND…No filters enabled!

Since our client wants to add raw css and javascript to landing pages, we will create a field on the ‘landing page’ content type. It will be Text (formatted, long) and label “Inline CSS”. We will limit it to just one on the page.

Add field inline

Add field inline css

Have it use the Raw text format from the last step. You can limit the field to only this format by installing the package

Composer require drupal/allowed_formats

Be sure to check the “Raw” box on the field page and save it.

Now make sure our field is being output.

  • Go to Admin > Structure > Types > Manage > Landing page > Display > Full
  • Make sure it is enabled and the label is hidden. It should be output in the default format.

Inline css displayed

Making sure inline css is displayed

Visit a landing page content form by going to Manage > Content > Add content > Landing Page, and put some real css in our new field:

Add map background raw

Adding map background raw

We also provide a WYSIWYG place to enter HTML. In this case we need some HTML, perhaps a div, with class=‘map’.

We’re not finished yet! We need to provide a twig template. Look at the output HTML. We get:

<!-- THEME DEBUG -->
<!-- THEME HOOK: 'field' -->
* field--node--field-inline-css--landing-page.html.twig
* field--node--field-inline-css.html.twig
* field--node--landing-page.html.twig
* field--field-inline-css.html.twig
x field--text-long.html.twig
* field.html.twig
<!-- BEGIN OUTPUT from 'core/themes/classy/templates/field/field--text-long.html.twig' -->
<div data-quickedit-field-id="node/589/field_inline_css/en/full" class="clearfix text-formatted field field--name-field-inline-css field--type-text-long field--label-hidden field__item">.map {
background: url(http://www.example.com/assets/images/background-images/banner-landing-page/map.png) center no-repeat;
padding-top: 80px;
min-height: 350px;
<!-- END OUTPUT from 'core/themes/classy/templates/field/field--text-long.html.twig' -->

in our output! Notice the <div> surrounding our CSS! We don’t want that! So it’s time to create a Twig template without extra div’s. One that will output raw CSS.

We will go from this (notice all the extra <div>s)


And we should do three things:

  1. Remove all <div> tags,
  2. Send it through a raw filter, and
  3. Surround it with <style> tags so we will go to this >



Then we get in output:

<!-- THEME DEBUG -->
<!-- THEME HOOK: 'field' -->
x field--node--field-inline-css--landing-page.html.twig
* field--node--field-inline-css.html.twig
* field--node--landing-page.html.twig
* field--field-inline-css.html.twig
* field--text-long.html.twig
* field.html.twig
<!-- BEGIN OUTPUT from 'themes/custom/example/templates/field/field--node--field-inline-css--landing-page.html.twig' -->
.map {
background: url(http://www.example.com/assets/images/background-images/banner-section-landing-page/map.png) center no-repeat;
padding-top: 80px;
min-height: 350px;
<!-- END OUTPUT from 'themes/custom/example/templates/field/field--node--field-inline-css--landing-page.html.twig' -->

Tada! The CSS shows up ready to use on the page! The same technique can be used to allow content editors to put JavaScript on the page! Instead of putting <style> tags around the template, make it <script> tags instead.

Make sure you meet your content editors where they are, give them tools they can use but don’t use this technique with novice or non-technical content editors.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Sep 24 2018
Sep 24

One of the most overlooked barriers to working with Drupal is learning its idiosyncratic naming conventions. Most CMSs use a fairly simple set of terms for things such as Pages, Widgets, and Plugins. Drupal used a more computer-science precise language that is nevertheless confusing as heck when you first start working in the CMS.

Let’s review what some of those are!


The basic content for Drupal is the Node. This would be a Page (or post) in every other CMS. In the admin interface, the term Node will not be found however - everything is simply called Content.

Node Screenshot

Developers call them nodes. Editors call them content. Why can’t we communicate?


Small pieces of reusable content are called Blocks (and sometimes Beans in Drupal 7). These are Widgets in WordPress and elsewhere. They’re primarily managed through the Block Layout page, which is under the Structure tab.

Blocks Screenshot

Blocks, within the Structure tab.


This is a really confusing one, since it generally only appears on the programming side of things. Entities as used by Drupal are very similar to Objects. The D7 Intro to Entities page has a very good rundown, though they bury the Entity->Object mapping near the end of the page:

  • An *entity type* is a *base class*
  • A *bundle* is an *extended class*
  • A *field* is a *class member, property, variable or field instance* (depending on your naming preference)
  • An *entity* is an *object* or *instance* of a *base* or *extended class*

Most Drupal content, be it Blocks or Nodes or even Users, are different types of Entities.


This is a plugin or add-on. It extends and expands the CMS. They come in two types: community-contributed modules (contrib), and custom modules. Custom modules are where you should write low-level programming for your site.

Drupal maintains a well-curated (but slow) system for vetting and approving contributed modules and patches on Drupal.org. The pages for managing Modules are hidden under the Extend tab.

Modules Screenshot

Modules AKA Plugin or add-on.


Drupal contains a very large library of custom functions. While some can be used or re-used in a standalone manner, there are special functions called hooks that allow you to access specific points in the Drupal execution thread. As such, some have to be implemented in a custom module, and some can be used in the template layer (template hooks). They’re called ‘hooks’ because you rename the first part of the function depending on where you implement it.

For example, if you implement the template hook called hook_preprocess_page() in a theme called mytheme, you would rename it mytheme_preprocess_page(). If you implement hook_form_alter in a custom module called mysite_common, it would be mysite_common_form_alter().

The full, searchable list of Core hooks is in the Drupal Core API. Additional hooks can be implemented in modules.


This is a UI for a custom database query generator, and it’s fairly unique to Drupal. There’s an optional setting in Drupal 8 /admin/structure/views/settings: ‘Show the SQL query’ that can be helpful if you know SQL.

Views is used to build many Drupal lists, such as the primary list of content/nodes seen above, blocks, files, etc., and for content display, such as a dynamic list of upcoming events, or “related content” blocks. The place to add or edit Views is under the Structure tab:

Views Screenshot

Check out these views.


Sometimes called ‘Tags’, Taxonomy is the part of the site that holds all of the groups of things such as State names, cities, tags, etc. (called Vocabularies), and those contain Terms. It is also under the Structure tab:

Taxonomy Screenshot

Taxonomy under Structure tab.

Machine Name

A slug! Not a terrestrial mollusk, but the version of something that can be rendered easily in a URL.


This is shorthand for Drupal.org, the primary drupal site, pronounced Dee-dot-oh.


This is the command-line tool for Drupal. Very useful for clearing your cache (drush cr).

And there you have it! That’s the most common set of (potentially) confusing terms that you’ll run across when you’re learning to use Drupal. Good luck!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jul 09 2018
Jul 09

One of the most useful items in the Drupal 8 toolbox is the Paragraphs Module. By creating custom paragraph types, you can have much finer control over the admin and content creation process in Drupal.

A recent client of ThinkShout needed a content type (office locations) to include ‘sub-pages’ for things like office hours, services, and other items depending on the location. Most of the sub-page content was pretty simple, but they also needed to have direct links, be printable, and have the same header as the parent page. This ruled out an Ajax solution.

We’ve been using Paragraphs to make configurable content throughout the site, and since the sub-pages only have Title and Content fields, we thought they would be a good fit here as well. We then decided to explore the possibility of using custom entity routes to fulfill the other requirements.

To start, we created two additional view modes for the sub-page paragraphs called Sub-page and Menu link containing the Content and Title fields respectively. By keeping these fields in separate view modes, we make it much easier to work with them.

Next we created a custom module to hold all of our code, ts_sub_pages. In addition to the standard module files, we added the file ts_sub_pages.routing.yml, which contains the following:

  path: '/node/{node}/sub-page/{paragraph}'
    _controller: '\Drupal\ts_sub_pages\Controller\TSSubPagesController::subPageParagraph'
    _title_callback: '\Drupal\ts_sub_pages\Controller\TSSubPagesController::getTitle'
        type: entity:node
        type: entity:paragraph
    _permission: 'access content'

This defines a unique system path based on the parent node ID and the paragraph entity ID. It would look like https://example.org/node/12345/sub-page/321. It also defines the call to the controller and the title_callback, essentially a location where we can create functions to manipulate the entity and its route. The options define the things to pass into the controller and title callback functions, and we also define access permissions using requirements.

One of the odd things about the controller and title_callback calls is that they look like a path, but are not. They have a predefined (and minimally documented) structure. You must do the following to make them work:

  • Create two folders in your module: src/Controller (case is important).
  • Create a file called TSSubPagesController.php - this must match the call.
  • Define a class matching TSSubPagesController in TSSubPagesController.php
  • Define a function matching subPageParagraph inside the TSSubPagesController class.

Example below. The names of the controller file, class, and function are up to you, but they must have the same case, and the file and class must match.

Digging into the TSSubPagesController.php file, we have a setup like so:


namespace Drupal\ts_sub_pages\Controller;

use Drupal\Core\Controller\ControllerBase;
use Symfony\Component\HttpFoundation\Request;
use Drupal\node\Entity\Node;
use Drupal\paragraphs\Entity\Paragraph;

 * TS Sub Pages controller.
class TSSubPagesController extends ControllerBase {

   * {@inheritdoc}
  public function subPageParagraph(Paragraph $paragraph, Node $node, Request $request) {

Here we have the namespace - this is our module. Note again that the src is taken for granted. Next are the Symfony/Drupal use statements, to pull in the classes/interfaces/traits we’ll need. Then we extend the ControllerBase class with TSSubPagesController, and define our subPageParagraph function. The function pulls in the $node and $paragraph options we defined in ts_sub_pages.routing.yml.

Now we can finally get to work on our sub-pages! Our goal here is to bring in the parent node header fields on every sub-page path. In the Drupal admin interface, go to ‘Manage Display’ for your content type. In our case it was /admin/structure/types/manage/location/display. Scroll to the bottom and under ‘Custom display settings’ you’ll find a link to ‘Manage view modes’. We added a mode called sub-page, and added all of the fields from our Location’s header.

Now we can bring that view of the node into the sub-page using the subPageParagraph function we defined above:


public function subPageParagraph(Paragraph $paragraph, Node $node, Request $request) {
  $node_view_builder = \Drupal::entityTypeManager()->getViewBuilder('node');
  $node_header = $node_view_builder->view($node, 'sub_page');

  $paragraph_view_builder = \Drupal::entityTypeManager()->getViewBuilder('paragraph');
  $paragraph_body = $paragraph_view_builder->view($paragraph, 'sub_page');

  return ['node' => $node_header, 'paragraph' => $paragraph_body];

We get the node and paragraphs using getViewBuilder, then the view modes for each. The node’s ‘sub-page’ view mode contains all of the header fields for the node, and the paragraph ‘sub-page’ view mode contains the paragraph body. We return these, and the result is what looks like a page when we visit the base paragraph url of /node/12345/sub-page/321. The title is missing though, so we can add that with another small function inside the TSSubPagesController class (we call it using the _title_callback in ts_sub_pages.routing.yml):


 * Returns a page title.
public function getTitle(Paragraph $paragraph, Node $node) {
  $node_title = $node->getTitle();
  $paragraph_title = $paragraph->field_title_text->value;

  return $node_title . ' - ' . $paragraph_title;

Now we need to build a menu for our sub-pages. For this we can just use the ‘sub-pages’ paragraph field on the parent node. In the admin display, this field is how we add the sub-page paragraphs, but in the public-facing display, we use it to build the menu.

First, make sure you include it in the ‘default’ and ‘sub-page’ displays as a Rendered Entity, using the “Rendered as Entity” Formatter, which has widget configuration where you need to select the “Menu Link” view mode. When we set up the Paragraph, we put the Title field in the ‘Menu Link’ view. Now the field will display the titles of all the node’s sub-pages. To make them functional links, go to the ‘Menu Link’ view mode for your sub-page paragraph type, make the Title a ‘Linked Field’, and use the following widget configuration:

Destination: /node/[paragraph:parent_id]/sub-page/[paragraph:id]
Title: [paragraph:field_title_text]


Next we need to account for the fact that the site uses URL aliases. A node called ‘main office’ will get a link such as /locations/main-office via the Pathauto module. We want our sub-pages to use that path.

We do this by adding a URL Alias to the sub-page routes on creation (insert) or edit (update). In our module, we add the following functions to the ts_sub_pages.module:

 * Implements hook_entity_insert().
function ts_sub_pages_entity_insert(EntityInterface $entity) {
  if ($entity->getEntityTypeId() == 'paragraph' && $entity->getType() == "custom_subpage") {

 * Implements hook_entity_update().
function ts_sub_pages_entity_update(EntityInterface $entity) {
  if ($entity->getEntityTypeId() == 'paragraph' && $entity->getType() == "custom_subpage") {

These get called every time we add or update the parent node. They call a custom function we define just below. It’s important to note that we have a custom title field field_title_text defined - your title may be the Drupal default:

 * Custom function to create a sub-path alias.
function _ts_sub_pages_path_alias($entity) {
  $sub_page_slug = Html::cleanCssIdentifier(strtolower($entity->field_title_text->value));

  $node = \Drupal::routeMatch()->getParameter('node');
  $language = \Drupal::languageManager()->getCurrentLanguage()->getId();

  $nid = $node->id();
  $alias = \Drupal::service('path.alias_manager')->getAliasByPath('/node/' . $nid);
  $system_path = "/node/" . $nid . "/sub-page/" . $entity->id();

  if (!\Drupal::service('path.alias_storage')->aliasExists($alias . "/" . $sub_page_slug, $language)) {
      ->save($system_path, $alias . "/" . $sub_page_slug, $language);

This function gets the sub-page paragraph title, and creates a URL-friendly slug. It then loads the paragraph’s node, gets the current language, ID, and alias. We also build the system path of the sub-page, as that’s necessary for the url_alias table in the Drupal database. Finally, we check that there’s no existing path that matches ours, and add it. This will leave old URL aliases, so if someone had bookmarked a sub-page and the name changes, it will still go to the correct sub-page.

Now we can add the ‘Home’ link and indicate when a sub-page is active. For that we’ll use a custom twig template. The field.html.twig default file is the starting point, it’s located in core/themes/classy/templates/field/. Copy and rename it to your theme’s template directory. Based on the field name, this can be called field--field-sub-pages.html.twig.

The part of the twig file we’re interested in is here:

{% for item in items %}
  <div{{ item.attributes.addClass('field__item') }}>{{ item.content }}</div>
{% endfor %}

This occurs three times in the template, to account for multiple fields, labels, etc. Just before each of the for loops, we add the following ‘home’ link code:

{% if url('<current>')['#markup'] ends with node_path %}
  <div class="field__item active" tabindex="0">Home</div>
{% else %}
  <div class="field__item"><a href="{{ node_path }}">Home</a></div>
{% endif %}

Next, we make some substantial changes to the loop:

{% set sub_text = item.content['#paragraph'].field_title_text.0.value %}
{% set sub_path = node_path ~ '/' ~ sub_text|clean_class %}
{% if url('<current>')['#markup'] ends with sub_path %}
<li{{ item.attributes.addClass('field__item', 'menu-item', 'active') }}>{{ sub_text }}</li>
{% else %}
<li{{ item.attributes.addClass('field__item', 'menu-item') }}><a href="{{ sub_path }}">{{ sub_text }}</a></li>

Here, sub_text gets the sub-page title, and sub_path the path of each sub-page. We then check if the current url ends with the path, and if so, we add the active class and remove the link.

And that’s it! The client can now add as many custom sub-pages as they like. They’ll always pick up the parent node’s base path so they will be printable, direct links. They’ll have the same header as the parent node, and they will automatically be added or deleted from the node’s custom context-aware menu.

Hmm, maybe this would make a good contributed module?

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
May 24 2018
May 24

Last month I went to my first DrupalCon in Nashville. I met a lot of interesting people, had good conversations, and had a hard time choosing from the record number of sessions. As the week went on, I noticed a theme kept coming up. It showed up in sessions on how to create a better admin and content editing experience, in sessions on accessibility and what it’s like to be a blind or deaf engineer, in conversations about helping first-time users enjoy the experience of using Drupal, and in debates about what Drupal will look like in the future. What if the thing that will give Drupal a competitive advantage and improve the admin experience is the same thing that will attract new users and create sites that are accessible for all?

The idea that kept surfacing during my week at DrupalCon was this: we need empathy. The Drupal community has excelled at solving complex engineering problems, and the next challenge we face is just as critical, though it requires us to think a little differently: how do we make space for empathy in our work and in our community?

Reflection photo

It’s time to shift our perspective. Photo Credit: Randy Jacob.

Our Bias is our Blindspot

Sometimes we don’t need more complex solutions, we need thoughtful ones. Building websites is challenging. There’s never enough time or resources. It’s easy to stick with what’s known and what works. But sometimes what I know is limited, and only works for people who look and think like me. It’s easy to become insular and indifferent to the needs of others because it’s hard to make everyone happy, and thinking about the effort required to change can be overwhelming.

If someone told me, “It’s really hard to talk to people with accents, so I just avoid them,” I’d be shocked. But I know I’ve created sites and tools that are difficult—if not impossible—for people with disabilities to use. Arriving in Nashville, I knew enough about accessibility to know that I needed to learn more. So I dove in and attended every session I could.

DrupalCon Accessibility Session

I kicked off my deep dive with Katherine Shaw and Fito Kahn’s awesome all day Drupal Accessibility training. Check out Katherine Shaw’s great blog posts on accessibility.

Accessible Empathy

I learned that excuses like “accessibility is hard,” or “it doesn’t affect me because I’m not working on a government site” won’t get me off the hook. Accessible websites are now a part of the Americans with Disabilities Act. And any site that is not accessible to all users is liable. I met several engineers who are currently resolving warnings or navigating lawsuits for not meeting WCAG 2.0 guidelines.

But it’s about much more than just changing processes to avoid a lawsuit. Listening to the Core Accessibility panel, I was humbled when it was pointed out that we labor over fixes for Internet Explorer, which can make up 2-3% of users. Meanwhile, 12.6% of people in the US have disabilities (40.7 million people), and accessibility can still be considered an edge case. Building a website that works for more users is not difficult, but it takes intention, a willingness to learn and empathy.

I also learned that having empathy for all types of users doesn’t mean everything has to change immediately. During his talk about accessibility, Everett Zufelt said, “The best place to start? Anywhere. If you fix one button, your site is that much more accessible than it was before.” So I’m challenging myself to build things the right way the first time, drop bad habits and to refine best practices so I can create sites and tools that serve all types of users.

Inward Empathy

For some of you reading this, the challenge might be that you have empathy for everyone in the room, except yourself. You take on multiple roles at work. You handle the backend and the frontend and design and project management. You say yes because you know you can do it and how will you get ahead if you don’t show how valuable you are by doing all of the things all the time? I get it. Now stop it.


“ ‘No’ might make them angry, but it will make you free.” –Nayyirah Waheed; Photo credit: Clem Onojeghuo

You deserve empathy too, so be kind to yourself. Good boundaries will keep you fresh and sane. A well cared for version of you will help your team more than the stretched and exhausted one that’s running on too little sleep and too much caffeine.

Something that stood out to me in particular in sessions at DrupalCon was how people wouldn’t move over in their seats to make room and allow those in an already crowded session to sit comfortably in chairs instead of on the floor. People would have empty seats on either side, and not move down the row to make it easier others. There are people who don’t have an issue taking up space, taking what they need, and not for an instant feeling bad about it. Let’s find some balance somewhere in the middle. Give yourself the empathy you need to succeed, and–for the love of god–let’s all scoot down so no one is left sitting on the floor.

Outward Empathy

A better admin experience, and faster and more accessible websites are only created when we think about how our work is used by everyone. Take a moment to walk a mile in someone else’s shoes. Now apologize for taking their shoes, sit down and talk to them about how they use your site, what the sticking points are, and how it can be improved. Most importantly, listen. Forget what you think you know, and learn about what it means to be someone else using your website. Then you just might have a week like mine where you were reminded: sometimes engineers are blind or deaf, or both. Sometimes keynotes are a she or a he or a they. Sometimes content editors know exactly what is needed to make a better editing experience–if you just ask.

Be Human. Think Digital.

Empathy is what makes us human. We all want to be seen and known and understood. And at the end of the day we all want to use tools that help us to accomplish a task, not remind us that we’re not who the engineer had in mind. Technology without empathy is hollow. Empathy without technology is limited. Let’s make space for empathy in our community and in our code, and let’s change the world for good—for everyone.

Driesnote Slide: There is no power for change greater than a community discovering what it cares about

One of my favorite slides from the Driesnote at DrupalCon Nashville

Women in Drupal Event

The ThinkShout team hanging out with some awesome folks at the Women in Drupal event.


If you’re interested in learning more about the sessions I attended this week, here are links to some of my favorite talks:

JavaScript and Accessibility: Don’t blame the language

DrupalCon Nashville 2018: Core Accessibility: Building Inclusivity into the Drupal Project

Usability testing is super important and easier than you think

A smarter Way to Test Accessibility - a comparison of top tools (Lighthouse, Tenon.io and WAVE API)

If you’re overwhelmed by accessibility and don’t know where to start, here’s a great video on how to do a very basic accessibility audit.

If you’re interested in refining your accessibility practices, there are some amazing tools and resources available. Here are some of my favorites. If you have tools or processes you love, please share in the comments below!

Style Guide Module: Allows you to run accessibility tests on one page that is automatically populated with all basic layout elements. This is also great as a living style guide for the site.

VoiceOver Screen Reader Getting Started Guide

A11y checklist: A11y has a ton of patterns and a useful checklist.

WAVE Accessibility Plugin: Described in the “A smarter Way to Test Accessibility” talk as the ‘Cadillac of accessibility plugins,” this free tool will catch errors, markup the page with an outline of your headings and make accessibility QA much easier.

Sim Daltonism tool: This overlay tool allows you to preview your site for multiple types of colorblindness.

Color Contrast Ratio Checker: This chrome plugin will tell you whether the color contrast of fonts on your site passes WCAG 2.0 standards.

ARIA cheat sheet: This doc outlines all of the different ways you can use ARIA to make your site more accessible

HTML Codesniffer by Squiz: Allows you to set the accessibility standard you want to meet (WCAG2AA is the new legal requirement), and creates a report identifying errors, warnings and notices.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
May 13 2018
May 13

“We’ve recently updated our privacy policy.”

If you’ve ever given your email address to an online store, entity, social media platform or done just about anything online, then you’ve probably received the above notice in your inbox from those entities with increasing regularity over the last month or two.

Most of these notices are related to the European Union’s General Data Protection Regulations (GDPR) that are going into effect later this month on May 25, 2018.

To be clear, we at ThinkShout are not lawyers and we strongly encourage our clients and anyone collecting user information in some way, shape, or form to seek legal counsel for your own specific obligations related to the GDPR. Here’s how we’re viewing the regulations and what actions we are taking at ThinkShout.

The big picture

The regulations apply specifically to organizations that collect or process data associated with EU citizens. The overall intent is to give EU citizens control over how their own data is collected and used. The stick that’s being wielded to enforce the regulations is the possibility of fines of up to €20 million or 4% of an organization’s global annual revenue (whichever is greater). Charitable organizations are not exempted from these penalties, however it’s likely that the steep fines will be for recurring or significant privacy issues and that the focus will be on fixing any issues that are discovered. There are questions about enforceability (particularly in the USA) that will likely need to be settled in court, but many of the regulations reflect smart privacy practices regardless of the penalties. All the chatter and hand wringing about the GDPR has led to a fast growing industry of businesses offering compliance audits, consulting and technical solutions to the regulations. Some of the vendors offering these services are legitimate, while many are simply designed to sell products or services based on embellished fears.

The principles of the GDPR can be broadly summed up as protecting personal data by allowing individuals to choose what data they allow to be collected, how that data is used or processed, and gives them control over that data even after it’s been collected. The UK’s Information Commissioner’s Office provides an easy to read guide to the GDPR that goes into detail on the various provisions while the EU provides a more graphical explanation. That last link might be more palatable for the visual learners reading this.

EU Graphic Explanation of GDPR

Portion of the EU’s graphical explanation of GDPR - full explanation can be found here.

Does the GDPR apply to you and your users?

In short, probably. While compliance is technically only needed when handling data for EU citizens, discerning who is and isn’t a EU citizen can be difficult, and compliance in many cases isn’t all that cumbersome.

Documentation and communication are two of the key areas of responsibility.

Start with an audit of the data you collect from users, the data you collect from other sources and what is done with that data. Note that this isn’t just about new data but also any data already in your various systems (website, Salesforce, spreadsheets, etc.). Once you know what user information you have and why you have it, communicate that information to both your staff and your users by updating your privacy notices, and emailing constituents with that now famous subject line, “We’ve recently updated our privacy policy.”

Document how your data handling processes are shared with new staff. It’s also a good idea to revise privacy policies written by lawyers to be “concise, transparent, intelligible and easily accessible” and should further be “written in clear plain language.”

Here’s an example of good privacy notices and requests for consent.

Basically, ensure that the general population (who did not attend law school) can easily understand the language.

Processing must be allowed under a lawful basis.

Any processing of personal data must be supported by both the need to process that data as well as a lawful basis. Out of the eight lawful bases that the GDPR defines, consent, legal obligation and legitimate interest appear to be the most likely to be cited in the work of our clients. For consent to apply, it must be active (opt-in), current, specific and revocable.

Legal obligation covers data needed for accounting or other legal audit trails. Legitimate interest is less defined, but addresses situations where the usage of the data can be reasonably expected, has minimal privacy impact and there is strong justification for the processing of the data. Using a user’s email address on an account they created to send them a link to reset their password might be an example of legitimate interest as a lawful basis.

Individuals have defined rights to the protection and usage of their data.

  1. The right to be informed: privacy notices, accurate opt-in information, etc.
  2. The right of access: ability to see any data you have on an individual.
  3. The right to rectification: ability to correct any errors in the data you have - allowing users to update their own profiles covers much of this right.
  4. The right to erasure: ability to request data be removed. This is not all encompassing, nor does it need to be automated. Data needed for legal or other legitimate needs can be retained.
  5. The right to restrict processing: ability to request that their data not be processed but also not deleted.
  6. The right to data portability: ability to request a machine readable export of their data.
  7. The right to object: ability to opt out of marketing, processing based on legitimate interest or processing for research or statistical purposes. The process for opting out must be clearly explained in the privacy notice and at the point of first communication.
  8. Rights in relation to automated decision making and profiling: If you collect data to profile individuals for behavior tracking or marketing purposes then additional regulations apply.

What about cookies?

Cookies aren’t specifically called out in the GDPR, however some of the provisions can apply to them. Some experts recommend altering the site behavior to prevent cookies from being created until after the user has provided and the site has recorded consent. Several services seek to provide paid services that support this approach, although altering the code on your site is generally necessary to use them correctly. A few Drupal modules and WordPress plugins also seek to provide this functionality. It is expected that in 2019 the revised e-Privacy Directive will shift some or all of the obligations for managing consent related to cookies to the browser application.


We’re recommending that all our clients take the following steps to ensure compliance:

  • Evaluate your organization’s legal needs related to the GDPR. Consulting with your own counsel is recommended.
  • Appoint an internal person to take responsibility for data protection in your organization. While the GDPR includes provisions for appointing a Data Protection Officer (DPO), it’s specifically for public authorities and organizations whose core business is tracking behavior or processing criminal data. Appointing a staff person will help avoid a diffusion of responsibility regarding data security.
  • Audit your data collection and processing (here’s a sample template):
    • What is being held already and what is being collected?
    • Is there data being collected or stored that isn’t needed?
    • How is the collected data is used within the organization?
    • Is there a legal basis for the different pieces of personal data being collected?
    • If consent is the legal basis, is the consent active (opt-in), granular and recent?
  • Review and revise privacy notices and cookie policies to be clearly written and comprehensive. Be sure to include information about third-party data collection (Google Analytics, AddThis, Facebook, etc). Here’s a privacy notice checklist to get you started.
  • Document processes for handling user requests as well as security breaches. Your organization has a month to respond to an individual’s request for export, access, or deletion of their data. In most cases this will currently be a manual process although there is working happening in both the Drupal and WordPress communities to make these request easier to accommodate. If there is a data breach, the GDPR states that the regulating agency must be notified within 72-hours. A good starting point is the Security Breach Response Plan Toolkit.
  • Evaluate if changes to your website (beyond the privacy/cookie notices) are necessary. Consider specifically:
    • Is Google Analytics configured properly? Ensure IP anonymization is enabled, data retention settings are correct and that no personal information is being tracked (check page urls, titles, etc.).
    • What third-party scripts or pixel trackers are included?
    • How is consent being collected in newsletter signup forms?
    • How is consent being collected in user registration forms?
    • Any other places that user data could be collected?

What’s next for us?

Like most agencies, we’re continuing to learn more about the GDPR and the implications for our clients. We are working in partnership with them to help them understand and implement changes that might be needed on their sites or their internal processes. Internally we’re providing additional training on the principles of privacy by design to our team. In terms of our open source work we’ll be incorporating MailChimp’s GDPR consent forms into the Drupal MailChimp modules as soon as the functionality is available in their API. We see opportunities for including functionality related to subject access requests (export, deletion, etc) and consent tracking in our RedHen CRM suite of modules as well.

Bottom line is: this is something we all need to be cognizant of; it’s not solely an EU issue. We’ll continue to keep a close eye on this as GDPR gets rolled out – and there are many resources out there at your disposal (and within this blog post). You can be sure to get the latest from us on this and other digital trends by signing up for our newsletter and following us on twitter. Good luck!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
May 02 2018
May 02

It feels like it was just yesterday… probably because we still have visions of beignets dancing in our head. But we had such a blast at 18NTC, and we have you, the nonprofit community to thank for it.

Booth Beingets

Our pre-conference Drupal Day was jam-packed with content from project management to content strategy, to Bene.

Drupal Day booth

Brett and Mimi take over registration!

Here were some highlights:

Content Strategy

Brett and Chris Carrol from University of Chicago Graham School definitely schooled us (sorry) on Content Strategy in Drupal.


Lev shares a case study on Bene with the crowd. Bene is an open source distribution intended for small to medium-sized nonprofits. We have a whole page dedicated to it here.

Carie and Jessica talk redesign

Jessica and Carie Lewis Carlson (formerly from the Humane Society of the United States) shared some wisdom on the challenges often faced in a redesign process.

We want to give a special shout out to our collaborators from Gizra and Fionta - who contributed to the breakout sessions – and to anyone who led a BOF (birds of a feather) breakout and conducted a lightning talk to close. We appreciate you adding your voice to the day.

Next year we’re especially excited about NTC because it’s happening right in our home town of Portland, Oregon! We’re eager to participate and contribute to the rich and vibrant nonprofit community that we deeply care about, and also show you some of what we love best about our hometown.

To stay up to date on everything pre-con / Drupal Day related, (or just to stay in touch!) sign up for our email list. We’ll send a monthly newsletter with the latest trends and case studies to share…and as 19NTC approaches (too soon?) you’ll be first to hear about our plans and how you can join in the fun.

Below is the full agenda from Drupal Day 2018, please reach out if you would like slides from any of the presentations. Thanks and see you all soon!

Drupal Day 2018 Schedule

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Apr 06 2018
Apr 06

We’re thrilled to announce that Teaching Tolerance, a program of the Southern Poverty Law Center is up for a Webby! Their mission is to provide educators with the tools to create a kind and inclusive school climate. They do this through a library of articles and resources that they’ve turned into teachable materials via a Learning Plan Builder, and other classroom activities. It’s something that we feel is especially important work, now more than ever.

This is a project that meant so much to everyone that touched it; and it was a true partnership every step of the way for both our teams (Tolerance and ThinkShout). It certainly speaks to the passion that was put into it from all angles, and it’s an honor to be recognized for this work.

Our Case Study will give you the full scope of work. But for a quick summary: In redesigning their website, ThinkShout set out to turn the wealth of articles and resources Tolerance had into teachable materials, and did so by creating a guided Learning Plan Builder that makes all content classroom-ready. Tolerance grants free access to thousands of resources – from video to essays to proven teaching strategies – and everything within that catalogue is now actionable. We also took on the challenge of migrating their content from two older Drupal 6 and 7 sites into one new Drupal 8 site.

The result? Since launching summer of 2017, Tolerance.org has seen improvements across the board:

  • Pages per session up 21%
  • Session duration up 27%
  • Bounce rate decreased by 8%
  • Returning visitors up by 3%
  • Registrations nearly doubled (from 19,000 to 36,000)

Here’s where you come in: our nomination for a Webby means we need the people’s voice (aka VOTES) to actually win. Voting ends April 19th!

Vote for Tolerance.org in the Webby’s

Personally, we can’t think of anything more critical at this time than the work Tolerance.org is doing to ensure the next generation is primed to participate in our democracy. And winning the Webby will certainly help them gain visibility and advance their mission even further.

P.S. Travel Oregon also made it as an honoree in the Travel category, and they were up against some stiff competition! You can see their case study here.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Mar 28 2018
Mar 28

It unfolded like a sitcom. You know, the one where someone promises two separate people to be their date to the dance at school. It’s gonna end badly for everyone.

Well, we saw it coming, and like the hero of that sitcom, we concocted an elaborate scheme to be in two places at once. Can we pull it off? Join us at DrupalCon and #18NTC to find out!

Zack Morris


We’re heading to New Orleans for a week of jam-packed nonprofit goodness. The fun kicks off on April 10th with a pre-con event, Drupal Day:

Agenda Here!

We’re still looking for lightning talk speakers so contact us if you have a hot topic you want to share. Lightning talks are traditionally 5 minutes in length, and can cover any topic ranging from Drupal to nonprofit life to lessons learned and funny predicaments involving all of the above!

Brett Meyer and Jessica Tate will also be dropping some knowledge during the conference itself. Mark your calendars so you don’t miss their talks:

Supporting Culture with Multicultural Design

Content Strategy in Popular Culture

And no NTC would be complete without some ThinkShout classic tees! We’ll be handing out these beauties at our booth (number 606). We went with a 70s-inspired design this year, and they’ll give you all the posi-vibes you need.

ThinkShout 2018 shirts


They said it couldn’t be done…so naturally we took on the challenge! A handful of our team will also be in Nashville for DrupalCon to learn about and contribute to the latest innovations in the Drupal ecosystem. As long-time supporters of Drupal, we felt it was important to continue to deepen our involvement and commitment to the open source community.

Join us for the Nonprofit Summit, an interactive day of learning and networking with nonprofit leaders and Drupal experts. The summit features breakout sessions on a variety of topics – including a case study presented by Gabe Carleton-Barnes on Bene, the distribution we released this winter. Don’t miss it!

We’re also sponsoring a Women in Drupal event on April 10th. All women, transgender individuals, those who identify outside of the gender binary, and allies are welcome. Our goal is to foster inclusivity and embrace the involvement of individuals across the gender spectrum. We’ll be handing out those same tees at both events, so be sure to show up early and grab yours while supplies last!

All joking about conference logistics aside, we truly appreciate every opportunity to connect with the nonprofit tech community. It’s a chance to hear more about the inspiring work being done in the world. If you’d like to sit with us and talk about what we can do for your digital presence, let us know, we look forward to hearing from you!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Dec 12 2017
Dec 12

Bene is based on a simple goal: Smaller nonprofits deserve websites that are designed to increase their impact and grow alongside them. Sounds simple enough, but achieving it has always been a major challenge from both the nonprofit and agency perspective.

Nonprofits have traditionally been forced to choose their digital solutions from two extremes on a spectrum. On the one side, you have low cost do-it-yourself solutions like SquareSpace or Wix. While inexpensive and easy to launch, these solutions serve such a wide array of use cases that they have nothing unique to offer nonprofits in terms of best practices. The content strategy and designs are not intended to tell a nonprofit’s compelling story and drive engagement. In addition, as the needs of a nonprofit grow, out of the box solutions may not flex to meet those needs, forcing an organization to start over with its digital strategy.

On the other end of the spectrum are custom websites conceived through a rigorous discovery process and built on open source platforms like Drupal and WordPress. If done right, this approach yields the best results: a gorgeous responsive website that provides powerful storytelling tools, increases fundraising, advocacy, and other types of engagement, and serves as the hub of an organization’s digital ecosystem. But these projects take time and money, typically 6-9 months and upwards of $200k.

RedHen Engagement Rules

As ThinkShout has grown, our business has focused more on those larger projects. But in the process, it’s become harder to work with the thousands of organizations who can’t afford the time and cost involved in building a completely custom website. We wanted to find a way to partner with grassroots organizations, as helping them succeed is core to who we are and what we do.

Bene is our answer to this problem: A low cost website for small nonprofits bundling content management, mission critical features, a tailored user experience, hosting, and strategic support. All built on an open source Drupal distribution that can grow alongside the organization.

We first conceived of Bene during an open source “sprint for good” event nearly 2 years ago. This month, we’re proud to launch our first website on the platform for Free Geek, with two more on the way over the next few weeks. We do recognize there are other efforts to address this need, from Drutopia to WordPress.com. We support all of them, as the growing number of organizations working to make our world a better place have a wide range of requirements, and they need all the help they can get!

Learn more about Bene and please get in touch if you think it’s a good fit for you, or have some ideas on how to improve it. We realize we’ve only taken the first step.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Nov 06 2017
Nov 06

We’re fresh off of BADCamp (Bay Area Drupal Camp), and we’re eager to share our experience with you! If you’ve ever thought about going to one of the local Drupal Camps in your area, or attending BADCamp yourself, we hope our takeaways persuade you to seek this out as a professional development opportunity.

BADCamp is essentially three days of intense workshops and sessions for Drupal users to hone their skills, meet other open source contributors, and make valuable connections in the community. Amongst the ThinkShout team, two had never attended BADCamp before. We were eager to hear their perspective on the conference and their key takeaways.

Sessions they attended ranged from learning about component-based theming tools, object oriented php, module development, debugging JavaScript; to Drupal 9 and backward compatibility and the importance of upgrading to D8 now.

Let’s hear from Mario and Lui–I mean Amy and Jules, on what their first BADCamp experience was like!

Amy and Jules

Amy and Jules on Halloween. Costumes are not required at BADCamp.

What did you learn at BADCamp?

Amy: Component-based theming is a hot topic these days for those building sites due to a number of reasons. Here are a couple of them:

  • It encourages a DRY (Don’t Repeat Yourself) and more organized theming code base.
  • It decouples site building in such a way that backend and frontend developers can work on the site at the same time, rather than the backend code needing to be built first before the frontend developer can do their work.
  • It provides clients with an interactive experience of their site (including responsiveness) before the database and backend elements are hooked up to it. This allows the client more time to provide feedback in case they want to change behaviors before they’re completely built.

I also attended a session called: React, GraphQL, and Drupal. This talk was largely about an opportunity to create multiple suites using the same API. The team used “headless Drupal” (to serve as the API), React.js to build the sites, and GraphQL to explore data coming from the API in a much more direct and clear way. It seemed like a great solution for a tricky problem, in addition to giving this team the opportunity to learn and use cutting edge technologies - so much fun!

Jules: I learned a lot about the Drupal Community. This was my first BADCamp, and also my first Drupal conference. I was excited about how generous the community is with knowledge and tools, working together so we can succeed together.

I learned about some of the changes to Drupal releases from @Webchick’s talk (Drupal 9 and Backward Compatibility: Why Now is the Time to Upgrade to Drupal 8). If I keep up with the incremental point releases (ie: 8.x), upgrading to 9 should be pretty painless, which is a relief. Knowing the incremental releases will be coming out with a regular six month-ish cadence will make planning easier. I’m also excited about the new features in the works; including Layouts, Work Spaces, a better out of the box experience on first install, a better UI admin experience (possibly with React?).

What would you tell someone who is planing to attend BADCamp next year?

Amy: Definitely invest in attending the full-day sessions if they interest you. The information I took away from my Pattern Lab day was priceless, and I came back to ThinkShout excited and empowered to figure out a way to make component based theming part of our usual practice.

Jules: The full day sessions were a great way to dive into deeper concepts. It’s hard to fully cover a subject in a shorter session. It also helps to show up with an open mind. It’s impossible to know everything about Drupal, and there are so many tools available. It was valuable just meeting people and talking to them about their workflows, challenges, and favorite new tools.

Do you consider BADCamp to be better for networking, professional development, or both?

Amy: My big focus was on professional development. There were so many good training days and sessions happening that those filled my schedule almost entirely. Of course, attending sessions (and being a session speaker!) is a great way to network with like-minded people too.

Jules: My goal was to immerse myself in the Drupal community. Since I’m new to Drupal, the sessions were really valuable for me. Returning with more experience, that might not be the case. It was valuable to see new ideas being presented, challenged, discussed, and explored with mutual respect and support. We’re all in this together. Some talks were stronger than others, but every speaker had a nugget of gold I could take with me. It was encouraging to meet peers and to see all of the great work people are doing out in the world. It also served as a reminder that great strides can come from many small steps (or pushes)!

Make time to learn

It can be difficult to take time away from project work and dedicate yourself to two or three days of conferencing. But when you disconnect and dive into several days of leaning, it makes your contributions back at the office invaluable. As Jules commented to me after her first day of sessions, “it was like php church!”

Getting out of your usual environment and talking to other people opens your mind up to other ways of problem solving, and helps you arrive at solutions you otherwise wouldn’t get through sitting in your cubicle. We hope you’re inspired to go to a local Drupal Meetup or Camp – or even better, meet us at DrupalCon or NTC’s Drupal Day!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Aug 29 2017
Aug 29

If you’ve ever implemented a WYSIWYG editor in Drupal, one thing that becomes apparent quickly is that the term (What You See Is What You Get) is a complete lie. None of the default theme styles appear in the editor, because the editor shows up in the admin theme. This obviously diminishes its value, and makes custom element styles useless. The good news is that it’s fairly simple to fix - once you know how.

Drupal 8’s default WYSIWYG is CKEditor, and it’s included as a core module with its own API. This is great, because they also added a way to get that default theme front-end code into the admin theme CKEditor. The description of how to manage this leaves a bit to be desired, as all they mention is ‘specifying a ckeditor_stylesheets key in the *.info.yml file’.

Let’s start from the beginning. Say you’ve been working on a D8 site and the intro has an H2, some text, and a call to action button:


That’s great! What does CKEditor show us?


Oh. What I see is certainly not what I get. Let’s start by showing the basic styles in CKEditor. Go to your current default theme (ours is in /web/themes/custom/) and find your THEMENAME.info.yml. Open it in your favorite editor and you’ll see something like this:

name: My Theme Name
type: theme
description: A base theme for My Site
package: Other
core: 8.x

base theme: classy

regions, etc...

Now add the ckeditor_stylesheets: key and the target file right below the core: 8.x line, like so:

package: Other
core: 8.x
  - css/ckeditor.css

If there’s something already under core: 8.x just put the CKEditor lines below it.

Next you have to actually add a file there! Go to your theme’s /css/ directory and add an empty ckeditor.css file next to the site’s style.css.

Now, you could just tell CKEditor to load all of the site CSS - but that would be overkill for the poor little iframe. It’s better to just find the vanilla CSS styles you need in your style.css file and copy them over. In our case it’s only about 160 lines of CSS - the default styles for the site, plus some rendered Sass mixins for the button. How does our WYSIWYG look now?


Bazinga! What a difference.

Hmm, but our button is missing its styles because we haven’t configured the CKEditor for that yet.

Go into the Drupal configs to set that up at /admin/config/content/formats and click ‘configure’ for the CKEditor text format you want (Full HTML, etc).

If you don’t have ‘Styles’ in the ‘Active Toolbar’, add it by dragging it in. It looks good next to ‘Format’, and has a similar behavior:


Then scroll down to the ‘Styles dropdown’ tab and add the appropriate markup and class for the button.


In our case we want to turn an anchor link (a) into a button by adding a .button class, so we use a.button. The text after the pipe (|) is what will appear in the ‘Styles’ dropdown.

Finally, make sure you’ve added that markup to the ‘allowed HTML tags’ section if you’re adding it to a restricted markup configuration:


Important Note: style options won’t show up in the Styles dropdown unless you have clicked/selected an eligible piece of markup - in our case the a tag - in the CKEditor window. So in our example, we’d have to click on ‘read more’ before we click on the Styles dropdown.


As long as you have a.button styles in ckeditor.css, it should work right away. (Well, after a cache clear. It’s Drupal.)

And that’s it! From here you can continue to add styles to ckeditor.css, and to the Styles dropdown in the Drupal ‘Text formats and editors’ admin.

The WYSIWYG is no longer a lie!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jul 06 2017
Jul 06

If you’ve ever built a Drupal 7 module, then you’ll be familiar with hooks: functions that allow modules to react to things happening in other modules. The hook system is functionally fine but, with so many hooks to implement, .module files often become bloated and difficult to manage.

Drupal 8’s event system does a lot to reduce the clutter of hooks. Now, instead of using a hook, you can create an event subscriber that will execute your code every time a module triggers an event. This is similar to the hook system only in the effect; the execution is very different.

Porting our popular MailChimp eCommerce module to Drupal 8 gave me the perfect opportunity learn about the event system. I use the word “opportunity” to disguise the fact that I was forced to learn how events work because it was impossible to port the module without doing so.

The MailChimp eCommerce module depends on the Commerce module, naturally, and in Drupal 8, the Commerce module makes heavy use of events.

First, let’s look at an event. I’m using an example ripped straight from Commerce.

The Commerce submodule, Commerce Cart, contains a class named CartEntityAddEvent. You can find it here.

The class itself is simple; it’s designed to store a few values - the cart, the item being added to the cart, and the quantity of that item. The class also has a few getter functions for convenience.

Most importantly, this class represents an event that’s triggered every time a user adds an item to their shopping cart. This is done using just two lines of code:

$event = new CartEntityAddEvent($cart, $purchased_entity, $quantity, $saved_order_item);
$this->eventDispatcher->dispatch(CartEvents::CART_ENTITY_ADD, $event);

The event class is created with all the relevant values, then “dispatched” to any event subscribers configured to pay attention to it. When dispatched, the event is identified by a constant - CartEvents::CART_ENTITY_ADD. This constant is used by event subscribers, which we’ll take a look at now.

This is a cut-down version of an event subscriber used by our MailChimp eCommerce module.

 * Event Subscriber for Commerce Carts.
class CartEventSubscriber implements EventSubscriberInterface {

   * The Cart Handler.
   * @var \Drupal\mailchimp_ecommerce\CartHandler
  private $cart_handler;

   * The Order Handler.
   * @var \Drupal\mailchimp_ecommerce\OrderHandler
  private $order_handler;

   * CartEventSubscriber constructor.
   * @param \Drupal\mailchimp_ecommerce\CartHandler $cart_handler
   *   The Cart Handler.
   * @param \Drupal\mailchimp_ecommerce\OrderHandler $order_handler
   *   The Order Handler.
  public function __construct(CartHandler $cart_handler, OrderHandler $order_handler) {
    $this->cart_handler = $cart_handler;
    $this->order_handler = $order_handler;

   * Respond to event fired after adding a cart item.
  public function cartAdd(CartEntityAddEvent $event) {
    /** @var \Drupal\commerce_order\Entity\Order $order */
    $order = $event->getCart();

    /** @var \Drupal\commerce_order\Entity\OrderItem $order_item */
    $order_item = $event->getOrderItem();

    $product = $this->order_handler->buildProduct($order_item);

    $this->cart_handler->addCartLine($order->id(), $order_item->id(), $product);

   * {@inheritdoc}
  public static function getSubscribedEvents() {
    $events[CartEvents::CART_ENTITY_ADD][] = ['cartAdd'];

    return $events;


Here’s the complete version, if you’re interested.

So what does it do, exactly?

Let’s start with the getSubscribedEvents() function. This is where we define which events we want to subscribe to, and assign each event a processing function. Here we are subscribing to just one event, the “cart entity add” event, and assigning the cartAdd() function as a processor.

Note that the cartAdd() function takes one argument, an instance of the CartEntityAddEvent class. That’s the same class we looked at earlier - the event class defined in the Commerce Cart module. This is where our module reacts to that event being triggered.

The cartAdd() function itself extracts the order and item information from the event and uses an instance of the CartHandler class, provided by the MailChimp eCommerce module, to send updated cart information to MailChimp’s API.

One final thing:

Event subscribers won’t work unless they are defined as a service. Services are defined in a module’s *.services.yml file, which you can learn more about here.

The service definition for the CartEventSubscriber looks like this:

    class: '\Drupal\mailchimp_ecommerce_commerce\EventSubscriber\CartEventSubscriber'
    arguments: ['@mailchimp_ecommerce.cart_handler', '@mailchimp_ecommerce.order_handler']
      - { name: event_subscriber }

We identify the class using its namespace, inject the “cart_handler” and “order_handler” services, then, finally, tag the service as an “event_subscriber”. Check out the full file here. Just for completeness, the two injected services are defined in here.

I’m a big fan of how Drupal 8 has shifted towards a more object-oriented way of doing things. It’s more organized, promotes consistency between modules, and, best of all, finally signals an end to massive .module files.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jun 06 2017
Jun 06

fade-to-black-1.jpgResponsive design brings a fascinating array of challenges to both designers and developers. Using background images in a call to action or blockquote element is a great way to add visual appeal to a design, as you can see in the image to the left.

fade-to-black-2.jpg However, at mobile sizes, you’re faced with some tough decisions. Do you try and stretch the image to fit the height of the container? If so, at very tall/narrow widths, you’re forced to load a giant image, and it likely won’t be recognizable.

In addition, forcing mobile users to load a large image is bad for performance. Creating custom responsive image sets would work, but that sets up a maintenance problem, something most clients will not appreciate.

Luckily, there’s a solution that allows us to keep the image aspect ratio, set up standard responsive images, and it looks great on mobile as well. The fade-out!

I’ll be using screenshots and code here, but I’ve also made all 6 steps available on CodePen if you want to play with the code and try out different colors, images, etc…

Let’s start with that first blockquote:

fade-to-black-1.jpg(pen) This is set up for desktop - the image aspect ratio is determining the height of the container using the padding ratio trick. Everything in the container is using absolute positioning and flexbox for centering. We have a simple rgba() background set using the :before pseudo-property in the .parent-container:

  :before {
    content: "";
    display: block;
    position: absolute;
    width: 100%;
    height: 100%;
    background-color: rgba(0,0,0,0.4);
    z-index: 10;
    top: 0;

fade-to-black-3.jpg(pen) The issues arise once we get a quote of reasonable length, and/or the page width gets too small. As you can see, it overflows and breaks quite badly.

fade-to-black-4.jpg(pen) We can fix this by setting some changes to take place at a certain breakpoint, depending on the max length of the field and the size of the image used.

Specifically, we remove the padding from the parent element, and make the .content-wrapper position: static. (I like to set a min-height as well just in case the content is very small)

fade-to-black-5.jpg(pen) Now we can add the fader code - background-image: linear-gradient, which can be used unprefixed. This is inserted into the .image-wrapper using another :before pseudo-element:

  :before {
    content: "";
    display: inline-block;
    position: absolute;
    width: 100%;
    height: 100%;
    background-image: linear-gradient(
      // Fade over the entire image - not great.
      rgba(0, 0, 0, 0.0) 0%,
      rgba(255, 0, 0, 1.0) 100%

fade-to-black-6.jpg(pen) The issue now is that the gradient covers the entire image, but we can fix that easily by adding additional rgba() values, in effect ‘stretching’ the part of the gradient that’s transparent:

  :before {
    background-image: linear-gradient(
      // Transparent at the top.
      rgba(0, 0, 0, 0.0) 0%,
      // Still transparent through 70% of the image.
      rgba(0, 0, 0, 0.0) 70%,
      // Now fade to solid to match the background.
      rgba(255, 0, 0, 1.0) 100%

fade-to-black-7.jpg(pen) Finally, we can fine-tune the gradient by adding even more rgba() values and setting the percentages and opacity as appropriate.

Once we’re satisfied that the gradient matches the design, all that’s left is to make the gradient RGBA match the .parent-container background color (not the overlay - this tripped me up for a while!), which in our case is supposed to be #000:

  :before {
    background-image: linear-gradient(
      rgba(0, 0, 0, 0.0) 0%,
      rgba(0, 0, 0, 0.0) 70%,
      // These three 'smooth' out the fade.
      rgba(0, 0, 0, 0.2) 80%,
      rgba(0, 0, 0, 0.7) 90%,
      rgba(0, 0, 0, 0.9) 95%,
      // Solid to match the background.
      rgba(0, 0, 0, 1.0) 100%

We’ll be rolling out sites in a few weeks with these techniques in live code, and with several slight variations to the implementation (mostly adding responsive images and making allowances for Drupal’s markup), but this is the core idea used.

Feel free to play with the code yourself, and change the rgba() values so that you can see what each is doing.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
May 19 2017
May 19

I recently had the opportunity to migrate content from a Drupal 6 site to a Drupal 8 site. This was especially interesting for me as I hadn’t used Drupal 6 before. As you’d expect, there are some major infrastructure changes between Drupal 6 and Drupal 8. Those differences introduce some migration challenges that I’d like to share.

The Migrate module is a wonderful thing. The vast majority of node-based content can be migrated into a Drupal 8 site with minimal effort, and for the content that doesn’t quite fit, there are custom migration sources. A custom migration source is a small class that can provide extra data to your migration in the form of source fields. Typically, a migration will map source fields to destination fields, expecting the fields to exist on both the source node type and destination node type. We actually published an in-depth, two-part blog series about how we use Drupal Migrate to populate Drupal sites with content in conjunction with Google Sheets in our own projects.

In the following example, we are migrating the value of content_field_text_author from Drupal 6 to field_author in Drupal 8. These two fields map one-to-one:

id: book
label: Book
migration_group: d6
deriver: Drupal\node\Plugin\migrate\D6NodeDeriver
  key: migrate
  target: d6
  plugin: d6_node
  node_type: book
  field_author: content_field_text_author
  plugin: entity:node

This field mapping works because content_field_text_author is a table in the Drupal 6 database and is recognized by the Migrate module as a field. Everyone is happy.

However, in Drupal 6, it’s possible for a field to exist only in the database table of the node type. These tables look like this:

mysql> DESC content_type_book;
| Field                      | Type             | Null | Key | Default | Extra  |
| vid                        | int(10) unsigned | NO   | PRI | 0             |   |
| nid                        | int(10) unsigned | NO   | MUL | 0           |   |
| field_text_issue_value     | longtext         | YES  |     | NULL |   |

If we want to migrate the content of field_text_issue_value to Drupal 8, we need to use a custom migration source.

Custom migration sources are PHP classes that live in the src/Plugin/migrate/source directory of your module. For example, you may have a PHP file located at src/Plugin/migrate/source/BookNode.php that would provide custom source fields for a Book content type.

A simple source looks like this:

namespace Drupal\custom_migrate_d6\Plugin\migrate\source;

use Drupal\node\Plugin\migrate\source\d6\Node;

 * @MigrateSource(
 *   id = "d6_book_node",
 * )
class BookNode extends Node {

   * @inheritdoc
  public function query() {
    $query = parent::query();

    $query->join('content_type_book', 'book', 'n.nid = book.nid');
    $query->addField('book', 'field_text_issue_value');

    return $query;


As you can see, we are using our migration source to modify the query the Migrate module uses to retrieve the data to be migrated. Our modification extracts the field_text_issue_value column of the book content type table and provides it to the migration as a source field.

To use this migration source, we need to make one minor change to change to our migration. We replace this:

plugin: d6_node

With this:

plugin: d6_book_node

We do this because our migration source extends the standard Drupal 6 node migration source in order to add our custom source field.

The migration now contains two source fields and looks like this:

id: book
label: Book
migration_group: d6
deriver: Drupal\node\Plugin\migrate\D6NodeDeriver
  key: migrate
  target: d6
  plugin: d6_book_node
  node_type: book
  field_author: content_field_text_author
  field_issue: field_text_issue_value
  plugin: entity:node

You’ll find you can do a lot with custom migration sources, and this is especially useful with legacy versions of Drupal where you’ll have to fudge data at least a little bit. So if the Migrate module isn’t doing it for you, you’ll always have the option to step in and give it a little push.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
May 15 2017
May 15

Websites, like most things, have a lifespan. At first, they are new and shiny and aligned with both your organization’s goals and current web trends and best practices. As time goes on, however, technology continues to progress, and your organizational goals will probably evolve as well.

If you’ve worked through a full Discovery process to develop an information architecture that supports your organization’s core mission, then all you may need to update is the look and some of the site content. But if you haven’t engaged in an in-depth Discovery process before, you may find that your site is not only technically outdated, but also no longer reflects who you are as an organization.

So it’s time to think about a redesign. The good news is, starting your new project with a full Discovery will help you create a site structure that will serve your needs not just for the new version of the site, but for years to come. Additionally, if you build your new site on a widely-used and well-supported open source CMS platform (like Drupal or Wordpress), you won’t need to switch systems every couple of years. For example, Drupal 8, the latest version of Drupal, is expected to have a lifespan of 8-10 years.

Investing time and energy to develop a strong foundation now will set you up for success in the future. But how can you ensure your website redesign gets off to the right start?

Here at ThinkShout, we believe that technical excellence and award-worthy design should be a given, and that our focus should be on building you a site that helps you connect with your constituents and meet your goals. Through numerous discovery engagements with many different organizations, we’ve uncovered some key questions to ask during the initial requirements gathering phase that will help ensure the solution we create meets your needs and serves your mission.

Here are some things to think about when you’re thinking about a redesign:

What are Your Organizational Goals?

Before you dive into the specifics of your website, let’s take a step back and think at a higher level. Defining your organizational goals will help make sure that the solution you and your vendor create not only looks good and functions well, but will also support the fundamental mission of your organization.

So it’s important to take a moment to think about what your organization’s goals are. What issue are you working to address? What does success for your organization look like? The more specific and measurable these goals are, the better. Measuring your progress towards your higher level goals can help you assess the success of your project.

What are Your Project Goals?

Now it’s time to zoom in and focus on this project itself. Project goals should be tangible, attainable, and measurable. They may include a mixture of internal goals (perhaps relating to how you are able to manage the website) and external goals (how your users interact with the website: engagement, donations, tracking, etc.).

It may be helpful when thinking about your project goals to determine how they relate to your organizational goals. Can you map your project goals to the organizational goals they support? If not, perhaps you should consider if that particular goal for the project is even necessary – or if it can be deprioritized.

For example, if your organization is a local animal shelter, one of your organizational goals may be to increase pet adoption. Website project goals that support this higher-level goal might be to post profiles for adoptable pets online, or facilitate adoptions through your website.

Identifying and then prioritizing your project goals may also help you define what success will look like for your redesign project. How will you measure progress towards these goals? Which goals need to be met for the project to be successful?

Who are Your Audiences?

A website only adds value for your organization if your audiences use it, and mostly people will come to your website looking for information, driven by their own needs and motivations. If you focus primarily on your goals, you may end up with a website that is geared towards your organization’s needs and structures, but that does not allow your users to easily access the information they seek.

Defining who your audiences are will allow you to put your users first when redesigning your website. Once you know who your audiences are, you can determine what content will satisfy their needs, sparking the trust that will allow you to nudge them to take an action beneficial to you.

These questions are just a starting place for your website redesign. A full discovery process will delve more deeply into your programs and departments, your needs and wants, and what makes your organization tick. But asking yourself these three questions before you start will give you an anchor to help you ensure that your new website engages your users and supports your mission.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Apr 19 2017
Apr 19

We’re packing our bags for Baltimore and polishing up our slide decks for DrupalCon! We’re so excited to join the Drupal community for a full week of Drupal-y things. We’ve got some great content planned for this year’s conference, and we’re very excited to share it with you all - here’s what you need to know:

Exhibit Hall

The ThinkShout Headquarters this year is booth 432! We’ll be giving away free t-shirts and raffling off an Amazon Echo. You can enter to win for the low, low price of one business card. If you have any questions about our work, current available job opportunities, or what the weather’s like in Portland (spoiler: it’s probably raining), stop by - we’d love to chat with you!

ThinkShout Sessions

The ThinkShout team has two sessions in the DrupalCon agenda this year. We’re also very excited to be leading a discussion in our first DrupalCon Nonprofit Summit. Take a look at our lineup and mark your calendars

Rapid Response Campaigns & Digital Tools” - Monday (4/24), 12:30 - 1:15pm, Nonprofit Summit

The news cycle doesn’t stop, and your website must help you respond to emergencies, not act as a barrier. Drupal can help you react quickly, in concert with your other channels, to turn current events into opportunities to spread your message and further your mission. In this breakout session, Brett Meyer and Lev Tsypin will talk about the tools you have at your disposal in Drupal, scenarios that call for rapid response solutions and how to implement them, and strategies that will help you turn these situations into lasting engagement with your constituents.

Demystifying Rendered Content in Drupal 8 Twig Files” - Tuesday (4/25), 3:45 - 4:45pm

Amy Vaillancourt-Sals is going to show you the ins and outs of Twig! Twig is a robust and elegant template engine for PHP. It’s lightweight, fairly quick to pick up, very readable, and it grants users ultimate control over the markup, including wrapping elements and rendering exactly the output you need. In this session, you’ll learn about the debugging process of sorting through twig variables, using xdebug in PHPStorm, the other helpful debugging tools at your disposal, plus common patterns Amy found helpful for rendering content in twig files.

Content Strategy in Popular Culture, Part Deux” - Thursday (4/27), 10:45 - 11:45am

Brett Meyer’s got a sequel to his session from DrupalCon New Orleans. Another year, another array of pop culture obsessions to examine and apply to the work we do. By exploring how crucial aspects of content strategy play out in movies, music, comic books, and video games, we’ll continue to expand the palette of language we can use to explain and convince more people about the importance of content strategy online, and ensure they understand that it’s not just vital, but fun as well.

Let’s Chat

If you’d like to schedule some time to chat with us in advance, drop us a line via our contact form. We’d be happy to meet up with you in Baltimore!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Mar 03 2017
Mar 03

The Nonprofit Technology Conference is just around the corner, and we’re hard at work making those final preparations for our trip to D.C. We have some exciting things in store for you this year, so mark your calendars!

Drupal Salon

This year, we’re honored to coordinate the first ever Drupal Salon at the NTC. In lieu of the traditional pre-conference days as we’ve done in previous years, NTEN set aside space and time for subject matter experts to present nine twenty-minute-long talks on all things Drupal on Thursday, March 23rd. These talks will take place as part of the formal NTC schedule, and sessions can be viewed in the Wordpress & Drupal Salon tracks on the NTC site.

We’re excited to have experts from the Southern Poverty Law Center, Shatterproof, and the Center for Strategic and International Studies share their Drupal insight and experiences. ThinkShout will also be providing one-on-one consulting at our Drupal Salon table, so bring us all of your Drupal questions! Drupal hosting providers Pantheon and Acquia will also be on hand to tackle whatever Drupal hosting questions you may have.

We hope you’ll be able to join us! Here’s what we’ll be talking about:

We’re confident that the Drupal Salon sessions will have a little something for everyone, and we look forward to connecting with the nonprofit community with this new format.

Meet the ThinkShout Team

Be sure to catch our team session on March 23rd, as well!

The ThinkShout team will have a presence in the Exhibit Hall this year, of course. Stop by our booth (#501) and chat; we’ll be debuting brand new t-shirts and we’re excited to share them with you all (for free)! This is a great opportunity to learn more about our work and the organizations we partner with. We’re also available to talk about anything B Corp related, so send those questions our way!

If you’d like to schedule a time to meet with our staff at the NTC in advance, drop us a line through our contact form. See you in the capitol!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Feb 06 2017
Feb 06

Front-end development is full of challenges - changing design trends, browser idiosyncrasies, client demands, and ever-evolving web standards to name a few. Over the last few years though, a new challenge has emerged. Which development stack should you choose?

Once upon a time, front end development didn’t really have a “dev stack.” You had markup in files, maybe output from a CMS, a stylesheet, and some jQuery if you felt like getting fancy. Now though, the array of options can be paralysing. Pre-processors, post-processors, task runners, and package managers have made many aspects of development faster, but which ones are best? Where do you start?

Here at ThinkShout, under the watchful eye of Eric Paxton, our Senior Front End Engineer, we’ve been trying out the various options whenever we take on a new project, to see how well it fits in our theming process. We’re pretty busy, so this covers a lot of ground quickly. We’ve been careful to fully document the tools used in the past so that we don’t bedevil the maintenance folks. (We are often the maintenance folks).

The last few builds have seen our dev stack settle down to a flexible tool set that is easy to setup and maintain, while providing us with excellent modern theming tools. Let’s dive in!

Getting Started: Languages, Handlers, and Package Management

At the bottom of a development stack are the languages used, the language handlers, and the package managers that allow you to include pre-built tools and libraries in your project. Some of these are interchangeable, but it solves a lot of problems if everyone uses the same fundamental tools.

In our case, we use Ruby and JavaScript as the base languages, and rbenv and Node as their handlers. By using Ruby and JavaScript, we get access to an extremely wide array of applications, tools, plugins, and more. Once these are installed (Using an OS package manager! In this case, Homebrew (since we all use Macs), we add package handling for these languages: Bundler and NPM respectively. This gives us the following base:

  • Ruby via rbenv, managing gems using Bundler
  • JavaScript via Node.js, managing packages using NPM

Now we can specify Ruby Gems and Node packages in a Ruby Make file (Rakefile), and a complex project setup is as simple as running rake install once from the theme directory, and starting the task watcher using rake serve. (To be more precise, we use the Rakefile to install the Ruby Gems as defined in the Gemfile, and the Node modules as specified in the package.json file).

The complete project setup for a new developer would be the following:

~: brew install rbenv
~: gem install bundler
~: brew install node
~: brew install npm
~: cd ~/path/to/theme/directory
~: rake install
~: rake serve

After that, any new projects would only need the last three lines run.

The key to making this work is to have a Rakefile, a Gemfile and a package.json set up in our project’s theme so that rake install works properly. In our case we use the Rakefile to first run bundle install, which installs the appropriate gems and their dependencies:


task :install do
     system 'bundle install' # this runs the Gemfile contents!
     system 'npm install -g browser-sync'


source 'http://rubygems.org'
gem 'sass'
gem 'sass-globbing'

This generates a Gemfile.lock listing all of the installed packages/versions.

The npm install lines in the Rakefile setup tools that we’ll discuss later. Our next layer in the stack are the SASS tools that Bundler installed.

SASS at ThinkShout (please pass the Bourbon)

In the middle of our stack is SASS. We use SASS in a fairly simple way at ThinkShout, installing it with sass-globbing. This allows us to set up directories that allow any files using the appropriate _filename.scss syntax to be included in the build. We also tend to keep the directory structure fairly minimal:


@import 'lib/bourbon/bourbon';
@import 'lib/neat/neat';
@import 'lib/normalize/normalize';
@import 'global/*';
@import 'layout/*';
@import 'modules/*';

The first thing we include is the Bourbon mixin library. This includes coding shortcuts such as the pixels-to-rems syntax rem(24). This allows us to read a design’s pixel spacing and it converts them to the appropriate rem values. The Bourbon Docs are excellent and well-maintained as well. Never worry about browser prefixes or fallbacks again.

Next up is the Bourbon-related grid framework, Neat. A simple but powerful grid that uses semantic markup and easy-to-read terminology such as @include span-columns(9). No extra wrappers, no specific classes to add, and it’s extremely robust. We haven’t run into any cross-browser issues in over two years of using it, which says a lot, and since it’s only applied as you specify, it’s easy to break out of the grid if you need to.

Next up is normalize.css, a modern update to the old CSS reset stylesheets. Not really much to add to that except it’s really well commented, so make sure you change it from normalize.css to _normalize.scss so that you don’t bloat your final site.css file.

The Global directory has the following:


The _01, _02, etc. prefixes take advantage of the sass-globbing’s alphabetical file inclusion. All our site variables (colors, font weights, and so forth) are in vars, our custom mixins are next, then extends. Base has all of the base markup styles:

body {
  font-size: rem(16);
  font-style: normal;
  font-weight: $regular;
  -webkit-font-smoothing: antialiased;
  -moz-osx-font-smoothing: grayscale;

h1, h2, h3, h4, h5, h6 {
  text-rendering: optimizeLegibility; // Fix the character spacing for headings

p, a, ul, ol, etc...

The layouts directory usually has a _layouts.scss file, which covers the basics of site layout. Since we use Drupal, we’ll often add a _regions.scss as well to specify layout for the various Drupal-generated content zones on a site. These files are where we use the Neat framework the most.

Finally, we have the modules directory - where most of the theming takes place. These are usually named by content type (_basic-pages.scss, _articles.scss, etc.), though there are often files such as _forms.scss and _homepage.scss as well. Sometimes we don’t even have to use our source maps to know where code is!

One of our good habits is to start with our mobile-first, responsive _01.template.scss file:

// Default / Mobile

// Tablet (580px)
@media all and (min-width: $tablet) {

// Large Tablet (768px)
@media all and (min-width: $lg-tablet) {

// Desktop (1228px) $max-width: 1440px
@media all and (min-width: $desktop) {

When you want to add another theming module, you just make a copy of the template and your progressive breakpoints are included! (The $max-width: 1440px is there in a comment because it’s handy).

All of this gets handled by a task in our Rakefile, which sets a watcher for changes to any SASS file and compiles them into a single css/style.css:

desc 'Watch sass'
task :sasswatch do
system 'sass -r sass-globbing --watch sass/style.scss:css/style.css'

Pulling It All Together: Browsersync!

Finally, at the top of our stack, we have Browsersync. Eric Paxton, our Senior Front End Engineer, wrote an excellent overview of why we use this amazing tool, what it does, as well as how to install it in detail for Drupal 8.

In our stack it’s as simple as another task in that Rakefile:

desc 'Running Browsersync'
task :browsersync do
     system 'browser-sync start --proxy "local.dev" --files "css/*.css" --no-inject-changes'

And adding the following (generated by running browser-sync start) to the site’s <head> :

<!-- <script id="__bs_script__">
  //<![CDATA[ document.write("<script async src='http://HOST:3000/browser-sync/browser-sync-client.2.12.3.js'><\/script>".replace("HOST", location.hostname));
</script> -->

This also sets a watcher on the CSS, and refreshes every browser you have open to localhost:3000 or the local network IP address it generates upon running rake serve.

The last part of the Rakefile implements the tasks we set up:

desc 'Serve'
task :serve do
  threads = []
  %w{sasswatch browsersync}.each do |task|
    threads << Thread.new(task) do |devtask|
  threads.each {|thread| thread.join}
  puts threads

This has the magical effect of opening a new browser window to localhost:3000 when you run rake serve, and reloading it every time you save any of your SASS files. It also scrolls all open windows together, even when you open up things on your phone using the local network proxy, which it helpfully provides as output:

>>> Sass is watching for changes. Press Ctrl-C to stop.
[BS] Proxying: http://site.dev
[BS] Access URLs:
       Local: http://localhost:3000
          UI: http://localhost:3001
 UI External:
[BS] Watching files...
[BS] File changed: css/style.css
      write css/style.css
      write css/style.css.map

This is really the cherry on top of the dev stack - after using it for a little while, you’ll wonder how you ever got along reloading everything manually.

Stack Overview

In summary, here’s that front-end stack:

  • Ruby via rbenv, managing gems using Bundler
  • JavaScript via Node.js, managing packages using NPM
  • SASS with globbing, set up in a simple directory structure
  • Bourbon Mixin library
  • Neat Grid system
  • Normalize.css as _normalize.scss
  • A simple module template containing responsive breakpoints
  • Browsersync

None of this is carved in stone of course, and it gets slightly tweaked for every new build based on the project requirements, such as internationalization, the base CMS (Drupal, WordPress, or Jekyl in our case), and the desire to try out something new, which is fairly constant. After all, that’s how we got to the stack we have today!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jan 16 2017
Jan 16

In Part 1, I talked about using Google Docs + Migrate to populate your site. Now we’re going to do that with the Migrate Google Sheets module. Below, I’ll provide the steps to get your own migration up and running, but if you prefer to experiment with a working example, check out a demo of the Migrate Google Sheets Example module (provided as a submodule within Migrate Google Sheets). All content on that site was built using this example Google Sheet.

Setup: Install the Module

If you’ve already got a Drupal 8 site up and running, you can install the module in any of the normal ways. I’m assuming here that you have access to the site using Drush, as it’s not possible to run migrations through anything but the command line at this time. At ThinkShout, we use composer to build our site distributions, and have a repo for building the demo site here.

Step 1: Creating Your Custom Migration Module

The easiest way to get started on your own set of migrations is to copy the migrate_google_sheets_example submodule and rename it something of your own. Let’s say we rename it “my_migration.” Follow these steps:

  1. Rename your .install file to “my_migration.install”, and change the function migrate_google_sheets_example_uninstall to “my_migration_uninstall”.
  2. Delete the helper submodule “migrate_google_sheets_example_setup” entirely – that is just necessary to build the content types required for the example module, but you shouldn’t need it for your migration module.
  3. Rename your migrate_google_sheets_example.info.yml as “my_migration.info.yml” and open it up. At the very least, you’ll want to change the name of the migration to “name: my_migration” but you’ll also likely wish to remove the migrate_google_sheets:migrate_google_sheets_example_setup dependency. Mine ended up looking like this:
name: my_migration
type: module
description: My Migrations
core: 8.x
package: Migrate
  - migrate_plus
  - migrate_tools
  - migrate_google_sheets
  - redirect 

When completed, your module structure should look like this:

Module Structure

You are now ready to enable your My Migrations module. (Make sure you disable the migrate_google_sheets_example module first, if you’d previously enabled that!)

Step 2: Create Your Spreadsheet

Assuming you have the Game and Landing page content types, you could now run the migrations in your “My Migrations” module and it will pull the data from the Google Sheet.

But since you don’t have permissions to edit that sheet, you’re going to need to copy the existing sheet and create your own to do any customizations.


When this is done, you’ll get a url like this:

https://docs.google.com/spreadsheets/d/YourLongHashIDHere where YourLongHashIDHere is your feed key.

Now you’ll need to publish your new spreadsheet. This is an option under “File” -> “Publish to the web”


To verify that your migration module will be able to see the Google sheet, try opening an anonymous browser window and visiting the Feed version of the url, whose format is this:


If visiting that URL throws out a bunch of json, you’re ready to start migrating!

But of course, your current set of migration files still point to the old feed. In the my_migrations/config/install folder, you’ll need to find all instances of our feed string (1spS1BeUIzxR1KrGK2kKzAoiFZii6vBHyLx_SA0Sb89M) and replace them with your feed string.

Step 3: Decide Which Migrations to Keep

The Migrate Google Sheets Example module provides one Migration Group (games_example) and 6 Migrations. Depending on your site configuration, some of these might be useful, like the menu_links and the blocks migrations, and some of them will not be so useful (like the node_game migration, likely). This is a good time to trim or modify any migrations that aren’t going to be useful for your Drupal site. That being said, here are a few things that the sample migrations demonstrate:

  • Block UUIDs: When you place a block using the Block Layout screen, the block’s UUID is saved in config. If you’re running a migration over and over, your block’s ID will iterate on its own, but the UUID can remain constant if you add it to the migration. In the demo site, this allows us to create a persistent CTA block in the header.

Module Structure

  • Menu Links parents: You can specify that a menu link item has a parent from within the current migration. This lets us say /bohnanza and /hanabi are children of /games
  • Page and Game redirects: These sheets demonstrate how to add the redirect from the url of content on an old site to the new home right in the content sheet. Try going to https://live-mgs-demo.pantheonsite.io/that-fireworks-game and see where you end up.
  • Related content as strings or ids: On the Page sheet, we have a reference to the “Related games” for the given page. This is an entity reference which we could fill with a couple of things. We could refer to the ID of the related games, as they are stored in the Games sheet, or we could do what we’ve done here and use the migrate_plus plugin “entity_lookup” to search for the related game node by name. As long as there is a Game node called Bohnanza, we’ll always link to the right one. This is particularly useful with Term references, where the name of the item ideally remains constant.

Related Content

  • Game downloadable file: Games have associated images, which are files hosted externally to the spreadsheet. In order to relate my game content to its image, I need to download the image, get it into the file_managed database table (creating a file entity) and THEN relate that entity to the current node. This is done with the following lines in the “node_games” migration:
    plugin: default_value
    default_value: 'public://'
    plugin: concat
    delimiter: ''
      - @public_file_directory
      - imagefilename
      plugin: file_copy
        - image
        - @public_file_uri
      plugin: entity_generate
  field_image/alt: imagealt
  field_image/title: imagetitle

You can keep as many or as few of the migration files as you’d like. You can also create new ones.

Step 4: Tell Drupal About Your Changes

Drupal 8 only sees the changes you’ve made to your migration yml files when you first install the module. That means that you need to uninstall and reinstall the module to make new changes show up. ThinkShout has a Robo script that does this, but the same thing can be done in Drush:

drush mr --all             # Rolls back all migrations
drush pmu my_migration -y  # Disables my migration module
drush en my_migration -y   # Enable my migration module
drush ms                   # Displays my current migration status

You can also string these together as one line:

drush mr --all && drush pmu my_migration -y && drush pmu my_migration -y && drush ms

Step 5: Run your migrations

This part is simple. To run all migrations, it’s a quick drush command:

drush mi --all

If you’d like to find out about all the possible options for the migrate-import command, you can run

drush help mi

You can also see your list of migration groups at /admin/structure/migrate, and you can review your migrations by clicking “List Migrations.” The resulting page will give you the same output, more or less, that you get from running a drush ms.


These pages are helpful to know about, as they give you an easy place to find errors logged during the migration process. However, you can’t currently run a migration from the UI (although there is an issue for this).


But before we close, I do want to acknowledge some challenges we’ve seen in this approach.

Sad fact #1: HTML in a spreadsheet is ugly.

Google Spreadsheets don’t let you make your rows smaller than the number of line breaks in a cell. So if you have pretty html with a bunch of line breaks, your row might be too tall to fit on your screen. People have some clever workarounds for this, but so far we’ve not implemented any.

Sad fact #2: Sheet order matters (right now)

Maintaining the order of sheets isn’t top on everyone’s minds as they’re making changes to a spreadsheet, especially when duplicating tabs. Migrate Google Sheets asks for each sheet based on tab order. If I make a copy of the Page tab, the Game tab is now the fourth tab instead of the third tab.

Copy of page

As it stands now, the module will happily request columns that don’t exist on the third tab and then fail in puzzling ways.

There is currently only one issue in the issue queue for the Migrate Google Sheets module, and it’s to fix this.

Sad fact #3: Google sheets must be publicly viewable to work (again, right now)

As the module exists right now, there’s no authentication involved, so any migrated content must be publicly viewable. Google authorization is possible with Oauth2, but that is not currently implemented.


Thanks for following along! I hope you found this series helpful. And don’t forget to visit the Migrate Google Sheets issue queue if you find any bugs, have an idea for a feature, or need help!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jan 06 2017
Jan 06

The problem:

Content management systems are extremely powerful, in that they let developers focus on what they do best – build the infrastructure of a site, while allowing content editors to do what they do best – create content.

But this can be a problem when building a new feature. How often have you heard something to this effect:

Developer: “That blank spot right there will be a neat slideshow, once you build it.”

Client: “I thought I was paying you to build it.”

The separation between content and development can lead to missed edge cases, unfounded assumptions, and wasted time for everyone involved.

There are a few workarounds to this problem. We often prototype our sites with dummy content (insert your favorite Ipsum here). But this, without fail, leads to some nasty surprises when the client starts entering real content. It’s suddenly much longer (or shorter) than the designer or developer intended. Or maybe the images are far too big. Or they’re all portraits where we expected landscapes. In short, the arguments made against using Lorem Ipsum in designs go doubly once you start actually implementing fields on your Drupal site.

So what about more meaningful content – maybe exported from another source? Modules like Default Content allow developers to export certain content for import during the initial site build. But that content has the disadvantage of requiring a developer’s intervention. The more of a nuisance it is to update the content, sync the database, change the fields, etc, the less likely you are to keep the content up-to-date.

At ThinkShout, we want to populate our client’s sites with content as soon as possible.

It doesn’t need to be the final content…

But it should be real content.

It shouldn’t necessarily be exactly what’s on the old site…

But it ought to be close…

In other words, our initial content needs to be easy to change – easy enough that the client can do it. Easy enough that the developers don’t have to take a walk around the block to calm down when they find out the fields are changing (again). Easy.

Our Solution Part 1: Migrate

“But isn’t Migration to Drupal hard?” I hear you saying.

It certainly was in Drupal 7, where the Migrate module had a (deserved) reputation for being difficult to use. Migrating from one Drupal site to another, or even from Wordpress to Drupal was relatively smooth, but if you really wanted to do something unusual, like migrate from a less-common CMS to Drupal, you were going to be writing a lot of custom code “glue” to get it all working.

In D8, Migrations have been moved to core. This means a few things. First, it means the core concept of entities is baked right in. In D7 migrations, you often had to hunt around for a plugin, hoping someone had written a Destination Handler for your favorite oddball entities, like Redirects, or Addresses, or the dreaded Field Collections. In D8, an entity is an entity.

As such, with a solid knowledge of the helpful migration plugins and two essential contributed modules, Migrate Tools and Migrate Plus, you can write a robust migration of both content and config entities without writing code more complicated than a few .yml files. If you don’t believe me, I encourage you to try upgrading your D6 or D7 site to D8 on a local or dev environment to see how much of your data is already in there.

That being said, what if I don’t have an existing site? Or what if I want to implement a new content strategy to go along with my fancy new site?

Our Solution Part 2: Google Sheets

Throw that new content into a Google Doc!

Yes, spreadsheets are old school, but let’s take a minute to appreciate what they give us.

  • Spreadsheets are familiar. When it comes right down to it, spreadsheets are the universal language of business. Putting content into little boxes gives us the ability to move those boxes around, highlight them, and sort them – few UX experiences can get you so much information so quickly.

  • Spreadsheets are dynamic. It doesn’t take hours of database planning to get information into a spreadsheet. Nor does it take hours of testing to rearrange or remove items from a spreadsheet. It doesn’t demand anything of your data architect other than “organize things by columns and rows.”

  • Spreadsheets are sharable. We can enable a Google spreadsheet and share it with the client in a few minutes. Clients can start entering their data from day 1 (alright, maybe day 2 or 3). And they can update content as needed – take it out of the sheet, update things, and change them.

  • Google spreadsheets have revisioning built in. If someone really messes up a Google Doc, you can go back through its history and revert it. It’s a nice compromise between committing all your initial content to source control or just letting it live freely.

Ready to give it a shot?

Stay tuned for Part 2 of this series, where I go into detail about how to set up your own Google sheet Drupal 8 migration.

Can’t wait? Check out the Migrate Google Sheets module now! We’ve even set up a sample site where content comes entirely from an external spreadsheet to help you get started.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Dec 16 2016
Dec 16

We mention iATS Payments quite frequently on our blog. Why? Well we did partner with them to create Commerce iATS for Drupal. It also happens to be one of our preferred online payment processors, due in no small part to its incredibly reasonable processing fees and the excellent support it offers nonprofits. It made sense for us to invest the time in making iATS Drupal compatible, given the ever-increasing number of nonprofits using Drupal to host their sites and fundraising efforts.

We sat down with Mike Kim, Partner Account Manager at iATS Payments, and asked him to explain in his own words exactly what iATS brings to the table:

What is iATS Payments exactly? What differentiates it from other companies that offer similar services?

Mike: iATS Payments offers payment processing solutions to over 10,000 nonprofit organizations around the globe.

We take care of all of the backend processes that occur when an online or mobile transaction is initiated by one of your donors. To do so, we provide both the payment gateway (the interface for accepting online donations) and the merchant account (the bank account where funds are held as the transaction is being processed).

We enable nonprofits to accept all major credit cards and ACH (direct debit) payments to provide their supporters with the utmost flexibility when it comes to online giving.

What sets iATS apart from other payment processors is that we’re one of the only services designed specifically for nonprofits. Since we only work with nonprofit organizations, we have a firm understanding of nonprofits’ needs and the pain points they face when accepting online donations.

Additionally, we’ve established an extensive network of partners, so our services can easily be integrated with many donor databases, event planning or auction software platforms, and other fundraising software. The ability to integrate simplifies data collection significantly and provides your organization with the most up-to-date and comprehensive financial reports, since all transactions are handled through one service.

Why would someone use iATS Payments instead of PayPal or another similar commercial service?

Mike: There are many benefits of working with a dedicated payment processor like iATS instead of services like PayPal.

PayPal and similar services are aggregators, meaning that they don’t allow organizations to select their own merchant accounts (the bank account used to hold donations while transactions are being verified). Instead, aggregators require their clients to use the merchant account that they (the aggregator) have set up, which your nonprofit would have to share with all of their other clients.

Using a shared merchant account can put your donations in jeopardy, especially considering that aggregators likely process hundreds, if not thousands, of transactions each day. If anything were to happen to one of their clients, it could put the whole account (and thus, your funds) at risk. And since aggregators have so many clients to attend to, you wouldn’t receive attentive support in ironing out any issues.

Not to mention, since PayPal and similar services aren’t specialized to nonprofits, they’re not as in-tune with your unique needs as we would be at iATS.

Those are only a couple of the benefits of working with us over PayPal, but there are many, many more. If you’re interested in learning more about why your organization should choose a PayPal alternative, check out this article from @Pay.

A lot of nonprofits utilize nontechnical staff and volunteers. How does your company support them with what is very technical stuff?

Mike: Our services were designed so that organizations don’t have to deal with any of the technical aspects of payment processing. We take care of all of the backend parts of the transaction for you so that your organization can spend more time focusing on what’s truly meaningful: connecting with supporters and enacting the good work that you do.

That being said, if your organization has any questions or ever runs into any technical difficulties, we’re known for our excellent customer support. We offer live customer care, so your organization can quickly get in touch and we can solve issues with minimal turnaround time.

We also offer user-friendly tutorial videos and product guides that nonprofit staffs and volunteers can use to gain a better understanding of our services.

What about security? This is a big issue for nonprofits. What methods do you have in place to instill trust in the security of your service?

Mike: iATS takes data security very seriously. We are a level 1 PCI-compliant payment processor, meaning that our services adhere to the strictest security standards as outlined by the Payment Card Industry.

Our payment processing solutions include many fraud prevention features that your organization can turn off and on dynamically as you see fit. Here are just a few that we offer:

  • Address verification system (AVS). An AVS checks the billing address that the donor has submitted on a webform against the billing address on file with their bank account to help you spot potentially fraudulent transactions.

  • Card verification code requirement (CVV). Turning on this feature requires the donor to enter the CVV number (the three digit number on the back of their credit or debit card) when making a transaction. Requiring another form of payment method identification can also deter fraud.

  • Minimum transaction limit. Fraudsters often test out stolen credit cards on donation forms by entering small, random amounts (think: $1.32). With our services, you can set a minimum donation amount so that your donation form is less likely to become a testing ground for fraudulent transactions.

Essentially, our services allow your organization to customize the level of security to your unique needs, providing both you and your donors with more peace of mind. At the same time, our security measures are non-invasive, keeping the donation process quick and convenient.

How can a nonprofit run by volunteers and non-technical staff offload the responsibilities that go with virtual transactions?

Mike: There can be a lot of tricky regulations to maneuver around when it comes to online transactions. The best way to avoid having to take on these crucial responsibilities is to work with a dedicated payment processor like iATS.

Our services are designed to be PCI-compliant and regulate the virtual transaction process so that organizations never have to bear the burden of these responsibilities (and the potential consequences that could arise as a result of failing to adhere to security standards and other regulations).

What platforms and implementations of your services do you support?

Mike: There are three ways that nonprofits can take advantage of iATS’ services. Here’s a quick rundown of what we offer:

  • Brickwork. Brickwork is a payment processing application offered on the Salesforce App Exchange. It’s compatible with both the Nonprofit Success Pack and Enterprise editions of Salesforce so that your organization can accept both credit cards and direct debit payments through your Salesforce CRM and Auctions for Salesforce platforms.

  • Aura. With Aura, your organization can use the iATS customer portal to create customizable donation forms for multiple campaigns. Then, you can seamlessly embed your forms into your website by placing a user-friendly Aura code on your site.

  • Partnerships. iATS also partners with over 130 donor database, event management, and fundraising software vendors. Chances are that our services can be integrated with the other nonprofit software platforms your organization is already using.

By providing multiple solutions, we can cater to nonprofits of all shapes and sizes, from those just starting out with online fundraising to those who already use an established suite of software to run their efforts.

How are you keeping up with the development and future of those platforms?

Mike: At iATS, we’re always looking toward improvement. We actively seek out feedback from clients, partners, and other stakeholders to help us identify areas where our products or services could be adjusted to make the virtual transaction process easier for all parties involved.

For example, one of the features we recently rolled out in the newest version of our Brickwork platform is card swipe reader support, so organizations can easily swipe credit cards on the go (which, as you know, is crucial now that donors are carrying cash less and less). We also added campaign and record type IDs so that transactions are recorded more accurately in the CRM.

How can a nonprofit get started with iATS if they decide today that they want to give it a try?

Mike: If your organization would like to get started with iATS, you can get in contact with us by visiting the contact page on our website. Simply fill out the quick form, and then we’ll be in touch to evaluate your needs and guide you through the sales process.

Consequently, you could also contact our sales department:

By phone: 1.866.300.4287 (#2) By email: [email protected]

We hope to hear from you soon!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Nov 09 2016
Nov 09

Have you ever stared at your computer screen with a deer-in-headlights expression on your face thinking “I have no idea where to even start with this…”? That was me about a month ago when I was asked to help theme a Drupal 8 project for the very first time. Getting started theming in Drupal 8, still being in a fairly new iteration with a programming style differing from Drupal 7, was both an exciting and daunting task. It was exciting in the sense that I heard good things from those who’ve already started theming in D8, and daunting because there’d been a lot of changes between D7 and D8.

One of the differences between Drupal 7 and 8 is template files; PHPTemplate (.tpl.php) files were replaced with Twig (.html.twig) files. Twig is a robust and elegant template engine for PHP. Once I started working with Twig, I instantly loved it. I found it to be lightweight, fairly quick to pick up, and very readable. Not only that, but I had what felt like ultimate control to the markup, including wrapping elements and rendering exactly the ouput I needed. Often with Drupal 7, wrapping elements in a <div> requires assistance from a back-end developer.

With this newfound enthusiasm, I set out to write the best twig code ever! In order to find the output I needed, I used the Twig function dump(). This function “dumps” a variable’s information right on the screen. This proved highly useful until I realized I needed to dig deeper into the arrays and objects contained within the variable. There was only so much guess work I could do here before getting epically frustrated, seemingly wasting valuable time looking for an image file’s path.

Though there are a handful of debugging methods to choose from, I had the best luck getting what I needed by using PHPStorm to debug Twig files. That’s right, front-end friends, PHPStorm isn’t just for back-end PHP-coding devs. It can be a great tool for front-end programmers as well!

After following the steps listed in Lubomir Culen’s post about debugging Twig templates, I began to look for templates in the following path sites/default/files/php. From my understanding, opening a template folder gains access to the current template version the project is using, hence the long hash.

Content rendering 1

If a change is made to the template, an additional hash file is created and a new breakpoint will need to be set. If at any point the hash template files get overwhelming, clearing the cache (running drush cr all) will reset the PHP folder and the template files, reducing the hash files to one per template folder.

First off, I needed to acclimate myself to translating PHPStorm syntax into Twig. For example, copying a variable name in PHPStorm produces a syntax like this: $context[‘page’][‘#title’]->arguments[‘@name’]. That gets translated into the twig file like so: page[‘#title’].arguments[‘@name’]. Here’s what my PHPStorm screen looked like while working on this solution:

Content rendering 2

Some patterns and tricks I found helpful:

  • Ignoring $context and starting with the main content variable.
  • Strip array syntax, i.e. [‘page’] = page.
  • If arrays exist next to each other, separate them with periods. Ex. [‘page’][‘content’] = page.content.
  • If an array has a #, @, or other symbol associated, keep its integrity. No period is needed here. Ex. [‘page’][‘#title’] = page[‘#title’], and arguments[‘@name’] stays the same.
  • If an arrow exists, treat the method (what comes after the ->) in the same manner as arrays. Ex. [‘#title’]->arguments = [‘#title’].arguments
  • If you’re having trouble rendering the desired output, try adding .value to the end of the render code and see if that does the trick.
  • Use dump() simultaneously with PHPStorm’s suggested variable path.
  • Refer to the Twig documentation for other handy built-in features.

Up until the moment I got PHPStorm doing the heavy lifting, my team and I were relying soley on the dump() Twig function. We were halfway through the project when I discovered a value was no longer present. The disappearance was due to a template’s reliance on a value being rendered via an array placement, i.e. content.tile.3['#markup'], the ‘3’ referring to the 4th placement in the ‘tile’ array. To alleviate potential confusion, ‘tile’ happened to be the custom field group where the field_to_render existed, and the field_to_render was the 4th field in the list of fields. When a field was moved within the ‘tile’ field group, the code broke. Once I had access to the phpstorm debugger, I was able to see a better way to render this element, i.e. content.field_to_render. It suddenly dawned on me that our project needed some tidying, so I rolled up my sleeves and got to work.

These are the strategies I established during my clean-up process:

  • Create the shortest render code possible with the closest placement to the main content variable. This will be the most stable. My array placement example mentioned previously is a good example of this. The same value can be present and rendered in numerous ways.
  • If rendering a field, use this pattern: content.field_to_render. This will render the field object, inheriting any backend logic that’s been applied to that field existing in the view_mode you are theming.
  • If you prefer having just the markup or integer value, try adding a .value to the end. Examples: content[‘#node’].nid.value will provide just the node id, and content.node_title will render the title object whereas content[‘#node’].title.value will render the title as a string.
  • The first element in an array might be the most stable. For example, we often use the media module which can add complexity to a media item’s data structure. In order use a node’s image as a background for a <div>, this is the best approach we found: <div class=”banner-image” style="background-image: url({{file_url(content.field_banner_image.0['#item'].entity.uri.value)}})">.

Any change can be tough to navigate, but it’s often well worth the effort. My experience theming in Drupal 8 thus far has been lovely, and fairly intuitive. I find it offers front-end developers more authority over the markup than its predecessor, and makes me excited for the future of theming in Drupal 8. If you were at all daunted by the thought of theming in Drupal 8, I hope this post helps you in your future twig debugging endeavors!

Note: Devel and Kint are a couple additional tools available for debugging Twig variables, and I mention those in case others find them useful. More information on how to set those tools up for debugging Twig files (and more!) can be found in this Drupal 8 Theming Guide and on Amber Matz’s Let’s Debug in Drupal 8! post.

If you’re new to Drupal 8 theming, I would start with the resources Amber specifies in her “Editor’s notes”, and sqndr’s D8 theming docs. Debugging twig files is an intermediate topic.

If you have any personal experience with Drupal 8 theming, or insight you’d like to share, I’d love to hear about it in the comments section!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Oct 14 2016
Oct 14

Originally published Sept. 25, 2016 on GregBoggs.com.

In many modern frameworks, data modeling is done by building out database tables. In Drupal, we use a web-based interface to build our models. This interface makes building the database accessible for people with no database experience. However, this easy access can lead to overly complex content models because it’s so easy to build out advanced structures with a few hours of clicking. It’s surprising how often Drupal developers are expected to be content modeling experts. Rachel Lovinger wrote this great overview of content modeling for the rest of us who aren’t experts yet.

Data Modeling Goal

Our goal when modeling content in Drupal is to build out the structure that will become our editor interface and HTML output. We also need to create a model that supports the functionality needed in the website. While accomplishing this, we want to reduce the complexity of our models as much as possible.

Getting Started

One of the first things to do when building a Drupal site is build content types. So, before you start a site build, start with either a content model or a detail page wireframe. This spreadsheet from Palantir will help you. The home page design may look amazing, but it’s unhelpful for building out content types. Get the detail pages before you start building.

Why Reduce Complexity?

The more content types you create, the more effort it will take to produce a site. Furthermore, the more types you have, the more time it will take to maintain the site in the future. If you have 15 content types and need to make a site-wide change, you need to edit 15 different pages.

The more pages you need to edit, the more mistakes you will make in choosing labels, settings, and formatters. Lastly, content can’t easily be copied from one type to another, which makes moving content around your site harder when there are many content types. So, the first thing you’ll want to do with your content model is collapse your types into as few types as feasible. How many is that?

5 Content Types is Enough

Drupal has many built in entities like files, taxonomy, users, nodes, comments, and config. So, the vast majority of sites don’t need any more than 5 content types. Instead of adding a new content type for every design, look for ways to reuse existing types by adding fields and applying layouts to those fields.

Break Up the Edit Form

Drupal 8 allows you to have different form displays for a single content type. With either Form Mode Control or Form Mode Manager, you can create different edit experiences for the same content type without overloading the admin interface.

By reducing the complexity of the content model, we decrease maintenance cost, improve the consistency of the website, and simplify the editing experience. Now that you’ve got some content modeling basics, look for opportunities to reduce and reuse content types in your Drupal projects. Content editors will thank you.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Sep 01 2016
Sep 01

Listen up, Drupal savvy MailChimp fans. We’ve got some news for you: MailChimp recently rolled out a newer and more robust version of their API - MailChimp API version 3.0! Now I can probably guess what you’re thinking so I’ll just come out and say it: this means MailChimp’s API version 2.0 is about to become deprecated, and we’re not monkeying around.

For those of you using the 8.x and 7.x-4.x branches of the MailChimp module, feel free to sit back and relax - you are already using MailChimp’s API v3.0. Those of of you still using the 7.x-2.x and 7.x-3.x branches, get ready: API v2.0 will be phased out on December 31st, so we encourage you all to upgrade.

Don’t be a furious George - we’ve got you covered. Our documentation up on Drupal.org has been updated, and we’ve provided information that will help make your upgrade experience as seamless as possible. We’ve even included a shiny new FAQ page this go around. For additional support, feel free to post questions on Drupal Answers.

Alright, let’s get down to monkey business. Those of you who upgrade are about to have a module that is regularly maintained, has an improved infrastructure (see the README.txt on the 7.x-4.x branch for more info), and can be integrated with the new MailChimp E-Commerce module (more on that in a future blogpost) - now, that’s something to go bananas for!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jul 20 2016
Jul 20

We recently launched a new case tracker for foster ed youth designed to improve their educational outcomes in partnership with The National Center for Youth Law (NCYL). The web application replaces their existing platform, Goal Book, which lacked the flexibility they needed to meet their requirements. A web application differs from a website in that a website primarily provides content where a web application primarily provides tools.

The project presented us with an opportunity to do extensive custom development with our favorite new platform, Drupal 8. D8’s many developer experience improvements, including standardized object-oriented development methods, allowed us to meet NCYL’s requirements efficiently and with a level of quality that would have been more difficult on Drupal 7. In addition, we were able to accelerate the release of RedHen CRM on Drupal 8, which lives at the heart of the application managing all of the contacts, organizations, and relationships.

To enhance the utility of the application, we made an early decision to customize every URL a user would interact with. As most of the functionality would revolve around nodes, we wanted to make sure we avoided URLs like /node/256/edit that don’t give the user any indication of which part of the application they’re using.


If you wanted to customize URLs in Drupal 7, you could use the Pathauto module. You can still do that in Drupal 8, but D8’s routing system can be coaxed into doing something similar. It works on admin pages, too, which was perfect for NCYL’s needs.

Overriding Existing Node Paths

As an example, let’s say you have a node type specifically for storing information about schools: a School Node. The standard admin path for adding a School Node would be something like this:


But, add a custom module with route configuration and you can have this:


For simplicity, we’ll call our module school.module. The directory structure looks like this:


The route configuration sits inside school.routing.yml and looks like this:

  path: '/school/add'
    _controller: '\Drupal\node\Controller\NodeController::add'
    _title: 'Add School'
    node_type: 'school'
    _node_add_access: 'node:school'

Line by line:


This is the name of the route. Route names should be unique and usually start with the name of your module.

path: '/school/add'

The path the route points to. This is the part that comes after your site’s base URL.

_controller: '\Drupal\node\Controller\NodeController::add'

This tells the route to use the NodeController, provided by the Node module. No need for a custom controller here.

_title: 'Add School'

This sets the page title of the node add form.

_node_add_access: 'node:school'

This is an access handler that ensures the user has permission to add a node of type “school.”

Providing a custom path to edit School Nodes is even easier:

  path: '/school/{node}/edit'
    _entity_form: 'node.edit'
    node: \d+
    _entity_access: 'node.update'

We no longer need to tell the route which controller to use or what type of node we’re using. Drupal 8’s Entity API figures it out using the node ID passed in the URL.

Line by line again:

path: '/school/{node}/edit'

The path now contains a placeholder, {node}, which represents the node ID in the URL.

_entity_form: 'node.edit'

The form we want to use to edit the node.

node: \d+

Some validation to ensure the URL contains the right data type for a node ID. By specifying the regular expression pattern \d+, we are telling Drupal to only use this route when {node} is one or more digits. The route will match a URL like /school/32/edit, but will not match /school/lincoln-high/edit.

_entity_access: 'node.update'

An access handler, ensuring the user has permission to update this node. No need to specify the node type, as we did when adding a node.

Finally, a route for viewing the node:

  path: '/school/{node}'
    _controller: '\Drupal\node\Controller\NodeViewController::view'
    node: \d+
    _entity_access: 'node.view'

Very similar to the node edit route, just with a different path and controller.

For a more thorough explanation of routes and route options not covered here, check out the official docs.

Custom Controllers

What if you want to provide a custom controller for adding a node and still take advantage of Drupal’s permissions system? Routes can do that, too.

Let’s introduce a Teacher Node and an accompanying module.


teacher.routing.yml looks like this:

  path: '/teacher/add'
    _controller: '\Drupal\teacher\Controller\TeacherController::addTeacher'
    _title: 'Add Teacher'
    node_type: 'teacher'
    _node_add_access: 'node:teacher'

Very similar to the route we used to add School Nodes, but with a custom controller.

TeacherController.php looks like this:

namespace Drupal\teacher\Controller;

use Drupal\node\NodeTypeInterface;

class TeacherController extends ControllerBase {

  public function addTeacher(NodeTypeInterface $node_type) {



The addTeacher function is where you would add your custom code for adding Teacher Nodes.

That’s how you can use core Drupal 8 functionality to make your Drupal admin pages a little more user friendly.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jul 12 2016
Jul 12

Anyone who works on team-based projects knows how handy good project documentation is, and how frustrating it can be when that documentation is out of date, incomplete, or just not there. But there are other benefits to good documentation aside from convenience, and a solid system for writing and maintaining documentation is the key.

Defining Documentation

Before we begin, we should be clear about what we mean when we say ‘Project Documentation’ (Docs for short). We’re referring to the information for team members (developers, designers, project managers, and engineers) who join a project at some point after initial development has begun, or even long after a project is complete, such as a maintenance team. This is different than User/Tech docs (how things work on a site), and Code docs (Comments, README files, etc.).

Good docs allow these team members to get up to speed on a project with a minimum of questions for existing or previous team members. In an ideal world, docs alone would suffice in getting someone set up and working on new features, bugfixes, or other project tasks.

Additional Benefits

The convenience of good docs is apparent to anyone who joins a project after it has begun, but consider some of the other benefits:

  • Junior developers will be able to reference docs, instilling confidence.
  • A team member leaving your company will not cause as much of a ‘knowledge drain’.
  • Consistent docs allow any team member to quickly jump in and out of projects as needed, providing project managers with additional flexibility in resource allocation.
  • Long-dormant projects can be resurrected quickly, even if none of the original team members are available.
  • Figuring out where a project’s code is, how to install it locally, how to make/commit changes to production, and tracking down the original wireframes, designs, and planning docs can take days if the original team members are not available. Good docs can cut this time to under an hour, or even minutes in some cases.
  • Docs that accompany open-source projects are especially useful in saving the end-user AND the maintainer’s time.

Location, Location, Location

Having your docs in one place, or in the same place on every project is the first step in making them easy to find - after all, what good are the docs if nobody can find them? ThinkShout uses GitHub for all of its projects, so we take advantage of the fact that every project on GitHub has a free Wiki. A link in the README.md to the wiki means everyone can find the docs in seconds.

A Solid Foundation

The keys to good docs are consistency, accuracy, and completeness:


For our Wiki, we have a template we use for every project’s docs, so we don’t have to search for the information among 40 different documentation styles. Your project’s needs may differ, but this should be a good starting point (this is in Markdown):

## Current Status

(Site Type / Status. Drupal, WordPress, under development, maintenance, etc...)

## Site Build Info

* [Wireframes](URL)
* [Budget](URL)
* [Implementation overview](URL)
* [Migration Spreadsheet](URL)
* [Style Guide](URL)

## Build Team

* Name (Team Lead)
* Name (Back-end)
* Name (Front-end)
* Name (PM)
* Name (Design/UX)

## Hosting

* [Dev](URL)
* [Test](URL)
* [Live](URL)

## Issue Tracking

[Redbooth Tasks](URL)

## Deploying Code  
Note: it is a good practice to run backups before deploying.

    `cd ~/projects/PROJECTAME;git pull;./scripts/deploy.sh`  

## Installation Notes

Clone into `projects` folder, install to `~/Sites/`:

    cd ~/projects
    git clone [email protected]:thinkshout/PROJECTNAME.git
    composer update
    ./scripts/build.sh ~/Sites/PROJECTNAME root root PROJECTNAME

Download db and files from [production](production backup URL)

Install the db by opening Sequel Pro, deleting the PROJECTNAME db,  
adding a new PROJECTNAME db, and importing the live db, then truncating  
all of the cache_* tables. 

Install the files by unzipping the file download and copying them  
to `~/Sites/PROJECTNAME/sites/default/files`, then run:  

    chmod -R 777 ~/Sites/PROJECTNAME/sites/default/files
    drush cc all
    drush fra -y

Log in: drush uli

Disable cache and JS/CSS file aggregation   
at http://PROJECTNAME.dev/admin/config/development/performance

## Front-end Setup  
Theme directory is at:  

To get Sass running, `cd` to that directory and run `bundle`  
Thereafter, you only need to run `rake serve` from the theme directory.


The nice thing about having your docs in a wiki is that everyone in your organization can edit them if they discover they are out of date. When a new team member is added to a project, encourage them to work from the docs and see how far they can get without asking for clarification or dealing with an unexpected error. And make sure they update the docs to reflect their experience - the only time docs are ‘done’ is when anyone can use them reliably every time. If you have to ask what something means, it’s likely that the next person will need to know that too - so update the docs!


Every project has its quirks and exceptions to the standard procedures - usually for good reason. Good docs will not only note exceptions to standard procedures, but also explain why. In addition, sometimes a ‘Phase 2’ project will require additional information. Make note of these major updates with details such as planning info, principals, dates, and an overview of what was accomplished.

Sometimes a developer will run across coding environment issues that hold them up - this is quite common for the complex front-end setups needed to compile SASS into CSS. Front-end developers sometimes take these setups for granted, but documenting that install process can mean that your back-end developer can handle small CSS changes without assistance:

To get Sass running, `cd` to that directory and run `bundle`  
Thereafter, you only need to run `rake serve` from the theme directory.

NOTE: If you get a 'not found' error after running `bundle`,  
run `gem install bundler`, then `bundle install`.

Part of Your Process

Finally, it’s not enough to have all of these wonderful docs in place and forgotten - they have to be a part of your project setup and launch checklist, and it needs to a part of every project, big or small.

Consistent, accurate, and complete project documentation will save time, make your code easier to maintain, improve team confidence, and do a great service to every developer who comes to your project after it’s finished. Docs Rocks!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
May 05 2016
May 05

Next week, nine of us will ship out to New Orleans for DrupalCon. As excited as we are to finally enjoy an authentic beignet and explore the French Quarter, the real draw here is the opportunity to once again catch up with our friends and collaborators in the Drupal community.

We’ve been hard at work preparing our presentations and lining up BoF schedules, and putting those finishing touches on our booth. All that’s missing is you! We have five short days to see what all DrupalCon has to offer, and meet everyone we want to meet. Can we do it? Two words: Challenge accepted.

As always, if you’re reading this and you plan on attending DrupalCon, too, we want to meet you! Here’s what we’ve got lined up.

Exhibit Hall

This year, booth 216 is the ThinkShout headquarters! Rest assured, we brought the swag. Shirts, socks, and MailChimp hats await you, so be sure to stop by and grab one before they’re all gone! We’ll also be raffling off another BB-8. All you have to do is sign up for our newsletter by dropping off a business card. Painless, right?

We’ll also be on the lookout for folks interested in pursuing the senior engineer position we recently listed, so if you think you might be a good fit for the position and you’ll be at DrupalCon, then let’s chat! Drop us a line and we’ll set up a time to get to know each other a little better. Or just show up - that’s fine too.

ThinkShout Sessions

Content Strategy in Pop Culture” - Wednesday (5/11), 1:00pm - 2:00pm

Join Brett Meyer for an unconventional look at content strategy through a pop culture lens. Draw parallels between the information you consume every day and the sites you build, and have a little fun while doing it!

The Story of Successful Drupal Integrations in 3 Acts” - Thursday (5/12), 10:45am - 11:45am

Lev Tsypin will share the stories of the MailChimp, iATS Payments, and Salesforce Drupal integrations, and share some insight into how to be successful when setting out to integrate Drupal with other third party systems.

Birds of a Feather (BoF) Discussions

Our team has three birds of a feather discussions planned on the Tuesday of DrupalCon, so mark your calendars and join the conversation!

Fundraising in Drupal” - Tuesday (5/10), 11:00am - 12:00pm

Static Fanatic: Tips on Developing Static Sites” - Tuesday (5/10), 11:00am - 12:00pm

Event Registration in Drupal” - Tuesday (5/10), 3:45pm - 4:45pm

Of course, we also plan on checking out the many wonderful social events that DrupalCon has in store in the evenings, so there’s a good chance you’ll see us there as well! It’s going to be a jam-packed week, and we can’t wait. As always, we’re looking forward to catching lots of sessions, brushing up on trends, and learning new things. We hope to see you there!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
May 03 2016
May 03

As Drupal 7 developers, we know how risky it is to edit production code and configuration live. However, we often let clients do it because using Features is hard. Drupal 8 has solved a lot of this headache with file-based configuration management, which allows file-based workflows that gracefully avoid editing production directly. This article will show you how to use Drupal 8 configuration management and Pantheon’s amazing workflow tools to easily give your clients the ability to make configuration changes. We’ll show you how to seamlessly integrate those changes into your normal development workflow, so that you - and your clients - will win at Drupal!

Benefits of File-based Config

Storing active configuration directly in files has many benefits. The main benefit is that clients no longer have any reason to ever edit production configuration directly on production. Further, using file-based configuration removes the extra steps required to edit configuration in the database. These extra steps are confusing, can fail with fatal errors, and are made unnecessary by not storing configuration in the database.

How to Enable File-based Config

The documentation for enabling this isn’t too difficult. But, Pantheon recommends not storing the services.yml file in version control. So, we’ll create a new services YAML file and include that along with the active configuration settings in settings.php. Before you start, export your current configuration to the sites/default/config folder and deploy that to Pantheon. Next, enable file storage by adding the following config.services.yml to your sites folder and using the following settings.php.

Once deployed to Pantheon, the site will now be running in file-based configuration storage. To test this, go ahead and make a setting change in your local environment. You should see Drupal immediately write the change to site/default/config. Deploying this edit to Pantheon should make the Pantheon site immediately update to reflect the new configuration change. You just won at Drupal!

Configuration Workflow on Pantheon

Now create a multidev for the client to use. Once the multidev is created, put the Pantheon account into SFTP mode because SFTP mode allows Drupal to edit the configuration files directory. So, now so the client can edit the configuration in Drupal and save their work with the Pantheon dashboard.

file-config1Changes ready to commit

file-config2Changes committed

file-config3Merge to development

file-config4Configuration deployed to development

When the client has completed their work, they can deploy it using the Pantheon deployment tools. You can merge in a client’s work into your own easily because the client is now using version control. Once the configuration is merged to Dev, the standard Pantheon workflow makes it easy to deploy these changes to production.

Don’t Edit Production Directly

If production is in SFTP mode, clients can still edit production live. To prevent this, either keep production in Git mode, or use the Config Readonly module to lock production configuration.

Drupal gives users the power to build and edit a website, and users can make dramatic changes to websites with just a few clicks in forms. With Pantheon’s tools and Drupal 8, we now have the ability to use those wonderful tools in a safe environment. The tools combined allow us to bring clients into the workflow and manage deployments as a part of the team because Drupal 8 allows us to build robust, collaborative workflows like never before.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Apr 27 2016
Apr 27

Drupal 8 theming can be irksome with cache-rebuilding and browser refreshing, especially with responsive design. Wouldn’t it be great if you could just open your site on three different devices and have them update live as you edit your theme?

Let me introduce you to Browsersync. Browsersync is a module for Node.js that allows you to sync your changes across browsers and devices.

Preparing Drupal

This article assumes you have a working install of Drupal 8 and a theme in place. If you don’t, check out Joe Komenda’s post, Up and Theming with Drupal 8. This will get you going.

Once you have D8 installed, you’ll need to turn off caching. Rename sites/example.settings.local.php to sites/example.settings.local.php. You can rename the files with editor of choice, if you prefer, or run the following command from your site root :

$ cp sites/example.settings.local.php sites/default/settings.local.php

To be sure your changes are included, we’ll need to enable Drupal’s Null Cache Service. Uncomment the following line sites/default/settings.php:

$settings['container_yamls'][] = DRUPAL_ROOT . '/sites/development.services.yml';

Next, let’s disable the render cache and dynamic page cache. Uncomment the following in the same file.

$settings['cache']['bins']['render'] = 'cache.backend.null';
$settings['cache']['bins']['dynamic_page_cache'] = 'cache.backend.null';

Finally, add the following to sites/development.service.yml:

    debug : true
    auto_reload: true
    cache: false

Run drush cr from the root of your site to rebuild the cache.

Installing Browsersync

Browsersync is installed using Node Package Manager (NPM). If you already have Node.js, then you already have NPM. If you don’t have it installed, head over to nodejs.org.

Once Node.js and NPM are set up, install Browsersync with npm install -g browser-sync. This will install it globally so that you don’t have to reinstall it every time you spin up a new project. Test that your installation is working by running browser-sync -h in your terminal. That should show all the usage, commands, and options for the plugin.

Connecting Browsersync to Drupal

Let’s make the magic happen by connecting Drupal and Browsersync. Go to the root of your Drupal theme folder. Run browser-sync start. Browsersync will generate a script tag for you to place just before the closing body tag. Browser sync also has UI. You’ll see a URL for your localhost and one for sharing the connection to other devices on the same network.

Browsersync start

Let’s add the script tag to your html.html.twig file just above closing </body> tag. This will add a connection to your Drupal environment and Browsersync.

Since Drupal will most likely be running on a local server configured by your LAMP stack, you’ll need to run Browsersync with the proxy option. Run browser-sync start --proxy <your site localhost> in your terminal. For example, if your site is running at http://mysite.dev then use browswersync start --proxy mysite.dev Your browser will open automatically to http://localhost:3000. Now you should see “Connected to BrowswerSync” in the top right of your browser.

Watching for Changes

Although Browswersync and Drupal are connected, we need to watch for changes. Let’s run Browsersync with the the --files option. We’ll watch changes to our CSS file and have it automatically update the browser with our changes. In your terminal run:

$ browswer-sync start --proxy mysite.dev --files "css/*.css" --no-inject-changes

This command tells Browswersync to start and watch for changes to files with the .css extension in the css directory. The --no-inject-changes option tells Browsersync to refresh the browser window instead of just injecting a new version of the stylesheet. Injecting the changes won’t work because of the way Drupal imports our stylesheets. We need to reload to get the new version.

Try opening your site in Chrome, Firefox, and even on your mobile device browser. Once you make a change, you should see all of them automatically update.

Where to Go from Here

Browsersync is a great tool for fast development and syncing your changes across multiple devices without having to manually reload each one. I recommend integrating Browswersync with your task manager of choice. Here are some resources to help you integrate with Grunt or Gulp:

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Apr 19 2016
Apr 19

We were recently asked by a client to edit the user profile view page on their site. This client needed us to move the link to the user’s contact form out of the tab area at the top of the profile and replace it with a link that appears further down in the content of the user’s profile. While this is not something you can do through the admin interface in Drupal 7, it is easy to do with just a few lines of code in a custom module, which I will show you how to do here.

Prior to adding our custom code, the link to the contact form appears as a tab.

Customize Menu 1

The “Contact” menu item starts out as a tab because the Drupal contact module originally creates the menu item and assigns it the type MENU_LOCAL_TASK. (See Menu item types for a list of the possible menu types and their uses in Drupal.) In order for us to change the type, we can use Drupal’s hook_menu_alter() function to change the item to the MENU_CALLBACK type, which will remove it from the display, but keep it available as a valid path.

 * Implements hook_menu_alter().
function mymodule_menu_alter(&$items) {
  // Remove the 'contact' tab.
  $items['user/%user/contact']['type'] = MENU_CALLBACK;

Now it is no longer a tab, but we still need make use of Drupal’s hook_user_view_alter() to insert it into the content of the profile before it is rendered on the page.

 * Implements hook_user_view_alter().
function mymodule_user_view_alter(&$build) {
  // Check to see if this user has allowed others to contact him/her.
  if ($build['#account']->data['contact']) {
    // Create the text for the link using the account info to get the user’s first name.
    $link_text = $build['#account']->field_first_name['und'][0]['safe_value'] ? "email "
      . $build['#account']->field_first_name['und'][0]['safe_value'] : "email";
    // Use the l() function to create the link.
    $contact_link = l($link_text,'user/' . $build['#account']->uid . '/contact');
    // Insert it into the $build array.
    $build['contact_link'][0]['#markup'] = "<div class=\"field\"><div class=\"field-label\">" 
    . t('Contact') . ":&nbsp;</div><div class=\"field-items\"><div class=\"field-item even\">"
    . $contact_link . "</div></div></div>";
    // Insert into the user details that group we created in the display mode in admin interface.
    $build['#group_children']['contact_link'] = 'group_user_details';

After the custom code and a quick cache clear, the tab is gone and there is a link to the form within the body of the profile.

Customize Menu 2

I won’t go into creating a custom module; that’s a bit beyond the scope of this post, but there is a tutorial for creating a custom module on drupal.org.

Shout out to Greg Boggs for his assistance!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Apr 12 2016
Apr 12

Have you heard the news? A ThinkShout-built site has been nominated for a Webby! And yes, it’s a Drupal site.

The nominated site is none other than Splcenter.org, the online voice of the Southern Poverty Law Center, an organization committed to teaching tolerance, battling institutionalized prejudice, and giving a voice to the most vulnerable people in our communities. Our partnership with the SPLC has been a source of great pride for us, as it’s led to an amazing collaboration for our respective teams.

The Southern Poverty Law Center demonstrated its commitment to web excellence during the redesign process by investing in a platform that supports the vital work they do in the fight against injustice in our country, making it available and accessible to the people who need it most. This was an incredible undertaking, and I encourage you to check out our case study on the journey to the relaunch of the SPLC to learn more about the process.

The Webbys are perhaps the most prestigious awards of their kind and simply being nominated is an honor. Splcenter.org is technically up for two awards in the “Websites - Law” category. The first is a Webby, chosen by the Webby Academy. The second is the People’s Voice Webby. While we can’t affect the outcome of the first award (beyond the work we put into making a great website), the community decides the winner of the second. Yes, you can help a great organization win by voting for the SPLC!

The site is the product of countless hours of work and careful planning. We strove to build the SPLC a site that would further their mission and protect them from cyber attacks carried out by the forces of hate they combat every day. Together, we’ve accomplished these goals. We believe that work speaks for itself on the new site, but the improvement is measurable, too: since launch, we’ve tracked a 55% increase in overall traffic year-over-year, with a 120% increase in mobile traffic. From the Hate Map – which provides a highly accessible, albeit terrifying view of the rise of hate groups in the U.S. – to the sheer volume of civil rights news and resources made readily available and searchable from any page, the SPLC continues to expand its award-winning voice online. We’re asking you to contribute your vote and show the world that the SPLC deserves the People’s Voice award.

Please take a moment to cast your vote for the Southern Poverty Law Center. With the Webby’s, your voice makes a difference, so please vote and be heard!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Apr 01 2016
Apr 01

While many of you were on spring break, working on your tans… we spent our spring break in San Jose. We weren’t catching rays, we were surfing… surfing through oodles of great content and meeting many amazing people working for worthy causes at the Nonprofit Technology Conference (NTC).

For the uninitiated, the NTC is an annual conference that brings together roughly 2,000 nonprofit professionals from around the world. (I met someone who flew in from India to attend.) It is, without question, the banner technology event of the year in our field. Not only do the sessions (of which there were over 100) spark conversations, collaboration, and change, they also explore the latest and greatest strategies and technologies available to help all of these worthy causes achieve their goals.

We look forward to this event every year; it’s infectious to feel the energy in a room with like-minded individuals, and to be afforded the opportunity to solve problems and collaborate in the real world. You can certainly learn a lot from the articles and case studies sitting in your inbox, but sometimes, sitting across from someone at lunch and finding out how their organization approached a common struggle and found a resolution can be so much more illuminating. Remember talking to people? It’s that thing we all did before texting and tweeting took over our lives!

I’m (mostly) kidding.

Joking aside, if you’re in the nonprofit industry, whether it be on the tech side, marketing, fundraising, or leadership, this conference should be on your list for professional development. We’d like to share with you some of the sessions we were a part of – and keep an eye out for updates from NTEN (that’s the Nonprofit Technology Network) for info on conference follow-ups or other webinars they host.

Drupal Day

We look forward to Drupal Day all year, as Stephanie explained in her last blog post. Lev kicked things off for us with his “Fundraising in Drupal” talk. This session explored the toolsets that enable anyone with limited coding skills to run compelling online campaigns, create one page donation forms with multiple payment options, and run viral fundraising campaigns with tools like RedHen CRM and RedHen Raiser. There is a wide range in the efficacy of online fundraising, and Lev walked through how Drupal provides the tools to do it right. He even covered some recent case studies where these tools were implemented and why they were especially effective in those environments. You can find his slides on Slideshare.

Next, Amy dove into the (somewhat) intimidating world of Web Development. In her session, she covered ways to maintain and improve your website without breaking the bank – which is important to just about anyone running a business today, but especially to those in the nonprofit sector. She talked about the pros and cons of using in-house developers versus vendors, considerations for static sites versus a CMS, and of course, strategies to help you keep costs down. Low and behold: it is possible to have a beautifully functional website in the face of budget constraints! Check out the slides from “Web Development Within Your Means.”


On day one of the formal conference, Lev and I led a session on digital storytelling, explaining why it’s relevant, and talking about some common tools and tips to effectively tell your organization’s story online. Stories are how you engage your constituents and build a relationship with them. More importantly, it’s how they connect to your cause. But the way we tell our stories and digest information is rapidly evolving, and text on a web page isn’t enough. In the Internet of Things, there is a multitude of ways to share your message. Christian Anthony from Earthjustice shared the inventive ways they utilize technology for greater engagement through photo essays, maps, and infographics, just to name a few. You can find the slides for our session, “Show, Don’t Tell: Online Storytelling Through Digital Media,” on Slideshare.

On day two, Brett teamed up with Melissa Barber from North Peak and Lara Koch of the Humane Society of the United States, and spoke about user experiences across 3rd-party systems. I’d wager that about 90% of nonprofits out there wish for nothing more than the opportunity to use a single platform to implement their digital strategy. Most of the time, though, business needs and historical requirements dictate that many projects require you to — somehow — create a cohesive user experience across multiple platforms, even when those platforms don’t provide you with extensive customization options.

It’s not necessarily a pretty job, but we can’t ignore it. Fortunately, there are tried and true ways to streamline the process and maintain as clean a user experience as possible. Topics covered were:

  • Requirements Gathering: What systems are in play?

  • Design: How can we create reusable components?

  • APIs: Can we hide the 3rd-party systems altogether?

  • Compromise: How can we change our perceived organizational needs to put users first?

  • Governance: What are the human systems we need to consider?

  • Post-launch: How can we prevent fragmentation of the experience after the solution is implemented?

And these were just the sessions we participated in! There is an unbelievable wealth of good information and content at the NTC. If you were there with us, you know it’s easy to get overwhelmed and come back to the office, head swirling with ideas, unsure of where to start. You’re likely already knee deep in your established day-to-day tasks and projects – but don’t forget what you learned.

My suggestion? Look back to your notes and session materials – write your goals down and focus on accomplishing one of them this quarter, set a deadline for yourself with your supervisor and make yourself accountable. After all, do or do not, there is no try…

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Mar 16 2016
Mar 16

It’s that Nonprofit Technology Conference time of year. The NTC is one of our favorite conferences, as it’s an amazing place for us to learn more about the exciting work being done in the nonprofit world. It’s also a great time to for us to share what we’ve been doing, and make connections. So if you’re curious about a job opening we’ve posted, or have a project you’d like to talk to us about, the NTC is a great place to have those conversations!

There are a few places you can connect with us this year.

Drupal Day

We’re co-sponsoring Drupal Day on March 22nd along with the fine folks at Aten Design, Four Kitchens, Forum One, Message Agency, and Zivtech. We’ve got a jam packed day of breakout sessions and nonprofit case studies lined up. There’s a great variety of session topics to choose from, from content strategy, Drupal fundraising, and theming to web projects on a nonprofit budget, so there will be something for everyone at Drupal Day!

Want to stop by a ThinkShout-led breakout session at Drupal Day? Lev Tsypin and Amy Vaillancourt-Sals have a couple of sessions on the docket you should check out. Lev will be leading “Fundraising in Drupal” at 9:30 am, and Amy will be leading “Web Development Within Your Means” at 1:45 pm.

You can check out the full Drupal Day schedule online. If you’re planning on attending the NTC, but haven’t registered for Drupal Day and would like to, no problem. Just show up that morning and we’ll take care of the rest.


There are two ThinkShout sessions in the official NTC schedule. Natania LeClerc and Lev Tsypin will be teaming up with Christian Anthony of Earthjustice.org for “Show, Don’t Tell: Online Storytelling on Digital Media” on Wednesday, March 23rd at 3:30 pm.

On Thursday, March 24th at 1:30 pm, you can catch Brett Meyer, Lara Koch from the Humane Society of the United States, Melissa Barber from North Peak at their session, “If I Only Had a Frame(work): Crafting User Experiences Across 3rd Party Systems.”

These sessions are great opportunities for you to see our staff in action, sharing their expertise with the nonprofit tech community. Be sure to follow Brett and Lev on Twitter for updates on their sessions – and feel free to pick their brains in person!

Exhibit Hall

Booth #715 is the ThinkShout HQ at the 2016 Nonprofit Technology Conference. We’re debuting a brand new, super comfy shirt, so be sure to stop by and pick one up. This year, the exhibit hall will be closed during sessions, but you’re guaranteed to find us there during the breaks in between. If you’re looking to talk shop, or learn more about the services we offer, definitely stop by the booth. If you’d like to plan on a specific time to meet in advance, drop us a line and we’ll make it happen.

At the very least, you should stop by our booth for a chance to win your very own Bluetooth BB-8. Or, as we like to call it, our new office best friend. Yes, you’ll get a chance to meet it before you take it home.

Image: BB-8“Beep boop”

We’ll see you all in San Jose!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Mar 09 2016
Mar 09

It’s easy to take for granted all the great tools we use on the web, everything from email, to social media, to git hosting. Many of these services are free or use advertising to support the sometimes significant costs to run them. Mandrill is one of those great tools and one that we’ve been recommending to our clients on a consistent basis. If you’re unfamiliar with Mandrill, it’s a transactional email service for sending email ranging from password resets to Commerce receipts. Now we realize that your webserver can already send email, so why bother with a service like Mandrill? It offers three major advantages:

  • Deliverability
  • Reporting and accountability
  • Templated emails that look great across all email clients

As the team who originally partnered with MailChimp to build the Mandrill module, we might be a bit biased, but we think it’s one of the best transactional email services out there that integrates with Drupal. We have many clients that use both MailChimp and Mandrill, and many that use Mandrill by itself. In fact, we just completed a Mandrill 8.x-1.0-alpha1 release with all the great features of Mandrill for Drupal 8.

Change is inevitable though, and MailChimp has decided to refocus the Mandrill service on their core value of delivering customized email to users. This doesn’t mean Mandrill is going away by any means, just that it’s new focus will be on delivery of 1:1 customized emails as opposed to 1:many emails. MailChimp already provides a great way of delivering 1:many emails as a part of your email campaigns, and Mandrill will focus on the customized 1:1 delivery of email. This should reduce the number of users using Mandrill to send 1:many emails, many of which are spammers abusing the system.

You can read more about the announcement on the MailChimp blog along with the FAQ they prepared for customers. At the heart of the change is the requirement of having a paid MailChimp account and purchasing Mandrill credits to use the Mandrill service. Your MailChimp and Mandrill accounts can be merged on March 16th and a single monthly subscription will be maintained going forward. According to the MailChimp FAQ, the minimum cost of having both services will be $10 for the basic MailChimp account and $20 for 25,000 Mandrill emails sent in a month.

This does mean Mandrill as a free service is going away and for some people, this means that they’ll need to find an alternative transactional email service. The good news is that there are alternatives out there, and some are even honoring Mandrill’s free level. If you’re looking for one of those free alternatives, try Sparkpost, Amazon SES, or SendGrid. Not all of these services have Drupal modules at this point, but many do, and could be worth trying out.

Our recommendation for many of our clients is to stick with Mandrill for a number of reasons. For those that already use MailChimp and Mandrill, the additional monthly cost is not significant enough to switch in many cases, and having a single account to manage can be beneficial. For those clients that only use Mandrill now, adding a new monthly cost can feel a bit more burdensome. The stability of the Mandrill service and Drupal module does outweigh the cost of switching in many situations, but we are sensitive to our clients’ budgets and may look at the alternatives if it makes sense. It’s important to note that all Mandrill accounts must be merged with an existing MailChimp account by April 27th.

What does this mean for the Mandrill module for Drupal and our ongoing support and maintenance of it? MailChimp has assured us they are committed to the Mandrill service for the foreseeable future and will be looking at ways MailChimp and Mandrill can work better together. We also see a lot of ways the two modules can be combined and managed as one, making it easier for users to view activity across the two services in one place. At this point, we will continue our support and development of the module, including the new Mandrill D8 release. We are excited to see where the Mandrill module goes and the additional value that MailChimp adds to the service going forward. Stay tuned for the next chapter of Mandrill!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Feb 23 2016
Feb 23

A fundamental part of ThinkShout’s philosophy and business is to not only use great open source tools, but to actively give back to the open source community. The most recognizable part of that contribution is in the form of Drupal modules: over 60,000 Drupal sites run on ThinkShout technology! This strategy has been a huge win for our clients, who get features, bugfixes, and security updates from the community because they are sharing code. We benefit from this work as well, as it enhances our profile in the community and helps us generate leads for new projects. Of course, the vast majority of these sites are built on Drupal 7, which released in the same month that ThinkShout was founded.

The exciting and much-anticipated release of Drupal 8 has given us a lot to think about. Internally, we’ve been running a Drupal 8 “book club” to give our development team a chance to dive in and figure out what’s going on with D8. While we’ve bathed in the glories of Composer, Twig, and YAML files galore, we’ve also had a chance to start upgrading a few of our favorite modules to Drupal 8!

With support from the incredible folks over at MailChimp, we’ve already got a working release of the MailChimp integration module for Drupal 8, and are hard at work on the integration with Mandrill, MailChimp’s awesome transactional email service, which may have a Drupal 8 release before this blog post goes live.

This is a great start, but it’s really only the tip of the iceberg: ThinkShout has about 50 modules released for Drupal 7! As much as we’d love to dive in and update all of them for Drupal 8 today, that’s not particularly practical. To better focus our work, I analyzed some of these modules so we can prioritize them and look for opportunities to work with our partners and clients to get the most useful, popular, and important modules upgraded to Drupal 8 first.

Of our 50 modules, we started by de-prioritizing anything that was:

That left us with around 10 projects, among them MailChimp and Mandrill, which we were already working on. We wanted to pick a manageable number of these remaining modules to get started on.

Based on community usage, the priorities of our clients, and perceived usefulness, it was clear that the Registration module belonged on this list. The story of Registration’s development is connected to the story of ThinkShout’s fledgling years and open source philosophy, so it’s an added bonus that Registration will be part of our early push into Drupal 8.

ThinkShout has also carved out a reputation as experts in the CRM world, with RedHen, our leading Drupal-integrated CRM, and the Salesforce Suite, a fabulous tool for integrating Drupal sites with Salesforce. Though these modules don’t have the 5-digit usage numbers that Registration or MailChimp have, they still have lots of users who are very engaged, and are central to the needs of our clients. We added them to the top of the list for Drupal 8 consideration.

In thinking about the rest of our modules and the nature of our work, it became clear that these three projects really stand out from the rest: they are our “Big 3”, and we set about creating a roadmap for developing them on Drupal 8.

You can already see the beginnings of this work! At our team sprint on February 11, we put together an outline for bringing RedHen to Drupal 8, and pushed the first commits to Drupal.org.

porting-thinkshout.jpgThese are our sprint faces!

As of February 11, all of the Big 3 have nominal Drupal 8 branches.

As we kick off four Drupal 8 sites in the first part of this year, we will be working with our clients to bring Registration, RedHen CRM, and Salesforce Suite to Drupal 8. All three should update beautifully, as they are built on top of Entity API, which is part of Core in D8.

We will also be focusing our internal open source contribution hours on these three projects to kickstart their jump into the Drupal 8 sea. If you’re looking for awesome CRM or registration systems for your Drupal 8 site, fear not! They are on their way.

We have two Drupal 8 sites utilizing ThinkShout core technologies scheduled for launch this summer, so look for a release of RedHen in the spring!

Our next round of prioritization will depend significantly on the progress of Commerce solutions in Drupal 8: once that landscape settles, we have some projects that will jump up that priority list, including:

So if you’re a fan of our Commerce integrations, or Add to Cal, or even little Bean Entity View (I know I am): stay tuned! We love these tools, we love that you’re using them, and we look forward to bringing you even more awesome stuff for Drupal 8 than we have for Drupal 7!

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jan 28 2016
Jan 28

There are two exciting Drupal community events happening in Portland soon. The first is the Drupal Global Sprint Day on January 30th - this coming Saturday - which is a day focused on bringing together volunteers to contribute work such as documentation, testing, code, and design to the Drupal project. The project needs improvements from a wide variety of skill sets, and it’s a great way for new folks to contribute to Drupal. The second is Drupal Global Training Day, a free Drupal 8 training for new community members. We’re thrilled to be involved with both!

Drupal Global Sprint

Bring your projects and come code with us!

We’re hosting the Portland sprint at our office. Bring your projects and come code with us! If you’ve wanted to contribute to Drupal 8, but don’t know how to begin, we’re happy to help you get started. New contributors are encouraged to attend, as we will be providing sprint training and new contributor onboarding, so don’t worry if you’ve never contributed to Drupal before. The sprint starts at 9:00 am and goes until 5:00 pm. Programming help, snacks, coffee, tables, and wifi will all be provided by ThinkShout.

Drupal 8 Training

These two global Drupal events offer something for Drupal folks of all skill levels

February 6th is the Drupal Global Training Day. We will be leading the Portland training at the Drupal Association headquarters, and it’s open to everyone. This free training is ideal for new community members and people who are new to Drupal – but PHP developers not familiar with Drupal should also find the training valuable. The training includes coffee and snacks. Participants need only bring a laptop. Everything you need to know to get started will be discussed in detail at the event. We’ll cover:

  • An introduction to CMS
  • File management and databases
  • Site building basics with content types, fields, and views
  • Installation of modules and themes
  • Deploying to your web host with Git
  • Introduction to Drupal 8 theming with Twig templates
  • Drupal 8 configuration management

These two global Drupal events offer something for Drupal folks of all skill levels, helping us to tap into Portland’s strong Drupal community. I hope you’ll join us for either (or both!) of these great events.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jan 18 2016
Jan 18

This past summer, Sean shed some light on one of our most exciting Drupal contributions to date: RedHen Raiser, our open source peer-to-peer fundraising solution, the first of its kind to be built entirely with Drupal. Thanks to the Capital Area Food Bank of Washington, D.C., we finally had the opportunity to build a platform that would provide fundraisers a highly-customizable, Drupal alternative to SAAS applications like Causevox and Crowdrise.

After a year of leveraging Raiser as their vehicle for crowdfunding, we’re thrilled to see that the Capital Area Food Bank of Washington, D.C. brought in over $324,000, representing tens of thousands of pounds of food, through 17 different campaigns to help fight hunger in their community.

The Food Bank recently shared an infographic with its constituents that breaks down the donations they received through their RedHen Raiser campaigns over this past year.


It’s really exciting for us to think of what other organizations will accomplish with RedHen Raiser. Raiser’s features allow fundraisers to tailor their campaigns to fit a variety of causes. The Food Bank utilized the Teams feature to give local law firms their own fundraising pages during the “Food from the Bar” campaign. There are so many other possible applications: walk-a-thons, food drives, even parties - as the Food Bank has shown us - have the potential to be great fundraising opportunities for your organization, and they can all be easily managed with RedHen Raiser.

We can’t thank the Food Bank enough for working with us to release RedHen Raiser as an open source platform so other members of the Drupal nonprofit community can benefit from it, too – and, of course, make it even better.

Is your organization using RedHen Raiser to crowdfundraise? Get in touch with us. We’d love to hear about your experience and share your story.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jan 14 2016
Jan 14

Winston Churchill

While I fully realize it’s not fair to compare a major turning point at an organization to one in an existential battle over the future of the civilized world, Winston Churchill was clearly onto something. At this point, the news is out that Sean is leaving ThinkShout. I still wince when I see those words, although we’re all getting used to the idea.

In my mind, the “beginning” in this case is the early stages of a company started by a couple of guys with a vague notion of wanting to do well by doing good. At this point, ThinkShout is so much more. It is 22 intelligent, compassionate, talented, and dedicated professionals. It is a collection of accumulated knowledge and processes that allow us to partner with organizations like the Southern Poverty Law Center that we look upon with awe and respect. It is the combined innovation of 60,000 websites running on our contributions. It is the experience and insights we can offer on how to best use those tools.

As we built this company, Sean and I were often challenged by mentors and our leadership team on what our goals were for ThinkShout. The answers varied over the years, but one key point was consistent: we both dreamed of building an organization that was much more than either of us, that could live on when we inevitably moved on.

For Sean, that time is now. And, even though he’s incredibly humble and won’t say it himself, he’s leaving with his head held high, incredibly proud of having accomplished what he set out to do.

I can’t thank him enough for going on this journey with me these past 6 years. I don’t want to sugarcoat things. There have been plenty of challenges, as there are in any relationship, but Sean’s been a wonderful business partner. He’s challenged me in countless ways, supported me when I’ve been down, and done the dirty work to make the business run when I’ve played the prima donna engineer role. In short, he’s set ThinkShout up for amazing success as he moves on. And for that, we’re all grateful.

Alright, so that’s the beginning. What’s next?

As part of Sean’s transition, we’ve made a number of promotions that were already under consideration that I’m incredibly proud to announce here.

Brett Meyer, Chief Strategy Officer

Brett will run point on our business development efforts, as well as continue to lead our strategy and UX teams. He will work closely with Stephanie Gutowski, who was recently promoted to Marketing Manager. He will also support Natania LeClerc, who will lead ThinkShout’s new digital fundraising practice area. We’re also growing our strategy team, with an immediate opening for a Senior UX Designer and more to come. Brett has been pivotal in getting ThinkShout to where it is today and his ascension to this leadership position is well deserved. He crafts our solutions in the sales process and during discovery, leads our UX and IA practice, and is a well-recognized thought leader in the nonprofit technology community. Just try getting a seat during one of his talks at the Nonprofit Technology Conference.

Alex MacMillan, Chief Operating Officer

In her role as Director of Project Management, Alex has led our PM team, defined our project delivery process, and managed resource allocation. In her 2 years with us, she has become the heartbeat of our projects, drastically increasing client satisfaction and project success rates. As COO, she’ll also take on oversight over the financial health of individual projects and the company as a whole. We’re also growing our project management team, looking for another senior project manager to partner with our clients to help them have a positive impact.

With Alex’s new role, Krista Van Veen, who has been our Operations Manager, has a new title: Manager of Culture and People. This represents her focus on HR, recruiting, and community engagement, although she’ll continue to play a critical role in managing ThinkShout’s finances. Krista has worked hard to convert our vision and values into actual cultural changes, as demonstrated by her leadership in attaining our recent B Corp certification.

Tauno Hogue, Chief Technology Officer

Tauno is ThinkShout’s most senior employee, having been with us, quite literally, since the beginning. In that time, he’s grown from a talented developer into the leader of our engineering team. He takes on some of our most complex technical challenges, provides mentorship, and helps define our development workflow. In his new role as CTO, he’ll continuing doing more of the same, along with ensuring that we continue to offer innovative open source solutions for our forward-thinking clients. And, you guessed it, we’re also hiring on the engineering side of things.

Given that I had the title of CTO, clearly something else has to change. I am assuming Chief Executive Officer (CEO) responsibilities. This will entail setting the strategic direction of the company in partnership with the leadership team, along with being an external representative of ThinkShout. Most importantly to me, I will remain very actively involved in crafting innovative technical solutions for the challenges facing our clients. But, really, my biggest job will be to stay out of the way of the amazing leadership that we now have in place.

"To improve is to change; to be perfect is to change often."

- Winston Churchill

Sticking with the Churchill theme, it’s important to note that this is far from the end of the changes that we expect to see at ThinkShout in the coming months.

First, and I mean this sincerely, there is absolutely nothing that we are doing perfectly. Far from it. We’ll continue to iterate on everything, including our organizational structure. We’re going to get some stuff wrong, and we’ll adjust accordingly.

But, on the whole, I have never been more confident in, and excited for, the future. We’re due to kick off our largest project yet with an organization that we’ve all dreamed of working with. We have four Drupal 8 projects lined up to kick off in Q1. We’re launching a new Digital Fundraising practice area. We’re welcoming five new team members over the next few months.

None of this would be possible without everything that Sean has put into ThinkShout up to this point. For that, we thank him. And we wish him luck on his next grand adventure – which, rumor has it, might involve pickling or waxed canvas accessories… Stay tuned, I know it will be great.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jan 14 2016
Jan 14

As most of my friends and colleagues in the Drupal and nonprofit technology communities know, I joyously became a dad about a year and a half ago. Stepping into fatherhood involved a move to Denver, away from our Portland-based team at ThinkShout. Since then, I’ve flown over 100,000 miles, juggling my responsibilities as CEO at ThinkShout with those as a parent in Colorado. It’s been tricky to say the least. I am incredibly thankful for the support of my team, especially that of my thoughtful and always encouraging business partner, Lev Tsypin.

That said, it’s time for a change. It’s time for me to be fully present in Colorado, focused on my family, embracing this new community.

After a lot of conversation with Lev over the past few months, I have decided to step down as CEO at ThinkShout. I have handed over my ownership for an incremental buyout that keeps the company in a strong place financially.

I will continue to serve this team and our clients through this transition for the next three to six months as our Director of Sales. Then, I look forward to taking a break from the world of professional services and technology to try some new things. (Honest to goodness, professional pickling is on my short list. Let’s chat more about that over a beer.)

I will always be an advocate for this incredible team.

I am sad to leave right as we start tackling a new wave of technology challenges with the release of Drupal 8, as we begin to serve exciting new clients like the Humane Society, and as we kick off a new digital fundraising practice area.

At the same time, I could not feel more confident in the future of this company. We have invested heavily the last two years in our management team, organizational structure, and processes. We have attracted talented engineers, project managers, designers, and strategists. While hopefully some folks on staff will notice that I’m gone, as a team, ThinkShout will not miss a beat in serving its mission-driven clients.

ThinkShout will continue telling stories of good people making lasting change. Now I just get to tell ThinkShout’s story as fan and supporter in Colorado.


(My son Ernie is a big fan of ThinkShout, too.)

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jan 11 2016
Jan 11

Five years ago today, Sean and I met on the rainy steps of our attorney’s office in downtown Portland. We walked inside and proceeded to sign a small mountain of paperwork to form ThinkShout, Incorporated. Then, the two of us biked back to the 100 square foot office we’d been sharing for the previous year to figure out what the heck it meant to start a company together.

Apart from our name and commitment to helping organizations that are making a positive impact in the world, it’s safe to say that pretty much everything else at ThinkShout has changed since that first day, including the grey in our beards! Personally, the challenges we face with constant change are among the things I like most about my job. As I often say, running ThinkShout has been a new job every 6 months or so, and I’m more excited than ever to see what the future will hold.


In thinking about all that the team has accomplished and everything we’ve experienced these last five years, I spent some time looking at my first annual wrap-up post. What immediately stands out to me is that when we first started the company, most of our focus and pride came from talking about our technology contributions. We dreamt of being the “smartest geeks in the room” and bragged about how everyone at ThinkShout wrote code. Wow, was that ever naive and foolish. Just one of the many, MANY, lessons that Sean I learned as we stumbled down this path. We now understand that it takes so much more than solid technology to help an organization accomplish its goals. You need strategists to define solutions, designers to create delightful user experiences, and project managers to guide projects to success.

Of course, we are still inspired by open source innovations. In fact, this year the aggregate install base of our Drupal contributions peaked at 60,000 websites and we released perhaps our most innovative solution yet, RedHen Raiser, a viral fundraising platform based on RedHen CRM.

At the same time, our understanding of what it means to build a sustainable, values-driven business has matured. We have grown our team to 21 full-time staff, adding service offerings such as user experience design and digital fundraising strategy. We have also grown our leadership team, creating a “small council” of directors who’ve been tasked with making sure that we are responsive to the needs of both our staff and our customers.

It’s because of the strength of this team that we’ve also been able to work with some amazing clients this last year, including the relaunch of the Southern Poverty Law Center’s website, and the kickoff of a new website for The Humane Society of America. We launched 11 client websites and helped 43 organizations craft compelling digital experiences for their constituents.

This last year saw us focus on more than just our client engagements, as we increased our investment into community engagement by almost 50%. We hosted and sponsored over a dozen local events, many focused on fostering diversity in the tech industry. Similarly, we sponsored six regional and five national conferences, with our team leading over 30 hours of community trainings at these events.

We also undertook a major initiative to redefine our mission, vision, and values. Having these guideposts are critical so we don’t “lose our way” as we continue to grow. And one of my proudest experiences of being part of this team, we achieved our long time goal of becoming B Corp certified. B Corp certification provides us with guidelines, metrics, and a community of practice for all of the business decisions that we make in support of our team, our clients, our local community, and the environment. B Corp certification is our a roadmap for continuing to make intentional choices that put people and the planet above profit as we grow and continue to change over the next five, ten, fifteen years and beyond.


Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web